Is kylin support kerberos while using cube connecting to HBASE?

classic Classic list List threaded Threaded
10 messages Options
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Is kylin support kerberos while using cube connecting to HBASE?

ran gabriele
Hello,


I am using kylin 2.0.0 for CDH 5.7/5.8. My cluster is configured with kerberos as certification.


Here I got the error log.


17/05/17 17:25:16 WARN ipc.RpcClientImpl: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
17/05/17 17:25:16 ERROR ipc.RpcClientImpl: SASL authentication failed. The most likely cause is missing or invalid credentials. Consider 'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
    at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(HBaseSaslRpcClient.java:181)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupSaslConnection(RpcClientImpl.java:617)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.access$700(RpcClientImpl.java:162)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:743)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.run(RpcClientImpl.java:740)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:740)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:906)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:873)
    at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1242)
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:227)
    at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:336)
    at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
    at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:394)
    at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:203)
    at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:64)
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
    at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:381)
    at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:355)
    at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
    at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:80)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
    at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147)
    at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
    at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187)
    at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
    at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
    at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
    ... 26 more
17/05/17 17:25:16 ERROR persistence.ResourceStore: Create new store instance failed
java.lang.reflect.InvocationTargetException
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:91)
    at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
    at org.apache.kylin.cube.CubeDescManager.getStore(CubeDescManager.java:370)
    at org.apache.kylin.cube.CubeDescManager.reloadAllCubeDesc(CubeDescManager.java:298)
    at org.apache.kylin.cube.CubeDescManager.<init>(CubeDescManager.java:109)
    at org.apache.kylin.cube.CubeDescManager.getInstance(CubeDescManager.java:81)
    at org.apache.kylin.cube.CubeInstance.getDescriptor(CubeInstance.java:109)
    at org.apache.kylin.cube.CubeSegment.getCubeDesc(CubeSegment.java:119)
    at org.apache.kylin.cube.CubeSegment.isEnableSharding(CubeSegment.java:467)
    at org.apache.kylin.cube.kv.RowKeyEncoder.<init>(RowKeyEncoder.java:48)
    at org.apache.kylin.cube.kv.AbstractRowKeyEncoder.createInstance(AbstractRowKeyEncoder.java:48)
    at org.apache.kylin.engine.spark.SparkCubingByLayer$2.call(SparkCubingByLayer.java:205)
    at org.apache.kylin.engine.spark.SparkCubingByLayer$2.call(SparkCubingByLayer.java:193)
    at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
    at org.apache.spark.api.java.JavaPairRDD$$anonfun$pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
    at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
    at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)
    at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:64)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:73)
    at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
    at org.apache.spark.scheduler.Task.run(Task.scala:89)
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: File not exist by 'kylin_metadata@hbase': /mnt/disk2/yarn/nm/usercache/kylin/appcache/application_1493867056374_0598/container_e21_1493867056374_0598_01_000002/kylin_metadata@hbase
    at org.apache.kylin.common.persistence.FileResourceStore.<init>(FileResourceStore.java:49)
    ... 29 more

Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

Billy Liu
I think so. Have you give the proper kerberos token to the current user?

2017-05-18 17:10 GMT+08:00 ran gabriele <[hidden email]>:

> Hello,
>
>
> I am using kylin 2.0.0 for CDH 5.7/5.8. My cluster is configured with
> kerberos as certification.
>
>
> Here I got the error log.
>
>
> 17/05/17 17:25:16 WARN ipc.RpcClientImpl: Exception encountered while
> connecting to the server : javax.security.sasl.SaslException: GSS
> initiate failed [Caused by GSSException: No valid credentials provided
> (Mechanism level: Failed to find any Kerberos tgt)]
> 17/05/17 17:25:16 ERROR ipc.RpcClientImpl: SASL authentication failed. The
> most likely cause is missing or invalid credentials. Consider 'kinit'.
> javax.security.sasl.SaslException: GSS initiate failed [Caused by
> GSSException: No valid credentials provided (Mechanism level: Failed to
> find any Kerberos tgt)]
>     at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(
> GssKrb5Client.java:211)
>     at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(
> HBaseSaslRpcClient.java:181)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> setupSaslConnection(RpcClientImpl.java:617)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> access$700(RpcClientImpl.java:162)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.
> run(RpcClientImpl.java:743)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.
> run(RpcClientImpl.java:740)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1628)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> setupIOstreams(RpcClientImpl.java:740)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> writeRequest(RpcClientImpl.java:906)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> tracedWriteRequest(RpcClientImpl.java:873)
>     at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(
> RpcClientImpl.java:1242)
>     at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(
> AbstractRpcClient.java:227)
>     at org.apache.hadoop.hbase.ipc.AbstractRpcClient$
> BlockingRpcChannelImplementation.callBlockingMethod(
> AbstractRpcClient.java:336)
>     at org.apache.hadoop.hbase.protobuf.generated.
> ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
>     at org.apache.hadoop.hbase.client.ScannerCallable.
> openScanner(ScannerCallable.java:394)
>     at org.apache.hadoop.hbase.client.ScannerCallable.call(
> ScannerCallable.java:203)
>     at org.apache.hadoop.hbase.client.ScannerCallable.call(
> ScannerCallable.java:64)
>     at org.apache.hadoop.hbase.client.RpcRetryingCaller.
> callWithoutRetries(RpcRetryingCaller.java:200)
>     at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> RetryingRPC.call(ScannerCallableWithReplicas.java:381)
>     at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> RetryingRPC.call(ScannerCallableWithReplicas.java:355)
>     at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> RpcRetryingCaller.java:126)
>     at org.apache.hadoop.hbase.client.ResultBoundedCompletionService
> $QueueingFuture.run(ResultBoundedCompletionService.java:80)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>     at java.lang.Thread.run(Thread.java:745)
> Caused by: GSSException: No valid credentials provided (Mechanism level:
> Failed to find any Kerberos tgt)
>     at sun.security.jgss.krb5.Krb5InitCredential.getInstance(
> Krb5InitCredential.java:147)
>     at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(
> Krb5MechFactory.java:122)
>     at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(
> Krb5MechFactory.java:187)
>     at sun.security.jgss.GSSManagerImpl.getMechanismContext(
> GSSManagerImpl.java:224)
>     at sun.security.jgss.GSSContextImpl.initSecContext(
> GSSContextImpl.java:212)
>     at sun.security.jgss.GSSContextImpl.initSecContext(
> GSSContextImpl.java:179)
>     at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(
> GssKrb5Client.java:192)
>     ... 26 more
> 17/05/17 17:25:16 ERROR persistence.ResourceStore: Create new store
> instance failed
> java.lang.reflect.InvocationTargetException
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>     at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>     at org.apache.kylin.common.persistence.ResourceStore.
> createResourceStore(ResourceStore.java:91)
>     at org.apache.kylin.common.persistence.ResourceStore.
> getStore(ResourceStore.java:110)
>     at org.apache.kylin.cube.CubeDescManager.getStore(
> CubeDescManager.java:370)
>     at org.apache.kylin.cube.CubeDescManager.reloadAllCubeDesc(
> CubeDescManager.java:298)
>     at org.apache.kylin.cube.CubeDescManager.<init>(
> CubeDescManager.java:109)
>     at org.apache.kylin.cube.CubeDescManager.getInstance(
> CubeDescManager.java:81)
>     at org.apache.kylin.cube.CubeInstance.getDescriptor(
> CubeInstance.java:109)
>     at org.apache.kylin.cube.CubeSegment.getCubeDesc(CubeSegment.java:119)
>     at org.apache.kylin.cube.CubeSegment.isEnableSharding(
> CubeSegment.java:467)
>     at org.apache.kylin.cube.kv.RowKeyEncoder.<init>(
> RowKeyEncoder.java:48)
>     at org.apache.kylin.cube.kv.AbstractRowKeyEncoder.createInstance(
> AbstractRowKeyEncoder.java:48)
>     at org.apache.kylin.engine.spark.SparkCubingByLayer$2.call(
> SparkCubingByLayer.java:205)
>     at org.apache.kylin.engine.spark.SparkCubingByLayer$2.call(
> SparkCubingByLayer.java:193)
>     at org.apache.spark.api.java.JavaPairRDD$$anonfun$
> pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
>     at org.apache.spark.api.java.JavaPairRDD$$anonfun$
> pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
>     at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
>     at org.apache.spark.util.collection.ExternalSorter.
> insertAll(ExternalSorter.scala:191)
>     at org.apache.spark.shuffle.sort.SortShuffleWriter.write(
> SortShuffleWriter.scala:64)
>     at org.apache.spark.scheduler.ShuffleMapTask.runTask(
> ShuffleMapTask.scala:73)
>     at org.apache.spark.scheduler.ShuffleMapTask.runTask(
> ShuffleMapTask.scala:41)
>     at org.apache.spark.scheduler.Task.run(Task.scala:89)
>     at org.apache.spark.executor.Executor$TaskRunner.run(
> Executor.scala:227)
>     at java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
>     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
>     at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.IllegalArgumentException: File not exist by
> 'kylin_metadata@hbase': /mnt/disk2/yarn/nm/usercache/
> kylin/appcache/application_1493867056374_0598/container_
> e21_1493867056374_0598_01_000002/kylin_metadata@hbase
>     at org.apache.kylin.common.persistence.FileResourceStore.
> <init>(FileResourceStore.java:49)
>     ... 29 more
>
>
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

shaofengshi
Kylin supports Kerberos authentication, and it doesn't need any
code/configuration change in Kylin side.

Since Kylin is working as a Hadoop client, connecting with cluster in
standard ways, you just need prepare a client machine, from which
hive/yarn/hbase command lines normally, then Kylin will work. Remember as
the Kerberos token will expire, you need a cron job which repeatedly
refresh the token.

2017-05-19 15:10 GMT+08:00 Billy Liu <[hidden email]>:

> I think so. Have you give the proper kerberos token to the current user?
>
> 2017-05-18 17:10 GMT+08:00 ran gabriele <[hidden email]>:
>
> > Hello,
> >
> >
> > I am using kylin 2.0.0 for CDH 5.7/5.8. My cluster is configured with
> > kerberos as certification.
> >
> >
> > Here I got the error log.
> >
> >
> > 17/05/17 17:25:16 WARN ipc.RpcClientImpl: Exception encountered while
> > connecting to the server : javax.security.sasl.SaslException: GSS
> > initiate failed [Caused by GSSException: No valid credentials provided
> > (Mechanism level: Failed to find any Kerberos tgt)]
> > 17/05/17 17:25:16 ERROR ipc.RpcClientImpl: SASL authentication failed.
> The
> > most likely cause is missing or invalid credentials. Consider 'kinit'.
> > javax.security.sasl.SaslException: GSS initiate failed [Caused by
> > GSSException: No valid credentials provided (Mechanism level: Failed to
> > find any Kerberos tgt)]
> >     at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(
> > GssKrb5Client.java:211)
> >     at org.apache.hadoop.hbase.security.HBaseSaslRpcClient.saslConnect(
> > HBaseSaslRpcClient.java:181)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > setupSaslConnection(RpcClientImpl.java:617)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > access$700(RpcClientImpl.java:162)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.
> > run(RpcClientImpl.java:743)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection$2.
> > run(RpcClientImpl.java:740)
> >     at java.security.AccessController.doPrivileged(Native Method)
> >     at javax.security.auth.Subject.doAs(Subject.java:422)
> >     at org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1628)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > setupIOstreams(RpcClientImpl.java:740)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > writeRequest(RpcClientImpl.java:906)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.
> > tracedWriteRequest(RpcClientImpl.java:873)
> >     at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(
> > RpcClientImpl.java:1242)
> >     at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(
> > AbstractRpcClient.java:227)
> >     at org.apache.hadoop.hbase.ipc.AbstractRpcClient$
> > BlockingRpcChannelImplementation.callBlockingMethod(
> > AbstractRpcClient.java:336)
> >     at org.apache.hadoop.hbase.protobuf.generated.
> > ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:34094)
> >     at org.apache.hadoop.hbase.client.ScannerCallable.
> > openScanner(ScannerCallable.java:394)
> >     at org.apache.hadoop.hbase.client.ScannerCallable.call(
> > ScannerCallable.java:203)
> >     at org.apache.hadoop.hbase.client.ScannerCallable.call(
> > ScannerCallable.java:64)
> >     at org.apache.hadoop.hbase.client.RpcRetryingCaller.
> > callWithoutRetries(RpcRetryingCaller.java:200)
> >     at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> > RetryingRPC.call(ScannerCallableWithReplicas.java:381)
> >     at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$
> > RetryingRPC.call(ScannerCallableWithReplicas.java:355)
> >     at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(
> > RpcRetryingCaller.java:126)
> >     at org.apache.hadoop.hbase.client.ResultBoundedCompletionService
> > $QueueingFuture.run(ResultBoundedCompletionService.java:80)
> >     at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >     at java.lang.Thread.run(Thread.java:745)
> > Caused by: GSSException: No valid credentials provided (Mechanism level:
> > Failed to find any Kerberos tgt)
> >     at sun.security.jgss.krb5.Krb5InitCredential.getInstance(
> > Krb5InitCredential.java:147)
> >     at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(
> > Krb5MechFactory.java:122)
> >     at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(
> > Krb5MechFactory.java:187)
> >     at sun.security.jgss.GSSManagerImpl.getMechanismContext(
> > GSSManagerImpl.java:224)
> >     at sun.security.jgss.GSSContextImpl.initSecContext(
> > GSSContextImpl.java:212)
> >     at sun.security.jgss.GSSContextImpl.initSecContext(
> > GSSContextImpl.java:179)
> >     at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(
> > GssKrb5Client.java:192)
> >     ... 26 more
> > 17/05/17 17:25:16 ERROR persistence.ResourceStore: Create new store
> > instance failed
> > java.lang.reflect.InvocationTargetException
> >     at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >     at sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >     at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >     at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> >     at org.apache.kylin.common.persistence.ResourceStore.
> > createResourceStore(ResourceStore.java:91)
> >     at org.apache.kylin.common.persistence.ResourceStore.
> > getStore(ResourceStore.java:110)
> >     at org.apache.kylin.cube.CubeDescManager.getStore(
> > CubeDescManager.java:370)
> >     at org.apache.kylin.cube.CubeDescManager.reloadAllCubeDesc(
> > CubeDescManager.java:298)
> >     at org.apache.kylin.cube.CubeDescManager.<init>(
> > CubeDescManager.java:109)
> >     at org.apache.kylin.cube.CubeDescManager.getInstance(
> > CubeDescManager.java:81)
> >     at org.apache.kylin.cube.CubeInstance.getDescriptor(
> > CubeInstance.java:109)
> >     at org.apache.kylin.cube.CubeSegment.getCubeDesc(
> CubeSegment.java:119)
> >     at org.apache.kylin.cube.CubeSegment.isEnableSharding(
> > CubeSegment.java:467)
> >     at org.apache.kylin.cube.kv.RowKeyEncoder.<init>(
> > RowKeyEncoder.java:48)
> >     at org.apache.kylin.cube.kv.AbstractRowKeyEncoder.createInstance(
> > AbstractRowKeyEncoder.java:48)
> >     at org.apache.kylin.engine.spark.SparkCubingByLayer$2.call(
> > SparkCubingByLayer.java:205)
> >     at org.apache.kylin.engine.spark.SparkCubingByLayer$2.call(
> > SparkCubingByLayer.java:193)
> >     at org.apache.spark.api.java.JavaPairRDD$$anonfun$
> > pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
> >     at org.apache.spark.api.java.JavaPairRDD$$anonfun$
> > pairFunToScalaFun$1.apply(JavaPairRDD.scala:1018)
> >     at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
> >     at org.apache.spark.util.collection.ExternalSorter.
> > insertAll(ExternalSorter.scala:191)
> >     at org.apache.spark.shuffle.sort.SortShuffleWriter.write(
> > SortShuffleWriter.scala:64)
> >     at org.apache.spark.scheduler.ShuffleMapTask.runTask(
> > ShuffleMapTask.scala:73)
> >     at org.apache.spark.scheduler.ShuffleMapTask.runTask(
> > ShuffleMapTask.scala:41)
> >     at org.apache.spark.scheduler.Task.run(Task.scala:89)
> >     at org.apache.spark.executor.Executor$TaskRunner.run(
> > Executor.scala:227)
> >     at java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> >     at java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> >     at java.lang.Thread.run(Thread.java:745)
> > Caused by: java.lang.IllegalArgumentException: File not exist by
> > 'kylin_metadata@hbase': /mnt/disk2/yarn/nm/usercache/
> > kylin/appcache/application_1493867056374_0598/container_
> > e21_1493867056374_0598_01_000002/kylin_metadata@hbase
> >     at org.apache.kylin.common.persistence.FileResourceStore.
> > <init>(FileResourceStore.java:49)
> >     ... 29 more
> >
> >
>



--
Best regards,

Shaofeng Shi 史少锋
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

ranmx
In reply to this post by Billy Liu
Thank you for your reply.

Below are what I have done:
1. I have set the kerberos token for the user and put the keytab under home
directory.
2. I have made a crontab to refresh the keytab
3. I set kylin.properties like following:
    >kylin.env.hadoop-conf-dir=/home/kylin/hadoop-conf
    >kylin.engine.spark-conf.spark.principal=kylin/[hidden email]
    >kylin.engine.spark-conf.spark.keytab=/home/kylin/kylin.keytab
    >kylin.engine.spark-conf.spark.yarn.keytab=/home/kylin/kylin.keytab
    >kylin.engine.spark-conf.spark.yarn.principal=kylin/[hidden email]
    >kylin.engine.spark-conf.spark.yarn.security.tokens.hbase.enabled=true
4. I put all my hadoop setting files under '/home/kylin/hadoop-conf'
5. I set hbase-site.xml like followling:
      <property>
          <name>hbase.security.authentication</name>
          <value>kerberos</value>
      </property>

Is there anything I should do more to make it work?
Looking forward to your reply, thank you!


 


--
View this message in context: http://apache-kylin.74782.x6.nabble.com/Is-kylin-support-kerberos-while-using-cube-connecting-to-HBASE-tp8028p8233.html
Sent from the Apache Kylin mailing list archive at Nabble.com.
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

ran gabriele
In reply to this post by shaofengshi
Thank you for your reply.

Below are what I have done:
1. I have set the kerberos token for the user and put the keytab under home directory.
2. I have made a crontab to refresh the keytab
3. I set kylin.properties like following:
    >kylin.engine.spark-conf.spark.master=yarn
    >kylin.engine.spark-conf.spark.submit.deployMode=client
    >kylin.env.hadoop-conf-dir=/home/kylin/hadoop-conf
    >kylin.engine.spark-conf.spark.principal=kylin/kylin@MYSERVER.COM
    >kylin.engine.spark-conf.spark.keytab=/home/kylin/kylin.keytab
    >kylin.engine.spark-conf.spark.yarn.keytab=/home/kylin/kylin.keytab
    >kylin.engine.spark-conf.spark.yarn.principal=kylin/kylin@MYSERVER.COM
    >kylin.engine.spark-conf.spark.yarn.security.tokens.hbase.enabled=true
4. I put all my hadoop setting files under '/home/kylin/hadoop-conf'
5. I set hbase-site.xml like followling:
      <property>
          <name>hbase.security.authentication</name>
          <value>kerberos</value>
      </property>

Is there anything I should do more to make it work?
Looking forward to your reply, thank you!
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

ran gabriele
In reply to this post by shaofengshi
Well, I find this in debug mode:

17/06/15 11:12:13 DEBUG Client: HBase Class not found: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration

I find the method and also find 'org.apache.hbase:hbase-common:0.98.7-hadoop2' in maven:

def obtainTokenForHBase(
      sparkConf: SparkConf,
      conf: Configuration,
      credentials: Credentials): Unit = {
    if (shouldGetTokens(sparkConf, "hbase") && UserGroupInformation.isSecurityEnabled) {
      val mirror = universe.runtimeMirror(getClass.getClassLoader)

      try {
        val confCreate = mirror.classLoader.
          loadClass("org.apache.hadoop.hbase.HBaseConfiguration").
          getMethod("create", classOf[Configuration])
        val obtainToken = mirror.classLoader.
          loadClass("org.apache.hadoop.hbase.security.token.TokenUtil").
          getMethod("obtainToken", classOf[Configuration])

        logDebug("Attempting to fetch HBase security token.")

        val hbaseConf = confCreate.invoke(null, conf).asInstanceOf[Configuration]
        if ("kerberos" == hbaseConf.get("hbase.security.authentication")) {
          val token = obtainToken.invoke(null, hbaseConf).asInstanceOf[Token[TokenIdentifier]]
          credentials.addToken(token.getService, token)
          logInfo("Added HBase security token to credentials.")
        }
      } catch {
        case e: java.lang.NoSuchMethodException =>
          logInfo("HBase Method not found: " + e)
        case e: java.lang.ClassNotFoundException =>
          logDebug("HBase Class not found: " + e)
        case e: java.lang.NoClassDefFoundError =>
          logDebug("HBase Class not found: " + e)
        case e: Exception =>
          logError("Exception when obtaining HBase security token: " + e)
      }
    }
  }

Thus do you have any idea about what should I do to solve this problem?
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

shaofengshi
Hi Ran,

I'm afraid Spark couldn't connect to HBase cluster when it enables Kerberos
authentication, a JIRA is open for this:
https://issues.apache.org/jira/browse/KYLIN-2653

2017-06-15 13:59 GMT+08:00 ran gabriele <[hidden email]>:

> Well, I find this in debug mode:
>
> 17/06/15 11:12:13 DEBUG Client: HBase Class not found:
> java.lang.ClassNotFoundException: org.apache.hadoop.hbase.
> HBaseConfiguration
>
> I find the method and also find
> 'org.apache.hbase:hbase-common:0.98.7-hadoop2' in maven:
>
> def obtainTokenForHBase(
>       sparkConf: SparkConf,
>       conf: Configuration,
>       credentials: Credentials): Unit = {
>     if (shouldGetTokens(sparkConf, "hbase") &&
> UserGroupInformation.isSecurityEnabled) {
>       val mirror = universe.runtimeMirror(getClass.getClassLoader)
>
>       try {
>         val confCreate = mirror.classLoader.
>           loadClass("org.apache.hadoop.hbase.HBaseConfiguration").
>           getMethod("create", classOf[Configuration])
>         val obtainToken = mirror.classLoader.
>           loadClass("org.apache.hadoop.hbase.security.token.TokenUtil").
>           getMethod("obtainToken", classOf[Configuration])
>
>         logDebug("Attempting to fetch HBase security token.")
>
>         val hbaseConf = confCreate.invoke(null,
> conf).asInstanceOf[Configuration]
>         if ("kerberos" == hbaseConf.get("hbase.security.authentication"))
> {
>           val token = obtainToken.invoke(null,
> hbaseConf).asInstanceOf[Token[TokenIdentifier]]
>           credentials.addToken(token.getService, token)
>           logInfo("Added HBase security token to credentials.")
>         }
>       } catch {
>         case e: java.lang.NoSuchMethodException =>
>           logInfo("HBase Method not found: " + e)
>         case e: java.lang.ClassNotFoundException =>
>           logDebug("HBase Class not found: " + e)
>         case e: java.lang.NoClassDefFoundError =>
>           logDebug("HBase Class not found: " + e)
>         case e: Exception =>
>           logError("Exception when obtaining HBase security token: " + e)
>       }
>     }
>   }
>
> Thus do you have any idea about what should I do to solve this problem?
>
> --
> View this message in context: http://apache-kylin.74782.x6.
> nabble.com/Is-kylin-support-kerberos-while-using-cube-connecting-to-HBASE-
> tp8028p8241.html
> Sent from the Apache Kylin mailing list archive at Nabble.com.
>



--
Best regards,

Shaofeng Shi 史少锋
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

ran gabriele
Thank you for telling me that.

After some struggle, I succeed in making spark yarn work for HBASE token distribution.

However I got this error instead:

17/06/19 09:50:10 ERROR ResourceStore: Create new store instance failed
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.kylin.common.persistence.ResourceStore.createResourceStore(ResourceStore.java:91)
        at org.apache.kylin.common.persistence.ResourceStore.getStore(ResourceStore.java:110)
        at org.apache.kylin.cube.CubeManager.getStore(CubeManager.java:812)
        at org.apache.kylin.cube.CubeManager.loadAllCubeInstance(CubeManager.java:732)
        at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
        at org.apache.kylin.cube.CubeManager.getInstance(CubeManager.java:107)
        at org.apache.kylin.engine.spark.SparkCubingByLayer.execute(SparkCubingByLayer.java:160)
        at org.apache.kylin.common.util.AbstractApplication.execute(AbstractApplication.java:37)
        at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.java:44)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.kylin.common.persistence.StorageException: Error when open connection hbase
        at org.apache.kylin.storage.hbase.HBaseConnection.get(HBaseConnection.java:242)
        at org.apache.kylin.storage.hbase.HBaseResourceStore.getConnection(HBaseResourceStore.java:73)
        at org.apache.kylin.storage.hbase.HBaseResourceStore.createHTableIfNeeded(HBaseResourceStore.java:90)
        at org.apache.kylin.storage.hbase.HBaseResourceStore.<init>(HBaseResourceStore.java:86)
        ... 22 more
Caused by: java.io.IOException: java.lang.reflect.InvocationTargetException
        at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
        at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218)
        at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119)
        at org.apache.kylin.storage.hbase.HBaseConnection.get(HBaseConnection.java:229)
        ... 25 more
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
        ... 28 more
Caused by: java.lang.IllegalAccessError: tried to access class org.apache.hadoop.hbase.client.AsyncProcess from class org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2433)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:712)
        at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:652)
        ... 33 more


Do you have any idea about that?
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

shaofengshi
I think the root cause error is "Caused by: java.lang.IllegalAccessError:
tried to access class
org.apache.hadoop.hbase.client.AsyncProcess from class
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation"

For what this error means, you can check:
https://stackoverflow.com/questions/7076414/java-lang-illegalaccesserror-tried-to-access-method

So there might be version unmatched hbase jars in Kylin's classpath.


2017-06-19 13:51 GMT+08:00 ran gabriele <[hidden email]>:

> Thank you for telling me that.
>
> After some struggle, I succeed in making spark yarn work for HBASE token
> distribution.
>
> However I got this error instead:
>
> 17/06/19 09:50:10 ERROR ResourceStore: Create new store instance failed
> java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at
> org.apache.kylin.common.persistence.ResourceStore.createResourceStore(
> ResourceStore.java:91)
>         at
> org.apache.kylin.common.persistence.ResourceStore.
> getStore(ResourceStore.java:110)
>         at org.apache.kylin.cube.CubeManager.getStore(
> CubeManager.java:812)
>         at
> org.apache.kylin.cube.CubeManager.loadAllCubeInstance(
> CubeManager.java:732)
>         at org.apache.kylin.cube.CubeManager.<init>(CubeManager.java:143)
>         at org.apache.kylin.cube.CubeManager.getInstance(
> CubeManager.java:107)
>         at
> org.apache.kylin.engine.spark.SparkCubingByLayer.execute(
> SparkCubingByLayer.java:160)
>         at
> org.apache.kylin.common.util.AbstractApplication.execute(
> AbstractApplication.java:37)
>         at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.
> java:44)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:
> 62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
>         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> SparkSubmit.scala:181)
>         at org.apache.spark.deploy.SparkSubmit$.submit(
> SparkSubmit.scala:206)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.
> scala:121)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: org.apache.kylin.common.persistence.StorageException: Error
> when
> open connection hbase
>         at
> org.apache.kylin.storage.hbase.HBaseConnection.get(
> HBaseConnection.java:242)
>         at
> org.apache.kylin.storage.hbase.HBaseResourceStore.getConnection(
> HBaseResourceStore.java:73)
>         at
> org.apache.kylin.storage.hbase.HBaseResourceStore.createHTableIfNeeded(
> HBaseResourceStore.java:90)
>         at
> org.apache.kylin.storage.hbase.HBaseResourceStore.<
> init>(HBaseResourceStore.java:86)
>         ... 22 more
> Caused by: java.io.IOException: java.lang.reflect.
> InvocationTargetException
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> ConnectionFactory.java:240)
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> ConnectionFactory.java:218)
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> ConnectionFactory.java:119)
>         at
> org.apache.kylin.storage.hbase.HBaseConnection.get(
> HBaseConnection.java:229)
>         ... 25 more
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(
> NativeConstructorAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
>         at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> ConnectionFactory.java:238)
>         ... 28 more
> Caused by: java.lang.IllegalAccessError: tried to access class
> org.apache.hadoop.hbase.client.AsyncProcess from class
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation.createAsyncProcess(ConnectionManager.java:2433)
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation.<init>(ConnectionManager.java:712)
>         at
> org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation.<init>(ConnectionManager.java:652)
>         ... 33 more
>
>
> Do you have any idea about that?
>
> --
> View this message in context: http://apache-kylin.74782.x6.
> nabble.com/Is-kylin-support-kerberos-while-using-cube-connecting-to-HBASE-
> tp8028p8268.html
> Sent from the Apache Kylin mailing list archive at Nabble.com.
>



--
Best regards,

Shaofeng Shi 史少锋
Reply | Threaded
Open this post in threaded view
|  
Report Content as Inappropriate

Re: Is kylin support kerberos while using cube connecting to HBASE?

Billy Liu
My friend told me, put the hdfs-site.xml into HADOOP_CONF_DIR will resolve
the HBase Kerberos issue. Have a try.

2017-06-20 0:23 GMT+08:00 ShaoFeng Shi <[hidden email]>:

> I think the root cause error is "Caused by: java.lang.IllegalAccessError:
> tried to access class
> org.apache.hadoop.hbase.client.AsyncProcess from class
> org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation"
>
> For what this error means, you can check:
> https://stackoverflow.com/questions/7076414/java-lang-
> illegalaccesserror-tried-to-access-method
>
> So there might be version unmatched hbase jars in Kylin's classpath.
>
>
> 2017-06-19 13:51 GMT+08:00 ran gabriele <[hidden email]>:
>
> > Thank you for telling me that.
> >
> > After some struggle, I succeed in making spark yarn work for HBASE token
> > distribution.
> >
> > However I got this error instead:
> >
> > 17/06/19 09:50:10 ERROR ResourceStore: Create new store instance failed
> > java.lang.reflect.InvocationTargetException
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >         at
> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at java.lang.reflect.Constructor.newInstance(Constructor.java:
> 423)
> >         at
> > org.apache.kylin.common.persistence.ResourceStore.createResourceStore(
> > ResourceStore.java:91)
> >         at
> > org.apache.kylin.common.persistence.ResourceStore.
> > getStore(ResourceStore.java:110)
> >         at org.apache.kylin.cube.CubeManager.getStore(
> > CubeManager.java:812)
> >         at
> > org.apache.kylin.cube.CubeManager.loadAllCubeInstance(
> > CubeManager.java:732)
> >         at org.apache.kylin.cube.CubeManager.<init>(
> CubeManager.java:143)
> >         at org.apache.kylin.cube.CubeManager.getInstance(
> > CubeManager.java:107)
> >         at
> > org.apache.kylin.engine.spark.SparkCubingByLayer.execute(
> > SparkCubingByLayer.java:160)
> >         at
> > org.apache.kylin.common.util.AbstractApplication.execute(
> > AbstractApplication.java:37)
> >         at org.apache.kylin.common.util.SparkEntry.main(SparkEntry.
> > java:44)
> >         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >         at
> > sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:
> > 62)
> >         at
> > sun.reflect.DelegatingMethodAccessorImpl.invoke(
> > DelegatingMethodAccessorImpl.java:43)
> >         at java.lang.reflect.Method.invoke(Method.java:498)
> >         at
> > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
> > deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
> >         at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(
> > SparkSubmit.scala:181)
> >         at org.apache.spark.deploy.SparkSubmit$.submit(
> > SparkSubmit.scala:206)
> >         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.
> > scala:121)
> >         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> > Caused by: org.apache.kylin.common.persistence.StorageException: Error
> > when
> > open connection hbase
> >         at
> > org.apache.kylin.storage.hbase.HBaseConnection.get(
> > HBaseConnection.java:242)
> >         at
> > org.apache.kylin.storage.hbase.HBaseResourceStore.getConnection(
> > HBaseResourceStore.java:73)
> >         at
> > org.apache.kylin.storage.hbase.HBaseResourceStore.createHTableIfNeeded(
> > HBaseResourceStore.java:90)
> >         at
> > org.apache.kylin.storage.hbase.HBaseResourceStore.<
> > init>(HBaseResourceStore.java:86)
> >         ... 22 more
> > Caused by: java.io.IOException: java.lang.reflect.
> > InvocationTargetException
> >         at
> > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> > ConnectionFactory.java:240)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> > ConnectionFactory.java:218)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> > ConnectionFactory.java:119)
> >         at
> > org.apache.kylin.storage.hbase.HBaseConnection.get(
> > HBaseConnection.java:229)
> >         ... 25 more
> > Caused by: java.lang.reflect.InvocationTargetException
> >         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> > Method)
> >         at
> > sun.reflect.NativeConstructorAccessorImpl.newInstance(
> > NativeConstructorAccessorImpl.java:62)
> >         at
> > sun.reflect.DelegatingConstructorAccessorImpl.newInstance(
> > DelegatingConstructorAccessorImpl.java:45)
> >         at java.lang.reflect.Constructor.newInstance(Constructor.java:
> 423)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(
> > ConnectionFactory.java:238)
> >         ... 28 more
> > Caused by: java.lang.IllegalAccessError: tried to access class
> > org.apache.hadoop.hbase.client.AsyncProcess from class
> > org.apache.hadoop.hbase.client.ConnectionManager$
> HConnectionImplementation
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager$
> > HConnectionImplementation.createAsyncProcess(
> ConnectionManager.java:2433)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager$
> > HConnectionImplementation.<init>(ConnectionManager.java:712)
> >         at
> > org.apache.hadoop.hbase.client.ConnectionManager$
> > HConnectionImplementation.<init>(ConnectionManager.java:652)
> >         ... 33 more
> >
> >
> > Do you have any idea about that?
> >
> > --
> > View this message in context: http://apache-kylin.74782.x6.
> > nabble.com/Is-kylin-support-kerberos-while-using-cube-
> connecting-to-HBASE-
> > tp8028p8268.html
> > Sent from the Apache Kylin mailing list archive at Nabble.com.
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>
Loading...