Some error happened when i build cube

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

Some error happened when i build cube

周湘伦
Hi,all
When i built cube,an error occurred in step 16(Convert Cuboid Data to
HFile).

The version as belows:
hadoop-2.8.0,hbase-1.2.5,jdk1.8.0_131,kylin-2.0.0

The error logs in hadoop/userlogs,the log is shown below:

2017-06-03 17:57:04,106 FATAL [main] org.apache.hadoop.mapred.YarnChild:
Error running child : java.lang.NoSuchMethodError:
org.apache.hadoop.hbase.util.ChecksumType.getChecksumObject()Ljava/util/zip/Checksum;
        at
org.apache.hadoop.hbase.io.hfile.ChecksumUtil.generateChecksums(ChecksumUtil.java:73)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.finishBlock(HFileBlock.java:943)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.ensureBlockReady(HFileBlock.java:895)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.finishBlockAndWriteHeaderAndData(HFileBlock.java:1011)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.writeHeaderAndData(HFileBlock.java:997)
        at
org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexWriter.writeIndexBlocks(HFileBlockIndex.java:883)
        at
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(HFileWriterV2.java:331)
        at
org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:996)
        at
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.close(HFileOutputFormat2.java:269)
        at
org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$1.close(HFileOutputFormat2.java:277)
        at
org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(ReduceTask.java:550)
        at
org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:629)
        at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)

According to others' suggest,i had copy jars which in hadoop-2.8.0 to
hbase-1.2.5,but the problem is still not resolved.

How can we solve the problem?

Thanks a lot.
Reply | Threaded
Open this post in threaded view
|

Re: Some error happened when i build cube

shaofengshi
The method "getChecksumObject" does not exist in "
org.apache.hadoop.hbase.util.ChecksumType" in hbase 1.2.5. While in your
environment the "org.apache.hadoop.hbase.io.hfile.ChecksumUtil" still
invoking to it. It indicates there is a "hbase-server*.jar" which isn't
v1.2.5. Please search your environment globally to identify and remove it.

Usually we suggest using a formal release of CDH/HDP/MapR, which won't have
such version conflict. If you install these components separately, there
will be such environment issues.

2017-06-03 18:29 GMT+08:00 周湘伦 <[hidden email]>:

> Hi,all
> When i built cube,an error occurred in step 16(Convert Cuboid Data to
> HFile).
>
> The version as belows:
> hadoop-2.8.0,hbase-1.2.5,jdk1.8.0_131,kylin-2.0.0
>
> The error logs in hadoop/userlogs,the log is shown below:
>
> 2017-06-03 17:57:04,106 FATAL [main] org.apache.hadoop.mapred.YarnChild:
> Error running child : java.lang.NoSuchMethodError:
> org.apache.hadoop.hbase.util.ChecksumType.getChecksumObject()Ljava/util/
> zip/Checksum;
>         at
> org.apache.hadoop.hbase.io.hfile.ChecksumUtil.generateChecksums(
> ChecksumUtil.java:73)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> finishBlock(HFileBlock.java:943)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> ensureBlockReady(HFileBlock.java:895)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> finishBlockAndWriteHeaderAndData(HFileBlock.java:1011)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> writeHeaderAndData(HFileBlock.java:997)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexWriter.
> writeIndexBlocks(HFileBlockIndex.java:883)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(
> HFileWriterV2.java:331)
>         at
> org.apache.hadoop.hbase.regionserver.StoreFile$Writer.
> close(StoreFile.java:996)
>         at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> 1.close(HFileOutputFormat2.java:269)
>         at
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> 1.close(HFileOutputFormat2.java:277)
>         at
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(
> ReduceTask.java:550)
>         at
> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:629)
>         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
>         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:422)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(
> UserGroupInformation.java:1807)
>         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)
>
> According to others' suggest,i had copy jars which in hadoop-2.8.0 to
> hbase-1.2.5,but the problem is still not resolved.
>
> How can we solve the problem?
>
> Thanks a lot.
>



--
Best regards,

Shaofeng Shi 史少锋
Reply | Threaded
Open this post in threaded view
|

Re: Some error happened when i build cube

周湘伦
HI,ShaoFeng

I had replaced hbase-server-1.1.1.jar which in hive/lib
to hbase-server-1.2.5.jar.
The above problem has been solved.
Thank you!

But another problem arises:

14:16:52.441 [Job 90064ee6-2b3a-4ef7-b035-33f40391aafb-141] ERROR
org.apache.kylin.job.execution.AbstractExecutable - error running
Executable: MapReduceExecutable{id=90064ee6-2b3a-4ef7-b035-33f40391aafb-15,
name=Convert Cuboid Data to HFile, state=RUNNING}
14:16:52.496 [Job 90064ee6-2b3a-4ef7-b035-33f40391aafb-141] ERROR
org.apache.kylin.job.execution.AbstractExecutable - error running
Executable: CubingJob{id=90064ee6-2b3a-4ef7-b035-33f40391aafb,
name=Kylin_Sample_cube_1 - 20120101000000_20130101000000 - BUILD -
GMT+08:00 2017-06-08 15:06:49, state=RUNNING}
14:16:52.537 [pool-9-thread-1] ERROR
org.apache.kylin.job.impl.threadpool.DefaultScheduler - ExecuteException
job:90064ee6-2b3a-4ef7-b035-33f40391aafb
org.apache.kylin.job.exception.ExecuteException:
org.apache.kylin.job.exception.ExecuteException:
java.lang.NoSuchFieldError: DEFAULT_TEMPORARY_HDFS_DIRECTORY
at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:134)
~[kylin-core-job-2.0.0.jar:2.0.0]
at
org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(DefaultScheduler.java:142)
[kylin-core-job-2.0.0.jar:2.0.0]
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
[?:1.8.0_131]
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
[?:1.8.0_131]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
Caused by: org.apache.kylin.job.exception.ExecuteException:
java.lang.NoSuchFieldError: DEFAULT_TEMPORARY_HDFS_DIRECTORY
at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:134)
~[kylin-core-job-2.0.0.jar:2.0.0]
at
org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(DefaultChainedExecutable.java:64)
~[kylin-core-job-2.0.0.jar:2.0.0]
at
org.apache.kylin.job.execution.AbstractExecutable.execute(AbstractExecutable.java:124)
~[kylin-core-job-2.0.0.jar:2.0.0]
... 4 more


I had configured this option in hbase-site.xml:
<property>
        <name>hbase.fs.tmp.dir</name>
        <value>hdfs://master:9000/hbase/tmp/hadoop-staging</value>
      <!-- <value>/tmp/hadoop-staging</value>-->
        <description>i have tried these configurations</description>
    </property>

I had even configured this option in kylin_job_conf.xml.
But there is no effect.

Of course i have created director in local file system and hdfs.

Can you tell me how to solve this problem?
Than you!

2017-06-05 13:04 GMT+08:00 ShaoFeng Shi <[hidden email]>:

> The method "getChecksumObject" does not exist in "
> org.apache.hadoop.hbase.util.ChecksumType" in hbase 1.2.5. While in your
> environment the "org.apache.hadoop.hbase.io.hfile.ChecksumUtil" still
> invoking to it. It indicates there is a "hbase-server*.jar" which isn't
> v1.2.5. Please search your environment globally to identify and remove it.
>
> Usually we suggest using a formal release of CDH/HDP/MapR, which won't have
> such version conflict. If you install these components separately, there
> will be such environment issues.
>
> 2017-06-03 18:29 GMT+08:00 周湘伦 <[hidden email]>:
>
> > Hi,all
> > When i built cube,an error occurred in step 16(Convert Cuboid Data to
> > HFile).
> >
> > The version as belows:
> > hadoop-2.8.0,hbase-1.2.5,jdk1.8.0_131,kylin-2.0.0
> >
> > The error logs in hadoop/userlogs,the log is shown below:
> >
> > 2017-06-03 17:57:04,106 FATAL [main] org.apache.hadoop.mapred.YarnChild:
> > Error running child : java.lang.NoSuchMethodError:
> > org.apache.hadoop.hbase.util.ChecksumType.getChecksumObject()Ljava/util/
> > zip/Checksum;
> >         at
> > org.apache.hadoop.hbase.io.hfile.ChecksumUtil.generateChecksums(
> > ChecksumUtil.java:73)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > finishBlock(HFileBlock.java:943)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > ensureBlockReady(HFileBlock.java:895)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > finishBlockAndWriteHeaderAndData(HFileBlock.java:1011)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > writeHeaderAndData(HFileBlock.java:997)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexWriter.
> > writeIndexBlocks(HFileBlockIndex.java:883)
> >         at
> > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(
> > HFileWriterV2.java:331)
> >         at
> > org.apache.hadoop.hbase.regionserver.StoreFile$Writer.
> > close(StoreFile.java:996)
> >         at
> > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> > 1.close(HFileOutputFormat2.java:269)
> >         at
> > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> > 1.close(HFileOutputFormat2.java:277)
> >         at
> > org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(
> > ReduceTask.java:550)
> >         at
> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:629)
> >         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:389)
> >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:175)
> >         at java.security.AccessController.doPrivileged(Native Method)
> >         at javax.security.auth.Subject.doAs(Subject.java:422)
> >         at
> > org.apache.hadoop.security.UserGroupInformation.doAs(
> > UserGroupInformation.java:1807)
> >         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)
> >
> > According to others' suggest,i had copy jars which in hadoop-2.8.0 to
> > hbase-1.2.5,but the problem is still not resolved.
> >
> > How can we solve the problem?
> >
> > Thanks a lot.
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>
Reply | Threaded
Open this post in threaded view
|

Re: Some error happened when i build cube

shaofengshi
It looks like another jar version mismatch error. As I mentioned
previously, if you install these Hadoop components separately, there are
kinds of such error. You need take the risk and spend the time to fix them
one by one. Such issue usually is out of Kylin scope and I may not help.

2017-06-09 14:27 GMT+08:00 周湘伦 <[hidden email]>:

> HI,ShaoFeng
>
> I had replaced hbase-server-1.1.1.jar which in hive/lib
> to hbase-server-1.2.5.jar.
> The above problem has been solved.
> Thank you!
>
> But another problem arises:
>
> 14:16:52.441 [Job 90064ee6-2b3a-4ef7-b035-33f40391aafb-141] ERROR
> org.apache.kylin.job.execution.AbstractExecutable - error running
> Executable: MapReduceExecutable{id=90064ee6-2b3a-4ef7-b035-
> 33f40391aafb-15,
> name=Convert Cuboid Data to HFile, state=RUNNING}
> 14:16:52.496 [Job 90064ee6-2b3a-4ef7-b035-33f40391aafb-141] ERROR
> org.apache.kylin.job.execution.AbstractExecutable - error running
> Executable: CubingJob{id=90064ee6-2b3a-4ef7-b035-33f40391aafb,
> name=Kylin_Sample_cube_1 - 20120101000000_20130101000000 - BUILD -
> GMT+08:00 2017-06-08 15:06:49, state=RUNNING}
> 14:16:52.537 [pool-9-thread-1] ERROR
> org.apache.kylin.job.impl.threadpool.DefaultScheduler - ExecuteException
> job:90064ee6-2b3a-4ef7-b035-33f40391aafb
> org.apache.kylin.job.exception.ExecuteException:
> org.apache.kylin.job.exception.ExecuteException:
> java.lang.NoSuchFieldError: DEFAULT_TEMPORARY_HDFS_DIRECTORY
> at
> org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:134)
> ~[kylin-core-job-2.0.0.jar:2.0.0]
> at
> org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(
> DefaultScheduler.java:142)
> [kylin-core-job-2.0.0.jar:2.0.0]
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(
> ThreadPoolExecutor.java:1142)
> [?:1.8.0_131]
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(
> ThreadPoolExecutor.java:617)
> [?:1.8.0_131]
> at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
> Caused by: org.apache.kylin.job.exception.ExecuteException:
> java.lang.NoSuchFieldError: DEFAULT_TEMPORARY_HDFS_DIRECTORY
> at
> org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:134)
> ~[kylin-core-job-2.0.0.jar:2.0.0]
> at
> org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
> DefaultChainedExecutable.java:64)
> ~[kylin-core-job-2.0.0.jar:2.0.0]
> at
> org.apache.kylin.job.execution.AbstractExecutable.
> execute(AbstractExecutable.java:124)
> ~[kylin-core-job-2.0.0.jar:2.0.0]
> ... 4 more
>
>
> I had configured this option in hbase-site.xml:
> <property>
>         <name>hbase.fs.tmp.dir</name>
>         <value>hdfs://master:9000/hbase/tmp/hadoop-staging</value>
>       <!-- <value>/tmp/hadoop-staging</value>-->
>         <description>i have tried these configurations</description>
>     </property>
>
> I had even configured this option in kylin_job_conf.xml.
> But there is no effect.
>
> Of course i have created director in local file system and hdfs.
>
> Can you tell me how to solve this problem?
> Than you!
>
> 2017-06-05 13:04 GMT+08:00 ShaoFeng Shi <[hidden email]>:
>
> > The method "getChecksumObject" does not exist in "
> > org.apache.hadoop.hbase.util.ChecksumType" in hbase 1.2.5. While in your
> > environment the "org.apache.hadoop.hbase.io.hfile.ChecksumUtil" still
> > invoking to it. It indicates there is a "hbase-server*.jar" which isn't
> > v1.2.5. Please search your environment globally to identify and remove
> it.
> >
> > Usually we suggest using a formal release of CDH/HDP/MapR, which won't
> have
> > such version conflict. If you install these components separately, there
> > will be such environment issues.
> >
> > 2017-06-03 18:29 GMT+08:00 周湘伦 <[hidden email]>:
> >
> > > Hi,all
> > > When i built cube,an error occurred in step 16(Convert Cuboid Data to
> > > HFile).
> > >
> > > The version as belows:
> > > hadoop-2.8.0,hbase-1.2.5,jdk1.8.0_131,kylin-2.0.0
> > >
> > > The error logs in hadoop/userlogs,the log is shown below:
> > >
> > > 2017-06-03 17:57:04,106 FATAL [main] org.apache.hadoop.mapred.
> YarnChild:
> > > Error running child : java.lang.NoSuchMethodError:
> > > org.apache.hadoop.hbase.util.ChecksumType.
> getChecksumObject()Ljava/util/
> > > zip/Checksum;
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.ChecksumUtil.generateChecksums(
> > > ChecksumUtil.java:73)
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > finishBlock(HFileBlock.java:943)
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > ensureBlockReady(HFileBlock.java:895)
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > finishBlockAndWriteHeaderAndData(HFileBlock.java:1011)
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > writeHeaderAndData(HFileBlock.java:997)
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexWriter.
> > > writeIndexBlocks(HFileBlockIndex.java:883)
> > >         at
> > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(
> > > HFileWriterV2.java:331)
> > >         at
> > > org.apache.hadoop.hbase.regionserver.StoreFile$Writer.
> > > close(StoreFile.java:996)
> > >         at
> > > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> > > 1.close(HFileOutputFormat2.java:269)
> > >         at
> > > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> > > 1.close(HFileOutputFormat2.java:277)
> > >         at
> > > org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(
> > > ReduceTask.java:550)
> > >         at
> > > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:629)
> > >         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.
> java:389)
> > >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> java:175)
> > >         at java.security.AccessController.doPrivileged(Native Method)
> > >         at javax.security.auth.Subject.doAs(Subject.java:422)
> > >         at
> > > org.apache.hadoop.security.UserGroupInformation.doAs(
> > > UserGroupInformation.java:1807)
> > >         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:169)
> > >
> > > According to others' suggest,i had copy jars which in hadoop-2.8.0 to
> > > hbase-1.2.5,but the problem is still not resolved.
> > >
> > > How can we solve the problem?
> > >
> > > Thanks a lot.
> > >
> >
> >
> >
> > --
> > Best regards,
> >
> > Shaofeng Shi 史少锋
> >
>



--
Best regards,

Shaofeng Shi 史少锋
Reply | Threaded
Open this post in threaded view
|

Re: Some error happened when i build cube

周湘伦
OK,i will re-install CDH and hope not to have these problems.

Than you very much!

2017-06-09 15:16 GMT+08:00 ShaoFeng Shi <[hidden email]>:

> It looks like another jar version mismatch error. As I mentioned
> previously, if you install these Hadoop components separately, there are
> kinds of such error. You need take the risk and spend the time to fix them
> one by one. Such issue usually is out of Kylin scope and I may not help.
>
> 2017-06-09 14:27 GMT+08:00 周湘伦 <[hidden email]>:
>
> > HI,ShaoFeng
> >
> > I had replaced hbase-server-1.1.1.jar which in hive/lib
> > to hbase-server-1.2.5.jar.
> > The above problem has been solved.
> > Thank you!
> >
> > But another problem arises:
> >
> > 14:16:52.441 [Job 90064ee6-2b3a-4ef7-b035-33f40391aafb-141] ERROR
> > org.apache.kylin.job.execution.AbstractExecutable - error running
> > Executable: MapReduceExecutable{id=90064ee6-2b3a-4ef7-b035-
> > 33f40391aafb-15,
> > name=Convert Cuboid Data to HFile, state=RUNNING}
> > 14:16:52.496 [Job 90064ee6-2b3a-4ef7-b035-33f40391aafb-141] ERROR
> > org.apache.kylin.job.execution.AbstractExecutable - error running
> > Executable: CubingJob{id=90064ee6-2b3a-4ef7-b035-33f40391aafb,
> > name=Kylin_Sample_cube_1 - 20120101000000_20130101000000 - BUILD -
> > GMT+08:00 2017-06-08 15:06:49, state=RUNNING}
> > 14:16:52.537 [pool-9-thread-1] ERROR
> > org.apache.kylin.job.impl.threadpool.DefaultScheduler - ExecuteException
> > job:90064ee6-2b3a-4ef7-b035-33f40391aafb
> > org.apache.kylin.job.exception.ExecuteException:
> > org.apache.kylin.job.exception.ExecuteException:
> > java.lang.NoSuchFieldError: DEFAULT_TEMPORARY_HDFS_DIRECTORY
> > at
> > org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:134)
> > ~[kylin-core-job-2.0.0.jar:2.0.0]
> > at
> > org.apache.kylin.job.impl.threadpool.DefaultScheduler$JobRunner.run(
> > DefaultScheduler.java:142)
> > [kylin-core-job-2.0.0.jar:2.0.0]
> > at
> > java.util.concurrent.ThreadPoolExecutor.runWorker(
> > ThreadPoolExecutor.java:1142)
> > [?:1.8.0_131]
> > at
> > java.util.concurrent.ThreadPoolExecutor$Worker.run(
> > ThreadPoolExecutor.java:617)
> > [?:1.8.0_131]
> > at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]
> > Caused by: org.apache.kylin.job.exception.ExecuteException:
> > java.lang.NoSuchFieldError: DEFAULT_TEMPORARY_HDFS_DIRECTORY
> > at
> > org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:134)
> > ~[kylin-core-job-2.0.0.jar:2.0.0]
> > at
> > org.apache.kylin.job.execution.DefaultChainedExecutable.doWork(
> > DefaultChainedExecutable.java:64)
> > ~[kylin-core-job-2.0.0.jar:2.0.0]
> > at
> > org.apache.kylin.job.execution.AbstractExecutable.
> > execute(AbstractExecutable.java:124)
> > ~[kylin-core-job-2.0.0.jar:2.0.0]
> > ... 4 more
> >
> >
> > I had configured this option in hbase-site.xml:
> > <property>
> >         <name>hbase.fs.tmp.dir</name>
> >         <value>hdfs://master:9000/hbase/tmp/hadoop-staging</value>
> >       <!-- <value>/tmp/hadoop-staging</value>-->
> >         <description>i have tried these configurations</description>
> >     </property>
> >
> > I had even configured this option in kylin_job_conf.xml.
> > But there is no effect.
> >
> > Of course i have created director in local file system and hdfs.
> >
> > Can you tell me how to solve this problem?
> > Than you!
> >
> > 2017-06-05 13:04 GMT+08:00 ShaoFeng Shi <[hidden email]>:
> >
> > > The method "getChecksumObject" does not exist in "
> > > org.apache.hadoop.hbase.util.ChecksumType" in hbase 1.2.5. While in
> your
> > > environment the "org.apache.hadoop.hbase.io.hfile.ChecksumUtil" still
> > > invoking to it. It indicates there is a "hbase-server*.jar" which isn't
> > > v1.2.5. Please search your environment globally to identify and remove
> > it.
> > >
> > > Usually we suggest using a formal release of CDH/HDP/MapR, which won't
> > have
> > > such version conflict. If you install these components separately,
> there
> > > will be such environment issues.
> > >
> > > 2017-06-03 18:29 GMT+08:00 周湘伦 <[hidden email]>:
> > >
> > > > Hi,all
> > > > When i built cube,an error occurred in step 16(Convert Cuboid Data to
> > > > HFile).
> > > >
> > > > The version as belows:
> > > > hadoop-2.8.0,hbase-1.2.5,jdk1.8.0_131,kylin-2.0.0
> > > >
> > > > The error logs in hadoop/userlogs,the log is shown below:
> > > >
> > > > 2017-06-03 17:57:04,106 FATAL [main] org.apache.hadoop.mapred.
> > YarnChild:
> > > > Error running child : java.lang.NoSuchMethodError:
> > > > org.apache.hadoop.hbase.util.ChecksumType.
> > getChecksumObject()Ljava/util/
> > > > zip/Checksum;
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.ChecksumUtil.generateChecksums(
> > > > ChecksumUtil.java:73)
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > > finishBlock(HFileBlock.java:943)
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > > ensureBlockReady(HFileBlock.java:895)
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > > finishBlockAndWriteHeaderAndData(HFileBlock.java:1011)
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.
> > > > writeHeaderAndData(HFileBlock.java:997)
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexWriter.
> > > > writeIndexBlocks(HFileBlockIndex.java:883)
> > > >         at
> > > > org.apache.hadoop.hbase.io.hfile.HFileWriterV2.close(
> > > > HFileWriterV2.java:331)
> > > >         at
> > > > org.apache.hadoop.hbase.regionserver.StoreFile$Writer.
> > > > close(StoreFile.java:996)
> > > >         at
> > > > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> > > > 1.close(HFileOutputFormat2.java:269)
> > > >         at
> > > > org.apache.hadoop.hbase.mapreduce.HFileOutputFormat2$
> > > > 1.close(HFileOutputFormat2.java:277)
> > > >         at
> > > > org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.close(
> > > > ReduceTask.java:550)
> > > >         at
> > > > org.apache.hadoop.mapred.ReduceTask.runNewReducer(
> ReduceTask.java:629)
> > > >         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.
> > java:389)
> > > >         at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.
> > java:175)
> > > >         at java.security.AccessController.doPrivileged(Native
> Method)
> > > >         at javax.security.auth.Subject.doAs(Subject.java:422)
> > > >         at
> > > > org.apache.hadoop.security.UserGroupInformation.doAs(
> > > > UserGroupInformation.java:1807)
> > > >         at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:
> 169)
> > > >
> > > > According to others' suggest,i had copy jars which in hadoop-2.8.0 to
> > > > hbase-1.2.5,but the problem is still not resolved.
> > > >
> > > > How can we solve the problem?
> > > >
> > > > Thanks a lot.
> > > >
> > >
> > >
> > >
> > > --
> > > Best regards,
> > >
> > > Shaofeng Shi 史少锋
> > >
> >
>
>
>
> --
> Best regards,
>
> Shaofeng Shi 史少锋
>