找到答案 - 除了这里的文档中提到的内容 - https://github.com/splicemachine/spliceengine/blob/branch-2.8/platforms/hdp2.6.5/docs/HDP-installation.md。
对于 spark2 jar 中的类,在 hbase 主日志中引发了多个 NoClassDefFoundError 错误。
其中之一是
master.HMaster: The coprocessor com.splicemachine.hbase.SpliceMasterObserver threw java.io.IOException: Unexpected exception
java.io.IOException: Unexpected exception
at com.splicemachine.si.data.hbase.coprocessor.CoprocessorUtils.getIOException(CoprocessorUtils.java:30)
at com.splicemachine.hbase.SpliceMasterObserver.start(SpliceMasterObserver.java:111)
at org.apache.hadoop.hbase.coprocessor.CoprocessorHost$Environment.startup(CoprocessorHost.java:415)
at org.apache.hadoop.hbase.coprocessor.CoprocessorHost.loadInstance(CoprocessorHost.java:256)
at org.apache.hadoop.hbase.coprocessor.CoprocessorHost.loadSystemCoprocessors(CoprocessorHost.java:159)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.<init>(MasterCoprocessorHost.java:93)
at org.apache.hadoop.hbase.master.HMaster.finishActiveMasterInitialization(HMaster.java:774)
at org.apache.hadoop.hbase.master.HMaster.access$900(HMaster.java:225)
at org.apache.hadoop.hbase.master.HMaster$3.run(HMaster.java:2038)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.NoClassDefFoundError: org/spark_project/guava/util/concurrent/ThreadFactoryBuilder
at com.splicemachine.timestamp.impl.TimestampClient.<init>(TimestampClient.java:108)
at com.splicemachine.timestamp.hbase.ZkTimestampSource.initialize(ZkTimestampSource.java:62)
at com.splicemachine.timestamp.hbase.ZkTimestampSource.<init>(ZkTimestampSource.java:48)
at com.splicemachine.si.data.hbase.coprocessor.HBaseSIEnvironment.<init>(HBaseSIEnvironment.java:146)
at com.splicemachine.si.data.hbase.coprocessor.HBaseSIEnvironment.loadEnvironment(HBaseSIEnvironment.java:100)
at com.splicemachine.hbase.SpliceMasterObserver.start(SpliceMasterObserver.java:81)
... 8 more
Caused by: java.lang.ClassNotFoundException: org.spark_project.guava.util.concurrent.ThreadFactoryBuilder
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 14 more
我们发现这个类在 spark-network-common jar 中,并将其符号链接到 HBASE_CLASSPATH 中的一个文件夹。但随后它为不同的类抛出了另一个 classdef not found 错误。
我们最终将 spark2/jars 文件夹中的所有 jar 符号链接到作为 HBASE_CLASSPATH 一部分的文件夹之一。
在此之后,hbase master 成功启动并启动 splice db 进程,我们能够连接 sqlshell.sh
注意:确保在所有具有区域服务器的节点上进行符号链接。
帮助我们解决这个问题的是 splice 文档中的这两个 java 文件
https://github.com/splicemachine/spliceengine/blob/master/hbase_sql/src/main/java/com/splicemachine/hbase/SpliceMasterObserver.java
和
https://github.com/splicemachine/spliceengine/blob/ee61cadf17c97a0c5d866e2b764142f8e55311a5/splice_timestamp_api/src/main/java/com/splicemachine/timestamp/impl/TimestampServer.java