0

我正在尝试将 Shark 0.9.1(用于 hadoop 1)与 datastax 企业 4.0.3 Hadoop 节点上的 hive 集成。我已经成功安装并配置了 Scala 2.10.3 和 Spark 1.0.0。Scala 和 sparks shell 也运行良好。现在,当我尝试使用 SHARK_HOME/bin/shark 打开 Shark shell 时,出现错误

 14/06/26 10:04:14 DEBUG metastore.CassandraHiveMetaStore: Creating CassandraHiveMetaStore
 14/06/26 10:04:15 ERROR config.DatabaseDescriptor: Fatal configuration error
    org.apache.cassandra.exceptions.ConfigurationException: Cannot locate cassandra.yaml
    at org.apache.cassandra.config.YamlConfigurationLoader.getStorageConfigURL(YamlConfigurationLoader.java:64)
    at org.apache.cassandra.config.YamlConfigurationLoader.loadConfig(YamlConfigurationLoader.java:75)
    at org.apache.cassandra.config.DatabaseDescriptor.loadConfig(DatabaseDescriptor.java:135)
    at org.apache.cassandra.config.DatabaseDescriptor.<clinit>(DatabaseDescriptor.java:111)
    at org.apache.cassandra.service.StorageService.getPartitioner(StorageService.java:148)
    at org.apache.cassandra.service.StorageService.<init>(StorageService.java:142)
    at org.apache.cassandra.service.StorageService.<clinit>(StorageService.java:144)
    at com.datastax.bdp.hadoop.hive.metastore.CassandraClientHolder.<init>(CassandraClientHolder.java:69)
    at com.datastax.bdp.hadoop.hive.metastore.SchemaManagerService.<init>(SchemaManagerService.java:111)
    at com.datastax.bdp.hadoop.hive.metastore.CassandraHiveMetaStore.setConf(CassandraHiveMetaStore.java:114)
    at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
    at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
    at org.apache.hadoop.hive.metastore.RetryingRawStore.<init>(RetryingRawStore.java:64)
    at org.apache.hadoop.hive.metastore.RetryingRawStore.getProxy(RetryingRawStore.java:73)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:415)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:402)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:441)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:326)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:286)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:54)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:59)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newHMSHandler(HiveMetaStore.java:4060)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:121)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1210)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2136)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2147)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1043)
    at shark.memstore2.TableRecovery$.reloadRdds(TableRecovery.scala:49)
    at shark.SharkCliDriver.<init>(SharkCliDriver.scala:283)
    at shark.SharkCliDriver$.main(SharkCliDriver.scala:162)
    at shark.SharkCliDriver.main(SharkCliDriver.scala)
 Cannot locate cassandra.yaml
  Fatal configuration error; unable to start. See log for stacktrace.

任何人都可以遇到这种问题。请建议。谢谢

4

1 回答 1

1

试试 DataStax Enterprise 4.5,它内置了对 Spark 和 Shark 的支持。

于 2014-07-04T04:41:59.563 回答