tutorial started spark org getting examples ejemplo hadoop apache-spark hive hbase apache-spark-sql

hadoop - org - getting started apache spark



SparkSQL+Hive+Hbase+HbaseIntegration no funciona (2)

Recibo un error cuando intento conectar la tabla de colmenas (que se está creando a través de HbaseIntegration) en la chispa

Pasos que seguí: código de creación de Hive Table :

CREATE TABLE test.sample(id string,name string) STORED BY ''org.apache.hadoop.hive.hbase.HBaseStorageHandler'' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,details:name") TBLPROPERTIES ("hbase.table.name" = "sample");

DESCRIBE PRUEBA;

col_name data_type comment id string from deserializer name string from deserializer

Iniciando el shell Spark con este comando:

spark-shell --master local[2] --driver-class-path /usr/local/hive/lib/hive- hbase-handler-1.2.1.jar: /usr/local/hbase/lib/hbase-server-0.98.9- hadoop2.jar:/usr/local/hbase/lib/hbase-protocol-0.98.9-hadoo2.jar: /usr/local/hbase/lib/hbase-hadoop2-compat-0.98.9- hadoop2.jar:/usr/local/hbase/lib/hbase-hadoop-compat-0.98.9-hadoop2.jar: /usr/local/hbase/lib/hbase-client-0.98.9- hadoop2.jar:/usr/local/hbase/lib/hbase-common-0.98.9-hadoop2.jar: /usr/local/hbase/lib/htrace-core-2.04.jar:/usr/local/hbase/lib/hbase-common- 0.98.9-hadoop2-tests.jar: /usr/local/hbase/lib/hbase-server-0.98.9-hadoop2- tests.jar:/usr/local/hive/lib/zookeeper-3.4.6.jar:/usr/local/hive/lib/guava- 14.0.1.jar

En chispa-cáscara:

val sqlContext=new org.apache.spark.sql.hive.HiveContext(sc) sqlContext.sql(“select count(*) from test.sample”).collect()

Stack Trace :

Stack SQL context disponible como sqlContext.

scala> sqlContext.sql("select count(*) from test.sample").collect() 16/09/02 04:49:28 INFO parse.ParseDriver: Parsing command: select count(*) from test.sample 16/09/02 04:49:35 INFO parse.ParseDriver: Parse Completed 16/09/02 04:49:40 INFO metastore.HiveMetaStore: 0: get_table : db=test tbl=sample 16/09/02 04:49:40 INFO HiveMetaStore.audit: ugi=hdfs ip=unknown-ip-addr cmd=get_table : db=test tbl=sample java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes at org.apache.hadoop.hive.hbase.HBaseSerDe.parseColumnsMapping(HBaseSerDe.java:184) at org.apache.hadoop.hive.hbase.HBaseSerDeParameters.<init>(HBaseSerDeParameters.java:73) at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:117) at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:53) at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:521) at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:391) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:276) at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:258) at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:605) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:331) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1$$anonfun$3.apply(ClientWrapper.scala:326) at scala.Option.map(Option.scala:145) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:326) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$getTableOption$1.apply(ClientWrapper.scala:321) at org.apache.spark.sql.hive.client.ClientWrapper$$anonfun$withHiveState$1.apply(ClientWrapper.scala:279) at org.apache.spark.sql.hive.client.ClientWrapper.liftedTree1$1(ClientWrapper.scala:226) at org.apache.spark.sql.hive.client.ClientWrapper.retryLocked(ClientWrapper.scala:225) at org.apache.spark.sql.hive.client.ClientWrapper.withHiveState(ClientWrapper.scala:268) at org.apache.spark.sql.hive.client.ClientWrapper.getTableOption(ClientWrapper.scala:321) at org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:122) at org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:60) at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:384) at org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:457) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:161) at org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveContext.scala:457) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.getTable(Analyzer.scala:303)

Estoy usando hadoop 2.6.0, spark 1.6.0, colmena 1.2.1, hbase 0.98.9

Agregué esta configuración en hadoop-env.sh como

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HBASE_HOME/lib/*

¿Algún cuerpo puede sugerir alguna solución?


java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/util/Bytes

porque los jar relacionados con hbase no están en classpath

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH: `hbase classpath`

debería incluir todos los archivos jar relacionados con hbase o ver mi respuesta aquí usando --jars

Nota: Para verificar el classpath, puede agregar el código siguiente en el controlador para imprimir todos los recursos classpath

versión scala:

val cl = ClassLoader.getSystemClassLoader cl.asInstanceOf[java.net.URLClassLoader].getURLs.foreach(println)

java:

import java.net.URL; import java.net.URLClassLoader; ... ClassLoader cl = ClassLoader.getSystemClassLoader(); URL[] urls = ((URLClassLoader)cl).getURLs(); for(URL url: urls) { System.out.println(url.getFile()); }


Lo tengo funcionando. Tienes que usar debajo de los frascos.

spark-shell --master yarn-client --executor-cores 10 --executor-memory 20G --num-executors 15 --driver-memory 2G --driver-class-path /usr/hdp/current/hbase-client/lib/hbase-common.jar:/usr/hdp/current/hbase-client/lib/hbase-client.jar:/usr/hdp/current/hbase-client/lib/hbase-server.jar:/usr/hdp/current/hbase-client/lib/hbase-protocol.jar:/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar:/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar --jars /usr/hdp/current/hbase-client/lib/hbase-client.jar,/usr/hdp/current/hbase-client/lib/hbase-common.jar,/usr/hdp/current/hbase-client/lib/hbase-server.jar,/usr/hdp/current/hbase-client/lib/guava-12.0.1.jar,/usr/hdp/current/hbase-client/lib/hbase-protocol.jar,/usr/hdp/current/hbase-client/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/current/hive-client/lib/hive-hbase-handler.jar --files /etc/spark/conf/hbase-site.xml