tutorial plataforma instalar arquitectura aprender and hadoop mapreduce hive

plataforma - Cómo solucionar este error en hadoop hive vainilla



mapreduce hadoop (0)

Me enfrenta el siguiente error al ejecutar el trabajo de MapReduce en Linux (CentOS). Agregué todos los frascos en classpath. El nombre de la base de datos y el nombre de la tabla ya están en la base de datos de la colmena con alguna columna de datos en la tabla. Entonces tampoco puedo acceder a los datos de la tabla de la base de datos de la colmena. Estoy usando la versión estándar de hadoop para el trabajo. ¿Debería editar el archivo hive-site.xml por la ruta del controlador mysql, el nombre de usuario y la contraseña de colmena ?. en caso afirmativo, dígame el procedimiento para agregar el nombre de usuario y la contraseña para colmena. Gracias de antemano

murali]# hadoop jar /home/murali/workspace/hadoop/HiveInputForMapper/target/HiveInputForMapper-0.0.1-SNAPSHOT.jar com.cosmonet.HiveInputDriver -libjars $LIBJARS Java HotSpot(TM) Server VM warning: You have loaded library /hadoop/hadoop/lib/native/libhadoop.so which might have disabled stack guard. The VM will try to fix the stack guard now. It''s highly recommended that you fix the library with ''execstack -c <libfile>'', or link it with ''-z noexecstack''. 14/11/21 11:26:19 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 14/11/21 11:26:20 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore 14/11/21 11:26:20 INFO metastore.ObjectStore: ObjectStore, initialize called 14/11/21 11:26:20 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored 14/11/21 11:26:20 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored 14/11/21 11:26:22 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 14/11/21 11:26:22 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "@" (64), after : "". 14/11/21 11:26:23 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 14/11/21 11:26:23 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 14/11/21 11:26:23 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table. 14/11/21 11:26:23 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table. 14/11/21 11:26:23 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery@0" since the connection used is closing 14/11/21 11:26:23 INFO metastore.ObjectStore: Initialized ObjectStore 14/11/21 11:26:24 INFO metastore.HiveMetaStore: Added admin role in metastore 14/11/21 11:26:24 INFO metastore.HiveMetaStore: Added public role in metastore 14/11/21 11:26:24 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty 14/11/21 11:26:24 INFO metastore.HiveMetaStore: 0: get_databases: NonExistentDatabaseUsedForHealthCheck 14/11/21 11:26:24 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_databases: NonExistentDatabaseUsedForHealthCheck 14/11/21 11:26:24 INFO metastore.HiveMetaStore: 0: get_table : db=bigdata tbl=categories 14/11/21 11:26:24 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=bigdata tbl=categories Exception in thread "main" java.io.IOException: NoSuchObjectException(message:bigdata.categories table not found) at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:97) at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:71) at com.cosmonet.HiveInputDriver.run(HiveInputDriver.java:27) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84) at com.cosmonet.HiveInputDriver.main(HiveInputDriver.java:49) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: NoSuchObjectException(message:bigdata.categories table not found) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.get_table(HiveMetaStore.java:1560) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:105) at com.sun.proxy.$Proxy9.get_table(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getTable(HiveMetaStoreClient.java:997) at org.apache.hive.hcatalog.common.HCatUtil.getTable(HCatUtil.java:191) at org.apache.hive.hcatalog.mapreduce.InitializeInput.getInputJobInfo(InitializeInput.java:105) at org.apache.hive.hcatalog.mapreduce.InitializeInput.setInput(InitializeInput.java:86) at org.apache.hive.hcatalog.mapreduce.HCatInputFormat.setInput(HCatInputFormat.java:95) ... 10 more