sqoop从mysql导入hive的一些问题

时间:2018-09-23 11:42:15   收藏:0   阅读:1483

1.错误:Sqoop:Import failed: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

解决方法1:

将mysql中的数据导入到hive中报错:

ERROR tool.ImportTool:Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf

这是因为sqoop需要一个hive的包,将hive/lib中的hive-common-2.3.3.jar拷贝到sqoop的lib目录中,重新执行试试。

解决方法2:

在/etc/profile中添加下面一行

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HIVE_HOME/lib/*

2.错误:main ERROR Could not register mbeans java.security.AccessControlException: access denied ("javax.management.MBeanTrustPermission" "register")

解决方法:

Just add the following lines to your java.policy file unter <JRE_HOME>/lib/security.
grant {
  permission javax.management.MBeanTrustPermission "register";
};

3.错误:ERROR exec.DDLTask: java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.ObjectMapper.readerFor(Ljava/lang/Class;)Lcom/fasterxml/jackson/databind/ObjectReader;

解决犯法:jackson jar包版本冲突,将sqoop/lib下的jackson*.jar备份,将hive/lib包下的jackson*.jar包拷贝过来

4.错误:Required table missing : "DBS" in Catalog "" Schema "". DataNucleus requires this table to perform its persistence operations. Either your MetaData is incorrect, or you need to enable "datanucleus.schema.autoCreateTables"

解决方法:

到hive-site.xml中找到datanucleus.schema.autoCreateAll 把value改成true 

参考:http://blog.itpub.net/31506529/viewspace-2213877/

 

评论(0
© 2014 mamicode.com 版权所有 京ICP备13008772号-2  联系我们:gaon5@hotmail.com
迷上了代码!