导入命令
./sqoop import -Dmapreduce.map.java.opts=-Xmx3000m -Dmapreduce.map.memory.mb=3200 --connect jdbc:oracle:thin:@192.168.113.17:1521:btobbi --username tianlianbi --P --table BIO_PRODUCT_MAIN --hive-import --hive-overwrite -m 4
数据已经进入到hdfs了,hive表也自动建好了,但是就是最后报错了,hive里面没有数据
下面是报错日志
17/03/23 17:19:23 INFO hive.HiveImport: OK 17/03/23 17:19:23 INFO hive.HiveImport: Time taken: 1.361 seconds 17/03/23 17:22:20 INFO hive.HiveImport: FAILED: SemanticException Line 2:17 Invalid path ''hdfs://cluster/user/root/BIO_PRODUCT_MAIN'' 17/03/23 17:22:20 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Hive exited with status 64 at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:389) at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:339) at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:240) at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:514) at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:605) at org.apache.sqoop.Sqoop.run(Sqoop.java:143) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218) at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227) at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
求教解决方案
走同样的路,发现不同的人生