我是靠谱客的博主 忧心康乃馨,最近开发中收集的这篇文章主要介绍java操作mysql导表,如何使用Java将表从MySQL导入Hive?,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

I am trying to import table from MySQL to Hive. But, I am getting the following error, can you please provide the solution for this?

SqoopOptions loading .....

Import Tool running ....

14/03/18 06:48:34 WARN sqoop.ConnFactory: $SQOOP_CONF_DIR has not been set in the environment. Cannot check for additional configuration.

14/03/18 06:48:43 INFO mapred.JobClient: SPLIT_RAW_BYTES=87

14/03/18 06:48:43 INFO mapred.JobClient: Map output records=2

14/03/18 06:48:43 INFO mapreduce.ImportJobBase: Transferred 18 bytes in 5.5688 seconds (3.2323 bytes/sec)

14/03/18 06:48:43 INFO mapreduce.ImportJobBase: Retrieved 2 records.

14/03/18 06:48:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM student AS t WHERE 1=0

14/03/18 06:48:43 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM student AS t WHERE 1=0

14/03/18 06:48:43 INFO hive.HiveImport: Loading uploaded data into Hive

WARNING: org.apache.hadoop.metrics.jvm.EventCounter is deprecated. Please use org.apache.hadoop.log.metrics.EventCounter in all the log4j.properties files.

Logging initialized using configuration in jar:file:/home/master/apps/hive-0.10.0/lib/hive-common-0.10.0.jar!/hive-log4j.properties

Hive history file=/tmp/master/hive_job_log_master_201403180648_1860851359.txt

FAILED: Error in metadata: MetaException(message:file:/user/hive/warehouse/student is not a directory or unable to create one)

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask

FAIL !!!

The code I have written:

public class SqoopJavaInterface {

private static final String JOB_NAME = "Sqoop Hive Job";

private static final String MAPREDUCE_JOB = "Hive Map Reduce Job";

private static final String DBURL = "jdbc:mysql://localhost:3306/test";

private static final String DRIVER = "com.mysql.jdbc.Driver";

private static final String USERNAME = "root";

private static final String PASSWORD = "root";

private static final String HADOOP_HOME = "/home/master/apps/hadoop-1.0.4";

private static final String JAR_OUTPUT_DIR = "/home/master/data";

private static final String HIVE_HOME = "/home/master/apps/hive-0.10.0";

private static final String HIVE_DIR = "/user/hive/warehouse/";

private static final String WAREHOUSE_DIR = "hdfs://localhost:9000/user/hive/warehouse/student";

private static final String SUCCESS = "SUCCESS !!!";

private static final String FAIL = "FAIL !!!";

/**

* @param table

* @throws IOException

*/

public static void importToHive(String table) throws IOException {

System.out.println("SqoopOptions loading .....");

Configuration config = new Configuration();

// Hive connection parameters

config.addResource(new Path(HADOOP_HOME+"/conf/core-site.xml"));

config.addResource(new Path(HADOOP_HOME+"/conf/hdfs-site.xml"));

config.addResource(new Path(HIVE_HOME+"/conf/hive-site.xml"));

FileSystem dfs =FileSystem.get(config);

/* MySQL connection parameters */

SqoopOptions options = new SqoopOptions(config);

options.setConnectString(DBURL);

options.setTableName(table);

options.setDriverClassName(DRIVER);

options.setUsername(USERNAME);

options.setPassword(PASSWORD);

options.setHadoopMapRedHome(HADOOP_HOME);

options.setHiveHome(HIVE_HOME);

options.setHiveImport(true);

options.setHiveTableName(table);

options.setOverwriteHiveTable(true);

options.setFailIfHiveTableExists(false);

options.setFieldsTerminatedBy(',');

options.setOverwriteHiveTable(true);

options.setDirectMode(true);

options.setNumMappers(1); // No. of Mappers to be launched for the job

options.setWarehouseDir(WAREHOUSE_DIR);

options.setJobName(JOB_NAME);

options.setMapreduceJobName(MAPREDUCE_JOB);

options.setTableName(table);

options.setJarOutputDir(JAR_OUTPUT_DIR);

System.out.println("Import Tool running ....");

ImportTool it = new ImportTool();

int retVal = it.run(options);

if (retVal == 0) {

System.out.println(SUCCESS);

} else {

System.out.println(FAIL);

}

}

When I execute above code, I am getting the following error. Can u please provide solution for this?

Execution failed while executing command: 192.168.10.172

Error message: bash: 192.168.10.172: command not found

Now wait 5 seconds to begin next task ...

Connection channel disconnect

net.neoremind.sshxcute.core.Result@60c2be20

Command is sqoop import --connect jdbc:mysql://localhost:3316/hadoop --username root --password root --table employees --hive-import -m 1 -- --schema default

Connection channel established succesfully

Start to run command

Connection channel closed

Check if exec success or not ...

Execution failed while executing command: sqoop import --connect jdbc:mysql://localhost:3316/hadoop --username root --password root --table employees --hive-import -m 1 -- --schema default

Error message: bash: sqoop: command not found

Now wait 5 seconds to begin next task ...

Connection channel disconnect

SSH connection shutdown

解决方案

As sqoop options method is deprecated, you can use following code:

public static void importToHive() throws Exception{

Configuration config = new Configuration();

config.addResource(new Path("/usr/local/hadoop/conf/core-site.xml"));

config.addResource(new Path("/usr/local/hadoop/conf/hdfs-site.xml"));

String[] cmd ={"import", "--connect",,"--username", userName,

"--password", password,"--hadoop-home", "/usr/local/hadoop","--table",, "--hive-import","--create-hive-table", "--hive-table",,"-target-dir",

"hdfs://localhost:54310/user/hive/warehouse","-m", "1","--delete-target-dir"};

Sqoop.runTool(cmd,config);

}

Please use the proper hadoop and hive warehouse path, username, password for mysql. Please check your port from core-site.xml (in my case it is 54310)

最后

以上就是忧心康乃馨为你收集整理的java操作mysql导表,如何使用Java将表从MySQL导入Hive?的全部内容,希望文章能够帮你解决java操作mysql导表,如何使用Java将表从MySQL导入Hive?所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(57)

评论列表共有 0 条评论

立即
投稿
返回
顶部