我是靠谱客的博主 微笑路灯,最近开发中收集的这篇文章主要介绍jdbc之java连接mysql,hive,hbase全解JAVA连接MysqlJAVA连接hiveJAVA连接HBase,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

创建maven工程

Maven–》勾选Create from archetype
–》maven-archetype-quickstart
–》NEXT

在这里插入图片描述

在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
在这里插入图片描述
点击Enable Auto Import自动导包

在pom文件的properties里修改版本为1.8
在这里插入图片描述
改为
在这里插入图片描述
修改Project Structure
在这里插入图片描述
在这里插入图片描述
将language level改为8
在这里插入图片描述
File-Settings
修改字节编码版本为1.8
在这里插入图片描述

在这里插入图片描述
在src-》main下新建文件夹名resources,将其配置为资源文件夹,此后所有的配置文件均可放在此文件夹下,运行java代码时这些配置就可起作用

Project Structure–>Project Settrings -->Modules -->找到resources文件夹,选中,点击Resources–》Apply --》OK

在这里插入图片描述

maven工程创建完成,下面三种第一步均需此操作

JAVA连接Mysql

Pom文件导依赖

加上mysql-connector-java的依赖即可

<dependency>
      <groupId>mysql</groupId>
      <artifactId>mysql-connector-java</artifactId>
      <version>5.1.38</version>
    </dependency>

Pom文件全部

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>cn.wxj.mysql.jdbc</groupId>
  <artifactId>java_mysql</artifactId>
  <version>1.0-SNAPSHOT</version>

  <name>java_mysql</name>
  <!-- FIXME change it to the project's website -->
  <url>http://www.example.com</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>

    <dependency>
      <groupId>mysql</groupId>
      <artifactId>mysql-connector-java</artifactId>
      <version>5.1.38</version>
    </dependency>
  </dependencies>

  <build>
    <pluginManagement><!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) -->
      <plugins>
        <!-- clean lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#clean_Lifecycle -->
        <plugin>
          <artifactId>maven-clean-plugin</artifactId>
          <version>3.1.0</version>
        </plugin>
        <!-- default lifecycle, jar packaging: see https://maven.apache.org/ref/current/maven-core/default-bindings.html#Plugin_bindings_for_jar_packaging -->
        <plugin>
          <artifactId>maven-resources-plugin</artifactId>
          <version>3.0.2</version>
        </plugin>
        <plugin>
          <artifactId>maven-compiler-plugin</artifactId>
          <version>3.8.0</version>
        </plugin>
        <plugin>
          <artifactId>maven-surefire-plugin</artifactId>
          <version>2.22.1</version>
        </plugin>
        <plugin>
          <artifactId>maven-jar-plugin</artifactId>
          <version>3.0.2</version>
        </plugin>
        <plugin>
          <artifactId>maven-install-plugin</artifactId>
          <version>2.5.2</version>
        </plugin>
        <plugin>
          <artifactId>maven-deploy-plugin</artifactId>
          <version>2.8.2</version>
        </plugin>
        <!-- site lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#site_Lifecycle -->
        <plugin>
          <artifactId>maven-site-plugin</artifactId>
          <version>3.7.1</version>
        </plugin>
        <plugin>
          <artifactId>maven-project-info-reports-plugin</artifactId>
          <version>3.0.0</version>
        </plugin>
      </plugins>
    </pluginManagement>
  </build>
</project>

配置文件设置

resources文件夹下新建文件datasource.properties
编辑以下四项配置内容

driver=com.mysql.jdbc.Driver
url=jdbc:mysql://192.168.182.131:3306/kb06mysqltestdb?useUnicode=true&characterEncoding=utf8&useSSL=true
username=root
password=javakb10

代码实现阶段

1.定义BaseConfig类获取datasource.properties的驱动,数据库url,用户名,密码连接mysql数据库

定义内部类Config,使用其对象作为BaseConfig的成员变量,加载驱动等信息

 //定义内部类Config,使用其对象作为BaseConfig的成员变量,加载驱动等信息
    private class Config{
        String driver;
        String url;
        String username;
        String password;
    }
    private Config config;

定义valid方法检测url格式是否正确

//valid类检测url格式是否正确
  private boolean valid(String url)  {
        Pattern p = Pattern.compile("jdbc:\w+://((\d{1,3}\.){3}\d{1,3}|\w+):\d{1,5}/\w+");
        Matcher m = p.matcher(url);
        return m.matches();
    }

init方法将配置信息加载到BaseConfig的成员变量config上

 //init方法将配置信息加载到BaseConfig的成员变量config上
    private void init() throws Exception {

        //用线程读取读取资源文件datasource.properties的路径
        String path=Thread.currentThread().getContextClassLoader()
                .getResource("datasource.properties").getPath();

        //使用Properties包装类解析properties文件
        Properties pro=new Properties();
        pro.load(new FileReader(path));
        String url=pro.getProperty("url");
        if(url==null|| !valid(url)){
            throw new Exception("no or invalid url exception");
        }
        config=new Config();
        config.url=url;
        config.driver =pro.getProperty("driver","org.apache.hive.jdbc.HiveDriver");
        config.username=pro.getProperty("username","root");
        config.password=pro.getProperty("password","javakb10");
        pro.clear();

    }

用代码块调用init方法,加载数据库配置信息并加载驱动driver

 //用代码块调用init方法,加载数据库配置信息并加载驱动driver
    {
        try {
            init();
            Class.forName(config.driver);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

获得数据库连接Connection

//获得数据库连接Connection
    protected Connection getCon() throws SQLException {
        return DriverManager.getConnection(config.url,config.username,config.password);
    }

通用关闭线程方法

//
    protected void close(AutoCloseable...acs){
        if(acs!=null){
            for (AutoCloseable ac : acs) {
                try {
                    ac.close();
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    }

BaseConfig类整体代码

package cn.wxj.mysql.jdbc;

import java.io.FileReader;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.util.Properties;
import java.util.regex.Matcher;
import java.util.regex.Pattern;

public class BaseConfig {
    //定义内部类Config,使用其对象作为BaseConfig的成员变量,加载驱动等信息
    private class Config{
        String driver;
        String url;
        String username;
        String password;
    }
    private Config config;

//valid类检测url格式是否正确
    private boolean valid(String url)  {
        Pattern p = Pattern.compile("jdbc:\w+://((\d{1,3}\.){3}\d{1,3}|\w+):\d{1,5}/\w+");
        Matcher m = p.matcher(url);
        return m.matches();
    }

    //用代码块调用init方法,加载数据库配置信息并加载驱动driver
    {
        try {
            init();
            Class.forName(config.driver);
        } catch (Exception e) {
            e.printStackTrace();
        }
    }

    //获得数据库连接Connection
    protected Connection getCon() throws SQLException {
        return DriverManager.getConnection(config.url,config.username,config.password);
    }

//通用关闭线程方法
    protected void close(AutoCloseable...acs){
        if(acs!=null){
            for (AutoCloseable ac : acs) {
                try {
                    ac.close();
                } catch (Exception e) {
                    e.printStackTrace();
                }
            }
        }
    }
    //init方法将配置信息加载到BaseConfig的成员变量config上
    private void init() throws Exception {

        //用线程读取读取资源文件datasource.properties的路径
        String path=Thread.currentThread().getContextClassLoader()
                .getResource("datasource.properties").getPath();

        //使用Properties包装类解析properties文件
        Properties pro=new Properties();
        pro.load(new FileReader(path));
        String url=pro.getProperty("url");
        if(url==null|| !valid(url)){
            throw new Exception("no or invalid url exception");
        }
        config=new Config();
        config.url=url;
        config.driver =pro.getProperty("driver","org.apache.hive.jdbc.HiveDriver");
        config.username=pro.getProperty("username","root");
        config.password=pro.getProperty("password","javakb10");
        pro.clear();

    }
}

2.Result类,方便解析sql语句产生的结果

package cn.wxj.mysql.jdbc;

public class Result<T> {
    private T data;
    private boolean isErr;

    public Result( boolean isErr,T data) {
        this.data = data;
        this.isErr = isErr;
    }

    public T getData() {
        return data;
    }

    public boolean isErr() {
        return isErr;
    }

    public static <T>Result Succeed(T data){
        return new Result(false,data);
    }

    public static Result Fail(){
        return new Result(true,null);
    }
}

3.BaseDao类,使sql语句成功编译执行

package cn.wxj.mysql.jdbc;

import java.io.BufferedReader;
import java.io.FileReader;
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.util.ArrayList;
import java.util.List;

public class BaseDao extends BaseConfig {

    //预编译执行器PreparedStatement(防止sql注入)
    private PreparedStatement getPst(Connection con, String sql, Object...params) throws SQLException {
        PreparedStatement pst=con.prepareStatement(sql);
        if(params.length>0){
            for (int i = 0; i < params.length; i++) {
                pst.setObject(i+1,params[i]);
            }
        }
        return  pst;
    }

    //非查询语句调用方法
    public Result exeNonQuery(String sql,Object...params){
        Connection con=null;
        PreparedStatement pst=null;
        try {
            con=getCon();
            pst=getPst(con,sql,params);
            return Result.Succeed(pst.executeUpdate());
        } catch (SQLException e) {
            e.printStackTrace();
            return Result.Fail();
        }finally {
            close(pst,con);
        }
    }

    //查询语句调用方法
    public Result exeQuery(String sql,Object...params){
        Connection con=null;
        PreparedStatement pst=null;
        ResultSet rst=null;
        List<List<String>> data=new ArrayList<>();
        try {
            con=getCon();
            pst=getPst(con,sql,params);
            rst=pst.executeQuery();
            final int COUNT=rst.getMetaData().getColumnCount();
            while (rst.next()){
                List<String> row=new ArrayList<>(COUNT);
                for (int i = 1; i <= COUNT; i++) {
                    row.add(rst.getObject(i).toString());
                }
                data.add(row);
            }
            return Result.Succeed(data);
        } catch (SQLException e) {
            e.printStackTrace();
            return Result.Fail();
        }finally {
            close(rst,pst,con);
        }
    }

    //读取sql语句
    public String readSql(String...paths) throws Exception {
        String path=paths.length==0?"sql/sql.sql":paths[0];
        StringBuilder builder=new StringBuilder();
        BufferedReader read=new BufferedReader(new FileReader(path));
        String line=null;
        while (null!=(line=read.readLine())){
            builder.append(line.trim()+" ");
        }
        return builder.toString();
    }

}

4.Test类 main方法执行SQL语句,显示结果

public class Test {
    public static void main(String[] args) throws Exception{

        BaseDao dao=new BaseDao();
        Result<List<List<String>>> result = dao.exeQuery(dao.readSql());

        List<List<String>>  table=result.getData();
        table.forEach(tab->{
            tab.forEach(row->{
                System.out.print(row+"t");
            });
            System.out.println();
        });

        Result result1 = dao.exeNonQuery(dao.readSql());
        System.out.println(result1);
    }
}

以查询语句为例输出结果
在这里插入图片描述

JAVA连接hive

Pom依赖

<properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <hadoop.version>2.6.0</hadoop.version>
    <hive.version>1.1.0</hive.version>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-jdbc</artifactId>
      <version>${hive.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-auth</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>${hadoop.version}</version>
    </dependency>
  </dependencies>

Pom整体

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>cn.wxj.hive.jdbc</groupId>
  <artifactId>java2hive</artifactId>
  <version>1.0-SNAPSHOT</version>

  <name>java2hive</name>
  <!-- FIXME change it to the project's website -->
  <url>http://www.example.com</url>

  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <hadoop.version>2.6.0</hadoop.version>
    <hive.version>1.1.0</hive.version>
  </properties>

  <dependencies>
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <scope>test</scope>
    </dependency>
    <dependency>
      <groupId>org.apache.hive</groupId>
      <artifactId>hive-jdbc</artifactId>
      <version>${hive.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-auth</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>${hadoop.version}</version>
      <exclusions>
        <exclusion>
          <artifactId>jdk.tools</artifactId>
          <groupId>jdk.tools</groupId>
        </exclusion>
      </exclusions>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>${hadoop.version}</version>
    </dependency>
  </dependencies>

  <build>
    <pluginManagement><!-- lock down plugins versions to avoid using Maven defaults (may be moved to parent pom) -->
      <plugins>
        <!-- clean lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#clean_Lifecycle -->
        <plugin>
          <artifactId>maven-clean-plugin</artifactId>
          <version>3.1.0</version>
        </plugin>
        <!-- default lifecycle, jar packaging: see https://maven.apache.org/ref/current/maven-core/default-bindings.html#Plugin_bindings_for_jar_packaging -->
        <plugin>
          <artifactId>maven-resources-plugin</artifactId>
          <version>3.0.2</version>
        </plugin>
        <plugin>
          <artifactId>maven-compiler-plugin</artifactId>
          <version>3.8.0</version>
        </plugin>
        <plugin>
          <artifactId>maven-surefire-plugin</artifactId>
          <version>2.22.1</version>
        </plugin>
        <plugin>
          <artifactId>maven-jar-plugin</artifactId>
          <version>3.0.2</version>
        </plugin>
        <plugin>
          <artifactId>maven-install-plugin</artifactId>
          <version>2.5.2</version>
        </plugin>
        <plugin>
          <artifactId>maven-deploy-plugin</artifactId>
          <version>2.8.2</version>
        </plugin>
        <!-- site lifecycle, see https://maven.apache.org/ref/current/maven-core/lifecycles.html#site_Lifecycle -->
        <plugin>
          <artifactId>maven-site-plugin</artifactId>
          <version>3.7.1</version>
        </plugin>
        <plugin>
          <artifactId>maven-project-info-reports-plugin</artifactId>
          <version>3.0.0</version>
        </plugin>
      </plugins>
    </pluginManagement>
  </build>
</project>

datasource.properties文件不需要密码

driver=org.apache.hive.jdbc.HiveDriver
url=jdbc:hive2://192.168.182.131:10000/default
username=root

resources文件夹下新增log4j.properties配置文件,内容为

log4j.rootLogger=INFO, stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d %p [%c] - %m%n
log4j.appender.logfile=org.apache.log4j.FileAppender
log4j.appender.logfile.File=target/hadoop.log
log4j.appender.logfile.layout=org.apache.log4j.PatternLayout
log4j.appender.logfile.layout.ConversionPattern=%d %p [%c] - %m%n

BaseConfig的init方法稍作改动,将password的默认值改为空

 config=new Config();
        config.url=url;
        config.driver =pro.getProperty("driver","org.apache.hive.jdbc.HiveDriver");
        config.username=pro.getProperty("username","root");
        config.password=pro.getProperty("password","");

剩下的没有区别

JAVA连接HBase

需要配置windows主机映射

Pom依赖

<?xml version="1.0" encoding="UTF-8"?>

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>cn.kgc.kb10.wxj.hbase</groupId>
  <artifactId>hbasejdbc</artifactId>
  <version>1.0-SNAPSHOT</version>

  <name>hbasejdbc</name>
  <!-- FIXME change it to the project's website -->
  <url>http://www.example.com</url>

  <properties>
    <maven.compiler.source>1.8</maven.compiler.source>
    <maven.compiler.target>1.8</maven.compiler.target>
    <hadoop.version>2.6.0-cdh5.14.2</hadoop.version>
    <hive.version>1.1.0-cdh5.14.2</hive.version>
    <hbase.version>1.2.0-cdh5.14.2</hbase.version>
  </properties>

  <repositories>
    <repository>
      <id>cloudera</id>
      <url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
    </repository>
  </repositories>

  <dependencies>
    <!--hadoop-->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-common</artifactId>
      <version>${hadoop.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-client</artifactId>
      <version>${hadoop.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-hdfs</artifactId>
      <version>${hadoop.version}</version>
    </dependency>

    <!--日志-->
    <dependency>
      <groupId>commons-logging</groupId>
      <artifactId>commons-logging</artifactId>
      <version>1.2</version>
    </dependency>

    <!--MapReduce-->
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-auth</artifactId>
      <version>${hadoop.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-core</artifactId>
      <version>${hadoop.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hadoop</groupId>
      <artifactId>hadoop-mapreduce-client-jobclient</artifactId>
      <version>${hadoop.version}</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.zookeeper/zookeeper -->
    <!--zookeeper-->
    <dependency>
      <groupId>org.apache.zookeeper</groupId>
      <artifactId>zookeeper</artifactId>
      <version>3.4.5</version>
      <type>pom</type>
    </dependency>

    <!--hbase-->
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-client</artifactId>
      <version>${hbase.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-common</artifactId>
      <version>${hbase.version}</version>
    </dependency>
    <dependency>
      <groupId>org.apache.hbase</groupId>
      <artifactId>hbase-server</artifactId>
      <version>${hbase.version}</version>
    </dependency>

    <!--log4j-->
    <dependency>
      <groupId>log4j</groupId>
      <artifactId>log4j</artifactId>
      <version>1.2.17</version>
    </dependency>

    <!--测试-->
    <dependency>
      <groupId>junit</groupId>
      <artifactId>junit</artifactId>
      <version>4.11</version>
      <!--<scope>test</scope>-->
    </dependency>
  </dependencies>
</project>

代码实现

package cn.kgc.kb10.wxj.hbase;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.*;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;
import org.junit.Test;
import java.io.IOException;

public class HBaseClientDemo {
    //创建一个表
  @Test
    public void createTable() throws IOException {
        //1.获取hbase连接,配置
        Configuration conf = HBaseConfiguration.create();
        conf.set("hbase.zookeeper.quorum","192.168.182.131");
        conf.set("hbase.zookeeper.property.clientPort","2181");
        //2.创建连接
        Connection conn = ConnectionFactory.createConnection(conf);
        //3.创建admin
        Admin admin=conn.getAdmin();
        //4.创建表的相关信息,表名
        HTableDescriptor student = new HTableDescriptor(TableName.valueOf("student100"));
        //5.添加列族信息
        student.addFamily(new HColumnDescriptor("info"));
        student.addFamily(new HColumnDescriptor("score"));
        //6.调用创建表的方法进行建表操作
        admin.createTable(student);
        //7.关闭连接
        conn.close();
    }
    @Test
    public void putData2Table() throws IOException {
        //1.获取hbase连接,配置
        Configuration conf = HBaseConfiguration.create();
        conf.set("hbase.zookeeper.quorum","192.168.182.131");
        conf.set("hbase.zookeeper.property.clientPort","2181");
        //2.创建连接
        Connection conn = ConnectionFactory.createConnection(conf);
        //3.获取table
        Table student =conn.getTable(TableName.valueOf("student100"));
        //4.表中添加数据rowkey
        Put put=new Put(Bytes.toBytes("1001"));
        //5.添加列 info:name zhangsan
        put.addColumn(Bytes.toBytes("info"),Bytes.toBytes("name"),Bytes.toBytes("zhangsan"));
        put.addColumn(Bytes.toBytes("info"),Bytes.toBytes("gender"),Bytes.toBytes("male"));
        put.addColumn(Bytes.toBytes("info"),Bytes.toBytes("age"),Bytes.toBytes("11"));
        //6.插入数据
        student.put(put);
        //7.关闭连接
        conn.close();
    }

    //读取数据
    @Test
    public void getDataFromTable() throws IOException {
//1.获取hbase连接,配置
        Configuration conf = HBaseConfiguration.create();
        conf.set("hbase.zookeeper.quorum","192.168.182.131");
        conf.set("hbase.zookeeper.property.clientPort","2181");
        //2.创建连接
        Connection conn = ConnectionFactory.createConnection(conf);
        //3.获取table
        Table student =conn.getTable(TableName.valueOf("student100"));
        //4.读取数据,Get
        Get get = new Get(Bytes.toBytes("1001"));
        //5.获取结果
        Result result = student.get(get);
        
        //6.遍历
        Cell[] cells = result.rawCells();
        for (Cell cell : cells) {
            System.out.println("rowkey:"+Bytes.toString(CellUtil.cloneRow(cell)));
            System.out.println("列族:"+Bytes.toString(CellUtil.cloneFamily(cell)));
            System.out.println("列名:"+Bytes.toString(CellUtil.cloneQualifier(cell)));
            System.out.println("value:"+Bytes.toString(CellUtil.cloneValue(cell)));
            System.out.println("-----------------");
        }
        conn.close();
    }

    //删除
    @Test
    public void dropTable() throws IOException {
        //1.获取hbase连接,配置
        Configuration conf = HBaseConfiguration.create();
        conf.set("hbase.zookeeper.quorum","192.168.182.131");
        conf.set("hbase.zookeeper.property.clientPort","2181");
        //2.创建连接
        Connection conn = ConnectionFactory.createConnection(conf);
        //3.创建admin
        Admin admin=conn.getAdmin();
        //4.禁用表
        admin.disableTable(TableName.valueOf("student100"));


        //5.删除表
        admin.deleteTable(TableName.valueOf("student100"));
        conn.close();

    }
}

最后

以上就是微笑路灯为你收集整理的jdbc之java连接mysql,hive,hbase全解JAVA连接MysqlJAVA连接hiveJAVA连接HBase的全部内容,希望文章能够帮你解决jdbc之java连接mysql,hive,hbase全解JAVA连接MysqlJAVA连接hiveJAVA连接HBase所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(60)

评论列表共有 0 条评论

立即
投稿
返回
顶部