我是靠谱客的博主 妩媚羽毛,最近开发中收集的这篇文章主要介绍JAVA通过kerberos认证登陆调起HDFS-进行HDAOOP文件上传下载成功版,觉得挺不错的,现在分享给大家,希望可以做个参考。

概述

2019独角兽企业重金招聘Python工程师标准>>> hot3.png

1、遇到错误一:java.lang.ClassNotFindException: org.apache.commons.configuration.Configuration
    已解决:maven中添加commons-configuration
2、遇到错误二:Can't get Kerberos realm
    已解决:HDFSMain.java中添加两个系统变量
    System.setProperty("java.security.krb5.kdc","192.168.13.7:21732");
    System.setProperty("java.security.krb5.realm","HADOOP.com");
    后发现可用
    System.setProperty("java.security.krb5.conf","/tmp/hcweb/krb5.conf");
    代替删除。
3、遇到错误三:java.io.IOException: HADOOP_HOME or hadoop.home.dir are not set
    已解决:HDFSMain.java中添加 hadoop.home.dir 系统变量
    System.setProperty("hadoop.home.dir", "/opt/client/HDFS/hadoop");
4、遇到错误四: javax.security.auth.login.LoginException: Unable to obtain password from user(无法获取到密码)
    已解决,尝试过的解决方案:
    a、将/opt/client/JDK/jdk/jre/lib/security/下的cacerts复制到/home/run/jre/lib/security/下,用以替换原来的证书
    b、将/opt/client/JDK/jdk/jre/lib/security下的local_policy.jar复制到/home/run/jre/lib/security/下。
    c、更改客户端HDFS目录下的component_env配置。
    d、从/opt/client/JDK/jdk/jre/lib/security下复制US_export_policy.jar到/home/run/jre/lib/security/,解决无法使用AES192、256位加密解密的问题
    e、在/tomcat/bin/catalina.bat添加JVM启动参数
        set JAVA_OPTS=-Dhttps.protocols="TLSv1.1,TLSv1.2"
        set JAVA_OPTS=-Djdk.tls.client.protocols="TLSv1.1,TLSv1.2"
    f、代码中添加JVM启动参数
        System.setProperty("https.protocols", "TLSv1.1,TLSv1.2");
        System.setProperty("jdk.tls.client.protocols", "TLSv1.1,TLSv1.2");
    g、从csdn下载local_policy.jar、US_export_policy.jar替换/home/run/jre/lib/security/下相同jar包,解决无法使用AES192、256位加密解密的问题
5、遇到错误五:javax.security.auth.login.LoginException: ICMP Port Unreachable(端口不可达)
    已解决,尝试过的解决方案:
    a、手动在/etc/hosts中添加一条记录"192.168.13.7"
    b、查看hadoop客户端中的krb5.conf,根据里面realms参数配置java.security.krb5.kdc请求路径跟端口
    c、删除启动参数
        System.setProperty("java.security.krb5.kdc","192.168.13.7:21732");
        System.setProperty("java.security.krb5.realm","HADOOP.com");
6、遇到错误六:java.lang.NoClassDefFoundError: org/apache/htrace/SamplerBuilder
    已解决 导入:htrace-core-3.1.0-incubating.jar(耗时0.5小时)
7、遇到错误七: java.lang.NoClassDefFoundError: org/apache/commons/cli/ParseException
    导入commons-cli-1.2.jar     
    检查缺失jar包,将案列下缺失jar全部导入(耗时1.5小时)
  *成功登陆并初始化FileSystem 代码如下

package com.talkweb.huicai.io;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.IOException;
import com.talkweb.huicai.common.PropertyUtil;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.BlockLocation;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.LocatedFileStatus;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.fs.RemoteIterator;
import org.apache.hadoop.security.UserGroupInformation;
import org.springframework.stereotype.Service;
import javax.annotation.Resource;
/**
* Hadoop相关的文件传输方法
*
* @author zhaoxinyu
* @create 2018/02/02
*/
@Service("fileSystemHadoop")
public class FileSystemHadoop {
@Resource(name = "propertyUtil")
private PropertyUtil propertyUtil;
/**
* hadoop文件存放基本路径
*/
public final String hadoopBasePath = propertyUtil.getProperty("hadoopBasePath");
/**
* 生成文件本地基础存储路径
*/
public final String localBasePath = propertyUtil.getProperty("localBasePath");
/**
* 配置文件的绝对路径
*/
private static String CONFIG_PATH = "/tmp/hcweb/";
private static final String PRINCIPAL = "username.client.kerberos.principal";
private static final String KEYTAB = "username.client.keytab.file";
private static final String KRBFILE = "java.security.krb5.conf";
private static String HDFS_SITE_PATH = CONFIG_PATH+"hdfs-site.xml";
private static String CORE_SITE_PATH = CONFIG_PATH+"core-site.xml";
private static String USER_KEYTAB_PATH = CONFIG_PATH+"hwcdm_user.keytab";
private static String KRB5_CONF_PATH = CONFIG_PATH+"krb5.conf";
private static Configuration conf;
private static FileSystem fileSystem;
private static String PRNCIPAL_NAME = "hwcdm@HADOOP.COM";
/**
* 初始化,获取一个FileSystem实例
* @throws IOException
*/
public void init() throws IOException {
confLoad();
authentication();
instanceBuild();
}
/**
*
* Add configuration file)
*/
public void confLoad() throws IOException {
conf = new Configuration();
conf.addResource(new Path(HDFS_SITE_PATH));
conf.addResource(new Path(CORE_SITE_PATH));
}
/**
* kerberos security authentication
*/
public void authentication() throws IOException{
if ("kerberos".equalsIgnoreCase(conf.get("hadoop.security.authentication"))) {
System.setProperty("java.security.krb5.conf",KRB5_CONF_PATH);
System.setProperty("hadoop.home.dir","/opt/client/HDFS/hadoop");
System.setProperty("sun.security.krb5.debug", "true");
System.setProperty("https.protocols", "TLSv1.1,TLSv1.2");
System.setProperty("jdk.tls.client.protocols", "TLSv1.1,TLSv1.2");
System.out.println("https.protocols == "+System.getProperty("https.protocols"));
System.out.println("jdk.tls.client.protocols == "+System.getProperty("jdk.tls.client.protocols"));
conf.set(PRINCIPAL, PRNCIPAL_NAME);
conf.set(KEYTAB, USER_KEYTAB_PATH);
conf.set("hdfs.connection.timeout","5000");
System.setProperty(KRBFILE, KRB5_CONF_PATH);
UserGroupInformation.setConfiguration(conf);
try {
UserGroupInformation.loginUserFromKeytab(conf.get(PRINCIPAL), conf.get(KEYTAB));
System.out.println("Login success!!!!!!!!!!!!!!");
} catch (IOException e) {
e.printStackTrace();
}
}
}
/**
* build HDFS instance
*/
public void instanceBuild() throws IOException{
fileSystem = FileSystem.get(conf);
}
/**
* 往hdfs上传文件
*/
public void addFileToHdfs(String localPath,String fileName) throws Exception {
Path src = new Path(localPath);
boolean isExists = fileSystem.exists(new Path(hadoopBasePath));
if(!isExists){
fileSystem.mkdirs(new Path(hadoopBasePath));
}
Path dst = new Path(hadoopBasePath+ File.separator +fileName);
fileSystem.copyFromLocalFile(src, dst);
fileSystem.close();
}
/**
* 从hdfs中复制文件到本地文件系统
*/
public void downloadFileToLocal(String resultPath) throws IllegalArgumentException, IOException {
fileSystem.copyToLocalFile(new Path("resultPath"), new Path(localBasePath));
fileSystem.close();
}
/**
* 删除文件
*/
public static void delete(String filePath) throws IOException{
Path path = new Path(filePath);
boolean isok = fileSystem.deleteOnExit(path);
if(isok){
System.out.println("delete ok!");
}else{
System.out.println("delete failure");
}
fileSystem.close();
}
/**
* 查看目录信息,只显示文件
*/
public void listFiles() throws FileNotFoundException, IllegalArgumentException, IOException {
System.out.println("--------------查看目录信息,只显示文件--------------");
RemoteIterator<LocatedFileStatus> listFiles = fileSystem.listFiles(new Path("/"), true);
while (listFiles.hasNext()) {
LocatedFileStatus fileStatus = listFiles.next();
System.out.println(fileStatus.getPath().getName());
System.out.println(fileStatus.getBlockSize());
System.out.println(fileStatus.getPermission());
System.out.println(fileStatus.getLen());
BlockLocation[] blockLocations = fileStatus.getBlockLocations();
for (BlockLocation bl : blockLocations) {
System.out.println("block-length:" + bl.getLength() + "--" + "block-offset:" + bl.getOffset());
String[] hosts = bl.getHosts();
for (String host : hosts) {
System.out.println(host);
}
}
System.out.println("--------------分割线--------------");
}
}
/**
* 查看文件及文件夹信息
*/
public void listAll() throws FileNotFoundException, IllegalArgumentException, IOException {
System.out.println("--------------查看文件及文件夹信息--------------");
FileStatus[] listStatus = fileSystem.listStatus(new Path("/"));
String flag = "d--
";
for (FileStatus fstatus : listStatus) {
if (fstatus.isFile())
flag = "f--
";
System.out.println(flag + fstatus.getPath().getName());
}
}
}

pom.xml如下,hdfs的包在本地(com.huicai 的包都是本地导入maven仓库的)

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>huicai</groupId>
<artifactId>huicai</artifactId>
<packaging>war</packaging>
<version>1.0-SNAPSHOT</version>
<name>huicai Maven Webapp</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<junit.version>4.12</junit.version>
<spring.version>3.2.3.RELEASE</spring.version>
<mybatis-spring.version>1.2.1</mybatis-spring.version>
<mybatis.version>3.2.6</mybatis.version>
<jackson.version>1.9.3</jackson.version>
<servlet.version>2.5</servlet.version>
<slf4j.version>1.7.6</slf4j.version>
<commons-dbcp.version>1.4</commons-dbcp.version>
<mysql.version>5.1.45</mysql.version>
<gson.version>2.4</gson.version>
<hadoop.version>2.7.2</hadoop.version>
</properties>
<dependencies>
<!-- 远程调用shell依赖包 -->
<dependency>
<groupId>org.jvnet.hudson</groupId>
<artifactId>ganymed-ssh2</artifactId>
<version>build210-hudson-1</version>
</dependency>
<!-- test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>${junit.version}</version>
<scope>test</scope>
</dependency>
<!-- servlet -->
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
<version>${servlet.version}</version>
<scope>provided</scope>
</dependency>
<!-- spring -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-orm</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context-support</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${spring.version}</version>
</dependency>
<dependency>
<groupId>com.jcraft</groupId>
<artifactId>jsch</artifactId>
<version>0.1.53</version>
</dependency>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>jstl</artifactId>
<version>1.2</version>
</dependency>
<!-- aspect -->
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjweaver</artifactId>
<version>1.7.4</version>
</dependency>
<!-- mybatis -->
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-spring</artifactId>
<version>${mybatis-spring.version}</version>
</dependency>
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis</artifactId>
<version>${mybatis.version}</version>
</dependency>
<!-- json -->
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-core-asl</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>org.codehaus.jackson</groupId>
<artifactId>jackson-mapper-asl</artifactId>
<version>${jackson.version}</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.4.4</version>
</dependency>
<!-- log -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<!-- dbcp -->
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>${commons-dbcp.version}</version>
</dependency>
<!-- fileupload -->
<dependency>
<groupId>commons-fileupload</groupId>
<artifactId>commons-fileupload</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>2.4</version>
</dependency>
<!-- mysql -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>${mysql.version}</version>
</dependency>
<!--hadoop -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-annotations</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-auth</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-hdfs</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-hdfs-client</artifactId>
<version>2.7.2</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hadoop-hdfs-colocation</artifactId>
<version>2.7.2</version>
</dependency>
<!-- hdfs-jetty-util -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jetty-util</artifactId>
<version>6.1.26</version>
</dependency>
<!-- hdfs-jetty -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jetty</artifactId>
<version>6.1.26</version>
</dependency>
<!-- hdfs-jersey-server -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jersey-server</artifactId>
<version>1.9</version>
</dependency>
<!-- hdfs-jersey-core -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jersey-core</artifactId>
<version>1.9</version>
</dependency>
<!-- hdfs-commons-daemon -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>commons-daemon</artifactId>
<version>1.0.13</version>
</dependency>
<!-- hdfs-inode-provider -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>hdfs-inode-provider</artifactId>
<version>2.7.2</version>
</dependency>
<!-- hdfs-javaluator -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>javaluator</artifactId>
<version>3.0.1</version>
</dependency>
<!-- hdfs-commons-configuration -->
<dependency>
<groupId>commons-configuration</groupId>
<artifactId>commons-configuration</artifactId>
<version>1.7</version>
</dependency>
<!-- hdfs-htrace-core -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>htrace-core</artifactId>
<version>3.1.0</version>
</dependency>
<!-- hdfs-commons-codec -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>commons-codec</artifactId>
<version>1.4</version>
</dependency>
<!-- hdfs-commons-cli -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>commons-cli</artifactId>
<version>1.2</version>
</dependency>
<!-- hdfs-asm -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>asm</artifactId>
<version>3.2</version>
</dependency>
<!-- hdfs-jsr305 -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jsr305</artifactId>
<version>3.0.0</version>
</dependency>
<!-- hdfs-leveldbjni-all -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>leveldbjni-all</artifactId>
<version>1.8</version>
</dependency>
<!-- hdfs-log4j -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>log4j</artifactId>
<version>1.2.17</version>
</dependency>
<!-- hdfs-netty -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>netty</artifactId>
<version>3.6.2</version>
</dependency>
<!-- hdfs-netty-all -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>netty-all</artifactId>
<version>4.0.23</version>
</dependency>
<!-- hdfs-protobuf-java -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>protobuf-java</artifactId>
<version>2.5.0</version>
</dependency>
<!-- hdfs-xercesImpl -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>xercesImpl</artifactId>
<version>2.9.1</version>
</dependency>
<!-- hdfs-xml-apis -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>xml-apis</artifactId>
<version>1.3.04</version>
</dependency>
<!-- hdfs-xmlenc -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>xmlenc</artifactId>
<version>0.52</version>
</dependency>
<!-- greenplum -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>greenplum</artifactId>
<version>1.0</version>
</dependency>
<!-- sso -->
<dependency>
<groupId>com.huicai</groupId>
<artifactId>sso-client</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>sso-common</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>commons-lang</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jdom</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>jdom</artifactId>
<version>1.0</version>
</dependency>
<dependency>
<groupId>com.huicai</groupId>
<artifactId>commons-logging</artifactId>
<version>1.0</version>
</dependency>
<!-- json -->
<dependency>
<groupId>net.sf.json-lib</groupId>
<artifactId>json-lib</artifactId>
<version>2.4</version>
<classifier>jdk15</classifier>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20160212</version>
</dependency>
<!-- gson -->
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>${gson.version}</version>
</dependency>
<!-- guava -->
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>
<!--commons-httpclient -->
<dependency>
<groupId>commons-httpclient</groupId>
<artifactId>commons-httpclient</artifactId>
<version>3.1</version>
</dependency>
<!-- poi -->
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>org.apache.poi</groupId>
<artifactId>poi-ooxml</artifactId>
<version>3.9</version>
</dependency>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
</dependency>
<!-- quartz -->
<dependency>
<groupId>org.quartz-scheduler</groupId>
<artifactId>quartz</artifactId>
<version>1.8.5</version>
</dependency>
<dependency>
<groupId>org.apache.httpcomponents</groupId>
<artifactId>httpclient</artifactId>
<version>4.5</version>
</dependency>
</dependencies>
<build>
<finalName>huicai</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
<resources>
<resource>
<directory>src/main/java</directory>
<includes>
<include>**/*.xml</include>
<!-- 将源码包中的xml文件打包 -->
</includes>
<filtering>true</filtering>
</resource>
<resource>
<directory>src/main/resources</directory>
<includes>
<!-- 将src/main/resources目录下的properties、xml和json文件打包 -->
<include>**/*.properties</include>
<include>**/*.xml</include>
<include>**/*.json</include>
</includes>
<filtering>true</filtering>
</resource>
</resources>
</build>
</project>

转载于:https://my.oschina.net/dreambreeze/blog/1811269

最后

以上就是妩媚羽毛为你收集整理的JAVA通过kerberos认证登陆调起HDFS-进行HDAOOP文件上传下载成功版的全部内容,希望文章能够帮你解决JAVA通过kerberos认证登陆调起HDFS-进行HDAOOP文件上传下载成功版所遇到的程序开发问题。

如果觉得靠谱客网站的内容还不错,欢迎将靠谱客网站推荐给程序员好友。

本图文内容来源于网友提供,作为学习参考使用,或来自网络收集整理,版权属于原作者所有。
点赞(57)

评论列表共有 0 条评论

立即
投稿
返回
顶部