java

关注公众号 jb51net

关闭
首页 > 软件编程 > java > java操作hdfs

java操作hdfs的方法示例代码

作者:java操作hdfs

这篇文章主要介绍了java操作hdfs的相关资料,在本地配置Hadoop和Maven的环境变量,首先需从官网下载与服务器相同版本的Hadoop安装包,配置环境变量后,引入Maven的配置文件,以便管理项目依赖,最后,编写代码实现对HDFS的连接和操作,完成数据的读写,需要的朋友可以参考下

在本地配置环境变量

导入maven配置文件

	<dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>3.1.3</version>
        </dependency>

        <dependency>
            <groupId>junit</groupId>
            <artifactId>junit</artifactId>
            <version>4.12</version>
        </dependency>

        <dependency>
            <groupId>org.slf4j</groupId>
            <artifactId>slf4j-log4j12</artifactId>
            <version>1.7.30</version>
        </dependency>

连接hdfs

public class JavaToHDFS {
    //定义类变量:文件系统,用来连接hdfs
    FileSystem fs=null;

    //预处理,连接hdfs
    public void init() throws Exception{
        Configuration conf = new Configuration();
        fs = FileSystem.get(new URI("本机ip地址"),conf,"root");
    }
}

hdfs代码

HDFSDemo.java
package cn.itcast.hadoop.hdfs;

import java.io.FileInputStream;
import java.io.FileNotFoundException;
import java.io.FileOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;
import java.net.URISyntaxException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.io.IOUtils;
import org.junit.Before;
import org.junit.Test;

public class HDFSDemo {

	FileSystem fs = null;
	
	@Before
	public void init() throws Exception{
		fs = FileSystem.get(new URI("hdfs://itcast01:9000"), new Configuration(), "root");
	}
	
	@Test
	public void testUpload() throws Exception{
		InputStream in = new FileInputStream("/root/install.log");
		OutputStream out = fs.create(new Path("/log123.log"));
		IOUtils.copyBytes(in, out, 1024, true);
	}
	
	@Test
	public void testMkdir() throws IllegalArgumentException, IOException{
		boolean  flag = fs.mkdirs(new Path("/a/aa"));
		System.out.println(flag);
	}
	
	@Test
	public void testDel() throws IllegalArgumentException, IOException{
		boolean flag = fs.delete(new Path("/a"), true);
		System.out.println(flag);
	}
	
	public static void main(String[] args) throws Exception {
		FileSystem fs = FileSystem.get(new URI("hdfs://itcast01:9000"), new Configuration());
		InputStream in = fs.open(new Path("/jdk"));
		OutputStream out = new FileOutputStream("/home/jdk1.7.tar.gz");
		IOUtils.copyBytes(in, out, 4096, true);
		
	}
}

总结 

到此这篇关于java操作hdfs的文章就介绍到这了,更多相关java操作hdfs内容请搜索脚本之家以前的文章或继续浏览下面的相关文章希望大家以后多多支持脚本之家!

您可能感兴趣的文章:
阅读全文