1、要想Java可以连接自己虚拟机的hadoop文件系统,需要给文件系统权限
(1)需要在/usr/local/hadoop/etc/hadoop/core-site.xml
core-site.xml文件配置具体ip
<configuration><property><name>fs.defaultFS</name><value>hdfs://ip:9000</value></property>
</configuration>
(2)文件权限
hdfs dfs -chmod -R 777 /
2、maven依赖
<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common --><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-common</artifactId><version>2.9.2</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-client</artifactId><version>2.9.2</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-hdfs</artifactId><version>2.9.2</version></dependency>
3、具体操作
(1)创建文件夹
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", "hdfs://ip:9000");
FileSystem fileSystem = FileSystem.get(configuration);
boolean bool = fileSystem.mkdirs(new Path("/test"));
System.out.println(bool);
(2)创建文件
Configuration configuration = new Configuration();
configuration.set("fs.defaultFS", "hdfs://ip:9000");
FileSystem fileSystem = FileSystem.get(configuration);
Path path = new Path("/demo/test1.txt");
FSDataOutputStream out = fileSystem.create(path);
out.write("hfajdhfkafa".getBytes());
out.flush();
out.close();