/**
* 通过Hadoop api访问
* @throws IOException
*/
@Test
public void readFileByAPI() throws IOException{
Configuration conf = new Configuration();
conf.set(“fs.defaultFS”, “hdfs://192.168.75.201:8020/”);
FileSystem fs = FileSystem.get(conf);
Path path = new Path(“/user/index.html”);
FSDataInputStream fis =fs.open(path);
byte[] bytes = new byte[1024];
int len = -1;
ByteArrayOutputStream baos = new ByteArrayOutputStream();
while((len = fis.read(bytes))!=-1){
baos.write(bytes, 0, len);
}
System.out.println(new String(baos.toByteArray()));
fis.close();
baos.close();
}
第二种方式:
/**
* 通过hadoop api访问
* @throws IOException
*/
@Test
public void readFileByAPI2() throws IOException{
Configuration conf = new Configuration();
conf.set(“fs.defaultFS”, “hdfs://192.168.75.201:8020/”);
FileSystem fs = FileSystem.get(conf);
Path path = new Path(“/user/index.html”);
FSDataInputStream fis =fs.open(path);
ByteArrayOutputStream baos = new ByteArrayOutputStream();
IOUtils.copyBytes(fis, baos, 1024);
System.out.println(new String(baos.toByteArray()));
fis.close();
baos.close();
}
原创文章,作者:Maggie-Hunter,如若转载,请注明出处:https://blog.ytso.com/tech/opensource/193031.html