本文整理汇总了Java中org.apache.hadoop.hdfs.HftpFileSystem类的典型用法代码示例。如果您正苦于以下问题:Java HftpFileSystem类的具体用法?Java HftpFileSystem怎么用?Java HftpFileSystem使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
HftpFileSystem类属于org.apache.hadoop.hdfs包,在下文中一共展示了HftpFileSystem类的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: testFileCrcInternal
import org.apache.hadoop.hdfs.HftpFileSystem; //导入依赖的package包/类
void testFileCrcInternal(boolean inlineChecksum) throws IOException {
((Log4JLogger) HftpFileSystem.LOG).getLogger().setLevel(Level.ALL);
Random random = new Random(1);
final long seed = random.nextLong();
random.setSeed(seed);
FileSystem fs = FileSystem.getLocal(new Configuration());
// generate random data
final byte[] data = new byte[1024 * 1024 + 512 * 7 + 66];
random.nextBytes(data);
// write data to a file
Path foo = new Path(TEST_ROOT_DIR, "foo_" + inlineChecksum);
{
final FSDataOutputStream out = fs.create(foo, false, 512, (short) 2, 512);
out.write(data);
out.close();
}
// compute data CRC
DataChecksum checksum = DataChecksum.newDataChecksum(
DataChecksum.CHECKSUM_CRC32, 1);
checksum.update(data, 0, data.length);
// compute checksum
final int crc = fs.getFileCrc(foo);
System.out.println("crc=" + crc);
TestCase.assertEquals((int) checksum.getValue(), crc);
}
开发者ID:rhli,项目名称:hadoop-EAR,代码行数:33,代码来源:TestLocalFileSystem.java
示例2: getDTfromRemote
import org.apache.hadoop.hdfs.HftpFileSystem; //导入依赖的package包/类
static public Credentials getDTfromRemote(String nnAddr,
String renewer) throws IOException {
DataInputStream dis = null;
InetSocketAddress serviceAddr = NetUtils.createSocketAddr(nnAddr);
try {
StringBuffer url = new StringBuffer();
if (renewer != null) {
url.append(nnAddr).append(GetDelegationTokenServlet.PATH_SPEC)
.append("?").append(GetDelegationTokenServlet.RENEWER).append("=")
.append(renewer);
} else {
url.append(nnAddr).append(GetDelegationTokenServlet.PATH_SPEC);
}
URL remoteURL = new URL(url.toString());
URLConnection connection = SecurityUtil.openSecureHttpConnection(remoteURL);
InputStream in = connection.getInputStream();
Credentials ts = new Credentials();
dis = new DataInputStream(in);
ts.readFields(dis);
for(Token<?> token: ts.getAllTokens()) {
token.setKind(HftpFileSystem.TOKEN_KIND);
SecurityUtil.setTokenService(token, serviceAddr);
}
return ts;
} catch (Exception e) {
throw new IOException("Unable to obtain remote token", e);
} finally {
if(dis != null) dis.close();
}
}
开发者ID:ict-carch,项目名称:hadoop-plus,代码行数:32,代码来源:DelegationTokenFetcher.java
示例3: testLazyRenewerStartup
import org.apache.hadoop.hdfs.HftpFileSystem; //导入依赖的package包/类
@Test
public void testLazyRenewerStartup() {
DelegationTokenRenewer<HftpFileSystem> dtr =
new DelegationTokenRenewer<HftpFileSystem>(HftpFileSystem.class);
assertFalse(dtr.isAlive());
dtr.start();
assertFalse(dtr.isAlive());
dtr.addRenewAction(null);
assertTrue(dtr.isAlive());
}
开发者ID:Seagate,项目名称:hadoop-on-lustre,代码行数:11,代码来源:TestDelegationTokenRenewer.java
示例4: initialValue
import org.apache.hadoop.hdfs.HftpFileSystem; //导入依赖的package包/类
protected SimpleDateFormat initialValue() {
return HftpFileSystem.getDateFormat();
}
开发者ID:rhli,项目名称:hadoop-EAR,代码行数:4,代码来源:ListPathsServlet.java
示例5: initialValue
import org.apache.hadoop.hdfs.HftpFileSystem; //导入依赖的package包/类
@Override
protected SimpleDateFormat initialValue() {
return HftpFileSystem.getDateFormat();
}
开发者ID:ict-carch,项目名称:hadoop-plus,代码行数:5,代码来源:ListPathsServlet.java
示例6: getDTfromRemote
import org.apache.hadoop.hdfs.HftpFileSystem; //导入依赖的package包/类
/**
* Utility method to obtain a delegation token over http
* @param nnAddr Namenode http addr, such as http://namenode:50070
* @param renewer User that is renewing the ticket in such a request
*/
static public Credentials getDTfromRemote(String nnAddr,
String renewer) throws IOException {
DataInputStream dis = null;
InetSocketAddress serviceAddr = NetUtils.createSocketAddr(nnAddr);
try {
StringBuffer url = new StringBuffer();
if (renewer != null) {
url.append(nnAddr).append(GetDelegationTokenServlet.PATH_SPEC).append("?").
append(GetDelegationTokenServlet.RENEWER).append("=").append(renewer);
} else {
url.append(nnAddr).append(GetDelegationTokenServlet.PATH_SPEC);
}
if(LOG.isDebugEnabled()) {
LOG.debug("Retrieving token from: " + url);
}
URL remoteURL = new URL(url.toString());
boolean isSecure = "https".equals(remoteURL.getProtocol().toLowerCase());
URLConnection connection =
SecurityUtil.openSecureHttpConnection(remoteURL);
InputStream in = connection.getInputStream();
Credentials ts = new Credentials();
dis = new DataInputStream(in);
ts.readFields(dis);
for(Token<?> token: ts.getAllTokens()) {
if (isSecure) {
token.setKind(HsftpFileSystem.TOKEN_KIND);
} else {
token.setKind(HftpFileSystem.TOKEN_KIND);
}
SecurityUtil.setTokenService(token, serviceAddr);
}
return ts;
} catch (Exception e) {
throw new IOException("Unable to obtain remote token", e);
} finally {
if(dis != null) dis.close();
}
}
开发者ID:Seagate,项目名称:hadoop-on-lustre,代码行数:46,代码来源:DelegationTokenFetcher.java
注:本文中的org.apache.hadoop.hdfs.HftpFileSystem类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论