I have a Hadoop cluster with Kerberos enabled I want to put files on HDFS using a windows/linux machine outside the cluster.
Hadoop admin team have provided me with username to access hadoop and keytab file, how should I use them in my java code?
I went through many resources on internet but none of them give any guide for accessing kerberized hadoop from outside the cluster.
Also, Is it necessary to run the code using hadoop jar
? if yes how will I run it fromo outside the cluster
Reference
http://blog.rajeevsharma.in/2009/06/using-hdfs-in-java-0200.html
http://princetonits.com/technology/using-filesystem-api-to-read-and-write-data-to-hdfs/
I got kerberos working ,able to generate ticket now
But curl is not working(windows)
curl -i --negotiate u:qjdht93 "http://server:50070/webhdfs/v1/user/qjdht93/?op=LISTSTATUS"
Gives error as
HTTP/1.1 401 Authentication required
Cache-Control: must-revalidate,no-cache,no-store
Date: Mon, 01 Jun 2015 15:26:37 GMT
Pragma: no-cache
Date: Mon, 01 Jun 2015 15:26:37 GMT
Pragma: no-cache
Content-Type: text/html; charset=iso-8859-1
WWW-Authenticate: Negotiate
Set-Cookie: hadoop.auth=; Version=1; Path=/; Expires=Thu, 01-Jan-1970 00:00:00 G
MT; HttpOnly
Content-Length: 1416
Server: Jetty(6.1.26)
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=ISO-8859-1"/>
<title>Error 401 Authentication required</title>
</head>
<body><h2>HTTP ERROR 401</h2>
<p>Problem accessing /webhdfs/v1/user/qjdht93. Reason:
<pre> Authentication required</pre></p><hr /><i><small>Powered by Jetty://</s
mall></i><br/>
<br/>
<br/>
Please suggest