forked from ukwa/httpfs
-
Notifications
You must be signed in to change notification settings - Fork 0
/
README.txt
45 lines (28 loc) · 1.3 KB
/
README.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
-----------------------------------------------------------------------------
HttpFS backport for cdh3u5 - Hadoop HDFS over HTTP
The HttpFS source for this backport has been taken from the following
Apache Hadoop Subversion branch@revision:
https://svn.apache.org/repos/asf/hadoop/common/trunk@1363175
HttpFS is a server that provides a REST HTTP gateway to HDFS with full
filesystem read & write capabilities.
HttpFS can be used to transfer data between clusters running different
versions of Hadoop (overcoming RPC versioning issues), for example using
Hadoop DistCP.
HttpFS can be used to access data in HDFS on a cluster behind of a firewall
(the HttpFS server acts as a gateway and is the only system that is allowed
to cross the firewall into the cluster).
HttpFS can be used to access data in HDFS using HTTP utilities (such as curl
and wget) and HTTP libraries Perl from other languages than Java.
Requirements:
* Unix OS
* JDK 1.6.*
* Maven 3.*
How to build:
Clone this Git repository. Checkout the cdh3u5 branch.
Run 'mvn package -Pdist'.
The resulting TARBALL will under the 'target/' directory.
How to install:
Expand the build TARBALL.
Follow the setup instructions:
http://cloudera.github.com/httpfs/
-----------------------------------------------------------------------------