a.在master节点上(ubuntu-v01)修改hdfs-site.xml加上以下内容?
旨在取消权限检查,原因是为了解决我在windows机器上配置eclipse连接hadoop服务器时,配置map/reduce连接后报以下错误,org.apache.hadoop.security.accesscontrolexception: permission denied:?
b.同样在master节点上(ubuntu-v01)修改hdfs-site.xml加上以下内容?
原因是运行时,报如下错误 warn org.apache.hadoop.security.shellbasedunixgroupsmapping: got exception trying to get groups for user jack?
应该是我的windows的用户名为jack,无访问权限?
更多权限配置可参看官方说明文档:?
hdfs权限管理用户指南http://hadoop.apache.org/docs/r1.0.4/cn/hdfs_permissions_guide.html?
配置修改完后重启hadoop集群:?
hadoop@ubuntu-v01:~/data$./sbin/stop-dfs.sh?
hadoop@ubuntu-v01:~/data$./sbin/stop-yarn.sh?
hadoop@ubuntu-v01:~/data$./sbin/start-dfs.sh?
hadoop@ubuntu-v01:~/data$./sbin/start-yarn.sh?
二.windows基础环境准备?
windows7(x64),jdk,ant,eclipse,hadoop?
1.jdk环境配置?
jdk-6u26-windows-i586.exe安装后好后配置相关java_home环境变量,并将bin目录配置到path?
2.eclipse环境配置?
eclipse-standard-luna-sr1-win32.zip解压到d:\eclipse\目录下并命名eclipse-hadoop?
下载地址:http://developer.eclipsesource.com/technology/epp/luna/eclipse-standard-luna-sr1-win32.zip?
3.ant环境配置?
apache-ant-1.9.4-bin.zip解压到d:\apache\目录下,配置环境变量ant_home,并将bin目录配置到path?
下载地址:http://mirror.bit.edu.cn/apache//ant/binaries/apache-ant-1.9.4-bin.zip?
4.下载hadoop-2.5.2.tar.gz?
http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.5.2/hadoop-2.5.2.tar.gz?
5.下载hadoop-2.5.2-src.tar.gz?
http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-2.5.2/hadoop-2.5.2-src.tar.gz?
6.下载hadoop2x-eclipse-plugin?
https://github.com/winghc/hadoop2x-eclipse-plugin?
7.下载hadoop-common-2.2.0-bin?
https://github.com/srccodes/hadoop-common-2.2.0-bin?
分别将hadoop-2.5.2.tar.gz、hadoop-2.5.2-src.tar.gz、hadoop2x-eclipse-plugin、hadoop-common-2.2.0-bin下载解压到f:\hadoop\目录下?
8.修改本地hosts文件,加入如下内容:
192.168.1.112 ubuntu-v01
三、编译hadoop-eclipse-plugin-2.5.2.jar配置?
1.添加环境变量hadoop_home=f:\hadoop\hadoop-2.5.2\?
追加环境变量path内容:%hadoop_home%/bin?
2.修改编译包及依赖包版本信息?
修改f:\hadoop\hadoop2x-eclipse-plugin-master\ivy\libraries.properties?
hadoop.version=2.5.2?
jackson.version=1.9.13?
3.ant编译?
f:\hadoop\hadoop2x-eclipse-plugin-master\src\contrib\eclipse-plugin>?
ant jar -dversion=2.5.2 -declipse.home=d:\eclipse\eclipse-hadoop\eclipse -dhadoop.home=f:\hadoop\hadoop-2.5.2?
编译好后hadoop-eclipse-plugin-2.5.2.jar会在f:\hadoop\hadoop2x-eclipse-plugin-master\build\contrib\eclipse-plugin目录下?
四、eclipse环境配置?
1.将编译好的hadoop-eclipse-plugin-2.5.2.jar拷贝至eclipse的plugins目录下,然后重启eclipse?
2.打开菜单window--preference--hadoop map/reduce进行配置,如下图所示:?
3.显示hadoop连接配置窗口:window--show view--other-mapreduce tools,如下图所示:?
4.配置连接hadoop,如下图所示:?
查看是否连接成功,能看到如下信息,则表示连接成功:?
五、hadoop集群环境添加测试文件
(如果已有则无需配置)?
a.dfs上创建input目录?
hadoop@ubuntu-v01:~/data/hadoop-2.5.2$bin/hadoop fs -mkdir -p input?
b.把hadoop目录下的readme.txt拷贝到dfs新建的input里?
hadoop@ubuntu-v01:~/data/hadoop-2.5.2$bin/hadoop fs -copyfromlocal readme.txt input?
六、创建一个map/reduce project?
1.新建项目 file--new--other--map/reduce project 命名为mr1,?
然后创建类org.apache.hadoop.examples.wordcount,从hadoop-2.5.2-src中拷贝覆盖?
(f:\hadoop\hadoop-2.5.2-src\hadoop-mapreduce-project\hadoop-mapreduce-examples\src\main\java\org\apache\hadoop\examples\wordcount.java)?
2.创建log4j.properties文件?
在src目录下创建log4j.properties文件,内容如下:?
log4j.rootlogger=debug,stdout,r?
log4j.appender.stdout=org.apache.log4j.consoleappender?
log4j.appender.stdout.layout=org.apache.log4j.patternlayout?
log4j.appender.stdout.layout.conversionpattern=%5p - %m%n?
log4j.appender.r=org.apache.log4j.rollingfileappender?
log4j.appender.r.file=mapreduce_test.log?
log4j.appender.r.maxfilesize=1mb?
log4j.appender.r.maxbackupindex=1?
log4j.appender.r.layout=org.apache.log4j.patternlayout?
log4j.appender.r.layout.conversionpattern=%p %t %c - %m%n?
log4j.logger.com.codefutures=debug?
3.解决java.lang.unsatisfiedlinkerror: org.apache.hadoop.io.nativeio.nativeio$windows.access0(ljava/lang/string;i)异常问题?
(由于你的环境和我的可能不一致,可以在后面出现相关问题后再进行修改)?
拷贝源码文件org.apache.hadoop.io.nativeio.nativeio到项目中?
然后定位到570行,直接修改为return true;?
如下图所示:?
七、windows下运行环境配置
(如果不生效,则需要重启机器)?
需要hadoop.dll,winutils.exe?
我是直接拷贝f:\hadoop\hadoop-common-2.2.0-bin-master\bin目录下内容覆盖f:\hadoop\hadoop-2.5.2\bin?
八、运行project
在eclipse中点击wordcount.java,右键,点击run as—>run configurations,配置运行参数,即输入和输出文件夹?
hdfs://ubuntu-v01:9000/user/hadoop/input hdfs://ubuntu-v01:9000/user/hadoop/output?
如下图所示:?