问题标签 [hadoop2]
For questions regarding programming in ECMAScript (JavaScript/JS) and its various dialects/implementations (excluding ActionScript). Note JavaScript is NOT the same as Java! Please include all relevant tags on your question; e.g., [node.js], [jquery], [json], [reactjs], [angular], [ember.js], [vue.js], [typescript], [svelte], etc.
hadoop - Single Serialization Type (SST) of Pig/Cascading versus Multiple Serialization Type (MST) of Apache Crunch
In their FAQ here, Crunch teams highlights the main difference to be MST of Crunch over SST of Cascading. I am not sure how these are different. Can some one explain with an example?
hadoop - Datanode 无法正确启动
我正在尝试以伪分布式模式安装 Hadoop 2.2.0。当我尝试启动datanode服务时,它显示以下错误,谁能告诉我如何解决这个问题?
hadoop - 无法将示例文件加载到 hadoop 2.2.0
我尝试安装 2.2.0 伪模式,同时尝试运行copyfromlocal
以复制示例数据
我现在在目标路径中使用了/input ,例如- bin/hadoop fs -copyFromLocal /home/prassanna/Desktop/input /input
我认为它现在可以工作了,我使用下面的 bin/hadoop fs -ls /input 14/03/12 09:31:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform 验证了文件...使用适用的内置java类找到1个项目 -rw-r--r-- 1根超组64 ,我还检查了datanode的uI,但它显示的已用百分比仅为'0',但它必须显示一些kb( 64)文件对吗?请告诉输入文件是否正确复制到hdfs?**并告诉我文件在本地机器中的物理存储位置?请帮助解决这个困惑。提前致谢
maven - Maven artifactId hadoop 2.2.0 for hadoop-core
我正在将我的应用程序从 hadoop 1.0.3 迁移到 hadoop 2.2.0,并且 maven build 将 hadoop-core 标记为依赖项。由于 hadoop 2.2.0 不存在 hadoop-core。我尝试用 hadoop-client 和 hadoop-common 替换它,但我仍然收到 ant.filter 的此错误。有人可以建议使用哪个神器吗?
错误:
hadoop - hadoop 2.2.0 中的数据备份和恢复
我是 Hadoop 新手,对 Hadoop 管理很感兴趣,所以我尝试在 Ubuntu 12.04 中以伪分布式模式安装 Hadoop 2.2.0 并成功安装并运行一些示例 jar 文件,现在我正在尝试进一步学习,尝试学习数据现在备份和恢复部分,任何人都可以告诉如何在 hadoop 2.2.0 中备份和恢复数据,还请推荐任何关于 Hadoop 管理的好书和学习 Hadoop 管理的步骤。
提前致谢。
hadoop - Hadoop release missing /conf directory
I am trying to install a single node setup of Hadoop on Ubuntu. I started following the instructions on the Hadoop 2.3 docs.
But I seem to be missing something very simple.
First, it says to
To get a Hadoop distribution, download a recent stable release from one of the Apache Download Mirrors.
Then,
Unpack the downloaded Hadoop distribution. In the distribution, edit the file conf/hadoop-env.sh to define at least JAVA_HOME to be the root of your Java installation.
However, I can't seem to find the conf
directory.
I downloaded a release of 2.3 at one of the mirrors. Then unpacked the tarball, an ls
of the inside returns:
I was able to find the file they were referencing, just not in a conf
directory:
Am I missing something, or am I grabbing the wrong package? Or are the docs just outdated?
If so, anyone know where some more up-to date docs are?
hadoop - Hadoop 命令,hadoop fs -ls 抛出重试连接服务器错误?
当我输入 时hadoop fs -ls
,我收到以下错误消息:
hadoop namenode -format 的输出是
hadoop - Hadoop fs -cp ,说文件不存在?
文件 new.txt 肯定是可用的;我不知道为什么当我试图进入 hdfs 目录时,它说文件不存在。
PS:我已经创建了一个名为 hdfs 的目录:
java - Hadoop Text 数据类型是可变的还是不可变的?
在其中一个 mapreduce 程序中,我new Text()
在 context.write 期间使用。
当我使用上述语句时,我想知道如何存储 Text 对象以及如何处理内存管理。还想知道在没有被任何对象引用之后对象值的存在。
请让我知道这件事。
hadoop - Hadoop,尝试添加映射器类时出错
我已经成功安装了Hadoop 插件。而且我也可以从 eclipse访问DFS位置。但是当我尝试将 Mapper 或 reducer 类添加到我创建的项目时,出现以下错误
除非源级别为 1.5,否则无法参数化超类。
我检查了构建路径位置,所有 Hadoop jar 都可用。可能是什么错误?