Datanode process not running in Hadoop
Datanode process not running in Hadoop
I set up and configured a multi-node Hadoop cluster using this tutorial.
When I type in the start-all.sh command, it shows all the processes initializing properly as follows:
starting namenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-namenode-jawwadtest1.out jawwadtest1: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-jawwadtest1.out jawwadtest2: starting datanode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-datanode-jawwadtest2.out jawwadtest1: starting secondarynamenode, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-secondarynamenode-jawwadtest1.out starting jobtracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-jobtracker-jawwadtest1.out jawwadtest1: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-jawwadtest1.out jawwadtest2: starting tasktracker, logging to /usr/local/hadoop/libexec/../logs/hadoop-root-tasktracker-jawwadtest2.out
However, when I type the jps command, I get the following output:
31057 NameNode 4001 RunJar 6182 RunJar 31328 SecondaryNameNode 31411 JobTracker 32119 Jps 31560 TaskTracker
As you can see, there's no datanode process running. I tried configuring a single-node cluster but got the same problem. Would anyone have any idea what could be going wrong here? Are there any configuration files that are not mentioned in the tutorial or I may have looked over? I am new to Hadoop and am kinda lost and any help would be greatly appreciated.
EDIT: hadoop-root-datanode-jawwadtest1.log:
STARTUP_MSG: args = [] STARTUP_MSG: version = 1.0.3 STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common/branches/$ ************************************************************/ 2012-08-09 23:07:30,717 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: loa$ 2012-08-09 23:07:30,734 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapt$ 2012-08-09 23:07:30,735 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$ 2012-08-09 23:07:30,736 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$ 2012-08-09 23:07:31,018 INFO org.apache.hadoop.metrics2.impl.MetricsSourceAdapt$ 2012-08-09 23:07:31,024 WARN org.apache.hadoop.metrics2.impl.MetricsSystemImpl:$ 2012-08-09 23:07:32,366 INFO org.apache.hadoop.ipc.Client: Retrying connect to $ 2012-08-09 23:07:37,949 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: $ at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(Data$ at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransition$ at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNo$ at org.apache.hadoop.hdfs.server.datanode.DataNode.(DataNode.java$ at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNod$ at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode($ at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataN$ at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.$ at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1$ 2012-08-09 23:07:37,951 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: S$ /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at jawwadtest1/198.101.220.90 ************************************************************/
Answer by giltsl for Datanode process not running in Hadoop
You need to do something like this:
bin/stop-all.sh
(orstop-dfs.sh
andstop-yarn.sh
in the 2.x serie)rm -Rf /app/tmp/hadoop-your-username/*
bin/hadoop namenode -format
(orhdfs
in the 2.x serie)
the solution was taken from: http://pages.cs.brandeis.edu/~cs147a/lab/hadoop-troubleshooting/. Basically it consists in restarting from scratch, so make sure you won't loose data by formating the hdfs.
Answer by HypnoticSheep for Datanode process not running in Hadoop
I was having the same problem running a single-node pseudo-distributed instance. Couldn't figure out how to solve it, but a quick workaround is to manually start a DataNode with
hadoop-x.x.x/bin/hadoop datanode
Answer by Neha Milak for Datanode process not running in Hadoop
Try this 1. stop-all.sh 2. vi hdfs-site.xml 3. change the value given for property dfs.data.dir 4. format namenode 5. start-all.sh
Answer by sunskin for Datanode process not running in Hadoop
I ran into the same issue. I have created a hdfs folder '/home/username/hdfs' with sub-directories name, data, and tmp which were referenced in config xml files of hadoop/conf.
When I started hadoop and did jps, I couldn't find datanode so I tried to manually start datanode using bin/hadoop datanode. Then I realized from error message that it has permissions issue accessing the dfs.data.dir=/home/username/hdfs/data/ which was referenced in one of the hadoop config files. All I had to do was stop hadoop, delete the contents of /home/username/hdfs/tmp/* directory and then try this command - chmod -R 755 /home/username/hdfs/
and then start hadoop. I could find the datanode!
Answer by user1431921 for Datanode process not running in Hadoop
I have got details of the issue in the log file like below : "Invalid directory in dfs.data.dir: Incorrect permission for /home/hdfs/dnman1, expected: rwxr-xr-x, while actual: rwxrwxr-x" and from there I identified that the datanote file permission was 777 for my folder. I corrected to 755 and it started working.
Answer by natalinobusa for Datanode process not running in Hadoop
Please control if the the tmp directory property is pointing to a valid directory in core-site.xml
hadoop.tmp.dir /home/hduser/data/tmp
If the directory is misconfigured, the datanode process will not start properly.
Answer by JackeyXu for Datanode process not running in Hadoop
Instead of deleting everything under the "hadoop tmp dir", you can set another one. For example, if your core-site.xml has this property:
hadoop.tmp.dir /home/hduser/data/tmp
You can change this to:
hadoop.tmp.dir /home/hduser/data/tmp2
and then scp core-site.xml to each node, and then "hadoop namenode -format", and then restart hadoop.
Answer by apurva.nandan for Datanode process not running in Hadoop
This is for newer version of Hadoop (I am running 2.4.0)
- In this case stop the cluster sbin/stop-all.sh
- Then go to /etc/hadoop for config files.
In the file: hdfs-site.xml Look out for directory paths corresponding to dfs.namenode.name.dir dfs.namenode.data.dir
- Delete both the directories recursively (rm -r).
- Now format the namenode via bin/hadoop namenode -format
- And finally sbin/start-all.sh
Hope this helps.
Answer by sindhu Y for Datanode process not running in Hadoop
You need to check :
/app/hadoop/tmp/dfs/data/current/VERSION and /app/hadoop/tmp/dfs/name/current/VERSION ---
in those two files and that to Namespace ID of name node and datanode.
If and only if data node's NamespaceID is same as name node's NamespaceID then your datanode will run.
If those are different copy the namenode NamespaceID to your Datanode's NamespaceID using vi editor or gedit and save and re run the deamons it will work perfectly.
Answer by The joker for Datanode process not running in Hadoop
if formatting the tmp directory is not working then try this:
- first stop all the entities like namenode, datanode etc. (you will be having some script or command to do that)
- Format tmp directory
- Go to /var/cache/hadoop-hdfs/hdfs/dfs/ and delete all the contents in the directory manually
- Now format your namenode again
- start all the entities then use jps command to confirm that the datanode has been started
- Now run whichever application you have
Hope this helps.
Answer by Swapnil Gangrade for Datanode process not running in Hadoop
Run Below Commands in Line:-
- stop-all.sh (Run Stop All to Stop all the hadoop process)
- rm -r /usr/local/hadoop/tmp/ (Your Hadoop tmp directory which you configured in hadoop/conf/core-site.xml)
- sudo mkdir /usr/local/hadoop/tmp (Make the same directory again)
- hadoop namenode -format (Format your namenode)
- start-all.sh (Run Start All to start all the hadoop process)
- JPS (It will show the running processes)
Answer by Bhaskar for Datanode process not running in Hadoop
- I configured hadoop.tmp.dir in conf/core-site.xml
- I configured dfs.data.dir in conf/hdfs-site.xml
- I configured dfs.name.dir in conf/hdfs-site.xml
- Deleted everything under "/tmp/hadoop-/" directory
Changed file permissions from 777 to 755 for directory listed under
dfs.data.dir
And the data node started working.
Answer by Harish Pathak for Datanode process not running in Hadoop
Follow these steps and your datanode will start again.
- Stop dfs.
- Open hdfs-site.xml
- Remove the data.dir and name.dir properties from hdfs-site.xml and -format namenode again.
- Then remove the hadoopdata directory and add the data.dir and name.dir in hdfs-site.xml and again format namenode.
- Then start dfs again.
Answer by cocoliso for Datanode process not running in Hadoop
- Erase the files where data and name are in dfs.
In my case , I have hadoop on windows, over C:/, this file according to core-site.xml, etc , it was in tmp/Administrator/dfs/data... name, etc, so erase it.
Then, namenode -format. and try again,
Answer by Sneha Priya Ale for Datanode process not running in Hadoop
Stop all the services - ./stop-all.sh Format all the hdfs tmp directory from all the master and slave. Don't forget to format from slave.
Format the namenode.(hadoop namenode -format)
Now start the services on namenode. ./bin/start-all.sh
This made a difference for me to start the datanode service.
Answer by juggernaut1996 for Datanode process not running in Hadoop
mv /usr/local/hadoop_store/hdfs/datanode /usr/local/hadoop_store/hdfs/datanode.backup mkdir /usr/local/hadoop_store/hdfs/datanode hadoop datanode OR start-all.sh jps
Answer by Sunil Suthar for Datanode process not running in Hadoop
Step 1:- Stop-all.sh Step 2:- got to this path cd /usr/local/hadoop/bin Step 3:- Run that command hadoop datanode
Now DataNode work
Fatal error: Call to a member function getElementsByTagName() on a non-object in D:\XAMPP INSTALLASTION\xampp\htdocs\endunpratama9i\www-stackoverflow-info-proses.php on line 72
0 comments:
Post a Comment