We were setting up a small cluster of Hadoop for some requirements, and decided to go with the HortonWorks release of Hadoop. (HDP 1.1 for Windows).
After installing, we realized that for some reason the Oozie service was not starting up . (we tried both the CLI command provided with the HDP installation, as well as trying to manually start the service via services.msc).
Looking through the log files (<HDP 1.1 installation path>\oozie-<version>\Service directory, specifically the oozieservice.out.log file) we found the error -
and so realized that Oozie server is dependent on JDK. It does not work with only JRE.
To correct this we uninstalled JRE, and installed JDK. We also manually modified the JAVA_HOME environment variable to point to the new installation path. And then Oozie server started working all fine.
But all the rest of the services (hadoop jobtracker, hadoop datanode etc.) stopped starting.
Checking the logs of the services, we found that all the services seemed to be referring to the old JRE path which did not exist. Thinking that maybe the environment variables were being cached by the service, we restarted the machine in vain. Exploring this further we saw that there were many XML files which contained the initial JAVA_HOME folder as a part of the XML configuration, rather than picking the environemtn variable dynamically at run time.
Finally we got all our hadoop services on our master node, by modifying the following XML files -
After installing, we realized that for some reason the Oozie service was not starting up . (we tried both the CLI command provided with the HDP installation, as well as trying to manually start the service via services.msc).
Looking through the log files (<HDP 1.1 installation path>\oozie-<version>\Service directory, specifically the oozieservice.out.log file) we found the error -
and so realized that Oozie server is dependent on JDK. It does not work with only JRE.
To correct this we uninstalled JRE, and installed JDK. We also manually modified the JAVA_HOME environment variable to point to the new installation path. And then Oozie server started working all fine.
But all the rest of the services (hadoop jobtracker, hadoop datanode etc.) stopped starting.
Checking the logs of the services, we found that all the services seemed to be referring to the old JRE path which did not exist. Thinking that maybe the environment variables were being cached by the service, we restarted the machine in vain. Exploring this further we saw that there were many XML files which contained the initial JAVA_HOME folder as a part of the XML configuration, rather than picking the environemtn variable dynamically at run time.
Finally we got all our hadoop services on our master node, by modifying the following XML files -
- <hdp1.1 installation path>\hive-0.9.0\bin\metastore.xml
- <hdp1.1 installation path>\hive-0.9.0\bin\hwi.xml
- <hdp1.1 installation path>\hive-0.9.0\bin\hiveserver.xml
- <hdp1.1 installation path>\hive-0.9.0\bin\derbyserver.xml
- <hdp1.1 installation path>\hadoop-1.1.0-SNAPSHOT\bin\tasktracker.xml
- <hdp1.1 installation path>\hadoop-1.1.0-SNAPSHOT\bin\secondarynamenode.xml
- <hdp1.1 installation path>\hadoop-1.1.0-SNAPSHOT\bin\namenode.xml
- <hdp1.1 installation path>\hadoop-1.1.0-SNAPSHOT\bin\jobtracker.xml
- <hdp1.1 installation path>\hadoop-1.1.0-SNAPSHOT\bin\historyserver.xml
- <hdp1.1 installation path>\hadoop-1.1.0-SNAPSHOT\bin\datanode.xml
Similarly for xml files within templeton, sqoop.
I suggest using TextPad's Search sub directory feature for finding all the locations where the JRE path is mentioned.
Comments
if you are looking for the Big Data Training in indore i would highly recommened you the ssi , they are the one the best education institute in indore