Adding Solr via Ambari

Solr is an open source search engine service. The service is a part of the Hortonworks Data Platform and prior to installing it via Ambari, the service should be added (Zeppelin Notebook went through the same in HDP 2.4 Рit had to be added manually before installing). Here is the link to the post about how to add Solr to the list of services.

Once the Solr service is available on the Add Services list, it can be installed. It is a simple process – click next a few times, deploy and that is it. Well, not really.

The following error message will appear

error while installing.JPG

This documentation at the bottom states the following:

In the case of the Java preinstall check failing, the easiest remediation is to login to each machine that Solr will be installed on, temporarily set the JAVA_HOME environmental variable, then then use yum/zypper/apt-get to install the package. For example on CentOS:

export JAVA_HOME=/usr/jdk64/jdk1.8.0_77
yum install lucidworks-hdpsearch

The Datanodes did not find any Java so I installed Java

sudo add-apt-repository ppa:openjdk-r/ppa && sudo apt-get update && sudo apt-get install openjdk-8-jdk -y

Added JAVA_HOME to /env/environment

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64

And I installed the packages on every DataNode.

cd /etc/apt/sources.list.d
sudo wget http://public-repo-1.hortonworks.com/HDP-SOLR-2.5-100/repos/ubuntu14/hdp-solr.list
sudo apt-get update -y
sudo apt-get install lucidworks-hdpsearch

Do not forget to do the same on client nodes and Namenode(s).

When the install is through and if all went well, the following output is given

Result on all nodes:

====
Package lucidworks-hdpsearch was installed
====

The NameNode challenge

The previous steps worked on all nodes except on NameNode! I figured out th

echo $JAVA_HOME

NameNode show the following

/usr/jdk64/jdk1.8.0_77

DataNodes and Client gives me this

/usr/lib/jvm/java-8-openjdk-amd64

So I installed Java on NameNode as described above (I did not remove any versions) and ran the following commands (same as above)

export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
cd /etc/apt/sources.list.d
sudo wget http://public-repo-1.hortonworks.com/HDP-SOLR-2.5-100/repos/ubuntu14/hdp-solr.list
sudo apt-get update -y
sudo apt-get install lucidworks-hdpsearch

Output was

====
Package lucidworks-hdpsearch was installed
====

I went to Ambari, I deleted the Solr with Install Fail status, installed it again and it was a success!

I have no explanation why (or if) the issue was in the Java version.

Advertisements

Namenode hangs when restarting -can’t leave safemode

I am using Hortonworks distribution and Ambari for Hadoop administration. Sometimes, HDFS has to be restarted and sometimes Namenode hangs in the process giving the following output:

2016-04-21 06:12:47,391 – Retrying after 10 seconds. Reason: Execution of ‘hdfs dfsadmin -fs hdfs://t-namenode1:8020 -safemode get | grep ‘Safe mode is OFF” returned 1.
2016-04-21 06:12:59,595 – Retrying after 10 seconds. Reason: Execution of ‘hdfs dfsadmin -fs hdfs://t-namenode1:8020 -safemode get | grep ‘Safe mode is OFF” returned 1.
2016-04-21 06:13:11,737 – Retrying after 10 seconds. Reason: Execution of ‘hdfs dfsadmin -fs hdfs://t-namenode1:8020 -safemode get | grep ‘Safe mode is OFF” returned 1.
2016-04-21 06:13:23,918 – Retrying after 10 seconds. Reason: Execution of ‘hdfs dfsadmin -fs hdfs://t-namenode1:8020 -safemode get | grep ‘Safe mode is OFF” returned 1.
2016-04-21 06:13:36,101 – Retrying after 10 seconds. Reason: Execution of ‘hdfs dfsadmin -fs hdfs://t-namenode1:8020 -safemode get | grep ‘Safe mode is OFF” returned 1.

To get out of this loop I run the following command from the command line on the Namenode:

sudo -u hdfs hdfs dfsadmin -safemode leave

The output is the following:

Safe mode is OFF

If you have High Availability in the cluster, something like this shows up:

Safe mode is OFF in t-namenode1/10.x.x.171:8020
Safe mode is OFF in t-namenode2/10.x.x.164:8020

After the command is executed, the Namenode restart process in Ambari continues.