Sunday, 20 July 2014

Building Native Hadoop Libraries to Fix VM Stack Guard error on 64 bit machine

Sometime Node manager may not start on 64 bit machine and If you see a message similar to this one if you are running on a 64 bit server using the Apache distribution without modification:
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /home/hduser/bigdata/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/02/01 17:02:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable.


Solution:
To get started, you need to have this following set up this setup:
Basic dev tools already installed:  gcc, make
Java 1.6+ installed
Maven 3.1.1
If you start with this setup, you also need to install these components (Ubuntu user can use sudo apt-get install):

  • g++
  • cmake
  • zlib1g-dev
  • protobuf 2.5


First Step: Building the native libraries requires using protobuf 2.5
 You will need to download and build it yourself.  You can get the download from here .  Download version 2.5, which is the latest version as of this post.

To build protobuf, run these commands from the main protobuf directory:

> ./configure
> make
Once the build has finsihed, run this command to execute the unit tests and verify that protobuf was built successfully:

> make check
Look for this in the output:

==================
All 5 tests passed
==================
If you see this, then protobuf was built successfully, and you can move on to building the Hadoop libraries.

Second Step:  Building the Hadoop Libraries
To build the Hadoop libraries, start off with a Hadoop src distribution archive from here.
  Extract the archive, then move into the hadoop-common-project/hadoop-common directory:

$ cd hadoop-common-project/hadoop-common
Before building, you need to define the location of protoc in the protobuf code:

$ export HADOOP_PROTOC_PATH=[path to protobuf]/src/protoc
From this directory, use Maven to build the native code:

$ mvn compile -Pnative
Look for the typical Maven BUILD SUCCESS message to indicate that you have built the libraries properly:

[INFO] --------------------------------------------
[INFO] BUILD SUCCESS
[INFO] --------------------------------------------
Maven will generate the libraries in target/native/target/usr/local/lib .

Final step:  Copying the libraries into Hadoop
Once the libraries are built, all you need to do is copy them to your Hadoop installation.  If you have been following the instructions to set up a cluster on this site, that path is $HADOOP_HOME .  Copy the files as the hduser since that user has permissions to write to the Hadoop installation:

hdfs> cp target/native/target/usr/local/lib/libhadoop.a $HADOOP_HOME/lib/native
hdfs> cp target/native/target/usr/local/lib/libhadoop.so.1.0.0 $HADOOP_HOME/lib/native


Now restart(Stop then start) your Hadoop daemons. Node Manager will up along with other services and running when checked with jps:)


6 comments:

  1. command for installing zlib :
    sudo apt-get install zlib1g-dev

    ReplyDelete
  2. Thanks Suman for pointing out,i have updated the zlib package name.

    ReplyDelete
  3. In hadoop2.7.3 you need to apply a patch, also I used the following dependencies.

    sudo apt-get install cmake
    sudo apt-get install libc6-dev
    sudo apt-get install zlib1g-dev
    sudo apt-get install maven
    wget https://issues.apache.org/jira/secure/attachment/12570212/HADOOP-9320.patch
    patch < HADOOP-9320.patch

    ReplyDelete
  4. Hi Kuntal,
    thank you VERY MUCH for your post, because, although I don`t use Hadoop anyway, I had similar problem and error with the Network Security Service, NSS library of Mozilla, and a Java application to make work the process of online sign with the DNI-E.
    Your way of facing the problem has shown me how I had to solve mine: recompiling the NSS library on 64b. I do not understand why these f25 packages come precompiled for 32 bits when my system is 64!!!!
    Once again, thank you for showing me the light.

    ReplyDelete
  5. This comment has been removed by the author.

    ReplyDelete
  6. hi when i tried to run ./configure command in protobuf-3.6.1 home directory, it was showing this error
    checking for strtol... no
    checking zlib version... headers missing or too old (requires 1.2.0.4)
    checking whether gcc is Clang... no
    checking whether pthreads work with -pthread... yes
    checking for joinable pthread attribute... PTHREAD_CREATE_JOINABLE
    checking whether more special flags are required for pthreads... no
    checking for PTHREAD_PRIO_INHERIT... yes
    checking the location of hash_map...
    configure: WARNING: could not find an STL hash_map
    checking for library containing sched_yield... no
    configure: error: in `/home/user/Desktop/protobuf-3.6.1':
    configure: error: sched_yield was not found on your system
    See `config.log' for more details

    and make is also not working

    what i have todo

    ReplyDelete