Tag : hajime-osako

How to semi-automate deploying dev hdp cluster

Purpose of this article:

When you install HDP for dev/test environment, you would repeat same commands to set up your host OS. To save time, created a BASH script which helps to set up the host OS (Ubuntu only) and docker image (CentOS).


What this script does:

  1. Install packages on Ubuntu host OS
  2. Set up docker, such as creating image and spawning containers
  3. [Optional] Set up a local repository for HDP (not Ambari) with Apache2


What this script does NOT:

  1. ​As of this writing, this does not install HDP
  2. ​Please use Ambari Blueprint if you would like to automate HDP installation as well.
  3. This step is NOT for production environment but would be useful to test HA components

​Host setup steps:


Step 1: ​Install Ubuntu 14.x LTS on your VirtualBox/VMware/Azure/AWS.

​It should be easy to deploy Ubuntu VM if you use Azure or AWS.
If you are using VirtualBox/VMWare, you might want to backup Ubuntu installed VM as a template, so that later you can clone.


Step 2: Login to Ubuntu and become root (sudo -i)


Step 3: Download script using below command

wget https://raw.githubusercontent.com/hajimeo/samples/master/bash/start_hdp.sh -O ./start_hdp.sh && chmod u+x ./start_hdp.sh


Step 4: Start the script with Install mode

./start_hdp.sh -i


Step 5: Start of an interview 

Script will ask a few questions such as your choice of guest OS, Ambari version, HDP version etc. Normally default values should be OK, so you can just keep pressing Enter key.
NOTE: The end of interview, it asks you to save your answer in a text file. You can reuse this file to skip interview when you install a new cluster.


Step 6: Confirm your answers 

After saving your responses, it will ask you “Would you like to start setup this host? [Y]:“. If you answer yes, it starts setting up your Ubuntu host OS. After waiting for while, the scripts finishes, or if there is any error, it stops.

The time would be depending on your choice. If you selected to setup a local repo, downloading repo may take long time.


Step 7: Complete the setup

Once the script completed successfully, your choice of Ambari Server should be installed and running on your specified docker container on port 8080.

NOTE: At this moment, docker containers are installed in a private network, so that you would need to do one of followings (“1″ would be the easiest):

Following command creates proxy from your local PC port 18080

ssh -D 18080 username@ubuntu-hostname

Following command do port forwarding from your localhost:8080 to node1:8080

ssh -L 8080:node1.localdomain:8080 username@ubuntu-hostname

Set up proper proxy, such as squid

If you decided to set up a proxy, installing addon such as “​SwitchySharp” would be handy.

  1. Once you confirmed you can use Ambari web interface, please proceed to install HDP.
    If you choose to set up a HDP local repository, please replace “public-repo-1.hortonworks.com” to “dockerhost1.localdomain” (if you used default value)
  2. Private key should be /root/.ssh/id_rsa in any node
  3. Remaining steps should be same as installing normal HDP.
    NOTE: if you decided to install older Ambari version, there is a known issue ​AMBARI-8620


Host Start up step

If you shutdown the VM, next time you can just run “./start_hdp.sh -s” which starts up containers, Ambari Server, Ambari Agents and HDP services.​


How to semi-automate deploying dev hdp cluster – Did you like this article ? please feel free to send an email to info@crazyadmins.com if you have any further questions on this. Please don’t forget to like our facebook page. Happy Hadooping!! :)


facebooktwittergoogle_plusredditpinterestlinkedinmailby feather

About US

crazyadmins.com is developed by hadoop lovers to share knowledge about open source technologies, latest hadoop technologies and much more, if you think you are crazy for sharing your knowledge, please email your articles on any opensource technology on crazyadmins.com

email ID –  kuldeepkulkarni09@gmail.com

Like Our Facebook page – https://www.facebook.com/crazyhadoopers


Screen Shot 2016-04-08 at 3.38.58 PM   Kuldeep Kulkarni: Website founder and Contributor

     LinkedIn : https://in.linkedin.com/pub/kuldeep-kulkarni/23/167/707
     Facebook: https://www.facebook.com/cools1990
     HDP Certified Administrator
     Cloudera Certified Hadoop Administrator [ License number – 100-014-213 ]



In Kuldeep’s words…
Having a great passion for technologies and interest in knowing all that is possible and achievable. At the same time I also believe in sharing the same knowledge and experiences with everyone. This is one such initiative to do so!
Crazy for Hadoop and I have started learning it since the beginning of my career. With 4+ years of experience, currently working on exploring all Big Data Technologies Administration.
Also good at good shell scripting, I strongly support automation of every possible task to make things easier for implementation and management. Lets Hadoop everyone! :)



  Arti Wadhwani: Contributor

   LinkedIn : https://in.linkedin.com/in/artiwadhwani
   Facebook: https://www.facebook.com/WadhwaniArti
   HDP Certified Administrator
   Cloudera Certified Hadoop Administrator [ License number – 100-014-210 ]



In Arti’s words…
Started as a fresher by being a part of Technical Operations team. Got great opportunities to grow and then started exploring more of Hadoop. It has been more than 4+ years now and my love for Hadoop is only increasing! :) Currently working on core BigData Technologies and enjoying being a Hadooper! :-)



mayur   Mayur Bhokase: Contributor

   LinkedIn : https://in.linkedin.com/pub/mayur-bhokase/76/14b/56
   Facebook: https://www.facebook.com/bhokasemayur



In Mayur’s words…
I’m professional java developer since last 3 years in well known multinational IT industry and famous for my jasper, spring, angular JS and hibernate skills,  I’m always enthusiastic about sharing my domain knowledge with techies





   Tushar Bodhale: Contributor

   LinkedIn : https://in.linkedin.com/pub/tushar-bodhale/24/4b3/665
   Facebook: https://www.facebook.com/tusharbodhale


In Tushar’s words…
” Having 4+ years of experience in Application Design, Development and Maintenance but still there is lot to learn.
Technology is always the integral part of my life and hence love to share my experience through such blogs.
I like experimenting new technologies by working as a freelancer “

Believe ” When we know it, you’ll know it…”





  Mahendra Tonape: Contributor

   LinkedIn : https://www.linkedin.com/pub/mahendra-tonape/22/28b/641 
   Facebook: https://www.facebook.com/mahendra.tonape.9


In Mahendra’s words…
“I am working as java developer since more than 3 years.I really feels java is base platform to so many latest technologies and new software developments thanks to great open source community of java I would really love to be part of java’s large open source community.”




  Hajime Osako: Contributor

   LinkedIn : https://www.linkedin.com/in/hajime-osako-a3280048 
   HDP Certified Administrator



In Hajime’s words…
“Started as a graphic designer including 3d animation in Japan. More than 10 years of experience as Web application developer with PHP, Python and Java. Also worked as MySQL, SQL Server, PostgreSQL DBA and also FreeBSD, Ubuntu Server system administrator, but new to Hadoop and enjoying learning new things everyday”