Hadoop Operations – Past, Present, and Future

Hadoop Operations - Past, Present, and Future

Thursday, October 11
4:50 PM - 5:30 PM

Hadoop operations started on-prem primarily driven by Apache Ambari. However, due to the agility and flexibility of the cloud, it has driven many Hadoop cluster operations to the cloud and to hybrid environments. Cloud is enabling many ephemeral on-demand use cases which is a game-changing opportunity for analytic workloads. But all of this comes with the challenges of running enterprise workloads in the cloud securely and with ease.

Apache Ambari is used by thousands of Hadoop Operators to manage the deployment, lifecycle, and automation of DevOps for Hadoop ecosystem projects. Starting out, Apache Ambari installed a handful of Apache Hadoop ecosystem projects, on a few operating systems, and helped with the most basic Hadoop operational tasks. Today, the product manages over 20 different services, runs on multiple major operating systems and versions, and automates many of the most challenging Hadoop operational tasks in the most secure customer environments.

In this session, we will also take you through Cloudbreak as a solution to simplify provisioning and managing enterprise workloads while providing an open and common experience for deploying workloads across clouds. We will discuss the challenges (and opportunities) to run enterprise workloads in the cloud and will go through a live demo of how the latest from Cloudbreak enables enterprises to easily and securely run Apache Hadoop. This includes deep-dive discussion on Ambari Blueprints, recipes, custom images, and enabling Kerberos -- which are all key capabilities for Enterprise deployments.

As part of this talk, will walk you through what we've learned, the challenges we've overcome, and how the Apache Ambari and Cloudbreak community has changed the product to handle them. The future is fast approaching, and with it comes new on-premise and cloud deployment architectures. See how Apache Ambari and Cloudbreak are being re-imagined to handle these new challenges.


Santosh Gowda
Principal Software Engineer
Santhosh B Gowda is an Engineering Manager QE at Hortonworks. He takse care of the scale, resiliency and performance aspects for Hortonworks products (i.e. HDP, HDF and Dataplane). He has more than 13 years of IT experience and has focused on Hadoop for the last 2 years. He holds an M.S Software Systems degree from BITS, Pilani.