This course covers the essentials of deploying and managing an Apache™ Hadoop® cluster. The course is lab intensive with each participant creating their own Hadoop cluster using either the CDH (Cloudera’s Distribution, including Apache Hadoop) or Hortonworks Data Platform stacks. Core Hadoop services are explored in depth with emphasis on troubleshooting and recovering from common cluster failures. The fundamentals of related services such as Ambari, Zookeeper, Pig, Hive, HBase, Sqoop, Flume, and Oozie are also covered. The course is approximately 60% lecture and 40% labs.
Course Overview
CLASS INFORMATION
Price:
$2,200Duration:
3 daysVersion:
D03Module 1: HADOOP OVERVIEW
- Data Analysis
- Big Data
- Origins of Hadoop
- Hadoop Marketplace
- Hadoop Core
- Hadoop Ecosystem
- Cluster Architecture
- Hardware/Software Requirements
- Running Commands on Multiple Systems
Module 2: HDFS
- Design Goals
- Design
- Blocks
- Block Replication
- Namenode Daemon
- Secondary Namenode Daemon
- Datanode Daemon
- Accessing HDFS
- Permissions and Users
- Adding and Removing Datanodes
- Balancing
Module 3: YARN
- YARN Design Goals
- YARN Architecture
- Resource Manager
- Node Manager
- Containers
- YARN: Other Important Features
- Slider
Module 4: MAPREDUCE
- MapReduce
- Terminology and Data Flow
Module 5: INSTALLING HADOOP WITH AMBARI LAB TASKS
- CDH Uninstall
- Installing Hadoop with Ambari
- Tez
Module 6: DATA INGESTION
- Sqoop
- Flume
- Kafka
Module 7: DATA LINEAGE AND GOVERNANCE
- Falcon
- Atlas
- Oozie
Module 8: DATA PROCESSING FRAMEWORKS
- The Bane of MapReduce
- Tez overview
- Pig
- Hive
- Spark
- Storm
- Solr
Module 9: NOSQL IMPLEMENTATIONS
- HBase
- Phoenix
Module 10: CLUSTER MANAGEMENT
- Ambari Metrics System (AMS)
- Zookeeper
- Module 1 LAB TASKS
- Running Commands on Multiple Hosts
- Preparing to Install Hadoop
- Module 2 LAB TASKS
- Single Node HDFS
- Multi-node HDFS
- Files and HDFS
- Managing and Maintaining HDFS
- Module 3 LAB TASKS
- YARN
- Module 4 LAB TASKS
- Mapreduce
- Module 6 LAB TASKS
- Sqoop
- Module 8 LAB TASKS
- Pig
Qualified participants should be comfortable with the Linux commands and have some systems administration experience, but do not need previous Hadoop experience.