|  

Udemy - Flume and Sqoop for Ingesting Big Data



Size :495.80 MB
Peers : Seeders : 0      Leechers : 0
Added : 5 years ago » by tutsgalaxy » in Tutorials
Language : English
Last Updated :7 months ago
Info_Hash :AC5B73E3CB2BB825C6C748F8BCDF2CA05D1CBCBA

Torrent File Contents

Udemy - Flume and Sqoop for Ingesting Big Data
  Flume and Sqoop for Ingesting Big Data.zip
  -  495.8 MB

  Downloaded from TutsGalaxy.com.txt
  -  73 Bytes

  Download more courses.url
  -  123 Bytes

  TutsGalaxy.com.txt
  -  53 Bytes



Torrent Description

Description:

Description

Taught by a team which includes 2 Stanford-educated, ex-Googlers. This team has decades of practical experience in working with Java and with billions of rows of data.

Use Flume and Sqoop to import data to HDFS, HBase and Hive from a variety of sources, including Twitter and MySQL

Let’s parse that.

Import data : Flume and Sqoop play a special role in the Hadoop ecosystem. They transport data from sources like local file systems, HTTP, MySQL and Twitter which hold/produce data to data stores like HDFS, HBase and Hive. Both tools come with built-in functionality and abstract away users from the complexity of transporting data between these systems.

Flume: Flume Agents can transport data produced by a streaming application to data stores like HDFS and HBase.

Sqoop: Use Sqoop to bulk import data from traditional RDBMS to Hadoop storage architectures like HDFS or Hive.

What’s Covered:

Practical implementations for a variety of sources and data stores ..

  Sources : Twitter, MySQL, Spooling Directory, HTTP
  Sinks : HDFS, HBase, Hive

.. Flume features :

Flume Agents, Flume Events, Event bucketing, Channel selectors, Interceptors

.. Sqoop features :

Sqoop import from MySQL, Incremental imports using Sqoop Jobs
Who this course is for:

  Yep! Engineers building an application with HDFS/HBase/Hive as the data store
  Yep! Engineers who want to port data from legacy data stores to HDFS

Requirements

  Knowledge of HDFS is a prerequisite for the course
  HBase and Hive examples assume basic understanding of HBase and Hive shells
  HDFS is required to run most of the examples, so you’ll need to have a working installation of HDFS

Last updated 10/2016