View Printable Version

F14 - Using Event Publishing for Near Real Time, Hadoop Data Ingestion

Session Number: 5150
Track: Ala Carte - I
Session Type: Podium Presentation
Primary Presenter: Peter Corran [Carefirst]
Time: May 02, 2018 (04:55 PM - 05:55 PM)
Room: Congress A

Session Code: F14
Speaker Bio: Pete has been a DBA for the last 20 years, starting in 1997 with a Sybase trouble reporting system for Verizon. He helped engineer the conversion of this system to Oracle, and later become involved with DB2 LUW data warehouses running DPF, and has worked exclusively with DB2 LUW ever since.

High availability and disaster recovery have always been a major focus of his work.

Pete lives in Silver Spring, Maryland with his wife of 29 years, Ann, and has two grown children. He is an avid golfer and tennis player, and pretend baseball player.




Audience experience level: Intermediate
Presentation Category: Information Integration
Presentation Platform: DB2 for Linux, UNIX, Windows
Audiences this presentation will apply to: Select a Value
Technical areas this presentation will apply to: Data Warehousing and Business Intelligence
Objective 1: Learn about the DB2 and MQ tools available to directly ingest data into a Hadoop warehouse.

Abstract:  As more and more organizations explore the potential cost savings of running their analytics processing on Hadoop, the question arises: How to ingest data into Hadoop from legacy OLTP (DB2 on z/OS or LUW) systems in a near real time (NRT) fashion?

This presentation will show how to exploit the Java Messaging Service (JMS) facilities of IBM WebSphere MQ, combined with Q-replication’s Event Publishing to seamlessly present data for Hadoop NRT ingestion.

Specific topics include:
1) Using the IBM WebSphere MQ JMSAdmin utility to generate the necessary bind file for Hadoop’s “Flume” component
2) Setting up the basic underlying MQ infrastructure required
3) Setting up the basic Event Publishing infrastructure necessary
4) Creating the Q-replication Publications through Replication Center and scripting
5) Monitoring flow through
6) Very basic overview from the Hadoop/Flume side of things

For questions or concerns about your event registration, please contact support@idug.org