Now Reading
Developers Heads Up: Register For Data Engineering Workshop By Google Cloud, Qubole & AIM

Developers Heads Up: Register For Data Engineering Workshop By Google Cloud, Qubole & AIM


Analytics India Magazine in collaboration with Google Cloud and Qubole is organising a workshop for developers who work extensively on data analytics platforms. This is a huge opportunity for the developers and data engineers who are looking to gain hands-on experience on how to leverage Google Cloud Platform and build end-to-end solutions.

This workshop will be held in Mumbai on 25 January and in Gurugram on 1 February.

Click here to Register.



Mumbai venue:

Gurugram venue:

If selected, participants will receive a confirmation email stating the same. Only selected participants will be allowed to enter the premise of the workshop. Direct walk-in OR on the day OR on the spot registrations will not be permitted at all. 

The last day for registration is:

  • 24th January 2020, 11 AM for Mumbai Workshop and,
  • 31st January 2020, 11 AM or Gurugram Workshop.
This image has an empty alt attribute; its file name is S-jGE0aAUbu34bs9wvXeN5wB87YSHY66UfnHuIKJU6rt-EtHa_okbJgGbV_lt3Re7ubyNcLdPpLwnP0FczMdTAqNFUe_pdExaYZOKbiE14uLiKaIIram2kHhamwVUb9xHL1yxMJj

Workshop Details At A Glance

10:00 AM – 10:30 AM Registration

10:30 AM – 11:00 AM Workshop Begins Intros and Workshop Overview

11:00 AM – 12:30 PM Hands-On session on building pipelines for ML

12:30 PM – 1:00 PM Q&A Session

                   1:00 PM Network + Lunch

See Also

Register for the workshop here.

Who Should Attend This Workshop?

This workshop will benefit those participants who:

  • want to learn to acquire and transform data sets for data science and analytics 
  • want to learn how to make data sets available to different users and fully leverage a GCP data lake throughout your organization
  • want access to a pre-configured Qubole environment that will be loaded with datasets and the appropriate tools, including Apache Spark and Airflow, as well as interactive notebooks
  • want to build an end-to-end solution that addresses common business scenarios

Tools And Techniques That Will Be Used

The participants will get their hands on the following tools and techniques at the workshop:

Create Metadata on Data

  • Metastore
  • DDLs

Transformation/Cleaning/Denormalization

  • Hive/Spark Jobs
  • Airflow/Scheduler
  • BigQuery Integration
  • REST APIs

Data Analysis

  • Demonstrate using examples: Autoscaling and Preemptive VMs
  • Presto queries
  • Qu Workbench

Notebooks Demo

  • Zeppelin
  • Package Management

Key Takeaways

By the end of this workshop, the participants are expected to know about:

  • Ingesting the data to and from Google Cloud Storage (GCS) data lake.
  • Performing interactive data analysis and building AI/ML models using Spark or custom python packages.
  • Transforming the data set with Spark and building interactive dashboards.
  • Seamlessly interacting with other data sources like BigQuery through SQL Workbench.
  • Deploying end-to-end data pipeline using Apache Airflow.

Register for the workshop here.


Enjoyed this story? Join our Telegram group. And be part of an engaging community.


Provide your comments below

comments

What's Your Reaction?
Excited
12
Happy
0
In Love
0
Not Sure
2
Silly
0
Scroll To Top