Skip to content

noman-xg/GCP

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

s204

Implementation of ETL pipeline on GCP Infrastructure

Build Status

In this cloud specialization module we'll be implementing an ETL (Extract,Transform,Load) pipeline through a Google cloud function written in GOLANG. We'll be using Terraform as IAC tool to provision and use following GCP resources for our ETL pipeline.

  • Google Cloud Storage.
  • Google CLoud Function.
  • Google Cloud Scheduler.
  • GCP BigQuery

Features

  • Extract 100 csv records per function invocation.
  • Transform records which meet the transformation criteria.
  • Insert the records into the BigQuery Table.
  • Enable periodic function invocation using Google Cloud Scheduler.

Step-by-Step Guide

1 - Clone the xldp repo to your system and navigate to the directory relevant to this module.

git clone https://github.com/X-CBG/xldp.git
cd xldp/cloud_specializations/s204/noman

2 - Download the csv file from this link.

3 - Navigate to the CloudFunction directory and create a zip archive (ETL.zip) of the cloud function source code files.

cd CloudFunction/
zip ETL.zip main.go go.mod && cd ..

4 - Navigate to the Terraform folder. initialize terraform in the directory.

cd Terraform && terraform init

5 - Open the directory in any editor for example VScode and update the values of sourcepath and csv_local_path
variables in variables.tf according to the absolute path of the csv file and ETL.zip on your system. Futher, please update the value of project in main.tf to reflect the ProjectID of your project in your GCP account.

6 - Authenticate your gcloud cli with GCP.

gcloud auth login 
#you will be prompted with a window in your default browser, check the boxes as per requirements and your gcloud CLI will be authenticated with GCP automatically.

7 - Apply the terraform configuration to setup the infrastructure resources

terraform apply --auto-approve

GCP Resources

Login into your gcp account and verify that the following resources with the mentioned names have been created inside your project.

Resource Name
Cloud Storage Bucket for_s204_xgrid
Cloud Storage Object covid-csv-s204.csv
Cloud Function etl_function
Cloud Scheduler invoke_ETL
BigQuery Dataset covidDataset
BigQuery Table covid-table

Verify

You may verify the functionality by manually triggering the cloud fucntion http endpoint from your browser.

https://us-east1-your-projectID.cloudfunctions.net/etl_function/?a=your-projectID&&b=covidDataset&&c=covid-table

** Note: Please replace "your-projectID" with the projectID of your project in your GCP accout.

You can now verify the functionality of the ETL pipeline by previewing the BigQuery table in the dataset which will now be including the first 100 records from the csv after the first invocation.


About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages