Data Engineer- GCP
Location Address: Toronto Downtown- Hybrid model (1 day a week in office- No specific days)
Contract Duration:3 Months, possible extension
Project: International banking Salesforce effectiveness - Support to migrate IB Commercial Banking data to Google Cloud Platform (GCP) and to CB Commercial Banking's Salesforce Financial Services Cloud (FSC) instance to create a global commercial Salesforce org. A total of 10 people working on
this project.
Typical Day in Role:
- Design, develop and maintain robust data pipelines for ingestion, transformation, and distribution of large datasets.
- Utilize services and tools to automate data workflows and streamline the data engineering process.
- Collaborate with stakeholders and product managers to analyze, and build data mapping, models and reporting needs.
- Monitor application and pipeline performance.
- Conduct data quality checks
Candidate Requirements/Must Have Skills:
- 10 years of experience with Data Warehouse/ Data Platforms
- 5 years of experience creating ELT data pipelines from scratch, working with structured, semi- structured, and unstructured data and SQL.
- 2 years of experience configuring and using data ingestion tools such as Fivetran, Qlik, Airbyte or others.
- 3 years of experience with Cloud: GCP
- 5 years of experience working as a data developer, data engineer, programmer, ETL, ELT, with processes for data integration.
- 5 years of continuous integrations and continuous deployment pipeline (CI/CD) and working with source control systems such as Github, Bitbucket, and Terraform
Nice-To-Have Skills:
- Experience in data modelling, manipulating large data sets and handling raw SQL, and handling other cleaning techniques.
- Python -- nice to have
- DBT -- nice to have
Education \& Certificates:
Bachelor's degree in a technical field such as computer science, computer engineering or related field required or sufficient experience.