About the Role
Duties \& Responsibilities:
-
Design and build a scalable, cloud-native data platform aligned with microservices.
-
Develop real-time and batch data pipelines to power data-driven products.
-
Implement SQL, NoSQL, and hybrid storage strategies for diverse business needs.
-
Enable self-serve data access with secure, well-documented data APIs.
-
Collaborate with Product \& Business teams to define and optimize data products.
-
Ensure data quality, lineage, and governance in all data pipelines and products.
-
Build event-driven architectures using Kafka, Azure Event Hub, or Service Bus.
-
Develop scalable ETL/ELT processes for ingestion, transformation, and distribution.
-
Optimize query performance, indexing, and caching for data-intensive apps.
-
Enforce data privacy, security, and access controls aligned with compliance standards.
-
Implement observability and monitoring for data infrastructure and pipelines.
-
Work with DevSecOps teams to integrate security into CI/CD workflows
Qualifications: Must Haves
- 5 years of experience in data engineering, with exposure to Data Mesh and Data as a Product preferred.
- Expertise in modern data storage and processing, including SQL, NoSQL (Cosmos DB, PostgreSQL), Data Lakes (Azure Data Lake, Delta Lake, Apache Iceberg).
- Proficiency in ETL frameworks (e.g. Apache Kafka, Airflow, Flink, Spark, Azure Data Factory, Databricks).
- Experience with Event-driven architectures using Queues, Pub/Sub services (e.g. Azure Service Bus, Azure Event Grid, Amazon Event Bridge) and Containerized Environments (Container Apps, AWS ECS).
- Experience with Apache and/or Azure data platforms or similar, e.g. Fabric, Databricks, Snowflake, and Apache Hudi.
- Strong API development skills using GraphQL, REST, and/or gRPC for enabling data as a product.
- Proficiency in Go, Java, and/or Python.
- Deep understanding of data governance, security, lineage, and compliance using Microsoft Purview, OpenLineage, Apache Ranger, or Azure Key Vault.
- Experience with Infrastructure as Code (IaC) using Bicep, Terraform, or CloudFormation for managing cloud-based data solutions.
- Strong problem-solving and collaboration skills, working across data, engineering, and business teams.
Nice to Have
-
Knowledge of ML, AI, and LLMs, including data engineering for model training and inference with Azure Machine Learning, Databricks ML, and MLflow.
-
Hands-on experience in Notebooks (e.g. Jupyter, Databricks, Azure Synapse) for data workflows.
-
Experience in real-time data streaming architectures.
-
Exposure to data monetization strategies and analytics frameworks.
-
Familiarity with federated data governance and self-serve data platforms.
-
Building a diverse and inclusive team, supporting career growth and development.
-
Competitive base salaries.
-
Share Appreciation Rights program for salaried employees.
-
Paid vacation days and sick days.
-
An employee charitable donation program.
-
Hotel and travel discounts.
-
Comprehensive benefits package including extended health, vision, dental, Health Spending Account, TeleDoc, Employee Assistance Program, Life, Long-term Disability, AD\&D, and Critical Illness Insurance.
-
Calgary's Head Office: Located in beautiful Eau Claire in downtown Calgary, within a 5-minute walk to Prince's Island Park
-
Company-hosted events, and a game room.
-
Ability to join our Social Club and join colleagues for fun events such as golf, bowling, curling, stampede events and more.
-
Free access to the Aspen Properties Fitness Centers.
Black Diamond Group