-
Work together with other data engineers, data analysts, business analysts, business SMEs, records analyst, and privacy analyst to understand the needs for data and create effective, secure data workflows.
-
Responsible for designing, building, and maintaining secure and compliant data processing pipelines using various Microsoft Azure data services and frameworks including but not limited to Azure Databricks, Data factory, ADLS Storage, PySpark.
-
Build databases, data marts or data warehouse and perform data migration work.
-
Build reporting and analytical tools to utilize the data pipeline, provide actionable insight into key business performance metrics.
-
Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Microsoft Azure Cloud.
-
Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and/or Azure Blob Storage.
-
Utilize Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations.
-
Implement data validation and cleansing procedures to ensure the quality, integrity, and dependability of the data.
-
Improve the scalability, efficiency, and cost-effectiveness of data pipelines.
-
Monitor and resolve data pipeline problems to ensure consistency and availability of the data.
-
Identify, design, and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
-
Adapt and learn new technologies per business requirements.
-
Ensure compliance with data governance, privacy and security policies.
-
Fosters and maintains an organizational culture that promotes equity, diversity and inclusion, mutual respect, teamwork, and service excellence.
-
Architectures, and datasets using Microsoft Azure technologies including Spark
-
Experience with data migration projects within a Microsoft and Azure
-
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
-
Strong analytic skills related to working with unstructured datasets
-
A successful history of manipulating, processing, and extracting value from large, disconnected datasets
-
Ability to plan, prioritize and manage workload within a time sensitive environment