Company Description Make an impact at a global and dynamic investment organization
- Stimulating work in a fast-paced and intellectually challenging environment
- Accelerated exposure and responsibility
- Global career development opportunities
- Diverse and inspiring colleagues and approachable leaders
- A hybrid-flexible work environment with an emphasis on in-person collaboration
- A culture rooted in principles of integrity, partnership, and high performance
- An organization with an important social purpose that positively impacts lives
Job Description
Accountabilities
- Own aspects of designing, building, and maintaining scalable and efficient data pipelines to Data Products from various sources to storage and analytical systems.
- Provide on-call support for production systems, identify and create automation for reducing MTTR and mature operational runbooks.
- Help implement processes and tools to monitor and improve data quality, including data profiling, data freshness, and ensuring SLA adherence.
- Partner with Data Governance team to establish data governance policies and procedures to ensure data accuracy, privacy, and security.
- Advocate and drive adoption of DevOps \& QE culture, supporting presentations in various internal forums and sessions on product releases, best practices, and provide consulting, guidance, and implementation assistance to users/application teams.
- Collaborate with cross-functional teams such as data scientists, analysts, and software engineers to understand data requirements and deliver solutions that meet business needs.
- Stay updated on emerging technologies, trends, and best practices in data engineering and related fields. Continuously evaluate and adopt new tools and techniques to enhance productivity and innovation.
Qualifications
- Undergraduate degree or college diploma in related field (e.g. Engineering, Computer Science).
- 4 years of relevant experience.
- Experience working with Big Data, including change data capture. data quality, data lineage, etc.
- Solid programming skills with professional programming experience in AWS Data \& Analytics context (Python, EMR, Glue, Lakeformation, Airflow, PySpark, Trino, Hudi, Iceberg)
- Expertise with data quality principles, data profiling techniques, and data governance best practices.
- AWS Certified Data \& Analytics Specialty
- Nice to haves: Experience with Databricks, Github Action, Terrafrom and Rust
Additional Information LinkedIn Career Page Follow us LinkedIn.
Our Commitment To Inclusion And Diversity
Disclaimer