Position Snapshot
Business areas: Nestlé Canada
Job title: Sr. Specialist Product Ownership-Data Engineer (1-year contract)
Location: North York, ON located at 25 Sheppard Ave W, North York, ON M2N 6S8;
Hybrid
A little bit about us
While Nestlé is known for KitKat, Gerber, Nescafe, and Häagen-Dazs, our recipe for success comes down to one thing: our people.
We strive to lead a people-focused culture that empowers employees to bring their authentic selves to work each day. There are 3,000+ members of Nestlé Canada celebrated for taking action using agility, courage, and trust to find solutions that benefit the business or greater good. We’re a team of changemakers, who are curious and challenge the status quo, that take risks that will help drive us forward.
Our focus is not only on nourishing our customers, but also about enriching you. We know that empowerment leads to strong employee engagement, a great work culture, and motivated employees.
Position Summary
This role combines advanced data engineering with product ownership. The Sr. Specialist Product Ownership-Data Engineer is responsible for designing, building, and optimizing data products using structured and unstructured data, supporting analytics and visualization needs. They collaborate with IT and product teams to embed models into operations, manage product lifecycles, and drive continuous improvement. Key skills include Python, R, SQL, big data technologies, machine learning, and data visualization. Strong problem-solving, communication, and stakeholder management are essential.
A day in the life of an Sr. Specialist Product Ownership:
Data Engineering - Data Products
Cleaned and prepared data for analysis and consumption
Optimized the performance of the data ecosystem
Integrated data from various sources
Managed complex queries for data extraction and manipulation
Derived insights and performed analysis on structured and unstructured data
Collaborated with IT Solution Architects and Data Product Owners for embedding successful models into operations
Continuously improved data products based on user feedback and evolving business needs
Clear and concise technical documentation and architecture design
Operational Effectiveness & Efficiency
Streamlined data processes: Implementing efficient and standardized data processing workflows to improve operational effectiveness.
Automated data pipelines: Developing automated data integration and transformation pipelines to enhance efficiency and reduce manual effort.
Scalable data infrastructure: Designing and implementing a scalable data infrastructure that can handle increasing data volumes and user demands.
Monitoring and optimization: Establishing monitoring systems to track data performance and identifying areas for optimization to improve operational efficiency.
Agile project delivery: Applying agile methodologies to ensure timely and efficient delivery of data products and solutions.
Continuous improvement: Encouraging a culture of continuous improvement to identify and implement enhancements in operational processes and workflows.
Collaboration and communication: Facilitating effective collaboration and communication between cross-functional teams to ensure smooth and efficient data operations.
Documentation and knowledge sharing: Creating comprehensive documentation and promoting knowledge sharing to enhance operational effectiveness and efficiency.
Performance metrics and reporting: Establishing performance metrics and reporting mechanisms to measure and communicate operational effectiveness and efficiency to stakeholders.
Stakeholder engagement
Develops and maintains partnerships with strategic business stakeholders including senior business leaders to keep them apprised about their data solution needs.
Works with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions and provide data sourcing advisory
Collaborates with senior business leaders on data related opportunities, gaps, and strategy
Role Requirements
Bachelor’s or master’s degree in computer science, Information Systems, Data Analytics, or a related field from a recognized institution.
Minimum 5 years working with BIG or complex data sets in a data engineering role.
Strong understanding of data engineering concepts, including idempotency, data modelling, ETL process and data integration.
Effective communication skills at various levels within the organization including senior business leaders, with fluency in English including the ability to translate technical information to business relevancy.
Extensive knowledge of designing a transactional schema, star schema and/or snowflake schema.
Experience with cloud-based platforms- Microsoft Azure, Azure Databricks and Snowflake.
Expertise in working with structured and semi-structured data; apply methods, technologies and techniques that address data architecture, integration, and governance of data.
Experience with Azure Cloud services – Azure Synapse Analytics, Data Factory, Logic Apps, Azure Data Lake Storage Gen2.
Experience with Snowflake architecture and data ingress/egress methodologies.
Expert in utilizing Databricks for writing advanced data transformation scripts and designing efficient data pipelines for seamless data movement between Azure and Databricks cloud systems.
Expert programming skills in Python, PySpark, and/or Scala, with a demonstrated ability to modularize code, apply DRY/WET programming concepts, refactor code for optimization, and perform data preprocessing using native or custom modules.
Experience working in Azure DevOps to adopt continuous integration and continuous delivery method.
Demonstrable experience with Agile projects and knowledge of Scrum techniques and artifacts (such as definition of sprint planning, planning poker, feature creation, user stories and backlog refinement)
Benefits
Flexible and hybrid work arrangements
Excellent training and development programs as well as opportunities to grow within the company
Access to the Discount Company store with Nestlé, Nespresso, and Purina products (Located across various Nestle offices/sites)
Additional discounts on a variety of products and services offered by our preferred vendors and partnerships
What you need to know
We will be considering applicants as they apply, so please don’t delay in submitting your application.
Nestlé Canada is an equal-opportunity employer committed to diversity, equity, inclusion, and accessibility. We welcome qualified applicants to bring their diverse and unique experiences as a result of their education, perspectives, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, colour, religion, disability, sexual orientation and beliefs.
If you are selected to participate in the recruitment process, please inform Human Resources of any accommodations you may require. Nestlé will work with you in an effort to ensure that you are able to fully participate in the process.
Position Snapshot
Business areas: Nestlé Canada
Job title: Sr. Specialist Product Ownership-Data Engineer (1-year contract)
Location: North York, ON located at 25 Sheppard Ave W, North York, ON M2N 6S8;
Hybrid
A little bit about us
While Nestlé is known for KitKat, Gerber, Nescafe, and Häagen-Dazs, our recipe for success comes down to one thing: our people.
We strive to lead a people-focused culture that empowers employees to bring their authentic selves to work each day. There are 3,000+ members of Nestlé Canada celebrated for taking action using agility, courage, and trust to find solutions that benefit the business or greater good. We’re a team of changemakers, who are curious and challenge the status quo, that take risks that will help drive us forward.
Our focus is not only on nourishing our customers, but also about enriching you. We know that empowerment leads to strong employee engagement, a great work culture, and motivated employees.
Position Summary
This role combines advanced data engineering with product ownership. The Sr. Specialist Product Ownership-Data Engineer is responsible for designing, building, and optimizing data products using structured and unstructured data, supporting analytics and visualization needs. They collaborate with IT and product teams to embed models into operations, manage product lifecycles, and drive continuous improvement. Key skills include Python, R, SQL, big data technologies, machine learning, and data visualization. Strong problem-solving, communication, and stakeholder management are essential.
A day in the life of an Sr. Specialist Product Ownership:
Data Engineering - Data Products
Cleaned and prepared data for analysis and consumption
Optimized the performance of the data ecosystem
Integrated data from various sources
Managed complex queries for data extraction and manipulation
Derived insights and performed analysis on structured and unstructured data
Collaborated with IT Solution Architects and Data Product Owners for embedding successful models into operations
Continuously improved data products based on user feedback and evolving business needs
Clear and concise technical documentation and architecture design
Operational Effectiveness & Efficiency
Streamlined data processes: Implementing efficient and standardized data processing workflows to improve operational effectiveness.
Automated data pipelines: Developing automated data integration and transformation pipelines to enhance efficiency and reduce manual effort.
Scalable data infrastructure: Designing and implementing a scalable data infrastructure that can handle increasing data volumes and user demands.
Monitoring and optimization: Establishing monitoring systems to track data performance and identifying areas for optimization to improve operational efficiency.
Agile project delivery: Applying agile methodologies to ensure timely and efficient delivery of data products and solutions.
Continuous improvement: Encouraging a culture of continuous improvement to identify and implement enhancements in operational processes and workflows.
Collaboration and communication: Facilitating effective collaboration and communication between cross-functional teams to ensure smooth and efficient data operations.
Documentation and knowledge sharing: Creating comprehensive documentation and promoting knowledge sharing to enhance operational effectiveness and efficiency.
Performance metrics and reporting: Establishing performance metrics and reporting mechanisms to measure and communicate operational effectiveness and efficiency to stakeholders.
Stakeholder engagement
Develops and maintains partnerships with strategic business stakeholders including senior business leaders to keep them apprised about their data solution needs.
Works with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions and provide data sourcing advisory
Collaborates with senior business leaders on data related opportunities, gaps, and strategy
Role Requirements
Bachelor’s or master’s degree in computer science, Information Systems, Data Analytics, or a related field from a recognized institution.
Minimum 5 years working with BIG or complex data sets in a data engineering role.
Strong understanding of data engineering concepts, including idempotency, data modelling, ETL process and data integration.
Effective communication skills at various levels within the organization including senior business leaders, with fluency in English including the ability to translate technical information to business relevancy.
Extensive knowledge of designing a transactional schema, star schema and/or snowflake schema.
Experience with cloud-based platforms- Microsoft Azure, Azure Databricks and Snowflake.
Expertise in working with structured and semi-structured data; apply methods, technologies and techniques that address data architecture, integration, and governance of data.
Experience with Azure Cloud services – Azure Synapse Analytics, Data Factory, Logic Apps, Azure Data Lake Storage Gen2.
Experience with Snowflake architecture and data ingress/egress methodologies.
Expert in utilizing Databricks for writing advanced data transformation scripts and designing efficient data pipelines for seamless data movement between Azure and Databricks cloud systems.
Expert programming skills in Python, PySpark, and/or Scala, with a demonstrated ability to modularize code, apply DRY/WET programming concepts, refactor code for optimization, and perform data preprocessing using native or custom modules.
Experience working in Azure DevOps to adopt continuous integration and continuous delivery method.
Demonstrable experience with Agile projects and knowledge of Scrum techniques and artifacts (such as definition of sprint planning, planning poker, feature creation, user stories and backlog refinement)
Benefits
Flexible and hybrid work arrangements
Excellent training and development programs as well as opportunities to grow within the company
Access to the Discount Company store with Nestlé, Nespresso, and Purina products (Located across various Nestle offices/sites)
Additional discounts on a variety of products and services offered by our preferred vendors and partnerships
What you need to know
We will be considering applicants as they apply, so please don’t delay in submitting your application.
Nestlé Canada is an equal-opportunity employer committed to diversity, equity, inclusion, and accessibility. We welcome qualified applicants to bring their diverse and unique experiences as a result of their education, perspectives, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, colour, religion, disability, sexual orientation and beliefs.
If you are selected to participate in the recruitment process, please inform Human Resources of any accommodations you may require. Nestlé will work with you in an effort to ensure that you are able to fully participate in the process.