Location: Hybrid, 3 days onsite per week. | Toronto Downtown
Duration: 12 months | strong likelihood for an extension, or conversion to full time employee
Must-have Skills/Experiences and/or Education, certifications, qualifications, designations:
- Bachelor's or master's degree in computer science, Information Systems, or a related field.
- 3-5 years of architecture experience in both Databricks and Snowflake platforms to design and implementing big data solutions including migration towards hybrid architecture.
- Professional certification on designing solutions for Azure and/or AWS public clouds.
- Strong knowledge of ETL processes, OLAP\OLTP systems, SQL and NoSQL Databases.
- Experience building batch and real-time data pipelines leveraging big data technologies such as Spark, Airflow, Nifi, Kafka, Cassandra, and Elasticsearch
- Expertise in DataOps/DevOps practices for deploying and monitoring automated data pipelines and data lifecycle management.
- Proficiency in writing and optimizing SQL queries and at least one programming languages like Java, Scala and/or Python.
- Continuously learning mindset and enjoy working on open-ended problems.
Nice-to-have Skills/Experience and/or Education, certifications, qualifications, designations:
- System administration experience including Docker and Kubernetes platforms.
- Experience with OpenShift, S3, Trino, Ranger and Hive.
- Knowledge of machine learning and data science concepts and tools.
- Experience with BI tools.