Rockstar is recruiting for a fast-growing, mission-driven technology company focused on workforce development. The client is dedicated to building innovative digital solutions that empower individuals and organizations to thrive in the modern economy. Rockstar is supporting this client in their search for a talented Sr. Data Engineer to help evolve their core data platform and drive impactful business outcomes.
A Sr. Data Engineer is sought to join the team. This individual will play a key role in evolving the core data platform, which includes data pipelines, machine learning models, and various databases. The ideal candidate will combine technical data expertise with strong business intuition to build a foundation of reliable data. Expertise in data engineering will help build strong, performant pipelines.
What You’ll Own
- Data infrastructure: Building and maintaining the infrastructure that powers the data platform including pipeline orchestration, data warehousing, and machine learning
- Data solutions that drive the product: Developing and maintaining data solutions alongside a team of data scientists that enable the product to function at scale and with quality
- Data governance and quality: Upholding best practices in data governance, ensuring accuracy, accessibility, and compliance across data systems
- Cross-platform data sourcing: Surfacing and integrating data from across the platform to address real business needs in product, engineering, and GTM
- Evolving core data models: Continuously evolving foundational models by identifying and incorporating new, high-value data sources
30/60/90 Day Plan
30 days:
- Onboarding/Learning Stack/Product
- Learning who the customers are, what their problems are, and how data can be leveraged to support them
- Gaining an understanding of core data entities and how they drive the product
- Contributing to core data pipelines by adding data quality and data enrichment layers
60 days:
- Working with data scientists to develop datasets and processes that streamline complex workflows
- Contributing to and owning aspects of the data catalog by defining and maintaining metrics, dimensions, and lineage
- Supporting surrounding teams in getting value out of the platform’s data through regular reporting and analysis
90 days:
- Owning and automating reporting workflows from data ingestion all the way to building out dashboards and tools
- Independently gathering reporting and insights requirements from stakeholders
- Presenting findings to stakeholders and providing recommendations to drive the organization towards making data-driven decisions
Required Experience
- Proven ability to translate ambiguous business problems into clear, actionable insights
- Hands-on experience using SQL and Python for analysis in a professional setting
- Experience building and maintaining data pipelines, warehouses, and infrastructure
- Strong communication skills to convey technical insights to both technical and non-technical stakeholders
- Demonstrated ownership of analytics solutions, ensuring accuracy, reliability, and business alignment
- Familiarity with data visualization tools such as Looker, Power BI, or Tableau
- Familiarity with modeling structured and unstructured data, including NoSQL databases like MongoDB
- A sharp, kind, and open-minded approach, driven by both excellence and impact
Preferred Experience
- Hands-on experience with modern data tools like DBT and Airflow
- Experience with SageMaker or an equivalent machine learning / data science platform
- Experience in the workforce development industry
Our Tech Stack
- Languages: SQL, Python
- Data orchestration and transformation: Airflow, dbt
- Data storage and warehousing: PostgreSQL, Redshift, MongoDB (for unstructured data)
- Machine learning and experimentation: AWS SageMaker
- Visualization and reporting: Looker
- Infrastructure: AWS ecosystem (S3, Lambda, Glue, Redshift)