Hatch Ltd. logo

Senior Data Engineer

Job Overview

Location

Indiana, USA

Job Type

Full-time

Category

Data Engineer

Date Posted

February 16, 2026

Full Job Description

đź“‹ Description

  • • Hatch is at the forefront of revolutionizing customer engagement through advanced AI, and we are seeking a highly skilled Senior Data Engineer to join our dynamic and rapidly growing data team. This role is critical for building, optimizing, and maintaining the robust data pipelines and platform services that underpin our analytics, reporting, and cutting-edge AI initiatives. We are not just looking for a data professional; we are seeking a seasoned software developer who brings a deep understanding of production-grade software engineering principles to the data domain. Your primary responsibility will be to architect and implement scalable, reliable, and efficient data infrastructure that empowers our AI-driven customer conversations.
  • • As a Senior Data Engineer at Hatch, you will be instrumental in designing and constructing sophisticated batch and real-time data pipelines. This will involve leveraging a suite of industry-leading technologies such as Kinesis, Pub/Sub, Flink, Spark, Airflow, and dbt. Your expertise will be crucial in developing and maintaining production-quality APIs, SDKs, and backend services that seamlessly integrate with our data infrastructure, ensuring that data is accessible, reliable, and actionable for various downstream applications and teams.
  • • A significant aspect of this role involves architecting and implementing multi-tier data lake architectures. You will be responsible for defining the structure of our data lake, including raw staging and curated layers, establishing clear promotion criteria between these layers, implementing rigorous data quality gates, and defining optimal consumption patterns for various stakeholders. This foundational work will ensure the integrity, usability, and scalability of our data assets.
  • • We expect you to apply software engineering best practices rigorously across all data platform work. This includes embracing modular design principles, implementing established design patterns, ensuring comprehensive testing strategies, setting up and managing CI/CD pipelines for automated deployments, building robust observability into our systems, and actively participating in and conducting code reviews. Your commitment to code quality and maintainability will be paramount.
  • • You will also be responsible for modeling and optimizing datasets within BigQuery and Aurora PostgreSQL. This requires a keen eye for performance tuning, cost management, and data governance, ensuring that our data stores are both efficient and compliant.
  • • Collaboration is key. You will work closely with our backend engineering teams to define data contracts, establish streaming interfaces, and delineate service boundaries, fostering a cohesive and well-integrated data ecosystem.
  • • Implementing infrastructure-as-code (IaC) using tools like Terraform, Docker, and Kubernetes (EKS) will be essential for automating the deployment and management of our data infrastructure, ensuring consistency and repeatability.
  • • You will play a vital role in establishing and monitoring Service Level Objectives (SLOs) for data quality, latency, and availability. Proactive troubleshooting of production issues across our distributed systems will be a regular part of your responsibilities, ensuring the continuous operation and reliability of our data services.
  • • This role is specifically designed for individuals with a strong software development background who are looking to apply their skills to data engineering challenges. It is not a business intelligence, pure analytics, or low-code configuration role. Candidates whose experience is primarily focused on dashboard creation, report generation, or configuring off-the-shelf BI tools will not be a good fit. We need engineers who have a proven track record of building and shipping production services that other engineers consume, demonstrating a deep understanding of API design and software architecture.
  • • Your contributions will directly impact the performance and scalability of our AI models and customer engagement platforms, enabling Hatch to maintain its leadership position in the market. You will have the opportunity to shape the future of our data infrastructure and influence engineering best practices within the company.

Skills & Technologies

Python
PostgreSQL
AWS
GCP
Docker
Senior
Remote

Ready to Apply?

You will be redirected to an external site to apply.

Hatch Ltd. logo
Hatch Ltd.
Visit Website

About Hatch Ltd.

Hatch is a global, multidisciplinary management, engineering, and professional services company. They offer a comprehensive suite of services across the project lifecycle, from concept and feasibility studies to design, construction, and operation. Hatch specializes in the mining, metals, energy, and infrastructure sectors, providing innovative solutions to complex challenges. Their expertise includes project management, engineering design, environmental consulting, and digital transformation. With a focus on sustainability and client collaboration, Hatch aims to deliver safe, efficient, and responsible projects that create lasting value for their clients and the communities they serve. They are committed to leveraging technology and deep industry knowledge to drive progress and shape a better future.

Similar Opportunities

Argentina
Full-time
Expires May 12, 2026
Onsite

1 day ago

Apply
Argentina
Full-time
Expires Apr 25, 2026
Senior
Remote

19 days ago

Apply
Megaport Limited logo

Megaport Limited

Brisbane, Australia
Full-time
Expires May 12, 2026
Onsite

1 day ago

Apply
Australia
Full-time
Expires May 12, 2026
Senior
Hybrid

1 day ago

Apply