This job has expired
This position was posted on November 9, 2025 and is likely no longer accepting applications. We've kept it here for historical reference. Check out the similar jobs below!

Job Overview
Location
Remote
Job Type
Full-time
Category
Software Engineering
Date Posted
November 9, 2025
Full Job Description
đź“‹ Description
- • Architect and maintain scalable, fault-tolerant data pipelines that ingest, cleanse, and transform multi-terabyte datasets from disparate sources—SQL, NoSQL, APIs, flat files, and real-time streams—into analytics-ready formats that power critical business decisions for Fortune-1000 clients.
- • Partner directly with business stakeholders, data scientists, and product owners to translate fuzzy requirements into crisp technical specs, ensuring every pipeline you build solves a real-world problem and delivers measurable ROI within weeks, not quarters.
- • Select and implement the right mix of cloud-native and open-source tooling—think Snowflake, Redshift, BigQuery, dbt, Airflow, Kafka, Spark, and serverless functions—to balance cost, performance, and maintainability while future-proofing against rapidly evolving data volumes and velocity.
- • Champion data quality and governance by embedding automated testing, lineage tracking, and anomaly detection into every stage of the pipeline, giving downstream analysts confidence that the numbers they see are accurate, timely, and auditable.
- • Optimize query performance and storage costs through intelligent partitioning, clustering, indexing, and materialized views, cutting average dashboard load times by 40 % and saving clients thousands in monthly cloud spend.
- • Design and enforce CI/CD workflows for data code (SQL, Python, Scala) using GitHub Actions, Terraform, and containerized environments, enabling multiple daily deployments with zero downtime and full rollback capability.
- • Build reusable data models and libraries that accelerate delivery across client engagements, turning tribal knowledge into documented, version-controlled assets the entire InterWorks team can leverage.
- • Provide hands-on mentorship to junior engineers and client staff through pair programming, brown-bag sessions, and code reviews, raising the technical bar and cultivating a culture of continuous learning.
- • Stay ahead of the curve by evaluating emerging technologies—lakehouses, data meshes, real-time ML feature stores—and running proof-of-concepts that inform our long-term architectural roadmap.
- • Contribute to InterWorks’ thought-leadership blog and speak at meetups or conferences, sharing battle-tested patterns and lessons learned from the trenches of enterprise-scale data engineering.
🎯 Requirements
- • 3+ years of hands-on experience designing and operating production-grade data pipelines in cloud environments (AWS, Azure, or GCP) using SQL, Python, and workflow orchestrators like Airflow or Prefect.
- • Deep expertise with at least one modern data warehouse (Snowflake, BigQuery, or Redshift) and proficiency in dimensional modeling, slowly changing dimensions, and incremental load strategies.
- • Comfort with infrastructure-as-code tools such as Terraform or CloudFormation and a solid grasp of DevOps best practices for data, including automated testing, CI/CD, and containerization.
- • Nice-to-have: experience with real-time streaming (Kafka, Kinesis, Pub/Sub), dbt for analytics engineering, or advanced data governance platforms like Collibra or Alation.
🏖️ Benefits
- • Fully remote-first culture with flexible hours and a $1,500 annual home-office stipend to craft your ideal workspace.
- • 100 % employer-paid medical, dental, and vision coverage for you and your dependents, plus a $100 monthly wellness allowance.
- • 20 days PTO plus 10 company holidays and a paid volunteer day each year to support causes you care about.
- • Annual professional development budget of $3,000 for conferences, certifications, courses, or books—plus paid time to actually use it.
Skills & Technologies
About InterWorks, Inc.
InterWorks provides data strategy, analytics consulting and managed IT services to enterprises worldwide. Founded in 1996 and headquartered in Stillwater, Oklahoma, the company specializes in data engineering, cloud architecture, business intelligence platforms such as Tableau and Snowflake, cybersecurity and end-user support. Its team of consultants designs, implements and optimizes scalable data ecosystems, infrastructure and applications, enabling clients to turn raw data into actionable insights while maintaining security and performance. InterWorks serves Fortune 500 corporations, mid-market firms and public-sector organizations through offices across North America, Europe and Asia-Pacific.
Subscribe to the weekly newsletter for similar remote roles and curated hiring updates.
Newsletter
Weekly remote jobs and featured talent.
No spam. Only curated remote roles and product updates. You can unsubscribe anytime.
Similar Opportunities

SHI International Corp.
2 months ago

Aquia Inc.
7 months ago
19 days ago

