This job has expired

This position was posted on September 19, 2025 and is likely no longer accepting applications. We've kept it here for historical reference. Check out the similar jobs below!

Danaher Corporation logo

Data Engineering (Supply Chain AI) Co-Op

Job Overview

Location

USA

Job Type

Full-time

Category

Data Engineer

Date Posted

September 19, 2025

Full Job Description

đź“‹ Description

  • • Join Danaher’s pioneering Supply Chain AI team as a Data Engineering Co-Op and spend 4–6 months turning terabytes of global logistics, planning, and manufacturing data into production-grade machine-learning fuel. You will sit at the intersection of operations and artificial intelligence, building the pipes that let our data scientists predict demand, optimize inventory, and cut millions in cost and carbon.
  • • Architect end-to-end data pipelines that ingest real-time feeds from ERP (SAP), WMS (Manhattan), TMS (Oracle OTM), IoT sensors, and external weather/port-delay APIs. You will design idempotent ETL/ELT workflows in Apache Airflow, dbt, and Spark that run on Azure Kubernetes Service and auto-scale to 10 k+ tasks per day while keeping latency under five minutes.
  • • Curate a curated “supply-chain feature store” in Snowflake: star-schema models, slowly-changing dimensions, and time-window aggregations that are version-controlled, unit-tested, and documented so that any data scientist can discover and reuse them via our internal data-catalog (DataHub).
  • • Partner with ML engineers to productionize feature engineering code—translate Jupyter notebooks into scalable Python services, wrap them in Docker containers, and deploy via CI/CD (GitHub Actions + Terraform) with built-in data-quality gates (Great Expectations, Monte-Carlo) that halt bad data before it reaches models.
  • • Build streaming anomaly-detection micro-services (Kafka + Flink) that flag sudden supplier lead-time spikes, port closures, or demand shocks; surface alerts in PowerBI and Teams so planners can act within minutes instead of days.
  • • Automate data-validation and reconciliation jobs that compare source-system totals to lake-house totals, send Slack alerts on divergence >0.5 %, and auto-trigger rewind & replay jobs to guarantee 99.9 % data accuracy SLA.
  • • Experiment with emerging lake-house patterns (Delta Lake, Iceberg) to merge the reliability of warehouses with the flexibility of lakes; benchmark query performance and cost to guide enterprise-wide adoption.
  • • Contribute to our inner-source “Data-Engineering Toolkit”: reusable Airflow operators, dbt macros, and Python libraries that reduce duplicate code across 15 Danaher operating companies. Your pull-requests will be reviewed by senior engineers and, once merged, used by teams on three continents.
  • • Present findings to VP-level stakeholders every other week—translate technical metrics (“95 % pipeline reliability, 40 % reduction in run-time”) into business impact (“$3 M working-capital freed, 1.2 kt COâ‚‚e avoided”).
  • • Own one “passion project” from ideation to demo: e.g., use computer-vision on satellite imagery to detect port congestion, or apply graph neural networks to predict supplier risk. You will showcase the outcome at Danaher’s global AI summit and may spin it into a patent filing.

🎯 Requirements

  • • Currently enrolled in a Bachelor’s or Master’s program in Computer Science, Data Engineering, Analytics, or related STEM field with graduation date December 2025 or later.
  • • Solid Python and SQL skills—can write performant joins, window functions, and unit tests; familiarity with pandas, PySpark, or similar data-processing libraries.
  • • Exposure to at least one major cloud stack (Azure, AWS, or GCP) and comfort with basic services like blob storage, VM’s, and identity management; Snowflake experience is a strong plus.
  • • Demonstrated curiosity about supply-chain domains (coursework, projects, or internships) and a passion for applying AI to real-world operations problems.

🏖️ Benefits

  • • Fully remote, flexible schedule—work from any US location and align hours with your class commitments; we provide MacBook Pro, 27-inch monitor, and $250 home-office stipend.
  • • Competitive hourly pay of $30–35 with overtime eligibility and 401(k) enrollment option; 11 paid holidays and one volunteer day during the co-op term.
  • • Direct mentorship from senior data engineers and ex-Amazon, Microsoft ML specialists; guaranteed code-review within 24 h and a personalized learning plan that includes Snowflake, Airflow, and Terraform certifications—paid by Danaher.

Skills & Technologies

Python
AWS
Azure
Remote

Ready to Apply?

You will be redirected to an external site to apply.

Danaher Corporation logo
Danaher Corporation
Visit Website

About Danaher Corporation

Danaher Corporation is a global science and technology innovator headquartered in Washington, D.C. Founded in 1969, it designs, manufactures, and markets medical, industrial, and commercial products and services to professional, medical, and scientific customers. Operating through three segments—Life Sciences, Diagnostics, and Environmental & Applied Solutions—Danaher owns brands such as Beckman Coulter, Cepheid, Leica Microsystems, and Hach. The company applies the Danaher Business System, a continuous-improvement methodology, to drive quality, delivery, cost, and innovation. With approximately 63,000 associates worldwide, Danaher focuses on accelerating diagnostics, advancing life-science research, and ensuring water quality and public health.

Similar Opportunities

Argentina
Full-time
Expires Apr 25, 2026
Senior
Remote

14 days ago

Apply
Bulgaria
Full-time
Expires Apr 25, 2026

14 days ago

Apply
Brazil
Full-time
Expires May 3, 2026
Python
AWS
Senior
+1 more

6 days ago

Apply
Brazil
Full-time
Expires May 3, 2026
Python
JavaScript
TypeScript
+1 more

5 days ago

Apply