Henry Schein, Inc. logo

Senior Data Engineer

Job Overview

Location

United Kingdom - Remote

Job Type

Full-time

Category

Data Engineer

Date Posted

March 25, 2026

Full Job Description

đź“‹ Description

  • • As a Senior Data Engineer at Henry Schein, you will design, build, and own the core business data platforms and pipelines that enable analytics, reporting, and future AI/ML workloads across HSO and business units such as Dentally, playing a strategically important role in establishing best practice data engineering foundations and embedding a scalable architecture, trusted data models, robust data quality, and secure access to healthcare data.
  • • You will design and implement a modern, cloud-native data platform on AWS, with Redshift and dbt at its core, building scalable, reliable data ingestion and transformation pipelines from operational, product, and commercial systems, delivering trusted datasets that power core reporting, dashboards, and self-serve analytics across the business.
  • • You will partner across Data, Product, Engineering, Commercial, Operational and Finance teams to ensure data models align with real business value drivers, while ensuring data platforms are extensible to support more advanced use cases such as streaming data, advanced analytics, future machine learning and AI workloads.
  • • You will collaborate with the Head of Data & AI to support emerging MLOps and AI data pipeline needs over time, defining and implementing best practices for data quality, testing, monitoring, and documentation using modern frameworks (e.g. dbt tests, data contracts, SLAs), and building secure, compliant data access patterns appropriate for healthcare and regulated environments, including data minimisation, hashing, tokenisation, and role-based access.
  • • You will own operational excellence (reliable, performant, scalable, cost aware and observable) for data ingestion and transformation pipelines, act as a senior technical contributor and role model within the data function, and support knowledge-sharing and mentoring as the team expands.
  • • You will bring proven experience building data platforms and pipelines from the ground up in a modern cloud environment (AWS, Azure, GCP, Redshift, Databricks, Snowflake, dbt, etc.), strong data engineering fundamentals including data modelling, ETL/ELT design, performance optimisation, governance, and reliability, and experience scaling data systems as usage, data volume, and organisational demand grow.
  • • You will have hands-on experience with AWS data services, strong experience with cloud data warehouses ideally Amazon Redshift and/or Snowflake, Databricks, production experience using dbt for data transformations, testing, and documentation, solid SQL expertise and proficiency in at least one general purpose programming language (e.g. Python), and experience implementing data quality frameworks, data contracts, data access frameworks, and SLAs.
  • • You will understand data security, privacy, and compliance considerations, have exposure to or strong interest in ML/AI data pipelines (Sagemaker, Bedrock) and MLOps concepts, be comfortable operating in ambiguity and autonomous enough to make pragmatic decisions in a greenfield environment that align to data principles, and focus on delivering value fast and value enablement.
  • • You will possess strong stakeholder communication skills, with the ability to translate technical concepts into business impact, a mindset focused on building pragmatic, sustainable, well governed systems rather than ad-hoc solutions and workarounds, and be curious, comfortable with ownership, and bring a continuous learner and improvement mindset.

🎯 Requirements

  • • Proven experience building data platforms and pipelines from the ground up in a modern cloud environment (AWS, Azure, GCP, Redshift, Databricks, Snowflake, dbt, etc.)
  • • Strong data engineering fundamentals, including data modelling, ETL/ELT design, performance optimisation, governance, and reliability
  • • Experience scaling data systems as usage, data volume, and organisational demand grow
  • • Hands on experience with AWS data services
  • • Strong experience with cloud data warehouses, ideally Amazon Redshift and/or Snowflake, Databricks
  • • Production experience using dbt for data transformations, testing, and documentation
  • • Solid SQL expertise and proficiency in at least one general purpose programming language (e.g. Python)
  • • Experience implementing data quality frameworks, data contracts, data access frameworks, and SLAs
  • • Understanding of data security, privacy, and compliance considerations
  • • Exposure to, or strong interest in, ML/AI data pipelines (Sagemaker, Bedrock) and MLOps concepts
  • • Comfortable operating in ambiguity and autonomous enough to make pragmatic decisions in a greenfield environment that align to data principles
  • • Strong stakeholder communication skills, with the ability to translate technical concepts into business impact
  • • A mindset focused on building pragmatic, sustainable, well governed systems rather than ad-hoc solutions and workarounds
  • • Is curious, comfortable with ownership, and brings a continuous learner and improvement mindset

🏖️ Benefits

  • • Permanent contract with individual contributor role at Job Grade P4
  • • Opportunity to work on strategically important data platforms with high visibility and direct impact on company strategy, product innovation, automation, and commercial outcomes
  • • Remote work flexibility based in the United Kingdom
  • • Exposure to cutting-edge healthcare data environments and regulated data systems
  • • Mentorship and knowledge-sharing opportunities as the data team expands
  • • Support for continuous learning and improvement mindset with access to modern data engineering tools and frameworks
  • • Alignment with Henry Schein’s core values of diversity, inclusion, ethics, and caring for people as the greatest asset
  • • Involvement in emerging MLOps and AI data pipeline initiatives under the Head of Data & AI
  • • Ability to own operational excellence and shape best practices for data quality, testing, monitoring, and documentation
  • • Work on meaningful problems that enable analytics, reporting, and future AI/ML workloads across business units like Dentally

Skills & Technologies

Python
AWS
Azure
GCP
Senior
Remote

Ready to Apply?

You will be redirected to an external site to apply.

Henry Schein, Inc. logo
Henry Schein, Inc.
Visit Website

About Henry Schein, Inc.

Henry Schein, Inc. is a Fortune 500 distributor of healthcare products and services to office-based dental, animal health, and medical practitioners. Founded in 1932 and headquartered in Melville, New York, the company supplies pharmaceuticals, equipment, software, and practice-management solutions across more than 30 countries. Its offerings include infection-control products, diagnostic equipment, vaccines, and financial services, supported by a global logistics network and value-added consulting. The firm also operates continuing-education programs and sustainability initiatives, serving approximately one million customers while emphasizing efficient supply-chain management and integrated technology platforms for healthcare practices.

Get more remote jobs like this

Subscribe to the weekly newsletter for similar remote roles and curated hiring updates.

Newsletter

Weekly remote jobs and featured talent.

No spam. Only curated remote roles and product updates. You can unsubscribe anytime.

Similar Opportunities

Dallas, TX
Full-time
Expires May 18, 2026
Python
Azure
Onsite

1 month ago

Apply
Dallas, TX
Full-time
Expires May 12, 2026
Onsite

1 month ago

Apply
Warsaw
Full-time
Expires May 27, 2026
Python
Scala
Azure
+2 more

27 days ago

Apply
⏰ EXPIRES SOON
Argentina
Full-time
Expires Apr 25, 2026 (Soon)
Senior
Remote

2 months ago

Apply