This job has expired
This position was posted on January 8, 2026 and is likely no longer accepting applications. We've kept it here for historical reference. Check out the similar jobs below!

Job Overview
Location
Indiana, USA
Job Type
Full-time
Category
Software Engineering
Date Posted
January 8, 2026
Full Job Description
đź“‹ Description
- • Architect and own the end-to-end lifecycle of petabyte-scale data products that power real-time analytics for millions of users across global markets. You will translate complex business questions into elegant, high-performance data models in Snowflake, ensuring sub-second query response times and 99.9 % pipeline reliability.
- • Lead the design and continuous optimization of ELT/ETL pipelines that ingest 500 + data sources daily, leveraging AWS Glue, Lambda, and S3 to process 10 TB of streaming and batch data. You will implement idempotent, event-driven workflows that auto-scale with demand and self-heal on failure, reducing manual intervention by 80 %.
- • Write production-grade Python automation that orchestrates data workflows, enforces data-quality SLAs, and exposes clean RESTful APIs for downstream analytics teams. Your code will be peer-reviewed, unit-tested, and deployed through fully automated CI/CD pipelines using GitHub Actions and CloudFormation.
- • Champion the adoption of dbt across the organization by creating reusable macros, enforcing testing standards, and mentoring analysts on modular data transformation patterns. You will build a centralized dbt repository with 200 + documented models, cutting onboarding time for new analysts from weeks to days.
- • Model and maintain dimensional schemas (star, snowflake, data vault) that balance analytical flexibility with storage efficiency. You will conduct quarterly schema retrospectives, driving a 30 % reduction in storage costs while improving query performance by 40 %.
- • Integrate external SaaS platforms via secure REST APIs, handling OAuth, rate-limiting, and incremental load strategies. You will design a unified API gateway that abstracts 50 + third-party sources, enabling analysts to query cross-platform data through a single semantic layer.
- • Collaborate in cross-functional squads with product managers, ML engineers, and business stakeholders to deliver data features that directly impact revenue. You will run weekly “data clinics,” translating stakeholder pain points into sprint-ready user stories and measurable KPIs.
- • Establish and monitor data observability frameworks that track freshness, volume, and lineage anomalies. You will configure alerts in Slack/PagerDuty, achieving a mean-time-to-resolution under 15 minutes and maintaining a 99.5 % data-quality score across all critical tables.
- • Drive continuous performance tuning using Snowflake’s query profiler, materialized views, and clustering keys. You will publish monthly performance dashboards, identifying optimization opportunities that save 1,000+ compute hours annually.
- • Contribute to the engineering culture by authoring internal tech blogs, leading lunch-and-learn sessions, and participating in open-source dbt or Python projects. Your thought leadership will position Nagarro as a top-tier destination for data talent.
🎯 Requirements
- • 5+ years of hands-on experience writing advanced SQL in Snowflake, including window functions, UDFs, and performance tuning
- • Expert-level Python skills for data processing, automation, and REST API integration (pandas, pySpark, FastAPI)
- • Proven track record designing and operating ELT/ETL pipelines on AWS (Glue, Lambda, S3, CloudFormation)
- • Production experience with dbt for data transformation, testing, and documentation
- • Deep understanding of dimensional modeling, data warehousing best practices, and version control with Git
🏖️ Benefits
- • Fully remote-first culture with flexible hours and no-meeting Fridays
- • Annual professional development budget of $3,500 for conferences, courses, and certifications
- • Comprehensive health, dental, and vision insurance plus mental-wellness stipend
- • Stock option plan and quarterly performance bonus tied to company OKRs
- • 25 days PTO, 12 company holidays, and paid volunteer time off
Skills & Technologies
About Nagarro SE
Nagarro SE is a publicly listed global digital engineering company headquartered in Munich, Germany. It provides strategy, experience design, cloud, data and AI, and platform modernization services to Fortune 500 and mid-market enterprises across banking, insurance, manufacturing and retail. Operating from more than 35 countries with 18,000+ employees, the company delivers agile, scalable solutions that accelerate digital transformation and improve time-to-market for clients worldwide.
Similar Opportunities

ICF International, Inc.
5 days ago

Harris Computer Systems Corporation
5 days ago

