
Job Overview
Location
Remote
Job Type
Full-time
Category
Data Engineer
Date Posted
March 24, 2026
Full Job Description
đź“‹ Description
- • As a Senior ETL Engineer at Weekday 1, you will play a pivotal role in designing and leading the development of scalable, cloud-native data platforms that empower enterprises adopting AI at scale with proactive compliance and sustainable automation. Your work will directly support the integration of advanced AI technologies, including LLM/BERT-based workflows, by ensuring high-quality, governed, and performant data pipelines—making you a critical enabler of innovation for both enterprise clients and innovative startups like Amelia.ai.
- • You will architect and implement high-performance data engineering solutions using Databricks, Apache Airflow, Spark, Delta Lake, Python, and SQL, while collaborating closely with ML teams to design and optimize data workflows for AI-driven applications. Your responsibilities include building robust ETL/ELT pipelines, establishing data governance frameworks, enforcing best practices in data modeling and orchestration, and leveraging modern GenAI tools to enhance pipeline efficiency, data quality, and developer productivity—all within a fully remote, India-based work environment.
- • You will join a forward-thinking team at Weekday 1 that specializes in delivering cutting-edge data infrastructure solutions for AI-centric enterprises. The company is deeply connected to innovative AI startups and focuses on bridging the gap between raw data and intelligent automation, providing you with exposure to real-world AI use cases and emerging technologies in a collaborative, impact-driven culture.
- • In this role, you will deepen your expertise in modern data architecture, cloud-native ETL/ELT patterns, and AI-integrated data workflows, positioning yourself as a leader in the intersection of data engineering and generative AI. You will gain hands-on experience with Databricks Lakehouse, Airflow orchestration, Delta Lake transactions, and LLM-enhanced data processing—skills that are in high demand across industries adopting AI at scale.
🎯 Requirements
- • Minimum 7 years of hands-on experience in data engineering, with strong expertise in designing and implementing ETL/ELT pipelines using Databricks, Apache Airflow, Spark, and Delta Lake.
- • Proficiency in Python and SQL for developing scalable data transformation logic, including experience with complex SQL queries, UDFs, and data validation frameworks.
- • Proven experience working with cloud platforms (Azure, AWS, or GCP), particularly in deploying and managing Databricks workspaces, clusters, and jobs in enterprise environments.
- • Familiarity with LLMs/NLP and their integration into data workflows, including experience preparing, transforming, and validating data for BERT-based or similar AI models.
- • Strong understanding of data architecture principles, including dimensional modeling, data governance, metadata management, and pipeline orchestration best practices.
- • Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field is preferred.
🏖️ Benefits
- • Fully remote work arrangement based anywhere in India, offering flexibility and work-life balance without geographic constraints.
- • Opportunity to work on cutting-edge AI-driven data projects involving LLM/BERT-based workflows and GenAI tooling, keeping your skills at the forefront of technological innovation.
- • Exposure to enterprise-scale data platforms and collaboration with innovative startups like Amelia.ai, enhancing your professional network and impact.
- • Competitive salary range of INR 1.5–3.0 million per annum (LPA), reflecting the seniority and strategic importance of the role.
- • Access to modern GenAI tools to augment developer efficiency, improve data quality, and streamline pipeline development—empowering you to work smarter and faster.
Skills & Technologies
About Weekday Technologies Inc.
Weekday Technologies operates a hiring platform that connects tech companies with pre-vetted software engineers through community referrals. The product crowdsources candidate recommendations from existing engineering teams, verifies skills, and offers employers a searchable talent pool for contract and full-time roles. Founded in 2021 and headquartered in San Francisco, the company focuses on reducing time-to-hire for startups and scale-ups by leveraging trusted peer networks rather than traditional recruiting pipelines.
Subscribe to the weekly newsletter for similar remote roles and curated hiring updates.
Newsletter
Weekly remote jobs and featured talent.
No spam. Only curated remote roles and product updates. You can unsubscribe anytime.


