
Job Overview
Location
Indiana, USA
Job Type
Full-time
Category
Data Engineer
Date Posted
March 10, 2026
Full Job Description
đź“‹ Description
- • We are seeking a highly skilled and experienced Senior Data Engineer, specializing in Microsoft Fabric, to join our team and play a pivotal role in designing, building, and scaling modern, cloud-native data platforms. This is a full-time, remote position based in India, offering the opportunity to work with a leading client and contribute to cutting-edge data solutions.
- • In this role, you will be instrumental in developing robust ETL/ELT pipelines, architecting sophisticated data solutions, and implementing high-performance data engineering strategies leveraging the full power of Microsoft Fabric and other Azure data technologies. Your expertise will be crucial in transforming raw data into actionable insights that drive business value and support advanced analytical initiatives.
- • The ideal candidate possesses a unique blend of strong architectural thinking and deep hands-on engineering expertise. You will be responsible for constructing scalable data pipelines, providing essential support for advanced analytics teams, and collaborating closely with machine learning specialists to develop and refine AI-driven data workflows. This position demands a profound understanding of tools and technologies such as Databricks, Spark, Delta Lake, Python, and SQL, alongside a solid grasp of modern data orchestration, governance, and best practices.
- • **Data Architecture & Platform Design:** You will be tasked with designing and implementing highly scalable, cloud-native data architectures tailored to meet the demands of modern data-intensive applications. This includes defining and enforcing best practices for data governance, establishing robust architecture standards, and ensuring the long-term scalability and resilience of our data platforms. A key aspect of this responsibility involves building sophisticated data models and comprehensive data warehouse architectures that effectively support both analytical reporting and cutting-edge AI workloads.
- • **ETL/ELT Pipeline Development:** A core function of this role is the design and development of high-performance ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) pipelines. These pipelines will be engineered to handle large-scale data processing efficiently and reliably. You will build and meticulously maintain these data pipelines using Python and SQL, enabling the processing and transformation of complex, diverse datasets. Ensuring the reliability, scalability, and optimal performance across all data workflows will be paramount.
- • **Data Engineering & Platform Development:** You will be responsible for developing and managing intricate data engineering workflows, with a strong emphasis on using Databricks, Apache Spark, and Delta Lake. This includes implementing flexible and efficient data ingestion frameworks and providing robust support for large-scale data processing environments. Continuous optimization of data pipelines for peak performance, unwavering reliability, and cost efficiency will be a key performance indicator.
- • **Orchestration & Automation:** A significant part of this role involves designing and implementing sophisticated workflow orchestration strategies. This may involve utilizing tools like Airflow or leveraging Azure-native orchestration services to manage complex data processes. The goal is to automate data processing pipelines wherever possible, ensuring seamless operation and maintaining high levels of operational reliability across all integrated systems.
- • **AI & Advanced Data Workflows:** You will actively collaborate with machine learning teams to provide critical data engineering support for advanced AI initiatives, including Large Language Models (LLMs), Natural Language Processing (NLP), and other AI-driven data workflows. This collaboration will involve enabling effective feature engineering and developing data pipelines that are optimized to support the training and deployment of advanced AI models.
- • **Governance & Best Practices:** You will play a key role in establishing and championing best practices across the data engineering lifecycle. This includes defining standards for data architecture, pipeline management, comprehensive documentation, and robust security protocols. Ensuring strict adherence to enterprise data governance policies and maintaining high data quality standards will be essential to the success of the data platforms.
- • This role offers a unique opportunity to work at the forefront of data engineering, utilizing the latest Microsoft Fabric technologies to build impactful data solutions. If you are a seasoned data professional with a passion for cloud-native architectures and a proven track record of delivering complex data projects, we encourage you to apply.
Skills & Technologies
About Weekday 1
Weekday 1 is a company focused on providing innovative solutions within the [Industry - e.g., technology, education, finance] sector. Their core business model revolves around [Business Model - e.g., developing SaaS products, offering consulting services, creating digital platforms] to address specific market needs. They aim to [Company Goal - e.g., streamline processes, enhance user experiences, drive digital transformation] for their clients. The company operates in a competitive landscape, differentiating itself through [Key Differentiators - e.g., proprietary technology, expert team, unique approach]. Weekday 1 is committed to [Values/Mission - e.g., fostering collaboration, delivering exceptional value, sustainable growth].
Similar Opportunities

Constructor Technologies, Inc.
7 days ago

Savvyinsurance Trellis
2 hours ago

Wave Financial Inc.
1 month ago
15 days ago
