Tide Platform Limited logo

Lead Data Engineer(Snowflake/DBT)

Job Overview

Location

Delaware, India

Job Type

Full-time

Category

Data Engineer

Date Posted

March 5, 2026

Full Job Description

đź“‹ Description

  • • As a Lead Data Engineer at Tide Platform Limited, you will be instrumental in architecting, building, and maintaining the robust data pipelines and services that power our business intelligence, reporting, and analytical functions. You will play a pivotal role in ensuring that critical data is accessible, reliable, and optimized to support data-driven decision-making across the organization. This role is central to our mission of helping SMEs save time and money by providing them with cutting-edge financial and administrative solutions.
  • • You will be responsible for developing end-to-end ETL/ELT pipelines, collaborating closely with Data Analysts from various business functions to translate their needs into effective data solutions. This involves designing, developing, and implementing scalable, automated processes for data extraction, transformation, and loading, with a strong emphasis on a Data Mesh architecture.
  • • A key aspect of this leadership role is mentoring junior engineers within the team. You will act as a subject matter expert, a go-to resource for data technologies and solutions, and provide guidance on best practices in data engineering.
  • • You will be expected to provide on-the-ground troubleshooting and diagnosis for architecture and design challenges, proactively identifying and resolving technical issues as they arise to ensure the smooth operation of our data infrastructure.
  • • Continuous improvement is paramount. You will actively seek opportunities to enhance both the 'what' and the 'how' of data pipeline delivery, ensuring efficiency, scalability, and maintainability.
  • • Translating complex business requirements into precise technical specifications will be a core responsibility. This includes defining data models, specifying dbt models to be built, setting delivery timelines, implementing tests, and generating reports.
  • • You will own the end-to-end delivery of data models and reports, from conception through to deployment and ongoing maintenance.
  • • Performing exploratory data analysis is crucial for identifying data quality issues early in the process. You will implement robust testing strategies to prevent future data integrity problems.
  • • Working collaboratively with Data Analysts, you will ensure that all data feeds are optimized and available precisely when needed. This includes leveraging advanced techniques such as Change Data Capture (CDC) and other delta loading approaches for efficient data synchronization.
  • • The role involves discovering, transforming, testing, deploying, and meticulously documenting data sources to ensure clarity and accessibility for all stakeholders.
  • • You will be a champion for data warehouse governance, actively applying, helping to define, and promoting best practices in data quality, testing methodologies, coding standards, and peer review processes.
  • • While not the primary focus, there may be instances where you will build Looker Dashboards for specific use cases, requiring a good understanding of BI tool integration.
  • • Our technology stack is modern and robust, heavily relying on Snowflake for our data warehousing needs, dbt for transformation, Airflow for orchestration, Fivetran for ingestion, and Looker for business intelligence. We leverage AWS extensively, with some integration of GCP services.
  • • You will be working within an agile, cross-functional delivery team, contributing to a fast-paced and innovative environment. High development standards, including code quality, rigorous code reviews, unit testing, and continuous integration/continuous deployment (CI/CD), are expected.
  • • Strong technical documentation skills are essential, enabling you to communicate complex technical details clearly and precisely to both technical and business users.
  • • Business-level English proficiency and excellent communication skills are necessary for effective collaboration with global teams.
  • • Experience in architecting analytical databases within a Data Mesh architecture is a significant advantage, aligning with our forward-thinking data strategy.
  • • A basic understanding of various AWS platform services is beneficial for seamless integration and operation.
  • • Previous experience working in a digitally native company, particularly within the fintech sector, is highly preferred, offering valuable insights into our industry and operational pace.
  • • Experience with Python, governance tools (e.g., Atlan, Alation, Collibra), or data quality tools (e.g., Great Expectations, Monte Carlo, Soda) will be considered a strong asset, further enhancing your capabilities in data management and quality assurance.

Skills & Technologies

Python
AWS
GCP
Git
Senior
Remote

Ready to Apply?

You will be redirected to an external site to apply.

Tide Platform Limited logo
Tide Platform Limited
Visit Website

About Tide Platform Limited

Tide Platform Limited is a UK-based financial technology company providing business banking services to small and medium-sized enterprises. It offers digital current accounts, expense cards, invoicing, accounting integrations, and credit products through a mobile-first platform. Operating under an e-money licence with banking services provided by ClearBank, Tide serves over 500,000 UK businesses. The company focuses on streamlining financial administration for entrepreneurs, freelancers, and growing companies through automated features and integrations with accounting software like Xero and FreeAgent.

Similar Opportunities

Argentina
Full-time
Expires Apr 25, 2026
Senior
Remote

11 days ago

Apply
Bulgaria
Full-time
Expires Apr 25, 2026

11 days ago

Apply
Brazil
Full-time
Expires May 3, 2026
Python
AWS
Senior
+1 more

3 days ago

Apply
Brazil
Full-time
Expires May 3, 2026
Python
JavaScript
TypeScript
+1 more

3 days ago

Apply