
Job Overview
Location
Australia
Job Type
Full-time
Category
Data Engineer
Date Posted
March 17, 2026
Full Job Description
đź“‹ Description
- • Embark on a transformative Data Engineering Internship at EverCommerce, a leading force in digitally transforming the service economy. As a rising junior or senior, you will immerse yourself in a dynamic, hands-on experience, contributing to critical projects within our Data and Analytics (DnA) team. This internship is meticulously designed to provide you with invaluable practical skills and exposure to enterprise-level data operations, supporting both the company-wide intern program and enhancing our DnA team's processes.
- • You will play a pivotal role in the development, testing, and ongoing maintenance of robust data pipelines. This involves leveraging essential technologies such as Python and SQL to build and manage these pipelines within our sophisticated enterprise data platform. Your contributions will directly impact the efficiency and reliability of our data infrastructure, ensuring seamless data flow across the organization.
- • A key responsibility will be supporting data ingestion and integration efforts. You will work with Fivetran, a leading data integration tool, to bring data from various internal and third-party systems into our platform. This experience will provide a deep understanding of how diverse data sources are unified and made accessible for analysis.
- • You will gain practical experience in developing and executing ETL/ELT workflows, utilizing Apache Airflow for orchestration. This will involve understanding how complex data transformation processes are managed, scheduled, and monitored to ensure timely and accurate data delivery.
- • Contribute to the creation and refinement of data transformations and analytics models using dbt (data build tool). You will learn and apply established best practices in data modeling and transformation, ensuring data is structured optimally for analytical purposes and business intelligence.
- • Support data processing tasks within Databricks, working with Spark-based workloads under the guidance of experienced senior engineers. This exposure will provide insights into handling large-scale data processing and the power of distributed computing frameworks.
- • Assist in the deployment, monitoring, and troubleshooting of data workloads within Amazon Web Services (AWS), specifically focusing on services like S3 and various compute services. This hands-on experience in a cloud environment is crucial for modern data engineering roles.
- • You will be instrumental in performing critical data quality checks, validation, and debugging activities. Your diligence in these tasks will ensure the accuracy, reliability, and trustworthiness of the data that powers our business decisions.
- • A significant aspect of your role will involve creating and maintaining comprehensive documentation for data pipelines, models, and integrations. Clear and thorough documentation is vital for knowledge sharing, team collaboration, and the long-term maintainability of data systems.
- • Collaborate closely with a team of experienced data engineers, analytics engineers, and other cross-functional partners. This collaborative environment will foster a deeper understanding of how data initiatives align with broader business objectives and how to effectively translate business requirements into technical solutions.
- • Throughout this internship, you will gain a profound understanding of how to design and operate scalable, cloud-based data pipelines within a real-world enterprise setting. You will become proficient with a suite of modern data tools, including Airflow, Databricks, Fivetran, dbt, and AWS, mastering best practices in data modeling, transformations, and analytics engineering.
- • You will learn the practical applications of orchestration, automation, and monitoring in managing complex data systems, alongside techniques for ensuring data quality, reliability, and observability across diverse data sources. The internship will also provide experience working with large datasets using distributed data processing frameworks and understanding enterprise-scale data architecture and integration patterns.
- • Furthermore, you will be exposed to professional software engineering practices, including version control, code reviews, documentation, and testing, preparing you for a career in data engineering. This comprehensive learning experience is designed to equip you with the skills and knowledge necessary to excel in the field.
Skills & Technologies
Python
AWS
Apache Spark
Junior
Remote
Degree Required
About EverCommerce Inc.
EverCommerce provides vertical software and payments platforms for service-based small and medium businesses. The company offers integrated SaaS solutions that streamline operations, marketing, customer engagement, and payments across health and wellness, home services, and fitness markets. Its cloud-based tools help businesses manage scheduling, billing, client relationships, and workforce operations. EverCommerce serves over 700,000 customers globally through a portfolio of brands including DrChrono, ServiceTitan, and Mindbody. The company went public in 2021 and is headquartered in Denver, Colorado.


