This job has expired

This position was posted on September 15, 2025 and is likely no longer accepting applications. We've kept it here for historical reference. Check out the similar jobs below!

Gravis Robotics AG logo

Data Analytics Intern

Job Overview

Location

Zurich

Job Type

Full-time

Category

Data Engineer

Date Posted

September 15, 2025

Full Job Description

đź“‹ Description

  • • Own the end-to-end design and implementation of a scalable data pipeline that ingests, cleans, and stores terabytes of high-frequency sensor, actuator, and telemetry data generated by our fleet of autonomous excavators and simulation environments.
  • • Architect and maintain cloud-native data infrastructure (AWS S3, Redshift, Glue, Lambda, Step Functions) that guarantees sub-minute latency for operational dashboards and supports long-term trend analytics for strategic decision-making.
  • • Build robust ETL/ELT workflows that transform raw machine logs, LiDAR point clouds, GNSS tracks, and CAN-bus signals into curated, query-ready datasets that power both engineering debug sessions and executive KPI reports.
  • • Develop intuitive data models and schema designs that balance query performance with storage cost; continuously profile and optimize queries to cut dashboard load times by 50 % and reduce monthly cloud spend.
  • • Create automated data-quality monitors and anomaly-detection alerts that flag missing packets, sensor drift, or unexpected machine behavior within minutes—enabling field engineers to react before costly downtime occurs.
  • • Collaborate daily with perception, controls, and simulation teams to translate research-grade prototypes into production-grade analytics features; your code will directly influence how operators interact with machines in our gamified control center.
  • • Design self-service BI tools (Metabase, Superset) that let non-technical stakeholders slice and dice data without writing SQL; empower finance, operations, and customer-success teams to answer their own questions in real time.
  • • Build lightweight Python/C++ micro-services that expose REST and gRPC endpoints for on-demand data retrieval, enabling downstream ML training pipelines to pull fresh datasets with zero manual intervention.
  • • Establish rigorous version control, CI/CD, and automated testing for every data artifact—treat data as a product, ensuring reproducibility and rollback capabilities across releases.
  • • Document architecture decisions, data dictionaries, and runbooks in Notion and Confluence so that new team members can ramp up within days, not weeks.
  • • Present weekly findings to the executive team: translate complex statistical insights into clear narratives that influence roadmap priorities, customer pricing, and go-to-market strategy.
  • • Champion a culture of data-driven experimentation by running A/B tests on operator UI changes and quantifying the impact on productivity, safety, and fuel efficiency.
  • • Contribute to open-source communities and publish technical blog posts about novel techniques in construction-tech data engineering—position Gravis as a thought leader in heavy-machine autonomy analytics.

🎯 Requirements

  • • Bachelor’s degree (or higher) in Computer Science, Data Engineering, Robotics, or a related quantitative field
  • • Demonstrated experience designing and operating production-grade data pipelines using Python or C++ in cloud environments (AWS preferred)
  • • Solid SQL skills and familiarity with relational and columnar databases; hands-on experience with data warehousing concepts
  • • Clear written and verbal communication in English; ability to explain technical trade-offs to both engineers and business stakeholders
  • • Nice-to-have: prior exposure to time-series data, robotics telemetry, or construction-equipment domains; familiarity with infrastructure-as-code tools like Terraform or Pulumi

🏖️ Benefits

  • • Six-month paid internship located in vibrant Zurich with a fair market salary and the possibility of extension or conversion to a full-time role
  • • Hybrid work model and flexible hours—tailor your schedule around deep-focus coding blocks or field visits to live construction sites
  • • Direct mentorship from founders and senior engineers who have over a decade of experience in large-scale robotics and machine learning
  • • Access to cutting-edge sensor suites, high-performance GPU clusters, and a rooftop testbed where you can see your data pipelines drive real 20-ton excavators autonomously
  • • International, inclusive team culture with regular team retreats, hackathons, and a generous learning budget for courses, conferences, and certifications

Skills & Technologies

Python
AWS
Junior
Remote
Degree Required

Ready to Apply?

You will be redirected to an external site to apply.

Gravis Robotics AG logo
Gravis Robotics AG
Visit Website

About Gravis Robotics AG

Gravis Robotics AG develops AI-driven software that transforms heavy construction machines into autonomous robots. The Zurich-based spin-off of ETH Zurich combines machine learning, perception, and motion-planning to let excavators, dozers, and compactors work without human operators, increasing productivity and safety on job sites while reducing fuel consumption and emissions.

Get more remote jobs like this

Subscribe to the weekly newsletter for similar remote roles and curated hiring updates.

Newsletter

Weekly remote jobs and featured talent.

No spam. Only curated remote roles and product updates. You can unsubscribe anytime.

Similar Opportunities

⏰ EXPIRES SOON
Dallas, TX
Full-time
Expires May 18, 2026 (Soon)
Python
Azure
Onsite

2 months ago

Apply
⏰ EXPIRES SOON
Dallas, TX
Full-time
Expires May 12, 2026 (Soon)
Onsite

2 months ago

Apply
Warsaw
Full-time
Expires May 27, 2026
Python
Scala
Azure
+2 more

1 month ago

Apply
❌ EXPIRED
Argentina
Full-time
Expired Apr 25, 2026
Senior
Remote

3 months ago

Apply