phData, Inc. logo

Senior DevOps Engineer (Cloud)

Job Overview

Location

India

Job Type

Full-time

Category

Data Science

Date Posted

March 11, 2026

Full Job Description

đź“‹ Description

  • • Join phData, a distinguished leader in the modern data stack, renowned for its strategic partnerships with premier cloud data platforms including Snowflake, AWS, Azure, GCP, Fivetran, Pinecone, Glean, and dbt. We are dedicated to empowering global enterprises to conquer their most complex data challenges.
  • • phData operates as a remote-first global entity, with a vibrant team spread across the United States, Latin America, and India. We deeply value and celebrate the diverse cultures of our team members, cultivating an environment rich in technological curiosity, a strong sense of ownership, and unwavering trust. Despite our rapid growth, we maintain a dynamic and engaging work atmosphere where top performers are empowered with the autonomy to achieve exceptional results.
  • • Our commitment to excellence is underscored by numerous accolades, including being a 6x Snowflake Partner of the Year (2020-2025), and recognized as Partner of the Year by Fivetran, dbt, and AWS. We hold the #1 position in Snowflake Advanced Certifications and boast over 600 expert cloud certifications across platforms like Sigma, AWS, and Azure.
  • • We are actively seeking highly skilled Senior DevOps Engineers to become integral members of our expanding Elastic Operations and services team, based in Bangalore, India. This role is crucial as we navigate significant growth and meet the escalating customer demand for advanced data and platform solutions.
  • • As a Senior DevOps Engineer within our Consulting Team, you will be at the forefront of technical delivery for critical projects involving Snowflake, cloud platforms (AWS/Azure), and cloud-hosted services. Your expertise will be vital in ensuring the seamless operation and continuous improvement of these sophisticated data environments.
  • • Your core responsibilities will encompass the comprehensive support and management of modern data platforms. This includes navigating and optimizing infrastructure from streaming data ingestion to data lakes, analytics, and beyond, all within a rapidly evolving technical landscape.
  • • A key aspect of this role is the continuous learning and adaptation to new technologies. You will be expected to embrace a quickly changing field, staying ahead of the curve to provide the best solutions for our clients.
  • • You will take ownership of task execution and provide guidance to other engineers on project-related tasks, ensuring alignment and efficient progress.
  • • A critical component of this role involves responding to critical incidents via pager duty, tackling complex and challenging problems head-on. This requires a deep dive into customer processes and workflows to effectively diagnose and resolve issues.
  • • You will demonstrate clear ownership and accountability for tasks across multiple simultaneous customer accounts, each with its unique technical stack and requirements.
  • • Continuous professional development is paramount. You will be expected to consistently grow your skills, learn new technologies, and remain up-to-date with the latest advancements in the Managed Services technology stack.
  • • This role operates on a 24/7 rotational shift basis, requiring flexibility and commitment to ensure continuous support for our global client base.
  • • You will leverage your working knowledge of SQL to write, debug, and optimize complex SQL queries, ensuring data integrity and performance.
  • • You will provide essential operational support across a large user base for cloud-native data warehouses such as Snowflake and/or Redshift, ensuring high availability and performance.
  • • Hands-on experience with Relational Database Management Systems like Oracle or MSSQL will be crucial for managing and optimizing database operations.
  • • Experience in a 24x7 production support team, monitoring and supporting scheduled data jobs and pipelines (ETL/ELT), is vital for maintaining operational continuity.
  • • Proficiency in Unix/Linux environments is required for system administration and troubleshooting tasks.
  • • A basic understanding of writing and optimizing Python programs will be beneficial for scripting and automation tasks.
  • • Experience with cloud-native data technologies within AWS or Azure is essential for managing and deploying cloud infrastructure.
  • • Familiarity with ITIL processes and working within SLA-driven support environments will ensure adherence to best practices and service level agreements.
  • • Strong troubleshooting and performance tuning skills are necessary to identify and resolve complex technical issues efficiently.
  • • Excellent client-facing communication skills, both written and verbal, are required to effectively interact with stakeholders and provide clear technical explanations.
  • • A proactive attitude towards learning new technology stacks and up-skilling/training other team members will foster a collaborative and knowledgeable team environment.
  • • Preferred candidates will possess production support experience and relevant certifications in core data platforms such as Snowflake, AWS, Azure, or Databricks.
  • • Experience with QLIK Sense support tasks is a plus.
  • • Familiarity with Cloud and Distributed Data Storage technologies like S3, ADLS, HDFS, or other NoSQL storage systems is advantageous.
  • • Experience with Data integration technologies such as Spark, Kafka, event/streaming, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory, or similar tools is highly desirable.
  • • Experience with Workflow Management and Orchestration tools like Airflow, AWS Managed Airflow, Luigi, or NiFi will be beneficial.
  • • Expertise in scripting languages for automating repetitive tasks, particularly Python, is preferred.
  • • Knowledge of continuous integration and deployment frameworks, with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, and Liquibase, is a significant advantage.
  • • A Bachelor's degree in Computer Science or a related field is preferred.

Skills & Technologies

Python
AWS
Azure
GCP
GitHub
DevOps
Senior
Remote
Degree Required

Ready to Apply?

You will be redirected to an external site to apply.

phData, Inc. logo
phData, Inc.
Visit Website

About phData, Inc.

phData is a professional services firm that specializes in designing, deploying, and managing modern data platforms based on Snowflake, AWS, Azure, and GCP. The company provides strategy, migration, architecture, machine learning, and managed services to large enterprises, helping them centralize, secure, and operationalize data at scale. Founded in 2014 and headquartered in Minneapolis, phData supports clients across North America and Europe with certified engineers and 24Ă—7 support. Its offerings span data engineering, governance, analytics, and DevOps automation, enabling organizations to accelerate analytics adoption and reduce total cost of ownership for cloud data initiatives.

Similar Opportunities

Nice, Argentina
Full-time
Expires Apr 25, 2026
JavaScript
TypeScript
React
+3 more

16 days ago

Apply
Buenos Aires, Argentina
Contract
Expires Apr 28, 2026
Python
AWS
Azure
+4 more

13 days ago

Apply
Sydney, Australia
Full-time
Expires Apr 27, 2026
Python
JavaScript
Node.js
+1 more

14 days ago

Apply
Nice, Australia
Full-time
Expires Apr 25, 2026
JavaScript
TypeScript
React
+3 more

16 days ago

Apply