phData is hiring a Senior DevOps Engineer responsible for technical delivery for projects involving Snowflake, AWS/Azure Cloud Platform, and cloud-hosted services. This role focuses on supporting modern data platforms, responding to incidents, and managing tasks across multiple customer accounts from our Bangalore location.
What You'll Do
- Support and manage modern data platforms, including streaming, data lakes, and analytics, across a progressively evolving technical stack.
- Own the execution of tasks and provide guidance on tasks other engineers are working on for the project.
- Respond to pager incidents, solve challenging problems, and investigate customer processes and workflows to resolve issues.
- Demonstrate clear ownership of tasks on multiple simultaneous customer accounts across various technical stacks.
- Continually grow, learn, and stay up-to-date with the Managed Services technology stack.
- Work 24/7 rotational shifts.
What We're Looking For
- Working knowledge of SQL and the ability to write, debug, and optimise SQL queries.
- Experience providing operational support across a large user base for a cloud-native data warehouse like Snowflake and/or Redshift.
- Hands-on experience with a Relational Database Management System such as Oracle or MSSQL.
- Experience in a 24*7 production support team, monitoring and supporting scheduled data jobs and pipelines (ETL/ELT).
- Working knowledge of Unix/Linux environments.
- Basic understanding of writing and optimising Python programs.
- Experience with cloud-native data technologies in AWS or Azure.
- Familiarity with ITIL processes and working in SLA-driven support environments.
- Strong troubleshooting and performance tuning skills.
- Client-facing written and verbal communication skills and experience.
- Openness to learning new technology stacks and up-skilling/training other team members.
Nice to Have
- Production support experience and certifications in core data platforms like Snowflake, AWS, Azure, or Databricks.
- Production support experience with QLIK Sense support tasks.
- Production support experience with Cloud and Distributed Data Storage technologies such as S3, ADLS, HDFS, or other NoSQL storage systems.
- Production support experience with Data integration technologies such as Spark, Kafka, event/streaming, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory or similar.
- Production support experience with Workflow Management and Orchestration like Airflow, AWS Managed Airflow, Luigi, NiFi.
- Expertise in a scripting language to automate repetitive tasks, preferably Python.
- Knowledge of continuous integration and deployment frameworks with hands-on experience using CI/CD tools like Bitbucket, Github, Flyway, and Liquibase.
- Bachelor's degree in Computer Science or a related field.
Technical Stack
- Languages & Databases: SQL, Python, Snowflake, Redshift, Oracle, MSSQL
- Cloud & Infrastructure: AWS, Azure, Unix/Linux, S3, ADLS, HDFS
- Data & Integration: Spark, Kafka, Matillion, Fivetran, HVR, NiFi, AWS Data Migration Services, Azure DataFactory
- Orchestration & Tools: Airflow, Luigi, Bitbucket, Github, Flyway, Liquibase
Team & Environment
You will join the Elastic Operations and services team at phData.
Benefits & Compensation
- Competitive compensation plan
- Annual bonus
- Training and certifications
- Equity
Work Mode
This role is based locally in Bangalore, India. phData is a remote-first global company.
phData celebrates diversity and is committed to creating an inclusive environment for all employees. We are proud to be an equal opportunity employer.





