Bangalore (Gopalan Axis SEZ) Hybrid Employment

Groupon is hiring a Data Reliability Engineer II

About the Role

We are looking for a Data Reliability Engineer to join our team. In this role, you will be responsible for designing, building, and maintaining reliable and scalable data systems. You will work closely with cross-functional teams to ensure data integrity and performance, and collaborate on projects that drive business value through data.

Responsibilities

  • Design and implement data pipelines and systems to ensure reliability and scalability.
  • Monitor and maintain data infrastructure to prevent and resolve issues.
  • Collaborate with data engineers, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Develop and implement data quality checks and monitoring tools.
  • Optimize data storage and retrieval processes to enhance performance.
  • Ensure data security and compliance with relevant regulations and standards.
  • Troubleshoot and resolve data-related issues and outages.
  • Create and maintain documentation for data systems and processes.
  • Participate in on-call rotations to provide 24/7 support for data systems.
  • Stay updated with the latest trends and best practices in data engineering and reliability.
  • Contribute to the development and improvement of data engineering standards and practices.
  • Work with cross-functional teams to integrate data systems with other business applications.
  • Perform data audits and assessments to identify and mitigate risks.
  • Implement automated testing and validation processes for data systems.
  • Provide technical guidance and mentorship to junior team members.
  • Develop and maintain data dashboards and reporting tools.
  • Ensure data systems are scalable to handle increasing data volumes and user demands.
  • Collaborate with IT and operations teams to manage data infrastructure and resources.
  • Participate in the design and implementation of data governance policies and procedures.
  • Conduct performance tuning and optimization of data systems.
  • Implement data backup and recovery solutions to ensure data availability.
  • Work with vendors and third-party providers to integrate data systems and services.
  • Develop and maintain data migration strategies and plans.

Nice to Have

  • Master's degree in Computer Science, Engineering, or a related field.
  • Certification in data engineering or a related field.
  • Experience with big data technologies, such as Hadoop or Spark.
  • Familiarity with machine learning and data science principles.
  • Experience with data streaming and real-time processing.
  • Knowledge of data lake architectures and data mesh frameworks.
  • Experience with data governance and compliance tools, such as Collibra or Alation.
  • Familiarity with data cataloging and metadata management tools.
  • Experience with data lineage and impact analysis tools.
  • Knowledge of data privacy and protection regulations, such as GDPR or CCPA.
  • Experience with data anonymization and pseudonymization techniques.
  • Familiarity with data governance and management frameworks, such as DAMA-DMBOK.
  • Experience with data quality and validation frameworks, such as DQM or DQM2.
  • Knowledge of data integration and API development frameworks, such as Apache Camel or MuleSoft.
  • Experience with data visualization and reporting tools, such as Looker or Power BI.
  • Familiarity with data warehousing and ETL tools, such as Talend or Informatica.
  • Experience with data modeling and database design tools, such as ER/Studio or Toad Data Modeler.
  • Knowledge of data encryption and security protocols, such as AES or RSA.
  • Experience with data backup and recovery solutions, such as Veeam or Commvault.
  • Familiarity with data governance and management frameworks, such as DAMA-DMBOK.

Compensation

Competitive salary and benefits package.

Work Arrangement

Hybrid work environment with a mix of remote and on-site work.

Team

Join a dynamic and collaborative team of data professionals.

What You'll Do

  • Design and implement data pipelines and systems to ensure reliability and scalability.
  • Monitor and maintain data infrastructure to prevent and resolve issues.
  • Collaborate with data engineers, analysts, and other stakeholders to understand data requirements and deliver solutions.
  • Develop and implement data quality checks and monitoring tools.
  • Optimize data storage and retrieval processes to enhance performance.
  • Ensure data security and compliance with relevant regulations and standards.
  • Troubleshoot and resolve data-related issues and outages.
  • Create and maintain documentation for data systems and processes.
  • Participate in on-call rotations to provide 24/7 support for data systems.
  • Stay updated with the latest trends and best practices in data engineering and reliability.

What You'll Need

  • Bachelor's degree in Computer Science, Engineering, or a related field.
  • Proven experience as a Data Reliability Engineer or in a similar role.
  • Strong knowledge of data engineering principles and best practices.
  • Experience with data pipeline tools and technologies, such as Apache Kafka, Apache Spark, and Airflow.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Experience with cloud platforms, such as AWS, Azure, or Google Cloud.
  • Strong problem-solving and analytical skills.
  • Excellent communication and teamwork skills.
  • Experience with data warehousing and ETL processes.
  • Knowledge of data security and compliance standards.
  • Experience with monitoring and logging tools, such as Prometheus, Grafana, and ELK Stack.
  • Familiarity with containerization and orchestration tools, such as Docker and Kubernetes.
  • Experience with SQL and NoSQL databases.
  • Knowledge of data modeling and database design.
  • Experience with data visualization tools, such as Tableau or Power BI.

Nice to Have

  • Master's degree in Computer Science, Engineering, or a related field.
  • Certification in data engineering or a related field.
  • Experience with big data technologies, such as Hadoop or Spark.
  • Familiarity with machine learning and data science principles.
  • Experience with data streaming and real-time processing.
  • Knowledge of data lake architectures and data mesh frameworks.
  • Experience with data governance and compliance tools, such as Collibra or Alation.
  • Familiarity with data cataloging and metadata management tools.
  • Experience with data lineage and impact analysis tools.
  • Knowledge of data privacy and protection regulations, such as GDPR or CCPA.

Our Benefits

  • Competitive salary and benefits package.
  • Hybrid work environment with a mix of remote and on-site work.
  • Join a dynamic and collaborative team of data professionals.
  • Visa sponsorship is available for eligible candidates.

Visa sponsorship is available for eligible candidates.

Earn more as a remote developer

Performance pay that rewards your skills

Iglu's revenue-sharing model means top performers earn significantly more than traditional salaries. Choose your projects, deliver great work, and see it reflected in your pay.

Revenue-sharing compensation
Project choice & autonomy
International client base
Career growth support
Check compensation
Top earners exceed market rate
About company
Groupon
Groupon is a marketplace where customers discover new experiences and services everyday and local businesses thrive. Its mission is to revolutionise the underserved local experiences and services market.
All jobs at Groupon Visit website
Job Details
Department Data Reliability Engineering (PRE)
Category other
Posted 3 hours ago