India Remote (Country) Employment

Granica is hiring a Software Engineer- Data Platform(India-Remote)

Responsibilities

  • Design and implement scalable and efficient data platforms.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Ensure the best possible performance, quality, and responsiveness of applications.
  • Identify and correct bottlenecks and fix bugs.
  • Help maintain code quality, organization, and automatization.
  • Implement security and data protection.
  • Write clean, scalable, and efficient code using best practices.
  • Create and maintain technical documentation.
  • Participate in code reviews and pair programming sessions.
  • Stay up-to-date with emerging technologies and industry trends.
  • Work on data migration, integration, and ETL processes.
  • Develop and maintain data pipelines and data warehouses.
  • Ensure data accuracy, consistency, and integrity.
  • Implement data governance and compliance measures.
  • Collaborate with data scientists and analysts to support data-driven decision-making.
  • Provide technical support and troubleshooting for data-related issues.
  • Optimize data storage and retrieval processes.
  • Develop and maintain data dashboards and reporting tools.
  • Implement data security and access control measures.
  • Conduct data analysis and reporting to support business objectives.
  • Collaborate with stakeholders to understand data requirements and deliver solutions.
  • Participate in on-call rotations and provide 24/7 support as needed.
  • Contribute to the development and maintenance of data infrastructure.
  • Work on data modeling and database design.
  • Implement data validation and cleansing processes.

Nice to Have

  • Experience with data lakes and data warehouses.
  • Experience with data governance and compliance measures.
  • Experience with data security and access control measures.
  • Experience with data validation and cleansing processes.
  • Experience with data analysis and reporting tools.
  • Experience with Agile methodologies and version control systems such as Git.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with big data technologies such as Hadoop, Spark, or Kafka.
  • Experience with databases such as MySQL, PostgreSQL, or NoSQL databases.
  • Experience with ETL tools such as Apache NiFi, Talend, or Informatica.
  • Experience with data modeling and database design.
  • Experience with data warehousing solutions such as Redshift, BigQuery, or Snowflake.
  • Experience with programming languages such as Java, Python, or Scala.
  • Experience with data pipelines and data warehouses.
  • Experience with data migration, integration, and ETL processes.
  • Experience with data dashboards and reporting tools.
  • Experience with data governance and compliance measures.
  • Experience with data security and access control measures.
  • Experience with data validation and cleansing processes.
  • Experience with data analysis and reporting tools.
  • Experience with Agile methodologies and version control systems such as Git.

Compensation

Competitive salary and benefits package

Work Arrangement

Remote work with flexible hours

Team

Collaborative and innovative team environment

What You'll Get

  • Competitive salary and benefits package
  • Remote work with flexible hours
  • Opportunity to work on cutting-edge technologies
  • Collaborative and innovative team environment
  • Opportunity to work on a global team
  • Opportunity to work on a diverse team
  • Opportunity to work on multiple projects simultaneously
  • Opportunity to work on a 24/7 support rotation
  • Opportunity to work on a flexible schedule
  • Opportunity to work on a remote basis

What You'll Do

  • Design and implement scalable and efficient data platforms.
  • Collaborate with cross-functional teams to define, design, and ship new features.
  • Ensure the best possible performance, quality, and responsiveness of applications.
  • Identify and correct bottlenecks and fix bugs.
  • Help maintain code quality, organization, and automatization.
  • Implement security and data protection.
  • Write clean, scalable, and efficient code using best practices.
  • Create and maintain technical documentation.
  • Participate in code reviews and pair programming sessions.
  • Stay up-to-date with emerging technologies and industry trends.
  • Work on data migration, integration, and ETL processes.
  • Develop and maintain data pipelines and data warehouses.
  • Ensure data accuracy, consistency, and integrity.
  • Implement data governance and compliance measures.
  • Collaborate with data scientists and analysts to support data-driven decision-making.
  • Provide technical support and troubleshooting for data-related issues.
  • Optimize data storage and retrieval processes.
  • Develop and maintain data dashboards and reporting tools.
  • Implement data security and access control measures.
  • Conduct data analysis and reporting to support business objectives.
  • Collaborate with stakeholders to understand data requirements and deliver solutions.
  • Participate in on-call rotations and provide 24/7 support as needed.
  • Contribute to the development and maintenance of data infrastructure.
  • Work on data modeling and database design.
  • Implement data validation and cleansing processes.

What You'll Need

  • Proven experience as a Software Engineer or similar role.
  • Strong proficiency in programming languages such as Java, Python, or Scala.
  • Experience with big data technologies such as Hadoop, Spark, or Kafka.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with databases such as MySQL, PostgreSQL, or NoSQL databases.
  • Experience with data warehousing solutions such as Redshift, BigQuery, or Snowflake.
  • Experience with ETL tools such as Apache NiFi, Talend, or Informatica.
  • Experience with data modeling and database design.
  • Experience with data governance and compliance measures.
  • Experience with data security and access control measures.
  • Experience with data validation and cleansing processes.
  • Experience with data analysis and reporting tools.
  • Experience with Agile methodologies and version control systems such as Git.
  • Strong problem-solving skills and attention to detail.
  • Ability to work independently and in a team environment.
  • Excellent communication and interpersonal skills.
  • Ability to manage multiple tasks and prioritize workload effectively.
  • Ability to work in a fast-paced and dynamic environment.
  • Ability to adapt to new technologies and tools quickly.
  • Ability to work on multiple projects simultaneously.
  • Ability to work on a remote basis.
  • Ability to work on a flexible schedule.
  • Ability to work on a 24/7 support rotation.
  • Ability to work on a global team.
  • Ability to work on a diverse team.

Nice to Have

  • Experience with data lakes and data warehouses.
  • Experience with data governance and compliance measures.
  • Experience with data security and access control measures.
  • Experience with data validation and cleansing processes.
  • Experience with data analysis and reporting tools.
  • Experience with Agile methodologies and version control systems such as Git.
  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.
  • Experience with big data technologies such as Hadoop, Spark, or Kafka.
  • Experience with databases such as MySQL, PostgreSQL, or NoSQL databases.
  • Experience with ETL tools such as Apache NiFi, Talend, or Informatica.

No visa sponsorship available

Want to work from Thailand?

Join a remote network built for tech talent

Iglu gives you real employment in Southeast Asia — visa, work permit, and projects included. Pick what you work on, earn performance-based pay, and live where you want.

Legal employment in Thailand & Vietnam
Choose your own projects
Performance-based revenue sharing
Relocation support available
Join Iglu
200+ professionals worldwide
About company
Granica
Granica is an AI research and systems company building infrastructure for intelligence that is structured, efficient, and deeply integrated with data, operating at exabyte scale and processing petabytes of data daily for prominent enterprises.
All jobs at Granica Visit website
Job Details
Department Engineering
Category other
Posted 3 months ago