About the Role
Design and build robust data pipelines and architectures on Azure platforms to support large-scale data processing and analytics needs.
Responsibilities
- Develop and maintain data workflows using Azure Data Factory
- Implement data storage solutions with Azure Data Lake and Blob Storage
- Create ETL processes for structured and unstructured data sources
- Optimize data models for performance and scalability
- Collaborate with data analysts and scientists to understand requirements
- Ensure data quality and integrity across systems
- Support integration of data from multiple enterprise systems
- Monitor and troubleshoot data pipeline issues
- Apply infrastructure-as-code principles using ARM templates or Bicep
- Follow security and compliance standards for data handling
- Document technical designs and system configurations
- Participate in code reviews and system testing
- Contribute to continuous improvement of data platforms
- Work with streaming data using Azure Event Hubs or Stream Analytics
- Integrate with Power BI for reporting and visualization
Nice to Have
- Certification in Azure Data Engineer or related specialties
- Experience with real-time data processing systems
- Background in financial or enterprise software domains
- Knowledge of machine learning pipelines
- Familiarity with containerization and orchestration tools
Compensation
Competitive salary based on experience
Work Arrangement
Hybrid work model with flexibility
Team
Collaborative team focused on cloud data solutions
Project Context
- Work within a global delivery team serving clients in North America and Europe
- Engage in long-term projects with evolving data challenges
- Focus on cloud-native solutions built on Azure architecture
Growth Opportunities
- Access to training programs and certification support
- Mentorship from experienced technical leads
- Opportunities to lead technical initiatives
Available for qualified candidates