The Fabric Developer plays a critical role in designing, implementing, and maintaining scalable data solutions within Microsoft Fabric. This includes developing robust SQL-based transformations, orchestrating data pipelines in Azure Data Factory, and ensuring seamless integration across various data platforms. The role is central to enabling accurate, timely, and secure data delivery for analytics and reporting in the insurance domain. Developers collaborate closely with data engineers, business analysts, and stakeholders to translate business requirements into technical solutions, while adhering to data governance, performance standards, and cloud best practices.
Responsibilities
- Design, develop, and maintain SQL-based data models and transformation logic
- Build and manage Azure Data Factory pipelines for data ingestion from diverse source systems
- Develop and enhance data solutions using Microsoft Fabric components such as Lakehouse and Data Pipelines
- Implement complete data workflows covering ingestion, transformation, and orchestration
- Ensure data accuracy, consistency, and quality across insurance-related data sources
- Optimize SQL queries and data pipelines for performance and scalability
- Collaborate with business analysts and stakeholders to meet reporting requirements in the insurance space
- Support data validation, reconciliation processes, and resolution of data issues
- Participate in code reviews, deployment activities, and production support tasks
- Contribute to continuous improvement of data architecture in alignment with enterprise and industry standards
- Work during evening hours from 2 PM to 11 PM and attend video-only customer meetings
Requirements
- Extensive hands-on experience with SQL and T-SQL for data analysis and transformation tasks
- Demonstrated experience building and managing Azure Data Factory pipelines, triggers, and integrations
- Practical understanding of Microsoft Fabric including Lakehouse, Data Pipelines, and Warehouses
- Proven ability to design, build, and optimize ETL and ELT data pipelines
- Solid knowledge of data modeling concepts, including dimensional models and schema design
- Experience working with Azure Data Lake, OneLake, and cloud-based storage solutions
- Capability to manage large-scale data processing with performance tuning expertise
- Familiarity with Agile delivery methodologies and DevOps practices
- Strong analytical abilities and experience collaborating with engineering and business teams
Nice to Have
- Familiarity with insurance industry data domains such as claims, policies, customers, and financial data
Tech Stack
SQL, T-SQL, Azure Data Factory (ADF), Microsoft Fabric, Lakehouse, Data Pipelines, Warehouses, ETL, ELT, Azure Data Lake, OneLake, Cloud Storage, Data Modeling, Dimensional Modeling, Agile, DevOps
Benefits
- Work within an inclusive, adaptable, and forward-thinking organization
- Join a globally recognized Top Employer with expertise spanning over 50 countries
- Access a broad innovation ecosystem including collaboration with technology partners and research centers
- Eligibility for flexible work arrangements, including remote or hybrid work options
- Commitment to diversity and inclusion in all employment practices
Compensation
Not specified
Work Arrangement
Remote or hybrid options available; subject to change based on client requirements
Team
Collaborative environment involving data engineers, business analysts, and stakeholders in the insurance domain
- Inclusive
- Adaptable
- Forward-thinking
- Committed to responsible innovation
- Flexible and responsive to evolving client and employee needs
Additional Information
- Shift hours are from 2 PM to 11 PM
- All customer interactions require video calls
- Employees are typically hired near NTT DATA offices or client locations when possible
- Remote or hybrid work is offered but subject to operational and client needs
- Regular participation in sprint planning and backlog refinement sessions is expected
- On-call support may be required during critical project phases
- Professional development and certification opportunities are encouraged and supported
- All code and documentation must adhere to established version control and governance standards
Not specified