Responsibilities
- Create and manage data processing workflows using Azure Databricks
- Develop and refine data transformations with PySpark and SQL within the Databricks environment
- Set up and oversee Lakehouse implementations using Delta Lake technology
- Build and coordinate ETL and ELT workflows using Azure Data Factory
- Consolidate data from diverse sources into unified data platforms and analytical systems
- Design and manage data models and warehouse schemas for analytical use
- Maintain high standards for data accuracy, system scalability, and processing efficiency
- Work with BI teams to enable Power BI and other reporting tools
- Support and enhance legacy SQL Server platforms and SSIS-based ETL processes as needed
- Help shape modern data architectures built on cloud infrastructure