Merkle (Dentsu) is hiring a Data & Integration Architect to architect, design, and build Cloud solutions using Azure Data Factory. In this role, you will design and build global data warehouse solutions, ensuring data consistency, quality, and compliance across international data sources.
What You'll Do
- Architect, design, and build Cloud solutions using Azure Data Factory.
- Design and build global data warehouse solutions, ensuring data consistency, quality, and compliance across international data sources.
- Develop and optimize ETL/ELT CI/CD workflows using ADF pipelines, Data Flows, Linked Services, Integration Runtimes, and Triggers.
- Use PySpark, Kafka, Kinesis, and Python for data transformation, cleansing, and enrichment tasks within Azure Synapse or Databricks environments.
- Collaborate with cross-functional teams to define data architecture standards, governance, and best practices.
- Provide technical leadership and mentorship to junior engineers.
- Ensure performance tuning, monitoring, and troubleshooting of data pipelines and workflows.
What We're Looking For
- 9+ years of experience in data warehousing/engineering.
- 3+ years experience in Azure Data Factory architecture and implementation (migration or new implementation).
- Bachelor's degree in computer science, Information Systems, or related field.
- Experience with ADF components: Pipelines, Datasets, Linked Services, Integration Runtime, Data Flows, and Triggers.
- Proven experience in building and managing global data warehouse solutions, integrating data from multiple countries and ensuring localization and compliance.
- Experience with Azure tool stack.
- Experience in Python and PySpark, Kafka, Kinesis for data processing and scripting.
- Familiarity with Azure Synapse Analytics, Azure Data Lake, and Azure Key Vault.
- Hands-on experience with any ETL tool.
- Good understanding of modern ELT practices, data ingestion patterns, and streaming pipelines.
- Knowledge of data modeling, data governance, and data security principles.
- Expert in work estimation & resource management.
- Experience in data privacy regulations (e.g., GDPR, HIPAA) in multi-country data environments.
- Experience with pipeline automation tools like Fivetran or custom connectors.
- Expertise in Databricks (on Azure) for large-scale data engineering and transformation workflows, including the use of PySpark, Scala, Delta Lake, and MLflow.
- Familiarity with Notebook-based collaboration and version-controlled data pipelines.
- Proficiency in SQL (T-SQL or SparkSQL) for developing complex queries, views, stored procedures, and optimization.
- Solid experience in Python, especially data manipulation libraries like pandas, numpy, and integration with PySpark.
- Experience with REST APIs and experience building/consuming APIs for data exchange.
- Familiarity with OAuth2.0, token-based authentication, and secure API practices in cloud environments.
- Working knowledge of Microsoft Fabric (OneLake, Lakehouse, Notebooks, Pipelines) as an interactive environment for unified data analytics and collaborative workflows across Power BI, Synapse, and Data Engineering workloads.
- Azure Databricks Unity Catalog and Azure Purview for data cataloging and lineage.
- Worked with Structured, semi-structured (JSON, Parquet), and unstructured data.
- Azure Schema design and optimization for performance.
Nice to Have
- Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect Expert) is a plus.
- Experience with CI/CD pipelines for data solutions using Azure DevOps is a plus.
- Experience with stored procedures in SQL server and oracle is a plus.
- Preferred Informatica PowerCenter/Cloud and Oracle PL/SQL.
Technical Stack
- Azure Data Factory, PySpark, Kafka, Kinesis, Python, Azure Synapse, Databricks, Scala, Delta Lake, MLflow, SQL (T-SQL, SparkSQL), pandas, numpy, REST APIs, OAuth2.0, Microsoft Fabric, Azure Databricks Unity Catalog, Azure Purview, Azure DevOps, Informatica PowerCenter/Cloud, Oracle PL/SQL, Fivetran
Team & Environment
You will report to the Vice President, Data Engineering Lead.
Benefits & Compensation
- Medical, vision, and dental insurance
- Life insurance
- Short-term and long-term disability insurance
- 401k
- Flexible paid time off
- At least 15 paid holidays per year
- Paid sick and safe leave
- Paid parental leave
- Compensation range: $113,000-$182,850
Work Mode
This is a remote-friendly position open to candidates in the USA - Remote - Ohio.
Dentsu is committed to providing equal employment opportunities to all applicants and employees. We do this without regard to race, color, national origin, sex, sexual orientation, gender identity, age, pregnancy, childbirth or related medical conditions, ancestry, physical or mental disability, marital status, political affiliation, religious practices and observances, citizenship status, genetic information, veteran status, or any other basis protected under applicable federal, state, or local law.




