Merkle Schweiz is seeking a Data & Integration Architect to design and build Cloud solutions using Azure Data Factory. The role involves developing global data warehouse solutions and optimizing ETL/ELT workflows.
You will architect, design and build Cloud solutions using Azure Data Factory. You will design and build global data warehouse solutions, ensuring data consistency, quality, and compliance across international data sources. You will develop and optimize ETL/ELT CI/CD workflows using ADF pipelines, Data Flows, Linked Services, Integration Runtimes, and Triggers. You will use PySpark, Kafka, Kenesis and Python for data transformation, cleansing, and enrichment tasks within Azure Synapse or Databricks environments. You will collaborate with cross-functional teams to define data architecture standards, governance, and best practices. You will provide technical leadership and mentorship to junior engineers . You will ensure performance tuning, monitoring, and troubleshooting of data pipelines and workflows. You will report to Vice President, Data Engineering Lead. Required Experience: 9+ years of experience in data warehousing/engineering. 3+ years experience in Azure Data Factory architecture and implementation ( migration or new implementation ). Bachelor’s degree in computer science, Information Systems, or related field. Experience with ADF components: Pipelines, Datasets, Linked Services, Integration Runtime, Data Flows, and Triggers. Proven experience in building and managing global data warehouse solutions, integrating data from multiple countries and ensuring localization and compliance. Experience with Azure tool stack. Experience in Python and PySpark, kafka, kinesis for data processing and scripting. Familiarity with Azure Synapse Analytics, Azure Data Lake, and Azure Key Vault. Hands-on experience with any ETL tool. Preferred Informatica PowerCenter/Cloud and Oracle PL/SQL. Good understanding of modern ELT practices, data ingestion patterns, and streaming pipelines. Knowledge of data modeling, data governance, and data security principles. Experience in data privacy regulations (e.g., GDPR, HIPAA) in multi-country data environments. Experience with pipeline automation tools like Fivetran or custom connectors. Expertise in Databricks (on Azure) for large-scale data engineering and transformation workflows, including the use of PySpark, Scala, Delta Lake, and MLflow. Familiarity with Notebook-based collaboration and version-controlled data pipelines. Proficiency in SQL (T-SQL or SparkSQL) for developing complex queries, views, stored procedures, and optimization. Solid experience in Python, especially data manipulation libraries like pandas, numpy, and integration with PySpark. Experience with REST APIs and experience building/consuming APIs for data exchange. Familiarity with OAuth2.0, token-based authentication, and secure API practices in cloud environments. Working knowledge of Microsoft Fabric (OneLake, Lakehouse, Notebooks, Pipelines) as an interactive environment for unified data analytics and collaborative workflows across Power BI, Synapse, and Data Engineering workloads. Azure Databricks Unity Catalog and Azure Purview for data cataloging and lineage. Worked with Structured, semi-structured (JSON, Parquet), and unstructured data Azure Schema design and optimization for performance. Azure certifications (e.g., Azure Data Engineer Associate, Azure Solutions Architect Expert) is a plus. Experience with CI/CD pipelines for data solutions using Azure DevOps is a plus. Experience with stored procedures in SQL server and oracle is a plus. The annual salary range for this position is $113,000-$182,850. Placement within the salary range is based on a variety of factors, including relevant experience, knowledge, skills, and other factors permitted by law. Benefits available with this position include: Medical, vision, and dental insurance, Life insurance, Short-term and long-term disability insurance, 401k, Flexible paid time off, At least 15 paid holidays per year, Paid sick and safe leave, and Paid parental leave. Dentsu also complies with applicable state and local laws regarding employee leave benefits, including, but not limited to providing time off pursuant to the Colorado Healthy Families and Workplaces Act, in accordance with its plans and policies. For further details regarding Dentsu benefits, please visit www.dentsubenefitsplus.com . To begin the application process, please click on the "Apply" button at the top of this job posting. Applications will be reviewed on an ongoing basis, and qualified candidates will be contacted for next steps. #LI-AB2 Beware of Job Scams We are aware of several scams targeting job seekers and candidates. Please be vigilant. All communication throughout the recruitment process will be from an official member of the dentsu recruitment team, using corporate email addresses (e.g., @dentsu.com or @merkle.com). We will never ask you to send money or vouchers to secure employment. If you suspect you have been a victim of a scam, please report the incident to your bank, local police, or fraud protection authority immediately. Additionally, you can report the scam to us at jobfraud@dentsu.com so we can take appropriate action to request the website is taken down. Please note that Merkle and dentsu are not responsible for any losses incurred as a result of these scams. We advise all individuals to exercise caution and verify the authenticity of any job offers or communications received. #J-18808-Ljbffr
Electrasteel is seeking a Senior Enterprise Data & Integrations Architect to design and implement scalable data integration solutions. This role focuses on connecting enterprise systems and enabling a unified data platform for analytics and reporting.
Merkle Schweiz is seeking a Data & Integration Architect to design and build Cloud solutions using Azure Data Factory. The role involves developing global data warehouse solutions and optimizing ETL/ELT workflows.
EY is seeking a Data Architect in Jacksonville, Florida, to design and implement scalable data solutions aligned with Azure standards. The role involves leading data engineering initiatives and ensuring data quality and security across various platforms.
The Azure Data Architect will lead the design and implementation of scalable data solutions, focusing on Microsoft Fabric and cloud transformation initiatives. This role requires strong technical expertise and team leadership skills to drive innovation and collaboration with stakeholders.
Alliance Recruitment Agency is seeking an Azure Data Architect with 10-15 years of experience to lead data architecture and cloud transformation initiatives. The role involves designing scalable data solutions and collaborating with stakeholders in a hybrid work environment.
ZipRecruiter is seeking an Azure Data Architect in San Diego to lead data architecture and cloud transformation initiatives. The role requires extensive experience in data solutions and strong leadership skills.
Electrasteel is seeking a Senior Enterprise Data & Integrations Architect to design and implement scalable data integration solutions. This role focuses on connecting enterprise systems and enabling a unified data platform for analytics and reporting.
Merkle Schweiz is seeking a Data & Integration Architect to design and build Cloud solutions using Azure Data Factory. The role involves developing global data warehouse solutions and optimizing ETL/ELT workflows.
EY is seeking a Data Architect in Jacksonville, Florida, to design and implement scalable data solutions aligned with Azure standards. The role involves leading data engineering initiatives and ensuring data quality and security across various platforms.
The Azure Data Architect will lead the design and implementation of scalable data solutions, focusing on Microsoft Fabric and cloud transformation initiatives. This role requires strong technical expertise and team leadership skills to drive innovation and collaboration with stakeholders.
Alliance Recruitment Agency is seeking an Azure Data Architect with 10-15 years of experience to lead data architecture and cloud transformation initiatives. The role involves designing scalable data solutions and collaborating with stakeholders in a hybrid work environment.
ZipRecruiter is seeking an Azure Data Architect in San Diego to lead data architecture and cloud transformation initiatives. The role requires extensive experience in data solutions and strong leadership skills.
Electrasteel is seeking a Senior Enterprise Data & Integrations Architect to design and implement scalable data integration solutions. This role focuses on connecting enterprise systems and enabling a unified data platform for analytics and reporting.
Merkle Schweiz is seeking a Data & Integration Architect to design and build Cloud solutions using Azure Data Factory. The role involves developing global data warehouse solutions and optimizing ETL/ELT workflows.
Merkle Schweiz is seeking a Data & Integration Architect to design and build Cloud solutions using Azure Data Factory. The role involves developing global data warehouse solutions and optimizing ETL/ELT workflows.