HYR Global Source Inc is seeking an Azure Data Engineer in Raleigh, NC to build scalable data pipelines and deploy Azure-based data solutions. The role requires strong experience in Azure Data Factory, Databricks, and Azure Data Lake.
Job Title: Azure Data Engineer Location: Raleigh, NC (3 Days Hybrid) Job Type: Full-Time Job Description We are seeking a highly skilled Azure Data Engineer to join our data engineering team in Minneapolis, MN. This hybrid role requires collaborating with cross-functional teams to build scalable data pipelines and deploy robust Azure-based data solutions. The ideal candidate will have strong experience in Azure Data Factory, Databricks, and Azure Data Lake, with proven ability to build and optimize real-time and batch data pipelines. Responsibilities • Build large-scale batch and real-time data pipelines using Azure cloud platform technologies. • Design and implement high-performance data ingestion pipelines from multiple structured and unstructured sources using Azure Databricks. • Lead the design and development of ETL, data integration, and migration solutions. • Develop scalable and reusable frameworks for ingestion, processing, and transformation of data sets. • Partner with data architects, analysts, engineers, and stakeholders to deploy enterprise-grade data platforms. • Integrate end-to-end data pipelines — from source systems to target data repositories — ensuring high data quality, reliability, and consistency. • Work with event-driven and streaming technologies for real-time data ingestion and processing. • Support related project components including API integrations and search services. • Evaluate tools and platforms based on performance and business requirements. Required Skills • Azure Data Factory • Azure Databricks • Azure Data Lake Storage (ADLS) • Azure SQL Database and Data Warehouse • Proven hands-on expertise in: • Implementing and optimizing Azure cloud data solutions • Performance tuning in the Databricks environment • SQL, T-SQL, and/or PL/SQL Working Knowledge Of • Azure Storage Accounts • Azure Data Catalog, Azure Data Analytics • Logic Apps and Function Apps • Familiarity with Agile methodology and tools like JIRA • Experience handling end-to-end data ingestion projects in an Azure environment is preferred Follow us on LinkedIn - https://www.linkedin.com/company/hyr-global-source-inc
Tiger Analytics is seeking a Principal Data Engineer with expertise in Azure and Databricks to design and build scalable data ingestion pipelines. This role involves collaborating with multiple teams to deliver high-performance data solutions.
HYR Global Source Inc is seeking an Azure Data Engineer in Raleigh, NC to build scalable data pipelines and deploy Azure-based data solutions. The role requires strong experience in Azure Data Factory, Databricks, and Azure Data Lake.
HYR Global Source Inc is seeking an Azure Data Engineer in Columbus, OH to build scalable data pipelines and deploy Azure-based data solutions. This full-time hybrid role requires collaboration with cross-functional teams and expertise in Azure technologies.
Relias LLC is seeking an Azure Data Engineer - Integrations in Morrisville, NC, to build and maintain ETL pipelines for HRIS and ATS integrations. This role involves collaboration with cross-functional teams to deliver actionable insights and scalable solutions.
Infosys Limited Digital is seeking an Azure Data Engineer in Raleigh, NC, to implement and manage technology solutions using Azure Data Bricks and Python. The role requires collaboration with industry experts to deliver innovative data solutions.
We are looking for an experienced Azure Data Engineer to join our team in Hopkins, MN, focusing on building scalable data pipelines and Azure-based data solutions. This hybrid role requires collaboration with cross-functional teams to optimize real-time and batch data processing.
Tiger Analytics is seeking a Principal Data Engineer with expertise in Azure and Databricks to design and build scalable data ingestion pipelines. This role involves collaborating with multiple teams to deliver high-performance data solutions.
HYR Global Source Inc is seeking an Azure Data Engineer in Raleigh, NC to build scalable data pipelines and deploy Azure-based data solutions. The role requires strong experience in Azure Data Factory, Databricks, and Azure Data Lake.
HYR Global Source Inc is seeking an Azure Data Engineer in Columbus, OH to build scalable data pipelines and deploy Azure-based data solutions. This full-time hybrid role requires collaboration with cross-functional teams and expertise in Azure technologies.
Relias LLC is seeking an Azure Data Engineer - Integrations in Morrisville, NC, to build and maintain ETL pipelines for HRIS and ATS integrations. This role involves collaboration with cross-functional teams to deliver actionable insights and scalable solutions.
Infosys Limited Digital is seeking an Azure Data Engineer in Raleigh, NC, to implement and manage technology solutions using Azure Data Bricks and Python. The role requires collaboration with industry experts to deliver innovative data solutions.
We are looking for an experienced Azure Data Engineer to join our team in Hopkins, MN, focusing on building scalable data pipelines and Azure-based data solutions. This hybrid role requires collaboration with cross-functional teams to optimize real-time and batch data processing.
Tiger Analytics is seeking a Principal Data Engineer with expertise in Azure and Databricks to design and build scalable data ingestion pipelines. This role involves collaborating with multiple teams to deliver high-performance data solutions.
HYR Global Source Inc is seeking an Azure Data Engineer in Raleigh, NC to build scalable data pipelines and deploy Azure-based data solutions. The role requires strong experience in Azure Data Factory, Databricks, and Azure Data Lake.
HYR Global Source Inc is seeking an Azure Data Engineer in Raleigh, NC to build scalable data pipelines and deploy Azure-based data solutions. The role requires strong experience in Azure Data Factory, Databricks, and Azure Data Lake.