NDIM Technologies Inc. is seeking an Azure Data Engineer to design and maintain scalable data solutions on Microsoft Azure. This remote position requires expertise in data engineering, ETL processes, and collaboration with cross-functional teams.
Azure Data Engineer (Full-Time, Remote – Lahore / Islamabad) Company: NDIM Technologies Inc. Location: Lahore / Islamabad (Remote) Working Hours: 6 PM – 2 AM Pakistan Time Compensation: Fixed Monthly Salary ⸻ About the Role NDIM Technologies Inc. is hiring a talented Azure Data Engineer to design, build, and maintain scalable data solutions on Microsoft Azure. You’ll work closely with cross-functional teams to deliver robust data pipelines, optimize data workflows, and support data-driven decision-making. This role involves hands-on work with Azure Data Factory, Synapse, Databricks, Data Lake, and Streamlit dashboards for data validation and performance insights. ⸻ Key Responsibilities • Design and develop end-to-end data pipelines using Azure Data Factory, Synapse, Databricks, and Data Lake Storage. • Build and manage ETL/ELT processes to integrate on-premises, cloud, and streaming data sources. • Develop data models and schemas optimized for analytics, reporting, and BI use cases. • Implement data governance, security, and access control policies (encryption, permissions, auditing). • Optimize query performance and Delta Lake tables using Z-ORDER, OPTIMIZE, and STATS collection. • Collaborate with data scientists, analysts, and stakeholders to define and deliver reliable data solutions. • Monitor data pipelines, troubleshoot failures, and configure automated alerting via Azure Monitor and Log Analytics. • Create Streamlit applications for tracking validation metrics, performance comparisons, and data health KPIs. • Maintain technical documentation, data flow diagrams, and solution design references. ⸻ Required Qualifications • Bachelor’s degree in Computer Science, Information Systems, or related discipline. • 3–5 years of experience in Data Engineering or related roles. • Proven experience with Azure Data Factory, Synapse Analytics, Databricks, and Data Lake Storage. • Advanced SQL skills and understanding of data warehousing and dimensional modeling. • Proficiency in Python or PySpark for data transformation and orchestration. • Familiarity with ETL/ELT design patterns, Delta Lake optimization, and data validation frameworks. • Experience with Git or Azure DevOps for version control and CI/CD pipelines. ⸻ Preferred Qualifications • Azure Data Engineer Associate (DP-203) certification or equivalent. • Experience with streaming data technologies (Event Hubs, Stream Analytics, Kafka). • Understanding of DevOps practices and Infrastructure as Code (Terraform, ARM templates). • Working knowledge of Power BI or similar visualization tools. • Familiarity with medallion architecture (bronze/silver/gold layers). • Experience developing Streamlit-based dashboards integrated with Databricks or Delta Lake. ⸻ Technical Stack • Azure Services: Data Factory, Synapse, Databricks, Data Lake Storage • Databases: Azure SQL Database, SQL Server, Cosmos DB • Languages: Python, PySpark, SQL • Frameworks: Delta Lake, Apache Spark, Streamlit • DevOps: GitHub / Azure DevOps (CI/CD) • Monitoring: Azure Monitor, Log Analytics ⸻ Why Join NDIM Technologies Inc.? • Work with cutting-edge Azure and Databricks solutions on real enterprise migration projects. • Be part of a US-based data modernization initiative focused on Teradata-to-Databricks migration. • Collaborate in a remote-first, growth-oriented engineering culture. • Competitive fixed salary and performance incentives. • Opportunity to grow your technical depth in data engineering, AI integration, and real-time analytics. ⸻ Working Conditions • Remote position – must be based in Lahore or Islamabad for coordination. • Schedule: 6 PM to 2 AM Pakistan Time to align with US business hours. • Fixed monthly salary with periodic performance reviews. ⸻ How to Apply Please include: • Your updated resume highlighting Azure, Databricks, and Streamlit experience. • A brief note on projects you’ve built with Azure Data Factory, Databricks, or Streamlit. • Links (if any) to GitHub repositories, dashboards, or demos relevant to this role.
JobRialto is seeking an Azure Data Engineer in Sunnyvale, CA, with expertise in Databricks, PySpark, and Azure services. The role involves building data governance solutions and optimizing data processes.
Realign LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract role in Pittsburgh, PA. The candidate will focus on designing and optimizing data pipelines in Azure cloud environments.
Realign LLC is seeking an experienced Azure Databricks Data Engineer proficient in Python, PySpark, and SQL to develop and optimize data pipelines on Azure Cloud. This is a contract position based in Pennsylvania.
NDIM Technologies Inc. is seeking an Azure Data Engineer to design and maintain scalable data solutions on Microsoft Azure. This remote position requires expertise in data engineering, ETL processes, and collaboration with cross-functional teams.
QData Inc is seeking a Senior Azure Databricks Engineer with expertise in Python for a remote contract position. The role involves building scalable data pipelines and contributing to Lakehouse architecture within the Azure cloud ecosystem.
Northern Trust Corp. is seeking a Sr Lead in Software Engineering with expertise in Python, AI, Azure, and SQL to lead the design and development of scalable software solutions. The role requires extensive experience in investment management and technical leadership in a collaborative environment.
JobRialto is seeking an Azure Data Engineer in Sunnyvale, CA, with expertise in Databricks, PySpark, and Azure services. The role involves building data governance solutions and optimizing data processes.
Realign LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract role in Pittsburgh, PA. The candidate will focus on designing and optimizing data pipelines in Azure cloud environments.
Realign LLC is seeking an experienced Azure Databricks Data Engineer proficient in Python, PySpark, and SQL to develop and optimize data pipelines on Azure Cloud. This is a contract position based in Pennsylvania.
NDIM Technologies Inc. is seeking an Azure Data Engineer to design and maintain scalable data solutions on Microsoft Azure. This remote position requires expertise in data engineering, ETL processes, and collaboration with cross-functional teams.
QData Inc is seeking a Senior Azure Databricks Engineer with expertise in Python for a remote contract position. The role involves building scalable data pipelines and contributing to Lakehouse architecture within the Azure cloud ecosystem.
Northern Trust Corp. is seeking a Sr Lead in Software Engineering with expertise in Python, AI, Azure, and SQL to lead the design and development of scalable software solutions. The role requires extensive experience in investment management and technical leadership in a collaborative environment.
JobRialto is seeking an Azure Data Engineer in Sunnyvale, CA, with expertise in Databricks, PySpark, and Azure services. The role involves building data governance solutions and optimizing data processes.
Realign LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract role in Pittsburgh, PA. The candidate will focus on designing and optimizing data pipelines in Azure cloud environments.
NDIM Technologies Inc. is seeking an Azure Data Engineer to design and maintain scalable data solutions on Microsoft Azure. This remote position requires expertise in data engineering, ETL processes, and collaboration with cross-functional teams.