Abode TechZone LLC is seeking a Lead Databricks Engineer to design and optimize large-scale data pipelines using Databricks and Azure technologies. This hybrid role requires collaboration with multiple teams to deliver efficient data solutions.
Role : Lead Databricks Engineer Location : Iselin, NJ / NY (Hybrid) Azure Data Engineer with Databricks Expertise Job Summary: We are seeking highly skilled Azure Data Engineer with strong expertise in Databricks to join our data team. The ideal candidate will design, implement and optimize large-scale data pipeline, ensuring scalability, reliability and performance. This role involves working closely with multiple teams and business stakeholders to deliver cutting-edge data solutions. Key Responsibilities: • Data Pipeline Development: • Build and maintain scalable ETL/ELT pipelines using Databricks. • Leverage PySpark/Spark and SQL to transform and process large datasets. • Integrate data from multiple sources including Azure Blob Storage, ADLS and other relational/non-relational systems. • Collaboration & Analysis: • Work Closely with multiple teams to prepare data for dashboard and BI Tools. • Collaborate with cross-functional teams to understand business requirements and deliver tailored data solutions. • Performance & Optimization: • Optimize Databricks workloads for cost efficiency and performance. • Monitor and troubleshoot data pipelines to ensure reliability and accuracy. • Governance & Security: • Implement and manage data security, access controls and governance standards using Unity Catalog. • Ensure compliance with organizational and regulatory data policies. • Deployment: • Leverage Databricks Asset Bundles for seamless deployment of Databricks jobs, notebooks and configurations across environments. • Manage version control for Databricks artifacts and collaborate with team to maintain development best practices. Technical Skills: • Strong expertise in Databricks (Delta Lake, Unity Catalog, Lakehouse Architecture, Table Triggers, Delta Live Pipelines, Databricks Runtime etc.) • Proficiency in Azure Cloud Services. • Solid Understanding of Spark and PySpark for big data processing. • Experience in relational databases. • Knowledge on Databricks Asset Bundles and GitLab. Preferred Experience: • Familiarity with Databricks Runtimes and advanced configurations. • Knowledge of streaming frameworks like Spark Streaming. • Experience in developing real-time data solutions. Certifications: • zure Data Engineer Associate or Databricks certified Data Engineer Associate certification. (Optional)
Cyborgwave is seeking a Databricks Architect in Raleigh, North Carolina, to design and maintain the Databricks environment for the NCDIT-Transportation Database Team. The role involves mentoring team members and ensuring the platform's scalability, performance, and security.
SGS U.S. Holding Inc. is seeking an Analytics Engineer to develop and maintain data solutions using Azure Data Platform and Power BI. This role involves collaborating across teams to deliver data products and optimize business processes.
EY is seeking a Senior Consultant Data Engineer with expertise in Databricks and cloud data engineering to design and implement analytics solutions. The role involves client collaboration, data architecture design, and leading data pipeline development.
CloudIngest is seeking an Azure DataBricks Lead/Architect with AI expertise to design and implement data solutions. This long-term contract role requires strong experience in data architecture, big data technologies, and machine learning frameworks.
Evolution USA is seeking a Senior Data Engineer to design and deliver enterprise-grade data solutions using Azure Databricks. This senior-level role involves building scalable data infrastructure and mentoring junior engineers.
Saviance is seeking a Cloud Data Engineer with expertise in Azure and DataBricks to design and implement cloud solutions. The role involves collaboration with clinical teams and ensuring compliance with healthcare regulations.
Cyborgwave is seeking a Databricks Architect in Raleigh, North Carolina, to design and maintain the Databricks environment for the NCDIT-Transportation Database Team. The role involves mentoring team members and ensuring the platform's scalability, performance, and security.
SGS U.S. Holding Inc. is seeking an Analytics Engineer to develop and maintain data solutions using Azure Data Platform and Power BI. This role involves collaborating across teams to deliver data products and optimize business processes.
EY is seeking a Senior Consultant Data Engineer with expertise in Databricks and cloud data engineering to design and implement analytics solutions. The role involves client collaboration, data architecture design, and leading data pipeline development.
CloudIngest is seeking an Azure DataBricks Lead/Architect with AI expertise to design and implement data solutions. This long-term contract role requires strong experience in data architecture, big data technologies, and machine learning frameworks.
Evolution USA is seeking a Senior Data Engineer to design and deliver enterprise-grade data solutions using Azure Databricks. This senior-level role involves building scalable data infrastructure and mentoring junior engineers.
Saviance is seeking a Cloud Data Engineer with expertise in Azure and DataBricks to design and implement cloud solutions. The role involves collaboration with clinical teams and ensuring compliance with healthcare regulations.
Cyborgwave is seeking a Databricks Architect in Raleigh, North Carolina, to design and maintain the Databricks environment for the NCDIT-Transportation Database Team. The role involves mentoring team members and ensuring the platform's scalability, performance, and security.
SGS U.S. Holding Inc. is seeking an Analytics Engineer to develop and maintain data solutions using Azure Data Platform and Power BI. This role involves collaborating across teams to deliver data products and optimize business processes.
Abode TechZone LLC is seeking a Lead Databricks Engineer to design and optimize large-scale data pipelines using Databricks and Azure technologies. This hybrid role requires collaboration with multiple teams to deliver efficient data solutions.