Job title: Azure Data Engineer (Databricks) Location: Remote Working Model: Contract Job Description: REQUIRED SKILLS Must-Have Skills: Databricks Expertise: Hands-on advanced experience with Databricks and Spark (PySpark preferred) for building and optimizing workflows and pipelines. Pipeline Development: Hands-on proficiency in designing, building, and maintaining ETL/ELT pipelines using Python, PySpark, and related tools. SQL and Data Warehousing: Strong SQL skills Cloud Platforms: Hands-on experience with cloud services (AWS, Azure, or GCP) and cloud-native data tools. Big Data and Integration: Expertise in handling big data systems and integrating data from diverse sources (e.g., APIs, flat files, databases). Network and External Connections: Experience in configuring secure external connections to data sources (e.g., Azure Data Lake Storage, On-prem SQL Server) using service principals, SAS tokens, OAuth, or mounting techniques. Security and Governance: Strong understanding of data security, encryption, and compliance standards Experience implementing data governance frameworks to manage data quality. Azure Data Factory (ADF): Proven expertise in using ADF for orchestrating and automating data workflows across cloud and on-premises systems. Nice-to-Have Skills: Data Vault Modeling: Knowledge of Data Vault methodology for designing scalable and flexible data models. Other Orchestration Tools: Familiarity with workflow orchestration tools like FiveTran
Job Type
Fulltime role
Skills required
Python, Azure
Location
Murphy, Texas
Salary
No salary information was found.
Date Posted
May 31, 2025
Veridian Tech is seeking an Azure Data Engineer with expertise in Databricks to design and maintain ETL/ELT pipelines. This remote contract position requires strong skills in cloud services and data integration.