Citadel Information Services Inc is seeking an Azure Cloud & Databricks Developer to design and maintain scalable data pipelines and ETL processes using Azure technologies. This role involves collaboration with data engineers to optimize cloud infrastructure and support analytics initiatives.
JOB SUMMARY We are seeking Azure Cloud & Databricks Developer, and he/she will be part of the Risk management group and is responsible for designing, developing and maintaining scalable data pipeline and ETL process using Azure cloud platform and Azure Databricks. SCOPE The Azure Cloud Data Engineer is responsible for strategizing, designing, and developing scalable cloud infrastructure and DevOps solutions. Working collaboratively with a team of skilled and passionate data engineers, this role plays a critical part in driving the automation and optimization of our Azure-based technology environment. A key focus of this role is leveraging Azure Databricks to build and manage advanced data pipelines, perform large-scale data processing, and support analytics and machine learning initiatives. The person will contribute to the growth and scalability of our cloud infrastructure while also managing complex interface development, resolving technical issues, and providing support during weekend maintenance and production operations. PRIMARY RESPONSIBILITIES The job responsibilities are described herein: • Develop and maintain Databricks notebooks using Python and SQL • Configure and manage Databricks clusters and integrate with version control systems such as GitHub. • Enable seamless integration between on-premises databases and Power BI for reporting and analytics. • Design and build large-scale data pipelines using Azure native data processing frameworks. • Collaborate with architects, engineers, analysts, and business stakeholders to deliver enterprise-grade, data-driven solutions. • Provide technical leadership and guidance on cloud architecture and implementation strategies. • Coordinate with platform teams, Azure API Management (APIM), GitHub, and support teams to ensure smooth operations. • Analyze business requirements and design scalable, secure, and efficient solutions on the Azure cloud platform. • Develop, test, and optimize software components to enhance the performance and reliability of data platforms. • Lead end-to-end project execution, working closely with business users, IT teams, data stewards, and third-party vendors. • Integrate and standardize data from diverse sources while ensuring compliance with data quality and accessibility standards. • Implement streaming data solutions and reusable design patterns in a big data environment. • Collaborate with data scientists to operationalize machine learning models and algorithms within automated data workflows. • Apply sound judgment and technical expertise to resolve moderately complex data engineering challenges. • Review and provide feedback on core code changes and support production deployments. CORE TECHNOLOGIES Azure: Azure Databricks, Azure Data Factory, Azure Synapse Analytics, Azure Functions, Azure Data Lake Storage Gen2, Azure Event Grid, Azure Event Hubs, Azure Service Bus, Azure Key Vault, Azure Monitor, Azure Log Analytics, Azure API Management (APIM), Azure DevOps. Scripting: Python, SQL, Bash. Databases: SQL Server, Oracle, PostgreSQL, Delta Lake Big Data: Apache Spark Version Control: GitHub, Git, Azure DevOps Visualization: Power BI & Integration with REST APIs for custom dashboards. Data Integration & Workflow Orchestration: Azure Data Factory, Databricks Workflows QUALIFICATIONS • IT professional experience in Azure Cloud with Minimum 3 years of experience in developing and maintaining data pipelines using Azure Databricks, Spark, and other Big Data technologies. • Proficiency in programming languages such as Python & SQL • Ability to recreate existing legacy application logic and functionality into Azure Databricks/Data Lake, SQL Database and SQL Datawarehouse environment. • Experience with Azure services such as Data Factory, Azure Machine Learning, and Azure DevOps. • Strong understanding of ETL processes and data warehousing concepts. • Excellent interpersonal and communication skills • Experience with software configuration management tools such as Git/GitHub
Saviance is seeking a Cloud Data Engineer with expertise in Azure and DataBricks to design and implement cloud solutions. The role involves collaboration with clinical teams and ensuring compliance with healthcare regulations.
JRD Systems Inc is seeking a Cloud Data Engineer with expertise in Azure and Databricks to design and implement scalable data solutions. This hybrid role is based in Dearborn, Michigan and requires a strong background in data engineering and cloud technologies.
Join a leading biopharmaceutical company as a Senior Azure Data Engineer, focusing on innovative data solutions and infrastructure. This hybrid role in Coppell, TX offers a competitive salary and comprehensive benefits.
Citadel Information Services Inc is seeking an Azure Cloud & Databricks Developer to design and maintain scalable data pipelines and ETL processes using Azure technologies. This role involves collaboration with data engineers to optimize cloud infrastructure and support analytics initiatives.
The Azure Cloud Data Delivery Lead/Architect will design and execute data architecture and integrations for the Enterprise Data platform at Builders FirstSource. This remote position requires extensive experience in cloud platforms and big data technologies.
Upwork is seeking a Cloud Data Architect with expertise in Azure and Databricks for a long-term remote project in the logistics industry. The role requires strong skills in big data technologies and cloud architecture.
Saviance is seeking a Cloud Data Engineer with expertise in Azure and DataBricks to design and implement cloud solutions. The role involves collaboration with clinical teams and ensuring compliance with healthcare regulations.
JRD Systems Inc is seeking a Cloud Data Engineer with expertise in Azure and Databricks to design and implement scalable data solutions. This hybrid role is based in Dearborn, Michigan and requires a strong background in data engineering and cloud technologies.
Join a leading biopharmaceutical company as a Senior Azure Data Engineer, focusing on innovative data solutions and infrastructure. This hybrid role in Coppell, TX offers a competitive salary and comprehensive benefits.
Citadel Information Services Inc is seeking an Azure Cloud & Databricks Developer to design and maintain scalable data pipelines and ETL processes using Azure technologies. This role involves collaboration with data engineers to optimize cloud infrastructure and support analytics initiatives.
The Azure Cloud Data Delivery Lead/Architect will design and execute data architecture and integrations for the Enterprise Data platform at Builders FirstSource. This remote position requires extensive experience in cloud platforms and big data technologies.
Upwork is seeking a Cloud Data Architect with expertise in Azure and Databricks for a long-term remote project in the logistics industry. The role requires strong skills in big data technologies and cloud architecture.
Saviance is seeking a Cloud Data Engineer with expertise in Azure and DataBricks to design and implement cloud solutions. The role involves collaboration with clinical teams and ensuring compliance with healthcare regulations.
JRD Systems Inc is seeking a Cloud Data Engineer with expertise in Azure and Databricks to design and implement scalable data solutions. This hybrid role is based in Dearborn, Michigan and requires a strong background in data engineering and cloud technologies.
Citadel Information Services Inc is seeking an Azure Cloud & Databricks Developer to design and maintain scalable data pipelines and ETL processes using Azure technologies. This role involves collaboration with data engineers to optimize cloud infrastructure and support analytics initiatives.