Job description Key Responsibilities Data Pipeline Design and Development Designing developing and maintaining scalable data pipelines and ETL processes using Azure Databricks Data Factory and other Azure services Spark Job Implementation and Optimization Implementing and optimizing Spark jobs data transformations and workflows within Databricks Data Model Development Developing and maintaining data models and data dictionaries Data Quality and Governance Developing and maintaining data quality checks governance policies and security procedures ETL Process Design and Maintenance Designing and creating ETL processes to supply data to various destinations including data warehouses Data Integration Integrating data from various sources into Azure Databricks Collaboration Working with other teams including data engineers data scientists and analysts to ensure data quality and consistency Monitoring and Optimization Implementing effective monitoring processes to track performance and optimize workflows Data Lakehouse Solutions Contributing to the design and implementation of data lakehouse solutions using Databricks Skills and Qualifications Azure Databricks Proficiency Experience with Azure Databricks PySpark and Spark Programming Languages Proficiency in languages like Python SQL and Scala ETL and Data Warehousing Strong understanding of ETL processes data warehousing concepts and data modeling Cloud Platforms Experience with cloud platforms particularly Microsoft Azure Data Engineering Experience in data engineering data pipelines and data integration Data Governance and Security Knowledge of data governance policies and procedures Problemsolving and Debugging Excellent problemsolving and debugging skills Communication and Teamwork Strong communication and teamwork skills Skills Mandatory Skills : ANSI-SQL, Azure BLOB, Azure Data Factory, AZURE DATA LAKE, Azure Functions, Azure SQL, Azure Synapse Analytics, Databricks, Java, Python, Scala, Snowflake
Job Type
Contractor role
Skills required
Azure, Python, Synapse, Java
Location
Cincinnati, Ohio
Salary
No salary information was found.
Date Posted
May 13, 2025
Galent is seeking a Cloud Specialist in Cincinnati, Ohio, to design and maintain scalable data pipelines and ETL processes using Azure services. The role involves collaboration with data teams to ensure data quality and governance.