Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.
Position: Cloud Data Engineer (On-Site / Tulsa, OK) Are you a problem-solver with a passion for working in the cloud? We are looking for a Cloud Data Engineer who thrives in a fast-paced, collaborative environment and is eager to design scalable, cloud-based data solutions to support growing business needs. If you want to make an impact by leveraging Azure and Databricks to optimize data processes, we want to hear from you! Key Responsibilities: • Lead the creation of data pipelines from start to finish, connecting disparate systems and enabling the seamless flow of information. • Architect and deploy cloud-based data solutions that optimize both performance and cost, using Azure and Databricks technologies. • Work with a dynamic team to ensure the seamless integration of data sources, performing data transformation, cleaning, and analysis for real-time decision-making. • Ensure high data quality and maintainability across all deployed solutions, solving performance issues and ensuring data integrity. • Act as a technical resource for other team members and stakeholders, helping to guide them through complex data engineering challenges. What You’ll Be Working With: • Strong experience with Databricks and Azure Synapse for designing and maintaining large-scale data pipelines. • Expertise in Python, PySpark, and SQL to develop efficient data transformations and manage large datasets. • Utilization of Azure Data Factory and SSIS for end-to-end data pipeline management. • Integration of Azure Dev Ops to ensure streamlined code development and deployment. What We’re Looking For: • 3+ years of hands-on experience with cloud data engineering, specifically in Azure environments. • Strong understanding of cloud architecture and best practices for building scalable and secure data systems. • Knowledge of modern data processing frameworks and languages (e.g., Databricks, Python, PySpark). • A proactive, solution-oriented mindset with excellent problem-solving abilities. • Strong communication skills with the ability to interact effectively with business leaders, developers, and data scientists. If you’re a self-starter looking to take the lead on exciting cloud-based data projects and make an immediate impact, apply now! #J-18808-Ljbffr
Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.
Rocket is seeking a Senior Data Engineer with expertise in Microsoft Fabric and Azure Synapse to support a high-impact data integration initiative for the Finance team. This role requires hands-on development and collaboration in a multi-tenant Azure environment, based in Irving, TX.
Cognizant is seeking an Azure Databricks Tech Lead to design and implement data solutions for insurance workflows in Hartford, CT. The role involves developing data pipelines and guiding junior engineers while leveraging Azure tools, Python, and SQL.
Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.
NavitsPartners is seeking a Cloud Data Engineer - Consultant in Columbia, SC, to develop and optimize cloud-based data pipelines. The role requires expertise in Python, SQL, and cloud environments like AWS or Azure.
Brandon Consulting Associates, Inc. is seeking a senior Enterprise Data Modeler to develop data models for a cloud-based data management platform in Richmond, VA. The role requires onsite work and involves translating business needs into data models while ensuring data governance and quality.
Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.
Rocket is seeking a Senior Data Engineer with expertise in Microsoft Fabric and Azure Synapse to support a high-impact data integration initiative for the Finance team. This role requires hands-on development and collaboration in a multi-tenant Azure environment, based in Irving, TX.
Cognizant is seeking an Azure Databricks Tech Lead to design and implement data solutions for insurance workflows in Hartford, CT. The role involves developing data pipelines and guiding junior engineers while leveraging Azure tools, Python, and SQL.
Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.
NavitsPartners is seeking a Cloud Data Engineer - Consultant in Columbia, SC, to develop and optimize cloud-based data pipelines. The role requires expertise in Python, SQL, and cloud environments like AWS or Azure.
Brandon Consulting Associates, Inc. is seeking a senior Enterprise Data Modeler to develop data models for a cloud-based data management platform in Richmond, VA. The role requires onsite work and involves translating business needs into data models while ensuring data governance and quality.
Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.
Rocket is seeking a Senior Data Engineer with expertise in Microsoft Fabric and Azure Synapse to support a high-impact data integration initiative for the Finance team. This role requires hands-on development and collaboration in a multi-tenant Azure environment, based in Irving, TX.
Motion Recruitment is seeking a Cloud Data Engineer to design scalable cloud-based data solutions in a collaborative environment. The role involves leveraging Azure and Databricks to optimize data processes and ensure data integrity.