Codvo.ai is seeking a Fullstack Data Engineer with expertise in Databricks to design and optimize data pipelines and analytics solutions. This role requires strong skills in data engineering, cloud platforms, and software development.
Job Description: Data Engineer - Databricks Role Overview We are looking for a highly skilled Full Stack Data Engineer with expertise in Databricks to design, develop, and optimize end-to-end data pipelines, data platforms, and analytics solutions. This role combines strong data engineering, cloud platform expertise, and software engineering skills to deliver scalable, production-grade solutions. Key Responsibilities • Design and develop ETL/ELT pipelines on Databricks (PySpark, Delta Lake, SQL). • Architect data models (batch and streaming) for analytics, ML, and reporting. • Optimize performance of large-scale distributed data processing jobs. • Implement CI/CD pipelines for Databricks workflows using GitHub Actions, Azure DevOps, or similar. • Build and maintain APIs, dashboards, or applications that consume processed data (full-stack aspect). • Collaborate with data scientists, analysts, and business stakeholders to deliver solutions. • Ensure data quality, lineage, governance, and security compliance. Required Skills & Qualifications • Core Databricks Skills: • Strong in PySpark, Delta Lake, Databricks SQL. • Experience with Databricks Workflows, Unity Catalog, and Delta Live Tables. • Programming & Full Stack: • Python (mandatory), SQL (expert). • Exposure to Java/Scala (for Spark jobs). • Knowledge of APIs, microservices (FastAPI/Flask), or basic front-end (React/Angular) is a plus. • Cloud Platforms: • Proficiency with at least one: Azure Databricks, AWS Databricks, or GCP Databricks. • Knowledge of cloud storage (ADLS, S3, GCS), IAM, networking. • DevOps & CI/CD: • Git, CI/CD tools (GitHub Actions, Azure DevOps, Jenkins). • Containerization (Docker, Kubernetes is a plus). • Data Engineering Foundations: • Data modeling (OLTP/OLAP). • Batch & streaming data processing (Kafka, Event Hub, Kinesis). • Data governance & compliance (Unity Catalog, Lakehouse security). Nice-to-Have • Experience with machine learning pipelines (MLflow, Feature Store). • Knowledge of data visualization tools (Power BI, Tableau, Looker). • Exposure to Graph databases (Neo4j) or RAG/LLM pipelines. Qualifications • Bachelor's or Master's in Computer Science, Data Engineering, or related field. • 4-7 years of experience in data engineering, with at least 2 years on Databricks. Soft Skills • Strong problem-solving and analytical skills. • Ability to work in fusion teams (business + engineering + AI/ML). • Clear communication and documentation abilities. About Us At Codvo, we are committed to building scalable, future-ready data platforms that power business impact. We believe in a culture of innovation, collaboration, and growth, where engineers can experiment, learn, and thrive. Join us to be part of a team that solves complex data challenges with creativity and cutting-edge technology.
ICF is seeking a Senior Databricks SME Data Engineer to support data modernization initiatives for federal clients. This fully remote role involves designing data pipelines and ensuring data quality for emergency management applications.
Upwork is seeking a Cloud Data Architect with expertise in Azure and Databricks for a long-term remote project in the logistics industry. The role requires strong skills in big data technologies and cloud architecture.
Codvo.ai is seeking a Fullstack Data Engineer with expertise in Databricks to design and optimize data pipelines and analytics solutions. This role requires strong skills in data engineering, cloud platforms, and software development.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
Join ICF as a Databricks Solutions Engineer to leverage your expertise in data engineering and analytics for federal clients. This remote role focuses on modernizing data platforms to enhance disaster management and service delivery.
ICF International is seeking a Senior Databricks Solutions Engineer to enhance data infrastructure and analytics for federal clients in the emergency management sector. This fully remote role requires expertise in Databricks and data engineering to drive effective service delivery.
ICF is seeking a Senior Databricks SME Data Engineer to support data modernization initiatives for federal clients. This fully remote role involves designing data pipelines and ensuring data quality for emergency management applications.
Upwork is seeking a Cloud Data Architect with expertise in Azure and Databricks for a long-term remote project in the logistics industry. The role requires strong skills in big data technologies and cloud architecture.
Codvo.ai is seeking a Fullstack Data Engineer with expertise in Databricks to design and optimize data pipelines and analytics solutions. This role requires strong skills in data engineering, cloud platforms, and software development.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
Join ICF as a Databricks Solutions Engineer to leverage your expertise in data engineering and analytics for federal clients. This remote role focuses on modernizing data platforms to enhance disaster management and service delivery.
ICF International is seeking a Senior Databricks Solutions Engineer to enhance data infrastructure and analytics for federal clients in the emergency management sector. This fully remote role requires expertise in Databricks and data engineering to drive effective service delivery.
ICF is seeking a Senior Databricks SME Data Engineer to support data modernization initiatives for federal clients. This fully remote role involves designing data pipelines and ensuring data quality for emergency management applications.
Upwork is seeking a Cloud Data Architect with expertise in Azure and Databricks for a long-term remote project in the logistics industry. The role requires strong skills in big data technologies and cloud architecture.
Codvo.ai is seeking a Fullstack Data Engineer with expertise in Databricks to design and optimize data pipelines and analytics solutions. This role requires strong skills in data engineering, cloud platforms, and software development.