• We are looking for a skilled Technical Lead to oversee the design and implementation of large-scale data engineering and migration projects • Databricks & Delta Lake - Evaluate good understanding of Hands-on experience with Databricks, including development using PySpark or Spark SQL, Efficient use of Delta Lake for scalable data pipelines, and data lineage in Databricks • Azure Data Factory - Evaluate good understanding of Building and managing ETL pipelines using ADF, Using ADF for orchestration with Databricks, Blob Storage, SAP sources, Monitoring, error handling, and pipeline performance tuning • Performance Optimization - Evaluate good understanding of handling Performance optimization on Databricks • Python & PySpark - Evaluate good understanding of Writing robust, maintainable data processing scripts, Using Python/Spark for custom transformations and integration logic • SAP HANA Experience/knowledge - Integrating SAP data with other platforms, Handling large-scale SAP data extraction, transformation, and migration; Good understanding of SAP HANA architecture and data modeling • ETL Tools, SAP Data Services - Evaluate good understanding of Creating, deploying, and optimizing data jobs in SAP BODS/Data Services, Working with complex mappings and SAP-specific data types, Handling change data capture (CDC) scenarios • Data Profiling & Validation - Evaluate good understanding of Experience in data profiling, validation, and reconciliation during migrations • Azure Cloud & Networking - Evaluate good understanding of Azure services related to compute, storage, networking, and security, Experience resolving firewall, VPN, and VNet issues impacting data pipelines, Familiarity with IAM, RBAC, and secure credential storage
Job Type
Fulltime role
Skills required
Azure
Location
San Jose, California
Salary
No salary information was found.
Date Posted
June 27, 2025
Apptad Inc is seeking a Technical Lead to manage large-scale data engineering and migration projects using Databricks, Azure, and SAP HANA. The role requires expertise in ETL processes, data profiling, and performance optimization.