Job Title: Tech Lead Azure Data Engineer with SAP Hana Location: On-Site ( Santa Clara CA) Job Type: Full-Time Job Description: Task: Migrating from SAP to DBX based tech stack (DPaaS 2.0) Technical Lead Data Engineering & Migration Key Responsibilities: • Databricks & Delta Lake -Evaluate good understanding of Hands-on experience with Databricks, including development using PySpark or Spark SQL, Efficient use of Delta Lake for scalable data pipelines, and data lineage in Databricks • Azure Data Factory - Evaluate good understanding of Building and managing ETL pipelines using ADF, Using ADF for orchestration with Databricks, Blob Storage, SAP sources, Monitoring, error handling, and pipeline performance tuning • Performance Optimization - Evaluate good understanding of handling Performance optimization on Databricks • Python & PySpark - Evaluate good understanding of Writing robust, maintainable data processing scripts, Using Python/Spark for custom transformations and integration logic • SAP HANA knowledge- Integrating SAP data with other platforms, Handling large-scale SAP data extraction, transformation, and migration Good To Have Skills • ETL Tools SAP Data Services -Evaluate good understanding of Creating, deploying, and optimizing data jobs in SAP BODS/Data Services, Working with complex mappings and SAP-specific data types, Handling change data capture (CDC) scenarios • Data Profiling & Validation -Evaluate good understanding of Experience in data profiling, validation, and reconciliation during migrations • Azure Cloud & Networking -Evaluate good understanding of Azure services related to compute, storage, networking, and security, Experience resolving firewall, VPN, and VNet issues impacting data pipelines, Familiarity with IAM, RBAC, and secure credential storage
Job Type
Fulltime role
Skills required
Azure
Location
Dover, Delaware
Salary
No salary information was found.
Date Posted
June 20, 2025
FalconSmartIT is seeking a Tech Lead Azure Data Engineer with expertise in SAP HANA to lead data migration projects. The role involves hands-on experience with Databricks, Azure Data Factory, and performance optimization.