Role: Azure Databricks Engineer Location: Jersey City, NJ - Hybrid (2 days Onsite) NOTE :: Must be available for face to face for screening and final round of interview Responsibilities: 1. Design and implement a scalable data warehouse on Azure Databricks using data and dimensional modeling techniques to support analytical and reporting requirements. 2. Develop and optimize ETL/ELT pipelines using Python, Azure Databricks and PySpark for large-scale data processing, ensuring data quality, consistency, and integrity. 3. Establish and implement best practices for data ingestion, transformation, and storage using the medallion architecture (Bronze, Silver, Gold). 4. Architect and develop highly scalable data applications using Azure Databricks and distributed computing. 5. Optimize Databricks clusters and ETL/ELT workflows for performance and scalability. 6. Manage data storage solutions using Azure Data Lake Storage (ADLS) and Delta Lake while leveraging Unity Catalog for data governance, security, and access control. 7. Develop and schedule Databricks notebooks and jobs for automated daily execution, implementing monitoring, alerting, and automated recovery processes for job failures. 8. Identify and resolve bottlenecks in existing code and follow best coding practices to improve performance and maintainability. 9. Use GitHub as version control tool to manage code and collaborate effectively with other developers; build and maintain CI/CD pipelines for deployment and testing using Azure DevOps and GitHub. 10. Create comprehensive documentation for data architecture, ETL processes, and business logic. 11. Work closely with business stakeholders to understand project goals and architect scalable and efficient solutions. 12. Knowledge of user authentication on Unity Catalog and authorization between multiple systems, servers and environments. 13. Ensure that programs are written to the highest standards (e.g., Unit Tests) and technical specifications. 14. Ability to collaborate on projects and work independently when required. Qualifications: 1. 10+ years of prior experience as a developer in the required technologies (Azure Databricks, Python, PySpark, Datawarehouse Designing) 2. Solid organizational skills, ability to multi-task across different projects 3. Experience with Agile methodologies 4. Skilled at independently researching topics using all means available to discover relevant information. 5. Ability to work in a team environment. 6. Excellent verbal and written communication skills 7. Self-starter with ability to multi-task and to maintain momentum.
Job Type
Contractor role
Skills required
Python, Agile
Location
Edison, New Jersey
Salary
No salary information was found.
Date Posted
May 14, 2025
Crox Consulting Inc is seeking an experienced Azure Databricks Engineer to design and implement scalable data solutions in a hybrid work environment. The role involves developing ETL/ELT pipelines and optimizing data applications on Azure Databricks.