Job Title: Lead ADB Developer 📍 Location: Pleasanton, CA (Hybrid) 🕒 Job Type: Contracting About the Role: We are looking for a highly skilled Senior Azure Data bricks (ADB) Lead Developer to join our Data Engineering team. This role involves developing large-scale batch and streaming data pipelines on Azure Cloud. Key Responsibilities: • Design and Develop: Create real-time and batch data pipelines using Azure Databricks, Apache Spark, and Structured Streaming. • Data Processing: Write efficient ETL scripts and automate workflows using Python. • Data Integration: Integrate with various data sources and destinations, including DB2, MongoDB, and other enterprise-grade data systems. • Performance Optimization: Tune Spark jobs for optimal performance and cost-effective compute usage on Azure. • Collaboration: Work with platform and architecture teams to ensure secure, scalable, and maintainable cloud data infrastructure. • CI/CD Support: Implement CI/CD for Databricks pipelines and notebooks using tools like GitHub and Azure DevOps. • Stakeholder Communication: Interface with product owners, data scientists, and business analysts to translate data requirements into production-ready pipelines. Required Skills: • experience in data engineering • Python Proficiency: • Data Manipulation: Using libraries like Pandas and NumPy for data manipulation and analysis. • Data Processing: Writing efficient ETL scripts. • Automation: Automating repetitive tasks and workflows. • Debugging: Strong debugging skills to troubleshoot and optimize code. • Database Management: • SQL: Advanced SQL skills for querying and managing relational databases. • NoSQL: Experience with NoSQL databases like MongoDB or Cassandra. • Data Warehousing: Knowledge of data warehousing solutions like Google BigQuery or Snowflake. • Big Data Technologies: • Kafka: Knowledge of data streaming platforms like Apache Kafka. • Version Control: • Git: Using version control systems for collaborative development. • Data Modeling: • Schema Design: Designing efficient and scalable database schemas. • Data Governance: Ensuring data quality, security, and compliance. • Database Management: • DB2: Understanding of DB2 architecture, SQL queries, and database management. • MongoDB: Knowledge of MongoDB schema design, indexing, and query optimization. • Programming Skills: • Proficiency in languages such as Java, Python, or JavaScript to write scripts for data extraction and transformation. • Experience with BSON (Binary JSON) for data conversion. • Cloud Services: • Experience with cloud platforms like AWS or Azure for deploying and managing databases. Preferred Skills: • Experience with Java or Scala in Spark streaming. • Familiarity with Azure services like Data Lake, Data Factory, Synapse, and Event Hubs. • Background in building data platforms in regulated or large-scale enterprise environments. Lovepreet Singh Account Manager m: +1 669-309-1773 w: www.e-solutionsinc.com e: Love.s@e-solutionsinc.com LinkedIn: https://www.linkedin.com/in/singhlovepreet/ “Disclaimer: E-Solutions Inc. provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, marital status, amnesty, or status as a covered veteran in accordance with applicable federal, state and local laws. We especially invite women, minorities, veterans, and individuals with disabilities to apply. EEO/AA/M/F/Vet/Disability.”
Job Type
Contractor role
Skills required
Python, NoSQL, MongoDB, Java, JavaScript, Azure
Location
Pleasanton, California
Salary
No salary information was found.
Date Posted
July 1, 2025
E-Solutions is seeking a Senior Azure Data Engineer to develop large-scale data pipelines on Azure Cloud. This hybrid contracting role is based in Pleasanton, California.