Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend development and microservices architecture. The role involves mentoring junior developers and optimizing data pipelines in a collaborative environment.
Key Responsibilities: • Lead the design and development of backend systems using Python Django. • Build, deploy, and manage RESTful APIs and microservices architecture. • Integrate and manage Azure Cloud services for application hosting and CI/CD. • Design and optimize data pipelines using PySpark and Azure Databricks. • Work with Delta Lake and Delta Tables for efficient data storage and querying. • Collaborate cross-functionally with front-end teams, product managers, and stakeholders. • Conduct peer code reviews, enforce coding standards, and ensure best practices. • Mentor junior developers and foster a high-performing engineering culture. • Optimize database performance and implement scalable SQL/NoSQL solutions. • Participate in architectural discussions and technical strategy planning. Required Qualifications: • Bachelor’s degree in Computer Science, Engineering, or a related field. • 8+ years of professional experience in backend development. • 8+ years of hands-on experience with Django framework. • Strong expertise in Python programming and microservices architecture. • Proven experience with Azure Cloud services and CI/CD pipelines. • Experience with Azure Databricks, PySpark, Delta Lake, and big data tools. • Deep understanding of REST APIs, SQL/NoSQL databases. • Proficient with version control systems such as Git. • Familiarity with Agile/Scrum methodologies. • Strong communication, leadership, and problem-solving skills. Preferred Skills: • Experience with containerization tools like Docker and orchestration platforms like Kubernetes. • Exposure to data engineering and data lake architectures. • Background in financial or enterprise-grade applications is a plus. Why Join Us? • Work on cutting-edge technologies with a high-performing team • Opportunity to lead complex technical initiatives • Dynamic work environment with growth potential Powered by JazzHR lZBnvxa7aV
Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend system design and development. The role involves building microservices, managing cloud services, and mentoring junior developers.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
i.t.motives is seeking a Data Engineer with expertise in Azure, Python, and Databricks to build foundational data infrastructure for AI-driven solutions. This remote contract position emphasizes collaboration, problem-solving, and ethical data practices.
REALIGN LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract in Pittsburgh, PA. The role involves designing and optimizing data pipelines in Azure cloud environments.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend development and microservices architecture. The role involves mentoring junior developers and optimizing data pipelines in a collaborative environment.
Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend system design and development. The role involves building microservices, managing cloud services, and mentoring junior developers.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
i.t.motives is seeking a Data Engineer with expertise in Azure, Python, and Databricks to build foundational data infrastructure for AI-driven solutions. This remote contract position emphasizes collaboration, problem-solving, and ethical data practices.
REALIGN LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract in Pittsburgh, PA. The role involves designing and optimizing data pipelines in Azure cloud environments.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend development and microservices architecture. The role involves mentoring junior developers and optimizing data pipelines in a collaborative environment.
Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend system design and development. The role involves building microservices, managing cloud services, and mentoring junior developers.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
Alasus Technologies is seeking a Lead Python Django Developer with expertise in Azure and Big Data to lead backend development and microservices architecture. The role involves mentoring junior developers and optimizing data pipelines in a collaborative environment.