The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
Position: Senior Data Architect (Azure Databricks/Python) Join to apply for the Senior Data Architect (Azure Databricks/Python) role at loan Depot 5 days ago Be among the first 25 applicants Join to apply for the Senior Data Architect (Azure Databricks/Python) role at loan Depot Position Summary The Senior Data Architect is responsible for designing, developing, and maintaining robust, scalable, and high-performance enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 15 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI/CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries. Position Summary The Senior Data Architect is responsible for designing, developing, and maintaining robust, scalable, and high-performance enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 15 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI/CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries. Responsibilities • Lead architecture design and implementation of enterprise-scale data platforms leveraging Databricks, Delta Lake, Azure cloud, and modern Big Data technologies. • Design, build, and maintain modern data warehouse solutions using dimensional modeling (star schema) and semantic layering to optimize analytics and reporting capabilities. • Define and enforce data modeling standards, guidelines, and best practices within analytics and BI contexts. • Architect robust batch processing and real-time streaming solutions using Apache Spark, Databricks, Kafka, Kinesis, and Spark Structured Streaming. • Effectively collaborate with engineering teams to rapidly deliver data architecture solutions and support agile development practices. • Provide clear, comprehensive source-to-target documentation, data lineage mappings, and semantic layer definitions. • Reverse engineer existing database structures, including stored procedures, views, and complex SQL logic, to document existing data processes and support modernization initiatives. • Provide technical leadership, mentoring, and guidance to data engineering teams, ensuring alignment with architectural standards and best practices. • Evaluate and continuously improve existing data architectures, optimize performance, and recommend enhancements for efficiency and scalability. • Collaborate closely with stakeholders to define long-term data strategies and clearly communicate architectural decisions. • Ensure compliance with industry standards, data governance practices,…
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
REALIGN LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract in Pittsburgh, PA. The role involves designing and optimizing data pipelines in Azure cloud environments.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
K-Tek Resourcing LLC is seeking an Azure Databricks Architect to design and implement scalable data architecture solutions for a financial services firm in New York. The role requires strong expertise in Azure Data Services and a deep understanding of the credit card domain.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
REALIGN LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract in Pittsburgh, PA. The role involves designing and optimizing data pipelines in Azure cloud environments.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
K-Tek Resourcing LLC is seeking an Azure Databricks Architect to design and implement scalable data architecture solutions for a financial services firm in New York. The role requires strong expertise in Azure Data Services and a deep understanding of the credit card domain.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
REALIGN LLC is seeking an experienced Azure Databricks Developer proficient in Python, PySpark, and SQL for a long-term contract in Pittsburgh, PA. The role involves designing and optimizing data pipelines in Azure cloud environments.
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.