The Senior Data Architect at loanDepot is responsible for designing and implementing scalable data architectures in Azure, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and collaboration with engineering teams.
Join to apply for the Senior Data Architect (Azure Databricks/Python) role at loanDepot Get AI-powered advice on this job and more exclusive features. The Senior Data Architect is responsible for designing, developing, and maintaining robust, scalable, and high-performance enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 15 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI/CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries. Lead architecture design and implementation of enterprise-scale data platforms leveraging Databricks, Delta Lake, Azure cloud, and modern Big Data technologies. Design, build, and maintain modern data warehouse solutions using dimensional modeling (star schema) and semantic layering to optimize analytics and reporting capabilities. Define and enforce data modeling standards, guidelines, and best practices within analytics and BI contexts. Architect robust batch processing and real-time streaming solutions using Apache Spark, Databricks, Kafka, Kinesis, and Spark Structured Streaming. Effectively collaborate with engineering teams to rapidly deliver data architecture solutions and support agile development practices. Provide clear, comprehensive source-to-target documentation, data lineage mappings, and semantic layer definitions. Reverse engineer existing database structures, including stored procedures, views, and complex SQL logic, to document existing data processes and support modernization initiatives. Provide technical leadership, mentoring, and guidance to data engineering teams, ensuring alignment with architectural standards and best practices. Evaluate and continuously improve existing data architectures, optimize performance, and recommend enhancements for efficiency and scalability. Collaborate closely with stakeholders to define long-term data strategies and clearly communicate architectural decisions. Ensure compliance with industry standards, data governance practices, regulatory requirements, and security guidelines. Champion modern DevOps and CI/CD practices for data and analytics pipelines. Expert working knowledge of the data platform landscape along with best practices with respect to Relational databases, NoSQL databases, and “Big Data” architectures Minimum of 15 years of hands-on data modeling experience specifically in Business Intelligence (BI) and Analytics contexts. Extensive experience designing and implementing modern data architectures, including Big Data solutions (Delta Lake), modern data warehouses (star schema/dimensional modeling), semantic layering, and transactional OLTP (3NF) data modeling. Prior experience designing and building a Delta Lake using Medallion Architecture Deep understanding of relational and dimensional modeling, normalization (3NF), semantic modeling, and transactional database design principles. Proven ability to produce detailed source-to-target mappings, data lineage documentation, semantic definitions, and reverse engineer existing stored procedures, views, and SQL logic. Demonstrated expertise in Apache Spark, Databricks, and Spark Structured Streaming for batch and real-time data processing. Proficiency in Python programming required. Strong experience with Azure cloud technologies, including Azure Data Factory, Azure Storage, Azure Databricks, and related data services. Solid experience designing streaming data solutions using Kafka, Kinesis, or similar streaming technologies. Knowledge and hands-on experience implementing modern CI/CD practices for data engineering and analytics solutions. ML Engineering experience or exposure to ML pipelines and model deployment processes highly desirable. Experience in the mortgage or financial services industry preferred but not required. Exposure to FiveTran and Dynamics CRM loanDepot (NYSE: LDI) is a digital commerce company committed to serving its customers throughout the home ownership journey. Since its launch in 2010, loanDepot has revolutionized the mortgage industry with a digital-first approach that makes it easier, faster, and less stressful to purchase or refinance a home. Today, as the nation's second largest non-bank retail mortgage lender, loanDepot enables customers to achieve the American dream of homeownership through a broad suite of lending and real estate services that simplify one of life's most complex transactions. With headquarters in Southern California and offices nationwide, loanDepot is committed to serving the communities in which its team lives and works through a variety of local, regional, and national philanthropic efforts. We are an equal opportunity employer and value diversity in our company. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Full-time Engineering and Information Technology Sign in to set job alerts for “Senior Data Architect” roles.
Cognizant is seeking a Microsoft Certified Azure Solutions Architect with expertise in Data Engineering and AI Engineering to design and implement scalable data solutions in Chicago, IL. The role involves real-time data access, AI-driven analytics, and integration with enterprise systems.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
Avanade Inc. is seeking a Senior Consultant in Azure Data Engineering to join their Data and AI Practice in Philadelphia, PA. The role involves designing data solutions to enhance decision-making and support data-driven strategies for clients.
The Senior Data Architect at loanDepot is responsible for designing and implementing scalable data architectures in Azure, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and collaboration with engineering teams.
i.t.motives is seeking a Data Engineer with expertise in Azure, Python, and Databricks to build foundational data infrastructure for AI-driven solutions. This remote contract position emphasizes collaboration, problem-solving, and ethical data practices.
HMG America is seeking a Data Engineering Lead with expertise in Azure and Databricks to manage complex data engineering solutions in Plano, Texas. The role involves leading teams, optimizing data pipelines, and collaborating with stakeholders to drive data initiatives.
Cognizant is seeking a Microsoft Certified Azure Solutions Architect with expertise in Data Engineering and AI Engineering to design and implement scalable data solutions in Chicago, IL. The role involves real-time data access, AI-driven analytics, and integration with enterprise systems.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
Avanade Inc. is seeking a Senior Consultant in Azure Data Engineering to join their Data and AI Practice in Philadelphia, PA. The role involves designing data solutions to enhance decision-making and support data-driven strategies for clients.
The Senior Data Architect at loanDepot is responsible for designing and implementing scalable data architectures in Azure, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and collaboration with engineering teams.
i.t.motives is seeking a Data Engineer with expertise in Azure, Python, and Databricks to build foundational data infrastructure for AI-driven solutions. This remote contract position emphasizes collaboration, problem-solving, and ethical data practices.
HMG America is seeking a Data Engineering Lead with expertise in Azure and Databricks to manage complex data engineering solutions in Plano, Texas. The role involves leading teams, optimizing data pipelines, and collaborating with stakeholders to drive data initiatives.
Cognizant is seeking a Microsoft Certified Azure Solutions Architect with expertise in Data Engineering and AI Engineering to design and implement scalable data solutions in Chicago, IL. The role involves real-time data access, AI-driven analytics, and integration with enterprise systems.
The Senior Data Architect at loanDepot will design and maintain scalable data architectures in a cloud environment, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and strong expertise in Azure and Apache Spark technologies.
The Senior Data Architect at loanDepot is responsible for designing and implementing scalable data architectures in Azure, focusing on Big Data solutions and modern data warehousing. The role requires extensive experience in data modeling, batch and streaming data processing, and collaboration with engineering teams.