The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
Position: Senior Data Architect (Azure Databricks/Python) Join to apply for the Senior Data Architect (Azure Databricks/Python) role at loan Depot 5 days ago Be among the first 25 applicants Join to apply for the Senior Data Architect (Azure Databricks/Python) role at loan Depot Position Summary The Senior Data Architect is responsible for designing, developing, and maintaining robust, scalable, and high-performance enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 15 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI/CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries. Position Summary The Senior Data Architect is responsible for designing, developing, and maintaining robust, scalable, and high-performance enterprise data architectures within a modern cloud environment. This role requires deep expertise in Big Data solutions (Delta Lake architecture), modern data warehouse practices and operations, semantic layering, dimensional modeling (star schemas), transactional OLTP databases (3NF modeling), and advanced data modeling techniques. The ideal candidate will have at least 15 years of data modeling experience specifically within Business Intelligence and Analytics contexts, extensive hands-on experience with batch and streaming data processing, and strong expertise with Apache Spark, Databricks, and Spark Structured Streaming. Required skills include proficiency in Python programming, Azure cloud technologies, semantic modeling, and modern CI/CD deployment practices. Experience in ML engineering is highly desirable. The candidate must be able to collaborate quickly and effectively with data and engineering teams, clearly document source-to-target mappings, and reverse engineer existing database objects such as stored procedures, views, and complex SQL queries. Responsibilities • Lead architecture design and implementation of enterprise-scale data platforms leveraging Databricks, Delta Lake, Azure cloud, and modern Big Data technologies. • Design, build, and maintain modern data warehouse solutions using dimensional modeling (star schema) and semantic layering to optimize analytics and reporting capabilities. • Define and enforce data modeling standards, guidelines, and best practices within analytics and BI contexts. • Architect robust batch processing and real-time streaming solutions using Apache Spark, Databricks, Kafka, Kinesis, and Spark Structured Streaming. • Effectively collaborate with engineering teams to rapidly deliver data architecture solutions and support agile development practices. • Provide clear, comprehensive source-to-target documentation, data lineage mappings, and semantic layer definitions. • Reverse engineer existing database structures, including stored procedures, views, and complex SQL logic, to document existing data processes and support modernization initiatives. • Provide technical leadership, mentoring, and guidance to data engineering teams, ensuring alignment with architectural standards and best practices. • Evaluate and continuously improve existing data architectures, optimize performance, and recommend enhancements for efficiency and scalability. • Collaborate closely with stakeholders to define long-term data strategies and clearly communicate architectural decisions. • Ensure compliance with industry standards, data governance practices,…
Cognizant North America is seeking an experienced Azure Databricks Architect to design and implement scalable data solutions remotely. The ideal candidate will have extensive experience in Azure technologies and data engineering.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric in San Jose, California. The role involves designing and managing cloud-based data architectures to support large-scale data processing and analytics.
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
Shrive Technologies is seeking a Technical Lead with expertise in Python, PySpark, Snowflake, Databricks, and Azure. The role requires strong leadership skills and extensive experience in software development and cloud platforms.
i.t.motives is seeking a Data Engineer with expertise in Azure, Python, and Databricks to build foundational data infrastructure for AI-driven solutions. This remote contract position emphasizes collaboration, problem-solving, and ethical data practices.
Cognizant North America is seeking an experienced Azure Databricks Architect to design and implement scalable data solutions remotely. The ideal candidate will have extensive experience in Azure technologies and data engineering.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric in San Jose, California. The role involves designing and managing cloud-based data architectures to support large-scale data processing and analytics.
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
Shrive Technologies is seeking a Technical Lead with expertise in Python, PySpark, Snowflake, Databricks, and Azure. The role requires strong leadership skills and extensive experience in software development and cloud platforms.
i.t.motives is seeking a Data Engineer with expertise in Azure, Python, and Databricks to build foundational data infrastructure for AI-driven solutions. This remote contract position emphasizes collaboration, problem-solving, and ethical data practices.
Cognizant North America is seeking an experienced Azure Databricks Architect to design and implement scalable data solutions remotely. The ideal candidate will have extensive experience in Azure technologies and data engineering.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric in San Jose, California. The role involves designing and managing cloud-based data architectures to support large-scale data processing and analytics.
The Senior Data Architect at loanDepot is responsible for designing and maintaining scalable enterprise data architectures using Azure Databricks and Python. The role requires extensive experience in Big Data solutions and collaboration with engineering teams.