The Azure DataOps / Databricks Architect role involves architecting and managing cloud-native data solutions on Azure, focusing on Databricks and Snowflake infrastructure. This position requires expertise in Terraform, data governance, and collaboration with cross-functional teams.
Data Platform Infrastructure (Data Infrastructure resources 1-2) Capacity planning, configure, deploy and maintain Databricks clusters, workspaces and Snowflake infrastructure on Azure cloud Use Terraform to automate provisioning and deploy Databricks clusters, workspaces, Snowflake and associated Azure resources. Ensure consistency and repeatability by treating infrastructure as code Monitor and optimize Databricks cluster performance and Snowflake resource utilization, troubleshoot issues to ensure optimal performance and cost-effectiveness. Implement and manage access controls and security policies to protect sensitive data. Develop environment strategies across the technology stack and governance based on best practices Provide technical support to Databricks and Snowflake users, including troubleshooting and issue resolution. Implement and enforce security policies, RBAC, access controls, and encryption mechanisms. Develop and maintain backup and disaster recovery strategies to ensure data integrity and availability. Collaborate with cross-functional teams, including data scientists, data engineers, and business analysts to understand their requirements and provide technical solutions Data Governance and Quality Management: Create and enforce data governance standards, ensuring robust data quality and compliance through tools such as Databricks Unity Catalog, Collibra and Snowflake Polaris. Enforce data governance, data quality, and enterprise standards, supporting a robust production environment Required Experience: Experience in Data Platform Engineering: Proven track record in architecting and delivering cloud-native data solutions on Azure using Terraform Infrastructure as Code. Proficiency in Azure, Databricks and Snowflake: Strong skills in data warehousing and lakehouse technologies with hands-on experience in Azure, Databricks, Delta Lake and Snowflake Tooling Knowledge: Experience with version control (GitHub), CI/CD pipelines (Azure DevOps, GitHub Actions), data orchestration and dashboarding tools
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
HiCounselor is seeking an Azure Databricks Architect to modernize and migrate data platforms for a client in the Midwest. The role involves designing scalable architectures and developing automated data pipelines using Databricks and Azure technologies.
The Azure DataOps / Databricks Architect role involves architecting and managing cloud-native data solutions on Azure, focusing on Databricks and Snowflake infrastructure. This position requires expertise in Terraform, data governance, and collaboration with cross-functional teams.
Infosys is seeking an Azure Databricks Senior Cloud Architect to drive digital transformation for clients through innovative architectural solutions. The role requires extensive experience in Azure, Databricks, and data engineering tools.
Infosys is seeking an Azure Databricks Senior Cloud Architect to drive digital transformation for clients through innovative architectural solutions. The role requires extensive experience in Azure, Databricks, and data engineering tools.
RIT Solutions Inc is seeking a DataOps / Databricks Architect in Phoenix, Arizona, with strong expertise in Databricks and Terraform. The role focuses on architecting and managing cloud-native data solutions on Azure.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
HiCounselor is seeking an Azure Databricks Architect to modernize and migrate data platforms for a client in the Midwest. The role involves designing scalable architectures and developing automated data pipelines using Databricks and Azure technologies.
The Azure DataOps / Databricks Architect role involves architecting and managing cloud-native data solutions on Azure, focusing on Databricks and Snowflake infrastructure. This position requires expertise in Terraform, data governance, and collaboration with cross-functional teams.
Infosys is seeking an Azure Databricks Senior Cloud Architect to drive digital transformation for clients through innovative architectural solutions. The role requires extensive experience in Azure, Databricks, and data engineering tools.
Infosys is seeking an Azure Databricks Senior Cloud Architect to drive digital transformation for clients through innovative architectural solutions. The role requires extensive experience in Azure, Databricks, and data engineering tools.
RIT Solutions Inc is seeking a DataOps / Databricks Architect in Phoenix, Arizona, with strong expertise in Databricks and Terraform. The role focuses on architecting and managing cloud-native data solutions on Azure.
Altera is seeking a skilled Data Architect specializing in Lakehouse architecture using Databricks, Azure, and Microsoft Fabric to design and manage cloud-based data solutions. The role involves collaboration with cross-functional teams to optimize data pipelines and ensure data governance.
HiCounselor is seeking an Azure Databricks Architect to modernize and migrate data platforms for a client in the Midwest. The role involves designing scalable architectures and developing automated data pipelines using Databricks and Azure technologies.
The Azure DataOps / Databricks Architect role involves architecting and managing cloud-native data solutions on Azure, focusing on Databricks and Snowflake infrastructure. This position requires expertise in Terraform, data governance, and collaboration with cross-functional teams.