HG Solutions is looking for a Data Engineer Associate III to design and build scalable data pipelines using Azure, Databricks, DBT, and Snowflake. The role involves transforming raw data into insights and developing robust data models.
Data Engineer Associate III - Data Engineering We are seeking a talented and experienced Data Engineer with strong expertise in Azure, Databricks, DBT, and Snowflake to join our team. In this role, you will be responsible for designing and building scalable data pipelines, transforming raw data into valuable insights, and developing robust data models using DBT in combination with Azure, Databricks, and Snowflake. The opportunity: Design, develop, and maintain data pipelines using Azure Data Factory, Databricks, DBT, and Snowflake for seamless data integration, transformation, and analysis. Implement data transformation models using DBT to ensure high-quality and consistent data transformation within the Snowflake ecosystem. Leverage Databricks and Apache Spark to process large datasets and integrate them with Snowflake for efficient data storage and analytics. Work closely with data scientists, analysts, and business teams to define data requirements, deliver insights, and ensure data quality and consistency. Develop and maintain data models and data pipelines that support reporting, analytics, and business intelligence applications. Automate and orchestrate data workflows with Azure Data Factory and DBT to streamline data processing and delivery. Optimize Snowflake data structures (e.g., schemas, tables, and views) to ensure efficient data storage, retrieval, and performance. What you need: The ideal candidate will be passionate about working with cutting-edge technologies to solve complex data engineering challenges. Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field (or equivalent experience). Proven experience as a Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake. Strong experience with Azure Data Factory, Azure Databricks, Azure Data Lake, and other Azure cloud services for data integration and processing. Proficiency with DBT for implementing data transformation workflows, creating models, and writing SQL-based scripts. Expertise in working with Snowflake for data warehousing, including experience with schema design, performance tuning, and optimization. Strong experience with Apache Spark and working in Databricks for large-scale data processing. Solid programming skills in SQL (advanced), Python, and Scala for developing data pipelines and transformation logic. Experience with ETL/ELT processes, data orchestration, and automating data workflows using Azure and DBT. Knowledge of data governance, security, and best practices for cloud data architectures. Familiarity with version control systems like Git, and experience in Agile environments. Preferred Qualifications: DBT Certifications or experience with advanced features such as DBT testing, macros, and hooks. Azure, Databricks or Snowflake certifications Experience with Snowflake performance tuning, including optimization of queries, schemas, and data partitioning. Familiarity with CI/CD practices and experience building automated pipelines for data workflows. Knowledge of cloud cost optimization in Azure and Snowflake for better resource utilization.
Phoenix Staff Inc is seeking an Applications and Data Engineering Projects Lead to oversee enterprise applications and data architecture in a hybrid role. The position requires strong technical leadership and project management skills in healthcare IT.
The BI Data Engineering Manager at McLane Company, Inc. will lead the Data Engineering team to enhance data infrastructure and productivity using cloud technologies. This hybrid position requires hands-on experience with Azure and big data technologies.
The IT Analyst III - Database Administrator at the State of Utah is responsible for managing and maintaining database performance, security, and availability across various platforms. This hybrid role involves both in-office and remote work, focusing on database administration for executive branch agencies.
The Judge Group Inc. is seeking a Product Manager specializing in Data Engineering and Analytics to lead the development of a scalable data delivery platform. This remote position requires strong collaboration across technology and analytics teams.
HG Solutions is looking for a Data Engineer Associate III to design and build scalable data pipelines using Azure, Databricks, DBT, and Snowflake. The role involves transforming raw data into insights and developing robust data models.
The Head of Data Engineering & Analytics at AlTi Tiedemann Global will lead the development and execution of the firm's enterprise data engineering strategy. This role involves architecting scalable data solutions and managing a high-performing data engineering team to support business intelligence and analytics across global operations.
Phoenix Staff Inc is seeking an Applications and Data Engineering Projects Lead to oversee enterprise applications and data architecture in a hybrid role. The position requires strong technical leadership and project management skills in healthcare IT.
The BI Data Engineering Manager at McLane Company, Inc. will lead the Data Engineering team to enhance data infrastructure and productivity using cloud technologies. This hybrid position requires hands-on experience with Azure and big data technologies.
The IT Analyst III - Database Administrator at the State of Utah is responsible for managing and maintaining database performance, security, and availability across various platforms. This hybrid role involves both in-office and remote work, focusing on database administration for executive branch agencies.
The Judge Group Inc. is seeking a Product Manager specializing in Data Engineering and Analytics to lead the development of a scalable data delivery platform. This remote position requires strong collaboration across technology and analytics teams.
HG Solutions is looking for a Data Engineer Associate III to design and build scalable data pipelines using Azure, Databricks, DBT, and Snowflake. The role involves transforming raw data into insights and developing robust data models.
The Head of Data Engineering & Analytics at AlTi Tiedemann Global will lead the development and execution of the firm's enterprise data engineering strategy. This role involves architecting scalable data solutions and managing a high-performing data engineering team to support business intelligence and analytics across global operations.
Phoenix Staff Inc is seeking an Applications and Data Engineering Projects Lead to oversee enterprise applications and data architecture in a hybrid role. The position requires strong technical leadership and project management skills in healthcare IT.
The BI Data Engineering Manager at McLane Company, Inc. will lead the Data Engineering team to enhance data infrastructure and productivity using cloud technologies. This hybrid position requires hands-on experience with Azure and big data technologies.
HG Solutions is looking for a Data Engineer Associate III to design and build scalable data pipelines using Azure, Databricks, DBT, and Snowflake. The role involves transforming raw data into insights and developing robust data models.