Spiceorb is seeking a skilled Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake to design and build scalable data pipelines. The role involves transforming raw data into insights and developing robust data models.
Data Engineer (Azure Databricks, DBT, Snowflake) Seattle, WA Fulltime JD: • We are seeking a talented and experienced Data Engineer with strong expertise in Azure, Databricks, DBT, and Snowflake to join our team. In this role, you will be responsible for designing and building scalable data pipelines, transforming raw data into valuable insights, and developing robust data models using DBT in combination with Azure, Databricks, and Snowflake. • This position description identifies the responsibilities and tasks typically associated with the performance of the position. Other relevant essential functions may be required. Skills Required: • The ideal candidate will be passionate about working with cutting-edge technologies to solve complex data engineering challenges. • Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field (or equivalent experience). • Proven experience as a Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake. • Strong experience with Azure Data Factory, Azure Databricks, Azure Data Lake, and other Azure cloud services for data integration and processing. • Proficiency with DBT for implementing data transformation workflows, creating models, and writing SQL-based scripts. • Expertise in working with Snowflake for data warehousing, including experience with schema design, performance tuning, and optimization. • Strong experience with Apache Spark and working in Databricks for large-scale data processing. • Solid programming skills in SQL (advanced), Python, and Scala for developing data pipelines and transformation logic. • Experience with ETL/ELT processes, data orchestration, and automating data workflows using Azure and DBT. • Knowledge of data governance, security, and best practices for cloud data architectures. • Familiarity with version control systems like Git, and experience in Agile environments. Preferred Qualifications: • DBT Certifications or experience with advanced features such as DBT testing, macros, and hooks. • Azure, Databricks or Snowflake certifications • Experience with Snowflake performance tuning, including optimization of queries, schemas, and data partitioning. • Familiarity with CI/CD practices and experience building automated pipelines for data workflows. • Knowledge of cloud cost optimization in Azure and Snowflake for better resource utilization.
Tiger Analytics is seeking a Principal Data Engineer with expertise in Azure and Databricks to design and build scalable data solutions. This role involves collaborating with various teams to deliver high-performance data processing and analytics solutions.
Saransh Inc is seeking a Data Engineer with expertise in Azure and Snowflake to join their team in Washington, DC. The role involves developing data pipelines and working with cloud services to solve complex data engineering challenges.
Saransh Inc is seeking a Data Engineer with expertise in Azure and Snowflake to join their team in Seattle. The role involves developing data pipelines and transformation logic using cutting-edge technologies.
Join Motion Recruitment Partners LLC as a Data Engineer in Philadelphia, PA, focusing on Snowflake, Databricks, and Azure technologies. This hybrid role offers the chance to collaborate across teams in a well-established advertising company.
Spiceorb is seeking a skilled Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake to design and build scalable data pipelines. The role involves transforming raw data into insights and developing robust data models.
Exusia is seeking a Lead Data Engineer with expertise in Azure, Databricks, and Snowflake to design and develop data systems for their clients. This full-time remote position requires strong analytical skills and experience in ETL processes.
Tiger Analytics is seeking a Principal Data Engineer with expertise in Azure and Databricks to design and build scalable data solutions. This role involves collaborating with various teams to deliver high-performance data processing and analytics solutions.
Saransh Inc is seeking a Data Engineer with expertise in Azure and Snowflake to join their team in Washington, DC. The role involves developing data pipelines and working with cloud services to solve complex data engineering challenges.
Saransh Inc is seeking a Data Engineer with expertise in Azure and Snowflake to join their team in Seattle. The role involves developing data pipelines and transformation logic using cutting-edge technologies.
Join Motion Recruitment Partners LLC as a Data Engineer in Philadelphia, PA, focusing on Snowflake, Databricks, and Azure technologies. This hybrid role offers the chance to collaborate across teams in a well-established advertising company.
Spiceorb is seeking a skilled Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake to design and build scalable data pipelines. The role involves transforming raw data into insights and developing robust data models.
Exusia is seeking a Lead Data Engineer with expertise in Azure, Databricks, and Snowflake to design and develop data systems for their clients. This full-time remote position requires strong analytical skills and experience in ETL processes.
Tiger Analytics is seeking a Principal Data Engineer with expertise in Azure and Databricks to design and build scalable data solutions. This role involves collaborating with various teams to deliver high-performance data processing and analytics solutions.
Saransh Inc is seeking a Data Engineer with expertise in Azure and Snowflake to join their team in Washington, DC. The role involves developing data pipelines and working with cloud services to solve complex data engineering challenges.
Spiceorb is seeking a skilled Data Engineer with expertise in Azure, Databricks, DBT, and Snowflake to design and build scalable data pipelines. The role involves transforming raw data into insights and developing robust data models.