Drillo.AI is seeking an Azure Data Engineer with Snowflake experience to design and optimize data pipelines. This onsite role in Berkeley Heights, NJ requires expertise in Azure Data services and strong ETL development skills.
This Role is Onsite only role accepting only residents of New Jersey and New York, Also accepting only Green cards and US Citizens at this moment. About the Role The Azure Data Engineer will design, build, and optimize scalable data pipelines and solutions using Azure Data services, Databricks, and Snowflake administration. This position requires deep technical expertise in Azure cloud, strong ETL/data pipeline development experience with Databricks, and hands-on administration and optimization of Snowflake for analytics and reporting. Responsibilities • Design, develop, and maintain scalable data pipelines using Azure Data Factory, Azure Databricks, and Snowflake. • Administer, configure, and monitor Databricks clusters, workspaces, integrations, RBAC, ACLs, and manage workloads for optimal performance and cost. • Administer, configure, and optimize Snowflake warehouses, users, policies, roles, RBAC/permissions, resource monitors, storage, and compute. • Develop ETL and ELT solutions in Databricks with Spark, Python (PySpark), and SQL for large data sets. • Implement data modeling techniques and maintain data structures in Snowflake (star schema, Data Cubes, etc.). • Build and troubleshoot complex SQL queries; tune and optimize data transformations for performance. • Collaborate with business users, data scientists, and analysts to understand requirements and deliver robust, secure data solutions. • Ensure data governance, security, privacy, and compliance across Azure and Snowflake platforms. • Monitor and tune data workflows for efficiency and cost effectiveness; proactively identify and resolve platform issues. • Document data workflows, configurations, and provide knowledge transfer to peers and stakeholders. Qualifications • 5+ years of experience in data engineering or related data platform roles. • Demonstrated expertise in Azure Data Lake, Databricks, and Data Factory. • Expert-level skills in Spark (PySpark) and Python for building scalable ETL workflows. • Strong experience with Snowflake: user/role management, warehouse configuration, query optimization, resource control. • Deep knowledge of data modeling/design and warehouse concepts (star, snowflake schema). • Solid working knowledge of SQL optimization and data processing on cloud platforms. • Familiarity with DevOps, CI/CD, version control (Azure DevOps, Git, pipelines in Azure). • Strong troubleshooting, communication, and collaboration skills. ```
Infosys Limited is seeking a Lead Data Engineer with expertise in Azure, Databricks, Snowflake, Python, and SQL to design and optimize scalable data solutions. The role is based in Bellevue, WA, and requires collaboration with various stakeholders to solve complex data challenges.
Beechwood Computing Limited is seeking a Certified Azure Synapse Engineer with experience in Databricks and Snowflake. The role involves designing and optimizing scalable data pipelines and requires strong proficiency in Python or Scala.
Yantran LLC is seeking a Senior Software Developer with strong expertise in Azure Data Factory and data integration solutions. The role involves designing, developing, and implementing cloud-based data solutions in a collaborative environment.
Floga Technologies is seeking a Python Full Stack Developer with Azure Databricks experience to develop scalable applications and manage data solutions. The role requires strong expertise in Python, SQL, and Azure services.
Ccube is seeking a Senior/Lead Azure Data Engineer with expertise in Snowflake to design and maintain data infrastructure. The role involves developing scalable data pipelines and ensuring data quality for analysis and reporting.
Drillo.AI is seeking an Azure Data Engineer with Snowflake experience to design and optimize data pipelines. This onsite role in Berkeley Heights, NJ requires expertise in Azure Data services and strong ETL development skills.
Infosys Limited is seeking a Lead Data Engineer with expertise in Azure, Databricks, Snowflake, Python, and SQL to design and optimize scalable data solutions. The role is based in Bellevue, WA, and requires collaboration with various stakeholders to solve complex data challenges.
Beechwood Computing Limited is seeking a Certified Azure Synapse Engineer with experience in Databricks and Snowflake. The role involves designing and optimizing scalable data pipelines and requires strong proficiency in Python or Scala.
Yantran LLC is seeking a Senior Software Developer with strong expertise in Azure Data Factory and data integration solutions. The role involves designing, developing, and implementing cloud-based data solutions in a collaborative environment.
Floga Technologies is seeking a Python Full Stack Developer with Azure Databricks experience to develop scalable applications and manage data solutions. The role requires strong expertise in Python, SQL, and Azure services.
Ccube is seeking a Senior/Lead Azure Data Engineer with expertise in Snowflake to design and maintain data infrastructure. The role involves developing scalable data pipelines and ensuring data quality for analysis and reporting.
Drillo.AI is seeking an Azure Data Engineer with Snowflake experience to design and optimize data pipelines. This onsite role in Berkeley Heights, NJ requires expertise in Azure Data services and strong ETL development skills.
Infosys Limited is seeking a Lead Data Engineer with expertise in Azure, Databricks, Snowflake, Python, and SQL to design and optimize scalable data solutions. The role is based in Bellevue, WA, and requires collaboration with various stakeholders to solve complex data challenges.
Beechwood Computing Limited is seeking a Certified Azure Synapse Engineer with experience in Databricks and Snowflake. The role involves designing and optimizing scalable data pipelines and requires strong proficiency in Python or Scala.
Drillo.AI is seeking an Azure Data Engineer with Snowflake experience to design and optimize data pipelines. This onsite role in Berkeley Heights, NJ requires expertise in Azure Data services and strong ETL development skills.