Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.
A client in Charlotte, NC is looking for Lead Data Engineer to join their team. As the Technical Lead Data Engineer, your primary responsibility will be to spearhead the design, development, and implementation of data solutions aimed at empowering the organization to derive actionable insights from intricate datasets. You will take the lead in guiding a team of data engineers, fostering collaboration with cross-functional teams, and spearheading initiatives geared towards fortifying our data infrastructure, CI/CD pipelines, and analytics capabilities. Responsibilities are shown below: Apply advanced knowledge of Data Engineering principles, methodologies and techniques to design and implement data loading and aggregation frameworks across broad areas of the organization. Gather and process raw, structured, semi-structured and unstructured data using batch and real-time data processing frameworks. Implement and optimize data solutions in enterprise data warehouses and big data repositories, focusing primarily on movement to the cloud. Drive new and enhanced capabilities to Enterprise Data Platform partners to meet the needs of product / engineering / business. Experience building enterprise systems especially using Databricks, Snowflake and platforms like Azure, AWS, GCP etc Leverage strong Python, Spark, SQL programming skills to construct robust pipelines for efficient data processing and analysis. Implement CI/CD pipelines for automating build, test, and deployment processes to accelerate the delivery of data solutions. Implement data modeling techniques to design and optimize data schemas, ensuring data integrity and performance. Drive continuous improvement initiatives to enhance performance, reliability, and scalability of our data infrastructure. Collaborate with data scientists, analysts, and other stakeholders to understand business requirements and translate them into technical solutions. Implement best practices for data governance, security, and compliance to ensure the integrity and confidentiality of our data assets. We are a company committed to creating inclusive environments where people can bring their full, authentic selves to work every day. We are an equal opportunity employer that believes everyone matters. Qualified candidates will receive consideration for employment opportunities without regard to race, religion, sex, age, marital status, national origin, sexual orientation, citizenship status, disability, or any other status or characteristic protected by applicable laws, regulations, and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application or recruiting process, please send a request to Human Resources Request Form . The EEOC "Know Your Rights" Poster is available here . To learn more about how we collect, keep, and process your private information, please review Insight Global's Workforce Privacy Policy: https://insightglobal.com/workforce-privacy-policy/ . Required Skills & Experience Must be onsite 5 days per week 8+ years of experience in a data engineering role, with expertise in designing and building data pipelines, ETL processes, and data warehouses 3+ years of experience working as a lead and developing/mentoring other engineers Experience working with Snowflake Strong proficiency in SQL, Python and Spark programming languages Strong experience with cloud platforms such as AWS, Azure, or GCP Hands-on experience with big data technologies such as Hadoop, Spark, Kafka, and distributed computing frameworks Knowledge of data lake and data warehouse solutions, including Databricks, Snowflake, Amazon Redshift, Google BigQuery, Azure Data Factory, Airflow etc. Experience in implementing CI/CD pipelines for automating build, test, and deployment processes Solid understanding of data modeling concepts, data warehousing architectures, and data management best practices Excellent communication and leadership skills, with the ability to effectively collaborate with cross-functional teams and drive consensus on technical decisions Benefit packages for this role will start on the 31st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.
Infosys is seeking an Azure and Databricks Lead Data Engineer to drive digital transformation for clients. The role involves leveraging Azure technologies and collaborating with stakeholders in a global delivery model.
Computer Enterprises, Inc. is seeking a Senior QA Engineer (SDET) specializing in Databricks and reporting tool automation. This contract-to-hire position requires expertise in data validation and automation within Azure ecosystems.
Shrive Technologies is seeking a Technical Lead with expertise in Python, PySpark, Snowflake, Databricks, and Azure. The role requires strong leadership skills and extensive experience in software development and cloud platforms.
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.
Infosys is seeking an Azure and Databricks Lead Data Engineer to drive digital transformation for clients. The role involves leveraging Azure technologies and collaborating with stakeholders in a global delivery model.
Computer Enterprises, Inc. is seeking a Senior QA Engineer (SDET) specializing in Databricks and reporting tool automation. This contract-to-hire position requires expertise in data validation and automation within Azure ecosystems.
Shrive Technologies is seeking a Technical Lead with expertise in Python, PySpark, Snowflake, Databricks, and Azure. The role requires strong leadership skills and extensive experience in software development and cloud platforms.
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.
Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.