Responsibilities • Develop and optimize data pipelines, ETL/ELT workflows, and data transformations. • Handle complex data integration challenges, crafting solutions that balance performance, scalability, and flexibility. • Take ownership of coding and implementation, ensuring projects are executed efficiently and effectively. • Develop and enforce data quality measures to maintain high levels of integrity and consistency. • Apply critical thinking and non-traditional approaches to improve efficiency and solve data problems in innovative ways. • Challenge conventional solutions when needed, focusing on customized, efficient, and future-proof designs. • Build data solutions that align with business needs, scalability, and long-term growth. • Ensure query performance tuning, indexing, partitioning, and caching strategies are implemented for optimized processing. • Identify and eliminate performance bottlenecks before they impact business users. • Design and maintain data models, warehouses, and integration frameworks that support rapid analytics. • Develop automated monitoring, alerting, and logging to reduce manual oversight. • Implement governance best practices, ensuring security, compliance, and auditability. • Establish and document best practices, workflows, and troubleshooting guides. • Act as a bridge between engineering and analytics, ensuring data is structured and accessible for end-users. • Work alongside Data Modeling Engineers, Analysts, and IT teams to align technical solutions with business objectives. • Take ownership of assigned tasks and ensure projects move forward without direct oversight. Qualifications & Experience • 3-7 years, hands-on experience in data engineering, data pipeline development, or database architecture. • Expert-level SQL skills (Snowflake, SQL Server, etc.), with experience in performance tuning. • Educated to degree level or similar in Computer Science • Strong experience in ETL/ELT development (Azure Data Factory, dbt, Apache Airflow, SSIS, etc.). • Experience working with large-scale data processing, transformation, and storage solutions. • Ability to write clean, efficient, and scalable code for data movement and transformation. • Problem-solving mindset, able to adapt and find unconventional solutions when needed.
Job Type
Fulltime role
Skills required
No particular skills mentioned.
Location
Collierville, Tennessee
Salary
No salary information was found.
Date Posted
July 15, 2025
Robert Half is seeking a Data Engineer in Collierville, Tennessee, to develop and optimize data pipelines and ensure data quality. The role requires strong SQL skills and experience in ETL/ELT development.