Responsibilities - Implement new data engineering features in a highly collaborative work environment alongside product managers and fellow engineers across multiple disciplines - Help drive the design and implementation of ETL workflows across multiple disparate systems and data sources and help provide clarity across the customer’s data - Collaborate closely with business partners and internal customers - Own end-to-end operations of resilient and observable ETL workflows - Share technical solutions and product ideas through team planning, design review, and technical discussions - Design, build and operate large-scale distributed data processing systems, data lakes, lake shore apps - Take the initiative on projects and help improve processes, documentation and the overall design by adopting data engineering best practices - Designing and working with large swaths of disparate vendor data sets, document stores, and tabular data Minimum Requirements - 5 years of experience with common data engineering languages like Python (primary language), SQL, and PySpark - 5 years of experience with orchestrating pipelines on Airflow - 5 years of proven experience in building and deploying complex ETL pipelines on-prem, hybrid, and cloud environments - 5 years of experience building within AWS in a production environment (ex. MWAA, Glue/EMR (Spark), S3, ECS/EKS, IAM, Lambda, etc.) - 5 years of experience with core statistical principles, industry-standard data modeling, data security and classification methodologies - 5 years of experience in Agile software development, test-driven development, unit testing, code reviews and design documentation - 5 years of experience in object-oriented or functional design skills with understanding of common design patterns Preferred Qualifications - Experience in Financial Industry - Experience with other modern programming languages like Java, C++, and C# - Databricks and /or Dreamio Lakehouse platform experience - Experience with DevOps and CI/CD pipelines including working with Terraform or any IaC platform - Ability to own all stages of the development process: design, testing, implementation, operational support - Experience documenting complex data processing workflows and operational guides
Job Type
Hybrid role
Skills required
Python, Agile, Java, C++, C#, CI/CD
Location
Irvine, CA | Los Angeles, CA | Seattle, WA
Salary
No salary information was found.
Date Posted
July 16, 2025
Valorem Reply is seeking a Data Engineer 3 to develop and enhance data engineering solutions. The role involves collaborating with teams to leverage data assets for solving complex business problems.