About Reach Security: Reach Security (https://reach.security) builds self-driving cybersecurity. Reach employs the first generative AI for cybersecurity to oversee an organization's current security stack, aiming to achieve the best possible security posture with the products already in use. About the Role: We are seeking Data Platform Engineers at all levels to design, build, and manage the platform and infrastructure that power robust data pipelines and analytic query engines. You'll play a key role in developing scalable, performant solutions using technologies like Trino (Presto), Redshift, BigQuery, and Apache Iceberg, supporting advanced analytics and reporting. The ideal candidate is a motivated problem solver who prioritizes high-quality solutions and excels in navigating ambiguity. As an early team member, you will have the opportunity to take ownership of various aspects of our backend from day one. Your role will be pivotal in establishing engineering best practices, balancing engineering priorities with business needs, and identifying innovative approaches to deliver outstanding value to our users. Your engineering knowledge will be applied by designing top-notch architectures, offering insightful feedback on technical designs, solving difficult problems and conducting thorough code reviews, all aimed at ensuring the software we build is both maintainable and dependable. In this role, you will: • Architect, build, and maintain scalable and reliable data platform infrastructure. • Implement and optimize analytic query engines using technologies like Trino (Presto), Redshift, and BigQuery. • Design and support robust data management solutions leveraging Apache Iceberg. • Collaborate closely with Data Engineering and Analytics teams to ensure effective integration, schema detection, and schema evolution. • Develop and maintain observability frameworks to monitor and troubleshoot data pipelines and platform performance. • Implement best practices for data modeling, schema design, and pipeline fan-out strategies. • Ensure data integrity, quality, and consistency across Medalion architectures, star schemas, and Lakehouse environments. • Proactively identify opportunities to enhance platform scalability, efficiency, and reliability. Success in this role requires: • 3+ years of experience in data platform engineering or related roles focusing on infrastructure and data management. • Deep experience with analytic query engines and platforms such as Trino (Presto), Redshift, BigQuery, and Apache Iceberg. • Expertise in designing platforms supporting Medalion architecture, star schemas, and schema evolution. • Solid foundation in automated schema detection, observability, and performance optimization. • Proficiency with cloud-based platforms and services (AWS, Azure, GCP, etc.). • Strong skills in Python and experience with modern infrastructure automation tools. • Excellent problem-solving abilities, particularly in complex data scenarios requiring performance tuning and scalability solutions. • Strong communication skills, able to articulate platform design concepts clearly to both technical and non-technical audiences. • A proactive and collaborative mindset, comfortable working independently and within fast-paced teams. • Must be a US citizen or Green Card holder. Ways to stand out: • Extensive experience in developing highly observable and scalable data platforms. • Proven expertise in advanced query optimization, database scaling, and platform architecture. • Familiarity with Infrastructure as Code (IaC) tools such as Terraform, Pulumi, or AWS CDK. • Demonstrated ability to align platform engineering strategies closely with strategic business objectives. Work arrangement: • Competitive salary and equity. • Comprehensive health, dental, and vision insurance. • Remote work flexibility.
Job Type
Fulltime role
Skills required
Azure, Python
Location
San Francisco, California
Salary
No salary information was found.
Date Posted
July 9, 2025
Reach Security is seeking a Software Engineer (Data Platform) to design and manage robust data pipelines and analytic query engines. The role involves developing scalable solutions using technologies like Trino, Redshift, and BigQuery.