For this position, you will have: - Undergraduate degree in data or computer science, IT, statistics, or mathematics preferred - Minimum of 2 years of experience as a Data Engineer in a Databricks environment - Specific expertise in Databricks Delta Lake, notebooks, and clusters - Data Vault Modeling experience - Knowledge of big data technologies such as Hadoop, Spark, and Kafka - Strong understanding of relational data structures, theories, principles, and practices - Proficiency in Python and SQL programming languages - Strong understanding of data modeling, algorithms, and data transformation strategies for data science consumption - Experience with performance metric monitoring and improvement - Experience analyzing and specifying technical and business requirements - Ability to create consistent requirements documentation in both technical and user-friendly language - Excellent critical thinking skills and understanding of relationships between data and business intelligence - Strong communication skills with technical and non-technical audiences - Ability to work remotely and collaborate with geographically distributed team In this position, you will: - Support BPM's culture of data, representing the firm's approach to data management, stewardship, lineage, architecture, collection, storage, and utilization for delivering analytic results - Deliver, maintain, and build trusted business relationships that contribute to BPM's data culture - Stay current with the latest technologies and methodologies with a pragmatic mindset - Participate in technology roadmaps and maintain data pipeline and tool documentationData Pipeline Development - Build, maintain, and govern data pipelines in an Azure environment with best of breed technology - Develop pipelines to the data Lakehouse ensuring scalability, reliability, security, and usability for insights and decision-making - Develop, deploy, and support high-quality, fault-tolerant data pipelines - Build infrastructure for optimal extraction, loading, and transformation of data from various sources - Support architecture for observing, cataloging, and governing dataETL / ELT - Build and optimize ELT functionality using Python, dbt, and SQL - Monitor and troubleshoot ELT processes to ensure accuracy and reliability - Implement development best practices including technical design reviews, test plans, peer code reviews, and documentationData Governance & Security - Implement data governance and access controls to ensure data security and compliance - Collaborate with security to implement encryption, authentication, and authorization mechanisms - Monitor and audit data access to maintain data privacy and integrityCollaboration & Communication - Collaborate with cross-functional stakeholders and IT to deliver meaningful outcomes - Profile data sources and understand data relationships to support analytics
Job Type
Remote role
Skills required
Python, Azure
Location
United States
Salary
No salary information was found.
Date Posted
June 4, 2025
BPM is seeking a Data Engineer to build and maintain data pipelines in a cutting-edge Azure environment, supporting a data-driven culture. The role involves collaborating with teams to optimize data infrastructure and ensure data governance.