• ** Must be local to Denver, CO and be able to work on a W2; position is 5 days a week onsite POSITION SUMMARY: Job Title: Techno-Functional Data Integration Engineer – Databricks & Enterprise Applications Job Description: We are seeking a Techno-Functional Data Integration Engineer with deep expertise in Databricks, Azure data services, and enterprise applications (ERP & CRM) such as Oracle ERP, SAP, and Salesforce. This hybrid role is ideal for someone who can bridge the gap between business and technology, driving data integration efforts while aligning closely with functional teams to understand business processes, data requirements, and enterprise system logic. The successful candidate will lead the development of robust data pipelines, facilitate stakeholder collaboration, and ensure smooth integration across internal systems and third-party platforms. Key Responsibilities: Collaborate with business and IT stakeholders to gather requirements and translate functional needs into scalable data integration solutions. Design and implement ETL/ELT pipelines for connecting and transforming data between enterprise systems (Salesforce, Oracle ERP, SAP) and cloud data platforms. Serve as a liaison between data engineering teams and functional teams (finance, sales, operations) to align on data mapping, usage, and validation logic. Build and optimize Databricks pipelines using Python, SQL, Delta Lake, and Spark for ingestion, transformation, and enrichment. Ensure data models reflect business logic and ERP/CRM workflows, maintaining data quality, consistency, and traceability. Utilize tools like Fivetran, Azure Data Factory, and custom connectors to manage end-to-end integration processes. Manage CI/CD pipelines using Azure DevOps for reliable deployment of data workflows and infrastructure. Maintain documentation for functional mappings, data flows, technical specifications, and system interfaces. Work with architects and security teams to implement and monitor data security, governance, and access control policies. Contribute to ongoing optimization of system performance, ensuring responsiveness and scalability of data solutions. Provide guidance and mentorship to junior engineers and support user acceptance testing (UAT) from a functional and technical perspective. Participate in cross-functional project planning and act as a subject matter expert for data from ERP/CRM platforms. Qualifications: Bachelor's degree in Computer Science, Information Systems, Engineering, or a related field; Master's preferred. 5–7 years of experience in data engineering, integration, or analytics roles, with a focus on enterprise applications. Strong knowledge of Databricks, Spark, SQL, Python, and Delta Lake. Hands-on experience integrating data from Salesforce, Oracle ERP, SAP, or other major enterprise platforms. Solid understanding of ERP/CRM business processes (order-to-cash, procure-to-pay, lead-to-opportunity, etc.). Experience with Azure services (Data Lake, Azure Data Factory, Key Vault, DevOps). Ability to gather business requirements and translate them into technical solutions. Familiarity with data modeling, data governance practices, and security/access control strategies. Strong analytical and problem-solving skills with a collaborative mindset. Preferred Skills: Experience with Delta Live Tables, Unity Catalog, Delta Sharing, and Fivetran connectors. Background in working with business-facing stakeholders or cross-functional teams (e.g., finance, operations, sales). Knowledge of cloud data platforms (Azure, AWS, or GCP) and hybrid on-prem/cloud architectures. Agile and DevOps knowledge to support fast iteration and reliable deployment. Excellent written and verbal communication skills, capable of presenting both to technical and non-technical audiences. Ability to manage tasks across geographically distributed teams and complex enterprise environments. Job Type: Contract Pay: $50.00 - $55.00 per hour Schedule: • 8 hour shift • Monday to Friday Work Location: Hybrid remote in Denver, CO 80206
Job Type
Contractor role
Skills required
Azure, Python, Agile
Location
Denver, Colorado
Salary
$50 - $55
Date Posted
April 22, 2025
SSH Cloud IT Solutions LLC is seeking a Techno-Functional Data Integration Engineer with expertise in Databricks and enterprise applications. This role involves developing data integration solutions and collaborating with business and IT stakeholders in Denver, CO.