Req No: #11211 Title: Senior Data Engineer with Databricks Location: Remote (If we have someone in the Cincinnati or Chicago area that may be appealing if they can come onsite) Duration: 6+ months CTH Interview: MS teams Note: No prescreening Job Description: Project: This person will be working on supporting Azure measurement pipelines/workflows for campaign performance. What you'll do: As a Senior Data Engineer, you are part of the software development team. We develop strategies and solutions to ingest, store, and distribute our big data. Our developers use Big Data technologies including (but not limited to) PySpark, Hive, JSON, and SQL to develop products, tools and software features. Top Skills: • Azure • Databricks • Services Development Responsibilities: Take ownership of features and drive them to completion through all phases of the entire client's SDLC. This includes external facing and internal applications as well as process improvement activities such as: • Participate in design of Big Data platforms and SQL based solutions • Perform development of Big Data platforms and SQL based solutions • Perform unit and integration testing • Partner with senior resources, gaining insights • Participate in retrospective reviews • Participate in the estimation process for new work and releases • Be driven to improve yourself and the way things are done Minimum Skills Required: • Proven Big Data technology development experience including Hadoop, Spark (PySpark), and Hive • Understanding of Agile Principles (Scrum) • Experience developing with Python • Cloud Development (Azure) • Exposure to VCS (Git, SVN) Position Specific Skill Preferences: Experience in the following: • Experience developing with SQL (Oracle, SQL Server) • Exposure to NoSQL (Mongo, Cassandra) • Apache NiFi • Airflow • Docker Key Responsibilities: • Innovate, develop, and drive the development and communication of data strategy and roadmaps across the technology organization to support project portfolio and business strategy • Drive the development and communication of enterprise standards for data domains and data solutions, focusing on simplified integration and streamlined operational and analytical uses • Drive digital innovation by leveraging innovative new technologies and approaches to renovate, extend, and transform the existing core data assets, including SQL-based, NoSQL-based, and Cloud-based data platforms • Define high-level migration plans to address the gaps between the current and future state, typically in sync with the budgeting or other capital planning processes Lead the analysis of the technology environment to detect critical deficiencies and recommend solutions for improvement • Mentor team members in data principles, patterns, processes and practices • Promote the reuse of data assets, including the management of the data catalog for reference • Draft and review architectural diagrams, interface specifications and other design documents • Proactively and holistically lead activities that create deliverables to guide the direction, development, and delivery of technological responses to targeted business outcomes. • Provide facilitation, analysis, and design tasks required for the development of an enterprise's data and information architecture, focusing on data as an asset for the enterprise. • Develop target-state guidance (i.e., reusable standards, design patterns, guidelines, individual parts and configurations) to evolve the technical infrastructure related to data and information across the enterprise, including direct collaboration with Client. Skills: Big Data, Hadoop, PySpark/Spark, Hive, JSON, SQL/Oracle, Agile/Scrum, Python, Azure, Databricks, Version Control/SVN/GIT, NoSQL/Mongo/Cassandra, Apache NiFi/ Airflow/Docker
Job Type
Temporary role
Skills required
Agile, Python, NoSQL, Azure
Location
Cincinnati, OH
Salary
No salary information was found.
Date Posted
October 8, 2024
Indus Valley Consultants is seeking a Senior Data Engineer with expertise in Databricks to support Azure measurement pipelines for campaign performance. This remote position requires strong skills in Big Data technologies and cloud development.