Delmock Technologies Inc is seeking a Data Engineer - Databricks Tester to analyze and validate complex data solutions. The role requires expertise in ETL, Databricks, SQL, and Python to ensure high-quality data processes.
About Our Company: Delmock Technologies, Inc. (DTI), is a leading HUBZone business in Baltimore, known for delivering sophisticated IT (Information Technology) and Health solutions with a commitment to ethics, expertise, and superior service. Actively engaged in the local community, DTI creates opportunities for talented residents while maintaining a stellar reputation as an award-winning contractor, earning accolades like the Government Choice Award for IRS (Internal Revenue Service) Systems Modernizations. Clearance: • Active IRS MBI Clearance is required. Location: This position is remote. Role: Delmock Technologies, Inc. is seeking a highly skilled Data Engineer Tester to join our team. The ideal candidate will have strong expertise in ETL, Databricks, workflow orchestration, SQL, Python, Pyspark/Javaspark and Java scripting. The role requires someone capable of analyzing complex data logic, extracting business rules from code, designing and executing test strategies, and automating testing processes to ensure high-quality data solutions. Responsibilities: • Analyze business and technical requirements, ensuring complete test coverage across data mapping, data pipelines, transformations, and reporting layers. • Review code conversion, logic extraction, and validation of complex transformation rules from source to target to identify test coverage • Develop, execute, and maintain test cases, test scripts, test data, and automation frameworks for ETL or Databricks pipelines, Databricks notebooks, and workflow orchestration jobs. • Validate data ingestion, transformation, aggregation, cleansing, and reporting logic against business requirements, code conversions, business logics/rules • Proficiency in validating flat files (CSV, TSV, TXT, fixed-length) including delimiter handling, header validation, null value handling, and schema verification. • Conduct data reconciliation between source, staging, and target systems to ensure accuracy and completeness. • Design and implement SQL- and Python-based automation frameworks for regression, smoke, and system integration testing. • Test data quality dimensions such as accuracy, completeness, consistency, timeliness, and validity. • Perform negative testing, boundary testing, and exception handling to ensure robustness of pipelines. • Collaborate with developers, data engineers, architects, and business analysts to identify data gaps, defects, and performance issues. • Conduct performance testing of queries and transformations to identify bottlenecks and recommend optimizations. • Provide clear and detailed defect reports, test execution results, and testing dashboards to stakeholders. • Support CI/CD integration of automated test scripts into deployment pipelines. • Participate in Agile ceremonies (stand-ups, sprint planning, retrospectives) and contribute to continuous improvement of test processes. • Mentor junior team members in best practices for data testing and automation. Minimum Requirements: • Bachelor’s degree in related field. • 5+ years of experience in data engineering or data testing roles. • Proven experience with ETL testing and data validation in large-scale enterprise environments. • Strong in creating test cases, writing SQL/Python scripts to validate test cases for data manipulation, validations, Reports and Files validations, Comparisons Validation • Hands-on experience with Databricks (notebooks, workflow, clusters, pipelines, jobs, reports, delta lake). • Expertise in workflow orchestration tools such as Airflow, Azure Data Factory, or Control-M. • Advanced proficiency in SQL (complex joins, CTEs, window functions, query optimization, stored procedures). • Strong scripting skills in Python (pandas, PySpark, unittest/pytest) and Java/JavaScript for test automation framework in Selenium, TestNG, SOAPUI for backend API testing • Ability to interpret complex transformation logic and translate it into test validation rules. • Strong knowledge of data warehouse concepts, star/snowflake schemas, fact/dimension validation. • Experience testing structured and semi-structured data (JSON, XML, Parquet, Avro, CSV) and comparisons • Experience with data quality frameworks and metadata-driven testing. • Experience with defect management tools (JIRA) and test management tools (qTest, Zephyr, TestRail). • Exposure to version control (Git), CI/CD pipelines, Bit Bucket and DevOps practices. • Strong problem-solving, analytical, and debugging skills. • Excellent written and verbal communication skills to interface with Development team, client and technical stakeholders team. Preferred/Nice to Have: • IRS GFE (badge, laptop) • Experience with cloud data platforms (Azure Data Lake, AWS Redshift, GCP BigQuery, Snowflake). • Knowledge of Big Data technologies (Spark, Kafka). • Hands-on experience with test data management and data masking for compliance. • Familiarity with BI/reporting tools (Tableau) to validate data visualization against backend logic. • Prior experience in federal or regulated environments with compliance standards (HIPAA, IRS, CMS, SOX). • Background in performance and scalability testing of data pipelines. Recently ranked as high as #3 among HUBZone Companies in a GOVWIN survey, DTI offers a dynamic environment for those passionate about impactful projects, community involvement, and contributing to top-ranking Federal and State Commissionaires project support teams. At DTI, we balance continuous growth and innovation with a strong dedication to corporate social responsibility. Join our talented team and be part of a company that values both professional excellence and community impact. Explore the exciting career opportunities awaiting you at DTI! DTI is committed to hiring and maintaining a diverse workforce. We are an equal opportunity employer making decisions without regard to race, color, religion, sex, national origin, age, veteran status, disability, or any other protected class.
CEDENT is seeking an experienced Azure Databricks Engineer to design and optimize data pipelines and analytics solutions in Dallas, TX. The role involves collaboration with cross-functional teams and mentoring junior engineers.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Delmock Technologies Inc is seeking a Data Engineer - Databricks Tester to analyze and validate complex data solutions. The role requires expertise in ETL, Databricks, SQL, and Python to ensure high-quality data processes.
The Senior Consultant - Data Engineering role at Internetwork Expert Inc focuses on building real-time data pipelines and implementing data mesh architectures. The position requires deep technical expertise in data engineering and a client-facing consulting approach.
A-1 Consulting Inc is seeking a Senior Data Engineer with expertise in Databricks to design and optimize data pipelines in New York, NY. The role requires strong skills in Python, SQL, and cloud platforms, with a focus on scalable data solutions.
The Principal Data Engineer at Internetwork Expert will focus on building real-time data pipelines and implementing data mesh architectures. This hands-on role requires deep technical expertise in modern data engineering methods and a client-facing consulting approach.
CEDENT is seeking an experienced Azure Databricks Engineer to design and optimize data pipelines and analytics solutions in Dallas, TX. The role involves collaboration with cross-functional teams and mentoring junior engineers.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Delmock Technologies Inc is seeking a Data Engineer - Databricks Tester to analyze and validate complex data solutions. The role requires expertise in ETL, Databricks, SQL, and Python to ensure high-quality data processes.
The Senior Consultant - Data Engineering role at Internetwork Expert Inc focuses on building real-time data pipelines and implementing data mesh architectures. The position requires deep technical expertise in data engineering and a client-facing consulting approach.
A-1 Consulting Inc is seeking a Senior Data Engineer with expertise in Databricks to design and optimize data pipelines in New York, NY. The role requires strong skills in Python, SQL, and cloud platforms, with a focus on scalable data solutions.
The Principal Data Engineer at Internetwork Expert will focus on building real-time data pipelines and implementing data mesh architectures. This hands-on role requires deep technical expertise in modern data engineering methods and a client-facing consulting approach.
CEDENT is seeking an experienced Azure Databricks Engineer to design and optimize data pipelines and analytics solutions in Dallas, TX. The role involves collaboration with cross-functional teams and mentoring junior engineers.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Delmock Technologies Inc is seeking a Data Engineer - Databricks Tester to analyze and validate complex data solutions. The role requires expertise in ETL, Databricks, SQL, and Python to ensure high-quality data processes.