Trigyn is seeking a Senior Data Engineer with expertise in Azure Data Factory and Databricks to design and optimize ETL pipelines for a major utility firm in White Plains, NY. The role involves API development, data quality governance, and collaboration with stakeholders.
Our client - a major utility firm based out of Westchester County, NY - has an immediate need for Senior Data Engineer. The particulars of the position are as follows. Job Functions & Responsibilities: ETL & Data Integration: • Design, develop, and optimize ETL pipelines using Azure Databricks, ADF, and Pentaho to support enterprise data workflows. • Implement and maintain data movement, transformation, and integration across multiple systems. • Ensure seamless data exchange between cloud, on-prem, and hybrid environments. • Work with Globalscape FTP for secure file transfers and automation. API Development and Integration: • Develop, consume, and integrate RESTful and SOAP APIs to facilitate data. • Work with API gateways and authentication methods such Oauth, JWT, certificate, and API keys. • Implement and optimize API-based data extractions and real-time data integrations Data Quality & Governance: • Implement data validation, cleansing, and enrichment techniques. • Develop and execute data reconciliation processes to ensure accuracy and completeness. • Adhere to data governance policies and security compliance standards. BAU Support & Performance Optimization: • Troubleshoot and resolve ETL failures, data load issues, and performance bottlenecks. • Optimize SQL stored procedures and complex queries for better performance. • Support ongoing enhancements and provide operational support for existing data pipelines. Collaboration & Documentation: • Work closely with Data Analysts, Business Analysts, and stakeholders to understand data needs. • Document ETL processes, data mappings, and workflows for maintainability and knowledge sharing. • Provide guidance and best practices to ensure scalability and efficiency of data solutions. Required Skills & Experience: • 7+ years of experience in ETL development, data integration, and SQL scripting. • Strong expertise in Azure Databricks, ADF (Azure Data Factory), and Pentaho. • Experience handling secure file transfers using Globalscape FTP. • Hands-on experience in developing and consuming APIs (REST/SOAP). • Experience working with API security protocols (Oauth, JWT, API Keys, etc.,). • Proficiency in SQL, stored procedures, performance tuning, and query optimization. • Understanding of data modeling, data warehousing, and data governance best practices. • Hands-on experience with cloud-based data platforms (Azure/AWS) is a plus. • Strong problem-solving skills, troubleshooting abilities, and ability to work independently. • Excellent communication skills and ability to work in a fast-paced environment. Preferred Qualifications: • Experience working in large-scale enterprise data integration projects. • Knowledge of Python, PySpark for big data processing. • Familiarity with CI/CD for data pipelines (Azure DevOps, GitHub Actions, etc.). Education & Certifications • Bachelor's or Master's degree in a relevant field like Computer science, Data Engineering or related technical field Nice to have below certifications: • Databricks certified Data Engineer • Azure Data Engineer associate. For an immediate response, please call 732-876-7632, or send your resume to RecruiterCSE@Trigyn.com Trigyn-8181 TRIGYN TECHNOLOGIES, INC. is an EQUAL OPPORTUNITY EMPLOYER and has been in business for 35 years. TRIGYN is an ISO 27001:2022 and CMMI Level 5 certified company.
TrekRecruit LLC is seeking an Azure Data Factory Developer with extensive experience in Databricks to join their team in Jersey City, NJ. This role involves developing and managing data pipelines and migrating legacy applications to Azure Cloud.
Trigyn is seeking a Senior Data Engineer with expertise in Azure Data Factory and Databricks to design and optimize ETL pipelines for a major utility firm in White Plains, NY. The role involves API development, data quality governance, and collaboration with stakeholders.
ACS Consultancy Services is seeking a Data Engineer with expertise in Microsoft Fabric and Azure Databricks to design and maintain ETL/ELT data pipelines. This hybrid position is based in Atlanta, Georgia, and requires a strong background in data engineering and related technologies.
Maxonic Inc. is seeking a Senior Data Engineer with expertise in Azure Databricks and Data Factory to develop and maintain data pipelines. This role is based in Los Angeles, California, and requires local candidates only.
Radiansys Inc. is seeking an Azure Data Lead with expertise in Databricks, ETL, and Azure Data Factory to lead data engineering projects in San Jose, CA. The role involves designing ETL solutions and mentoring team members while ensuring data governance and compliance.
The Data Lead will oversee the implementation and management of data solutions using Azure Data Factory, Data Bricks, and Snowflake. This role requires strong technical leadership and collaboration with cross-functional teams in a fast-paced environment.
TrekRecruit LLC is seeking an Azure Data Factory Developer with extensive experience in Databricks to join their team in Jersey City, NJ. This role involves developing and managing data pipelines and migrating legacy applications to Azure Cloud.
Trigyn is seeking a Senior Data Engineer with expertise in Azure Data Factory and Databricks to design and optimize ETL pipelines for a major utility firm in White Plains, NY. The role involves API development, data quality governance, and collaboration with stakeholders.
ACS Consultancy Services is seeking a Data Engineer with expertise in Microsoft Fabric and Azure Databricks to design and maintain ETL/ELT data pipelines. This hybrid position is based in Atlanta, Georgia, and requires a strong background in data engineering and related technologies.
Maxonic Inc. is seeking a Senior Data Engineer with expertise in Azure Databricks and Data Factory to develop and maintain data pipelines. This role is based in Los Angeles, California, and requires local candidates only.
Radiansys Inc. is seeking an Azure Data Lead with expertise in Databricks, ETL, and Azure Data Factory to lead data engineering projects in San Jose, CA. The role involves designing ETL solutions and mentoring team members while ensuring data governance and compliance.
The Data Lead will oversee the implementation and management of data solutions using Azure Data Factory, Data Bricks, and Snowflake. This role requires strong technical leadership and collaboration with cross-functional teams in a fast-paced environment.
TrekRecruit LLC is seeking an Azure Data Factory Developer with extensive experience in Databricks to join their team in Jersey City, NJ. This role involves developing and managing data pipelines and migrating legacy applications to Azure Cloud.
Trigyn is seeking a Senior Data Engineer with expertise in Azure Data Factory and Databricks to design and optimize ETL pipelines for a major utility firm in White Plains, NY. The role involves API development, data quality governance, and collaboration with stakeholders.
Trigyn is seeking a Senior Data Engineer with expertise in Azure Data Factory and Databricks to design and optimize ETL pipelines for a major utility firm in White Plains, NY. The role involves API development, data quality governance, and collaboration with stakeholders.