Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
• Develop data pipelines to ingest, load, and transform data from multiple sources. • Leverage Data Platform, running on Google Cloud, to design, optimize, deploy and deliver data solutions in support of scientific discovery • Use programming languages like Java, Scala, Python and Open-Source RDBMS and NoSQL databases and Cloud based data store services such as MongoDB, DynamoDB, Elasticache, and Snowflake • The continuous delivery of technology solutions from product roadmaps adopting Agile and DevOps principles • Collaborate with digital product managers, and deliver robust cloud-based solutions that drive powerful experiences • Design and develop data pipelines, including Extract, Transform, Load (ETL) programs to extract data from various sources and transform the data to fit the target model • Test and deploy data pipelines to ensure compliance with data governance and security policies • Moving implementation to ownership of real-time and batch processing and data governance and policies • Maintain and enforce the business contracts on how data should be represented and stored • Ensures that technical delivery is fully compliant with Security, Quality and Regulatory standards • Keeps relevant technical documentation up to date in support of the lifecycle plan for audits/reviews. • Pro-actively engages in experimentation and innovation to drive relentless improvement e.g., new data engineering tools/frameworks • Implementing ETL processes, moving data between systems including S3, Snowflake, Kafka, and Spark • Work closely with our Data Scientists, SREs, and Product Managers to ensure software is high quality and meets user requirements Required Qualifications • Bachelor's or Master's degree in Computer Science, Engineering, or related field. • 5+ years of experience as a data engineer building ETL/ELT data pipelines. • Experience with data engineering best practices for the full software development life cycle, including coding standards, code reviews, source control management (GIT, continuous integrations, testing, and operations) • Experience in programming language Python and SQL good to have Java, C#, C++, Go, Ruby, and Rust • Experience with Agile, DevOps & Automation [of testing, build, deployment, CI/CD, etc.], Airflow • Experience with Docker, Kubernetes, Shell Scripting • 2+ years of experience with a public cloud (AWS, Microsoft Azure, Google Cloud) • 3+ years experience with distributed data/computing tools (MapReduce, Hadoop, Hive, EMR, Kafka, Spark, Gurobi, or MySQL) • 2+ years experience working on real-time data and streaming applications • 2+ years of experience with NoSQL implementations (DynamoDB, MongoDB, Redis, Elasticache) • 2+ years of data warehousing experience (Redshift, Snowflake, Databricks, etc.) • 2+ years of experience with UNIX/Linux including basic commands and shell scripting • Experienced with visualization tools like SSRS, Excel, PowerBI, Tableau, Google Looker, Azure Synapse Required Skills : Python,SQL Basic Qualification Additional Skills : This is a high PRIORITY requisition. This is a PROACTIVE requisition Background Check : Yes Drug Screen : No
Empower Professionals is seeking a Data Engineer & AI Specialist with expertise in AI/ML and data engineering technologies for a hybrid role in Austin, TX. The ideal candidate will have advanced skills in Python, R, and various data warehousing technologies.
MARKS IT SOLUTIONS LLC is seeking a Data Engineer with expertise in Python and Databricks to join their team in New York. The role involves developing scalable applications and working with Azure cloud components.
Infosys Limited is seeking a Lead Data Engineer with expertise in Azure, Databricks, Snowflake, Python, and SQL to design and optimize scalable data solutions. The role is based in Bellevue, WA, and requires collaboration with various stakeholders to solve complex data challenges.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Realign LLC is seeking an experienced Azure Cognitive Services Integration Engineer to design and manage AI-driven data solutions using Databricks and SQL Server. This remote contract position involves integrating Azure Cognitive Services with data pipelines and optimizing AI workflows.
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Empower Professionals is seeking a Data Engineer & AI Specialist with expertise in AI/ML and data engineering technologies for a hybrid role in Austin, TX. The ideal candidate will have advanced skills in Python, R, and various data warehousing technologies.
MARKS IT SOLUTIONS LLC is seeking a Data Engineer with expertise in Python and Databricks to join their team in New York. The role involves developing scalable applications and working with Azure cloud components.
Infosys Limited is seeking a Lead Data Engineer with expertise in Azure, Databricks, Snowflake, Python, and SQL to design and optimize scalable data solutions. The role is based in Bellevue, WA, and requires collaboration with various stakeholders to solve complex data challenges.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Realign LLC is seeking an experienced Azure Cognitive Services Integration Engineer to design and manage AI-driven data solutions using Databricks and SQL Server. This remote contract position involves integrating Azure Cognitive Services with data pipelines and optimizing AI workflows.
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Empower Professionals is seeking a Data Engineer & AI Specialist with expertise in AI/ML and data engineering technologies for a hybrid role in Austin, TX. The ideal candidate will have advanced skills in Python, R, and various data warehousing technologies.
MARKS IT SOLUTIONS LLC is seeking a Data Engineer with expertise in Python and Databricks to join their team in New York. The role involves developing scalable applications and working with Azure cloud components.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.