Infinitive Inc is seeking a skilled DevOps Engineer with expertise in ElasticSearch and data engineering to optimize infrastructure and deployment pipelines. The role involves collaboration with teams to build scalable and secure data platforms.
About Infinitive: Infinitive is a data and AI consultancy that enables its clients to modernize, monetize and operationalize their data to create lasting and substantial value. We possess deep industry and technology expertise to drive and sustain adoption of new capabilities. We match our people and personalities to our clients' culture while bringing the right mix of talent and skills to enable high return on investment. Infinitive has been named "Best Small Firms to Work For" by Consulting Magazine 7 times most recently in 2024. Infinitive has also been named a Washington Post "Top Workplace", Washington Business Journal "Best Places to Work", and Virginia Business "Best Places to Work." About the Role: We are seeking a skilled DevOps Engineer with data engineering experience to join our dynamic team. The ideal candidate will have expertise in ElasticSearch, CI/CD, Git, and Infrastructure as Code (IaC) while also possessing experience in data engineering. You will be responsible for designing, automating, and optimizing infrastructure, deployment pipelines, and data workflows. This role requires close collaboration with data engineers, software developers, and operations teams to build scalable, secure, and high-performance data platforms.Key Responsibilities:DevOps & Infrastructure Management: • Design, deploy, and manage ElasticSearch clusters, ensuring high availability, scalability, and performance for search and analytics workloads. • Develop and maintain CI/CD pipelines for automating build, test, and deployment processes using tools like Jenkins, GitHub Actions, GitLab CI/CD, or ArgoCD. • Manage and optimize version control workflows using Git, ensuring best practices for branching, merging, and release management. • Implement Infrastructure as Code (IaC) solutions using Terraform, CloudFormation, or Ansible for cloud and on-prem infrastructure. • Automate system monitoring, alerting, and incident response using tools such as Prometheus, Grafana, Elastic Stack (ELK), or Datadog. Data Engineering & Pipeline Automation: • Collaborate with data engineering teams to design and deploy scalable ETL/ELT pipelines using Apache Kafka, Apache Spark, Kinesis, Pub/Sub, Dataflow, Dataproc, or AWS Glue. • Optimize data storage and retrieval for large-scale analytics and search workloads using ElasticSearch, BigQuery, Snowflake, Redshift, or ClickHouse. • Ensure data pipeline reliability and performance, implementing monitoring, logging, and alerting for data workflows. • Automate data workflows and infrastructure scaling for high-throughput real-time and batch processing environments. • Implement data security best practices, including access controls, encryption, and compliance with industry standards such as GDPR, HIPAA, or SOC 2. Required Skills & Qualifications: • 3+ years of experience in DevOps, Data Engineering, or Infrastructure Engineering. • Strong expertise in ElasticSearch, including cluster tuning, indexing strategies, and scaling. • Hands-on experience with CI/CD pipelines using Jenkins, GitHub Actions, GitLab CI/CD, or ArgoCD. • Proficiency in Git for version control, branching strategies, and code collaboration. • Experience with Infrastructure as Code (IaC) using Terraform, CloudFormation, Ansible, or Pulumi. • Solid experience with cloud platforms (AWS, GCP, or Azure) and cloud-native data engineering tools. • Proficiency in Python, Bash, or Scala for automation, data processing, and infrastructure scripting. • Hands-on experience with containerization and orchestration (Docker, Kubernetes, Helm). • Experience with data engineering tools, including Apache Kafka, Spark Streaming, Kinesis, Pub/Sub, or Dataflow. • Strong understanding of ETL/ELT workflows and distributed data processing frameworks. Preferred Qualifications: • Experience working with data warehouses and lakes (BigQuery, Snowflake, Redshift, ClickHouse, S3, GCS). • Knowledge of monitoring and logging solutions for data-intensive applications. • Familiarity with security best practices for data storage, transmission, and processing. • Understanding of event-driven architectures and real-time data processing frameworks. • Certifications such as AWS Certified DevOps Engineer, Google Cloud Professional Data Engineer, or Certified Kubernetes Administrator (CKA).
BAE Systems is seeking an Experienced DevOps Engineer to develop automated pipelines for cloud environments and support deployments for a Space Ground mission software application. This hybrid role requires expertise in DevSecOps tools and cloud technologies.
ONE Elite Staffing is seeking a Data Analyst with expertise in PowerCenter, MuleSoft, and DevOps for a hybrid contract position in Austin, Texas. The role involves data integration, SQL development, and leading agile teams.
Infinitive Inc is seeking a skilled DevOps Engineer with expertise in ElasticSearch and data engineering to optimize infrastructure and deployment pipelines. The role involves collaboration with teams to build scalable and secure data platforms.
The Dignify Solutions LLC is seeking a Cloud DevOps Engineer with extensive Azure experience to automate infrastructure and manage CI/CD pipelines. The role requires proficiency in Ansible, Terraform, and Python, with a focus on cloud architecture and networking.
Expeditors is seeking a Configuration Engineer to join their Analytics team, focusing on enhancing the EXP.O NOW platform. The role involves managing CI/CD pipelines, cloud infrastructure, and implementing security best practices.
Tekcogno is seeking an experienced Azure DevOps Lead with a strong background in banking to guide the modernization of DevOps processes. This full-time role is based in New York City and requires extensive technical leadership and project management skills.
BAE Systems is seeking an Experienced DevOps Engineer to develop automated pipelines for cloud environments and support deployments for a Space Ground mission software application. This hybrid role requires expertise in DevSecOps tools and cloud technologies.
ONE Elite Staffing is seeking a Data Analyst with expertise in PowerCenter, MuleSoft, and DevOps for a hybrid contract position in Austin, Texas. The role involves data integration, SQL development, and leading agile teams.
Infinitive Inc is seeking a skilled DevOps Engineer with expertise in ElasticSearch and data engineering to optimize infrastructure and deployment pipelines. The role involves collaboration with teams to build scalable and secure data platforms.
The Dignify Solutions LLC is seeking a Cloud DevOps Engineer with extensive Azure experience to automate infrastructure and manage CI/CD pipelines. The role requires proficiency in Ansible, Terraform, and Python, with a focus on cloud architecture and networking.
Expeditors is seeking a Configuration Engineer to join their Analytics team, focusing on enhancing the EXP.O NOW platform. The role involves managing CI/CD pipelines, cloud infrastructure, and implementing security best practices.
Tekcogno is seeking an experienced Azure DevOps Lead with a strong background in banking to guide the modernization of DevOps processes. This full-time role is based in New York City and requires extensive technical leadership and project management skills.
BAE Systems is seeking an Experienced DevOps Engineer to develop automated pipelines for cloud environments and support deployments for a Space Ground mission software application. This hybrid role requires expertise in DevSecOps tools and cloud technologies.
ONE Elite Staffing is seeking a Data Analyst with expertise in PowerCenter, MuleSoft, and DevOps for a hybrid contract position in Austin, Texas. The role involves data integration, SQL development, and leading agile teams.
Infinitive Inc is seeking a skilled DevOps Engineer with expertise in ElasticSearch and data engineering to optimize infrastructure and deployment pipelines. The role involves collaboration with teams to build scalable and secure data platforms.