RIT Solutions, Inc. is seeking a Kafka and Data Lake Engineer in Durham, North Carolina, to design and manage data pipelines and oversee data lake architecture. The role requires expertise in Apache Kafka, big data technologies, and cloud platforms.
Responsibilities Design data pipelines: Build robust, scalable, and secure data pipelines to ingest, process, and move data from various sources into the data lake using Kafka. Administer Kafka clusters: Deploy, configure, and maintain Kafka clusters and related ecosystem tools, such as Kafka Connect and Schema Registry, ensuring high availability and performance. Manage the data lake: Oversee the architecture and governance of the data lake, including managing data storage (e.g., in AWS S3 or ADLS), security, and metadata. Develop data processing applications: Create producers and consumers to interact with Kafka topics using programming languages like Python, Java, or Scala. Perform stream processing: Use tools like Kafka Streams, Apache Flink, or ksqlDB to perform real-time data transformations and analytics. Ensure data quality and security: Implement data quality checks, manage data lineage, and enforce security controls such as encryption, access controls (ACLs), and compliance (e.g., GDPR). Monitor and troubleshoot: Set up monitoring and alerting for Kafka and data lake infrastructure and respond to incidents to ensure operational reliability. Collaborate with teams: Work closely with data scientists, analysts, and other engineering teams to understand data requirements and deliver reliable data solutions. Essential skills and qualifications Experience: Proven experience designing and managing data platforms with Apache Kafka and big data technologies. Programming: Strong proficiency in languages like Python, Java, or Scala. Big data technologies: Expertise in big data processing frameworks, such as Apache Spark and Apache Flink. Cloud platforms: Hands-on experience with cloud environments (AWS, Azure, or GCP) and relevant services like S3, Glue, or Azure Data Lake Storage. Data lake architecture: A solid understanding of data lake design principles, including storage formats (e.g., Delta Lake, Apache Iceberg), data modeling, and governance. Databases: Experience with various database systems, including both SQL and NoSQL. Infrastructure management: Familiarity with infrastructure-as-code tools like Terraform or Ansible and containerization with Docker and Kubernetes. Professionals in this field can advance from entry-level data engineering positions to senior roles, and then to a Big Data Architect or Solutions Architect, where they oversee large-scale data infrastructure. Education: Bachelor's Degree - Degree in engineering, or a related scientific or technical discipline is required. Substitution for Education - 10 years or 8 years of additional relevant experience may be substituted for education Relevant certifications Pursuing certifications can validate your expertise and boost your career. For Kafka: Confluent Certified Administrator for Apache Kafka (CCAAK) Confluent Certified Developer for Apache Kafka (CCDAK) For Data Lake and Cloud: Databricks Certified Data Engineer AWS Certified Data Engineer Microsoft Certified: Azure Data Engineer Associate
RIT Solutions, Inc. is seeking a Kafka and Data Lake Engineer in Durham, North Carolina, to design and manage data pipelines and oversee data lake architecture. The role requires expertise in Apache Kafka, big data technologies, and cloud platforms.
GEICO is seeking a Data Analytics and Vertical Engineering Claims Domain Engineer to build high-performance platforms and applications. This role involves driving insurance business transformation through engineering excellence and collaboration.
Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.
Morgan Stanley is hiring multiple Cybersecurity Data Analysts and Cybersecurity DevOps Engineers in Baltimore, Maryland. These roles focus on enhancing cybersecurity measures and managing risks related to AI and LLM systems.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Infosys Limited is seeking an Azure and Snowflake Data Engineer in Boston, MA, to drive digital transformation and implement cloud data solutions. The role requires strong technical skills in Azure, Snowflake, and Python, along with experience in data engineering.
RIT Solutions, Inc. is seeking a Kafka and Data Lake Engineer in Durham, North Carolina, to design and manage data pipelines and oversee data lake architecture. The role requires expertise in Apache Kafka, big data technologies, and cloud platforms.
GEICO is seeking a Data Analytics and Vertical Engineering Claims Domain Engineer to build high-performance platforms and applications. This role involves driving insurance business transformation through engineering excellence and collaboration.
Insight Global is seeking a Lead Data Engineer with expertise in Databricks and Snowflake to design and implement data solutions. This onsite role in Tempe, Arizona requires strong leadership and technical skills to enhance the organization's data infrastructure.
Morgan Stanley is hiring multiple Cybersecurity Data Analysts and Cybersecurity DevOps Engineers in Baltimore, Maryland. These roles focus on enhancing cybersecurity measures and managing risks related to AI and LLM systems.
Vinsys Information Technology Inc is seeking a Data Engineer proficient in Databricks, Python, and SQL to develop data pipelines and cloud-based solutions. The role involves collaboration with product managers and data scientists to ensure high-quality software delivery.
Infosys Limited is seeking an Azure and Snowflake Data Engineer in Boston, MA, to drive digital transformation and implement cloud data solutions. The role requires strong technical skills in Azure, Snowflake, and Python, along with experience in data engineering.
RIT Solutions, Inc. is seeking a Kafka and Data Lake Engineer in Durham, North Carolina, to design and manage data pipelines and oversee data lake architecture. The role requires expertise in Apache Kafka, big data technologies, and cloud platforms.
GEICO is seeking a Data Analytics and Vertical Engineering Claims Domain Engineer to build high-performance platforms and applications. This role involves driving insurance business transformation through engineering excellence and collaboration.
RIT Solutions, Inc. is seeking a Kafka and Data Lake Engineer in Durham, North Carolina, to design and manage data pipelines and oversee data lake architecture. The role requires expertise in Apache Kafka, big data technologies, and cloud platforms.