LexisNexis Risk Solutions is seeking a Senior Software Engineer II with expertise in Python, Apache Spark, and Databricks to enhance the Truth Central Data Platform. The role involves collaborating with various teams to ensure reliable data delivery and support client onboarding.
.Senior Software Engineer II -Python Apache Spark and Databricks page is loaded## Senior Software Engineer II -Python Apache Spark and Databrickslocations: Massachusetts: Florida: Georgia time type: Full time posted on: Posted Todayjob requisition : R99902 Lexis Nexis Risk Solutions provides customers with innovative technologies, information-based analytics, decision-making tools and data management services that help them solve problems, make better decisions, stay compliant, reduce risk and improve operations. Headquartered in metro-Atlanta, Georgia it operates within the Risk market segment of RELX, a global provider of information-based analytics and decision tools for professional and business customers. About the role, The Data Software Engineer will support the day-to-day operations and ongoing development of the Truth Central Data Platform. In this role, you'll work closely with Implementation Services, client managers, Data Engineering, and Analytics teams to support client onboarding, troubleshoot production issues, deploy new features, and ensure consistent and reliable data delivery. You’ll help maintain the platform architecture and ensure smooth and reliable operations. About the team – This team has built a suite of products that plays a foundational role in enabling robust risk assessment via deep behavioral data. • * Responsibilities • ** Collaborating with Implementation Services to onboard new clients • Working with client managers to investigate and resolve data anomalies and production issues resulting from data anomalies • Maintaining the Truth Central Data Platform by monitoring the architecture and maintaining or building new data processing pipelines as needed • Supporting infrastructure and deployment workflows using CI/CD • Coordinating with Data Engineering to deploy data artifacts into production • Partnering with analytics teams to ensure data availability and provide retrospective support • Building and maintaining tools for data validation and quality monitoring • * Requirements • ** Current experience in data engineering or a similar technical role • Current proficiency in using • * Python and Java • ** Experience with big data tools like • * Apache Spark and Databricks • ** Experience working with • * No SQL databases • ** Solid understanding of data pipeline development and ETL workflows • Knowledge of Azure fundamentals (or any other cloud platform) • Familiarity with Infrastructure as Code tools (e.g., Terraform) and CI/CD platforms (e.g., Git Hub Actions) • Strong analytical and problem-solving skills, with experience in Agile environments We know that your wellbeing and happiness are key to a long and successful career. These are some of the benefits we are delighted to offer: ● Health Benefits: Comprehensive, multi-carrier program for medical, dental and vision benefits ● Retirement Benefits: 401(k) with match and an Employee Share Purchase Plan ● Wellbeing: Wellness platform with incentives, Headspace app subscription, Employee Assistance and Time-off Programs ● Short-and-Long Term Disability, Life and Accidental Death Insurance, Critical Illness, and Hospital Indemnity ● Family Benefits, including bonding and family care leaves, adoption and surrogacy benefits ● Health Savings, Health Care, Dependent Care and Commuter Spending Accounts ● In addition to annual Paid Time Off, we offer up to two days of paid leave each to participate in Employee Resource Groups and to volunteer with your charity of choice • ** Posting start date: 8/27/2024. We anticipate this posting will be posted for 30 days. • ** Position is eligible for base salary plus an annual bonus We are committed to providing a fair and accessible hiring process. If you have a disability or other need that requires accommodation or adjustment, please let us know by completing our or please contact • * Criminals may pose as recruiters asking for money or personal information. We never request money or banking details from job applicants. Learn more about spotting and avoiding scams** **. • * Please read our .We are an equal opportunity employer: qualified applicants are considered for and treated during employment without regard…
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
PacifiCorp is seeking an Energy Supply Data Scientist I or II in Portland, OR, to provide data-driven insights for optimizing operations in the renewable energy sector. The role involves data mining, statistical analysis, and collaboration with various teams to enhance energy supply management.
Gov Services Hub is seeking a Senior ADB Developer with expertise in Azure Databricks, Python, and Spark Streaming to develop large-scale data pipelines. This contract position is based in Pleasanton, California, offering a competitive hourly rate.
LexisNexis Risk Solutions is seeking a Senior Software Engineer II with expertise in Python, Apache Spark, and Databricks to enhance the Truth Central Data Platform. The role involves collaborating with various teams to ensure reliable data delivery and support client onboarding.
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
PacifiCorp is seeking an Energy Supply Data Scientist I or II in Portland, OR, to provide data-driven insights for optimizing operations in the renewable energy sector. The role involves data mining, statistical analysis, and collaboration with various teams to enhance energy supply management.
Gov Services Hub is seeking a Senior ADB Developer with expertise in Azure Databricks, Python, and Spark Streaming to develop large-scale data pipelines. This contract position is based in Pleasanton, California, offering a competitive hourly rate.
LexisNexis Risk Solutions is seeking a Senior Software Engineer II with expertise in Python, Apache Spark, and Databricks to enhance the Truth Central Data Platform. The role involves collaborating with various teams to ensure reliable data delivery and support client onboarding.
Lensa is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex web application.
Eliassen Group is seeking an Advanced Backend Python API Developer with expertise in Azure and Databricks for a remote contract position. The role involves designing and implementing APIs and data pipelines for a complex data and analytics web application.
LexisNexis Risk Solutions is seeking a Senior Software Engineer II with expertise in Python, Apache Spark, and Databricks to enhance the Truth Central Data Platform. The role involves collaborating with various teams to ensure reliable data delivery and support client onboarding.