Job Role: Databricks Data Engineer (Only On W2) Location: Los Angeles, CA - Remote Duration: 12 Months Contract Is this role located on-site, hybrid, or remote?: Remote Is a Livescan Required for Position?: Yes Candidate should clear the following background checks for at least the past 7 years: • Employment History – Verification of past employment records. • Educational Background – Validation of academic credentials. • Criminal/Felony – Check for any criminal records. • SSN – Social Security Number (SSN) verification. Technical Skills: Databricks platform expertise (workspace management, clusters, job scheduling). Spark (PySpark/Scala) proficiency. SQL optimization and advanced querying skills. Azure Data Factory (ADF) or equivalent ETL pipeline experience Data warehousing concepts and architecture (Delta Lake). Python scripting for data manipulation and automation. Familiarity with Azure Cloud environment (Blob Storage, ADLS). Knowledge of data governance, data security, and compliance. Analytical Skills: Data modeling and database design. Experience with BI/reporting tools (Power BI, Tableau). Data quality assurance and validation. Soft Skills: Strong problem-solving abilities. Effective communication and documentation. Ability to work independently under tight deadlines. Collaborative teamwork and adaptability. Experience Preferred: Experience for this position includes: • 3+ years hands-on experience with Databricks, including cluster management and optimization. • 3+ years background in PySpark or Scala for data engineering tasks. • 3+ years of demonstrated success building efficient ETL pipelines, preferably using Azure Data Factory (ADF). • 3+ years of migrating legacy reports to modern cloud analytics platforms. • 3+ years of strong SQL skills with expertise in data modeling and performance tuning. • 3+ years of Delta Lake architecture and implementation. • 3+ years of BI tools such as Power BI or Tableau. Education Preferred: Bachelor’s degree or higher in Computer Science, Information Systems, Data Science, or related technical field. Relevant industry certifications in Databricks, Azure Cloud, Data Engineering, or similar fields. Additional Information: This contractor will play a critical role in accelerating our data modernization efforts. Their expertise will support timely report conversion, improve data pipeline efficiency, and help ensure data integrity and accessibility across the organization. The role is time-sensitive and project-based, with clearly defined deliverables aligned to strategic analytics goals. Remote Skills: Analysis Skills, Apache Spark, Business Intelligence, Business Intelligence Software, Cloud Computing, Communication Skills, Computer Science, Data Management, Data Modeling, Data Quality, Data Science, Database Design, Database Extract Transform and Load (ETL), Documentation, Information Technology & Information Systems, Information/Data Security (InfoSec), Microsoft Windows Azure, Performance Modeling, Performance Tuning/Optimization, Power BI, Quality Assurance, Query Optimization, SQL (Structured Query Language), Scala Programming Language, Software Engineering, Strategic Analysis, Tableau, Team Player, Time Management About the Company: Pyramid Technology Solutions, Inc
Job Type
Fulltime role
Skills required
Azure, Python
Location
Los Angeles, California
Salary
No salary information was found.
Date Posted
April 13, 2025
Pyramid Technology Solutions, Inc is seeking a Databricks Data Engineer for a 12-month remote contract role in Los Angeles, CA. The ideal candidate will have extensive experience in Databricks, PySpark, and ETL processes using Azure Data Factory.