Interview : Video ( Prefer local candidates who can go for F2F interview but open to non local for video interview ) Visa : USC, GC, GC EAD, H4 , L2 Description: •Initial Project: Migrate existing Python data extraction tool and codebase to Alpine Data Lake, set up API extraction in Synapse, establish daily ETL for updates, and update Power BI reports to use data lake source. [Alpine Strategic Credit (ASC)] •Subsequent Project: Extract and migrate 20+ years of historical data from Portfolio system to a data warehouse, set up real-time API for continuous data feed from Tamarac, establish daily ETL for updates, and enable improved Power BI reporting. [Alpine Private Wealth (APW)] • Both projects involve ensuring data accuracy and integrity, structuring data for seamless integration into the data warehouse and Power BI. • Data modeling will utilize a Star Schema or Snowflake Schema approach. Responsibilities: •Designing, developing, and maintaining data pipelines and warehousing solutions. •Key tasks will include API integration, ETL development, data modeling (Star Schema or Snowflake Schema), and supporting Power BI reporting. •Collaborate with internal project teams to ensure data accuracy, integrity, and structured organization for business intelligence. Tech stack: • Azure Synapse Analytics • Two separate environments (e.g., Development and Production). • Handles data warehousing and large-scale analytics workloads. • Azure Data Lake • Centralized storage layer. • Supports both structured and unstructured data. • Scalable foundation for analytics and data integration. • Azure Key Vault • Manages secrets, encryption keys, and certificates. • Ensures secure access across both environments. • Azure DevOps • CI/CD pipelines for automated builds and deployments. • Manages data pipeline lifecycle and component delivery. • Apache Spark Notebooks • Deployed in both environments. • Used for interactive data exploration, transformation, and analytics. • Azure Integration Runtime • Facilitates secure and scalable data movement. • Enables transformations across network boundaries within Synapse or Data Factory. • Metastore Data Warehouse • Centralized metadata repository. • Maintains schema definitions, and table metadata • ARM Template (Azure Resource Manager) • Defines and automates infrastructure deployment. • Enables consistent provisioning of Synapse, Data Lake, Key Vault, and other resources across environments. About the Company: Shiftcode Analytics, Inc
Job Type
Fulltime role
Skills required
Azure
Location
St. Louis, Missouri
Salary
No salary information was found.
Date Posted
June 6, 2025
Shiftcode Analytics, Inc is seeking a Sr. Data Engineer to migrate data extraction tools to Alpine Data Lake and enhance Power BI reporting. The role involves designing data pipelines and ensuring data integrity for business intelligence.