Mediabistro logo
job logo

Business Analyst - Retail

Artech, Wilmington, DE, United States


Request ID: 70328-1

Title:

Business Analyst - Retail

Locations : Arden, DE

Duration:

6 Months

Pay Range: $48 - $53/Hour on W2 (All inclusive) - Applicants must be willing to work on W2

Job Description

Role Summary

The Databricks Consultant / Developer will design, build, and support data pipelines and integrations on the Azure data platform using Databricks, Airflow, ADF, and ADLS. The role spans both project development (Build) and production support (Run), with a strong focus on DAG orchestration, data transformation, and end to end pipeline reliability across retail and supply chain use cases.

Role Description:

Databricks Python L2 Support Engineer

Providing extensive support for the Integration Services

Analyzing issues and root causes of the issues

Understanding of batch programs and interfaces in Aptos Store Inventory Management System

Handling high priority incidents P1P2 till closure

Key Responsibilities

1. Data Pipeline & DAG Development (Build)

Design and develop Spark / PySpark notebooks and jobs in Azure Databricks to ingest, transform, and publish batch and near real time data across multiple domains (merchandising, supply chain, digital, finance, store systems).

Build and maintain DAGs (primarily in Airflow / Astronomer and Databricks Workflows) to orchestrate dependencies between upstream systems (e.g., RMS, e commerce, WMOS, Kafka topics) and downstream data products, ensuring correct ordering, retries, and alerting.

Use ADF and other integration services where required for file, API, and database ingestion into ADLS Gen2, applying appropriate partitioning and optimization techniques for large scale workloads.

Implement reusable frameworks for common ETL / ELT patterns (SFTP/Kafka ingestion, CDC, merge/upsert to Delta tables, slowly changing dimensions, data quality checks).

2. Data Transformation & Modeling

Translate business and source to target requirements into robust data transformations, aggregations, and business rules implemented in Databricks (Delta Lake).

Design and optimize logical and physical data models (staging, core, marts) to support reporting, analytics, and downstream applications, with attention to performance and cost optimization.

Apply Databricks and Spark best practices (partitioning, clustering, caching, broadcast joins, shuffle reduction, incremental processing) to meet SLAs for nightly and intraday jobs.

Required Skills & Experience

58+ years of overall data engineering experience, with 3+ years hands on in Azure Databricks / Spark building production grade data pipelines.

Strong expertise in PySpark / Spark SQL, Delta Lake, and performance tuning for large volume ETL / ELT workloads.

Solid experience with DAG based orchestration tools such as Airflow / Astronomer (or equivalent), including sensors, task dependencies, and advanced scheduling patterns.

Hands on experience with Azure Data Factory, ADLS Gen2, and broader Azure data ecosystem for data movement and storage.

Strong SQL skills for data profiling, validation, and reconciliation across multiple source and target systems.

Proven experience supporting production data pipelines (L2/L3 or DevOps style), including incident management, root cause analysis, and permanent fixes for job failures and data issues.

Experience with Git / Azure DevOps or similar tools for version control, CI/CD, and work tracking (Jira/ServiceNow/Confluence).

Ability to work in a fast paced, business critical environment with multiple stakeholders, balancing project delivery with ongoing support.

Competencies:

6-8+ years experience required

Business Analysis in Retail

Python

Databricks

PL/SQL

ServiceNow

Databricks Consultant / Developer Data Pipelines & Integrations

Company Benefits & Culture Inclusive and diverse work environment Opportunities for professional growth and development Comprehensive health and wellness benefits

Appreciate your quick response and please feel free to reach me out for any query you may have.

Thanks