Mediabistro logo
job logo

Sr. Data Engineer

3B Staffing LLC, Orlando, FL, United States


Sr. Data Engineer

Client: Capco/Comerica

Location: Role is hybrid (2-3 days/week) Need candidates local to Dallas, Orlando, Chicago or NYC

Visa: USC or GC

Rate = $53-$60/hr. on C2C

6 months contract

In-person interview in one of these locations will be required

Must have banking experience

Required Qualifications

5+ years of experience with PySpark, including performance tuning, DataFrames, Spark SQL, and distributed data processing.

3+ years of hands-on experience with Snowflake, including Snowpipe, stages, tasks, streams, and performance optimization.

Strong experience building data pipelines on AWS.

Strong SQL skills with the ability to write optimized, complex queries.

Solid understanding of ETL/ELT concepts, data warehousing, and modern data architecture.

Job Description:

Data Engineer (PySpark + Snowflake, AWS)

Position Overview

We are seeking an experienced Data Engineer with strong skills in PySpark and hands-on expertise in Snowflake on the AWS platform. The ideal candidate has 5+ years of PySpark experience and 3+ years working with Snowflake, with proven ability to build, optimize, and maintain large-scale data pipelines.

Key Responsibilities

Data Pipeline Engineering

Design, build, and maintain high-performance ETL/ELT pipelines using PySpark on AWS.

Develop automated ingestion, transformation, and validation workflows for large structured and semi-structured datasets.

Optimize Spark jobs for performance, scalability, and cost efficiency.

Snowflake Development

Build and manage data pipelines that load into Snowflake using PySpark, Snowpipe, and external stages.

Create and maintain Snowflake objects including:

Databases, schemas, tables

Virtual warehouses

Internal/external stages, file formats

Streams, Tasks, Dynamic Tables

Implement Snowpipe for continuous or incremental ingestion.

Apply Snowflake optimization techniques (clustering, micro-partitioning, query profiling, etc.).

AWS Integration

Work with AWS services such as S3, IAM, Lambda, CloudWatch, and EventBridge for data ingestion and automation.

Implement event-driven ingestion using SNS/SQS or other AWS-native triggers.