
Data Warehouse Engineer
3B Staffing LLC, Jersey City, NJ, United States
About the Role
Our client is modernizing their data ecosystem and building a
new ingestion framework
using the latest cloud and data engineering tools. We're looking for hands-on
Data Warehouse Engineers
with experience in
Snowflake, Azure , and
modern data pipelines . Ideal candidates bring a mix of
ETL development, data analysis , and
strong SQL , with the ability to work in a metrics-driven, quality-focused environment.
Top Skills We're Screening For
Snowflake
+
DBT-Snowflake
Microsoft Fabric / Snowflake Fabric
Azure Data Factory (ADF)
ADLS Gen2 / Iceberg (Apache)
GoldenGate
(Bonus)
Modern pipeline tools:
Kafka ,
Airflow ,
IDMC
Core Responsibilities
Design, build, and optimize
modern data pipelines
Develop and maintain
ETL / ELT
workflows using
Informatica & Snowflake
Perform
data analysis , focusing on data quality, lineage, and integrity
Support and contribute to the new ingestion framework
Build and enhance data warehouse, data marts, and ODS solutions
Collaborate with engineering, analytics, and cross-functional teams
Tech Environment
Snowflake
Informatica
Azure
(ADLS Gen2, ADF, MS Fabric)
Kafka ,
GoldenGate
DBT-Snowflake ,
IDMC
PySpark ,
Airflow
Meta-Data Pipeline Management tools
Preferred Background
5+ years in
data engineering , ETL development, or database applications
Strong SQL, query optimization, and RDBMS design
End-to-end
Snowflake architecture
experience (RBAC, tuning, cloning, etc.)
Experience with data warehouse, mart, and ODS solutions
Healthcare payer data familiarity (member, enrollment, claims, provider)
Cloud engineering:
Azure preferred , AWS acceptable
Nice to Have
CDMP or CBIP certification
ERwin, ER/Studio, or PowerDesigner
DataStage or SSIS experience (Informatica preferred)
Agile or ITIL certification
Our client is modernizing their data ecosystem and building a
new ingestion framework
using the latest cloud and data engineering tools. We're looking for hands-on
Data Warehouse Engineers
with experience in
Snowflake, Azure , and
modern data pipelines . Ideal candidates bring a mix of
ETL development, data analysis , and
strong SQL , with the ability to work in a metrics-driven, quality-focused environment.
Top Skills We're Screening For
Snowflake
+
DBT-Snowflake
Microsoft Fabric / Snowflake Fabric
Azure Data Factory (ADF)
ADLS Gen2 / Iceberg (Apache)
GoldenGate
(Bonus)
Modern pipeline tools:
Kafka ,
Airflow ,
IDMC
Core Responsibilities
Design, build, and optimize
modern data pipelines
Develop and maintain
ETL / ELT
workflows using
Informatica & Snowflake
Perform
data analysis , focusing on data quality, lineage, and integrity
Support and contribute to the new ingestion framework
Build and enhance data warehouse, data marts, and ODS solutions
Collaborate with engineering, analytics, and cross-functional teams
Tech Environment
Snowflake
Informatica
Azure
(ADLS Gen2, ADF, MS Fabric)
Kafka ,
GoldenGate
DBT-Snowflake ,
IDMC
PySpark ,
Airflow
Meta-Data Pipeline Management tools
Preferred Background
5+ years in
data engineering , ETL development, or database applications
Strong SQL, query optimization, and RDBMS design
End-to-end
Snowflake architecture
experience (RBAC, tuning, cloning, etc.)
Experience with data warehouse, mart, and ODS solutions
Healthcare payer data familiarity (member, enrollment, claims, provider)
Cloud engineering:
Azure preferred , AWS acceptable
Nice to Have
CDMP or CBIP certification
ERwin, ER/Studio, or PowerDesigner
DataStage or SSIS experience (Informatica preferred)
Agile or ITIL certification