Mediabistro logo
job logo

GCP Data Engineer

3B Staffing LLC, Hartford, CT, United States


Role - GCP Data Engineers

Location - At least 2x/week on-site in Hartford, CT-151 Farmington Avenue, Hartford, CT 06156- Locals Only

Interview Process:

One + done - 1 hour, 100% technical interview, will be asked to share screen and code!

NEED ONE MANAGER REFERNCE AND THEIR LINKEDIN PROFILE AND OFFICIAL EMAIL ID.

Duration:

6-12 mo. contract

Top 3 must-have technologies:
GCP
PythonSQL
Nice-to-Haves:

Teradata (platform they are moving off of)
Healthcare experience
Will be looking for:

Candidates who are well-versed + hands-on with required technologies
Candidates who are self-sufficient and can be independent
Position Detail:

We are seeking skilled Data Engineer(s) to support a high-impact enterprise data migration initiative. The goal is to migrate data warehouse assets and ETL pipelines from

Teradata to Google Cloud Platform (GCP).

The role involves hands-on development, testing, and optimization of data pipelines and warehouse structures in GCP, ensuring minimal disruption and maximum performance.

Key Responsibilities:

Lead and execute migration of data and ETL workflows from Teradata to GCP-based services such as BigQuery, Cloud Storage, Dataflow, Dataproc, and Composer (Airflow).
Analyze and map existing Teradata workloads to appropriate GCP equivalents.
Rewrite SQL logic, scripts, and procedures in GCP-compliant formats (e.g., standard SQL for BigQuery).
Collaborate with data architects and business stakeholders to define migration strategies, validate data quality, and ensure compliance.
Develop automated workflows for data movement and transformation using GCP-native tools and/or custom scripts (Python).
Optimize data storage, query performance, and costs in the cloud environment.
Implement monitoring, logging, and alerting for all migration pipelines and production workloads.
Required Skills:

4 to 6+ years of experience in Data Engineering, with at least 2 years in GCP.
Strong hands-on experience in Teradata data warehousing, BTEQ, and complex SQL.
Solid knowledge of GCP services: BigQuery, Dataflow, Cloud Storage, Pub/Sub, Composer, and Dataproc.
Experience with ETL/ELT pipelines using custom scripting tools (Python/Java).
Proven ability to refactor and translate legacy logic from Teradata to GCP.
Familiarity with CI/CD, GIT, Argo CD, and DevOps practices in cloud data environments.
Strong analytical, troubleshooting, and communication skills.
Preferred Qualifications:

GCP certification (Preferred: Professional Data Engineer).
Exposure to Apache Kafka, Cloud Functions, or AI/ML pipelines on GCP.
Experience working in the healthcare domain.
Knowledge of data governance, security, and compliance in cloud ecosystems.
Behavioral Skills:

Problem solving mindset
Attention to detail
Accountability and ownership
Curious and staying current with evolving GCP services

Preference:

Ability to work in Hartford, CT office at least thrice a week.