Mediabistro logo
job logo

DevOps Engineer/Data Pipeline Engineer/Data Platform Engineer

3B Staffing LLC, Boston, MA, United States


Location:

Boston, MA (local candidate to MA/RI/NH) must be able to potentially meet for a F2F interview

Duration:

12+ month contract

Requirements:

Title:

DevOps Engineer/Data Pipeline Engineer/Data Platform Engineer ---This is NOT a Data Engineer opening

Key Skills:

CI/CD, Data Pipelines, AWS, Airflow, Informatica/IICS, Terraform

Job Description:

EOHHS is seeking an experienced DevOps Engineer/Data Pipeline Engineer/Data Platform Engineer to support our cloud data warehouse modernization initiative, migrating from a SQL Server/AWS based system to a Snowflake-based data platform. The DevOps Engineer is responsible for developing, maintaining, and optimizing data pipelines and integration processes that support analytics, reporting, and business operations. The DevOps Engineer will design and implement CI/CD pipelines, automate data pipeline deployments, and ensure operational reliability across Snowflake, Informatica, and Apache Airflow environments. This role works closely with our EHS IT team supporting the Department of Mental Health and Department of Public Health Hospitals.

The primary work location for this role will be at 40 Broad Street, Boston Massachusetts. The work schedule for this position is Monday thru Friday, 9:00AM to 5:00PM EST.

DETAILED LIST OF JOB DUTIES AND RESPONSIBILITIES:

Build and maintain CI/CD (Continuous Integration (CI)/Continuous Delivery/Deployment (CD) pipelines for Snowflake, Informatica (IICS), and Airflow DAG (Directed Acyclic Graph) deployments

Implement automated code promotion between development, test, and production environments

Integrate testing, linting, and security scanning into deployment processes

Develop IaC(Infrastructure as Code using Terraform or similar tools to manage Snowflake objects, network, and cloud resources

Manage configuration and environment consistency across multi-region/multi-cloud setups

Maintain secure connectivity between cloud and on-prem systems (VPNs, private links, firewalls)

Implement logging and alerting for Airflow DAGs, Informatica workflows, and Snowflake performance

Develop proactive monitoring dashboards for job failures, data quality triggers, and warehouse usage

Optimize pipeline performance, concurrency, and cost governance in Snowflake

Own deployment frameworks for ETL/ELT code, SQL scripts, metadata updates

Support user access provisioning & RBAC alignment across Snowflake, Informatica, and Airflow

Troubleshoot platform and orchestration issues, lead incident response during outages

Enforce DevSecOps practices including encryption, secrets management, and key rotation

Implement audit, logging, compliance, and backup/restore strategies aligned with governance requirements

Participate in testing, deployment, and release management for new data workflows and enhancements.

Required Qualifications

3-7+ years in DevOps, Cloud Engineering, or Data Platform Engineering roles

Hands-on experience with:

Snowflake (roles, warehouses, performance tuning, cost control)

Apache Airflow (DAG orchestration, monitoring, deployments)

Informatica (IICS pipeline deployment automation preferred)

Strong CI/CD skills using GitLab, GitHub Actions, Azure DevOps, Jenkins, or similar

Proficiency with Terraform, Python, and Shell scripting

Deep understanding of cloud platforms: AWS, Azure, or GCP

Experience with containerization (Docker, Kubernetes), especially for Airflow

Strong knowledge of networking concepts and security controls

Preferred Knowledge, Skills & Abilities:

Experience migrating from SQL Server or other legacy DW platforms

Knowledge of FinOps practices for Snowflake usage optimization

Background in healthcare, finance, or regulated industries a plus

Soft Skills

Effective communication with technical and non-technical stakeholders

Ability to troubleshoot complex distributed data workloads

Strong documentation and cross-team collaboration skills

Proactive and committed to process improvement and automation

Detail-oriented, with a focus on data accuracy and process improvement.

Education and Certification:

Bachelor's degree or equivalent years in Computer Science, Information Systems, Data Engineering, Health Informatics, or related field.