
Data Engineer (Python & SQL) (Atlanta)
Holistic Partners, Inc, Atlanta, GA, United States
Job Title:
Data Engineer ( Python & SQL )
Location:
Atlanta, GA & Boston, MA (Onsite)
Duration:
6 Months
Interview Process:
Video
Project Overview
This role will support a key data engineering workstream, focused on building and maintaining enterprise data pipelines and transformations in a modern cloud environment. The individual should be able to work independently, ramp up quickly, and operate comfortably in a client-facing setting.
Top Required (MUST – High Proficiency)
•
Python
•
SQL
Required Skills / Experience
•
AI
•
Snowflake
•
DBT
•
AWS Airflow
•
AI / Automation
Role: Senior Associate Technology L2 (Data Platforms)
As a Senior Associate Technology L2 specializing in Data Platforms, you will play a key role in designing, developing, and optimizing data solutions that enable scalable, high-performance data processing. You will work with cutting-edge technologies to build robust data pipelines, data lakes, and analytics platforms that drive business insights and innovation.
Your Impact
• Design, develop, and maintain scalable data platforms that support enterprise data needs.
• Build and optimize data pipelines, ETL processes, and data integration workflows.
• Collaborate with data scientists, analysts, and business stakeholders to ensure data solutions meet business requirements.
• Implement best practices in data governance, security, and compliance.
• Work with cloud-based data platforms such as
AWS ,
Azure , or
GCP .
• Utilize big data technologies such as
Hadoop ,
Spark ,
Kafka , and
Snowflake .
• Automate data processing and monitoring using tools like
Airflow ,
Kubernetes , or
Apache NiFi .
• Troubleshoot and optimize data performance, ensuring high availability and reliability.
• Stay updated on emerging trends in data engineering and contribute to innovation within the team.
Skills & Experience
• 5+ years of experience in data engineering, data platforms, or related fields.
• Strong expertise in
SQL ,
NoSQL , and data modeling.
• Hands-on experience with big data technologies such as
Hadoop ,
Spark ,
Kafka , or
Snowflake .
• Proficiency in cloud-based data solutions ( AWS ,
Azure ,
GCP ).
• Experience with data pipeline orchestration tools like
Apache Airflow ,
NiFi , or
Kubernetes .
• Strong programming skills in
Python ,
Java ,
Scala , or similar languages.
• Knowledge of data governance, security, and compliance best practices.
• Ability to work in an Agile environment and collaborate with cross-functional teams.
• Strong problem-solving and analytical skills with a focus on data-driven decision-making.
Set Yourself Apart With
• Experience with real-time data processing and streaming analytics.
• Knowledge of machine learning pipelines and data science workflows.
• Certifications in cloud platforms ( AWS Certified Data Analytics – Specialty ,
Azure Data Engineer ,
GCP Professional Data Engineer ).
• Exposure to
DevOps
practices for data engineering.
Data Engineer ( Python & SQL )
Location:
Atlanta, GA & Boston, MA (Onsite)
Duration:
6 Months
Interview Process:
Video
Project Overview
This role will support a key data engineering workstream, focused on building and maintaining enterprise data pipelines and transformations in a modern cloud environment. The individual should be able to work independently, ramp up quickly, and operate comfortably in a client-facing setting.
Top Required (MUST – High Proficiency)
•
Python
•
SQL
Required Skills / Experience
•
AI
•
Snowflake
•
DBT
•
AWS Airflow
•
AI / Automation
Role: Senior Associate Technology L2 (Data Platforms)
As a Senior Associate Technology L2 specializing in Data Platforms, you will play a key role in designing, developing, and optimizing data solutions that enable scalable, high-performance data processing. You will work with cutting-edge technologies to build robust data pipelines, data lakes, and analytics platforms that drive business insights and innovation.
Your Impact
• Design, develop, and maintain scalable data platforms that support enterprise data needs.
• Build and optimize data pipelines, ETL processes, and data integration workflows.
• Collaborate with data scientists, analysts, and business stakeholders to ensure data solutions meet business requirements.
• Implement best practices in data governance, security, and compliance.
• Work with cloud-based data platforms such as
AWS ,
Azure , or
GCP .
• Utilize big data technologies such as
Hadoop ,
Spark ,
Kafka , and
Snowflake .
• Automate data processing and monitoring using tools like
Airflow ,
Kubernetes , or
Apache NiFi .
• Troubleshoot and optimize data performance, ensuring high availability and reliability.
• Stay updated on emerging trends in data engineering and contribute to innovation within the team.
Skills & Experience
• 5+ years of experience in data engineering, data platforms, or related fields.
• Strong expertise in
SQL ,
NoSQL , and data modeling.
• Hands-on experience with big data technologies such as
Hadoop ,
Spark ,
Kafka , or
Snowflake .
• Proficiency in cloud-based data solutions ( AWS ,
Azure ,
GCP ).
• Experience with data pipeline orchestration tools like
Apache Airflow ,
NiFi , or
Kubernetes .
• Strong programming skills in
Python ,
Java ,
Scala , or similar languages.
• Knowledge of data governance, security, and compliance best practices.
• Ability to work in an Agile environment and collaborate with cross-functional teams.
• Strong problem-solving and analytical skills with a focus on data-driven decision-making.
Set Yourself Apart With
• Experience with real-time data processing and streaming analytics.
• Knowledge of machine learning pipelines and data science workflows.
• Certifications in cloud platforms ( AWS Certified Data Analytics – Specialty ,
Azure Data Engineer ,
GCP Professional Data Engineer ).
• Exposure to
DevOps
practices for data engineering.