
Associate Data & AI Engineer
Career Techniques, New York, NY, United States
Overview
We are seeking an Associate Data & AI Engineer to help build and scale the data and AI infrastructure that powers our analytics, automation, and decision-making. This role sits at the intersection of data engineering and applied AI, with a specific focus on deploying LLM-powered agents that automate high-value workflows across the organization. This role is ideal for an early-career engineer who wants hands‑on experience across data engineering, AI agent development, and marketing analytics. You will work closely with cross‑functional teams to design data pipelines, build and deploy LLM-based automation agents, and deliver high-quality datasets and reporting that drive business impact. Real‑world use cases include automating paid media reporting, daily KPI distribution, and performance dashboards.
What You’ll Do
Data Engineering & Infrastructure
Design, build, and maintain scalable data pipelines (batch and real‑time)
Develop ETL/ELT workflows to ingest and transform data from multiple sources
Support the development and optimization of data warehouses and data models
Ensure data quality, reliability, and performance across systems
AI & Data Applications
Design and deploy LLM-powered agents to automate repeatable reporting workflows, including paid media performance summaries and daily KPI digests
Collaborate on building AI-driven tools and business dashboards
Product & Business Impact
Partner with greater data and analytics team to deliver clean, usable datasets
Translate business needs into scalable data solutions
Enable analytics, reporting, and AI tools
Engineering & Collaboration
Contribute to testing, and debugging
Participate in code reviews and maintain high engineering standards
Document workflows, pipelines, and project progress
Support monitoring, observability, and continuous improvement of data systems
What We’re Looking For
Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field
0–2 years of experience (internships, projects, or coursework included) in data engineering, software development, or data science
Proficiency in
Python and SQL
Familiarity with
data pipelines, data modeling, and ETL processes
Basic understanding of
cloud platforms
(AWS, GCP, or Azure)
Exposure to
data warehouses
(e.g., Snowflake, BigQuery, Redshift)
Understanding of
machine learning concepts
(e.g., NLP, neural networks, or data science fundamentals)
Familiarity with
LLM APIs and prompt engineering
(e.g., OpenAI, Anthropic) with an understanding of how to build reliable AI-driven workflows
Strong problem‑solving skills and ability to work both independently and collaboratively
Nice to Have
Experience with orchestration tools (Airflow, Dagster)
Experience with Digital Analytics (Adobe), Social Analytics (Sprout) and Business Intelligence (DOMO, PowerBI)
Exposure to real‑time or event-driven systems
Experience with agentic frameworks (LangChain, LlamaIndex, AutoGen, or CrewAI)
Familiarity with workflow automation tools (n8n, Make, or Zapier)
Portfolio or projects demonstrating data or AI work
Comp: TBD (will update shortly)
#J-18808-Ljbffr
We are seeking an Associate Data & AI Engineer to help build and scale the data and AI infrastructure that powers our analytics, automation, and decision-making. This role sits at the intersection of data engineering and applied AI, with a specific focus on deploying LLM-powered agents that automate high-value workflows across the organization. This role is ideal for an early-career engineer who wants hands‑on experience across data engineering, AI agent development, and marketing analytics. You will work closely with cross‑functional teams to design data pipelines, build and deploy LLM-based automation agents, and deliver high-quality datasets and reporting that drive business impact. Real‑world use cases include automating paid media reporting, daily KPI distribution, and performance dashboards.
What You’ll Do
Data Engineering & Infrastructure
Design, build, and maintain scalable data pipelines (batch and real‑time)
Develop ETL/ELT workflows to ingest and transform data from multiple sources
Support the development and optimization of data warehouses and data models
Ensure data quality, reliability, and performance across systems
AI & Data Applications
Design and deploy LLM-powered agents to automate repeatable reporting workflows, including paid media performance summaries and daily KPI digests
Collaborate on building AI-driven tools and business dashboards
Product & Business Impact
Partner with greater data and analytics team to deliver clean, usable datasets
Translate business needs into scalable data solutions
Enable analytics, reporting, and AI tools
Engineering & Collaboration
Contribute to testing, and debugging
Participate in code reviews and maintain high engineering standards
Document workflows, pipelines, and project progress
Support monitoring, observability, and continuous improvement of data systems
What We’re Looking For
Bachelor’s degree in Computer Science, Engineering, Mathematics, or a related field
0–2 years of experience (internships, projects, or coursework included) in data engineering, software development, or data science
Proficiency in
Python and SQL
Familiarity with
data pipelines, data modeling, and ETL processes
Basic understanding of
cloud platforms
(AWS, GCP, or Azure)
Exposure to
data warehouses
(e.g., Snowflake, BigQuery, Redshift)
Understanding of
machine learning concepts
(e.g., NLP, neural networks, or data science fundamentals)
Familiarity with
LLM APIs and prompt engineering
(e.g., OpenAI, Anthropic) with an understanding of how to build reliable AI-driven workflows
Strong problem‑solving skills and ability to work both independently and collaboratively
Nice to Have
Experience with orchestration tools (Airflow, Dagster)
Experience with Digital Analytics (Adobe), Social Analytics (Sprout) and Business Intelligence (DOMO, PowerBI)
Exposure to real‑time or event-driven systems
Experience with agentic frameworks (LangChain, LlamaIndex, AutoGen, or CrewAI)
Familiarity with workflow automation tools (n8n, Make, or Zapier)
Portfolio or projects demonstrating data or AI work
Comp: TBD (will update shortly)
#J-18808-Ljbffr