Mediabistro logo
job logo

Azure Data Engineer ADF at Inherent Technologies Cincinnati, OH

Inherent Technologies, cincinnati, oh, United States


Azure Data Engineer ADF job at Inherent Technologies. Cincinnati, OH.

Position

Azure Data Engineer ADF

Location

Cincinnati, OH (Onsite from Day 1)

Competencies

  • 10+ years' experience required
  • Digital: Microsoft Azure
  • Digital: Python
  • Agile Way of Working

Azure Data Engineer ADF, PySpark, and Python needed.

Role Description

  • Take ownership of stories and drive them to completion through all phases of the entire 84.51 SDLC.
  • Develop distributed data processing data pipeline solutions.
  • Orchestrate multi-step data transformation pipelines.
  • Perform unit, integration, and regression testing on packaged code.
  • Build transformation logic and code in an Object Oriented Programming style.
  • Enhance CICD pipelines in the path to production.
  • Create data quality checks for ingested and post processed data.
  • Ensure data observability via alerting and monitoring of automated pipeline solutions.
  • Maintain and enhance existing applications.
  • Build cloud resources via infrastructure as code.
  • Provide mentoring to junior team members.
  • Participate in retrospective reviews.
  • Participate in the estimation process for new work and releases.
  • Bring new perspectives to problems.
  • Be driven to improve yourself and the way things are done.

Essential Skills

  • Bachelors degree typically in Computer Science, Management Information Systems, Mathematics, Business Analytics or another technically strong program.
  • 2 years of proven professional data development experience.
  • 2 years developing with SQL.
  • Proficient with software engineering best practices.
  • Proficient with data pipeline orchestration.
  • Proficient with automated testing (PyTest, etc).
  • Proficient with VCS (Git, GitHub).
  • Proficient using Python frameworks.
  • Proficient with Object Oriented Programming principles.
  • Proficient with Java and Sprint Framework.
  • Experience with distributed data processing (PySpark and/or Snowpark).
  • Experience with Data Observability.
  • Experience with Data Quality checks and processes.
  • Experience with Cloud Technologies Services (Azure preferred, GCP, AWS).
  • Experience with CICD.
  • Experience with dependency management (Conda, venv, etc).
  • Experience with debugging enterprise applications.
  • Understanding of performance tuning enterprise processes.
  • Understanding of Infrastructure as Code.
  • Understanding SOLID principles.
  • Understanding Agile Principles (Scrum).

#J-18808-Ljbffr