Mediabistro logo
job logo

Data Engineer

Themesoft Inc., Nashville, TN, United States


Job Title: Senior Staff API/GCP Data Engineer.

Hybrid Requirement.

Work Authorization:

Candidates must be authorized to work in the U.S. without current or future sponsorship requirements.

The Senior Staff Data Engineer - API serves as a primary development resource for design, writing code, test, implementation, document functionality, and maintain of NextGen solutions for the GCP Cloud enterprise data initiatives. The role requires working closely with data teams, frequently in a matrixed environment as part of a broader project team. Due to the emerging and fast-evolving nature of GCP technology and practice, the position requires that one stay well-informed of technological advancements and be proficient at putting new innovations into effective practice.

As a Senior Staff Data Engineer, you will collaborate closely with all team members to create a modular, scalable solution that addresses current needs, but will also serve as a foundation for future success. The position will be critical in building the team’s API engineering practices in test driven development, continuous integration, and automated deployment and is a hands‑on team member who actively coaches the team to solve complex problems.

This role will provide application development for specific business environments. Focus on setting technical direction on groups of applications and similar technologies as well as taking responsibility for technically robust solutions encompassing all business, architecture, and technology constraints.

Responsibilities

Work with data engineers, data architects, data scientists, and other internal stakeholders to understand product requirements and then design, build, and monitor data platforms and pipelines that meet today’s requirements but can gracefully scale.

Implement automated workflows that lower manual/operational costs, define and uphold SLAs for timely delivery of data, and move the company closer to democratizing data.

Enable a self‑service data architecture supporting query exploration, dashboards, data catalog, and rich data discovery.

Promote a collaborative team environment that prioritizes effective communication, team member growth, and success of the team over success of the individual.

Design and create APIs that accelerate the time from idea to insight.

Adhere to and support API best practices, processes, and standards.

Produce high quality, modular, reusable code that incorporates best practices and serves as an example for less experienced engineers.

Help mentor team members on complex data projects and following the Agile process.

Responsible for building and supporting a GCP‑based ecosystem designed for enterprise‑wide analysis of structured, semi‑structured, and unstructured data.

Analyze requirements, design AI/ML based solutions, and integrate those solutions for customer environments.

Qualifications

Strong understanding of best practices and standards for GCP Data process design and implementation.

Over 10 years of total experience, with at least 2 years of hands‑on experience with the GCP platform and many of the following components: Postman, Dynatrace, Cloud Run, GKE, Cloud Functions.

4+ years of hands‑on experience with the following components:

API Development

Python FastAPI Framework

Spark Streaming, Kafka

Java, Python, or Scala

Certifications (a plus, but not required):

GCP Cloud Professional Data Engineer

#J-18808-Ljbffr