Mediabistro logo
job logo

Etl Architect / Data Architect – Azure

GovServicesHub, Tallahassee, FL, United States


Tallahassee, United States | Posted on 01/07/2026

Education Requirements
A bachelor’s degree from an accredited college or university in Computer Science, Information Systems, or a related field is required.

Alternatively, equivalent work experience, including experience in Service-Oriented Architecture (SOA) and Microsoft Azure Cloud Solutions, may be substituted for the educational requirement on a year-for-year basis, when applicable.

Purpose
The Florida Department of Corrections (Department), Office of Information Technology (OIT) is issuing this Request for Quote (RFQ) to define the scope and requirements of this Task Order pursuant to the State Term Contract (STC) for IT Staff Augmentation Services, STC 80101507-23-STC-ITSA.

General Experience Expectations
A minimum of seven (7) years of experience working with large and complex database management systems.

Scope of Work / Job Characteristics
The Data Architect, under the working job title of Extract, Transform, Load (ETL) Architect, will serve as the principal line of communication for the project team.

The ETL Architect will drive the development of data integration pipelines, enabling efficient and reliable access to critical data within the Correction Information Management System (CIMS) Data Warehouse/Data Lake on Azure.

The role requires hands‑on experience with Azure Data Factory (ADF), Azure Databricks, Azure Synapse Analytics, Power BI, and Azure Purview. The ETL Architect will be at the forefront of transforming complex data into actionable insights, while ensuring data integrity, security, and performance to meet mission‑critical needs.

Duties and Responsibilities

Lead the design and development of high‑performing ETL processes to integrate and transform data across disparate sources

Deliver efficient and reliable pipelines that meet business needs while maintaining the highest standards of security

Utilize Azure Data Factory (ADF) to automate and streamline data workflows, ensuring smooth transitions from source to target

Build and manage complex ETL workflows that extract, transform, and load data for downstream analytics and reporting, ensuring data is accurate, timely, and secure

Take ownership of data quality and validation, creating resilient ETL processes that ensure only trusted data reaches its destination

Leverage the full power of the Azure ecosystem—ADF, Databricks, Synapse, and Purview—to manage and process high volumes of structured and unstructured data with scalable and performance‑optimized solutions

Integrate large datasets into Azure Synapse Analytics, enabling analytics teams to deliver data‑driven insights that support the Department’s mission

Continuously optimize ETL jobs to minimize latency and maximize throughput

Ensure the architecture supports fast and reliable data access for end‑users and systems, meeting stringent performance metrics

Embed security and compliance best practices at every step of the ETL process

Protect sensitive data by adhering to industry standards and ensuring compliance with the Department’s data governance policies

Use Azure Purview to enforce data governance, track data lineage, and ensure data handling meets the highest standards of integrity

Partner with cross‑functional teams (data engineers, analysts, business stakeholders, and security experts) to design and implement ETL solutions that meet evolving Department needs

Act as a technical leader and mentor, guiding junior team members and providing expert direction on data processing and transformation best practices

Develop and maintain clear and detailed documentation for ETL processes to ensure consistent delivery of high‑quality solutions

Establish and enforce best practices for data handling, ETL development, and security, fostering a culture of excellence and accountability

Compliance Requirements

Any successful candidate with access to the Department’s network must complete Security Awareness Training within 30 calendar days of hire

All selected candidates must successfully complete a Level II Background Check

Required Qualifications

Seven (7) or more years of experience in ETL development and data engineering

Three (3) or more years of hands‑on experience with Azure Data Factory (ADF), Azure Cloud, Azure Databricks, Azure Synapse Analytics, and Azure Purview

Proven track record of building and optimizing large‑scale ETL pipelines for high‑performance, high‑availability environments

Extensive expertise in Spark, Python, and/or Scala for large‑scale data transformations

Strong SQL proficiency with experience working with complex data structures

In‑depth knowledge of data governance, security protocols, and role‑based access control (RBAC) within the Azure ecosystem

Ability to design ETL processes that are resilient, efficient, and fully compliant with regulatory standards

Preferred Qualifications

Microsoft certifications such as Azure Data Engineer Associate, Azure Solutions Architect Expert, and Azure Fundamentals

#J-18808-Ljbffr