Logo
job logo

Cognos Optimization Specialist

Data Freelance Hub, Austin, TX, United States


⭐ - Featured Role | Apply direct with Data Freelance Hub

This contract role in Austin, Texas (Remote) is for a Data Architect to support UT Austin's enterprise Data to Insights (D2I) modernization initiative. The role will design and guide the transition from legacy mainframe-based data environments to a modern cloud-based data platform leveraging Databricks and support downstream analytics tools including Tableau and Cognos.

Location: Remote (Austin, Texas)

Responsibilities

Design, implement, and scale cloud‑based data architectures within our AWS environment.

Modernize the unified data platform using Databricks to enable advanced analytics, machine learning, and real‑time data processing.

Collaborate closely with Data Engineering, DevOps, Data Modeling, Analytics, and Metadata teams to create scalable, efficient, and well‑governed data solutions.

Partner with the Analytics and Data Modeling Group to ensure alignment on data modeling standards, schema design, and data pipeline integration.

Architect efficient ETL/ELT pipelines for data ingestion, transformation, and delivery to support operational and analytical workloads.

Develop and maintain comprehensive data strategies that align with enterprise goals for real‑time and batch data processing.

Create technical artifacts, standards, and architectural frameworks to address current and future business requirements.

Ensure data quality, governance, compliance, and security throughout the data lifecycle.

Lead the implementation and optimization of Databricks, driving adoption of Delta Lake architectures for high‑performance pipelines.

Serve as a technical advisor to stakeholders, communicating complex data concepts to leadership and cross‑functional teams.

Foster a culture of collaboration, innovation, and continuous improvement within the team.

Qualifications

Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field.

Proven experience designing and implementing enterprise‑scale data architectures in AWS environments.

Strong expertise in data modeling, schema design, and database structures.

Hands‑on experience with Databricks for big‑data processing, analytics, and machine learning.

Proficiency in building ETL/ELT pipelines and working with data integration tools (e.g., AWS Glue, Informatica).

Deep understanding of SQL, NoSQL, and data storage technologies (e.g., Redshift, RDS, S3).

Experience ensuring data governance, quality, and compliance.

Strong troubleshooting skills and ability to optimize performance of cloud data solutions.

Collaborative team player with excellent communication and leadership skills.

Relevant education and experience may be substituted as appropriate.

Preferred: Master’s degree in a relevant field.

Preferred: Certifications in AWS (e.g., AWS Solutions Architect) and Databricks (e.g., Databricks Certified Professional).

Preferred: Expertise in Delta Lake architecture design and implementation.

Preferred: Familiarity with Agile development methodologies and tools (e.g., JIRA, Confluence).

Preferred: Proven experience leading data modernization initiatives across cross‑functional teams.

Contact For applications and inquiries, contact: hirings@openkyber.com

#J-18808-Ljbffr