
USA_Developer
Varite, Louisville, KY, United States
Pay Rate Range: $37.83 - 38.94/hr.
GBAM Req ID: 10712631
Job Description:
Must Have Technical/ Functional Skills
• Experienced in data modeling and transformation using SQL, Spark, and Data Flow capabilities.
• Skilled in scripting and programming with Python, .NET, and PowerShell to support data pipeline automation and workflow development.
• Capable of designing, developing, and consuming REST and SOAP APIs for seamless data exchange. • Hands-on experience with Azure SQL Database, Azure Data Lake, Cosmos DB, and cloud storage management.
• Familiar with CI/CD and DevOps practices, including Azure DevOps pipelines and infrastructure automation using ARM templates and terraform.
• Strong understanding of data security, encryption, secure transmission, and privacy compliance requirements.
• Experienced in monitoring and troubleshooting data pipelines using Azure Monitor, Log Analytics, and related diagnostic tools.
• Knowledgeable in integrating event-driven and messaging platforms such as Kafka, Event Grid, and Service Bus for both batch and real-time ingestion.
• Adept at creating and executing unit, integration, and end-to-end test cases to validate data workflows and ensure quality.
• Skilled in diagnosing and resolving access and data integration issues, with escalation to support teams when required.
• Collaborate effectively with business stakeholders to gather data integration requirements, define data movement needs, and establish transformation rules.
• Able to design end-to-end integration architectures, including logical solution designs and data flow diagrams.
• Experienced in preparing technical documentation, build sheets, and architectural artifacts to support onboarding and deployment activities.
• Communicate effectively with business teams, IT partners, and leadership to ensure alignment with Cloud 3.0 standards and guidelines.
• Roles & Responsibilities
• Extensive experience in data integration, ETL development, and processing data within cloud environments.
• Solid understanding of Cloud 3.0 architecture principles and enterprise-approved integration patterns. • Hands-on expertise with Azure Databricks, Azure Data Lake Storage (ADLS), and Unity Catalog.
• Proficient in working with relational databases, particularly PostgreSQL, with additional exposure to DB2.
• Skilled in SQL, including query tuning and optimizing database performance. • Knowledgeable in data profiling, source-to-target mapping, and implementing data quality processes.
• Familiarity with Kubernetes and AKS for orchestrating data transformation workloads. • Proficient in CI/CD practices, version control, and automation using tools such as GitHub, Maven, Jenkins, and Docker.
• Strong foundation in Master Data Management (MDM), data governance, data lineage, and data modeling.
• Experienced in managing large-scale enterprise data environments and leveraging modern cloud data platforms.
• Ability to assess and revise existing analytical data models, identifying reuse opportunities and implementing changes based on business and technical requirements.
• Applies integration design patterns and avoids anti-patterns to deliver scalable and maintainable Cloud 3.0 data integration solutions.
• Collaborates with business and technical stakeholders to define reporting and analytics requirements, and architect/build appropriate data models, ETL processes, and data applications.
• Analyzes source systems and data attributes to identify authoritative data sources, enhancing trust in enterprise reporting.
• Adheres to coding standards, design patterns, unit testing, TDD, and CI/CD workflows using modern tools and technologies.
• Conducts data profiling, source-to-target mapping, data analysis, data quality assessment, database design, SQL tuning, query optimization, and data architecture tasks.
• Supports data governance, data quality management, data modeling, and data lineage across multiple enterprise data repositories.
• Demonstrates strong problem-solving skills in developing solutions for complex analytical and data-driven challenges.
• Works collaboratively with cross-functional teams and stakeholders to achieve shared business objectives.
• Contributes to cloud-based data integration initiatives leveraging approved Cloud 3.0 platforms and patterns, including Azure Databricks, ADLS, Unity Catalog, and PostgreSQL as applicable.
• Facilitates data ingestion, transformation, and movement across cloud platforms, following enterprise-approved integration and lakehouse patterns.
Essential Skills:
Data Integration Developer experienced in migrating DB2| SQL| PostgreSQL
Skills:
Cognos Data Integration
Experience Required:
8-10 years
Skills: Category Name Required Importance Experience SkillCategoryTest1_MN Cognos Data Integration Yes 1 7+ years
GBAM Req ID: 10712631
Job Description:
Must Have Technical/ Functional Skills
• Experienced in data modeling and transformation using SQL, Spark, and Data Flow capabilities.
• Skilled in scripting and programming with Python, .NET, and PowerShell to support data pipeline automation and workflow development.
• Capable of designing, developing, and consuming REST and SOAP APIs for seamless data exchange. • Hands-on experience with Azure SQL Database, Azure Data Lake, Cosmos DB, and cloud storage management.
• Familiar with CI/CD and DevOps practices, including Azure DevOps pipelines and infrastructure automation using ARM templates and terraform.
• Strong understanding of data security, encryption, secure transmission, and privacy compliance requirements.
• Experienced in monitoring and troubleshooting data pipelines using Azure Monitor, Log Analytics, and related diagnostic tools.
• Knowledgeable in integrating event-driven and messaging platforms such as Kafka, Event Grid, and Service Bus for both batch and real-time ingestion.
• Adept at creating and executing unit, integration, and end-to-end test cases to validate data workflows and ensure quality.
• Skilled in diagnosing and resolving access and data integration issues, with escalation to support teams when required.
• Collaborate effectively with business stakeholders to gather data integration requirements, define data movement needs, and establish transformation rules.
• Able to design end-to-end integration architectures, including logical solution designs and data flow diagrams.
• Experienced in preparing technical documentation, build sheets, and architectural artifacts to support onboarding and deployment activities.
• Communicate effectively with business teams, IT partners, and leadership to ensure alignment with Cloud 3.0 standards and guidelines.
• Roles & Responsibilities
• Extensive experience in data integration, ETL development, and processing data within cloud environments.
• Solid understanding of Cloud 3.0 architecture principles and enterprise-approved integration patterns. • Hands-on expertise with Azure Databricks, Azure Data Lake Storage (ADLS), and Unity Catalog.
• Proficient in working with relational databases, particularly PostgreSQL, with additional exposure to DB2.
• Skilled in SQL, including query tuning and optimizing database performance. • Knowledgeable in data profiling, source-to-target mapping, and implementing data quality processes.
• Familiarity with Kubernetes and AKS for orchestrating data transformation workloads. • Proficient in CI/CD practices, version control, and automation using tools such as GitHub, Maven, Jenkins, and Docker.
• Strong foundation in Master Data Management (MDM), data governance, data lineage, and data modeling.
• Experienced in managing large-scale enterprise data environments and leveraging modern cloud data platforms.
• Ability to assess and revise existing analytical data models, identifying reuse opportunities and implementing changes based on business and technical requirements.
• Applies integration design patterns and avoids anti-patterns to deliver scalable and maintainable Cloud 3.0 data integration solutions.
• Collaborates with business and technical stakeholders to define reporting and analytics requirements, and architect/build appropriate data models, ETL processes, and data applications.
• Analyzes source systems and data attributes to identify authoritative data sources, enhancing trust in enterprise reporting.
• Adheres to coding standards, design patterns, unit testing, TDD, and CI/CD workflows using modern tools and technologies.
• Conducts data profiling, source-to-target mapping, data analysis, data quality assessment, database design, SQL tuning, query optimization, and data architecture tasks.
• Supports data governance, data quality management, data modeling, and data lineage across multiple enterprise data repositories.
• Demonstrates strong problem-solving skills in developing solutions for complex analytical and data-driven challenges.
• Works collaboratively with cross-functional teams and stakeholders to achieve shared business objectives.
• Contributes to cloud-based data integration initiatives leveraging approved Cloud 3.0 platforms and patterns, including Azure Databricks, ADLS, Unity Catalog, and PostgreSQL as applicable.
• Facilitates data ingestion, transformation, and movement across cloud platforms, following enterprise-approved integration and lakehouse patterns.
Essential Skills:
Data Integration Developer experienced in migrating DB2| SQL| PostgreSQL
Skills:
Cognos Data Integration
Experience Required:
8-10 years
Skills: Category Name Required Importance Experience SkillCategoryTest1_MN Cognos Data Integration Yes 1 7+ years