
Senior Data Engineer (Banking)
Accord Technologies Inc, New York, NY, United States
Overview
Job Title:
Senior Data Engineer (Banking)
Location:
New York City, NY
Position type:
W2 contract
Mandatory skills:
Data engineer, Snowflake, Databricks, Python, PySpark, SQL, Banking
Key Responsibilities
Design and develop robust ETL/ELT pipelines using Snowflake, Databricks, Python, PySpark, and SQL.
Build and optimize data warehouses, data marts, and real-time data solutions for banking applications.
Collaborate with quantitative analysts, data scientists, and business stakeholders to deliver actionable data products.
Implement data governance, quality checks, and monitoring frameworks aligned with banking regulations (SOX, GDPR, CCAR).
Mentor junior engineers and contribute to architectural decisions and best practices.
9+ years of hands-on data engineering experience, with 3+ years in banking/financial services.
Expertise in Snowflake (SnowSQL, performance tuning, security) and Databricks (Delta Lake, Spark optimization).
Proficiency in Python and PySpark for large-scale data processing.
Advanced SQL skills for complex data modeling and query optimization.
Experience with cloud platforms (AWS/Azure) and CI/CD tools (Jenkins, Git).
Strong understanding of banking data domains: trading, risk, compliance, customer, transactions.
Qualifications
Experience:
9+ years of hands-on data engineering experience, with 3+ years in banking/financial services.
Technical:
Snowflake (SnowSQL, performance tuning, security); Databricks (Delta Lake, Spark optimization); Python and PySpark; advanced SQL; cloud platforms (AWS/Azure); CI/CD tools (Jenkins, Git).
Certifications And Additional Skills
SnowPro, Databricks Certified, AWS/Azure Cloud.
Knowledge of real-time streaming (Kafka, Spark Streaming).
Experience with data orchestration tools (Airflow, Dagster).
Familiarity with BI/visualization tools (Tableau, Power BI).
#J-18808-Ljbffr
Job Title:
Senior Data Engineer (Banking)
Location:
New York City, NY
Position type:
W2 contract
Mandatory skills:
Data engineer, Snowflake, Databricks, Python, PySpark, SQL, Banking
Key Responsibilities
Design and develop robust ETL/ELT pipelines using Snowflake, Databricks, Python, PySpark, and SQL.
Build and optimize data warehouses, data marts, and real-time data solutions for banking applications.
Collaborate with quantitative analysts, data scientists, and business stakeholders to deliver actionable data products.
Implement data governance, quality checks, and monitoring frameworks aligned with banking regulations (SOX, GDPR, CCAR).
Mentor junior engineers and contribute to architectural decisions and best practices.
9+ years of hands-on data engineering experience, with 3+ years in banking/financial services.
Expertise in Snowflake (SnowSQL, performance tuning, security) and Databricks (Delta Lake, Spark optimization).
Proficiency in Python and PySpark for large-scale data processing.
Advanced SQL skills for complex data modeling and query optimization.
Experience with cloud platforms (AWS/Azure) and CI/CD tools (Jenkins, Git).
Strong understanding of banking data domains: trading, risk, compliance, customer, transactions.
Qualifications
Experience:
9+ years of hands-on data engineering experience, with 3+ years in banking/financial services.
Technical:
Snowflake (SnowSQL, performance tuning, security); Databricks (Delta Lake, Spark optimization); Python and PySpark; advanced SQL; cloud platforms (AWS/Azure); CI/CD tools (Jenkins, Git).
Certifications And Additional Skills
SnowPro, Databricks Certified, AWS/Azure Cloud.
Knowledge of real-time streaming (Kafka, Spark Streaming).
Experience with data orchestration tools (Airflow, Dagster).
Familiarity with BI/visualization tools (Tableau, Power BI).
#J-18808-Ljbffr