
Data Analyst (SQL & Snowflake & Marketing Data)
Novia Infotech, New York, NY, United States
Data Analyst 1
Duration: 12 Months + possible extension
Location: One Pennsylvania Plaza, 26th Floor, New York, NY, USA
Schedule: 4 days in office till December 26 then fully on-site
Interview Process: 2 rounds - first round with be technical, 2nd round will be on-site with HM and 2 others
Additional Information provided by HM
- SQL is # 1 to succeed in this role
- Highly communicative person
- Communicate with internal sales stake holders and outside clients
- Customer facing role
- Data analysis role
- become an account specialist - will be given a specific client to deliver current reports and ad hoc requests via JIRA, building the relationship, help automate reporting
- Any marketing background would be helpful and given preference
- Building data pipelines (need experience building pipelines) - will be a big part - building new from snowflake - take raw, non-clean data from snowflake
- AWS is preferred experience
- Need a bachelor's degree
- Work with big data
Team works with snowflake, raw data from marketing for client, work with client on what they want to measure, then implement the SQL data & build structure of reporting
Qualifications
• 3+ years in data analysis
• Strong SQL skills – 2+ years using SQL.
• 2+ years of Python experience.
• Exposure to Snowflake.
• Experience with AWS tools such as IAM, Lambda, EC2 and S3.
• Jira, Confluence, GitHub and other agile development tool experience.
• Experience with automation, and ability to build custom solutions for unique client needs across various software tools.
• Ability to communicate technical roadblocks to non-technical stakeholders.
Responsibilities
- Working closely with internal stakeholders, translate business requirements into SQL queries and Python scripts for Ad Hoc analysis and custom reporting solutions
- Transform and cleanse outputs into meaningful analysis for business/sales teams
- Explore data to identify client ad performance opportunities, trends and anomalies, contribute to new reporting capabilities/solutions.
- Architect data pipelines to connect different data sources and automate reporting flows.
- Develop and maintain Extract, Transform & Load (ETL) processes tailored to evolving client needs.
- Tune and optimize current reporting solutions to meet updated client needs, new measurement methodologies, or improve query performance.
Bonus:
• Degree in computer science/engineering or quantitative discipline (Economics, Statistics, Engineering, Physics, Mathematics.
• Familiarity with media, TV measurement.
• Experience designing AB tests.
• Big Data / Data Lake Experience.
• Exposure to Data Cleanrooms.
• Athena Experience.