
Data Engineer
3B Staffing LLC, Kansas City, MO, United States
Linkedin Must needed
Direct Line phone no must needed
Rate - 55/hr on W2
Basically! I think this one is more focused on the below:
Must have a strong Microsoft Tech stack (all azure components)
1. SQL Server
2. Azure SQL
3. Azure Data Factory
4. Databricks
5. Python (PySpark) - new, required
Technologies listed in Job Desc are relevant. Strong Azure, Azure Data Factory, Data Lakehouse, etc. w/ prior experience with SQL, SSRS, SSAS,
Vacation/Remote working = not permissible
Laptops are to be secured at all times, no travel allowed with laptop.
Must be ready at all times to have camera on while working.
MUST have a quiet, dedicated space for remote work with strong internet connection.
Cannot have background noise or any other distractions.
Working Hours: Mostly 8 AM - 5 PM CT with ability to flex more as required. Needs to be able to work across time zones. Overtime must be approved in advance. Do not exceed 8 hours/day unless approved by supervisor.
Submission Criteria:
1. Formatted Resume
2. SparkHire Required
3. 2 rounds of interviews, Teams (45 min each)
Elevate
Position Title: Data Engineer
SUMMARY
The purpose of this position is to perform Data Development functions including design new or enhance existing enterprise database systems, maintain or develop critical databases processes, perform unit and system testing, perform support and help desk tasks. It also requires defining and adopting best practices for each of the data development functions as well as visualization and ETL processes. The position is responsible for architecting report solutions using SSRS reports by collaborating with management team. This position is also responsible for architecting ETL functions between a multitude of relational databases and external data files.
ESSENTIAL DUTIES AND RESPONSIBILITIES
1. Work with a highly dynamic team focused towards Digital Transformation
2. Understand the domain and business processes to implement successful data pipelines.
3. Provide work status, and coordinate with Data Engineers
4. Manage customer deliverables and regularly report the status via Weekly/Monthly reviews.
5. Ability to work in a dynamic environment with changing requirements.
6. Good communication and presentation skills
7. Working experience with a wide range of data technologies, data modeling, and metadata management.
8. Working experience with T-SQL and relational databases including currently supported versions of Microsoft SQL Server in Azure.
9. Design, develop and maintain Stored Procedures, Functions and Views
10. Working experience with development and maintenance of Data Factory data pipelines.
11. Working experience with development and maintenance of data pipelines on the Databricks platform in Azure.
12. Working experience using DevOps CI/CD in Azure.
13. Program in T-SQL with relational databases including currently supported versions of Microsoft SQL Server
14. Working experience using SQL and Python.
15. Write high performance SQL queries using Joins, Cross Apply, Aggregate Queries, Merge, Pivot
16. Design, develop and maintain Stored Procedures, Functions and Views
17. Design normalized database tables with proper indexing and constraints
18. Perform SQL query tuning and performance optimization on complex inefficient queries
19. Provide guidance in the use of table variable, temporary table, CTE appropriately to deal with large datasets
20. Collaborate with DBA on database design and performance enhancements
21. Leading in all phases of the software development life cycle in a team environment
22. Debug existing code and troubleshoot for issues
23. Design and provide a framework for maintaining existing data warehouse for reporting and data analytics
24. Following best practices, design, develop, test and document ETL processes
25. Develop data cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching
26. Keep up-to-date with the latest database features, techniques, and technologies
27. Support current business applications including the implementation of bug fixes as needed
28. Able to multi-task and adapt to shifting priorities
29. Able to meet deadlines set in project planning
30. Effectively communicate progress through the project execution phase
31. Follow industry and company standard coding practices
32. Produce technical and application documentation
33. Produce quality deliverables upon deployment
34. Other duties as assigned.
QUALIFICATIONS AND REQUIREMENTS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or abilities.
1. Minimum 10 years of work-related experience in T-SQL, SSRS and ETL. If candidate has relevant education in computer science, computer information systems, or related field, lack of experience can be supplemented by education as follows:
2. Microsoft Certifications - 1 year
3. Minimum 7 years of experience with development and maintenance of Data Factory data pipelines.
4. Minimum 5 years of experience with development and maintenance of data pipelines on the DataBricks platform in Azure.
5. Minimum of 3 years of experience using DevOps CI/CD in Azure.
6. Understanding of data lakehouse architecture in Azure.
7. Minimum of 3 years of supervising data infrastructure, and data flows.
8. Experience in developing, maintaining, and supporting database processes using Microsoft SQL Server with emphasis on .NET technologies.
9. Proficient at a senior level with the following; T-SQL, SSRS, SSIS, SSAS, Data warehousing, ETL, Python.
10. Demonstrated ability to perform above listed essential job
Direct Line phone no must needed
Rate - 55/hr on W2
Basically! I think this one is more focused on the below:
Must have a strong Microsoft Tech stack (all azure components)
1. SQL Server
2. Azure SQL
3. Azure Data Factory
4. Databricks
5. Python (PySpark) - new, required
Technologies listed in Job Desc are relevant. Strong Azure, Azure Data Factory, Data Lakehouse, etc. w/ prior experience with SQL, SSRS, SSAS,
Vacation/Remote working = not permissible
Laptops are to be secured at all times, no travel allowed with laptop.
Must be ready at all times to have camera on while working.
MUST have a quiet, dedicated space for remote work with strong internet connection.
Cannot have background noise or any other distractions.
Working Hours: Mostly 8 AM - 5 PM CT with ability to flex more as required. Needs to be able to work across time zones. Overtime must be approved in advance. Do not exceed 8 hours/day unless approved by supervisor.
Submission Criteria:
1. Formatted Resume
2. SparkHire Required
3. 2 rounds of interviews, Teams (45 min each)
Elevate
Position Title: Data Engineer
SUMMARY
The purpose of this position is to perform Data Development functions including design new or enhance existing enterprise database systems, maintain or develop critical databases processes, perform unit and system testing, perform support and help desk tasks. It also requires defining and adopting best practices for each of the data development functions as well as visualization and ETL processes. The position is responsible for architecting report solutions using SSRS reports by collaborating with management team. This position is also responsible for architecting ETL functions between a multitude of relational databases and external data files.
ESSENTIAL DUTIES AND RESPONSIBILITIES
1. Work with a highly dynamic team focused towards Digital Transformation
2. Understand the domain and business processes to implement successful data pipelines.
3. Provide work status, and coordinate with Data Engineers
4. Manage customer deliverables and regularly report the status via Weekly/Monthly reviews.
5. Ability to work in a dynamic environment with changing requirements.
6. Good communication and presentation skills
7. Working experience with a wide range of data technologies, data modeling, and metadata management.
8. Working experience with T-SQL and relational databases including currently supported versions of Microsoft SQL Server in Azure.
9. Design, develop and maintain Stored Procedures, Functions and Views
10. Working experience with development and maintenance of Data Factory data pipelines.
11. Working experience with development and maintenance of data pipelines on the Databricks platform in Azure.
12. Working experience using DevOps CI/CD in Azure.
13. Program in T-SQL with relational databases including currently supported versions of Microsoft SQL Server
14. Working experience using SQL and Python.
15. Write high performance SQL queries using Joins, Cross Apply, Aggregate Queries, Merge, Pivot
16. Design, develop and maintain Stored Procedures, Functions and Views
17. Design normalized database tables with proper indexing and constraints
18. Perform SQL query tuning and performance optimization on complex inefficient queries
19. Provide guidance in the use of table variable, temporary table, CTE appropriately to deal with large datasets
20. Collaborate with DBA on database design and performance enhancements
21. Leading in all phases of the software development life cycle in a team environment
22. Debug existing code and troubleshoot for issues
23. Design and provide a framework for maintaining existing data warehouse for reporting and data analytics
24. Following best practices, design, develop, test and document ETL processes
25. Develop data cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching
26. Keep up-to-date with the latest database features, techniques, and technologies
27. Support current business applications including the implementation of bug fixes as needed
28. Able to multi-task and adapt to shifting priorities
29. Able to meet deadlines set in project planning
30. Effectively communicate progress through the project execution phase
31. Follow industry and company standard coding practices
32. Produce technical and application documentation
33. Produce quality deliverables upon deployment
34. Other duties as assigned.
QUALIFICATIONS AND REQUIREMENTS
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or abilities.
1. Minimum 10 years of work-related experience in T-SQL, SSRS and ETL. If candidate has relevant education in computer science, computer information systems, or related field, lack of experience can be supplemented by education as follows:
2. Microsoft Certifications - 1 year
3. Minimum 7 years of experience with development and maintenance of Data Factory data pipelines.
4. Minimum 5 years of experience with development and maintenance of data pipelines on the DataBricks platform in Azure.
5. Minimum of 3 years of experience using DevOps CI/CD in Azure.
6. Understanding of data lakehouse architecture in Azure.
7. Minimum of 3 years of supervising data infrastructure, and data flows.
8. Experience in developing, maintaining, and supporting database processes using Microsoft SQL Server with emphasis on .NET technologies.
9. Proficient at a senior level with the following; T-SQL, SSRS, SSIS, SSAS, Data warehousing, ETL, Python.
10. Demonstrated ability to perform above listed essential job