Vacancy expired!
Overview
We have an exciting opportunity with our prestigious client, an industry leading integrator who provides a wide range of services and solutions to the Federal Government. They have a contract opening for a talented Data Engineering Lead to work on a project that manages retirement assets for more than 5 million Federal Government employees and servicemembers. In this role, you will be responsible for the implementation an end-to-end data pipeline for the program specific analytics platform. Your experience and understanding of ETL processes are key to your success. This position is located in Washington, DC or San Antonio, TX, with work being performed REMOTELY due to COVID. Per our Federal Government Contract, candidates must be U.S. Citizens and be able to pass the client's background check. Responsibilities- Own the ETL pipeline process, monitor and troubleshoot task and workflow scheduling/run
- Write python script to automate schema conversion, table definition and load script generation
- Write scripts to log errors using AWS CloudWatch
- 5+ years of experience with Extract Transform Load (ETL)
- 2+ years of experience with the following:
- Apache Spark
- AWS Redshift
- Solid understanding of using AWS Simple Storage Service (S3)
- Strong SQL knowledge for testing and troubleshooting
- Per our Federal Government Contract, candidates must be U.S. Citizens and be able to pass the client's background check
- 2+ years of experience with the following:
- Python Programming Language
- Snowflake Data Warehouse
- AWS CloudWatch
Vacancy expired!