Vacancy expired!
Skills:
Databricks, Python, AWS & Redshift Primary Job Duties & Responsibilities- Perform analysis, design, development, and configuration functions as well as define technical requirements for assignments of intermediate complexity.
- Participate with a team to perform analysis, assessment and resolution for defects and incidents of intermediate complexity and escalate appropriately.
- Work within guidelines set by the team to independently tackle well-scoped problems.
- Seek opportunities to expand technical knowledge and capabilities.
- 5 years plus of programming/development experience preferred
- Data Engineering experience
- Experience working with ETL tools such as Databricks and Redshift
- Experience coding in Python, and PySpark
- Experience with Python SDLC tools (flake8, commitizen, CircleCI)
- Comfortable working with APIs
- Cloud experience, specifically working with AWS (ECS, Redshift)
- Experience working with relational databases and SQL scripts.
- (Nice-to-have) Experience training, evaluating, and deploying ML models
Vacancy expired!