Vacancy expired!
- 4+ years developing in an Object-Oriented Programming language (Java preferred).
- 3+ years developing in Python (may be combined with the above requirement, if Python experience is strictly OO coding).
- 2+ years working in Public Cloud environments (AWS or Google Cloud Platform required, others desirable).
- Working knowledge of CloudFormation.
- 4+ years working in Linux environments.
- 4+ years in Integration Engineering, CI/CD or related field.
- Practical experience setting up build pipelines.
- 4+ years working with GIT and other SCM.
- Familiarity with specialized scripting/SDKs related to data management (e.g.: Boto3, PySpark).
- Familiarity with key file formats: JSON, YAML, AVRO, Parquet.
- Strong familiarity with AWS environment.
- Self-motivated learning and problem solving.
- Strong team communication.
- Experience working in structured environment with controlled software life cycle.
- Excellent writing and documentation skills – Visio, Draw.io, LucidChart and other diagramming tools a plus.
- This role is in support of an Enterprise Data Management Team, and includes general application development support (Java, Python, etc.) in support of a multi-cloud data platform.
- Engineers in this role are expected to focus on automation of deployment and integration tasks (CI/CD) and to drive cloud infrastructure management and automation for a Cloud Data Engineering team.
- The role brings the opportunity to work in Data Engineering and to grow into that role in keeping with the team’s DevOps culture (full ownership of the entire SDLC from architecture, design, and development, through deployment and operational support).
- Supports Cloud/Big Data Engineers in coding and integrating data pipelines for automated deployment
- Evaluates new tools and approaches to continuously evolve the team’s DevOps practice
- Acts as interface with other DevOps teams in the Enterprise
- Participates in Cloud Data Engineering tasks.
Vacancy expired!