Vacancy expired!
- Degree in a technology related field (e.g. Engineering, Computer Science, etc.).
- 8+ years of experience in implementing data engineering solutions in data analytics space
- 3+ years of experience in developing data applications in Cloud (AWS)
- Extensive experience in Object Oriented Programming (Python, Scala, Java), data movement technologies (ETL/ELT), Integration Technologies (SQS, SNS, Airflow, Step Functions) and Analytics Services (AWS Glue, Data Pipeline, EMR, Kinesis)
- Expertise in Data warehousing (Snowflake) and Data modeling techniques
- Experience in Relational and NoSQL databases (Dynamo DB, Elastic search, Graph database)
- Solid understanding of developing highly scalable distributed systems using AWS Services and Open-source technologies
- Experience with DevOps, Continuous Integration and Continuous Delivery (Maven, Jenkins, Stash, Ansible, Chef, Docker)
- Solid experience in Agile methodologies (Kanban and SCRUM)
- API and in-memory technologies is a plus
- Understanding of Machine Learning is a plus.
- Snowflake (using for Data Modeling & SQL)
- Python scripting – write code in Python and deploy automation (all automation, not just DevOps automation).
- AWS data movement services – EMR, Kinesis, Glue
- Developing ETL processes (extraction, transaction and loading).
- Will be building processes to prepare data.
- Good experience with Snowflake and understanding of Python scripting.
- Hands on experience in AWS (data movement technologies)
- Database background (Snowflake) and needs to know how do you model tables and how do you equate and transform data to consumers.
Vacancy expired!