Vacancy expired!
- Advanced knowledge of application, data and infrastructure architecture disciplines
- Experience with AWS, Redshift and Snowflake
- Experience with security, isolation and multi-tenant design of distributed cloud services
- Understanding of RESTful API design best practices and experience in developing them
- Experience with Hadoop ecosystem technology stacks as HDFS, HBase, Hive, Pig, Spark, MapReduce, Cloudera etc.
- Experience in using Eclipse/IntelliJ, Maven, Jenkins, GIT, JIRA, Control M or equivalent tools
- Working knowledge of at least a few of the common frameworks like Spring, Apache, Hibernate or similar ORM tools
- Experience with Casandra, Kafka preferred.
- ETL based High volume Real Time & Batch application processing.
- Experienced in developing large scale enterprise applications using Big Data open source solutions such as Hadoop, Spark, Kafka, and Elastic Search
- Experience with Scala, Java and/or Python
- Hands-on experience with RDBMS (Oracle, Mysql) and NoSql (Cassandra)
- Experience with Change Management and Incident Management process
Vacancy expired!