Vacancy expired!
- This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models.
- Provide subject matter expertise to Service Delivery and other Center teams, as required.
- Contribute to product feature prioritization and technology roadmaps.
- Traditional Development
- Design, build and unit test highly scalable applications.
- Provide maintenance support to applications as required supporting incident escalations.
- Identify new technologies, trends, and opportunities.
- Participate in sprint planning, design, coding, unit testing, sprint reviews.
- Provide basic design documents and translates into component-level designs to accelerate development.
- Design, develop, and distribute reusable technical components.
- Assist in developing of technical documentation; participate in test-plan development, integration and deployment.
- Define & develop project requirements, functional specifications and detailed designs of application solutions for clients.
- Must have extensive hands on experience in designing, developing, and maintaining software solutions on Big Data platform such as Hadoop eco-system.
- Must have experience with strong UNIX shell scripting
- Must have experience with one of the IDE tools such as Eclipse.
- Must have working experience with Spark and Scala/Python.
- Preferred experience with developing Pig scripts/Hive QL, HBASE, SQOOP, UDF for analyzing all semi-structured/unstructured/structured data flows.
- Preferred experience with developing MapReduce programs running on the Hadoop cluster using Java/Python.
- Not mandatory but a big advantage to have prior experience using Talend with Hadoop technologies.
Vacancy expired!