Vacancy expired!
- Function as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements.
- Be responsible for developing and testing solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation.
- Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities.
- Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments.
- Guide clients to the future IT environment state to support meeting their long term business goals.
- Enhance business drivers through enterprise-scale applications that enable visualization, consumption and monetization of both structured and unstructured data.
- Hadoop (Cloudera distribution)
- Spark with Scala or Python programming
- Experience in building Microservices using Java
- Hive Tuning, Bucketing, Partitioning, UDF, UDAF
- NOSQL Data Base such as HBase, MongoDB or Cassandra
- Experience and knowledge working in Kafka, Spark streaming, Sqoop, Oozie, Airflow, Control-M
- Knowledge of working in financial/insurance domain
- 6+ years of experience of professional work experience
- Strong technical skills including understanding of software development principles
- Hands-on programming experience
- Current federal guidance requires that GPS professionals must be fully vaccinated against COVID-19 by December 8, 2021, unless legally entitled to an accommodation.
- 6 + years' experience working with Big Data eco-system including tools such as Hadoop, Spark, Map Reduce, Sqoop, HBase, Hive and Impala
- Proficiency in one or more modern programming languages like Python or Scala
- Experience on data lakes, datahub implementation
- Knowledge on AWS or Azure platforms
- Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs
- Able to translate business requirements into logical and physical file structure design
- Ability to build and test solution in agile delivery manner
- Ability to articulate reasons behind the design choices being made
- Bachelor of Science in Computer Science, Engineering, or MIS, or equivalent experience
- Any bigdata certification is a plus
Vacancy expired!