Vacancy expired!
- Define Technology/BigData strategy and roadmap for client accounts, and guides implementation of that strategy within projects
- Strong understanding of Big Data Analytics platforms and ETL in the context of Big Data.
- Should be able to design complex and high-performance Data architecture.
- Developing and maintaining strong client relations —developing new insights into the client’s business model and pain points, and delivering actionable, high-impact results.
- Manage the team-members, to ensure that the project plan is being adhered to over the course of the project
- Manage the client stakeholders, and their expectations, with a regular cadence of weekly meetings and status updates.
- Advise clients in the planning, design, management, execution, and reporting of the core components for big data offerings. Skills include architecture, tool selection, data lake design and implementation.
- Bachelor’s Degree in Computer Science, Information Systems, or equivalent
- Hands on expertise in building and implementing data architecture for large enterprises.
- Java development experience
- Should have strong hands on experience in developing real time data ingestion / analytical solutions using Hadoop, Kafka, NoSQL and Spark Streaming.
- Understanding of microservices architectures, distributed systems and CI/CD
- Know how to write distributed data pipelines in Java/Scala and high-volume services in Java.
- Experience with distributed messaging and streaming technologies: Amazon SQS, Kinesis, Apache Kafka
- Should be proficient in writing complex SQL and data pipelines on Hive, Core Spark, Shell Scripting, Kafka, using Java, Python or Scala Programming.
- Good Knowledge on data modeling and database design involving any combination of - Data warehousing and Business Intelligence systems and tools.
- Strong understanding of Big Data Analytics platforms and ETL in the context of BigData.
- Technical Hands on experience in design, coding, development and managing Large Hadoop implementation.
- Good Experience in implementing Hadoop Solution on Cloud Platforms - using AWS EMR, Azure Data Bricks and Google DataProc.
- Extensive experience in architecting, data modeling, design, development, data migration and data integration aspects of SDLC
- Knowledge and experience in cloud architectures and cloud tools (Azure/Google Cloud Platform/AWS)
- Familiarity and experience with agile (scrum) development process
- Strong Experience with - Version Control System such as Git
- Strong Experience with NoSQL such as MongoDB, Hbase
Vacancy expired!