Vacancy expired!
ACS group has an immediate need for a
Big data Engineer/Architect- 5 Openings with experience in the Investment domain/Banking Industry. This is a 6+ months contract opportunity (with long-term potential.Please review the job description below:Position : Big data Engineer/Architect- 5 OpeningsLocation : Houston, TX 77002, Columbus, OH, Jersey City, NJDuration : 6+ months contract to hire (Direct Hire)Job Description: Top Skills: Minimum 3+ yrs. Experiences required:- Streaming data applications (Spark Streaming, Kafka, Kinesis, and Flink)
- Data pipeline open source products (Ex Airflow, Jupyter, etc)
- Hadoop data platform (MapReduce, Pig, Hive, HBase, Impala), experience working with Cloudera stack is preferred.
- Expert in Cloud Services and design technique as well as experience working across large environments with multiple operating systems/infrastructure for large-scale programs (e.g., Expert Engineers) starting to be firm-wide resources working on projects across the organization.
- Is multi-skilled with expertise across software development lifecycle and toolset.
- May be recognized as a leader in Agile and cultivating teams working in Agile frameworks.
- Sought out as coach for at least one technical skill
- Strong understanding of techniques such as Continuous Integration, Continuous Delivery, Test Driven Development, Cloud Development, resiliency, and security
- Working with our LOB (line of business) users, collaborate with other technology teams to design, develop, test full stack cloud data solutions.
- Leading and be responsible for the craftsmanship, security, availability, resilience and scalability of your solution.
- Leading the innovation, perform proof of concept and the implementation of cutting-edge technologies
- BS/BA degree or equivalent experience
- 5+ years of experience in application development using any of these programming languages: Java, Scala, Python. Experience working in financial industry is preferred.
- 3+ years of experience working with Amazon Web Services (AWS) including entitlement and IAM policy. AWS professional certification is preferred.
- 3+ years of experience working on streaming data applications (Spark Streaming, Kafka, Kinesis, and Flink)
- 3+ years’ experience working with data pipeline open source products (Ex Airflow, Jupyter, etc)
- 3+ years’ hands-on experience with Hadoop data platform (MapReduce, Pig, Hive, HBase, Impala), experience working with Cloudera stack is preferred.
- 1+ years of experience with DevOps automation tools (AWS CloudFormation, Terraform)
- 1+ years of experience in developing, deploying and monitoring in building large distributed and parallel systems using container technology including Docker, Kubernetes, AWS EKS, AWD Fargate
- Lead by influence, mentor junior software engineers, ability to navigate to different teams and business to find solutions to complex
Vacancy expired!