Job Details

ID #14389591
State Texas
City Dallas / fort worth
Job type Contract
Salary USD $56.24 - 61.24 per hour 56.24 - 61.24 per hour
Source Randstad Corporate Services
Showed 2021-05-27
Date 2021-05-27
Deadline 2021-07-26
Category Et cetera
Create resume

Hadoop Developer

Texas, Dallas / fort worth, 75202 Dallas / fort worth USA

Vacancy expired!

job summary:

Randstad Technologies is hiring and we're looking for someone like YOU to join our team! If you are seeking a new opportunity, looking to grow in your career, or you know someone who is - we want to hear from you! Take a look at the below opportunity, or feel free to visit RandstadUSA.com to view and apply to any of our open roles.

location: Dallas, Texas

job type: Contract

salary: $56.24 - 61.24 per hour

work hours: 8am to 5pm

education: Bachelors

responsibilities:

Design and build high performing and scalable data pipeline platform using Hadoop, Apache Spark, MongoDB and object storage architecture.

Design and build the data services on container-based architecture such as Kubernetes and Docker

Partner with Enterprise data teams such as Data Management & Insights and Enterprise Data Environment (Data Lake) and identify the best place to source the data

Work with business analysts, development teams and project managers for requirements and business rules.

Collaborate with source system and approved provisioning point (APP) teams, Architects, Data Analysts and Modelers to build scalable and performant data solutions.

Effectively work in a hybrid environment where legacy ETL and Data Warehouse applications and new big-data applications co-exist

Work with Infrastructure Engineers and System Administrators as appropriate in designing the big-data infrastructure.

Work with DBAs to troubleshoot problems and optimize performance

Support ongoing data management efforts for Development, QA and Production environments

Utilizes a thorough understanding of available technology, tools, and existing designs.

Leverage knowledge of industry trends to build best in class technology to provide competitive advantage.

Acts as expert technical resource to programming staff in the program development, testing, and implementation process

- Software engineering experience

- ETL (Extract, Transform, Load) Programming experience

- Agile experience

- Hadoop experience

- Java or Python experience

- Design and development experience with columnar databases using Parquet or ORC file formats on Hadoop

- Apache Spark design and development experience using Scala, Java, Python or Data Frames with Resilient - Distributed Datasets (RDDs)

- Experience integrating with IBM BPM RESTful API

- Nice to have - operational risk, conduct risk or compliance domain experience

- Nice to have - Experience with containers: Dockers, Kubernetes

- Nice to have - Experience with Amazon Web Services (AWS), Azure, Object Storage, Elastic Compute Cloud, OnDemand compute, etc.

qualifications:

  • Experience level: Experienced
  • Minimum 7 years of experience
  • Education: Bachelors

skills:
  • Hadoop

Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.

Vacancy expired!

Subscribe Report job