Job Details

ID #19596869
State Illinois
City Chicago
Job type Contract
Salary USD Depends on Experience Depends on Experience
Source Make Corporation
Showed 2021-09-14
Date 2021-09-13
Deadline 2021-11-11
Category Et cetera
Create resume

Big Data Developer- No third parties plz

Illinois, Chicago, 60290 Chicago USA

Vacancy expired!

Big Data Developer#15570

Location: CHICAGO, IL

Duration: 3 Months+

Job Description:
  • This position is responsible for developing, integrating, testing, and maintaining existing and new applications; knowing one or more programming languages; knowing one or more development methodologies / delivery models alongside the Domain expertise in Healthcare.
  • This position requires extensive data and integration experience.
  • An integration or data architecture background is preferred, but not required.

Required Job Qualifications:
  • Bachelor Degree and 5 years Information Technology experience OR Technical Certification and/or College Courses and 7 year Information Technology experience OR 9 years Information Technology experience. Master’s degree (in a technical related subject) preferred but not required.
  • Possess ability to manage workload, manage multiple priorities, and manage conflicts with customers/employees/managers, as applicable. Furthermore, ability to direct / manage a team of integration designers, developers, and testers in building large scale, complex integrations throughout a modern data ecosystem.
  • Must have extensive hands on experience in designing, developing, and maintaining software solutions in Hadoop cluster with extensive experience in Integration (Integration includes ETL, message-based, streaming and API styles of integration) with tools preferable Talend Data Integration, Talend Big Data migration platform Edition 6.2.1 or comparable toolsets, and Data Warehousing.
  • Talend is the preferred tool for data integration and Integration.
  • If you have extensive experience with some other tool, you are expected to be able to transfer these skills into Talend tools within 30-60 days.
  • Must have experience with strong UNIX shell scripting, SQOOP, eclipse, HCatalog .
  • Must have experience with NoSql Databases like HBASE, Mongo, CosmoDB, Graph Databases or Cassandra
  • Must have experience with Developing Pig scripts/Hive QL , UDF for analyzing all semi-structured/unstructured/structured data flows.
  • Must have working experience with Developing MapReduce programs running on the Hadoop cluster using Java/Python.
  • Experience with Spark and Scala, or some other JVM based language with data integration experience
  • Must have working knowledge of cloud computing infrastructure (e.g. Amazon Web Services EC2, Azure) and considerations for scalable, distributed systems
  • Must demonstrate Integration best practices with focus on Talend.
  • Must have extensive knowledge working with version control tools like GIT and SVN.
  • Hands on experience with PCF using Talend suite.
  • Experience implementing complex business rules in Talend by creating Reusable Transformations and robust mappings/mapplets. Experience in loading data, troubleshooting, Debugging and tuning of Talend mappings.
  • Designed and developed complex mappings, from varied transformation logic like
  • Unconnected and Connected lookups, Source Qualifier, Sorter, Normalizer, Sequence
  • Generator, Router, Filter, Expression, Aggregator, Joiner, Rank, Update Strategy, Stored
  • procedure, XML Source qualifier, Input and Output transformations.
  • Expertise in Data Modeling concepts including Dimensional Modeling, Star and Snowflake
  • schema, Experience in CDC and daily load strategies of Data warehouse and Data marts,
  • slowly changing Dimensions (Type1, Type2, and Type3) and Surrogate Keys and Data
  • warehouse concepts.
  • Hands-on experience in Performance tuning of Talend and Informatica ETL, Integration, Queries and Jobs.
  • Demonstrates broad knowledge of technical solutions, design patterns, and code for medium/complex applications deployed in Hadoop production. Additionally, experience in design for sustainability – minimize impact on operations crews in the event of systems outages, unexpected data, etc., as well as engineering the code base for straightforward extensions / expansions.
  • Must have working experience in the data warehousing and Business Intelligence systems. Additionally, experience in building in data quality analysis in-line in integration flows. Experience in working with metadata across the integration landscape in support of data governance and operational needs.
  • Participate in design reviews, code reviews, unit testing and integration testing.
  • Assume ownership and accountability for the assigned deliverables through all phases of the development lifecycle.
  • SDLC Methodology (Agile / Scrum / Iterative Development).
  • System performance management.
  • Systems change / configuration management.
  • Business requirements management.
  • Problem solving /analytical thinking.
  • Creative thinking.
  • Ability to execute.

Preferred Job Qualifications:
  • Master’s Degree in Computer Science or Information Technology.

Vacancy expired!

Subscribe Report job