Job Details

ID #23791320
State Arizona
City Gilbert
Job type Permanent
Salary USD TBD TBD
Source Deloitte
Showed 2021-12-03
Date 2021-12-02
Deadline 2022-01-31
Category Internet engineering
Create resume

Virtual Senior Python, PySpark, Scala Developer Consult

Arizona, Gilbert, 85233 Gilbert USA

Vacancy expired!

Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist who wants a daily collaborative environment, think-tank feels and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.

Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below

Work you'll do/Responsibilities

Function as an integrator between business needs and technology solutions, helping to create technology solutions to meet clients' requirements. Be responsible for developing and

testing solutions that aligns with clients' systems strategy, requirements, and design as well as supporting system implementation. Manage data pipeline process starting from acquisition to ingestion, storage, and provisioning of data to point-of-impact by modernizing and enabling new capabilities. Facilitate Data Integration on traditional and Hadoop environments by assessing client's enterprise IT environments. Guide clients to the future IT environment state to support meeting their long-term business goals. Enhance business drivers through enterprise-scale applications that enable visualization, consumption, and monetization of both structured and unstructured data.

The Team

From our centers, we work with Deloitte consultants to design, develop and build solutions to help clients reimagine, reshape, and rewire the competitive fabric of entire industries. Our centers house a multitude of specialists, ranging from systems designers, architects and integrators, to creative digital experts, to cyber risk and human capital professionals. All work together on diverse projects from advanced preconfigured solutions and methodologies, to brand-building and campaign management.

We are a unique blend of skills and experiences, yet we underline the value of each individual, providing customized career paths, fostering innovation and knowledge development with a focus on quality. The US Delivery Center supports a collaborative team culture where we work and live close to home with limited travel.

Qualifications

Required

  • Current federal guidance requires that GPS professionals must be fully vaccinated against COVID-19 by December 8, 2021, unless legally entitled to an accommodation.
  • Bachelor of Science in Computer Science, Engineering, or MIS, or equivalent experience
  • 6+ years of Hadoop (Cloudera distribution) experience
  • 6+ years of experience in Spark with Scala or Python programming
  • 6+ years of experience with Hive Tuning, Bucketing, Partitioning, UDF and UDAF
  • 6+ years of NOSQL Data Base such as HBase, MongoDB or Cassandra experience
  • 6+ years of experience and knowledge working in Kafka, Spark streaming, Sqoop, Oozie, Airflow, Control-M, Presto, No SQL, SQL
  • 6+ years knowledge of working in financial/insurance domain experience
  • 6+ years of strong technical skills including understanding of software development principles
  • 6+ years of hands-on programming experience
  • Must live a commutable distance to one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.
  • Limited Immigration sponsorship may be available.
  • Ability to travel up to 15% (While 15% of travel is a requirement of the role, due to COVID-19, non-essential travel has been suspended until further notice.)

Preferred

  • 6+ years of experience working with Big Data eco-system including tools such as Map Reduce, Sqoop, HBase, Hive and Impala
  • Expert level usage with Jenkins and GitHub
  • Experience on data lakes and datahub implementation
  • Knowledge on AWS or Azure platforms
  • Knowledgeable in techniques for designing Hadoop-based file layout optimized to meet business needs
  • Able to translate business requirements into logical and physical file structure design
  • Ability to build and test solution in agile delivery manner
  • Ability to articulate reasons behind the design choices being made
  • Any bigdata certification is a plus

Vacancy expired!

Subscribe Report job