Job Details

ID #43588722
State Georgia
City Dunwoody
Job type Permanent
Salary USD TBD TBD
Source State Farm
Showed 2022-06-24
Date 2022-06-23
Deadline 2022-08-22
Category Et cetera
Create resume

Data Engineer

Georgia, Dunwoody, 30338 Dunwoody USA

Vacancy expired!

Overview

We are not just offering a job but a meaningful career! Come join our passionate team!

As a Fortune 50 company, we hire the best employees to serve our customers, making us a leader in the insurance and financial services industry. State Farm embraces diversity and inclusion to ensure a workforce that is engaged, builds on the strengths and talents of all associates, and creates a Good Neighbor culture.

We offer competitive benefits and pay with the potential for an annual financial award based on both individual and enterprise performance. Our employees have an opportunity to participate in volunteer events within the community and engage in a learning culture. We offer programs to assist with tuition reimbursement, professional designations, employee development, wellness initiatives, and more!

Visit our Careers page for more information on our benefits, locations and the process of joining the State Farm team!

What's In It for You
  • Work with cutting edge technologies and business models
  • Contribute to brand new, patent-worthy, concepts and products.
  • Be part of small, self-empowered teams.
  • Participate in customizable skill-level and personal development training.
  • Opportunity to identify, research, and feed the development of new and experimental products.
  • Freedom to utilize different technologies, languages, and frameworks that apply to the problem being solved.
  • Influence and inform solution design efforts that consider performance, risk mitigation, user experience, and testability.
  • Participates in Design Thinking to identify personas, develop problem solving ideas, and pitch ideas to leadership as a team.
  • Competitive Benefits, Pay, and Bonus Potential.
  • STEM Mentoring Opportunities: Give back to the community in your area of expertise through volunteering at STEM events for students!
  • Local volunteer opportunities.
  • 401k plan
  • A Learning Culture: Mentoring, Tuition Reimbursement, Health Initiatives, and more!

Office Location: State Farm Hub location: Dunwoody, GA

Hybrid Work Environment: Selected applicants should plan to spend time working from home and some time working in the office as part of our flexible work environment.

SPONSORSHIP: Applicants are required to be eligible to lawfully work in the U.S. immediately; employer will not sponsor applicants for U.S. work authorization (e.g. H-1B visa) for this opportunity

Responsibilities

Duties and Responsibilities:
  • Applies skills, tools, security processes, applications, environments and programming language(s) to complete complex assignments.
  • Understands and develop/maintain data movement scripts related to storing, retrieving, or acting on housed data to AWS Cloud
  • Tests requirements for the movement, replication, synchronization, and validation of data
  • Identifies ways to automate and improve upon existing automation
  • Develop and improve monitoring solutions
  • Be willing to take on special assignments that may require additional learning

Qualifications

Skillsets for the role should include:

Required:
  • Understanding in programming (e.g., Python), and database functionality (e.g., SQL, Non-SQL)
  • Understanding in compute environments, including but not limited to Linux, Mainframe and Public Cloud
  • Understanding of Application Programming Interface (APIs)
  • Data Validation and Qualitative and Quantitative Analysis
  • Strong understanding in database technologies like IBM DB2, Postgres and AWS RDS, Redshift, Aurora
  • Certifications in AWS Cloud technologies

Experience in:
  • Advanced Python and SQL functional experience
  • Linux experience with strong bash scripting, python with Pandas data frames and spark with Pyspark or Scala Spark experience
  • AWS cloud experience with glue, lambda, DMS, step functions and Redshift, API Gateway
  • Experience building and using CI/CD pipelines, leveraging tools like GitLab CI/CD
  • SAS or R
  • Python w/ Pandas Data Frames
    • Pandas Profiling
  • Spark with PySpark or Scala Spark - more data frame driven software
  • SQL Clients; examples: DBeaver, WinSQL, PgAdminn, SQL Workbench
  • Jupyter Notebooks
SFARM#LI-KF1#LI-Hybrid

Vacancy expired!

Subscribe Report job