Job Details

ID #45686869
State North Carolina
City Durham
Job type Permanent
Salary USD TBD TBD
Source Adroit Software, Inc.
Showed 2022-09-14
Date 2022-09-13
Deadline 2022-11-11
Category Et cetera
Create resume

Data Engineer

North Carolina, Durham, 27709 Durham USA

Vacancy expired!

For a financial client we need Data Engineer. This position is based in Durham, NC. We are Primarily looking for W2 Candidates and not looking for Third Party Candidates. The Role We have an outstanding technology opportunity in the Data Engineering Chapter to deliver modern data solutions to the Fidelity CIO Tech Group. As Senior Data Engineer you will own the design and development of data services in support of the Enterprise Data Lakes and Analytics Platform. In this role you set out the technical direction for the team. You will work closely with our data architect to craft secure, scalable, resilient cloud-based services. As the most senior technical member of your team, you will guide other specialists and help develop their technical skills. The Expertise we're looking for

  • Bachelors or Masters in a technology related field (e.g. Computer Science, Engineering etc.) required.
  • 6+ years of related experience in data engineering, analysis, data warehouses, data lakes. Specialist understanding and experience of methodologies like data warehousing, data visualization and data integration.
  • Experience with IBM DB2 on Unix (AIX,Solaris,HP-UX, Linux) is essential.
  • Strong experience with relational database technologies (Oracle SQL & PL/SQL or similar RDBMS), preferably Snowflake Data warehousing services.
  • Strong expertise in all aspects of data movement technologies (ETL/ELT) and experience with schedulers.
  • Practical experience delivering and supporting Cloud strategies including migrating legacy products and implementing SaaS integrations.
  • Crafted and implemented operational data stores, as well as data lakes in production environments.
  • Experience with DevOps, Continuous Integration and Continuous Delivery. Developing and deploying pipelines. Deploying within a cloud native infrastructure would be advantageous.
  • Collaborate with a geographically complementary team.
The Skills you bring
  • Consistent track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding standard methodologies.
  • Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.
  • Ability to develop ELT/ETL pipelines to move data to and from Snowflake data store using combination of Python and Snowflake SnowSQL.
  • Knowledge and expertise of data modeling techniques and standard methodologies (Relational, Dimensional), plus any prior experience with data modeling tools (eg. PowerDesigner).
  • Previously used Data ingestion tool sets (e.g Apache NiFi, Kafka) is advantageous.
  • Experience in working with AWS, MS Azure or other cloud providers. Experience with AWS services such as Lambda or S3, AWS Certification.
  • Data Architecture experience (Database design, performance optimization).
  • Prior experience in setting up reliable infrastructure (Hardware, Scalable data management systems, and frameworks) to perform data-related tasks, particularly with Kafka.
  • Understanding basics of distributed systems and Kubernetes.
  • Strong Focus on resiliency & reliability.
  • You have excellent written and oral communication skills.
  • Nice to have: Scripting/coding experience in any of the following: Python, Unix, Java.

Vacancy expired!

Subscribe Report job