Job Details

ID #8572656
State North Carolina
City Charlotte
Job type Permanent
Salary USD TBD TBD
Source Bank Of America
Showed 2021-01-25
Date 2021-01-24
Deadline 2021-03-25
Category Architect/engineer/CAD
Create resume

Data Engineer (Software Engineer III), Enterprise Finance Technology

North Carolina, Charlotte, 28201 Charlotte USA

Vacancy expired!

Job Description:

This is an exciting opportunity to be at the forefront of developing new data engineering methods of transporting, manipulating and conforming data for business consumption. Corporate Investments Data Warehouse (CIDW) and Transaction Hub (THUB) are data warehouses supported by the Data Horizontal team within Risk & Finance Technology. CIDW is both a general purpose LOB data store and calculation engine supporting the needs of Corporate Treasury (Corporate Investments, Global Funding, Finance, Market Risk, etc.) as well as the Enterprise Authorized Data Source (ADS) for Cash & Cash Equivalents, Intercompany Loans and Long Term Debt. THUB is a BofA data store of Intrader (3rd party hosted SOR) position & transactional data. Agile teams consist of a scrum master, developers, data quality analysts and data / business analysts who support front office, middle office, market risk and finance users in collecting, transforming, loading and reporting end of day and intraday fixed income and derivative trading positions and other financial data.

The role is for a Data Engineer on one of the CIDW agile teams. As a Data Engineer, you will be expected to help the team craft data solutions to meet business and enterprise requirements. While our core stack is currently Informatica / Oracle / SQL, we are exploring new methods of moving data. Candidates experienced with Big Data Technologies such as Hadoop, Kafka, Spark, Hive, NiFi and with Python and/or Java are strongly encouraged to apply.

MIINIMUM QUALIFICATIONS:

  • Minimum of 5+ years of development experience in Oracle, SQL Server, Netezza, or another industry accepted database platform
  • Minimum of 3+ years in Data Warehouse / Data Mart / Business Intelligence delivery
  • Minimum of 1+ years with an Industry ETL tool (preferably Informatica PowerCenter)
  • Minimum of 2+ years of Linux / shell scripting (e.g. Bash, Perl, Python)
  • Minimum of 2+ years of Python (e.g. Pandas, Data Frames) and use in data processing solutions
  • Experience w/ enterprise job scheduling tool (e.g. Autosys, Airflow)
  • Proven experience in designing and building integrations supporting standard data warehousing data models star schema, snow flake and different Normalization Forms
  • Strong analytical and problem-solving skills.
  • Passion for working with data
  • Experience with one or more of the following:
    • Java and its use in implementing web services and data processing solutions
    • Experience developing Data Pipeline solutions w/ Python based methodologies as opposed to industry standard ETL tools
    • Modern job orchestration tools for data pipelines such as Air Flow
    • Integrating Rules Engines (e.g. Sapiens) into Data Pipeline workflows
    • Big Data and/or Emerging data technology tools and methodologies
    • Kafka, Sqoop, Spark, nifi
    • Data Wrangling tools such as Alteryx and/or Trifacta
    • Data Visualization tools such as Tableau and/or Microstrategy
  • Bachelor's degree in STEM related field
  • Ability to present technical concepts to senior level business stakeholders
  • Excellent Communication skills - verbal and written
  • Should be a self-motivated worker
  • Excellent interpersonal skills, positive attitude, team-player
  • Willingness to learn and adapt to changes
  • Experience in working in a global technology development model
  • Effectively deal with multiple deadline-driven, customer-sensitive projects and tasks
  • Knowledge of agile methodology and frameworks like Scrum, Kanban, etc.
  • Experience working in a SAFe Agile delivery model

Desired Skills:
  • 10+ years Data Engineering experience
  • Banking / Capital Markets / Accounting domain knowledge
  • Experience automating QA tests as part of the development workflow
  • Experience in Creating Low-level and High-level Design Artifacts
  • Advanced degree

Bank of America, Global Technology & Operations:
  • Believes diversity makes us stronger so we can reflect, connect and meet the diverse needs of our clients and employees around the world.
  • Is committed to building a workplace where every employee is welcomed and given the support and resources to perform their jobs successfully.
  • Wants to be a great place for people to work and strives to create an environment where all employees have the opportunity to achieve their goals.
  • Provides continuous training and development opportunities to help employees achieve their career goals, whatever their background or experience.
  • Is committed to advancing our tools, technology, and ways of working to better serve our clients and their evolving business needs.
  • Believes in responsible growth and is dedicated to supporting our communities by connecting them to the lending, investing and giving them what they need to remain vibrant and vital.

Shift:1st shift (United States of America)

Hours Per Week:40

Learn more about this role

Vacancy expired!

Subscribe Report job