Job Details

ID #21718802
State California
City Glendale
Job type Full-time
Salary USD TBD TBD
Source Public Storage
Showed 2021-10-26
Date 2021-10-25
Deadline 2021-12-24
Category Et cetera
Create resume

Analytics Engineer - ML

California, Glendale, 91201 Glendale USA

Vacancy expired!

Job Description

We are currently looking for an

Analytics Engineer to join our Machine Learning Data & Analytics practice in

Glendale, CA. The Analytics Engineer acts as the bridge between Data Engineers and Data Scientists to build data products supporting analytics use cases by expanding and optimizing our data warehouse and data pipeline architecture. The ideal candidate is an experienced data pipeline builder and data modeler who enjoys optimizing data systems and building them from the ground up through collaboration with business stakeholders and internal Data Analytics team members. This role will have significant ownership in the design and implementation of the future analytics data warehouse for Public Storage.

Responsibilities

  • Maintain our data warehouse with timely and quality data
  • Build and maintain data pipelines from internal databases and APIs
  • Create and maintain architecture and systems documentation
  • Drive initiatives around architecture design and implementation
  • Plan and execute system expansion as needed to support the company's growth and analytic needs
  • Collaborate with Data Engineers and Data Scientists to drive efficiencies for their work
  • Collaborate with other functions to ensure data needs are addressed
  • Improve, manage, and teach standards for code maintainability and performance in code submitted and reviewed

Qualifications

  • Bachelors/Masters in STEM fields such as computer science, math, physics, or business with a strong technical acumen required
  • 3+ years hands-on experience deploying production quality code
  • Demonstrably deep understanding of SQL and analytical data warehouses - we use BigQuery
  • Strong data modeling skills and familiarity with the Kimball methodology.
  • Hands-on experience implementing ETL (or ELT) best practices at scale
  • Hands-on experience with data pipeline tools (Airflow, Luigi, Azkaban, dbt) – we use Airflow and dbt
  • Professional experience using Python for data processing
  • Knowledge of and experience with data-related Python packages
  • Experience with software engineering best practices like version control and using Git
  • Experience with cloud environments e.g. AWS/GCP is a plus
  • Desire to continually keep up with advancements in engineering practices

#REITjobs

Additional Information

All your information will be kept confidential according to EEO guidelines.

Vacancy expired!

Subscribe Report job

Related jobs