Vacancy expired!
- Partner with technical and non-technical colleagues to understand data and reporting requirements.
- Work with engineering teams to collect required data from internal and external systems.
- Design table structures and define ETL strategy to build performant Data solutions that are reliable and scalable in a fast-growing data ecosystem.
- Develop Data Quality checks and visualizations/dashboards
- Develop and maintain ETL routines using ETL and orchestration tools such as Airflow and Nifi
- Implement database deployments using tools like Liquibase
- Perform ad hoc analysis as necessary.
- Perform SQL and ETL tuning as necessary.
- Develop and maintain Dashboards/reports using Looker
- Degree in an analytical field such as economics, mathematics, or computer science, required.
- 3+ years of relevant data engineering experience, required.
- 2+ years of work experience implementing and reporting on business key performance indicators
- in data warehousing environments, required.
- 2+ years of experience using analytic SQL, working with traditional relational databases and/or
- distributed systems (Snowflake or Redshift), required.
- 1+ years of experience programming languages (e.g., Python, Pyspark), preferred.
- 1+ years of experience with data orchestration/ETL tools (Airflow, NiFi), preferred.
- Strong understanding of data modelling principles including Dimensional modelling, data normalization principles.
- Good understanding of SQL Engines and able to conduct advanced performance tuning.
- Familiarity with data exploration / data visualization tools like Tableau, Looker, Chartio.
- Ability to think strategically, analyse and interpret market and consumer information.
- Strong communication skills - written and verbal presentations.
- Excellent conceptual and analytical reasoning competencies.
- Comfortable working in a fast-paced and highly collaborative environment.
- Familiarity with Agile Scrum principles and ceremonies
Vacancy expired!