Vacancy expired!
- Designing and implementing highly performant data ingestion pipelines from multiple sources using Azure Databricks
- Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times
- Working with event-based / streaming technologies to ingest and process data
- Working with other members of the project team to support the delivery of additional project components (API interfaces, Search)
- Evaluating the performance and applicability of multiple tools against customer requirements
- Strong knowledge of Data Management principles
- Experience in building ETL / data warehouse transformation processes
- Direct experience in building data pipelines using Azure Data Factory and Databricks.
- Hands-on experience designing and delivering solutions using the Azure Data Analytics platform (Cortana Intelligence Platform) including Azure Storage, Azure SQL
- Data Warehouse, Azure Data Lake, Azure Cosmos DB, Azure Stream Analytics
Vacancy expired!