Vacancy expired!
- Experience in data integration activities including: architecting, designing, coding, and testing phases
- Architect the data warehouse and provide guidance to the team in implementation using Snowflake SnowSQL and other big data technologies
- Hands-on experience with Snowflake utilities, SnowSQL, SnowPipe, Big Data model techniques using Python
- Experience in performance tuning of the snow pipelines and should be able to troubleshoot the issue quickly
- Extensive experience in relational as well as NoSQL data stores, methods and approaches (star and snowflake, dimensional modeling)
- Understanding of data transformation and translation requirements and suggest tools to leverage to get the job done
- Understanding data pipelines and modern ways of automating data pipeline using cloud based implementation and Testing and clearly document the requirements to create technical and functions specs
- Possesses strong leadership skills with a willingness to lead, create Ideas, and be assertive.
- Perform performance tuning, application support, and user acceptance training
- Identify process improvement opportunities.
- Able to maintain confidentiality of sensitive information.
- Document and communicate risk assessments pertaining to new functionality and enhancements.
- Collects data, analyzes and reports for early detection and correction, continual improvement.
- Recognize and attend to important details with accuracy and efficiency.
- Engage with onsite-offshore team for daily activities Status reporting – weekly and monthly basis.
- Model new features and subject areas and integrate them with existing structures to provide a consistent solution as well as develop and maintain documentation of the data architecture, data flows and data models.
- Previous experience with Azure PaaS technologies, Databricks
- Take ownership of technical solutions from design and architecture perspective, ensure the right direction and propose resolution to potential data pipeline-related problems.
- Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, Zero copy clone, time travel and understand how to use these feature
- Should have a minimum of 12 years of IT experience.
- Minimum 5+ years of experience in designing and implementing a fully operational solution on Snowflake Data Warehouse
- Experience with Python and a major relational database.
- Experience with version control and understanding of version control concepts.
- Understanding of RESTful API design
- Passion for industry best practices and computer programming
- Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies
- Should be having good presentation and communication skills, both written and verbal Ability to problem solve and able to convert the requirements to design
- Ability to troubleshoot issues as and when arisen.
- Ability to test the developed jobs and preparing test documents
- Work Experience on optimizing the performance of the Spark jobs
- Previous experience with end to end implementation of Snowflake cloud data warehouse or end to end data warehouse implementations on-premise
- Experience in Data Migration from RDBMS to Snowflake cloud data warehouse
- Experience with Azure data storage and management technologies such as ADLS
- Must have expertise in Azure Platform as a Service (PAAS).
Vacancy expired!