Vacancy expired!
- DB2 on Linux, it is not on Mainframe. It is stored procedures and Db2 written on Linux. They are not looking at COBOL Code. The Data base technology is DB2 on Linux
- Experience with IBM DB2 on Unix (AIX, Solaris, HP-UX, Linux) is essential.
- PL/SQL and Oracle SQL
- Very strong PL/SQL and Oracle SQL experience.
- DB2 on Linux and Snowflake knowledge with cloud platform experience.
- Bachelors or Masters in a technology related field (e.g. Computer Science, Engineering etc.) required.
- 6+ years of related experience in data engineering, analysis, data warehouses, data lakes. Specialist understanding and experience of methodologies like data warehousing, data visualization and data integration.
- Experience with IBM DB2 on Unix (AIX,Solaris,HP-UX, Linux) is essential.
- Strong experience with relational database technologies (Oracle SQL & PL/SQL or similar RDBMS), preferably Snowflake Data warehousing services.
- Strong expertise in all aspects of data movement technologies (ETL/ELT) and experience with schedulers.
- Practical experience delivering and supporting Cloud strategies including migrating legacy products and implementing SaaS integrations.
- Crafted and implemented operational data stores, as well as data lakes in production environments.
- Experience with DevOps, Continuous Integration and Continuous Delivery. Developing and deploying pipelines. Deploying within a cloud native infrastructure would be advantageous.
- Collaborate with a geographically complementary team.
- Consistent track record of working in collaborative teams to deliver high quality data solutions in a multi-developer agile environment following design & coding standard methodologies.
- Outstanding SQL skills and experience performing deep data analysis on multiple database platforms.
- Ability to develop ELT/ETL pipelines to move data to and from Snowflake data store using combination of Python and Snowflake SnowSQL.
- Knowledge and expertise of data modeling techniques and standard methodologies (Relational, Dimensional), plus any prior experience with data modeling tools (eg. PowerDesigner).
- Previously used Data ingestion tool sets (e.g Apache NiFi, Kafka) is advantageous.
- Experience in working with AWS, MS Azure or other cloud providers. Experience with AWS services such as Lambda or S3, AWS Certification.
- Data Architecture experience (Database design, performance optimization).
- Prior experience in setting up reliable infrastructure (Hardware, Scalable data management systems, and frameworks) to perform data-related tasks, particularly with Kafka.
- Understanding basics of distributed systems and Kubernetes.
- Strong Focus on resiliency & reliability.
- You have excellent written and oral communication skills.
- Nice to have: Scripting/coding experience in any of the following: Python, Unix, Java.
Vacancy expired!