Vacancy expired!
- Total of 12+ Years of IT Experience predominantly in Data Integration/ Data Warehouse area
- Must have 7 years of ETL Design and Development experience using Ab Initio
- 1-2 years of Data Integration project experience on Hadoop Platform, preferably Cloudera
- At least one project implementation of AbInitio CDC (Change Data Capture) in a Data Integration/ETL project
- Working knowledge of HDFS, Hive, Impala and other related Hadoop technologies
- Sound understanding of SQL and ability to write well performing SQL queries
- Good knowledge of OLTP and OLAP data models and other data warehouse fundamentals
- Rigor in high code quality, automated testing, and other engineering best practices, ability to write reusable code components
- Ability to unit test the code thoroughly and to troubleshoot issues in production environments
- Must have some working experience with Unix/Linux shell scripting
- Must be able to work independently and support other developers as needed
- Some Java development experience is nice to have
- Knowledge of Agile Development practices is required
- Participate in solution architecture and solution design meetings and provide inputs
- Work with Data Analysts and Product team to gather technical data requirements
- Transform the data to create a consumable data layer for various application uses
- Support Data pipeline with bug fixes, and additional enhancements
- Lead/guide a team of Ab Initio developers and other data engineers
- Perform code reviews and help with standardization of ETL processes
- Document Technical design, Operational Runbook etc.
Vacancy expired!