Vacancy expired!
Qualifications:
- Must be able to work on a W-2
- Strong understanding of Data warehousing (Dimensional Modeling, ETL etc.) and RDBMS concepts
- 5 years of experience with:
- ETL tools such as Talend, Informatica, DataStage, etc,
- SQL, stored procedures, and table design
- SQL Query optimization and ETL data loading performance
- 2 years of experience with:
- Hadoop platforms on components like HIVE, KAFKA, NiFi, Spark, etc.
- Shell scripting
- Experience with:
- Snowflake cloud technologies is preferred
- Real time streaming technologies is preferred
- Deploying machine learning models and automating processes in production is a plus
- Cloud technologies(AWS, Azure, GCP) is big plus
- Design, develop, and maintain secure, consistent and reliable ETL solutions supporting critical business processes across the various business units
- Ensure data solutions are compliant with enterprise security standards
- Work in complex multi-platform environments on multiple project assignments
- Develop and perform tests and validate data flows and prepare ETL processes to meet complex business requirements, including designing and implementing of complex end-to-end solutions using BI platforms
- Coordinate with analysts and developers to ensure jobs designed and developed meet minimum support standards and best practices before migration into the production environment
- Partner with various infrastructure teams, application teams, and architects to generate process designs and complex transformations to various data elements to provide the business with insights into their business processes
- Use strategies, such as indexing and partitioning to fine tune the data warehouse and big data environments to improve the query response time and scalability
- Define and capture metadata and rules associated with ETL processes
- Assist production support team in providing resolutions to production job failures, data issues, and performance tuning
Vacancy expired!