Vacancy expired!
Your Opportunity
At Schwab, the Data and Rep Technology (DaRT) organization governs the strategy and implementation of the enterprise data warehouse, Data Lake, and emerging data platforms. Our mission is to drive activation of data solutions, rep engagement technology (Sales, Marketing and Service) and client intelligence to achieve targeted business outcomes, address data risk and safeguard competitive edge. We help Marketing, Finance, Risk and executive leadership make fact-based decisions by integrating and analyzing data.As part of the Business Data Delivery team, you will partner with our Business stakeholders and Data Engineering team to design and develop data solutions for data science, analytics and reporting. We are a team of passionate data engineers and SMEs who bring a lot of energy, focus and fresh ideas that support our mission to provide value by seeing the world "Through Clients' Eyes". ETL Developers work with large teams, including onshore and offshore developers, using best-in-class technologies including Teradata, Informatica, Hadoop and BigQuery. You will design, development and implement enterprise data integration solutions with opportunities to grow in responsibility, work on exciting and challenging projects, train on new technologies and work with other Developers to set the future of the Data Warehouse.What you are good at- Developing and implementing new data ingestion workflows independently by practical application of existing and new data engineering techniques
- Developing data ingestion workflows across wide variety of data sources and data ingestion patterns such as batch, near real-time and real time
- Working with business analysts to understand business/new data requirements and use cases
- Crafting and updating ETL specifications and supporting documentation
- Working with Data Modelers and cross-functional teams to ensure an accurate and efficient implementation of requirements and following standards
- Defining and executing quality assurance and test scripts
- Ensuring consistency with published development, coding and testing standards
- Demonstrated ability to work independently as an ETL Developer with a track record of delivering code with minimal defects
- 1-3 years of hands-on experience with data integration tools such as Informatica Power Center and Talend
- 1-3 years in Data Warehouse platforms such as Teradata and BigData/Hadoop
- At least 1 year of experience in data modeling (logical and/or physical)
- At least 1 year of hands-on experience working with near real-time and/or real-time data ingestion techniques
- SQL experience with the ability to develop, tune and debug complex SQL applications is required
- Experience with Google Cloud Platform, BigQuery and Informatica Intelligent Cloud Services (IICS) highly desirable
- Experience with scheduling tools (eg. Control M, ESP)
- Ability to quickly learn & become proficient with new technologies
- Strong analytical, problem-solving, influencing, prioritization, decision-making and conflict resolution skills
- Exceptional interpersonal skills, including teamwork and communication
Vacancy expired!