Vacancy expired!
- Data, Big Data tools and ML-learning new technologies
- Logical/physical database design, development, analysis, architecture, and modeling
- Designing and developing large scale applications utilizing Big Data tech
- Engineering trade-offs, with an ability to understand the impact of software changes on extendibility, scalability, performance, and maintainability.
- 5 +years of experience with Big Data Tools, ideally Google Cloud Platform Big Data Tools.
- Must be able to code in SQL. SQL scripting, not MS SQL.
- Must have experience working with on-prem and cloud data platforms and sometimes both at once.
- Data platform experience in: DB2, Netezza, Teradata, etc.
- Experience programming Java or Python for a data platform. NOT looking for a Java application developer. Want someone with Data Transformation experience.
- 3 years of experience with the Hadoop stack
- 1 year of experience with Apache Beam
- Experience working with Google Cloud Platform services like Data Flow, Big Table, Big Query, and Google Cloud Platform Data Storage Buckets
- Experience in architecting multi-tier, distributed database applications
- Experience with Kafka/ Pub-Sub, SQL programming, and performance tuning skills
- Design, architect and build data platform while using a variety of BIG DATA technologies
- Work closely in the team to analyze and develop data architecture; ETL processes, ERD modeling and physical database implementation with Google Cloud Platform Data Services: Big Query, Big Table, Data Flow
- Design, develop and roll out new application features that impact databases
- Develop and maintain an in-depth understanding of the data/ETL architecture and the general application functionality used to maintain data integrity
- Develop Data Flow jobs to answer complex analytical and real-time operational questions
- Innovate by exploring, recommending, benchmarking, and implementing data-centric platform technologies
- Provide hardware architectural guidance, estimate cluster capacity, and create roadmaps for Hadoop or BIG Data Cloud services
- Provide support for both analytics and operational platforms
- Work closely with team-members including IT managers to deliver defect-free solutions in a timely manner. Update work status on a frequent (as often as daily) basis
- Follow and improve upon processes and policies for database application development methodologies and lifecycles
- Work on multiple projects at a time either independently or as a team member
- Work with developers and business owners to provide database needs for the entire company platform
- Oversee the development and release of solutions to non-production environments
- Are you'll willing to collaborate with some of the best Java architects to establish platform standards when new technologies are introduced in the company platform?
- Are you curious and want to continually investigate new technologies and their possible application to the company's business requirements?