Vacancy expired!
- Bachelor?s degree in in computer science, information technology, software engineering, or a related field or equivalent experience
- 5 years building data solutions
- 3 years of experience with big data solutions
- 2 years building data solutions frameworks
- 4 years of experience with Hadoop ecosystem tools
- Master?s degree in computer science, information technology, software engineering, or a related field
- Proficiency in the Microsoft Office suite
- Knowledge of NoSQL modeling techniques and nonstructured data formats, including Avro and Parquet
- Knowledge of general programming languages, including Java, Python, C/C++, Scala and Unix/Linux scripting skills
- Knowledge of Hadoop ecosystem tools
- Architect scalable and secure big data solutions to provide visibility and decision support to business end users
- Ensure data solutions are built for performance, fault tolerance and security with reusable patterns
- Define technology capabilities required to develop scalable applications for data science and data platform goals
- Design integration patterns with data source platforms
- Identify and enable data science capabilities across the enterprise
- Promote best practices, guide the team to follow them and collect and incorporate feedback based on team experiences to improve practices
- Design data schemas and processes to transform and analyze data streams
- Partner with business and technology to deliver road maps
- Work with a team to build a holistic view of the organization's data strategy, processes, information and information technology assets
- Define best practices and internal guidelines
- Research new technologies and use cases
- Mentor engineers
- Capture program metrics
Vacancy expired!