Vacancy expired!
- Responsible for Architecting & Engineering data pipelines for an enterprise data platform, democratizing datasets and enabling advanced analytics
- This role participates in client projects across the organization and defines logical/physical data architectures that creates the foundation for application development and data analytics to deliver enhanced process automation and insights
- Make key architectural decisions, create technical designs and define technical enablers, ensuring appropriate technologies are effectively leveraged
- Work closely with the Data Science, Engineering, and leadership teams to define and deliver the roadmaps
- Lead definition of architecture, high-level and low-level designs relating to our data science and analytics platform in support of consumers of out platform
- Work with stakeholders to define the data science technology roadmap
- Expected to be leader in the definition of MLOps, DevOps and CI/CD practices
- Assess the data architecture currently in place and work with technical staff to improve it
- Monitors and analyzes industry technology trends, internal and or external business challenges and/or regulatory issues to determine their potential impact or application for the organization and apply that knowledge to architecture designs
- Work with engineering teams to collect required data from internal and external systems
- Coach and mentor team members to improve their designs and ETL processes
- Work side-by-side with other architects and independently drive projects from inception, specification, execution, or to launch
- Must be hands on and willing to code
- 5+ years of working experience in a technical lead or an architect role
- B.S or M.S degree in Computing/Data/Data Science related field from top universities in US.
- Minimum of 8+ years of experience in large scale data engineering and management responsible for delivering data migration initiatives.
- Understanding of Cloud technology (preferably AWS- EC2, S3, Lambda, EMR, ECS, API Gateway, GLUE) and Big Data platforms (Hadoop- Sqoop, Hue, Impala, Hive, Hbase)
- Modern Data Management and Architecture skills to manage and process complex datasets across multiple sources.
- Experience developing and maintaining data roadmaps, data standards, data domain structures and defining data asset types.
- Experience with large scale Big Data and AI/ML platform engineering with recent coding experience in one or more of the following languages: Java, Python, Scala
- Modern Data Platforms - Expertise in data storage concepts and structures (relational/non-relational databases, data lakes, etc.)
- Experience working with NoSql and GraphDB technologies (DynamoDB, Mongo, MarkLogic, MariaDB, Neptune, TigerGraph)
- Design platforms as consumable data services across the organization using Big Data tech stack.
- Must be "self-motivated" as well as creative and efficient in proposing solutions to sophisticated, time-critical problems.
- Demonstrates decision making skills through consistent logic, rational, and objective thinking, and reasoning.
Vacancy expired!