Job Details

ID #23748352
State Nevada
City Las vegas
Job type Contract
Salary USD Depends on Experience Depends on Experience
Source Photon Infotech
Showed 2021-12-02
Date 2021-12-01
Deadline 2022-01-30
Category Et cetera
Create resume

Data Architect (Day 1 Onsite)

Nevada, Las vegas, 89101 Las vegas USA

Vacancy expired!

Greetings Everyone

Who are we? For the past 20 years, we have powered many Digital Experiences for the Fortune 500. Since 1999, we have grown from a few people to more than 4000 team members across the globe that are engaged in various Digital Modernization. For a brief 1 minute video about us, you can check https://youtu.be/uJWBWQZEA6o.

What will you do?
  • Technical leadership in Information and Data Architecture, working with Enterprise Data Architects leveraging the TOGAF architecture methodology with oversight of Domain Modeling, Logical Data Modeling and Physical Data Model implementation.
  • Apply modern data management toolsets and coding methods to design, build, implement, and optimize data solutions of all types – including data warehouses, data lakes, ODS, streaming data, analytic and BI/visualizations, etc.
  • Translate business issues and needs into Data and System requirements and Architect the management of data assets and their flow through the enterprise
  • Architect and Design Data Services (DaaS) for data consumption and manipulation throughout the Data Ecosystem applying the “contract first” design principle and including use of API, Microservices, Microbatch, ELT Pipeline and other methods.
  • Transform legacy data structures and processes to modern, capable, and secure solutions in a hybrid cloud setup
  • Apply Data Engineering & Design best practices to architect solutions, using a deep understanding of various data formats and database design approaches
  • Architect Data Storage solutions for OLTP (CRM, etc.) Systems, Analytics Platforms, Data Lake, Data Warehouse with Relational Database and Object Storage methods tailored for best fit for the needs.
  • Architect best-practice data ingestion framework for batch and real-time data flows, develop tooling for increasing scale, accuracy and automation in data pipeline, to integrate with decisioning, AI/NLP and consuming systems
  • Architect Data Catalogue and Metadata Management
  • Architect and Model for Master Data Management.
  • Architect for various Analytics Method including Descriptive, Diagnostic, Predictive, Prescriptive and Capabilities including Realtime Analytics, Advanced Analytics, Machine Learning (ML/AI/NLP)
  • Work with Enterprise Architects and Information Security Architects to design highly secure data platform ecosystem by designing controls and protection strategies
  • Enable application performance and modernization by creating appropriate data capabilities to match.
  • Determine best-in-breed Tools and Technologies, leveraging CNCF-backed Open Source, Managed Solutions and Engineered solutions where applicable.

What are we looking for?
  • 5+ years of experience in data analysis, engineering, architecture and operations roles, including experience with transformational efforts
  • Strong Database skills, with RDBMS (E.g., Oracle, SQL) as well as modern relational and unstructured data sources (like NoSQL), including cloud services (AWS/Google Cloud Platform/Azure). Hands on experience using tools is strongly preferred
  • Experience with Tools (or similar) such as Hadoop Stack, Airflow, Kafka, NiFi, PostgreSQL, Oracle, SQL Server, ElasticSearch (ELK), JSON, Parquet, Avro and other Data Storage formats, Tableau, Superset and other Visualization Tools, Apache Atlas, and other Data-centric Apache Packages
  • Extensive Knowledge of Design Patterns for Software and Data Engineering.
  • Experience Coding with Java, Javascript (Nodejs), Python, GO, Rust and similar.
  • Experience in on-prem and hybrid cloud infrastructure, including service and cost optimization
  • Experience with production and analytics data, batch and real time / streaming, etc.
  • Experience in regulated industries preferred (such as financial services, insurance, healthcare, etc.)
  • Familiarity with optimization tools and techniques, including Bayesian modelling and variety of machine learning techniques
  • Ability to manage large programs and projects will be essential

Vacancy expired!

Subscribe Report job