Job Details

ID #45975868
State Georgia
City Atlanta
Job type Contract
Salary USD Depends on Experience Depends on Experience
Source Donato Technologies Inc
Showed 2022-09-25
Date 2022-09-23
Deadline 2022-11-21
Category Et cetera
Create resume

GCP Data Solutions Architect

Georgia, Atlanta, 30301 Atlanta USA

Vacancy expired!

Job Title: GCP Data Solutions Architect

Location: Atlanta GA

Duration: Long-Term Contract

Job Resposibilities:

Develop comprehensive data architecture capable of supporting various data types (structured, semi-structured, unstructured) at different ingestion speed (scheduled, streaming) and analytics needs from reporting to machine learning based on client’s unique business requirements• Implement future state data management architecture designs and roadmap for Mindtree clients• Provide initiative specific data architecture solution options with supporting recommendations• Deep understanding of client’s business domain data and how it is used for metrics and analytical solutions• Define information/data flows and drive technical data quality process• Support client product, development and delivery teams on functional and technical design specifications and activities• Define the technical design for solutions, design POC’s and prototypes and guide the solutions through SDLC• Provide subject matter expertise around data infrastructure, analytical output and applications Requirements• 15+ years of professional work experience as a database engineer, administrator or designer• 5+ years expertise in Data Lifecycle - data migration, data validation, data cleansing and data modeling• 2+ years of Experience with private and public cloud architectures, pros, cons, and migration considerations• Advanced experience with cloud computing (GCP/AWS/Azure/Snowflake)• Experience with non-relational data technologies such as Cassandra, Mongo DB, AWS Redshift, etc)• Experience with digital transformation and application modernization• Implementation and tuning experience in the big data ecosystem such as Hadoop, Scala, Hive, Pig, Spark, HBase, HDFS, YARN, Sqoop, NiFi, Storm, Impala, Hawk, Flume, Kafka, Sqoop, etc.• Experience developing data pipelines with tools such as AWS Glue, Talend, Informatica or MS Data Factory• Excellent problem-solving skills, verbal/written communication, and ability to explain technically complex architecture to business stakeholders• Ability to engage with executive level stakeholders from client team to effectively translate business problems to infrastructure designs• Strong project management and team management skills; ability to work with a global team• Experience securing IaaS/Paas private or hybrid cloud and DevOps environment• Understand security controls dealing with PCI, HIPAA, PHI, PII, locational data restrictions, etc.• Ability to think independently, understand complex business requirements and render them as prototypes systems with quick turnaround• Ability to deliver high quality solutions to clients• Provide though leadership in areas of design and solution for given client requirements Preference:• Big data platform certification (GCP/AWS/Azure/Snowflake)• Deep understanding of application and cloud security

Vacancy expired!

Subscribe Report job