Job Details

ID #19903249
State Connecticut
City Hartford
Job type Permanent
Salary USD TBD TBD
Source SmartIT Frame
Showed 2021-09-19
Date 2021-09-19
Deadline 2021-11-17
Category Et cetera
Create resume

Big Data GCP Engineer

Connecticut, Hartford, 06101 Hartford USA

Vacancy expired!

Position: Big Data - Cloud Engineer (Google Cloud) Location: Hartford, CT

1. Job Title : Big Data - Cloud Engineer (Google Cloud)

2. Job Summary : a. Hands on Google Cloud Platform DataProc, Dataflow, Airflow, DAG, Cloud Composer

b. Hands on Google Cloud Platform - DevOps tools / services

c. Hands on experience with GCP Compute, Storage and Security components.

d. Hands on experience in setting up Cloud platforms for use-cases.

e. Experience in migrating workload from on-premise to cloud and cloud to cloud migrations.

f. Strong in programming skill in Java, Python

g. Strong in Unix/Linux shell scripting. h. Good to have knowledge on VPC, Private/Public Subnet, Network Security Groups, Firewalls.

i. Good to have knowledge on Big Data technologies - Hadoop, Sqoop, Hive and Spark including DevOPs.

j. Good to have knowledge on VPC, Private/Public Subnet, Network Security Groups, Firewalls.

k. Good to have knowledge on using Cloud Platform provided Big Data technologies (i.e. Storage, Databricks, Dataproc, Pub/Sub, BigQuery, Data flow etc.)

l. Strong analytical, problem-solving, data analysis and research skills.

m. Demonstrable ability to think outside of the box and not be dependent on readily available tools

n. Excellent communication, presentation and interpersonal skills are necessary.

3. Shift : N/A

4. Roles & Responsibilities :

a. Big Data - Cloud Engineer (Google Cloud) - ( 3 spots )

b. Hands on Google Cloud Platform DataProc, Dataflow, Airflow, DAG, Cloud Composer

c. Hands on Google Cloud Platform - DevOps tools / services

d. Hands on experience with GCP Compute, Storage and Security components.

e. Hands on experience in setting up Cloud platforms for use-cases.

f. Experience in migrating workload from on-premise to cloud and cloud to cloud migrations.

g. Strong in programming skill in Java, Python

h. Strong in Unix/Linux shell scripting. i. Good to have knowledge on VPC, Private/Public Subnet, Network Security Groups, Firewalls.

j. Good to have knowledge on Big Data technologies - Hadoop, Sqoop, Hive and Spark including DevOPs.

k. Good to have knowledge on VPC, Private/Public Subnet, Network Security Groups, Firewalls.

l. Good to have knowledge on using Cloud Platform provided Big Data technologies (i.e. Storage, Databricks, Dataproc, Pub/Sub, BigQuery, Data flow etc.)

m. Strong analytical, problem-solving, data analysis and research skills.

n. Demonstrable ability to think outside of the box and not be dependent on readily available tools

o. Excellent communication, presentation and interpersonal skills are necessary.

Vacancy expired!

Subscribe Report job