Job Details

ID #20230070
State California
City Sanfrancisco
Job type Contract
Salary USD $150,000 - $170,000 /yr 150000 - 170000 /yr
Source Stefanini
Showed 2021-09-25
Date 2021-09-24
Deadline 2021-11-22
Category Et cetera
Create resume

Data Engineer

California, Sanfrancisco, 94105 Sanfrancisco USA

Vacancy expired!

Stefanini is looking for a Data Engineer in San Francisco, CA Responsibilities:Design, develop and operationalize large scale enterprise data solutions using AWS data and analytics services - S3, Glue, Athena, Lambda, RedShift, EMR, Spark, DynamoDB Analyze, re-design and re-platform on-premise data solutions from Cloudera Hadoop platforms to AWS native data stack Design, develop and deploy data pipelines from ingestion to consumption within a big data architecture using Java, Python, Scala Participate in discussions related to architecture, design as well as product development to clearly understand business requirements and to translate into technical solutions Works on one or more projects as a technical team member taking responsibility for complete user stories through analysis, design, development, testing as per project timelines Develops and maintains scripts to automate batch jobs May involve in support and maintenance of existing applications working on troubleshooting and resolving technical issues Follows appropriate technical best practices and internal processes complying with various information security controls Maintain existing data solutions running on Hadoop stack using HDFS, Oozie, Impala, Hive handling some enhancements till the Cloud migration

Qualifications:Bachelors Degree in Computer Science, Information Technology or other relevant fields 7+ years software development experience related to data engineering. Hands on experience in Java, Python or Scala with ability to understand/write complex SQL queries.Proficient in S3, Glue, Athena, Lambda, RedShift, EC2, EMR, Spark, DynamoDB with either Java, Python, Scala or PySpark Experience deploying software solutions to the cloud platform through CI/CD in a DevOps model General understanding of the application and data security concepts with exposure to AWS IAM, CloudTrail, CloudWatch, AWS Config, Secrets Manager, KMS Hands on experience on Hadoop related technologies such as HDFS, Oozie, Impala, Hive Nice to have experience in Ansible/Terraform/Cloud Formation scripts to develop or support Infrastructure as Code Nice to have experience with RDBMS and data warehouses but not required Nice to have experience with Machine Learning technologies and data visualization tools Experience working with Hadoop based Big Data architecture and solutions Experience working in a Agile development environment using Agile tools like Jira, Rally etc Experience with UNIX commands and shell scripts Ability to effectively communicate, collaborate and work in a team environment to deliver high quality work independently

Vacancy expired!

Subscribe Report job