Vacancy expired!
Make a difference Ciber Global wants you. Come build new things with us and advance your career. At Ciber Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies.
- Responsible for designing the transformation and modernization of client s big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies.
- Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
- Looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
- Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of client s Data Platform.
- Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
- Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions.
- Test and compare competing solutions and report out a point of view on the best solution.
- Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Compose, Cloud SQL, Compute Engine, Cloud Functions, and App Engine.
- Migrate and productionalize existing Big Data pipelines into Google Cloud Platform.
- 5+ years coding skills in Java.
- In-depth understanding of Google"s product technology and underlying architectures.
- 5+ years of application development experience required, +3 years of GCP experience.
- Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, etc.
- Someone who understands Cloud as being a way to operate and not a place to host systems.
- Someone in this domain who understands data architectures and design independent of the technology.
- Someone who understands fundamental techniques such as parallel processing, partitioning, sharding in terms of how and when to apply these approaches vs how they are used in a tech stack like MR/Hive/TD etc as these concepts are powerful when applied at the right time and place in our data engineering and analytics projects. Experience with Python, Shell Script preferred.
- Exceptional problem solving and communication skills and management of multiple stakeholders.
- Experience in working with Agile and Lean methodologies.
- TDD Experience.
- Spring Boot Development.
- Google Cloud Platform (GCP) Certification.
- B.S. Information Systems, Computer Science or equivalent work experience in the requested field.
Vacancy expired!