Vacancy expired!
Make a difference Ciber Global wants you. Come build new things with us and advance your career. At Ciber Global you'll collaborate with experts. You'll join successful teams contributing to our clients' success. You'll work side by side with our clients and have long-term opportunities to advance your career with the latest emerging technologies. Key Responsibilities:
- Responsible for designing the transformation and modernization of Client s big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies.
- Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP is a must.
- Experience with broad set of technology skills across these areas and who can demonstrate an ability to design right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
- Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of client s Data Platform.
- Implement methods for standardization of all parts of the pipeline to maximize data usability and consistency.
- Identify, develop, evaluate and summarize Proof of Concepts to prove out solutions.
- Test and compare competing solutions and report out a point of view on the best solution.
- Design and build data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, BigTable, Data Fusion, DataProc.
- Data modeling and schema design that will range across multiple business domains and industries within the cloud for large enterprise data warehouse and data lakes solutions
- In-depth understanding of Google's product technology and underlying architectures.
- 5+ years of application development experience required, +3 years of GCP experience.
- Experience working in GCP based Big Data deployments (Batch/Real-Time) leveraging Big Query, Big Table, Google Cloud Storage, PubSub, Data Fusion, Dataflow, Dataproc, etc.
- Experience with Informatica for ELT/ETL and/or catalog is preferred.
- Experience with development eco-system such as Git, Jenkins and CI/CD.
- Coding skills in Python, Shell Script or similar technologies.
- Exceptional problem solving and communication skills.
- Experience in working with Agile and Lean methodologies Team player and attention to detail.
- Google Cloud Data Engineer Certification.
- B.S. Information Systems, Computer Science or equivalent work experience in the requested field.
Vacancy expired!