Vacancy expired!
- Excepted to provide hands on software development for a large data project, hosted in a cloud environment.
- Develop and refine the technical architecture used with Teradata, Python, Spark and Hadoop development teams.
- Provide expertise in the development of estimates for EPICs and User Stories for planning and execution.
- Be able to help others break down large team goals into specific and manageable tasks.
- Be involved and supportive of agile sprint model of development, helping to enforce the practice and the discipline.
- Coach and mentor team members on Teradata, Python, Spark and Hadoop development best practices.
- Define and enforce application coding standards and best practices.
- Identify and resolve technical and process impediments preventing delivery teams from meeting delivery commitments.
- Align and collaborate with architects, other team leads, and IT leadership to develop technical architectural runways supporting upcoming features and capabilities.
- Diagnose and troubleshoot performance and other issues.
- Collaborate with peers, reviewing complex change and enhancement requests.
- Evaluate potential changes and enhancements for objectives, scope and impact.
- Take a proactive approach to development work, leading peers and partners to strategic technical solutions in a complex IT environment.
- Document functional/technical requirements and design based on requirements or objectives.
- Mentor peers on coding standards, patterns and strategy.
- Guide the team on best practices in Teradata, Python, Spark and Hadoop as well as perform code reviews.
- Build and maintain active relationships with customers to determine business requirements.
- Partner with other IT teams during integration activities to facilitate successful implementations.
- Participate in on-call application support and respond to application issues when identified.
- Communicate effectively with technical peers in in a clear manner, while also being able to articulate complex solutions in ways nontechnical business partners can understand.
- Have a good understanding of where their project fits into the larger goals for engineering and adapts their work so that the priorities of the systems they are creating match those of the organization
- BA/BS degree or technical institute training or equivalent work experience
- 4+ years of hands on Teradata, Python, Spark and Hadoop development experience
- 1+ years combined of hands on Google Cloud Platform (Google Cloud Platform) development experience
- Expertise working in GCS Connector, DataProc, Bigquery
- Experience working in ADF Python will be an added advantage
- Experience with Big Data processing frameworks (Spark, Hadoop) is required.
- Experience with DevOps tools and techniques (Continuous Integration, Jenkins, Puppet) is required.
- Experience with one or more software version control systems (e.g. Git, Subversion)
- Experience overseeing team members.
- Excellent communication and presentation skills.
- Experience in agile environment
- Experience with Sprint Boot, Maven, Bamboo and great debugging skills.
- Great understanding with builds, software development and GIT.
- Strong effective communication skills, both written and verbal
- Gitflow
- Atlassian products – BitBucket, JIRA, Confluence etc.
- Continuous Integration tools such as Bamboo, Jenkins, or TFS
Vacancy expired!