Vacancy expired!
- Work with the team to evaluate business needs and priorities, liaise with key business partners and address team needs related to data systems and management.
- Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
- Participate in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
- Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
- Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or Google Cloud Platform Methods.
- Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
- Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or Google Cloud Platform cloud platform.
- 3+ years of experience in data engineering with an emphasis on data analytics and reporting.
- 3+ years of experience with at least one of the following cloud platforms: Microsoft Azure, Amazon Web Services (AWS), Google Cloud Platform (Google Cloud Platform), others.
- 3+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Warehouse, etc.).
- 3+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
- 3+ years of experience with one or more of the follow scripting languages: Python, SQL, Kafka and/or other.
- 3+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway, etc.
- Bachelor's degree or equivalent work experience.
- AWS, Azure and/or Google Cloud Platform Certification.
- Master's degree or higher.
- Expertise in one or more programming languages, preferably Scala, PySpark and/or Python.
- Experience working with either a Map Reduce or an MPP system on any size/scale.
- Experience working with agile development methodologies such as Sprint and Scrum.
Vacancy expired!