Vacancy expired!
- Act in a technical leadership capacity: Mentor junior engineers and new team members, and apply technical expertise to challenging programming and design problems
- Develop high-volume, low-latency, data-driven solutions utilizing current and next generation technologies to meet evolving business needs
- Establishes standards and guidelines for the design & development, tuning, deployment and maintenance of information, advanced data analytics, and text mining models and physical data persistence technologies
- Acquire big data input from numerous partners; Key technologies may include Python, Elastic, Prometheus, and Kafka
- Normalize complicated data sources to convert potentially unusable data into a format that can be efficiently used by software and/or employees; Key technologies may include Spark, Nifi, Lambda
- Maintain a CI/CD pipeline for our data software to ensure we keep quality high and time to market low; Key technologies may include Gitlab
- Master's degree in Computer Science, Computer Engineering, or a related technical degree; six years related experience; or equivalent combination of education and experience
- 4 or more years' experience in data streaming technologies, such as Kafka
- 4 or more years' experience using ETL (Extract, Transform, and Load) concepts
- Experience with querying and designing databases using one or more of the following: MySQL, MS SQL, Oracle SQL, or other professional database system
- Ability to work in teams and collaborate with others to clarify requirements, quickly identify problems, and collaboratively find creative solutions
- Ability to assist in documenting requirements as well as resolve conflicts or ambiguities
- 6 or more years' experience in programming using one or more of the following: Java, C, Perl, Python, or advanced Shell scripting
- 6 or more years of experience in implementing data-driven solutions in a production environment using tools such as Hadoop, Impala, Hive, NiFi, Athena, Redshift, ElasticSearch, BigTable, or Airflow
- 4 or more years' experience in machine learning and statistical modeling
- 2 or more years' experience in Cloud Native tools, such as Kubernetes and Docker
- 1 or more years' experience with using the R statistical computing language
- 1 or more years' experience with Agile at Scale, SAFe, and Lean Systems Engineering
Vacancy expired!