Vacancy expired!
- Work closely with data scientists and business experts to develop modeling solutions for actuarial and underwriting business problems
- Building and maintaining data pipelines for the development, implementation, execution, validation, monitoring, and improvement of data science solutions
- Establish business domain knowledge for State Farm data sources
- Investigate, recommend, and initiate acquisition of new data resources from internal and external data sources
- Identify critical and emerging technologies, techniques, tools, data sources, and platforms in the data engineering field, including cloud-based solutions, that support and extend quantitative analytic deployment solutions
- Up-to-date expertise in data engineering practices. Ability to provide solutions for the identification, acquisition, cleansing, profiling, and ETL (extracting, transformation, and loading) of data used in data science discovery and deployment solutions
- Conformance with State Farm data management and governance policies
- An understanding of the basics of predictive analytics/statistical models/machine learning models
- Developing and maintaining an effective network of both scientific and business contacts/knowledge
- Strong business acumen and the technical ability to acquire, transform and interpret complex data
- Excellent communication skills and the ability to work with multiple, diverse stakeholders across business areas and leadership levels
- Willingness to learn and adapt in agile development environment
- Ability to learn and share new technical concepts quickly
- A minimum of 3 years' relevant work experience
- Bachelor's Degree in Computer Science or a related field
- AWS Certification(s)
- Hands on experience working in AWS/Cloud Concepts
- Experience working with P&C data
- Experience with version control (e.g., GitHub, GitLab)
- Hands on GitOps experience
- Familiarity with one of the following programming languages: SAS, Python, R
- Experience working in Hadoop, or LINUX
- Experience with gathering and creating analytic business requirements, researching potential data sources (both internal and external sources), designing, developing and maintaining data assets
- Familiarity with building SQL and No-SQL queries
- Linux Based Containers/Dockers experience and knowledge
- Kubernetes experience (deploying/hosting runners, applications)
- Infrastructure as code (e.g. Terraform)
- CI/CD and integration with code dependency scans
- Knowledge of version control and DevOps tools such as GitLab
Vacancy expired!