Vacancy expired!
In this age of disruption, organizations need to navigate the future with confidence by tapping into the power of data analytics, robotics, and cognitive technologies such as Artificial Intelligence (AI). Our Strategy & Analytics portfolio helps clients leverage rigorous analytical capabilities and a pragmatic mindset to solve the most complex of problems. By joining our team, you will play a key role in helping to our clients uncover hidden relationships from vast troves of data and transforming the Government and Public Services marketplace.
Work you'll do We are looking for experienced Data Engineers to build and deliver innovative, game-changing mission-driven data pipelines. On this project, you will be responsible for leading the architecture and setup of hosted data lakes, as well as the ingestion pipeline and processing for large datasets, working closely with Agile software development team(s). This role includes responsibilities such as creating and managing schedules for data management (migration, integration, etc.) efforts, working with clients to validate migrated data, working with Agile development teams to understand changes and their impacts towards data migration efforts, among other tasks.The team Deloitte's Government and Public Services (GPS) practice - our people, ideas, technology and outcomes-is designed for impact. Serving federal, state, & local government clients as well as public higher education institutions, our team of over 15,000+ professionals brings fresh perspective to help clients anticipate disruption, reimagine the possible, and fulfill their mission promise. The GPS Analytics and Cognitive (A&C) offering is responsible for developing advanced analytics products and applying data visualization and statistical programming tools to enterprise data in order to advance and enable the key mission outcomes for our clients. Our team supports all phases of analytic work product development, from the identification of key business questions through data collection and ETL, and from performing analyses and using a wide range of statistical, machine learning, and applied mathematical techniques to delivery insights to decision-makers. Our practitioners give special attention to the interplay between data and the business processes that produce it and the decision-makers that consume insights. Qualifications- Active secret security clearance required
- Must be legally authorized to work in the United States without the need for employer sponsorship, now or at any time in the future
- 7+ years of experience with extract, transform, and load (ETL) methods and tools
- 7+ years of experience with data modeling, data warehousing, and building ETL pipelines
- 7+ years of experience with SQL queries and JSON objects
- 7+ years of experience with both SQL and NoSQL databases, including PostgreSQL and MongoDB
- Familiarity with microservice architectures
- Interest in event streaming architectures, such as Apache Kafka
- Prior professional services or federal consulting experience
- Knowledge of data mining, machine learning, data visualization and statistical modeling
- Ability to thrive in a fast-paced work environment with multiple stakeholders
- Creativity and innovation - desire to learn and apply new technologies, products, and libraries
Vacancy expired!