Vacancy expired!
Are you an experienced, passionate pioneer in technology - a solutions builder, a roll-up-your-sleeves technologist and leader who wants a daily collaborative environment, think-tank feel and share new ideas with your colleagues - without the extensive demands of travel? If so, consider an opportunity with our US Delivery Center - we are breaking the mold of a typical Delivery Center.
Our US Delivery Centers have been growing since 2014 with significant, continued growth on the horizon. Interested? Read more about our opportunity below Work you'll do/Responsibilities- Evaluate business needs and priorities, liaise with key business partners and address team needs related to big data systems and management.
- Translate business requirements into technical specifications; establish and define details, definitions, and requirements of applications, components and enhancements.
- Lead in project planning; identifying milestones, deliverables and resource requirements; tracks activities and task execution.
- Generate design, development, test plans, detailed functional specifications documents, user interface design, and process flow charts for execution of programming.
- Develop data pipelines / APIs using Python, SQL, potentially Spark and AWS, Azure or Google Cloud Platform Methods.
- Use an analytical, data-driven approach to drive a deep understanding of fast changing business.
- Build large-scale batch and real-time data pipelines with data processing frameworks in AWS, Azure or Google Cloud Platform cloud platform.
- 10+ years of experience in data engineering with an emphasis on data analytics and reporting.
- 10+ years of experience with one or more Cloud platforms: Microsoft Azure, Amazon Web Services (AWS), or Google Cloud Platform (Google Cloud Platform)
- 8+ years of experience in SQL, data transformations, statistical analysis, and troubleshooting across more than one Database Platform (Cassandra, MySQL, Snowflake, PostgreSQL, Redshift, Azure SQL Warehouse, etc.).
- 8+ years of experience in the design and build of data extraction, transformation, and loading processes by writing custom data pipelines.
- 8+ years of experience with one or more of the follow scripting languages: Python, SQL, or Kafka.
- 8+ years of experience designing and building solutions utilizing various Cloud services such as EC2, S3, EMR, Kinesis, RDS, Redshift/Spectrum, Lambda, Glue, Athena, API gateway.
- 5+ years of managing technical teams
- 5+ years in consulting
- Bachelor's degree
- AWS, Azure and/or Google Cloud Platform Certification.
- Master's degree
- Expertise in Scala, PySpark and/or Python.
- Experience working with either a Map Reduce or an MPP system on any size/scale.
- Experience working with agile development methodologies such as Sprint and Scrum.
- Travel up to 10% annually.
- Must live near one of the following cities: Atlanta, GA; Austin, TX; Boston, MA; Charlotte, NC; Chicago, IL; Cincinnati, OH; Cleveland, OH; Dallas, TX; Detroit, MI; Gilbert, AZ; Houston, TX; Indianapolis, IN; Kansas City, MO; Lake Mary, FL; Los Angeles, CA; Mechanicsburg, PA; Miami, FL; McLean, VA; Minneapolis, MN; Nashville, TN; Orange County, CA; Philadelphia, PA; Phoenix, AZ; Pittsburgh, PA; Rosslyn, VA; Sacramento, CA; St. Louis, MO; San Diego, CA; Seattle, WA; Tallahassee, FL; Tampa, FL; or be willing to relocate to one of the following USDC locations: Gilbert, AZ; Lake Mary, FL; Mechanicsburg, PA.
Vacancy expired!