
Website Apple
Job Description:
We are looking for a dedicated Hadoop/Data Platform DevOps Engineer to handle one of the largest Big data infrastructure. The successful candidate will enjoy handling the data among distributed database systems that serve Apple services around the globe!
Job Responsibilities:
- Measure and optimize system performance
- Engage and improve life-cycle of service from inception and design to deployment, operation, migration and sunsets
- Experience working with different teams to coordinate and execute critical projects
- Prioritize and work efficiently in a fast-paced environment
- Well organized and strict alignment to SLAs
- Write, review and develop code and documentation that solves the hardest problems on some of the largest and most complex systems
- Real passion for quality and automation, ability to understand complex systems and a desire to constantly make things better
Job Requirements:
- Experience handling data pipelines running complex AI+ML models and aggregations
- Deep understanding of Java applications
- BS in engineering, computer science or other technical disciplines (or equivalent experience) plus 3+ years of related experience
- Experience handling Big Data Environment like Spark, Hadoop, ELK, Kafka etc
- Deep understanding and experience in one or more of – Docker, Mesos, AWS, Ansible, Puppet, Chef
- Proficient in scripting languages like Python, Shell etc
- 2-5+ years of managing services in a large scale Unix environment
- Experience and understanding on Scaling, Capacity Planning and Disaster Recovery is important
Job Details:
Company: Apple
Job Category: Private
Vacancy Type: Full Time
Job Location: London, Ontario, CA
Application Deadline: N/A
Jobseve.online