Enable job alerts via email!
Boost your interview chances
Create a job specific, tailored resume for higher success rate.
Apple is seeking a Model Optimization Engineer to develop and optimize algorithms within their Core ML stack. You will design APIs, manage training jobs, and collaborate with teams to implement cutting-edge compression techniques, contributing to advancements in machine learning applications.
Cupertino, California, United States Software and Services
We work on a python library that implements a variety of training time and post training quantization algorithms and provides them to developers as simple to use, turnkey APIs, and ensures that these optimizations work seamlessly with the Core ML inference stack and Apple hardware. Our algorithms are implemented using PyTorch. We optimize models across domains, including NLP, vision, text, generative models etc.In this role, the Model Optimization Engineer will be an expert in understanding the internal workings of PyTorch, graph capturing and graph editing mechanisms, methods to observe and modify intermediate activations and weights, tensor subclasses, custom ops, different types of parallelism for training models, and use this knowledge to implement and update the core infrastructure of the optimization library which enables an efficient and scalable implementation of various classes of compression algorithms. You'll also set up and debug training jobs, datasets, evaluation, performance benchmarking pipelines.Additionally, you will...- Design and develop the core infrastructure which powers the implementations of various compression algorithms (training time, post training, data free, calibration data based etc)- Implement the latest algorithms from research papers for model compression in the optimization library.- Collaborate with software and hardware engineers, from the ML compiler inference stack, to co-develop new compression operations, and model export flows for on device deployment.- Design clean, intuitive, maintainable APIs- Run detailed experiments and ablation studies to profile algorithms on various models and tasks, across different model sizes.
Apple is an equal opportunity employer that is committed to inclusion and diversity. We seek to promote equal opportunity for all applicants without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, Veteran status, or other legally protected characteristics. Learn more about your EEO rights as an applicant .