How Advanced Hyperparameter Optimization Drives Performance without Compromising Privacy
How Advanced Hyperparameter Optimization Drives Performance without Compromising Privacy

Abstract: 

Most enterprises are investing in AI to either differentiate their products or grow profits. In these cases, model performance - and the engineering, training and tuning that powers it - is the most critical outcome of the modeling process. At the same time, many industries must balance the need for performance with other competing objectives like privacy.

During this talk, Scott Clark will discuss how black-box hyperparamter optimization can be used to impact model performance without compromising privacy. In particular, he will focus on advanced hyperparameter optimization techniques that can significantly boost performance. In the process, he’ll discuss specific use cases for each of these techniques to contextualize their potential impact on your modeling process. This conversation will include discussion of:

Metric definition and how comparisons of multiple objectives impact it
Model search and the use of conditional parameters to address it
Long training cycles and algorithmic techniques that improve efficiency

This talk is great for any machine learning engineer, data scientist, deep learning engineer or leader of these teams.

Bio: 

Scott Clark, CEO & Co-Founder of SigOpt, is passionate about empowering experts to achieve their full potential with optimization solutions. He conceived of the idea for SigOpt while completing his Applied Mathematics PhD at Cornell and proceeded to build and open source the Metric Optimization Engine at Yelp to help solve this problem. This process taught him that optimization needed to be productized to be effective for enterprises, which led him to found SigOpt in 2014, which has subsequently been funded by Y-Combinator, Andreessen Horowitz, Blumberg Capital, DCVC, In-Q-Tel, and others. SigOpt now helps firms and academics around the world accelerate and amplify their research through optimization in fields from machine learning to algorithmic trading and beyond. Scott holds a PhD in Applied Mathematics and an MS in Computer Science from Cornell University, and BS degrees in Mathematics, Physics, and Computational Physics from Oregon State University. He was chosen as one of Forbes’ 30 under 30 in 2016.