Senior Principal Deep Learning Engineer
Leo Dirac started his career building software at big companies including Microsoft and Google. In 2012 he fell in love with neural networks when he first saw AlexNet, and subsequently dedicated himself to applying deep learning. He spent six years at Amazon where he launched Amazon’s first deep learning projects for visual search, product similarity, and recommendations. He was the lead engineer on AWS’s first ML service called Amazon Machine Learning, and used those lessons to design the core of AWS SageMaker, and led the teams that built SageMaker’s AutoML features including Automatic Model Tuning and AutoPilot. Leo continues seeking ways to accelerate the pace of technological advancement for all of society.
AI of Tomorrow
July 18, 17:05
The Curse of Dimensionality in Configuration Space
Training a high-quality deep learning model requires good choices of hyperparameters like learning rate and regularization, along with an appropriately chosen network structure and various algorithmic options. Collectively these choices comprise the configuration space (CS) for an ML problem. Many configuration spaces share structural similarities like double descent, but these patterns are not widely understood. The curse of dimensionality makes it impractical to fully map any non-trivial CS. AutoML techniques attempt to automatically learn and optimize a CS. This talk attempts to impart some geometric intuition for what a CS looks like, and to give practical advice to explore them efficiently.