Tomas Mikolov is a Senior Researcher at the Czech Institute of Informatics, Robotics, and Cybernetics. His research has covered topics in natural language understanding and representation learning, including Word2Vec, and complexity.
Tomas's PhD thesis is titles "Statistical Language Models Based on Neural Networks", which he completed in 2012 at the Brno University of Technology. We discuss compression and recurrent language models, the backstory behind Word2Vec, and his recent work on complexity & automata.
Episode notes: https://cs.nyu.edu/~welleck/episode25.html
Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
The podcast The Thesis Review is embedded on this page from an open RSS feed. All files, descriptions, artwork and other metadata from the RSS-feed is the property of the podcast owner and not affiliated with or validated by Podplay.