Fac Cand: Dani Yogatama

Event Type: 
Colloquium
Dates: 
Wednesday, December 2, 2015 - 14:15 to 15:15
Location: 
Room 120
Title: Learning to Represent Language: Embeddings and Optimization
 
Abstract:
The performance of a machine learning model heavily depends on how the data is represented in the model. For example, when working with text data we can represent words as strings, binary vectors, or real vectors. Recent advances have sought to automate these choices to save human costs and achieve better results.
 
In this talk, I will first discuss how to encode prior knowledge into a representation learning model. I will show an application of this technique to learn a word embedding model that encourages hierarchical ordering of word meanings (e.g., "professor" is a hyponym of "academic").
 
I will also talk about an alternative method to learn representations that treats this problem as an optimization over a very large discrete set of choices. I will show how we can use Bayesian optimization to solve it efficiently and briefly describe an improved Bayesian optimization algorithm that can generalize across multiple datasets.
 
I will conclude by discussing applications of my work on representation learning in real-world systems and outlining future directions.
Computed Event Type: 
Mis
Event Title: 
Fac Cand: Dani Yogatama