Deep Learning Travels
Don't panic

Relationships from Entity Stream
WHY? Relational Network showed great performance in relational reasoning, but calculations and memory consumption grow quadratically with the number of the objects due to fully connected pairing process. WHAT? Using the last layer of CNN as objects is the same as RN. Instead of pairing this objects, RNN is used...

Linguistic Regularities in Sparse and Explicit Word Representations
WHY? Vector offset method is used for word analogy task. WHAT? The objective function of vector offset method can be interpreted as similarity in direction(PairDirection). Also, this objective can be reinterpreted as addition of 3 cosine similarity(3CosAdd). These two objectives show different performance. Since PairwiseDirection does not take into account...

Linguistic Regularities in Continuous Space Word Representations
WHY? Vector space word representations capture syntactic and semantic regularities in language well. WHAT? To test how continuous word representation capture regularities, this paper introduce relationspecific vector offset method. All the analogy tasks can be formulated as “a is to b as c is to __”. Synthetic test asks gramatical...

Scalable Distributed DNN Training Using Commodity GPU Cloud Computing
WHY? Synchronization is an important issue is distributed SGD. Too few synchoronization among nodes causes unstable training while too frequent synchoronization causes high communication cost. WHAT? This paper tried to reduce the communication cost of distributed SGD drastically by compression. This paper suggests two points. First is that many techniques...

Neural Word Embedding as Implicit Matrix Factorization
WHY? SkipGram Negative Sampling(SGNS) showed amazing performance compared to traditional word embedding methods. However, it was not clear where SGNS converge to. WHAT? This paper proved that minimizing the loss function of SGNS is equivalent to factorizing the wordcontext matrix with association measure of shifted PMI. The loss function of...