Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

103062134 #35

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 21 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,24 @@ To avoid writing the same person, please report the person's name in
https://docs.google.com/spreadsheets/d/153XruMO7DPONzBTkxh8ZoYSto1E_2zO021vs0prWZ_Q/edit?usp=sharing
First come first serve!
-------
Write here
# Thang Luong
![alt text](https://avatars2.githubusercontent.com/u/396613?v=4&s=460 "Logo Title Text 1")
# Motivation
Anyone who is paying attention to Deep Learning researches in NLP especially in Neural Machine Translation, will defintely hear of **Thang Luong**. His most famous work is [Effective Approaches to Attention-based Neural Machine Translation](https://arxiv.org/abs/1508.04025). The multiplicative attention mechanism in sequence-to-sequecne training he proposed has already been applied in many Deep Learning tasks in these years.
Because of my recent research and interest, I recently pay my attention on Deep Learning about NLP tasks mainly. I spend a lot of time and energy studying the whole process and concepts of NMT, and from the process I've deeply known that how attention mechanism deeply affects NMT.

# Introduction
Thang Luong is now the researcher in Google Brain who focus on Neural Machine Translationa and Deep Learning in Natural Language Processing tasks. After Luong finish his bachelor degree in National University of Singapore, he then pursues Ph.D in Stanford University with focus on Neural Machine Translation and other language models research.

# Influnce
Here is the link that he introduces all his research work in NMT in person: [Imthang/thesis](https://github.com/lmthang/thesis)
Besides the most famous research work of proposing attention mechanism, I'd like to present some other his researches which I'm really impressed with.

[Addressing the rare word problem in neural machine translation](https://arxiv.org/pdf/1410.8206)

[A hierarchical neural autoencoder for paragraphs and documents](https://arxiv.org/pdf/1506.01057)

[When Are Tree Structures Necessary for Deep Learning of Representations?](https://arxiv.org/pdf/1503.00185)