NLP_COURSE: A Deep Learning YSDA Natural Language Processing Course By GitHub
Natural language processing (NLP), as we all know is the ability of a computer program to understand human (natural) language just as it is spoken.
In short, to be more clear, it can be said that, Natural Language Processing (NLP) is a component of artificial intelligence (AI) that is concerned with in particular how to program computers to process and analyze large amounts of natural language data.
And here you are presented to a Deep Learning YSDA Natural Processing Course by GitHub:
NLP_Course is a 13-week deep-learning course in Natural Language Processing (NLP) by GitHub that has been divided into various lectures and seminars and in addition to the same also have home works with deadlines.
Now, Moving On, What is the syllabus of the course?
The course comprises of lectures and seminars and has the syllabus as follows:
Week 1: Embeddings
-
Lecture: Word embeddings. Distributional semantics, LSA, Word2Vec, GloVe. Why and when we do we require them.
-
Seminar: Playing with word as well as sentence embeddings.
Week 2: Text Classifications
-
Lecture: Text classification. Classical approaches for text representation: BOW, TF-IDF. Neural approaches: embeddings, convolutions, RNNs
-
Seminar: Salary prediction with convolutional neural networks; explanation of network predictions.
Week 3: Language Models
-
Lecture: Language models: N-gram and neural approaches; visualization of trained models.
-
Seminar: Generation of ArXiv papers with language models
Week 4: Seq2Seq/Attention
-
Lecture: Seq2seq: encoder-decoder framework. Attention: Bahdanau model. Self-attention, Transformer. Pointer networks. Attention for analysis.
-
Seminar: Machine translation of hotel as well as hostel descriptions
Week 5: Structured Learning
-
Lecture: Structured Learning: structured perceptron, structured prediction, dynamic oracles, RL basics.
-
Seminar: POS tagging
Week 6: Expectation Maximization
-
Lecture: Expectation-Maximization and Word Alignment Models
-
Seminar: Implementation of expectation maximization
Week 7: Machine Translation
-
Lecture: Machine Translation: A review of the key ideas from PBMT, the application specific ideas that have developed in NMT over the past 3 years as well as some of the open problems in the same area.
-
Seminar: This seminar in the course consists of presentations given by the students
Week 8: Transfer Learning and Multi-Tasking
-
Lecture: What and why does a network learn: "model" is never just "model"! Transfer learning in NLP. Multi-task learning in NLP. How to understand, what kind of information the model representations contain.
-
Seminar: Improvement of named entity recognition by learning jointly with other tasks
Week 9: Domain Adaptation
Week 10: Dialogue Symptoms
Week 11: Adversarial Methods
Week 12-13: TBA
For more information regarding the same, kindly go through the link mentioned below:
Source And More Information: GitHub