Natural Language Processing with Probabilistic Models
In Course 2 of the Natural Language Processing Specialization, you will:
a) Create a simple auto-correct algorithm using minimum edit distance and dynamic programming,
b) Apply the Viterbi Algorithm for part-of-speech (POS) tagging, which is vital for computational linguistics,
c) Write a better auto-complete algorithm using an N-gram language model, and
d) Write your own Word2Vec model that uses a neural network to compute word embeddings using a continuous bag-of-words model.
By the end of this Specialization, you will have designed NLP applications that perform question-answering and se…
Watch on Coursera ↗
(saves to browser)
DeepCamp AI