Introduction to Transformer Models for NLP: Unit 2
This course covers the fundamentals and advanced applications of BERT and GPT models. You will learn how BERT processes text, including tokenization and vectorization, and practice fine-tuning BERT for tasks such as sequence classification, token classification, and question answering. The course also explains how GPT generates text, adapts to different writing styles, and can be fine-tuned for tasks like translating English to code. Additional topics include semantic search using Siamese BERT and multi-task learning with GPT through prompt engineering. By the end of the course, you will have …
Watch on Coursera ↗
(saves to browser)
DeepCamp AI