New Audio Transformers Course: Live Launch Event with Paige Bailey, Seokhwan Kim, and Brian McFee
Join us for an exciting live event as we celebrate the launch of the new free and open-source Audio Transformers Course by Hugging Face!
We have invited a group of amazing guest speakers, experts in Audio and AI, with academic, open-source, and industry expertise, who will share their presentations to complement the course materials and excite you about Audio AI.
Our guests are:
* Paige Bailey, Product Lead for Generative Models at Google DeepMind, https://twitter.com/DynamicWebPaige
* Seokhwan Kim, a Principal Applied Scientist at Amazon Alexa AI.
Bio: Prior to joining Amazon in 2019, Seokhwan conducted work in natural language understanding and spoken dialog systems where he was an NLP Research Scientist at Adobe Research and a Research Scientist at the Institute for Infocomm Research (I2R) in Singapore. Seokhwan completed his PhD work at Pohang University of Science and Technology (POSTECH) in Korea, focusing on cross-lingual weakly-supervised language understanding under the guidance of Prof. Gary Geunbae Lee. He has authored 80 peer-reviewed papers in international journals and conferences, which have garnered over 2200 citations. His recent research has centered around knowledge-grounded conversional modeling. Seokhwan actively contributes to the research communities as a member of the IEEE Speech and Language Processing Technical Committee, a board member of SIGDIAL, a steering committee member of DSTCs, and a PC member of major NLP, dialogue, AI and speech conferences.
Talk:
"How Robust R U?”: Evaluating Task-oriented Dialogue Systems on Spoken Conversations
Most prior work in dialogue modeling has been on written conversations mostly because of existing data sets. However, written dialogues are not sufficient to fully capture the nature of spoken conversations as well as the potential speech recognition errors in practical spoken dialogue systems. In this talk, I will introduce a public benchmark that we have organized as a main track of the
Watch on YouTube ↗
(saves to browser)
Sign in to unlock AI tutor explanation · ⚡30
Playlist
Uploads from HuggingFace · HuggingFace · 0 of 60
← Previous
Next →
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
The Future of Natural Language Processing
HuggingFace
Trends in Model Size & Computational Efficiency in NLP
HuggingFace
Increasing Data Usage in Natural Language Processing
HuggingFace
In Domain & Out of Domain Generalization in the Future of NLP
HuggingFace
The Limits of NLU & the Rise of NLG in the Future of NLP
HuggingFace
The Lack of Robustness in the Future of NLP
HuggingFace
Inductive Bias, Common Sense, Continual Learning in The Future of NLP
HuggingFace
Train a Hugging Face Transformers Model with Amazon SageMaker
HuggingFace
What is Transfer Learning?
HuggingFace
The pipeline function
HuggingFace
Navigating the Model Hub
HuggingFace
Transformer models: Decoders
HuggingFace
The Transformer architecture
HuggingFace
Transformer models: Encoder-Decoders
HuggingFace
Transformer models: Encoders
HuggingFace
Keras introduction
HuggingFace
The push to hub API
HuggingFace
Fine-tuning with TensorFlow
HuggingFace
Learning rate scheduling with TensorFlow
HuggingFace
TensorFlow Predictions and metrics
HuggingFace
Welcome to the Hugging Face course
HuggingFace
The tokenization pipeline
HuggingFace
Supercharge your PyTorch training loop with Accelerate
HuggingFace
The Trainer API
HuggingFace
Batching inputs together (PyTorch)
HuggingFace
Batching inputs together (TensorFlow)
HuggingFace
Hugging Face Datasets overview (Pytorch)
HuggingFace
Hugging Face Datasets overview (Tensorflow)
HuggingFace
What is dynamic padding?
HuggingFace
What happens inside the pipeline function? (PyTorch)
HuggingFace
What happens inside the pipeline function? (TensorFlow)
HuggingFace
Instantiate a Transformers model (PyTorch)
HuggingFace
Instantiate a Transformers model (TensorFlow)
HuggingFace
Preprocessing sentence pairs (PyTorch)
HuggingFace
Preprocessing sentence pairs (TensorFlow)
HuggingFace
Write your training loop in PyTorch
HuggingFace
Managing a repo on the Model Hub
HuggingFace
Chapter 1 Live Session with Sylvain
HuggingFace
Chapter 2 Live Session with Lewis
HuggingFace
The push to hub API
HuggingFace
Chapter 2 Live Session with Sylvain
HuggingFace
Chapter 3 live sessions with Lewis (PyTorch)
HuggingFace
Day 1 Talks: JAX, Flax & Transformers 🤗
HuggingFace
Day 2 Talks: JAX, Flax & Transformers 🤗
HuggingFace
Day 3 Talks JAX, Flax, Transformers 🤗
HuggingFace
Chapter 4 live sessions with Omar
HuggingFace
Deploy a Hugging Face Transformers Model from S3 to Amazon SageMaker
HuggingFace
Deploy a Hugging Face Transformers Model from the Model Hub to Amazon SageMaker
HuggingFace
Run a Batch Transform Job using Hugging Face Transformers and Amazon SageMaker
HuggingFace
[Webinar] How to add machine learning capabilities with just a few lines of code
HuggingFace
Hugging Face + Zapier Demo Video
HuggingFace
Hugging Face + Google Sheets Demo
HuggingFace
Hugging Face Infinity Launch - 09/28
HuggingFace
Build and Deploy a Machine Learning App in 2 Minutes
HuggingFace
Hugging Face Infinity - GPU Walkthrough
HuggingFace
Otto - 🤗 Infinity Case Study
HuggingFace
Workshop: Getting started with Amazon Sagemaker Train a Hugging Face Transformers and deploy it
HuggingFace
Workshop: Going Production: Deploying, Scaling & Monitoring Hugging Face Transformer models
HuggingFace
🤗 Tasks: Causal Language Modeling
HuggingFace
🤗 Tasks: Masked Language Modeling
HuggingFace
More on: Staying Current in AI
View skill →Related AI Lessons
⚡
⚡
⚡
⚡
Automating LinkedIn Posts from Telegram with Hexabot
Dev.to · Med Marrouchi
Comment améliorer le référencement de votre site
Medium · SEO
Can Influencer Marketing Really Drive Business Growth?
Medium · Startup
Google’s New AI Search Guide Calls AEO And GEO ‘Still SEO’ via @sejournal, @MattGSouthern
Search Engine Journal
🎓
Tutor Explanation
DeepCamp AI