Pytorch NLP Multitask Learning - A Pytorch Multi-task Natural Learning Processing model is trained using AI Platform with a custom docker container.
Multitask Learning is an approach to inductive transfer that improves generalization by using the domain information contained in the training signals of related tasks as an inductive bias. This allows the model to exploit commonalities and differences across tasks, improving efficiency and prediction accuracy for task-specific models, compared to training the models separately. Typically, a multi-task model in the age of BERT works by having a shared BERT-style encoder transformer, and different task heads for each task. Since HuggingFace's Transformers has implementations for single-task models, but not modular task heads, a few library architectural changes are performed.
Date | Time |
---|---|
June 22, 2023 (Thursday) | 09:30 AM - 04:30 PM |
July 6, 2023 (Thursday) | 09:30 AM - 04:30 PM |
July 20, 2023 (Thursday) | 09:30 AM - 04:30 PM |
August 3, 2023 (Thursday) | 09:30 AM - 04:30 PM |
August 17, 2023 (Thursday) | 09:30 AM - 04:30 PM |
August 31, 2023 (Thursday) | 09:30 AM - 04:30 PM |