BYU

Abstract by Mason Poggemann

Personal Infomation


Presenter's Name

Mason Poggemann

Degree Level

Undergraduate

Abstract Infomation


Department

Computer Science

Faculty Advisor

Kevin Seppi

Title

Transfer Learning with Meta Learning

Abstract

In the field of natural language processing, much research has shown that transfer learning, using a large, pretrained model on a new task, can achieve state of the art results on a variety of tasks. This success, however, is mainly relegated to problems with large datasets available for fine-tuning. While there has been research regarding the issue of few-shot learning, current approaches such as MAML are not well suited for transfer learning. We have been working on filling in that gap, starting with a meta-based optimizer that learns to fine-tune weights during transfer, allowing more efficient use of data and smaller datasets.