Fast Few shot Debugging for NLU Test Suites
Publication Date: 4/18/2022
Event: arXiv
Reference: https://arxiv.org/abs/2204.06555
Authors: Christopher Malon, NEC Laboratories America, Inc., Kai Li, NEC Laboratories America, Inc., Erik Kruus, NEC Laboratories America, Inc.
Abstract: We study few shot debugging of transformer based natural language understanding models, using recently popularized test suites to not just diagnose but correct a problem. Given a few debugging examples of a certain phenomenon, and a held out test set of the same phenomenon, we aim to maximize accuracy on the phenomenon at a minimal cost of accuracy on the original test set. We examine several methods that are faster than full epoch retraining. We introduce a new fast method, which samples a few in danger examples from the original training set. Compared to fast methods using parameter distance constraints or Kullback Leibler divergence, we achieve superior original accuracy for comparable debugging accuracy.
Publication Link: https://arxiv.org/pdf/2204.06555.pdf