IEEE Access (Jan 2025)
Construction of Prompt Verbalizer Based on Dynamic Search Tree for Text Classification
Abstract
Prompt tuning has shown impressive performance in the domain of few-shot text classification tasks, yet the coverage of its crucial module, i.e., the verbalizer, has a considerable effect on the results. Existing methods have not addressed breadth and depth in constructing the verbalizer. Specifically, breadth refers to the cross-granularity issue of label words, while depth refers to the number of elements within a granularity that make a positive contribution to classification. This study proposes a dynamic search tree (DST) method to enhance the coverage of the verbalizer further. The core idea is to utilize the hierarchical relationships within the tree to automatically unearth concealed high-quality words, thereby ensuring that the constructed verbalizer possesses both higher breadth and depth. DST involves amalgamating knowledgeable prompt tuning (KPT) by leveraging the breadth of the KPT’s label word space, which encompasses characteristics at various granularities and from various perspectives, thereby addressing the problem of the verbalizer’s breadth. Subsequently, a method that is capable of measuring the interrelation between words on a designated feature is proposed by analyzing the word vector, which successfully eradicates the noise introduced by irrelevant dimensions during the process of extending the verbalizer and effectively enhances the quality of the verbalizer in terms of depth. Extensive experiments were conducted on zero- and few-shot text classification tasks to demonstrate the effectiveness of our method. Our source code is publicly available at https://github.com/XianliangXia/VerbalizerConstrucionByDynamicSearchTree.
Keywords