AI Bibliography

WIKINDX Resources  

Sorensen, T., Robinson, J., Rytting, C. M., Shaw, A. G., Rogers, K. J., & Delorey, A. P., et al.. (2022). An information-theoretic approach to prompt engineering without ground truth labels. arXiv preprint arXiv:2203.11364. 
Resource type: Journal Article
BibTeX citation key: Sorensen2022
View all bibliographic details
Categories: Artificial Intelligence, Cognitive Science, Computer Science, Data Sciences, Decision Theory, Engineering, General
Subcategories: Big data, Deep learning, Human factors engineering, Informatics, Machine learning, Psychology of human-AI interaction
Creators: Delorey, Fulda, Khalil, Robinson, Rogers, Rytting, Shaw, Sorensen, Wingate
Publisher:
Collection: arXiv preprint arXiv:2203.11364
Attachments  
Abstract
Pre-trained language models derive substantial linguistic and factual knowledge from the massive corpora on which they are trained, and prompt engineering seeks to align these models to specific tasks. Unfortunately, existing prompt engineering methods require significant amounts of labeled data, access to model parameters, or both. We introduce a new method for selecting prompt templates textit{without labeled examples} and textit{without direct access to the model}. Specifically, over a set of candidate templates, we choose the template that maximizes the mutual information between the input and the corresponding model output. Across 8 datasets representing 7 distinct NLP tasks, we show that when a template has high mutual information, it also has high accuracy on the task. On the largest model, selecting prompts with our method gets 90% of the way from the average prompt accuracy to the best prompt accuracy and requires no ground truth labels.
  
WIKINDX 6.7.0 | Total resources: 1621 | Username: -- | Bibliography: WIKINDX Master Bibliography | Style: American Psychological Association (APA)