Xun Liang1*,
Hanyu Wang1*,
Yezhaohui Wang2*,
Shichao Song1,
Jiawei Yang1,
Simin Niu1,
Jie Hu3,
Dan Liu3,
Shunyu Yao3,
Feiyu Xiong2,
Zhiyu Li2â€
1Renmin University of China
2Institute for Advanced Algorithms Research, Shanghai
3China Telecom Research Institute
Important
🌟 Star Us! If you find our work helpful, please consider staring our GitHub to stay updated with the latest in Controllable Text Generation!🌟
- [2024/08/26] We have updated our paper list, which can now be accessed on our GitHub page.
- [2024/08/23] Our paper is published on the arXiv platform: https://arxiv.org/abs/2408.12599.
- [2024/08/23] Our paper secured the second position on Hugging Face's Daily Papers module: https://huggingface.co/papers/2408.12599.
Welcome to the GitHub repository for our survey paper titled "Controllable Text Generation for Large Language Models: A Survey." This repository includes all the resources, code, and references related to the paper. Our objective is to provide a thorough overview of the techniques and methodologies used to control text generation in large language models (LLMs), with an emphasis on both theoretical underpinnings and practical implementations.
Our survey explores the following key areas:
Controllable Text Generation (CTG) must meet two main requirements:
-
Meeting Predefined Control Conditions: Ensuring that the generated text adheres to specified criteria, such as thematic consistency, safety, and stylistic adherence.
-
Maintaining Text Quality: Ensuring that the text produced is fluent, helpful, and diverse while balancing control with overall quality.
We define CTG as follows:
-
Relationship with LLM Capabilities: CTG is an ability dimension that is orthogonal to the objective knowledge capabilities of LLMs, focusing on how information is presented to meet specific needs, such as style or sentiment.
-
Injection of Control Conditions: Control conditions can be integrated into the text generation process at various stages using resources like text corpora, graphs, or databases.
-
Quality of CTG: High-quality CTG strikes a balance between adherence to control conditions and maintaining fluency, coherence, and helpfulness in the generated text.
CTG tasks are categorized into two main types:
-
Content Control (Linguistic Control/Hard Control): Focuses on managing content structure, such as format and vocabulary.
-
Attribute Control (Semantic Control/Soft Control): Focuses on managing attributes like sentiment, style, and safety.
CTG methods are systematically categorized into two stages:
-
Training-Stage Methods: Techniques such as model retraining, fine-tuning, and reinforcement learning that occur during the training phase.
-
Inference-Stage Methods: Techniques such as prompt engineering, latent space manipulation, and decoding-time intervention applied during inference.
We review the evaluation methods and their applications in CTG:
-
Evaluation Methods: We introduce a range of automatic and human-based evaluation metrics, along with benchmarks that assess the effectiveness of CTG techniques, focusing on how well they balance control and text quality.
-
Applications: We explore CTG applications across both specialized vertical domains and general tasks.
This survey addresses key challenges in CTG research and suggests future directions:
-
Key Challenges: Issues such as achieving precise control, maintaining fluency and coherence, and handling multi-attribute control in complex scenarios.
-
Proposed Appeals: We advocate for a greater focus on real-world applications and the development of robust evaluation frameworks to advance CTG techniques.
This paper aims to provide valuable insights and guidance for researchers and developers working in the field of Controllable Text Generation. All references, along with a Chinese version of this survey, are open-sourced and available at https://github.com/IAAR-Shanghai/CTGSurvey.
We’ve compiled a comprehensive spreadsheet of all the papers we reviewed, accessible here. A more user-friendly table format is in progress.
Below, you'll find a categorized list of papers from 2023 and 2024, organized by Type, Phase, and Classification.
- CTRL: A Conditional Transformer Language Model for Controllable Generation
Salesforce Research, arxiv'19, 2019 [Paper] - Parallel Refinements for Lexically Constrained Text Generation with BART
HKU, EMNLP'20, 2020 [Paper] - PAIR: Planning and Iterative Refinement in Pre-trained Transformers for Long Text Generation
Northeastern University, EMNLP'20, 2020 [Paper] - Pre-Training Based Personalized Dialogue Generation Model with Persona-sparse Data
THU, AAAI'20, 2020 [Paper] - POINTER: Constrained Progressive Text Generation via Insertion-based Generative Pre-training
Microsoft, EMNLP'20, 2020 [Paper] - CoCon: A Self-Supervised Approach for Controlled Text Generation
NTU, ICLR'21, 2021 [Paper] - A Simple and Efficient Multi-Task Learning Approach for Conditioned Dialogue Generation
Universit´e de Montr´eal, NAACL'21, 2021 [Paper] - CHAE: Fine-Grained Controllable Story Generation with Characters, Actions and Emotions
Tongji University, COLING'22, 2022 [Paper] - Director: Generator-Classifiers For Supervised Language Modeling
McGill University, IJCNLP'22, 2022 [Paper] - Emotional Text Generation with Hard Constraints
Huaqiao University, ICFTIC'22, 2022 [Paper] - FAST: Improving Controllability for Text Generation with Feedback Aware Self-Training
Microsoft, arxiv'22, 2022 [Paper] - Fine-Grained Controllable Text Generation Using Non-Residual Prompting
Research Institutes of Sweden, ACL'22, 2022 [Paper] - Genre-Controllable Story Generation via Supervised Contrastive Learning
Sungkyunkwan University, WWW'22, 2022 [Paper] - Fine-Grained Sentiment-Controlled Text Generation Approach Based on Pre-Trained Language Model
Zhejiang University of Technology, Appl. Sci., 2023 [Paper] - Lexical Complexity Controlled Sentence Generation for Language Learning
BLCU, CCL'23, 2023 [Paper] - Semantic Space Grounded Weighted Decoding for Multi-Attribute Controllable Dialogue Generation
Shanghai Jiao Tong University, EMNLP'23, 2023 [Paper] - SweCTRL-Mini: a data-transparent Transformer-based large language model for controllable text generation in Swedish
KTH Royal Institute of Technology, arxiv'23, 2023 [Paper]
- Technical Report: Auxiliary Tuning and its Application to Conditional Text Generation
AI21, arxiv'20, 2020 [Paper] - DisCup: Discriminator Cooperative Unlikelihood Prompt-tuning for Controllable Text Generation
BIT, EMNLP'22, 2022 [Paper] - Finetuned Language Models are Zero-Shot Learners
Google, ICLR'22, 2022 [Paper] - MReD: A Meta-Review Dataset for Structure-Controllable Text Generation
Alibaba, ACL'22_findings, 2022 [Paper] - Language Detoxification with Attribute-Discriminative Latent Space
KAIST, ACL'23, 2023 [Paper] - Controlled Text Generation with Hidden Representation Transformations
UCLA, ACL'23_findings, 2023 [Paper] - CLICK: Controllable Text Generation with Sequence Likelihood Contrastive Learning
THU, ACL'24_findings, 2023 [Paper] - Seen to Unseen: Exploring Compositional Generalization of Multi-Attribute Controllable Dialogue Generation
BUPT, ACL'23, 2023 [Paper] - DeepPress: guided press release topic-aware text generation using ensemble transformers
Universite´ de Moncton,, Neural Computing and Applications, 2023 [Paper] - DuNST: Dual Noisy Self Training for Semi-Supervised Controllable Text Generation
The University of British Columbia, ACL'23, 2023 [Paper] - Controlled text generation with natural language instructions
ETH ZĂĽrich, ICML'23, 2023 [Paper] - Controlling keywords and their positions in text generation
Hitachi, Ltd. Research and Development Group, INLG'23, 2023 [Paper] - Toward Unified Controllable Text Generation via Regular Expression Instruction
ISCAS, IJCNLP-AACL'23, 2023 [Paper] - Controllable Text Generation with Residual Memory Transformer
BIT, arxiv'23, 2023 [Paper] - Continuous Language Model Interpolation for Dynamic and Controllable Text Generation
Harvard University, arxiv'24, 2024 [Paper] - CoDa: Constrained Generation based Data Augmentation for Low-Resource NLP
UMD, arxiv'24, 2024 [Paper] - Contrastive Perplexity for Controlled Generation: An Application in Detoxifying Large Language Models
SAP, arxiv'24, 2024 [Paper] - CTGGAN: Controllable Text Generation with Generative Adversarial Network
JIUTIAN Team, China Mobile Research, Appl. Sci., 2024 [Paper] - ECCRG: A Emotion- and Content-Controllable Response Generation Model
TJU, Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, 2024 [Paper] - LiFi: Lightweight Controlled Text Generation with Fine-Grained Control Codes
THU, arxiv'24, 2024 [Paper]
- Learning to summarize with human feedback
OpenAI, NeurIPS'20, 2020 [Paper] - A Distributional Approach to Controlled Text Generation
Muhammad Khalifa, ICLR'21, 2021 [Paper] - Efficient Reinforcement Learning for Unsupervised Controlled Text Generation
agaralabs, arxiv'22, 2022 [Paper] - Training language models to follow instructions with human feedback
openAI, NeurIPS'22, 2022 [Paper] - STEER: Unified Style Transfer with Expert Reinforcement
University of Washington, EMNLP'23_findings, 2023 [Paper] - Prompt-Based Length Controlled Generation with Multiple Control Types
NWPU, ACL'24_findings, 2024 [Paper] - Reinforcement Learning with Dynamic Multi-Reward Weighting for Multi-Style Controllable Generation
University of Minnesota, arxiv'24, 2024 [Paper] - Safe RLHF: Safe Reinforcement Learning from Human Feedback
PKU, ICLR'24_spotlight, 2024 [Paper] - Token-level Direct Preference Optimization
IACAS, arxiv'24, 2024 [Paper] - Reinforcement Learning with Token-level Feedback for Controllable Text Generation
HUST, NAACL'24, 2024 [Paper]
- AutoPrompt: Eliciting Knowledge from Language Models with Automatically Generated Prompts
UCI&UCB, EMNLP'20, 2020 [Paper] - Attribute Alignment: Controlling Text Generation from Pre-trained Language Models
University of California, EMNLP'21_findings, 2021 [Paper] - GPT Understands, Too
THU, arxiv'21, 2021 [Paper] - Prefix-Tuning: Optimizing Continuous Prompts for Generation
University of California, Santa Barbara, ACL'21, 2021 [Paper] - The Power of Scale for Parameter-Efficient Prompt Tuning
Google, EMNLP'21, 2021 [Paper] - Controllable Natural Language Generation with Contrastive Prefixes
UCSB, ACL'22_findings, 2022 [Paper] - A Distributional Lens for Multi-Aspect Controllable Text Generation
HIT, EMNLP'22_oral, 2022 [Paper] - P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Across Scales and Tasks
THU, ACL'22, 2022 [Paper] - Controlled Text Generation using T5 based Encoder-Decoder Soft Prompt Tuning and Analysis of the Utility of Generated Text in AI
Queen Mary University of London, arxiv'22, 2022 [Paper] - Controllable Generation of Dialogue Acts for Dialogue Systems via Few-Shot Response Generation and Ranking
University of California Santa Cruz, SIGDIAL'23, 2023 [Paper] - PCFG-based Natural Language Interface Improves Generalization for Controlled Text Generation
Johns Hopkins University, SEM'23, 2023 [Paper] - Harnessing the Plug-and-Play Controller by Prompting
BUAA, GEM'23, 2023 [Paper] - An Extensible Plug-and-Play Method for Multi-Aspect Controllable Text Generation
THU&Meituan, ACL'23, 2023 [Paper] - Tailor: A Soft-Prompt-Based Approach to Attribute-Based Controlled Text Generation
Alibaba, ACL'23, 2023 [Paper] - InstructCMP: Length Control in Sentence Compression through Instruction-based Large Language Models
CNU, ACL'24_findings, 2024 [Paper] - Topic-Oriented Controlled Text Generation for Social Networks
WHU, Journal of Signal Processing Systems, 2024 [Paper] - Plug and Play with Prompts: A Prompt Tuning Approach for Controlling Text Generation
University of Toronto, AAAI'24_workshop, 2024 [Paper] - TrustAgent: Towards Safe and Trustworthy LLM-based Agents through Agent Constitution
UCSB, arxiv'24, 2024 [Paper]
- Deep Extrapolation for Attribute-Enhanced Generation
Salesforce Research, NeurIPS'21, 2021 [Paper] - Extracting Latent Steering Vectors from Pretrained Language Models
Allen Institute for Artificial Intelligence, ACL'22_findings, 2022 [Paper] - Activation Addition: Steering Language Models Without Optimization
UC Berkeley, arxiv'23, 2023 [Paper] - Evaluating, Understanding, and Improving Constrained Text Generation
for Large Language Models
PKU, arxiv'23, 2023 [Paper] - In-context Vectors: Making In Context Learning More Effective and Controllable Through Latent Space Steering
Stanford University, arxiv'23, 2023 [Paper] - MacLaSa: Multi-Aspect Controllable Text Generation via Efficient Sampling from Compact Latent Space
ICT CAS, EMNLP'23_findings, 2023 [Paper] - Miracle: Towards Personalized Dialogue Generation with Latent-Space Multiple Personal Attribute Control
HUST, EMNLP'23_findings, 2023 [Paper] - Controllable Text Generation via Probability Density Estimation in the Latent Space
HIT, EMNLP'23, 2023 [Paper] - Self-Detoxifying Language Models via Toxification Reversal
The Hong Kong Polytechnic University, EMNLP'23, 2023 [Paper] - DESTEIN: Navigating Detoxification of Language Models via Universal Steering Pairs and Head-wise Activation Fusion
Tongji University, arxiv'24, 2024 [Paper] - FreeCtrl: Constructing Control Centers with Feedforward Layers for Learning-Free Controllable Text Generation
NTU, ACL'24, 2024 [Paper] - InferAligner: Inference-Time Alignment for Harmlessness through Cross-Model Guidance
FuDan, arxiv'24, 2024 [Paper] - Multi-Aspect Controllable Text Generation with Disentangled Counterfactual Augmentation
NJU, ACL'24, 2024 [Paper] - Style Vectors for Steering Generative Large Language Models
German Aerospace Center (DLR), EACL'24_findings, 2024 [Paper]
- Generalization through Memorization: Nearest Neighbor Language Models
Urvashi Khandelwal, ICLR'20, 2020 [Paper] - MEGATRON-CNTRL: Controllable Story Generation with External Knowledge Using Large-Scale Language Models
USTHK, EMNLP'20, 2020 [Paper] - Plug-and-Play Conversational Models
USTHK, EMNLP'20 findings, 2020 [Paper] - Plug and Play Language Models: A Simple Approach to Controlled Text Generation
California Institute of Technology, ICLR'20, 2020 [Paper] - DExperts: Decoding-Time Controlled Text Generation with Experts and Anti-Experts
University of Washington, ACL'21, 2021 [Paper] - FUDGE: Controlled Text Generation With Future Discriminators
UC Berkeley, NAACL'21, 2021 [Paper] - GeDi: Generative Discriminator Guided Sequence Generation
Salesforce Research, EMNLP'21, 2021 [Paper] - Controllable Generation from Pre-trained Language Models via Inverse Prompting
THU, KDD'21, 2021 [Paper] - A Plug-and-Play Method for Controlled Text Generation
ETH ZĂĽrich, EMNLP'21 findings, 2021 [Paper] - Controlled Text Generation as Continuous Optimization with Multiple Constraints
CMU, NeurIPS'21, 2021 [Paper] - NeuroLogic Decoding: (Un)supervised Neural Text Generation with Predicate Logic Constraints
Allen Institute for Artificial Intelligence, NAACL'21, 2021 [Paper] - Plug-and-Blend: A Framework for Controllable Story Generation with Blended Control Codes
Georgia Institute of Technology, Workshop on Narrative Understanding'21, 2021 [Paper] - A Causal Lens for Controllable Text Generation
UC San Diego, NeurIPS'21, 2021 [Paper] - Self-Diagnosis and Self-Debiasing: A Proposal for Reducing Corpus-Based Bias in NLP
LMU Munich, TACL'21, 2021 [Paper] - Controllable Text Generation for All Ages: Evaluating a Plug-and-Play Approach to Age-Adapted Dialogue
University of Amsterdam, GEM'22, 2022 [Paper] - BeamR: Beam Reweighing with Attribute Discriminators for Controllable Text Generation
Vanguard, AACL'22_findings, 2022 [Paper] - Classifiers are Better Experts for Controllable Text Generation
Tinkoff, NeurIPS'22 workshop Transfer Learning for Natural Language Processing, 2022 [Paper] - Improving Controllable Text Generation with Position-Aware Weighted Decoding
HIT, ACL'22_findings, 2022 [Paper] - Controllable Text Generation with Language Constraints
Princeton University, arxiv, 2022 [Paper] - COLD Decoding: Energy-based Constrained Text Generation with Langevin Dynamics
University of Washington, NeurIPS'22, 2022 [Paper] - Collocation2Text: Controllable Text Generation from Guide Phrases in Russian
Vyatka State University, Dialogue-2022 conference, 2022 [Paper] - CounterGeDi: A Controllable Approach to Generate Polite, Detoxified and Emotional Counterspeech
Indian Institute of Technology, IJCAI'22, 2022 [Paper] - Bridging the Gap Between Training and Inference of Bayesian Controllable Language Models
THU, arxiv'22, 2022 [Paper] - Nearest Neighbor Language Models for Stylistic Controllable Generation
University of Marburg, GEM'22, 2022 [Paper] - Mix and Match: Learning-free Controllable Text Generationusing Energy Language Models
University of California San Diego, ACL'22, 2022 [Paper] - Gradient-based Constrained Sampling from Language Models
CMU, EMNLP'22, 2022 [Paper] - Controllable Text Generation with Neurally-Decomposed Oracle
UCLA, NeruIPS'22, 2022 [Paper] - NeuroLogic A*esque Decoding: Constrained Text Generation with Lookahead Heuristics
Allen Institute for Artificial Intelligence, NAACL'22, 2022 [Paper] - Plug-and-Play Recipe Generation with Content Planning
University of Cambridge, EMNLP 2022 GEM workshop, 2022 [Paper] - Sequentially Controlled Text Generation
University of Southern California, EMNLP'22, 2022 [Paper] - Air-Decoding: Attribute Distribution Reconstruction for Decoding-Time Controllable Text Generation
USTC, EMNLP'23, 2023 [Paper] - A Block Metropolis-Hastings Sampler for Controllable Energy-based Text Generation
UCSD, CoNLL'23, 2023 [Paper] - BOLT: Fast Energy-based Controlled Text Generation with Tunable Biases
University of Michigan, ACL'23_short, 2023 [Paper] - Controlled Decoding from Language Models
Google, NeurIPS_SoLaR'23, 2023 [Paper] - Focused Prefix Tuning for Controllable Text Generation
Tokyo Institute of Technology, ACL'23_short, 2023 [Paper] - Focused Prefix Tuning for Controllable Text Generation
Tokyo Tech, ACL'23_short, 2023 [Paper] - Goodtriever: Adaptive Toxicity Mitigation with Retrieval-augmented Models
Cohere For AI, EMNLP'23_findings, 2023 [Paper] - GRACE: Gradient-guided Controllable Retrieval for Augmenting Attribute-based Text Generation
National University of Defense Technology, ACL'23_findings, 2023 [Paper] - An Invariant Learning Characterization of Controlled Text Generation
Columbia University, ACL'23, 2023 [Paper] - Style Locality for Controllable Generation with kNN Language Models
University of Marburg, SIGDIAL'23_TamingLLM workshop, 2023 [Paper] - Detoxifying Text with MaRCo: Controllable Revision with Experts and Anti-Experts
University of Washington&CMU, ACL'23_short, 2023 [Paper] - MIL-Decoding: Detoxifying Language Models at Token-Level via Multiple Instance Learning
PKU, ACL'23, 2023 [Paper] - Controllable Story Generation Based on Perplexity Minimization
Vyatka State University, AIST 2023, 2023 [Paper] - PREADD: Prefix-Adaptive Decoding for Controlled Text Generation
UC Berkeley, ACL'23_findings, 2023 [Paper] - Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model
UNC-Chapel Hill, EMNLP'23_short, 2023 [Paper] - Controlled Text Generation for Black-box Language Models via Score-based Progressive Editor
Seoul National University, ACL'24, 2023 [Paper] - Successor Features for Efficient Multisubject Controlled Text Generation
Microsoft, arxiv'23, 2023 [Paper] - Controlled Text Generation via Language Model Arithmetic
ETH Zurich, ICLR'24_spotlight, 2024 [Paper] - COLD-Attack: Jailbreaking LLMs with Stealthiness and Controllability
UIUC, arxiv'24, 2024 [Paper] - Controlled Text Generation for Large Language Model with Dynamic Attribute Graphs
RUC, arxiv'24, 2024 [Paper] - DECIDER: A Rule-Controllable Decoding Strategy for Language Generation by Imitating Dual-System Cognitive Theory
BIT, TKDE_submitted, 2024 [Paper] - Word Embeddings Are Steers for Language Models
UIUC, ACL'24, 2024 [Paper] - RAIN: Your Language Models Can Align Themselves without Finetuning
PKU, ICLR'24, 2024 [Paper] - ROSE Doesn't Do That: Boosting the Safety of Instruction-Tuned Large Language Models with Reverse Prompt Contrastive Decoding
WHU, arxiv'24, 2024 [Paper] - Uncertainty is Fragile: Manipulating Uncertainty in Large Language Models
Rutgers, arxiv'24, 2024 [Paper]
- Exploring Controllable Text Generation Techniques
CMU, COLING'20, 2023 [Paper] - Conditional Text Generation for Harmonious Human-Machine Interaction
NWPT, TIST'21, 2021 [Paper] - How to Control Sentiment in Text Generation: A Survey of the State-of-the-Art in Sentiment-Control Techniques
DCU, WASSA'23, 2023 [Paper] - A Survey of Controllable Text Generation using Transformer-based Pre-trained Language Models
BIT, ACM CSUR, 2023 [Paper] - A Recent Survey on Controllable Text Generation: A Causal Perspective
Tongji, Fundamental Research, 2024 [Paper]