检索增强大模型:NeurIPS2023&ICRL2024
-
Self-RAG: Self-reflective Retrieval Augmented Generation
NeurIPS2023
-
Retrieval-Augmented Multiple Instance Learning
NeurIPS2023
-
Fine-grained Late-interaction Multi-modal Retrieval for Retrieval Augmented Visual Question Answering
NeurIPS2023
-
Accelerating Retrieval-augmented Language Model Serving with Speculation
ICLR2024
-
RA-DIT: Retrieval-Augmented Dual Instruction Tuning
ICLR2024
-
Self-RAG: Learning to Retrieve, Generate, and Critique through Self-Reflection
ICLR2024
-
In-Context Learning with Retrieval Augmented Encoder-Decoder Language Models
ICLR2024
-
BTR: Binary Token Representations for Efficient Retrieval Augmented Language Models
ICLR2024
-
Hybrid Retrieval-Augmented Generation for Real-time Composition Assistance
ICLR2024
-
Retrieval-augmented Text-to-3D Generation
ICLR2024
-
InstructRetro: Instruction Tuning post Retrieval-Augmented Pretraining
ICLR2024
-
PaperQA: Retrieval-Augmented Generative Agent for Scientific Research
ICLR2024
-
Making Retrieval-Augmented Language Models Robust to Irrelevant Context
ICLR2024
-
RECOMP: Improving Retrieval-Augmented LMs with Context Compression and Selective Augmentation
ICLR2024
-
RAPTOR: Recursive Abstractive Processing for Tree-Organized Retrieval
ICLR2024
-
Retrieval-augmented Vision-Language Representation for Fine-grained Recognition
ICLR2024
-
KITAB: Evaluating LLMs on Constraint Satisfaction for Information Retrieval
ICLR2024
-
Understanding Retrieval Augmentation for Long-Form Question Answering
ICLR2024
-
Personalized Language Generation via Bayesian Metric Augmented Retrieval
ICLR2024
-
Retrieval is Accurate Generation
ICLR2024
-
Adapting Retrieval Models to Task-Specific Goals using Reinforcement Learning
ICLR2024
-
Adaptive Chameleon or Stubborn Sloth: Revealing the Behavior of Large Language Models in Knowledge Conflicts
ICLR2024
-
Don't forget private retrieval: distributed private similarity search for large language models
ICLR2024
-
TabR: Tabular Deep Learning Meets Nearest Neighbors in 2023
ICLR2024
共有 0 条评论