| 1 |
Does Visual Grounding Enhance the Understanding of Embodied Knowledge in Large Language Models? |
提出具身知识理解基准,评估视觉 grounding 是否提升大语言模型感知能力。 |
large language model multimodal visual grounding |
|
|
| 2 |
Real-Time World Crafting: Generating Structured Game Behaviors from Natural Language with Large Language Models |
提出一种基于LLM的实时游戏世界构建框架,通过自然语言生成结构化游戏行为。 |
large language model chain-of-thought |
|
|
| 3 |
Knowing the Facts but Choosing the Shortcut: Understanding How Large Language Models Compare Entities |
揭示大语言模型实体比较中的启发式偏见,并探究模型规模的影响 |
large language model chain-of-thought |
|
|
| 4 |
ChiKhaPo: A Large-Scale Multilingual Benchmark for Evaluating Lexical Comprehension and Generation in Large Language Models |
ChiKhaPo:大规模多语言基准,评估大语言模型的词汇理解与生成能力 |
large language model |
|
|
| 5 |
Beacon: Single-Turn Diagnosis and Mitigation of Latent Sycophancy in Large Language Models |
Beacon:单轮诊断与缓解大语言模型中潜在的谄媚现象 |
large language model |
|
|
| 6 |
Structured yet Bounded Temporal Understanding in Large Language Models |
研究大型语言模型在不同时间参照框架下的时间理解能力 |
large language model |
|
|
| 7 |
Natural Language Processing for Cardiology: A Narrative Review |
综述性研究:利用自然语言处理技术解决心血管疾病领域的挑战 |
large language model multimodal |
|
|
| 8 |
Mapping from Meaning: Addressing the Miscalibration of Prompt-Sensitive Language Models |
提出基于语义概念空间采样的Prompt校准方法,提升语言模型不确定性估计。 |
large language model |
|
|
| 9 |
Vocab Diet: Reshaping the Vocabulary of LLMs with Vector Arithmetic |
提出Vocab Diet,利用向量运算重塑LLM词汇表,提升词汇多样性。 |
large language model |
|
|
| 10 |
Bits Leaked per Query: Information-Theoretic Bounds on Adversarial Attacks against LLMs |
提出信息论框架,量化LLM对抗攻击中的信息泄露,指导透明度与安全性的平衡。 |
large language model |
|
|
| 11 |
Neuronal Group Communication for Efficient Neural representation |
提出神经元组通信(NGC)框架,提升神经网络效率、模块化和可解释性。 |
large language model |
|
|
| 12 |
Verifiable Fine-Tuning for LLMs: Zero-Knowledge Training Proofs Bound to Data Provenance and Policy |
提出可验证微调方法,为大语言模型提供数据溯源和策略约束的零知识训练证明。 |
large language model |
|
|
| 13 |
U-Codec: Ultra Low Frame-rate Neural Speech Codec for Fast High-fidelity Speech Generation |
提出U-Codec,一种超低帧率神经语音编解码器,用于快速高保真语音生成。 |
large language model |
|
|
| 14 |
so much depends / upon / a whitespace: Why Whitespace Matters for Poets and LLMs |
关注诗歌空白符:研究空白符对诗歌创作和LLM生成诗歌的影响 |
large language model |
|
|
| 15 |
Investigating the Impact of Rationales for LLMs on Natural Language Understanding |
研究推理链对LLM在自然语言理解任务中的影响,并提出相应优化方法 |
chain-of-thought |
|
|
| 16 |
All You Need is One: Capsule Prompt Tuning with a Single Vector |
提出CaPT:利用单向量胶囊提示调整,提升大语言模型在下游任务的性能。 |
large language model |
|
|