cs.LG(2025-06-14)
📊 共 18 篇论文 | 🔗 1 篇有代码
🎯 兴趣领域导航
支柱二:RL算法与架构 (RL & Architecture) (10)
支柱九:具身大模型 (Embodied Foundation Models) (7 🔗1)
支柱八:物理动画 (Physics-based Animation) (1)
🔬 支柱二:RL算法与架构 (RL & Architecture) (10 篇)
🔬 支柱九:具身大模型 (Embodied Foundation Models) (7 篇)
| # | 题目 | 一句话要点 | 标签 | 🔗 | ⭐ |
|---|---|---|---|---|---|
| 11 | Unveiling Confirmation Bias in Chain-of-Thought Reasoning | 揭示思维链推理中大语言模型的确认偏差现象 | large language model chain-of-thought | ✅ | |
| 12 | Exploring the Secondary Risks of Large Language Models | 探索大语言模型在良性交互下的次生风险,提出SecLens评估框架。 | large language model | ||
| 13 | HYPER: A Foundation Model for Inductive Link Prediction with Knowledge Hypergraphs | 提出HYPER模型以解决知识超图的归纳链接预测问题 | foundation model | ||
| 14 | Beyond Frequency: The Role of Redundancy in Large Language Model Memorization | 揭示冗余在大型语言模型记忆中的作用,提出基于冗余的数据预处理方法。 | large language model | ||
| 15 | A Framework for Generating Conversational Recommendation Datasets from Behavioral Interactions | ConvRecStudio:基于行为交互生成对话式推荐数据集的框架 | large language model | ||
| 16 | Automatic Expert Discovery in LLM Upcycling via Sparse Interpolated Mixture-of-Experts | 提出SIMoE,通过稀疏插值混合专家模型实现LLM的自动专家发现与能力提升。 | large language model | ||
| 17 | QiMeng-Attention: SOTA Attention Operator is generated by SOTA Attention Algorithm | 提出QiMeng-Attention,通过LLM自动生成高性能Attention算子,解决长文本场景下的性能瓶颈。 | large language model |
🔬 支柱八:物理动画 (Physics-based Animation) (1 篇)
| # | 题目 | 一句话要点 | 标签 | 🔗 | ⭐ |
|---|---|---|---|---|---|
| 18 | Path-specific effects for pulse-oximetry guided decisions in critical care | 利用路径特定效应,研究脉搏血氧仪偏差对重症监护决策的影响 | PULSE |