| 1 |
Synthesize-on-Graph: Knowledgeable Synthetic Data Generation for Continue Pre-training of Large Language Models |
提出Synthetic-on-Graph框架,利用知识图谱生成合成数据,提升LLM在数据稀缺场景下的性能。 |
large language model chain-of-thought |
|
|
| 2 |
Enhancing ML Model Interpretability: Leveraging Fine-Tuned Large Language Models for Better Understanding of AI |
提出一种基于微调LLM的交互式XAI参考架构,提升模型可解释性 |
large language model |
|
|
| 3 |
On the effectiveness of Large Language Models in the mechanical design domain |
评估大型语言模型在机械设计领域的有效性,揭示领域特定失败模式 |
large language model |
|
|
| 4 |
Towards High-Fidelity Synthetic Multi-platform Social Media Datasets via Large Language Models |
提出基于大语言模型的多平台社交媒体数据集生成方法,解决数据获取难题。 |
large language model |
|
|
| 5 |
Helping Large Language Models Protect Themselves: An Enhanced Filtering and Summarization System |
提出一种无需重训练的过滤与总结系统,增强LLM对对抗攻击的防御能力 |
large language model |
|
|
| 6 |
Multimodal Transformers are Hierarchical Modal-wise Heterogeneous Graphs |
提出图结构交错掩码多模态Transformer(GsiT),提升多模态情感分析效率。 |
multimodal |
|
|
| 7 |
Do We Need a Detailed Rubric for Automated Essay Scoring using Large Language Models? |
针对LLM自动作文评分,研究表明简化评分细则可在保证准确率的同时降低token使用量。 |
large language model |
|
|
| 8 |
Large Language Model-Driven Dynamic Assessment of Grammatical Accuracy in English Language Learner Writing |
利用大型语言模型动态评估英语学习者写作中的语法准确性 |
large language model |
|
|
| 9 |
AURA: A Diagnostic Framework for Tracking User Satisfaction of Interactive Planning Agents |
AURA:用于追踪交互式规划Agent用户满意度的诊断框架 |
large language model instruction following |
|
|
| 10 |
Always Tell Me The Odds: Fine-grained Conditional Probability Estimation |
提出一种精细化条件概率估计模型,提升LLM在不确定信息下的概率预测精度。 |
large language model |
|
|
| 11 |
PREMISE: Matching-based Prediction for Accurate Review Recommendation |
提出PREMISE,一种基于匹配的架构,用于提升多模态评论推荐的准确性。 |
multimodal |
|
|
| 12 |
Leveraging LLMs to Create Content Corpora for Niche Domains |
利用大型语言模型为特定领域创建高质量内容语料库 |
large language model |
|
|
| 13 |
MateICL: Mitigating Attention Dispersion in Large-Scale In-Context Learning |
MateICL:缓解大规模上下文学习中的注意力分散问题 |
large language model |
✅ |
|
| 14 |
Position: Enough of Scaling LLMs! Lets Focus on Downscaling |
突破LLM规模瓶颈:提出一种关注模型小型化的新范式 |
large language model |
|
|