| 1 |
Weighted Multi-Prompt Learning with Description-free Large Language Model Distillation |
提出无描述多提示学习DeMul,通过LLM知识蒸馏提升VLM在下游任务中的性能。 |
distillation large language model |
|
|
| 2 |
Squeeze the Soaked Sponge: Efficient Off-policy Reinforcement Finetuning for Large Language Model |
提出ReMix,通过高效的离线强化微调提升大型语言模型的推理能力。 |
reinforcement learning PPO large language model |
|
|
| 3 |
Simple Yet Effective: An Information-Theoretic Approach to Multi-LLM Uncertainty Quantification |
MUSE:一种基于信息论的多LLM不确定性量化方法,简单有效。 |
distillation large language model chain-of-thought |
✅ |
|
| 4 |
Transferable Parasitic Estimation via Graph Contrastive Learning and Label Rebalancing in AMS Circuits |
提出CircuitGCL,通过图对比学习和标签重平衡实现AMS电路中可迁移的寄生参数估计。 |
representation learning contrastive learning |
✅ |
|
| 5 |
Few-shot Learning on AMS Circuits and Its Application to Parasitic Capacitance Prediction |
CircuitGPS:面向AMS电路的少样本学习寄生电容预测方法 |
representation learning MAE |
|
|