×
Apr 27, 2023 · We present a universal parameter-efficient transfer learning method, termed Predict-Interpolate Tuning (\pi-Tuning), for vision, language, and vision-language ...
Train interpolation weight. (and experts) on the target task. Figure 2. Overview of the π-Tuning method. Step 1 is the traditional parameter-efficient transfer ...
This repo is the official implementation of the paper $\pi$-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation.
May 17, 2023 · Foundation models have achieved great advances in multi-task learning with a unified interface of unimodal and multimodal tasks.
Predict-Interpolate Tuning aggregates the parameters of lightweight task-specific experts learned from similar tasks to aid the target downstream task, ...
In this work, we present a universal parameter-efficient transfer learning method, termed Predict-Interpolate Tuning ($\pi$-Tuning), for vision, language, and ...
pi-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation. Chengyue Wu, Teng Wang, Yixiao Ge, Zeyu Lu, Ruisong Zhou, ...
Inthis work, we present a universal parameter-efficient transfer learning method,termed Predict-Interpolate Tuning ($\pi$-Tuning), for vision, ...
Apr 28, 2023 · π-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation Surpasses fine-tuning and other parameter ...
0 About · π-Tuning: Transferring Multimodal Foundation Models with Optimal Multi-task Interpolation. 05:32. π-Tuning: Transferring Multimodal Foundation Models ...