News
ResearchConference PublicationsTackling Dimensional Collapse toward Comprehensive Universal Domain Adaptation
Hung-Chieh Fang,
Po-Yi Lu, Hsuan-Tien Lin
We investigate the self-supervised loss to tackle dimensional collapse in the target representation and step toward more comprehensive Universal Domain Adaptation.
Tan-Ha Mai, Nai-Xuan Ye, Yu-Wei Kuan, Po-Yi Lu, and Hsuan-Tien Lin
We make Complementary Label Learning practical with Vision-Language Model-based auto-labeling for large-scale data.
Oscar Chew, Po-Yi Lu,
Jayden Lin, Hsuan-Tien Lin
We propose straightforward textual perturbations on prompts to defend against backdoor attacks for text-to-image tasks.
A More Robust Baseline for Active Learning by Injecting Randomness to Uncertainty Sampling
Po-Yi Lu, Chun-Liang Li, and Hsuan-Tien Lin
We investigate the injecting of slight randomness into uncertainty sampling to balance the bias in the pure uncertainty sampling with a small variance.
Journal Publications
Po-Yi Lu, Yi-Jie Cheng, Chun-Liang Li, and Hsuan-Tien
Lin
We re-benchmark active learning in tabular datasets and affirm the effectiveness of uncertainty sampling.
ExperienceEducation
Work
ServicesAcademic Activities
Teaching
Community
Awards
| ||
|
This template is adapted from Jon Barron. |