Weilin Zhao

THUNLP Lab, Department of Computer Science, Tsinghua University.

prof_pic.jpg

I’m Weilin Zhao, a first year PhD student in the THUNLP, Department of Computer Science and Technology, Tsinghua University.

Email: zwl23 [at] mails.tsinghua.edu.cn

Google Scholar: link

Research Intrest: Efficient/Low-Resource Methods for NLP, Pre-trained Language Models, Parameter-Efficient Tuning.

selected publications

  1. BMInf
    BMInf: An Efficient Toolkit for Big Model Inference and Tuning
    Han, Xu, Zeng, Guoyang, Zhao, Weilin, Liu, Zhiyuan, Zhang, Zhengyan, Zhou, Jie, Zhang, Jun, Chao, Jia, and Sun, Maosong
    In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations May 2022
  2. BMCook: A Task-agnostic Compression Toolkit for Big Models
    Zhang, Zhengyan, Gong, Baitao, Chen, Yingfa, Han, Xu, Zeng, Guoyang, Zhao, Weilin, Chen, Yanxu, Liu, Zhiyuan, and Sun, Maosong
    In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing: System Demonstrations Dec 2022
  3. OpenPrompt
    OpenPrompt: An Open-source Framework for Prompt-learning
    Ding, Ning, Hu, Shengding, Zhao, Weilin, Chen, Yulin, Liu, Zhiyuan, Zheng, Haitao, and Sun, Maosong
    In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics: System Demonstrations May 2022
  4. PTR
    Ptr: Prompt tuning with rules for text classification
    Han, Xu, Zhao, Weilin, Ding, Ning, Liu, Zhiyuan, and Sun, Maosong
    arXiv preprint arXiv:2105.11259 May 2021