Hi there! I’m a 4th year Ph.D. student of Computer Science at City University of Hong Kong (CityU), advised by Prof. Chun Jason Xue at CityU and Prof. Tei-Wei Kuo at National Taiwan University. My research interests lie in building next-generation data compression algorithms and systems (e.g., combined with neural networks), neural-network acceleration, and edge computing.

🔥 News

  • 2024.05:   I successfully completed my thesis defense 🎉.
  • 2023.10:   We won the second prize at the ICCAD TinyML Contest.
  • 2023.05:   I was invited as female student representitive to give a talk at CityU PhD forum.
  • 2023.02:   Our paper “Faster and stronger Lossless Compression with Optimized Autoregressive framework” was accepted in DAC 2023.

📝 Selected Preprints and Publications

  • Weight Rescaling: Effective and Robust Regularization for Deep Neural Networks with Batch Normalization, Ziquan Liu, Yufei Cui, Jia Wan, Yu Mao, Antoni Bert Chan, arXiv preprint arXiv:2102.03497, 2021
  • Trace: A fast transformer-based general-purpose lossless compressor, Yu Mao, Yufei Cui, Tei-Wei Kuo, Chun Jason Xue, Proceedings of the ACM Web Conference 2022, 1829-1838, 2022
  • Accelerating general-purpose lossless compression via simple and scalable parameterization Yu Mao, Yufei Cui, Tei-Wei Kuo, Chun Jason Xue, Proceedings of the 30th ACM International Conference on Multimedia, 3205-3213, 2022
  • Variational Nested Dropout, Yufei Cui, Yu Mao, Ziquan Liu, Qiao Li, Antoni Bert Chan, Xue Liu, Tei-Wei Kuo, Chun Jason Xue, IEEE Transactions on Pattern Analysis and Machine Intelligence, 2023
  • Faster and Stronger Lossless Compression with Optimized Autoregressive Framework, Yu Mao, Jingzong Li, Yufei Cui, Chun Jaon Xue, 2023 60th ACM/IEEE Design Automation Conference (DAC), 1-6, 2023
  • Moby: Empowering 2D Models for Efficient Point Cloud Analytics on the Edge, Jingzong Li, Yik Hong Cai, Libing Liu, Yu Mao, Chun Jason Xue, Hong Xu, Proceedings of the 31st ACM International Conference on Multimedia, 9012-9021, 2023
  • On the compressibility of quantized large language models, Yu Mao, Weilan Wang, Hongchao Du, Nan Guan, Chun Jason Xue, arXiv preprint arXiv:2403.01384, 2024
  • STEM: Streaming-based FPGA Acceleration for Large-Scale Compactions in LSM KV, Dongdong Tang, Weilan Wang, Yu Mao, Jinghuan Yu, Tei-Wei Kuo, Chun Jason Xue, 40th IEEE International Conference on Data Engineering, 2024 (Corresponding author)
  • Pre-processing matters: A segment search method for WSI classification, Jun Wang, Yufei Cui, Yu Mao, Nan Guan, Chun Jason Xue, arXiv preprint arXiv:2404.11161, 2024
  • IHC Matters: Incorporating IHC analysis to H&E Whole Slide Image Analysis for Improved Cancer Grading via Two-stage Multimodal Bilinear Pooling Fusion, Jun Wang, Yu Mao, Yufei Cui, Nan Guan, Chun Jason Xue, arXiv preprint arXiv:2405.08197, 2024

🎖 Honors and Awards

  • 2023.10 TinyML Contest ICCAD 2023 Second Place, San Francisco, USA.
  • 2023.9 Outstanding Academic Performance Award, City University of Hong Kong, 2023.
  • 2023.7 DAC Young Research Fellow, San Francisco, USA.
  • 2022.10 EDAthon 2022 Second Place, Hong Kong, 2022.

💬 Invited Talks

  • 2023.10, Invited talk of Efficient Large Language Models at Institute of Microelectronics of the Chinese Academy of Sciences

Services

  • I currently serve as a reviewer of ICLR, Neurips, CVPR, TKDE, MM and ERC of ATC.

💻 Internships

  • 2018.05 - 2018.10, Microsoft, Beijing, China.
  • 2017.10 - 2018.05, Baidu, Beijing, China.