Biography

I am Wenyong Zhou (周文涌), a PhD graduate from The University of Hong Kong (HKU), where I had the privilege of being guided by Prof. Ngai Wong and Prof. Can Li in the Next Gen AI Lab in 2025. My academic journey began with a Bachelor’s degree from the School of Microelectronics at Tianjin University (TJU) in 2019, where I was fortunate to be mentored by Prof. Yugong Wu, followed by a Master’s degree in Electrical and Computer Engineering at Northwestern University (NU) in 2021, where I was deeply influenced by the visionary mentorship of Prof. Seda Ogrenci. My research focuses on Implicit Neural Representations (INRs) as an efficient data paradigm and emerging Compute-in-Memory (CIM) architectures, with a recent passion for Large Language Models (LLMs). I have enjoyed enriching internship experiences at Bytedance and JD.com in 2021 and 2023, respectively, and am currently working at Zhicun (Witmem) Technology, focusing on low-bit training of LLMs for compatibility with analog CIM hardware.

Research Interests

  • Data is the fuel of the AI era, and INRs offer an innovative approach for encoding data such as images, signals, and 3D scenes in a continuous and compact form using neural networks. This reduces the need for extensive storage while maintaining high-fidelity representations.

  • Computing power drives modern AI, but classical digital computers face challenges due to the separation of compute and storage units. CIM integrates memory and computation into a single unit, significantly reducing data movement and making it especially effective for accelerating AI workloads.

  • LLMs have become a cornerstone of modern AI, driving advancements in applications ranging from natural language understanding to content generation. Minimizing their size and computational requirements without compromising performance democratizes access to advanced AI capabilities and broadens their range of applications.

Recent News

  • 2025.07 - One paper was accepted by IEEE TCAD.
  • 2025.07 - One paper was accepted by ACM MM 2025.
  • 2025.06 - One paper was accepted by ICCV 2025.
  • 2025.06 - One paper was accepted by IEEE TCAS-II.
  • 2025.03 - One paper was accepted by ICME 2025.
  • 2024.12 - Two papers were accepted by ICASSP 2025.
  • 2024.11 - One paper was accepted by DATE 2025.
  • 2023.12 - Two papers were accepted by ICASSP 2024.
  • 2023.09 - Two papers were accepted by ASPDAC 2024.

Selected Publications

  • W. Zhou*, B. Li*, T. Wu, C. Ding, Z. Liu and N. Wong. QuadINR: Quadratic Implicit Neural Representations for Efficient Memristor-based CIM System, IEEE Transactions on Circuits and Systems II: Express Briefs.
  • W. Zhou, Z. Liu, Y. Ren, and N. Wong, Binary Weight Multi-Bit Activation Quantization for Compute-in-Memory CNN Accelerators, IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems.
  • W. Zhou*, J. Ren*, T. Wu, Y. Cheng, Z. Liu and N. Wong. Distribution-Aware Hadamard Quantization for Hardware-Efficient Implicit Neural Representations, 2025 IEEE International Conference on Multimedia and Expo (ICME), Nantes, France, 2025.
  • W. Zhou, T. Wu, C. Ding, Y. Ren, Z. Liu and N. Wong. Towards RRAM-based Transformer-based Vision Models with Noise-aware Knowledge Distillation, 2025 Design, Automation & Test in Europe Conference & Exhibition (DATE), Lyon, France, 2025.

More about me

  • I am passionate about competitive sports and enjoy practicing table tennis, badminton, and basketball. These activities have taught me valuable lessons about handling failure and shaping my personality.

  • I enjoy reading, particularly exploring history and politics through a financial lens to uncover how fundamental financial principles remain unchanged despite human intentions.

  • I love traveling the world to experience beautiful natural landscapes and diverse human cultures, which enrich my understanding of humanity and inspire new ways of thinking.

(Last updated on Aug., 2025)