Terry Yue Zhuo Headshot

ㄓㄨㄛˊㄩㄝˋ

Terry Yue Zhuo

I am an IBM PhD Fellow (2024 - 2025), and a PhD student at CSIRO's Data61 and Monash University.

My research focuses on Code Intelligence + X (e.g., agentic workflow, system efficiency, cybersecurity).

I will be organizing the 1st tutorial on "NLP+Code: Code Intelligence in Language Models" at EMNLP'25 in Suzhou, China. Please stay tuned for the update and consider attending the conference!

I will serve as a Senior Area Chair for EMNLP 2025. Looking forward to your high-quality submissions!

I am open to collaboration on code intelligence. Feel free to contact me if you are interested in working together.
Schedule the meeting with me.

My Working Principles

Why Code Intelligence?

Fun Facts


Recent Publications



BigCode Team, “BigCodeBench: Benchmarking Code Generation with Diverse Function Calls and Complex Instructions ”, ICLR 2025. (Among Top 5 Highest-Rated Papers! 🌟 Before→After Rebuttal: 6→8, 8→8, 8→10, 10→10) [Github]

BigCode Team, “Parameter-Efficient Instruction Tuning Code Large Language Models: An Empirical Study”, DL4Code workshop @ ICLR, 2025.

BigCode Team, “OctoPack: Instruction Tuning Code Large Language Models”, International Conference on Learning Representations (ICLR), 2024. [Github]

Terry Yue Zhuo, “ICE-Score: Instructing Large Language Models to Evaluate Code”, Findings of European Chapter of the Association for Computational Linguistics (EACL), 2024. [Github]

BigCode Team, “StarCoder: May The Source Be With You!”, Transactions on Machine Learning Research (TMLR), 2023. [Tweet 1] [Tweet 2] [TechCrunch] [StarCoderBase] [StarCoder] [Github]

BigCode Team, “SantaCoder: don't reach for the stars!”, Deep Learning For Code workshop (DL4C) @ ICLR, 2023. [Best Paper Award] [Tweet 1] [Tweet 2] [SantaCoder]