Terry Yue Zhuo Headshot

ㄓㄨㄛˊㄩㄝˋ

Terry Yue Zhuo

I am getting my PhD at CSIRO's Data61 and Monash University. I am currently leading the next generation of Code LLMs at BigCode while collaborating with various entities.

My research focuses on code intelligence, particularly function calling, program reasoning, and code generation. The long-term goal is to build AGI with code/executable languages.

I am open to collaboration. Feel free to contact me if you are interested in working together.

My Working Principles

Why Code Intelligence?

Fun Facts


Recent Publications



BigCode Team, “BigCodeBench: Benchmarking Code Generation with Diverse Function Calls and Complex Instructions ”, Manuscript, 2024. [Github]

BigCode Team, “OctoPack: Instruction Tuning Code Large Language Models”, International Conference on Learning Representations (ICLR), 2024. [Github]

Terry Yue Zhuo, “ICE-Score: Instructing Large Language Models to Evaluate Code”, Findings of European Chapter of the Association for Computational Linguistics (EACL), 2024. [Github]

BigCode Team, “Astraios: Parameter-Efficient Instruction Tuning Code Large Language Models”, Manuscript, 2024.

BigCode Team, “StarCoder: May The Source Be With You!”, Transactions on Machine Learning Research (TMLR), 2023. [Tweet 1] [Tweet 2] [TechCrunch] [StarCoderBase] [StarCoder] [Github]

BigCode Team, “SantaCoder: don't reach for the stars!”, Deep Learning For Code workshop (DL4C) @ ICLR, 2023. [Best Paper Award] [Tweet 1] [Tweet 2] [SantaCoder]