Tianyu Chen
University of Texas at Austin; tianyuchen@utexas.edu
White Sands, New Mexico
I am a third-year PhD student at the University of Texas at Austin, supervised by Professor Mingyuan Zhou. Before joining UT, I obtained my master’s degree in Statistics from the University of Chicago, where I closely collaborated with Kevin Bello, Bryon Aragam, Pradeep Ravikumar, Francesco Locatello, and supervised by Professor Jingshu Wang. I earned my Bachelor’s degree in Statistics from Fudan University, where I spent some of the most memorable moments of my life.
Research Interests: I have broad research interests in statistical machine learning. Specifically, my research interests lie in:
- Generative Models: diffusion models, model distillation, and controllable image editing.
- Reinforcement Learning for LLM: post-training for both autoregressive (AR) and diffusion language models (dLLMs), focusing on reasoning enhancement and algorithm design.
- Agentic AI – constructing autonomous agents for data generation and multi-turn evaluation.
- Statistical Sampling & Inference – neural posterior sampling and hypothesis testing under uncertainty.
I am actively seeking Summer 2026 Research Internship opportunities, especially those at the intersection of generative AI and reinforcement learning. If my background aligns with your team’s focus, I would love to connect! 📧 Feel free to reach out via email {tianyuchen@utexas.edu}.
news
| Nov 01, 2025 | Excited to announce that two papers are accepted by NeurIPS 2025! 🎉 CoLT (The Conditional Localization Test for assessing neural posterior estimates) is accepted as a Spotlight, and Improving Data Efficiency for LLM Reinforcement Fine-tuning introduces difficulty-targeted online data selection for more efficient LLM alignment. See you in San Diego! |
|---|---|
| Sep 01, 2025 | Our new paper EdiVal-Agent introduces an object-centric VLM agent for automated, fine-grained evaluation of multi-turn image editing! Our benchmark reveals that SeedDream v4.0 > Nano Banana > GPT-Image-1 in multi-turn editing quality. The project received media coverage and is showcased on our project page. |
| Mar 01, 2025 | Our new papers Denoising Score Distillation (DSD) and Restoration Score Distillation (RSD) propose a novel way to apply distillation methods on various corrupted data, including Fourier space MRI data. We prove that after distillation, we can further improve generative quality when only having access to corrupted data. |