Подписаться
Zhengyang Geng
Zhengyang Geng
Подтвержден адрес электронной почты в домене cs.cmu.edu - Главная страница
Название
Процитировано
Процитировано
Год
Is attention better than matrix decomposition?
Z Geng, MH Guo, H Chen, X Li, K Wei, Z Lin
ICLR, 2020
1502020
Medusa: Simple llm inference acceleration framework with multiple decoding heads
T Cai, Y Li, Z Geng, H Peng, JD Lee, D Chen, T Dao
ICML, 2024
104*2024
Deep Equilibrium Optical Flow Estimation
S Bai, Z Geng, Y Savani, JZ Kolter
CVPR 2022, 2022
662022
On Training Implicit Models
Z Geng, XY Zhang, S Bai, Y Wang, Z Lin
NeurIPS 2021, 2021
632021
Residual Relaxation for Multi-view Representation Learning
Y Wang, Z Geng, F Jiang, C Li, Y Wang, J Yang, Z Lin
NeurIPS 2021, 2021
332021
Eliminating Gradient Conflict in Reference-based Line-art Colorization
Z Li, Z Geng, Z Kang, W Chen, Y Yang
ECCV 2022, 2022
272022
Deep Equilibrium Approaches to Diffusion Models
A Pokle, Z Geng, Z Kolter
NeurIPS 2022, 2022
242022
One-step diffusion distillation via deep equilibrium models
Z Geng, A Pokle, JZ Kolter
Advances in Neural Information Processing Systems 36, 2024
62024
Equilibrium image denoising with implicit differentiation
Q Chen, Y Wang, Z Geng, Y Wang, J Yang, Z Lin
IEEE Transactions on Image Processing 32, 1868-1881, 2023
52023
Torchdeq: A library for deep equilibrium models
Z Geng, JZ Kolter
arXiv preprint arXiv:2310.18605, 2023
32023
Consistency Models Made Easy
Z Geng, A Pokle, W Luo, J Lin, JZ Kolter
arXiv preprint arXiv:2406.14548, 2024
12024
В данный момент система не может выполнить эту операцию. Повторите попытку позднее.
Статьи 1–11