|
Weiqi Feng
fengweiqi99@gmail.com
I'm a Research Scientist on the Seed Training Infra team at ByteDance, where I focus on improving the efficiency and scalability of Multimodal Large Language Model (MLLM) training. Before joining ByteDance, I worked at Databricks for a year on data warehousing, contributing to Databricks SQL (DBSQL).
I earned my Master’s degree in Computer Science from Harvard University, where I had the privilege of working with Prof. Minlan Yu in the systems and networking field. Prior to that, I received my B.Eng. in Computer Science from Shanghai Jiao Tong University in 2021.
Linkedin
 |
Email
|
GitHub
|
Google Scholar
|
|
Selected Publications
-
(ATC) Optimus: Accelerating Large-Scale Multi-Modal LLM Training by Bubble Exploitation
Weiqi Feng, Yangrui Chen, Shaoyu Wang, Yanghua Peng, Haibin Lin, Minlan Yu. USENIX ATC 25. June 2025
[paper]
-
(CoNEXT) F3: Fast and Flexible Network Telemetry with an FPGA coprocessor
Weiqi Feng, Jiaqi Gao, Xiaoqi Chen, Gianni Antichi, Ran Ben Basat, Michael Mingchao Shao, Ying Zhang, Minlan Yu.
CoNEXT 2024. Dec. 2024
[paper]
-
(SIGMOD) Allign: Aligning All-Pair Near-Duplicate Passages in Long Texts
Weiqi Feng and Dong Deng. ACM SIGMOD. Jun. 2021
[paper]
|
|