HN
Paper
All
Show
Ask
Jobs
Top
Today
Last 7 days
Last months
This year
Statistics
All
Show
Ask
Jobs
Top stories
Today
Last 7 days
Last months
This year
Statistics
Stories by
zhisbug
Create a 5s 1080p Video in 4.5s with FastVideo on a Single GPU
12 points
zhisbug
2026-03-13T21:10:41Z
1080p.fastvideo.org
1 points
zhisbug
2025-08-04T22:18:49Z
news.ycombinator.com
1 points
zhisbug
2025-06-30T20:35:28Z
news.ycombinator.com
1 points
zhisbug
2025-06-13T21:46:52Z
news.ycombinator.com
1 points
zhisbug
2025-05-12T22:10:17Z
news.ycombinator.com
1 points
zhisbug
2025-04-08T21:27:22Z
news.ycombinator.com
1 points
zhisbug
2025-03-14T23:05:16Z
news.ycombinator.com
1 points
zhisbug
2025-03-07T00:06:50Z
news.ycombinator.com
Can LLMs play real-time games like supermario (other than Pokemon red)?
3 points
zhisbug
2025-02-28T20:04:27Z
twitter.com
Sliding Tile Attention: A New Method That Speeds Up HunyuanVideo's Outputs by 3x
2 points
zhisbug
2025-02-20T18:26:37Z
old.reddit.com
Fast Video Generation with Sliding Tile Attention
12 points
zhisbug
2025-02-19T00:19:30Z
hao-ai-lab.github.io
More Efficient Chain-of-Thought Reasoning Through Certainty Probing
6 points
zhisbug
2025-02-18T02:36:07Z
huggingface.co
AI Space Escape: Playing Games While Evaluting LLM Reasonsing
13 points
zhisbug
2025-02-11T20:22:59Z
lmgame.org
Efficient LLM Scheduling by Learning to Rank
2 points
zhisbug
2025-01-14T21:11:19Z
hao-ai-lab.github.io
FastVideo: a lightweight framework for accelerating large video diffusion models
110 points
zhisbug
2024-12-17T20:56:01Z
github.com
MuxServe: Flexible Spatial-Temporal Multiplexing for Multiple LLM Serving
2 points
zhisbug
2024-06-24T21:40:27Z
hao-ai-lab.github.io
Consistency LLM: converting LLMs to parallel decoders accelerates inference 3.5x
355 points
zhisbug
2024-05-08T19:55:07Z
hao-ai-lab.github.io
Throughput Is Not All You Need: Maxing Goodput in LLM Serving via Disaggregation
5 points
zhisbug
2024-03-18T21:28:58Z
hao-ai-lab.github.io
Break the Sequential Dependency of LLM Inference Using Lookahead Decoding
16 points
zhisbug
2023-11-21T20:12:07Z
lmsys.org
Important and *MUST-KNOW* techniques for a 2023 LLM serving system
1 points
zhisbug
2023-09-13T22:44:54Z
twitter.com