Toggle navigation
HN
Paper
All
Show
Ask
Jobs
Top stories
Today
Last 7 days
Last months
This year
Stats
Stories by 3Sophons
Self-host any GUFF LLMs on Hugging Face and run across devices
1 points
3Sophons
2024-02-04T10:45:38Z
www.secondstate.io
Run the leaked Mistral Medium, miqu-1-70B across GPUs CPUs and OSes
3 points
3Sophons
2024-02-02T06:15:26Z
www.secondstate.io
Self-host StableLM-2-Zephyr-1.6B. Portable across GPUs CPUs OSes
2 points
3Sophons
2024-01-31T07:01:45Z
www.secondstate.io
LFX Mentorship 2024 Spring LLM Projects: Build Open Source AI Inference Infra
1 points
3Sophons
2024-01-30T10:38:07Z
www.secondstate.io
LlamaEdge 0.2.9 released. Now works with Hugging Face's 3000 GGUF models
1 points
3Sophons
2024-01-30T10:22:00Z
twitter.com
Rust boosts LLM app development: Make a serverless Japanese Learning bot in mins
1 points
3Sophons
2024-01-25T15:29:43Z
flows.network
Demo: Run LLM on your own devices. portable across GPUs/CPUs/OSes
1 points
3Sophons
2024-01-24T14:41:10Z
thenewstack.io
Demo: Use WebAssembly to Run LLMs on Your Own Device with WasmEdge
2 points
3Sophons
2024-01-22T10:52:15Z
www.youtube.com
Demo: Interact with open source LLMs via a web interface locally [video]
1 points
3Sophons
2024-01-19T14:39:37Z
www.youtube.com
Single command to self-host open source LLMs on Mac, Jetson etc. portable
1 points
3Sophons
2024-01-19T13:12:58Z
llamaedge.com
Run Nous-Hermes-2-Mixtral-8x7B with one Command on Mac, Jetson and more
1 points
3Sophons
2024-01-19T11:58:21Z
www.secondstate.io
LlamaEdge: Lightweight portable LLM tools for your local, edge& server devices
2 points
3Sophons
2024-01-16T17:27:19Z
github.com
Demo: Selfhost Mixtral 8x7B MoE on Mac+cross devices, PORTABLE 2MB inference app
1 points
3Sophons
2024-01-16T10:47:28Z
www.youtube.com
Use WASM as a cross-platform LLM backend for LangChain: Any LLMs on any device
3 points
3Sophons
2024-01-04T06:41:21Z
github.com
Self-host SOLAR-10.7B-Instruct-v1.0 LLM with portable 2MB AI inference app
1 points
3Sophons
2024-01-03T10:37:07Z
www.secondstate.io
Selfhost LLMs like Mixtral 8x7B on the Edge & across devices.
1 points
3Sophons
2024-01-03T09:53:17Z
www.secondstate.io
Easy Setup Self-host Mixtral-8x7B across devices with a 2M inference app
2 points
3Sophons
2024-01-02T10:12:10Z
www.secondstate.io
A unikernel designed specifically for running WASM apps and compatible with WASI
4 points
3Sophons
2023-12-29T10:46:28Z
github.com
Run on Mac Japanese LLM CALM2-7B with portable 2M inference app and create API
4 points
3Sophons
2023-12-29T10:38:29Z
www.secondstate.io
1 points
3Sophons
2023-12-20T03:20:38Z
news.ycombinator.com
1
2
3
4
5
6
7
8