View Single Post
Staro 25.04.2025., 02:12   #120
calypso
Premium
Moj komp
 
calypso's Avatar
 
Datum registracije: Oct 2002
Lokacija: Zagreb/Rijeka
Postovi: 1,996
Jeli itko probao koristiti RAMDisk za lokalni AI deployment tipa DeepSeek ili nes slicno?

Postavio sam upit onom drugom statistickom pogadjalu (DeepSeek, jel...) za sto bi se dalo koristiti RAMDisk kod lokalne instalacije i ovo mi je napisalo...


What DeepSeek Can Use the RAMDisk For

Model Cache:
- Store frequently accessed model weights (if using llama.cpp or similar).

Temporary File Storage:
- Store intermediate files (logs, tokenized inputs).

Swap for Small-Batch Inference:
- If your system has limited RAM, a RAMDisk can help avoid SSD thrashing.

Faster Disk-Based KV Caching:
- Some inference engines allow offloading cache to disk - use RAMDisk for speed.



Best for:
- Small models (e.g., deepseek-coder-1.3b in 4-bit) that fit entirely in RAMDisk.
- Frequent disk I/O tasks (like caching prompts).
__________________

GMKTec K8Plus
AMD Ryzen 7 PRO 8845HS, 3.80-5.10GHz 8-core
64GB DDR5-5200 (2x32GB), quad-channel
NVMe1 - 512GB Micron 3400, PCI-E 4.0 x4
NVMe2 - 512GB Samsung PM981, PCI-E 3.0 x4
AMD Radeon 780M integrirana grafika


2x EIZO EV2336WFS3-BK
23" 1920x1080 IPS


Dell Latitude 7390
intel i7 8650U, 1.90-4.20GHz, 4-core
32GB DDR4-2666
512GB Samsung PM981a, PCI-E 3.0 x4
intel UHD 620
13.3" FullHD IPS Touchscreen


calypso je offline   Reply With Quote