Settings for Mixture of Experts GGUF models? #122
Unanswered
stopandgo-44
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi all, I'm just trying to make sure I've optimized the settings for running a 4x8b model. How do I configure the number of experts?
System:
Linux Mint, latest kernal.
2 x Radeon VII (16 GB VRAM each)
ROCM backend
32 GB System Ram
Beta Was this translation helpful? Give feedback.
All reactions