Skip to content

Conversation

@bubundas17
Copy link

Added Codestral-22B q6 and q5 for Consumer GPUs with 24GB VRAM. like RTX 3090 and RTX 4090

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant