OS
Windows
GPU Library
CUDA 12.x
Python version
3.11
Pytorch version
2.7.1+cu128
Model
No response
Describe the bug
I copied the script at examples/chat.py and used the Oobabooga environment to run it. However, there are no modules called chat_formatting and chat_prompts. Does anyone know where they can be found? There are no other mentions in this repository, nor does it appear in PyPi or any Google searches.
Reproduction steps
- Copy
examples/chat.py as chat.py into the directory of text-generation-webui.
- Open
cmd_windows.bat.
python chat.py
Expected behavior
It runs without errors.
Logs
No response
Additional context
No response
Acknowledgements
OS
Windows
GPU Library
CUDA 12.x
Python version
3.11
Pytorch version
2.7.1+cu128
Model
No response
Describe the bug
I copied the script at
examples/chat.pyand used the Oobabooga environment to run it. However, there are no modules calledchat_formattingandchat_prompts. Does anyone know where they can be found? There are no other mentions in this repository, nor does it appear in PyPi or any Google searches.Reproduction steps
examples/chat.pyaschat.pyinto the directory oftext-generation-webui.cmd_windows.bat.python chat.pyExpected behavior
It runs without errors.
Logs
No response
Additional context
No response
Acknowledgements