-
Notifications
You must be signed in to change notification settings - Fork 33
Ollama default setup #490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ollama default setup #490
Conversation
…and any other valid configuration for ollama and another model. Added toolsEnabled inside ChatAgent (from dev mode prototype) and refactored the rest of the code to work with it. Signed-off-by: Siaa Gor <Siaa.Gor@ibm.com>
gkwan-ibm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, when I turn off AI mode a, I cannot turn it back.
Did you check all scenarios in https://github.ibm.com/dev-engage/liberty-ai-dev-mode-prototype/issues/216#issue-56125826?
src/main/java/io/openliberty/tools/common/plugins/util/DevUtil.java
Outdated
Show resolved
Hide resolved
src/main/java/io/openliberty/tools/common/plugins/util/DevUtil.java
Outdated
Show resolved
Hide resolved
src/main/java/io/openliberty/tools/common/plugins/util/DevUtil.java
Outdated
Show resolved
Hide resolved
src/main/java/io/openliberty/tools/common/plugins/util/DevUtil.java
Outdated
Show resolved
Hide resolved
src/main/java/io/openliberty/tools/common/plugins/util/DevUtil.java
Outdated
Show resolved
Hide resolved
src/main/java/io/openliberty/tools/common/plugins/util/DevUtil.java
Outdated
Show resolved
Hide resolved
Signed-off-by: Siaa Gor <Siaa.Gor@ibm.com>
Signed-off-by: Siaa Gor <Siaa.Gor@ibm.com>
Signed-off-by: Siaa Gor <siaa.gor@ibm.com>
Signed-off-by: Siaa Gor <siaa.gor@ibm.com>
Signed-off-by: Siaa Gor <siaa.gor@ibm.com>
Signed-off-by: Siaa Gor <siaa.gor@ibm.com>
Signed-off-by: Siaa Gor <siaa.gor@ibm.com>
|
Hello, yes, I tested the different scenarios. I did notice in a case before ( incorrect ollama base url and correct model scenario both provided together): langchain4j throws a retryable exception and the server is configured to print the retryable exception. I am wondering if I should change this or to keep it printed along with my own error message. Thank you. |
Signed-off-by: Gilbert Kwan <gkwan@ca.ibm.com>
addresses this issue