-
Notifications
You must be signed in to change notification settings - Fork 98
ModelNotReady("Model has started running, but not ready yet.") #1372
Copy link
Copy link
Open
Description
Describe the bug
I'm trying to build a model based on LTXImageToVideoPipeline huggingface pipeline, during load procedure I got several times this error:
ModelNotReady("Model has started running, but not ready yet.")
After a will, the model load perfectly, but if I call the model during a sleep time, I got this message error in return. The second call works perfectly.
I'm building my model directly on Baseten service, cannot running this locally.
To Reproduce
code:
import torch
from diffusers import LTXImageToVideoPipeline
from diffusers.utils import export_to_video, load_image
class Model:
def __init__(self, **kwargs):
self.pipe = None
def load(self):
self.pipe = LTXImageToVideoPipeline.from_pretrained(
"a-r-r-o-w/LTX-Video-0.9.1-diffusers",
torch_dtype=torch.bfloat16,
).to("cuda")
config:
python_version: py311
requirements:
- diffusers==0.32.2
- torch==2.5.1
- transformers
- accelerate
- safetensors
- imageio
- imageio-ffmpeg
- sentencepiece
- pillow
What I have tried so far
Pyhton 311, 310
Multiple toch version
Multiple diffusers version
Mutliple GPU (L4, H100)
complete error log
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7fa2c1cb0910 state=finished raised ModelNotReady>]
Exception in ASGI application
Traceback (most recent call last):
File "/control/control/endpoints.py", line 69, in proxy
raise ModelNotReady("Model has started running, but not ready yet.")
helpers.errors.ModelNotReady: Model has started running, but not ready yet.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/control/.env/lib/python3.11/site-packages/uvicorn/protocols/http/h11_impl.py", line 408, in run_asgi
result = await app( # type: ignore[func-returns-value]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/control/.env/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 84, in __call__
return await self.app(scope, receive, send)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/control/.env/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
await super().__call__(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/applications.py", line 113, in __call__
await self.middleware_stack(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 187, in __call__
raise exc
File "/control/.env/lib/python3.11/site-packages/starlette/middleware/errors.py", line 165, in __call__
await self.app(scope, receive, _send)
File "/control/.env/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "/control/.env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "/control/.env/lib/python3.11/site-packages/starlette/routing.py", line 715, in __call__
await self.middleware_stack(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/routing.py", line 735, in app
await route.handle(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/routing.py", line 288, in handle
await self.app(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/routing.py", line 76, in app
await wrap_app_handling_exceptions(app, request)(scope, receive, send)
File "/control/.env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 62, in wrapped_app
raise exc
File "/control/.env/lib/python3.11/site-packages/starlette/_exception_handler.py", line 51, in wrapped_app
await app(scope, receive, sender)
File "/control/.env/lib/python3.11/site-packages/starlette/routing.py", line 73, in app
response = await f(request)
^^^^^^^^^^^^^^^^
File "/control/control/endpoints.py", line 50, in proxy
for attempt in Retrying(
File "/control/.env/lib/python3.11/site-packages/tenacity/__init__.py", line 384, in __iter__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/control/.env/lib/python3.11/site-packages/tenacity/__init__.py", line 363, in iter
raise retry_exc from fut.exception()
tenacity.RetryError: RetryError[<Future at 0x7fa2c1e44650 state=finished raised ModelNotReady>]
It this a problem with diffusers library or truss? Any idea ?
Thank you.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels