Allow usage of more ram as shared memory for gpu. #100
DuckersMcQuack
started this conversation in
Ideas
Replies: 2 comments 2 replies
-
|
The idea here is to use RAM as swap only to keep parts of the tensors on GPU? If so, you will not see any performance improvements from TensorRT. PCIe bandwidth and memory in general is the most limiting factor. |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
Hey, take a look at : |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment

Uh oh!
There was an error while loading. Please reload this page.
-
I want to make 16:9/21:9 images, but also for as high base resolution i can, but even with 960x960 static resolution, it cancels and fails because i don't have enough memory. But for regular generations, it can generate just fine with shared memory. But not create tensorRT engines for said resolutions. Please make it able to use shared ram for generating higher res engines.
Beta Was this translation helpful? Give feedback.
All reactions