Skip to content

Inference takes one and a half day to run on a single movie #99

@fatihdinc

Description

@fatihdinc

Hi!

I have been using deep interpolation on our large-scale calcium imaging movies, but it seems that even with a state-of-the-art GPU (3090 Ti), inference step takes around 1.5 days in a 512x512x50,000 movie. I am suspecting that the inference is not using the GPU by default? Which value should I change for it to use GPU and also load with multiprocessing? In contrast, a reasonable training process takes 1-2 hours with good convergence on the first 2000 frames, which seems to be enough for these movies.

Also, is there a demo for inference on 1p movies? I was able to see a training script, but not demo. Thank you!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions