Skip to content

validation loss always lower than train loss #225

@ycc66104116

Description

@ycc66104116

hi, i recently use the code to train my own dataset. and i found something strange, that is on tensorboard, my val loss is always lower then train loss. however the train acc is higher than val acc.
the important thing is i don't know why my val loss is always lower than train loss. the gap between the two curves maintain the same, like the image below. i can't make the two converge.
image

i know there are something different in model.train() and model.eval(), but i expect the curves can converge as the epoch increase.
does anyone encountered this question before? would this happened when training dataset is too small?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions