Skip to content

An open source library designed to provide community examples of Joint Embedding Predictive Architectures (JEPAs). It contains code and examples for learning representations from images, video, and action-conditioned video, as well as planning using JEPA-based models.

License

Notifications You must be signed in to change notification settings

facebookresearch/eb_jepa

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Energy Based Joint Embedding Predictive Architectures (JEPA)

EB JEPA

An open source library and tutorial aimed at learning representations for prediction and planning using joint embedding predictive arhictectures. Examples include learning image (a), video (b), and action conidtioned video (c) predictive models representations, as well as planning with them (d).

Each example is (almost) self-contained and training takes up to few hours on a single GPU card.

Image Representations

This example demonstrates learning self-supervised representations from unlabeled images on CIFAR 10, and evaluated on image classification. Moving MNIST

Predictive Video Representations

Moving MNIST

A model is trained to predict the next image representation in a sequence

Action Conditioned Prediction and Planning

This example demonstrates a Joint Embedding Predictive Architecture (JEPA) for action-conditioned world modeling in the Two Rooms environment. The model learns to predict future states based on current observations and actions. These representations enable planning towards a goal observation embedding.

Planning Episode Task Definition
Successful planning episode Episode task definition
Successful planning episode Episode task definition: from init to goal state

Installation

We use uv package manager to install and maintain packages. Once you have installed uv, run the following to create a new virtual environment.

uv sync

This will create a virtual environment within the project folder at .venv/. To activate this environment, run source .venv/bin/activate.

Alternatively, if you don't want to run activate everytime, you can just prepend uv run before your python scripts:

uv run python main.py

Running test cases

Libraries added to eb-jepa must have their own test cases. To run the tests: uv run pytest tests/

Development

  • The uv package comes with black and isort, which must be run before adding any file in this repo. The continous integration will check the linting of the PRs and new files.
  • Every PR should be reviewed by folks tagged at CODEOWNERS.

License

EB JEPA is Apache licensed, as found in the LICENSE file.

About

An open source library designed to provide community examples of Joint Embedding Predictive Architectures (JEPAs). It contains code and examples for learning representations from images, video, and action-conditioned video, as well as planning using JEPA-based models.

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published