Welcome to the PyTorch wavelet toolbox. This package implements discrete-(DWT) as well as continuous-(CWT) wavelet transforms:
- the fast wavelet transform (fwt) via
wavedecand its inverse by providing thewaverecfunction, - the two-dimensional fwt is called
wavedec2the synthesis counterpartwaverec2, wavedec3andwaverec3cover the three-dimensional analysis and synthesis case,fswavedec2,fswavedec3,fswaverec2andfswaverec3support separable transformations.MatrixWavedecandMatrixWaverecimplement sparse-matrix-based fast wavelet transforms with boundary filters,- 2d sparse-matrix transforms with separable & non-separable boundary filters are available,
MatrixWavedec3andMatrixWaverec3allow separable 3D-fwt's with boundary filters.cwtcomputes a one-dimensional continuous forward transform,- single and two-dimensional wavelet packet forward and backward transforms are available via the
WaveletPacketandWaveletPacket2Dobjects, - finally, this package provides adaptive wavelet support (experimental).
This toolbox extends PyWavelets. In addition to boundary wavelets, we provide GPU and gradient support via a PyTorch backend. Complete documentation of our Python API is available at: https://pytorch-wavelet-toolbox.readthedocs.io/en/latest
This toolbox is independent work. Meta or the PyTorch team have not endorsed it.
Installation
Install the toolbox via pip or clone this repository. In order to use pip, type:
pip install ptwtYou can remove it later by typing pip uninstall ptwt.
Single dimensional transform
One way to compute fast wavelet transforms is to rely on padding and convolution. Consider the following example:
import torch
import numpy as np
import pywt
import ptwt # use "from src import ptwt" for a cloned the repo
# generate an input of even length.
data = np.array([0, 1, 2, 3, 4, 5, 6, 7, 7, 6, 5, 4, 3, 2, 1, 0])
data_torch = torch.from_numpy(data.astype(np.float32))
wavelet = pywt.Wavelet('haar')
# compare the forward fwt coefficients
print(pywt.wavedec(data, wavelet, mode='zero', level=2))
print(ptwt.wavedec(data_torch, wavelet, mode='zero', level=2))
# invert the fwt.
print(ptwt.waverec(ptwt.wavedec(data_torch, wavelet, mode='zero'),
wavelet))The functions wavedec and waverec compute the 1d-fwt and its inverse.
Internally both rely on conv1d, and its transposed counterpart conv_transpose1d
from the torch.nn.functional module. This toolbox also supports discrete wavelets
see pywt.wavelist(kind='discrete'). I have tested
Daubechies-Wavelets db-x and symlets sym-x, are usually a good starting point.
Two-dimensional transform
Analog to the 1d-case wavedec2 and waverec2 rely on
conv2d, and its transposed counterpart conv_transpose2d.
To test an example, run:
import ptwt, torch
from scipy import datasets
data = torch.tensor(datasets.face(), dtype=torch.float64)
# permute [H, W, C] -> [C, H, W]
data = data.permute(2, 0, 1)
coefficients = ptwt.wavedec2(face, "haar", level=2, mode="constant")
reconstruction = ptwt.waverec2(coefficients, "haar")
torch.max(torch.abs(face - reconstruction))Speed tests
Speed tests comparing our tools to related libraries are available.
Boundary Wavelets with Sparse-Matrices
In addition to convolution and padding approaches,
sparse-matrix-based code with boundary wavelet support is available.
In contrast to padding, boundary wavelets do not add extra pixels at
the edges.
Internally, boundary wavelet support relies on torch.sparse.mm.
Generate 1d sparse matrix forward and backward transforms with the
MatrixWavedec and MatrixWaverec classes.
Reconsidering the 1d case, try:
import torch
import pywt
import ptwt # use "from src import ptwt" for a cloned the repo
# generate an input of even length.
data = torch.arange(16, dtype=torch.float32)
# forward
matrix_wavedec = ptwt.MatrixWavedec(haar, level=2)
coeff = matrix_wavedec(data)
print(coeff)
# backward
matrix_waverec = ptwt.MatrixWaverec("haar")
rec = matrix_waverec(coeff)
print(rec)The process for the 2d transforms MatrixWavedec2, MatrixWaverec2 works similarly.
By default, a separable transformation is used.
To use a non-separable transformation, pass separable=False to MatrixWavedec2 and MatrixWaverec2.
Separable transformations use a 1D transformation along both axes, which might be faster since fewer matrix entries
have to be orthogonalized.
Adaptive Wavelets
Experimental code to train an adaptive wavelet layer in PyTorch is available in the examples folder. In addition to static wavelets
from pywt,
- Adaptive product-filters
- and optimizable orthogonal-wavelets are supported.
See https://github.com/v0lta/PyTorch-Wavelet-Toolbox/tree/main/examples/network_compression/ for a complete implementation.
Testing
The tests folder contains multiple tests to allow independent verification of this toolbox.
The GitHub workflow executes a subset of all tests for efficiency reasons.
After cloning the repository, moving into the main directory, and installing nox with pip install nox run
nox --session testfor all existing tests.
If you use this work in a scientific context, please cite the following:
@article{JMLR:v25:23-0636,
author = {Moritz Wolter and Felix Blanke and Jochen Garcke and Charles Tapley Hoyt},
title = {ptwt - The PyTorch Wavelet Toolbox},
journal = {Journal of Machine Learning Research},
year = {2024},
volume = {25},
number = {80},
pages = {1--7},
url = {http://jmlr.org/papers/v25/23-0636.html}
}