-
Notifications
You must be signed in to change notification settings - Fork 11
Home
The Musical Gestures Toolbox for Python is a collection of high-level modules targeted at researchers working with video recordings. It includes visualization techniques such as motion videos, motion history images, and motiongrams; techniques that, in different ways, allow for looking at video recordings from different temporal and spatial perspectives. It also includes basic computer vision analysis, such as extracting the quantity and centroid of motion, and using such features in analysis.
The toolbox was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.
The Musical Gestures Toolbox contains functions to analyze and visualize video, audio, and motion capture data. There are three categories of functions:
- Preprocessing (trimming, cropping, color adjustments, etc.)
- Visualization (video playback, image display, plotting)
- Processing (videograms, average images, motion images, etc.)
- Quick Start Tutorial
- Jupyter Notebook – can also be run in Colab
- Browse the detailed function documentation in this wiki
- Examples Overview
- Complete Documentation Site
This wiki provides detailed documentation for individual MGT functions:
- Installation - Setup instructions for all platforms
- Video Basics - Understanding video processing concepts
- Loading Videos - How to load and display videos
- Preprocessing - Video preprocessing techniques
- Video Analysis - Motion analysis and visualization
- Audio Analysis - Audio processing functions
- Output Management - Working with results
- Filtering Effects - Understanding filter parameters
- Function Chaining - Combining multiple operations
- File Naming - Output file conventions
The speed and efficiency of the MGT are made possible by the excellent FFmpeg project. Many of the toolbox functions are Python wrappers/bindings on FFmpeg commands called in a subprocess.
Please help improve the toolbox by adding bugs and feature requests in the issues section.
A project from the fourMs Lab, RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, Department of Musicology, University of Oslo.