Created by Caspar Addyman [email protected]
Version - 0.2
Videos of interacting humans are converted into time series data with OpenPose. The data are processed and then various statistical measures of synchrony and causality between actors are calculated using scipy.
This repository provides Python code and annotated Jupyter notebooks to perform these actions.
- Step 0: Getting started with this project. What to install (besides these files).
- Step 1: Process a video (or videos) with OpenPose, creating JSON file per frame with wireframe data for all identified persons ('actors'). Extract video by video, frame by frame data from JSON files and combine into a single numpy array.
- Step 2: Load nparray from step 1 & perform basic validations (identifying individuals over time, tagging windows of interest, handle missing data).
- Step 3: Perform fourier analysis to extract rythmic movements and compare across groups.
- Step 4: Calculate cross-correlations, Granger Causality (and other measures) between multiple actors in same video. Still in development
To get these scripts working on a new system you need to do the following
First you need to make sure you have supporting software installed.
- You need a working Python environment (Python v3.7 or higher) with support for Jupyter notebooks. The easiest way to do this is to install Anaconda.
- Install OpenPose
- Next you need to download the trained neural-network models that OpenPose uses. To do this go to the
models
subdirectory of OpenPose directory, and double-click / run themodels.bat
script.
You have two options to install this code. Either download the contents of this repository as a zip file to your local machine. Or if you are familiar with GitHub you can fork this repository and keep you own copy of the files with a version history. We recommend using the Github Desktop app to manage this.
- From your Anaconda folder launch an anaconda command prompt.
- Create a new environment for this project from our
environment.yml
file using the commandconda env create -f environment.yml
- Switch to your newly created enviroment with command
conda activate VASC
- Launch from the command line with command
juptyer
orjupyter lab
. Or launch by click the Jupyter icon within Ananconda Navigator (remember to switch enviroment first in the 'Applications on ...' drowdown.) - Open the notebook
Step 0.GettingStarted.ipynb
and follow the instructions in there.
The main requirements for this project are found in the environment.yml
file in this directory. This can be used to create a new (ana)conda environment like so:
conda create -f environment.yml
The main requirements are:
- numpy, pandas, glob2, opencv
- pyarrow, xlrd, jupytext
- ipywidgets, ipycanvas
- nodejs
(and their dependencies).
The folder DrumTutorial
will steps you through a small example of using Fourier transforms to extract drumming tempo from a set of short videos of infants and adults drumming. It will be downloaded when you install the contents of this folder.
In the meantime, you can watch this walkthrough video.
These tools were developed for a scientific project that was aims to see if parents and babies move in synchrony with each other and whether this predicts caring outcomes. The details are found here:
Automated measurement of responsive caregiving at scale using machine learning. Royal Academy of Engineering / Global Challenges Research Fund Overview document
This project was supported by the Royal Academy of Engineering Global Challenges Research Fund Grant: Frontiers of Development - Tranche 2 - FoDSF\1920\2\100020
If you have any comments or questions, either contact Caspar Addyman [email protected] or submit a issue report here on Github.