# To activate only if you're working with the git version, inside the repository (for developpers)
import sys
sys.path.insert(0, './src')
%load_ext autoreload
%autoreload 2
import logging
logging.getLogger().setLevel('INFO')
# either
# ! pip install keras-video-generators
This package aims to produce generators that yields video frames from video dataset. It provides VideoFrameGenerator
, SlidingFrameGenerator
and OpticalFlowGenerator
. The first one is the mother class.
This Notebook uses a subset of "HMDB: a large human motion database": https://serre-lab.clps.brown.edu/resource/hmdb-a-large-human-motion-database/
The first class is the simplest and probably the best to use if your dataset is large enough.
import keras_video
Here, we will get create a generator for above given classes. As you can see, the classname
pattern is represented as directory name.
Since v1.0.10: we can avoid to give the classes
argument to let the generator to find all classes respecting the glob_pattern
.
gen = keras_video.VideoFrameGenerator(batch_size=4, nb_frames=5, glob_pattern='./_test/{classname}/*', split_val=.2)
You can use transformation
parameter to give an ImageGenerator
object that will randomly transform frames for each epoch.
To check your generator, you can use keras_video.utils
that gives show_sample()
function to display (in a notebook) one batch.
from keras_video import utils as ku
ku.show_sample(gen, random=True)