We don’t have a document that describes specifically how ta access hardware acceleration. I looked for information related to hardware acceleration for the Atom and found some interesting documents but nothing specifically related to the Edison.
What is your goal specifically, to do something like this GitHub - drejkim/edi-cam: Video streaming on Intel Edison ?
Actually I am trying to use a FLIR Lepton and a USB webcam to do multi-spectral imaging and feature detection. I'm not sure yet if on-edison or broadcast-to-compute-machine will be better, but in general having a way to get the resulting video OFF the edison in a reliable and video-streaming way (right now I just have a hacked up MJPEG http servlet) is needed.
From the repository you linked it seems that relying on FFMPEG to have the optimizations is the current way to go.