Yet Another Computer Vision Index To Datasets (YACVID) - Details

Stand: 2017-10-19 000000m 12:51:36 - Overview

ok911
Attribute Current Content New
Name (Institute + Shorttitle)Berkeley Multimodal Human Action Database (MHAD) 
Description (include details on usage, files and paper references)The Berkeley Multimodal Human Action Database (MHAD) contains 11 actions performed by 7 male and 5 female subjects in the range 23-30 years of age except for one elderly subject. All the subjects performed 5 repetitions of each action, yielding about 660 action sequences which correspond to about 82 minutes of total recording time.

F. Ofli, R. Chaudhry, G. Kurillo, R. Vidal and R. Bajcsy. Berkeley MHAD: A Comprehensive Multimodal Human Action Database. In Proceedings of the IEEE Workshop on Applications on Computer Vision (WACV), 2013. 
URL Linkhttp://tele-immersion.citris-uc.org/berkeley_mhad/ 
Files (#)660 
References (SKIPPED)
Category (SKIPPED)Action, Classification 
Tags (single words, spaced)action classification multiview motion recognition 
Last Changed2017-10-19 
Turing (2.12+3.25=?) :-)