Matlab code for BART. Lu, H., Ichien, N., & Holyoak, K. J. (in press). Probabilistic analogical mapping with semantic relation networks. Psychological Review.
Relation learning: BART
Verbal analogy dataset
Causal reasoning: SS prior
Matlab code for simulating models of human causal induction, used in Lu, H., Yuille, A., Liljeholm, M., Cheng, P. W., Holyoak, K. J. (2008). Bayesian generic priors for causal learning. Psychological Review, 115(4), 955-984. [PDF]
Causal reasoning and Analogy
Matlab code for Holyoak, K. J., Lee, H. S., & Lu, H. (2010). Analogical and category-based inference: A theoretical integration with Bayesian causal models. Journal of Experimental Psychology: General, 139(4), 702-27. [PDF] Note: require MATLAB Symbolic Toolbox.
Biological motion research is an increasingly active field, with a great potential to contribute to a wide range of applications, such as behavioral monitoring/motion detection in surveillance situations, intention inference in social interactions, and diagnostic tools in autism research. In recent years, a large amount of motion capture data has become freely available online, potentially providing rich stimulus sets for biological motion research. However, there currently does not exist an easy-to-use tool to extract, present and manipulate motion capture data in the MATLAB environment, which many researchers use to program their experiments. We have developed the Biomotion Toolbox, which allows researchers to import motion capture data in a variety of formats, to display actions using Psychtoolbox 3, and to manipulate action displays in specific ways (e.g., invertion, 3D rotation, spatial-scrambling, phase-scrambling, and limited lifetime). The toolbox was designed to allow researchers with a minimal level of MATLAB programming skills to code experiments using biological motion stimuli. Download the Biomotion toolbox V2 here.
Human action online database
Human3.6M with 3.6 million 3D human poses and corresponding images for 17 scenarios. Ionescu, Papava, Olaru & Sminchisescu, 2014.
UCF101 includes 13320 videos covering 101 action types, Soomro, Zamir & Shah, 2012.
HMDB-51 includes 6766 videos covering 51 action types (Kuehne, Jhuang, Garrote, Poggio, & Serre, 2011)
CMU Graphics Lab Motion Capture Database with six motion categories and >100 motion sequences
HumanEva Dataset contains 4 subjects performing 6 common actions (e.g., walking, jogging, gesturing, etc.)
CVonline: Action databases includes hundreds of action videos used for computer vision research and algorithm evaluation
Point-light display movies from Dr. Thomas Shipley’s lab.
A motion-capture library for the study of identity, gender, and emotion perception by Ma, Paterson & Pollick
Human biological and nonbiological point-light movements: Creation and validation of the dataset. Lapenta, O.M., Xavier, A.P., Côrrea, S.C. et al. Behav Res (2016).
Leuven action database, Vanrie & Verfaillie, 2004.
Leuven communicative interaction Database. Manera, Schouten, Becchino, Bara & Verfaillie (2010).
Statistical test tools
Test of Mardia’s coefficients of multivariate skewness and kurtosis
J. Arthur Woodward and Hongjing Lu
MATLAB code: [download]
Reference: Bonett, D.G., Woodward, J. A. Woodward, Randall, R. L. Estimating p-values for Mardia’s coefficients of multivariate skewness and kurtosis. Computational Statistics, 17 (1), 117-121. 2002.