This is a limited proof of concept to search for research data, not a production system.

Search the MIT Libraries

Title: expyfun: Python experimental paradigm functions, version 2.0.0

Type Software Eric Larson, Daniel McCloy, Ross Maddox, Dean Pospisil (2014): expyfun: Python experimental paradigm functions, version 2.0.0. Zenodo. Software. https://zenodo.org/record/11640

Authors: Eric Larson (University of Washington) ; Daniel McCloy (University of Washington) ; Ross Maddox (University of Washington) ; Dean Pospisil (University of Washington) ;

Links

Summary

expyfun is a python module designed for running audiovisual psychophysics experiments. Prominent features include integration with TDT audio hardware and EyeLink eye tracking hardware, audio synchronization with display refresh times, and a wide variety of helper scripts for stimulus generation, analysis and plotting.

Summary of changes since first release (not an exhaustive list):

dropped Pyo in favor of Pyglet audio expanded EyeLink support new stimulus creation, analysis and visualization methods, including convolve_hrtf, fit_sigmoid, and barplot some changes to the API of ExperimentController (most notably the replacement of flip_and_play with start_stimulus)

More information

  • DOI: 10.5281/zenodo.11640

Subjects

  • python, psychophysics

Dates

  • Publication date: 2014
  • Issued: April 09, 2014

Rights


Much of the data past this point we don't have good examples of yet. Please share in #rdi slack if you have good examples for anything that appears below. Thanks!

Format

electronic resource

Relateditems

DescriptionItem typeRelationshipUri
IsSupplementTohttps://github.com/LABSN/expyfun/tree/2.0.0
IsVersionOfhttps://doi.org/10.5281/zenodo.592546
IsPartOfhttps://zenodo.org/communities/zenodo