Re: reaction time measures (Christophe Pallier )


Subject: Re: reaction time measures
From:    Christophe Pallier  <christophe.pallier@xxxxxxxx>
Date:    Fri, 8 Jun 2007 12:22:10 +0200
List-Archive:<http://lists.mcgill.ca/scripts/wa.exe?LIST=AUDITORY>

------=_Part_69410_14722168.1181298130735 Content-Type: text/plain; charset=ISO-8859-1; format=flowed Content-Transfer-Encoding: 7bit Content-Disposition: inline Check psychopy (http://www.psychopy.org/) (for visual presentations) If you do not mind programming, python/pygame is good enough for many experiments. It is difficult to ensure than you have millisecond precision on each trial because non real-time, pre-emptive operating systems like Windows or Linux can switch to other processes and create latencies ( i.e. delays in stimulus delivery or detection of button press). But this is not often a difficulty in practice. First, you can check for latencies in your program, second the variance of human subjects' decision times is typically order of magnitude larger than the variance in the timing of the computer. Christophe 2007/6/7, Hornsby, Benjamin Wade Young <ben.hornsby@xxxxxxxx>: > > Hi, > > Sorry if this is not a relevant topic for the list. I am interested in > doing a visual reaction time task where subjects respond when a simple > visual stimulus is presented on a computer screen. I'm looking for accuracy > within ~5-10 ms and I'd like, if possible, to make this portable and use a > laptop computer without having to drag along any external hardware for > running the experiment. Does anyone know of existing software that could do > this? Any leads would be greatly appreciated. > > > > Thanks Much, > > > > Ben Hornsby > > > -- Christophe Pallier ( www.pallier.org ) INSERM CEA Cognitive Neuroimaging unit ( www.unicog.org ) Currently at the Grup Recerca Neurociencia Cognitiva of the University of Barcelona ( www.ub.es/pbasic/sppb/ingl ) Tel: +34 93 280 4000 ext. 4420 of +34 93 600 97 69 ------=_Part_69410_14722168.1181298130735 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: 7bit Content-Disposition: inline Check psychopy (<a href="http://www.psychopy.org/" target="_blank" onclick="return top.js.OpenExtLink(window,event,this)">http://www.psychopy.org/</a>) (for visual presentations)<br><br>If you do not mind programming, python/pygame is good enough for many experiments. <br><br>It is difficult to ensure than you have millisecond precision on each trial because non real-time, pre-emptive operating systems like Windows or Linux can switch to other processes and create latencies ( i.e. delays in stimulus delivery or detection of button press). But this is not often a difficulty in practice. First, you can check for latencies in your program, second the variance of human subjects&#39; decision times is typically order of magnitude larger than the variance in the timing of the computer. <br><br>Christophe <br><br><div><span class="gmail_quote">2007/6/7, Hornsby, Benjamin Wade Young &lt;<a href="mailto:ben.hornsby@xxxxxxxx">ben.hornsby@xxxxxxxx</a>&gt;:</span><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;"> <div link="blue" vlink="purple" lang="EN-US"> <div> <p><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;">Hi,</span></font></p> <p><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;">Sorry if this is not a relevant topic for the list. I am interested in doing a visual reaction time task where subjects respond when a simple visual stimulus is presented on a computer screen. I'm looking for accuracy within ~5-10 ms and I'd like, if possible, to make this portable and use a laptop computer without having to drag along any external hardware for running the experiment. Does anyone know of existing software that could do this? Any leads would be greatly appreciated.</span></font></p> <p><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;">&nbsp;</span></font></p> <p><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;">Thanks Much,</span></font></p> <p><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;">&nbsp;</span></font></p> <p><font face="Arial" size="2"><span style="font-size: 10pt; font-family: Arial;">Ben Hornsby</span></font></p> <p><font face="Times New Roman" size="3"><span style="font-size: 12pt;">&nbsp;</span></font></p> </div> </div> </blockquote></div><br><br clear="all"><br>-- <br>Christophe Pallier ( <a href="http://www.pallier.org">www.pallier.org</a> )<br>INSERM CEA Cognitive Neuroimaging unit ( <a href="http://www.unicog.org">www.unicog.org</a> ) <br><br>Currently at the Grup Recerca Neurociencia Cognitiva of the University of Barcelona <br>( <a href="http://www.ub.es/pbasic/sppb/ingl">www.ub.es/pbasic/sppb/ingl</a> )<br><br>Tel: +34 93 280 4000 ext. 4420 of +34 93 600 97 69 ------=_Part_69410_14722168.1181298130735--


This message came from the mail archive
http://www.auditory.org/postings/2007/
maintained by:
DAn Ellis <dpwe@ee.columbia.edu>
Electrical Engineering Dept., Columbia University