2004
Volume 3, Issue 2
  • ISSN: 2665-9085
  • E-ISSN: 2665-9085

Abstract

Abstract

Measurement noise differs by instrument and limits the validity and reliability of findings. Researchers collecting reaction time data introduce noise in the form of response time latency from hardware and software, even when collecting data on standardized computer-based experimental equipment. Reaction time is a measure with broad application for studying cognitive processing in communication research that is vulnerable to response latency noise. In this study, we utilized an Arduino microcontroller to generate a ground truth value of average response time latency in , an open source, naturalistic, experimental video game stimulus. We tested if response time latency differed across computer operating system, software, and trial modality. Here we show that reaction time measurements collected using were susceptible to response latency variability on par with other response-latency measuring software tests. These results demonstrate that is a valid and reliable stimulus for measuring reaction time data. Moreover, we provide researchers with a low-cost and open-source tool for evaluating response time latency in their own labs. Our results highlight the importance of validating measurement tools and support the philosophy of contributing methodological improvements in communication science.

Loading

Article metrics loading...

/content/journals/10.5117/CCR2021.2.001.CALC
2021-10-01
2021-10-20
Loading full text...

Full text loading...

/deliver/fulltext/26659085/3/2/CCR2021.2.001.CALC.html?itemId=/content/journals/10.5117/CCR2021.2.001.CALC&mimeType=html&fmt=ahah

References

  1. Anderson, C. A., & Dill, K. E. (2000). Video games and aggressive thoughts, feelings, and behavior in the laboratory and in life. Journal of Personality and Social Psychology, 78(4), 772–790. https://doi.org/10.1037/0022-3514.78.4.772
    [Google Scholar]
  2. Bassili, J. N., & Fletcher, J. F. (1991). Response-time measurement in survey research a method for CATI and a new look at nonattitudes. Public Opinion Quarterly, 55(3), 331–346. https://doi.org/10.1086/269265
    [Google Scholar]
  3. Bowman, N. D., & Keene, J. R. (2018). A layered framework for considering open science practices. Communication Research Reports, 35(4), 363–372. https://doi.org/10.1080/08824096.2018.1513273
    [Google Scholar]
  4. Bridges, D., Pitiot, A., MacAskill, M. R., & Peirce, J. W. (2020). The timing mega-study: Comparing a range of experiment generators, both lab-based and online. PeerJ, 8, e9414. https://doi.org/10.7717/peerj.9414
    [Google Scholar]
  5. Brysbaert, M., & Stevens, M. (2018). Power Analysis and Effect Size in Mixed Effects Models: A Tutorial. Journal of Cognition, 1(1), 9. https://doi.org/10.5334/joc.10
    [Google Scholar]
  6. Carpenter, C. J. (2020). Meta-analyzing apples and oranges: How to make applesauce instead of fruit salad. Human Communication Research, 46(2-3), 322–333. https://doi.org/10.1093/hcr/hqz018
    [Google Scholar]
  7. Clayton, R. B., & Leshner, G. (2015). The Uncanny Valley: The Effects of Rotoscope Animation on Motivational Processing of Depression Drug Messages. Journal of Broadcasting & Electronic Media, 59(1), 57–75. https://doi.org/10.1080/08838151.2014.998227
    [Google Scholar]
  8. Clayton, R. B., Leshner, G., Sanders-Jackson, A., & Hendrickse, J. (2020). When Counterarguing Becomes the Primary Task: Examination of Dogmatic Anti-Vaping Messages on Psychological Reactance, Available Cognitive Resources, and Memory. Journal of Communication, 70(4), 522–547. https://doi.org/10.1093/joc/jqaa01
    [Google Scholar]
  9. Dienlin, T., Johannes, N., Bowman, N. D., Masur, P. K., Engesser, S., Kumpel, A. S., Lukito, J., Bier, L. M., Zhang, R., Johnson, B. K., Huskey, R., Schneider, F. M., Breuer, J., Parry, D. A., Vermeulen, I., Fisher, J. T., Banks, J., Weber, R., Ellis, D. A., … Vreese, C. D. (2020). An Agenda for Open Science in Communication. Journal of Communication. https://doi.org/10.1093/joc/jqz052
    [Google Scholar]
  10. Eckner, J. T., Kutcher, J. S., & Richardson, J. K. (2011). Between-Seasons Test-Retest Reliability of Clinically Measured Reaction Time in National Collegiate Athletic Association Division I Athletes. Journal of Athletic Training, 46(4), 409–414. https://doi.org/10.4085/1062-6050-46.4.409
    [Google Scholar]
  11. Ferguson, C. J., Rueda, Stephanie. M., Cruz, A. M., Ferguson, D. E., Fritz, S., & Smith, S. M. (2008). Violent video games and aggression: Causal relationship or byproduct of family violence and intrinsic violence motivation?Criminal Justice and Behavior, 35(3), 311–332. https://doi.org/10.1177/0093854807311719
    [Google Scholar]
  12. Fisher, J. T. & Hamilton, K. (in press). Integrating media selection and media effects using decision theory.Journal of Media Psychology. Theories, Methods, and Applications.
  13. Fisher, J. T., Hopp, F. R., & Weber, R. (2019). Modality-Specific Effects of Perceptual Load in Multimedia Processing. Media and Communication, 7(4), 149–165. https://doi.org/10.17645/mac.v7i4.2388
    [Google Scholar]
  14. Fisher, J. T., Huskey, R., Keene, J. R., & Weber, R. (2018). The limited capacity model of motivated mediated message processing: Looking to the future. Annals of the International Communication Association, 42(4), 291–315. https://doi.org/10.1080/23808985.2018.1534551
    [Google Scholar]
  15. Fisher, J. T., Keene, J. R., Huskey, R., & Weber, R. (2018). The limited capacity model of motivated mediated message processing: Taking stock of the past. Annals of the International Communication Association, 42(4), 270–290. https://doi.org/10.1080/23808985.2018.1534552
    [Google Scholar]
  16. Gong, X., Huskey, R., Eden, A., & Ulusoy, E. (May, 2021). People Prefer Negatively-Valenced Movies in a Two-Alternative Movie Decision Task: A Drift Diffusion Modeling Approach for Testing Mood Management Theory. Annual Meeting of the International Communication Association, Virtual Conference.
  17. Greenwald, A. G. (2012). There is nothing so theoretical as a good method. Perspectives on Psychological Science, 7(2), 99–108. https://doi.org/10.1177/1745691611434210
    [Google Scholar]
  18. Huskey, R., Craighead, B., Miller, M. B., & Weber, R. (2018). Does intrinsic reward motivate cognitive control? A naturalistic-fMRI study based on the Synchronization Theory of Flow. Cognitive, Affective & Behavioral Neuroscience, 18(5), 902–924. https://doi.org/10.3758/s13415-018-0612-6
    [Google Scholar]
  19. Huskey, R., Wilcox, S., Clayton, R., & Keene, J. R. (2020). The Limited Capacity Model of Motivated Mediated Message Processing: Meta-analytically summarizing two decades of research. Annals of the International Communication Association, 44(4), 322–349. https://doi.org/10.1080/23808985.2020.1839939
    [Google Scholar]
  20. Lang, A. (2000). The limited capacity model of mediated message processing. Journal of Communication, 50(1), 46–70. https://doi.org/10.1111/j.1460-2466.2000.tb02833.x
    [Google Scholar]
  21. Lang, A. (2006a). Motivated cognition (LC4MP): The influence of appetitive and aversive activation on the processing of video games. In P.Messarsis & L.Humphries (Eds.), Digital media: Transformation in human communication (pp. 237–256). Peter Lang Publishing.
    [Google Scholar]
  22. Lang, A. (2006b). Using the limited capacity model of motivated mediated message processing to design effective cancer communication messages. Journal of Communication, 56(1), S57–S80. https://doi.org/10.1111/j.1460-2466.2006.00283.x
    [Google Scholar]
  23. Lang, A. (2009). The limited capacity model of motivated mediated message processing. In R.Nabi & M. B.Oliver (Eds.), The SAGE handbook of media processes and effects (pp. 193–204). Sage Publications.
    [Google Scholar]
  24. Lang, A. (2017). Limited capacity model of motivated mediated message processing (LC4MP). In P.Roessler, C.Hoffner, & L.van Zoonen (Eds.), The International Encyclopedia of Media Effects (pp. 1–9). Wiley & Sons. https://doi.org/10.1002/9781118783764.wbieme0077
    [Google Scholar]
  25. Lang, A. (2020). How the LC4MP became the DHCCST: An epistemological fairy tal. In K.Floyd & R.Weber (Eds.), The Handbook of Communication Science and Biology (pp. 397–408). Routledge.
    [Google Scholar]
  26. Lang, A., & Basil, M. D. (1998). Attention, resource allocation, and secondary task reaction times in communication research: What do secondary reaction task reaction times measure, anyway? In M. E.Roloff (Ed.), Communication Yearbook (21st ed., pp. 443–473). Sage Publications.
    [Google Scholar]
  27. Lang, A., Bradley, S. D., Park, B., Shin, M., & Chung, Y. (2006). Parsing the resource pie: Using STRTs to measure attention to mediated messages. Media Psychology, 8(4), 369–394. https://doi.org/10.1207/s1532785xmep0804_3
    [Google Scholar]
  28. Lavrakas, P. J., Kennedy, C., Leeuw, E. D.de, West, B. T., Holbrook, A. L., & Traugott, M. W. (2019). Probability Survey-Based Experimentation and the Balancing of Internal and External Validity Concerns. In P.Lavrakas, M.Traugott, C.Kennedy, A.Holbrook, E.de Leeuw, & B.West (Eds.), Experimental Methods in Survey Research (pp. 1–18). John Wiley & Sons, Ltd.https://doi.org/10.1002/9781119083771.ch1
    [Google Scholar]
  29. Levine, T. R., & Weber, R. (2020). Unresolved Heterogeneity in Meta-Analysis: Combined Construct Invalidity, Confounding, and Other Challenges to Understanding Mean Effect Sizes. Human Communication Research, 46(2–3), 343–354. https://doi.org/10.1093/hcr/hqz019
    [Google Scholar]
  30. Lewis, N. A.Jr. (2020). Open Communication Science: A Primer on Why and Some Recommendations for How. Communication Methods and Measures, 14(2), 71–82. https://doi.org/10.1080/19312458.2019.1685660
    [Google Scholar]
  31. Liu, J., & Bailey, R. L. (2019). Effects of Substance Cues in Negative Public Service Announcements on Cognitive Processing. Health Communication, 34(9), 964–974. https://doi.org/10.1080/10410236.2018.1446251
    [Google Scholar]
  32. Lord, F. M., Novick, M. R., & Birnbaum, A. (1968). Statistical theories of mental test scores. Addison-Wesley.
  33. Mathiak, K., & Weber, R. (2006). Toward brain correlates of natural behavior: FMRI during violent video games. Human Brain Mapping, 27(12), 948–956. https://doi.org/10.1002/hbm.20234
    [Google Scholar]
  34. Matthews, N. L. (2019). Detecting the Boundaries of Disposition Bias on Moral Judgments of Media Characters’ Behaviors using Social Judgment Theory. Journal of Communication, 69(4), 418–441. https://doi.org/10.1093/joc/jqz021
    [Google Scholar]
  35. Miller, A., & Leshner, G. (2007). How Viewers Process Live, Breaking, and Emotional Television News. Media Psychology, 10(1), 23–40. https://doi.org/10.1080/15213260701300915
    [Google Scholar]
  36. Neath, I., Earle, A., Hallett, D., & Surprenant, A. M. (2011). Response time accuracy in Apple Macintosh computers. Behavior Research Methods, 43(2), 353. https://doi.org/10.3758/s13428-011-0069-9
    [Google Scholar]
  37. Plant, R. R. (2016). A reminder on millisecond timing accuracy and potential replication failure in computer-based psychology experiments: An open letter. Behavior Research Methods, 48(1), 408–411. https://doi.org/10.3758/s13428-015-0577-0
    [Google Scholar]
  38. R Core Team. (2020). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing. https://www.R-project.org/
  39. Schubert, T. W., Murteira, C., Collins, E. C., & Lopes, D. (2013). ScriptingRT: A software library for collecting response latencies in online studies of cognition. PloS One, 8(6), 1–12. https://doi.org/10.1371/journal.pone.0067769
    [Google Scholar]
  40. Soreni, N., Crosbie, J., Ickowicz, A., & Schachar, R. (2009). Stop Signal and Conners’ Continuous Performance Tasks: Test—Retest Reliability of Two Inhibition Measures in ADHD Children. Journal of Attention Disorders, 13(2), 137–143. https://doi.org/10.1177/1087054708326110
    [Google Scholar]
  41. Sternberg, S. (1969). Memory-scanning: Mental processes revealed by reaction-time experiments. American Scientist, 57(4), 421–457.
    [Google Scholar]
  42. Thorpe, S., Fize, D., & Marlot, C. (1996). Speed of processing in the human visual system. Nature, 381(6582), 520–522. https://doi.org/10.1038/381520a0
    [Google Scholar]
  43. Weafer, J., Baggott, M. J., & de Wit, H. (2013). Test–retest reliability of behavioral measures of impulsive choice, impulsive action, and inattention. Experimental and Clinical Psychopharmacology, 21(6), 475–481.
    [Google Scholar]
  44. Weber, R., Alicea, B., Huskey, R., & Mathiak, K. (2018). Network dynamics of attention during a naturalistic behavioral paradigm. Frontiers in Human Neuroscience, 12(182), 1–14. https://doi.org/10.3389/fnhum.2018.00182
    [Google Scholar]
  45. Wilcox, S., Huskey, R., & DeAndrea, D. C. (in press). Attitude consistent health messages about electronic cigarettes increase processing time: Perceiving message senders as socially close increases message recall. Journal of Media Psychology. Theories, Methods, and Applications..
http://instance.metastore.ingenta.com/content/journals/10.5117/CCR2021.2.001.CALC
Loading
/content/journals/10.5117/CCR2021.2.001.CALC
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error