Conversational Agent Research Toolkit | Amsterdam University Press Journals Online
2004
Volume 2, Issue 1
  • E-ISSN: 2665-9085

Abstract

Abstract

Conversational agents in the form of chatbots available in messaging platforms are gaining increasing relevance in our communication environment. Based on natural language processing and generation techniques, they are built to automatically interact with users in several contexts. We present here a tool, the Conversational Agent Research Toolkit (CART), aimed at enabling researchers to create conversational agents for experimental studies. CART integrates existing APIs frequently used in practice and provides functionality that allows researchers to create and manage multiple versions of a chatbot to be used as stimuli in experimental studies. This paper provides an overview of the tool and provides a step-by-step tutorial of to design an experiment with a chatbot.

Loading

Article metrics loading...

/content/journals/10.5117/CCR2020.1.002.ARAU
2020-02-01
2024-04-20
Loading full text...

Full text loading...

/deliver/fulltext/26659085/2/1/02_CCR2020.1.002.ARAU.html?itemId=/content/journals/10.5117/CCR2020.1.002.ARAU&mimeType=html&fmt=ahah

References

  1. Araujo, T.(2018). Living up to the chatbot hype: The influence of anthropomorphic design cues and communicative agency framing on conversational agent and company perceptions.Computers in Human Behavior, 85, 183–189. https://doi.org/10.1016/j.chb.2018.03.051
    [Google Scholar]
  2. Barot, T.(2017). The future of news is humans talking to machines. Retrieved March 9, 2018, from Nieman Lab website: www.niemanlab.org/2017/09/the-future-of-news-is-humans-talking-to-machines/
  3. Bickmore, T., & Gruber, A.(2010). Relational Agents in Clinical Psychiatry.Harvard Review of Psychiatry, 18(2), 119–130. https://doi.org/10.3109/10673221003707538
    [Google Scholar]
  4. Bickmore, T., & Picard, R. W.(2005). Establishing and Maintaining Long-term Human-computer Relationships.ACM Transactions on Computer-Human Interaction (TOCHI), 12(2), 293–327. https://doi.org/10.1145/1067860.1067867
    [Google Scholar]
  5. Bocklisch, T., Faulker, J., Pawlowski, N., & Nichol, A.(2017). Rasa: Open source language understanding and dialogue management.ArXiv Preprint ArXiv:1712.05181.
    [Google Scholar]
  6. Brandtzaeg, P., & Følstad, A. (2017, November 23). Why people use chatbots.Presented at the 4th International Conference on Internet Science, Thessaloniki, Greece.
    [Google Scholar]
  7. Corti, K., & Gillespie, A.(2016). Co-constructing intersubjectivity with artificial conversational agents: People are more likely to initiate repairs of misunderstandings with agents represented as human.Computers in Human Behavior, 58, 431–442. https://doi.org/10.1016/j.chb.2015.12.039
    [Google Scholar]
  8. Dahlbäck, N., Jönsson, A., & Ahrenberg, L.(1993). Wizard of Oz studies: Why and how.Proceedings of the 1st International Conference on Intelligent User Interfaces, 193–200. Retrieved from dl.acm.org/citation.cfm?id=169968
    [Google Scholar]
  9. Dale, R.(2016). The return of the chatbots.Natural Language Engineering, 22(5), 811–817. https://doi.org/10.1017/S1351324916000243
    [Google Scholar]
  10. Epley, N., Waytz, A., & Cacioppo, J. T.(2007). On seeing human: A three-factor theory of anthropomorphism.Psychological Review, 114(4), 864–886. https://doi.org/10.1037/0033-295X.114.4.864
    [Google Scholar]
  11. Etemad-Sajadi, R.(2016). The impact of online real-time interactivity on patronage intention: The use of avatars.Computers in Human Behavior, 61, 227–232. https://doi.org/10.1016/j.chb.2016.03.045
    [Google Scholar]
  12. Gilbert, C., & Hutto, E.(2014). Vader: A parsimonious rule-based model for sentiment analysis of social media text.Eighth International Conference on Weblogs and Social Media (ICWSM-14). Available at (20/04/16) Http://Comp. Social. Gatech. Edu/Papers/Icwsm14. Vader. Hutto. Pdf.
    [Google Scholar]
  13. Gunkel, D.(2012). Communication and Artificial Intelligence: Opportunities and Challenges for the 21st Century.Communication+1, 1(1), 1–25. https://doi.org/10.7275/R5QJ7F7R
    [Google Scholar]
  14. Guzman, A. L., & Lewis, S. C.(2019). Artificial intelligence and communication: A Human–Machine Communication research agenda.New Media & Society, 1461444819858691. https://doi.org/10.1177/1461444819858691
    [Google Scholar]
  15. Ho, A., Hancock, J., & Miner, A. S.(2018). Psychological, Relational, and Emotional Effects of Self-Disclosure After Conversations With a Chatbot.Journal of Communication. https://doi.org/10.1093/joc/jqy026
    [Google Scholar]
  16. Lichterman, J.(2016). Alexa, give me the news: How outlets are tailoring their coverage for Amazon’s new platform. Retrieved March 9, 2018, from Nieman Lab website: www.niemanlab.org/2016/08/alexa-give-me-the-news-how-outlets-are-tailoring-their-coverage-for-amazons-new-platform/
  17. Olmstead, K. (2017, December 12). Nearly half of Americans use digital voice assistants, mostly on their smartphones. Retrieved December 27, 2017, from Pew Research Center website: www.pewresearch.org/fact-tank/2017/12/12/nearly-half-of-americans-use-digital-voice-assistants-mostly-on-their-smartphones/
  18. Peter, J.(2018). New communication technologies and young people: The case of social robots.Youth and Media, 203–217. Nomos Verlagsgesellschaft mbH & Co. KG.
    [Google Scholar]
  19. Sundar, S. S.(2008). The MAIN model: A heuristic approach to understanding technology effects on credibility.Digital Media, Youth, and Credibility.
    [Google Scholar]
  20. Sundar, S. S., Bellur, S., Oh, J., Jia, H., & Kim, H.-S.(2016). Theoretical Importance of Contingency in Human-Computer Interaction Effects of Message Interactivity on User Engagement.Communication Research, 43(5), 595–625. https://doi.org/10.1177/0093650214534962
    [Google Scholar]
  21. Verhagen, T., van Nes, J., Feldberg, F., & van Dolen, W.(2014). Virtual Customer Service Agents: Using Social Presence and Personalization to Shape Online Service Encounters.Journal of Computer-Mediated Communication, 19(3), 529–545. https://doi.org/10.1111/jcc4.12066
    [Google Scholar]
  22. Woolley, S. C., & Howard, P. N.(2016). Political Communication, Computational Propaganda, and Autonomous Agents — Introduction.International Journal of Communication, 10(0), 9.
    [Google Scholar]
  23. Zarouali, B., Van den Broeck, E., Walrave, M., & Poels, K.(2018). Predicting Consumer Responses to a Chatbot on Facebook.Cyberpsychology, Behavior, and Social Networking, 21(8), 491–497. https://doi.org/10.1089/cyber.2017.0518
    [Google Scholar]
  24. Zhao, S.(2006). Humanoid social robots as a medium of communication.New Media & Society, 8(3), 401–419. https://doi.org/10.1177/1461444806061951
    [Google Scholar]
http://instance.metastore.ingenta.com/content/journals/10.5117/CCR2020.1.002.ARAU
Loading
/content/journals/10.5117/CCR2020.1.002.ARAU
Loading

Data & Media loading...

This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error