Skip to main content Skip to bottom nav

Listener Research Participation Opportunity: Simulation-Based Training of Active Listening Skills

User Profile: SoulfullyAButterfly
SoulfullyAButterfly January 26th
Stanford University is conducting a research study to collaboratively develop technologies for simulation-based training of active listening skills.  We are seeking participants to shape the design and evaluation of a Simulated Support Seeker, which we call a customized Member Bot.  Eventually, we hope listeners can interactively practice different conversational scenarios, without the need for a trained actor to role-play with.


At this second stage, we are seeking feedback on several versions of Simulated Support Seeker. As a participant, you will be invited to interact with a Simulated Support Seeker and evaluate sample dialogues for their believability and usefulness.

The study will take approximately 1 hour to complete and you will be compensated with a $25 Amazon gift card for your participation. 

Participant Requirements:

Listeners interested in participating:

  • Must be 18 years of age or older.

  • Must be located in the U.S.

    • Must have helped 30+ members through 1-1 chats (People Helped)


    Participants with experience in the 7 Cups Mock Chat program are especially encouraged to apply.

If you are interested in participating in this research, please fill out this interest form. We will reach out to you if you are a fit for the study.

4

Sounds lovely! :)

User Profile: kindRiver6
kindRiver6 January 28th

Unfortunately I can’t take part as I’m in located in UK ☹️

User Profile: Nightowl01
Nightowl01 January 30th

@SoulfullyAButterfly

Every time I see something like this where a gift card is offered as bribery to compel a person to do something, it raises a red flag to me. It's like going back to the Covid pandemic years where people were bribed in various ways to get a vaccine which could potentially harm them. Unfortunately many poor people in need of that gift card from Amazon may fall for this, without thinking about the potential harm this AI simulation information may cause for the future. You will be literally training a system to work so that us humans are no longer needed anymore. Real human interpersonal relationships matter, and it is not something that should be replaced or replicated by a bot. Once society takes the jump with this there will be no pulling it back.

1 reply
User Profile: SoulfullyAButterfly
SoulfullyAButterfly OP February 29th

@Nightowl01 The research team wanted to clarify: we are not seeking to replace humans during actual 7 Cups sessions; we want to retain and amplify the human connection and human support in these listening sessions. Instead, we aim to empower those listeners who are rusty or less confident in their active listening skills, through a Member Bot to practice with.  To do this ethically and responsibility, we are wanting to involve 7 Cups listeners so we can get feedback on its potential benefits and potential harms.

load more