Skip to main content Skip to bottom nav

Using Technology to Faciliate Healing Human Connection

GlenM November 10th, 2016

7 Cups is a technology company. We use technology to expand access to free, high quality, emotional support. Before 7 Cups, you had to visit a therapist to get support; now, you can open an app or visit our site and connect with a trained and compassionate listener. Technology has enabled us to make getting help convenient and easy.

Some tech companies are interested in using technology to replace humans. For example, Uber wants to create self-driving cars to replace human drivers. Replacing humans unequivocally helps Uber.

7 Cups is not interested in using technology to replace humans. That is the opposite of our mission. We do not want to decrease human connection. We want to significantly increase and facilitate healing human connection. Replacing humans would unequivocally harm 7 Cups.

According to some estimates, people in the developed world (think US, Europe etc.) have 8 times as much mental illness as people in the developing world. Isnt that staggering? I believe one big reason for this is because there is much more human connection in the developing world than there is in the developed world. People are much more isolated in countries with more financial resources. It is a major problem and one that we are working hard to help solve.

The developed world has access to more technology. The answer, however, is not to become a luddite (someone afraid of technology). The answer is instead to leverage technology - much like weve already been doing (!) - to facilitate human connection. In other posts, Ive talked about how we can use technology to better match people with listeners and better train listeners. These improvements increase human connection. They help us better fulfill our mission.

Last week we did some testing with Noni (our bot) leading a group discussion. It was an experiment. We wanted to see how a really basic version of Noni would do. It was nothing more than a queue of questions fired at certain times. Super low tech. It was essentially just a list of questions/thoughts/ideas. Imagine if a person just typed in a question every few minutes and didnt do anything else. That is what Noni did.

We learned some interesting things. People naturally kind of fit in around Noni. They brought nuance to the discussion in a way that Noni never could. They supported one another and, overall, it seemed to be a fairly positive experience. The people in the group and Noni complemented one another. Noni was a member of the team. A pretty limited member, but still a member.

I think this is a good metaphor or way to approach technology on 7 Cups. Tech, or Noni, can be a part of our team. Her role, and role of all our tech, is to facilitate human connection. Tech needs to increase the way that we care for one another. If it doesnt, then it is fundamentally undermining all that we stand for here at 7 Cups. I think about our team of amazing group moderators. They are doing incredibly important work. (Side note: Group therapy/moderating is actually a tech innovation on individual therapy. Individual therapy helps one person; group therapy helps multiple people with similar levels of efficacy at the same time). @Heather and our mods work very hard to scale support. We do not have research yet, but will soon on just how much they help people. I think well all be surprised with the power of group support.

Do we want to replace moderators with Noni? No. Never. Replacing a moderator would be against our mission. It would decrease human connection. If that is the case, then when does it make sense to use Noni? It makes sense to have Noni as a part of the moderating team when we do not have a moderator. This might be late at night. It might be at certain time intervals when all the mods are busy. It might be for a new group room that doesnt yet have a mod team. Noni is there to fill in the gaps as long as we need her. When we are good, we can turn her off or remove her from that temporary role.

We are in a very interesting time right now. Bots are coming and they are taking jobs (bank tellers, taxi drivers, translators...the list goes on and on). This trend will accelerate. Our mission is to increase human connection and healing. Talking with a bot alone cannot do that (check out the film Her for a great portrayal). Bots can support us, but they cannot replace us and they cannot provide direct human connection.

Here is another way of looking at it:

Decreased human connection = increased mental health problems

Increased human connection = decreased mental health problems

7 Cups = increased human connection :)

7 Cups will always, fundamentally, be about human to human connection. That is why we exist.

As always, please share thoughts, question, and feedback below! Thank you!

16
MidniteAngel November 10th, 2016

@Glen @Heather @Laura @Iara @Amelia

I think the main goal for noni is to fill in spaces that lack human interaction - having her act as this sort of waiting room receptionist gives the users a connection where there otherwise wouldn't be. Perhaps for new listeners or members we can give them the option to view a guided tour by noni that explains the use of the site - a little tutorial on how the feed, forum and chatrooms work. (With of course the option to skip this tutorial or view it later). Also I saw that you were all very interested in giving Noni a human face - but I thought making noni an animated little non-human character would be more appealing - because this means there is no gender, race, appearance, age bias or descrimination or connotations that will make users uncomfortable - it would make it easier for all users to connect to Noni.

4 replies
GlenM OP November 10th, 2016

@MidniteAngel, another great idea! @krinkthemellowunicorn has been thinking about this idea as well. We want to get there, we just have it pushed down the list while we try to knock out some other high level items. Look for new home page and other pages later on today or tomorrow. Would love to hear your feedback :)

3 replies
Navylady November 11th, 2016

@GlenM

I actually love this idea that @MidniteAngel has. I think that would be a wonderful addition. I am pretty computer savvy so I may not have needed it but not everyone is like me. Plus not only could this help them learn the site it could actually REALLY help them USE the site. Giving them examples or something of what different areas are used for. Like the chat rooms for group disscusions or hangouts. The 1-1 chats for, Well one on one time. Things like that. Help them learn how to get the most out of the site.

2 replies
GlenM OP November 11th, 2016

@Navylady and @MidniteAngel, totally agree and think it is a very smart idea :). I've tagged @krinkthemellowunicorn here. I'm sure we can get something like this together. Appreciate it!

KrinkTheMellowUnicorn November 14th, 2016

@Navylady- I love these suggestions. Keep them coming, please!

load more
load more
load more
SilentSerenityy November 10th, 2016

@GlenM

I am all for Noni doing more discussions. I wasn't part of one it did, but if it got positive reception, it should be done more. Listeners' time are becoming more and more taken up by hosting discussions, that many don't fully appreciate or get involved in, rather than listening and helping out elsewhere on the site. Noni could host a couple of discussions at one time and will always start at the right time and not miss one. The more intricate/sensitive topics should be led by a human, but simple ones should be done by Noni in my opinion.

As a moderator, I also support Noni helping out. She could instantly stop a message being posted if it contained any swear words that have been manipulated to avoid the censor or any phrase that's not appropriate and mute someone after a several messages had been deleted. I'd be all for her to be around 24/7 as I'm more comfortable when someone else is moderating with me.

Sasher November 10th, 2016

Peace be upon you.

People are unable to connect with the group leader. It's kind of like saying, we don't want to let our child drink coffee but they seemed to really enjoy the froth. It seems like an easy solution to fix the shortage of mods.

There's apps like kik, that when people are bored and have no one to talk to, they can talk to a bot instead.
I'm glad that the connection issues have been recognised and it makes sense that lack of connection is a contributer to mental health problems. They way to solve this is to teach family values and to teach people to respect, love, care and communicate with their parents and families.

"Replacing humans would unequivocally harm 7 Cups" Why have a bot lead a discussion, instead of a human, and why put your feet in that paddling pool?
I understand some of you are very excited and geared up to introduce Noni, and to make it work but I think seven cups is being used to further propagate the technological world and I can imagine seven cups will be pitched to invest more finances into better, less simplistic bots.
"Noni was a member of the team. A pretty limited member, but still a member." Noni has replaced a potential human member.

I do understand that seven cups is a charity based company and that without a cash based incentive, that it must be difficult to find willing and committed moderators, but I still don't believe that bots are the way forward. Surely it would make more sense to find a solution to recruiting more moderators first, and to go from there.

"Bots are coming and they are taking jobs" "This trend will accelerate" There are a shortage of jobs and the human population is increasing. It's only going to become more difficult for humans to get jobs. The Eastern world has more to offer the Western world, morals and traditional medicine and knowledge. But how long will it be before the Eastern world have little to no choice, other than to accept the western world's way of thinking? Will we then see a decline in their mental health also?

If you are familiar with the film I-robot... Films like these are usually designed to make sure that the product (in this case bots) are not rejected once they reach the market, as that would cause catastrophical loss to the industry. I think the film portrays the idea that robots are not a threat as the humans win in the end. There's also robots in the world of weapons of mass destruction and I believe it is geared towards removing human responsibility.

It isn't about fear of technology but I have noticed that this generation are resistant to bots, but I don't think that will be the case with the next generation as I think they will be confused and willing to connect to a bot. I also believe that the future is only paving way for further godlessness and decadence and I think there is little hope for the masses and the world as a whole.

You don't have much option to shy away from bots if you want to continue to grow your "technology company". I am mearly a member here of seven cups but if you choose to completely ignore the issues that I have highlighted, everyone responsible for seven cup can never claim they didn't know.

BrightWriter November 11th, 2016

Innovation is great and exciting and all that but I would also recommend putting some of that drive to innovate into improving the training for listeners and/or possibly even screening those who want to become one. This is to ensure that a "trained and compassionate listener" is who every guest or member connects with when they come seeking help. Because right now that's not always the case.

1 reply
KrinkTheMellowUnicorn November 14th, 2016

@BrightWriter - those are good points! Thank you. One thing we have done recently is completely revamped the Listener Progress Path to provide another mechanism to encourage ongoing training and improvement among listeners.

load more
Navylady November 11th, 2016

@GlenM

I just wanted to also say that I love this site. I like how I can connect with someone in a 1-1 chat at any time of the day. Or connect with multiple folks also pretty much any time of the day. That it is in multiple countries is even nicer. I can't wait to see what other things can be done with the site.

Although a listener myself I would like to see more training available or even required. Id love to see something where maybe every few months....like every quarter listeners have to retake or redo some training. Just to remind them of what they should be doing, and also how what they do can effect others.

Some other things I was thinking of was how members search for listeners. Although I don't mind talking to folks in other time zones it can make it difficult at times when they need help now but I am at work or sleeping. Maybe we could add a search option for time zone the listner is in?

Another thing I have mentioned in a forum post before was adding the user name to emails. For example I use the same email address for both my listener and my member account. But that means that when I get emails I don't always know which me they are for. Usually I can figure it out but every so often I can't.

I am sure I have other ideas and thoughts but my brain is shutting down for the night so I think thats my que to go to bed.

soulsings November 11th, 2016

@GlenM thank you for sharing a wonderful option with Noni leading discussions. That is a great application.

Maybe more discussions could be led by Noni that others do not have time to lead. Noni is programmed to STOP when a member or guest who is waiting for a listener. What if Noni was programmed to move to next question when someone types "Next Noni"

Noni could also be on a timer so after five minutes Noni could say "ready for next question?"

Technology can help us expand beyond what is humanly possible.

Thanks for sharing the process Glen.

RarelyCharlie November 12th, 2016

Some of those opening statements go far beyond reality, I think. Before 7 Cups you did not have to visit a therapist to get support. Supportive social networks are older than language. Technology didn't enable convenient and easy help. It only provided some additional channels, channels which only some people find convenient and easy.

Whether Uber wants to replace human drivers is unclear, even to Uber. In the end the decision will be crowdsourced to Uber's customers. If customers reject self-driving cars, Uber won't argue with them and go bust. Uber will accept the crowdsourced decision.

I think some of the reasons for Uber customers' collective decision are likely to be based on their perceptions of risk—not just road accidents, but also the bad things that human drivers sometimes do. Even though these things are rare, they can loom large in people's perception.

The car I drive is pretty old, but even so, when I want to go somewhere it speaks directions in English, choosing routes that avoid heavy traffic. If I want to phone home the car understands my voice commands. It knows the number and dials it for me. No one thinks of this as replacing humans, although once I would have needed a human passenger to map-read and to dial.

Yes, people naturally fit in around a simple bot that takes on a human-like role. This has been well known since people started to tell ELIZA their troubles half a century ago. Now, there's a self-driving minibus called Olli that understands spoken English and replies to its passengers in spoken English. I have no doubt that people will sometimes want to tell Olli their troubles just like they sometimes do with human cab drivers, and I have no doubt that Olli can be programmed to be as good a listener as any cab driver.

The time will come, I suspect, when some technology company makes a bot that's as good a listener as the best of our listeners at 7 Cups. Maybe it will be one of Olli's children. Then the people who now come to 7 Cups will be able to make a choice, just like Uber's customers.

I think some of the reasons for their decision will be their perceptions of risk—being invalidated, getting inappropriate advice, being blocked are just the first three that come to mind, based on what I read in these forums. In real life I recently heard about someone who sought support at 7 Cups and who reported becoming suicidal as a result of what one of our listeners said to him. Olli's children won't make these mistakes.

So, like it or not, I think one day the decision on whether bots can provide people with emotional support will be crowdsourced. I don't think we get to make a choice about whether or not that will happen. I think the only choice we get is whether to be the tech company that accepts the crowdsourced decision, or whether to be the tech company that argues and goes bust.

@GlenM