The Return On Immersion – A New Metric For Charities & Their Virtual Narratives.

Much of the nonprofit tech that is discussed in our sector revolves around software – state of the art platforms that help lift up our fundraising, new CRM plugins that automate & personalize our unique processes and other operational offerings that help us deliver our work more effectively and efficiently for both internal and external clients.

What we haven’t really seen in the industry are robust discussions around hardware. You know, the ones that really excite us about what is coming, and how we might adopt it. 

Over the past few years we have seen nonprofits and associations beginning to use virtual reality, augmented reality, 360 video, and other innovative storytelling vehicles to help advance their missions with some encouraging (if not, exceptional) results. So how far away is the charitable sector from utilizing these mediums as legitimate communications and fundraising tools? 

The use of VR in the nonprofit sector is still in its infancy and only truly available to the largest charities in the world due to the costs involved with creating original content. The ROI just isn’t there, especially if that tech is just a transactional insight, much like the headset displays you see at conferences that have no real connectivity to the content other than that they look ‘futuristic’.

Current examples that have worked though (and are cause for further exploration for the broader sector) include:

Conferences: As mentioned, large nonprofit conferences have been having some form of VR component for use by attendees for the past few years ranging from simple VR booths for people to be exposed to these computer simulated realities as well as breakout sessions discussing the possibilities of its future use. Content shown here is largely replicated, not original, and is set up to drive experiences rather than foster a call to action.

Gala Events/One-Off Fundraising Events: Given that VR is still relatively new, some first movers have taken the opportunity to create special events around unique virtual experiences allowing charities to reap rewards in real-time and provide out of the box thinking to highlight stories of impact and combat major donor fatigue.

The leading example of this experiential approach is Pencils of Promise who created a 16ft replica of a Ghanaian classroom to set the scene for their recent Wall St Gala. They ended up raising $1.9m at this star studded event with just a 90 second video. We predict that a number of events this year will become even more innovative with their use of VR with the possibility of a single event or multi-day conference able to raise $5m+.

Major Gifts: Examples of major gift projections being surpassed by up to 70% have been reported by UN backed conferences using VR for refugee donors to highlight the devastation in Syria. It was also found that 1 in 6 people pledged donations after participating in that same experience, double the normal rate.

One donor that recently visited Charity:Water’s office, who had already committed to giving $60,000, watched a VR film on their work in Africa and was so moved by the story that he gave $400,000 instead. 

By 2025 the extended reality economy is expected to reach $333bn and it’s safe to assume that a VR headset will take pride of place in a number of nonprofit CEO’s offices for when that all important ask comes. This figure alone should justify why the sector needs to embark on significant research projects to help fundraisers understand the empathy triggers, motivations, and power of immersive experiences of major donors. 

But again it comes down to cost. The dollars that accompany an early adopter approach are hardly conducive of nonprofit budgets. The costs of creating original VR content can easily exceed six figures, so expectations will need to be refocused over the coming years from the traditional lens of return of investment to that of a return on immersion. This will require the foresight (and goodwill) of first movers in this space to continue to increase awareness of VR in marketing and fundraising, and ultimately share its results to help more organizations (both charities and production houses) become more comfortable utilizing these lived experiences for the common good.

The Return on Immersion (a term I originally coined in 2017 at the PRSA International Conference) is a proposed performance measure used to evaluate the impact, empathy and investment of a user exposed to virtual, mixed use and augmented realities.

Much of my thoughts on this in relation to fundraising has been through the research of Paul J. Zak who is a Professor of Economic Sciences, Psychology & Management and the Director, Center for Neuroeconomics Studies.

While many nonprofits already understand the power of storytelling and how compelling a well-constructed narrative can be, Zak’s recent scientific work is putting a much finer point on just how stories change our attitudes, beliefs, and behaviors.

A decade ago, my lab discovered that a neurochemical called oxytocin is a key “it’s safe to approach others” signal in the brain. Oxytocin is produced when we are trusted or shown a kindness, and it motivates cooperation with others. It does this by enhancing the sense of empathy, our ability to experience others’ emotions. Empathy is important for social creatures because it allows us to understand how others are likely to react to a situation, including those with whom we work.

More recently my lab wondered if we could “hack” the oxytocin system to motivate people to engage in cooperative behaviors. To do this, we tested if narratives shot on video, rather than face-to-face interactions, would cause the brain to make oxytocin. By taking blood draws before and after the narrative, we found that character-driven stories do consistently cause oxytocin synthesis. Further, the amount of oxytocin released by the brain predicted how much people were willing to help others; for example, donating money to a charity associated with the narrative.

In subsequent studies we have been able to deepen our understanding of why stories motivate voluntary cooperation. (This research was given a boost when, with funding from the U.S. Department of Defense, we developed ways to measure oxytocin release noninvasively at up to one thousand times per second.) We discovered that, in order to motivate a desire to help others, a story must first sustain attention – a scarce resource in the brain – by developing tension during the narrative. If the story is able to create that tension then it is likely that attentive viewers/listeners will come to share the emotions of the characters in it, and after it ends, likely to continue mimicking the feelings and behaviors of those characters.

Zak’s research definitely lends itself to my thoughts that VR can be a straight up empathy machine, that if harnessed correctly can help drive deeper connections with donors, generate more meaningful and transformative gifts to your organization and also capture the unique nature of what the nonprofit sector does and how it serves and supports society.

Zak’s lab discovered in 2004 that the brain chemical oxytocin allows us to determine who to trust and trust is a big barrier to tech adoption. So to advance this new ROI into the mainstream we ultimately need a deeper understanding of empathy and what motivates folk to give.

People trust science, right?

Well, people have been trying to capture empathy through testing for decades, the empathy quotient is one, and one that is intended to measure how easily you pick up on other people’s feelings and how strongly you are affected by other people’s feelings, another being situational empathy which is measured either by asking subjects about their experiences immediately after they were exposed to a particular situation, by studying the “facial, gestural, and vocal indices of empathy-related responding”. 

All interesting, but hardly going to be captured in a post VR session survey, regardless of whether it’s delivered on an iPad or not.

I have been excited to see the development of wearables to assist in building the scaffolding for a new ROI. Neurosensors will also play a role too. That’s why I have been following its development and roll-out in the entertainment industry.

There currently exists a ‘return on experience’ platform and wearable neurosensor that captures what audiences care about in real time. Using this tech we are seeing corporate event planning professionals tracking moments of peak immersion, when people become frustrated, and identifies those who are extraordinarily immersed in your program. You can see where I’m going with this, yes, when a person is probably at a point where they should be asked to make a gift.

So if neurosensors are how to capture a return on experience, how do we spin it towards immersion? Again, going back to Zak’s research around narrative will provide the clues and then perhaps being supported through data modelling to show comparative data sets based on past engagements, contact reports and giving could do the trick.

To the issue of the former, it is one thing to read and contextualize impact, but quite another when you can see the real difference it makes to a person’s life — and see it from their eyes. Feeling an emotional connection to a foundation’s work in Kenya and Tanzania for example and appreciating the reality of these everyday problems that are so often looked over in favor of more dramatic conflict is a start. 

And while I’m buoyed by the trajectory of fundraising AI to provide that undercurrent of predicting a quantitative outcome, I still think we are ways away from the ethical elements of calculating immersion. I for one would have issues asking someone for a major gift knowing that we have potentially elevated their emotions and taken advantage of it in the moment.

So to sum up we have ways to go, but we are not moving blindly in this space with both academic research and innovative tech ensuring that this is going to be more a case of when and not if.

With many nonprofits desperate to break through the clutter of a sector clamoring for new sources of funding, virtual reality may be the key to increasing the amount of donations. It can provide stakeholders with a unique vantage point to understand and be emotionally moved by what you do and the communities you benefit.

I don’t think I have been more blown away by a technological advance than this one and this new ROI once understood and quantifiable will help justify its adoption and investment. My gut still tells me that the future of virtual reality (VR) does not lie in the hands of Hollywood production houses or video game enthusiasts, but in those of documentarians and storytellers worldwide, but we are going to need a bit of sector wide help.

To close this gap some major national foundations could either expand or create grant opportunities that will allow for smaller organizations to create VR content. This may also come in the form of partnerships with major companies and platforms such as Oculus and Vive to help partner VR production companies with impactful nonprofits, and who have already explored this approach through their VR for Good programs.

‘Telling your story’ has always been at the forefront of advice given to nonprofits and now the sector has a new tool to add to its fundraising arsenal and assist in their ongoing narrative. It is one thing to read and contextualize impact but quite another when you can see the real difference your donation makes to a person’s life, through their eyes and devoid of the harsh realities of which it’s all too easy to ignore. The VR industry understands that adoption and conversion is going to be driven from the outside in and that VR won’t flourish in this sector without its intervention and ongoing commitment.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s