L
ast July, an idealistic young entrepreneur by the name of Zhao Shuping had an epiphany: “Everything on the street,” he proclaimed, “can now be shared.”
Capitalizing on China’s sharing economy fetish, Shuping raised 10m yuan (~$1.6m USD) from a cadre of drooling investors, purchased 300k umbrellas, and rented them out at train stations across 11 Chinese cities for a fee of $0.80 per half-hour.
Within 2 weeks, all 300k umbrellas had been stolen.
The sharing economy is poised to grow to $335B worldwide by 2025 — and, as these platforms become more common, so too do the tales of their utter failure. Yet, our trust in collaborative consumption remains astronomically high.
Why?
In its purest form, the “sharing economy” leverages technology to facilitate transactions between people with idle goods and resources, and people willing to pay for them.
This system is highly dependent on us trusting complete strangers: we get into their cars, sleep in their beds, invite them into our homes to assemble IKEA furniture, and message them to watch our pets.
On paper, there seems to be a glaring problem with that: trust at an all-time low. Back in the ‘70s, nearly 50% of Americans thought most people could be trusted; today, that figure sits at 31%
Other polls show that we have abysmally low trust not just in the pillars of our democracy — the press (12%), banks (14%), and government officials (16%) — but even our own neighbors (42%) and co-workers (58%). These rates are even worse among millennials.
The weird thing is, despite this, our trust in the strangers of the sharing economy — like rideshare drivers — is sky-high, at 88%.
What’s going on here?
According to Rachel Botsman, one of the world’s leading experts on the sharing economy, companies use technology to establish an almost instantaneous “virtual trust” among users.
But sometimes, the sharing economy scales faster than user trust can be gained — and that’s when people start stealing umbrellas.
In any system that requires collaborative trust, there are going to be people who act according to their own self-interest, at the expense of everyone else. No kid, for instance, ever takes just one piece of candy from the unattended bowl on Halloween.
It’s a phenomenon known as the tragedy of the commons.
We’ve seen it play out many times in the sharing economy — especially in China, where a rapidly-expanding sharing economy has prioritized growth and profit over building trust with users.
Last year, more than 70 dockless bike sharing companies sprouted up in China, raised $1B in capital, and dumped millions of cycles on city streets all over the country.
This oversupply came with consequences: thieves stole bikes by the tens of thousands and ripped them into parts to sell; lazy riders discarded them in alleys and rivers; vandals circumvented the security system by smashing the locks and lighting the bikes on fire.
In the southern city of Hangzhou, authorities rounded up more than 20k bikes and dumped them in 16 enormous “graveyards,” where they currently rot in mangled piles.
“There’s no sense of decency anymore,” said one resident. “We treat each other like enemies.”
An actual image of a bike “graveyard” in China (photo via The Guardian; animation/illustration by Z. Crockett, The Hustle)This behavior isn’t exclusive to bicycles: shared nap pods, basketballs, cell-phone chargers, clothing, luxury handbags, and — of course — umbrellas, have all experienced high rates of vandalism and theft.
Yet, we still seem to trust even an imperfect or inadequate sharing economy system more than certain traditional forms of commerce.
Way back in pre-industrial times, when we wanted to trade a goat for 50 pounds of wheat, we based our trust in close-knit personal relationships.
After the Industrial Revolution, we got most of our goods from large corporations. Transactions became less personal, our trust eroded, and companies gained it back by creating strong brands and submitting to federal regulations.
Today, companies gain trust through technologies like digital ranking systems, which aim to recreate a model of capitalism that is highly(and, by certain measures, artificially) personal.
This “engineered trust” is baked into our daily interactions with the sharing economy.
When you call a Lyft, you can see your driver’s face, name, and sometimes even his or her music preferences. You can follow this person’s journey in real-time, as they zig-zag toward you like a fare-munching Pac-Man. And most importantly, you can see the person’s star rating, validated by the collective trust of other users.
These digital systems are often coupled with large teams of (human) surveillers. Airbnb, for instance, hired a team of 600 people to scour the site for bad actors after a host had her house raided by a guest in 2011.
Arun Sundararajan, author of “The Sharing Economy,” tells us these technologies have essentially “expedited” the process of gaining trust.
“If you meet a stranger and know nothing about him or her, trust takes time to develop,” he says. “But if you have a digital system that gives you a bunch of info about the authenticity of that stranger, trust can be gained instantly.”
Despite what Sundararajan calls “a constant game between platforms and people trying to game the system,” we find ourselves readily willing to commit to a collaborative system before we know how the person on the other side of the transaction will behave. Though bolstered by data, our trust in the sharing economy essentially operates on blind faith.
Platforms are starting to figure out how far this trust can go — and for now, the limits are lofty.
Just ask Jessica*, an Uber driver in San Francisco.
“I’ve had the sketchiest rides you could imagine… I once had a dude jack off in my backseat,” she tells us. “But do I trust my rides? Do I trust the [platform]? Sure. Who ever let a few bad apples ruin the fun?”
***
* Note: Ironically, the driver did not entrust us with her real name; it has been changed, at her request.