As you may have noticed, we have decided to suspend our work on Credport, below you will find the email we have send to our users. We will have a more in-depth post-mortem coming up.
As you may have noticed, we have decided to suspend our work on Credport, below you will find the email we have send to our users. We will have a more in-depth post-mortem coming up.
I wrote a post for Ouishare on the topic - hope you enjoy it! http://ouishare.net/2013/03/why-its-so-hard-to-build-a-peer-to-peer-marketplace/
Increasingly, peer to peer services have been pivoting their model or even shutting down due to the challenges they face. While it’s true that many startups fail, there are a few unique challenges for p2p startups. By keeping them in mind, we can build towards the next phase of the Sharing Economy.
Through the process of building Credport, we've talked to a large number of people running peer to peer marketplaces (especially in the Sharing Economy) and read even more about the topic. One key factor to success is building single user utility into your platform.
The first thing you should focus on in a marketplace is how you're going to get to liquidity. As Simon Rothman says, "Liquidity is the reasonable expectation of selling something you list or finding what you’re looking for...Until you reach liquidity, you’re vulnerable. After, you have the opportunity for dominance."
Many people familiar with the movement known as the Sharing Economy frequently cite a statistic about how a drill is only used for "x minutes in its entire lifetime." This is supposed to encourage us to either borrow a drill from someone, or hire someone to drill a hole for us. However, based on my experience in talking to people both inside and outside the sharing economy, as well as observing the startups who have done really well in this sector, I believe the drill is a bad example to use.
Trust and reputation are two words that get used so often and similarly on the web that they get confused with each other. Can I trust him? What's her reputation? Discussing the meaning of trust and reputation as well as examining how they fit together will hopefully give you a better idea of how Credport works.
To start, I'd like to refer back to a previous blog post to come up with a definition of trust - What is trust?. "In terms of a person, this means that someone is trustworthy if his or her actions are almost always what you expect an ideal person to do. Someone who isn’t trustworthy will frequently deviate from your expectations. In short, trust is your ability to accurately predict another person’s behavior."
If I were to say that a babysitter was trustworthy, I would mean that there is a high probability she will take good care of the children she is tasked with watching. The more I trust her, the better the probability.
On the other hand, reputation isn't a prediction of the future, but knowledge of the past. As sociologist Barbara Misztal states , reputation is a memory tied to a specific identity. It's a collectively agreed upon version of how history has taken place.
Your eBay reputation is a combined history of how you've behaved in all of your previous transactions. It's eBay's memory of your actions.
So then how do trust and reputation fit together?
Simply, a strong reputation builds trust. If I can see how you've acted in the past, then I'm more likely to be able to predict how you will act in the future. For example, if I can see that you've gotten 10 positive reviews as a host on Airbnb (your reputation), then I am much more likely to believe you will be a good host if I choose to stay with you (trust).
Credport is a way to aggregate your reputation in one place - therefore increasing the likelihood that people trust you. Seeing that you've been a good citizen in one place on the web is a much better indication that you will be in the future as opposed to just a name and picture.
And for those people who don't necessarily have a big reputation already out there, don't worry. Since you bring your social graph to Credport as well, people can also see how they're connected to you, whether it's via a common friend or a business partner.
Hopefully that clears the meaning up - I'd love to hear what you think in the comments!
Just to give you a bit of background: I wanted to write this post after reading a book (http://trustmojo.com/) on trust written by the founders of Soundcloud, Eric Wahlforss and Alexander Ljung. Funny to find out that one of the biggest startups in Berlin was founded by a few guys who were just as into online trust as we are (they actually wrote a book about it!). This book has a few topics worth posting about, but just wanted to start here.
 found on page 95 of "People, Profiles, and Trust"
I had the great opportunity to speak at TEDx Berlin about trust and privacy, and though the video is probably the best recap of my thoughts, I wanted to provide a text version with the same ideas as a quicker read. You can also check out the slides as well, though they're mostly pictures.
Though some think of privacy as a fundamental human right that's always been around, it hasn't really always been that way. The first human beings living in very small societies and villages probably had very little privacy. Living in a cave or a small circle of huts meant that everyone likely knew what everyone else was doing and who brought what home for dinner. Only through technology advancements that allowed humans to live in larger and larger groups did we gain more and more privacy through separate and remote homes, larger marketplaces to buy and sell goods, and the ability to just move to a new place and start over.
But now that same technology is taking away our privacy. Whether it's being tracked on the Internet, email tools that let us "spy" and research very easily, or just the much wider availability of information, it's hard to deny that many people are losing bits and pieces of privacy. In my experience, one can find out almost anything about anyone - it just depends how long and how much money you're willing to spend trying.
Many people see this as a negative, with the EU a noted example of fighting for privacy rights. We agree - there should always be places on the Internet where anonymity rules, whether for political, social, or personal reasons. But it's highly unlikely that those places will rule the web, and even some supposedly anonymous forums like Reddit and 4chan actually aren't so anonymous.
Is Information Bad?
Instead of seeing this privacy loss as a negative, we at Credport are presenting the possibility that maybe information flow isn't so bad. Instead of thinking information about human beings, let's switch the focus to something else.
Generally, people see transparency and information about corporations and governments as positive. The more we know about the workings and decisions of our government, the better our chances to catch corruption. In most advanced democracies, there are oversight committees upon oversight committees, public reports and records, and ways to check out just about everything.
The same logic follow for corporations. We want to know how they affect the environment we live in, whether their products are safe, and if they're charging consumers a fair price. This can only be accomplished through transparency.
So if we want to know more about governments and corporations, what about transparency for people?
The Future of Trust
The reason that transparency is so important is that it builds trust. The more you know about someone, the more you can trust them. The cool thing is that trust enables new kinds of interactions online. Sites like eBay and Airbnb simply couldn't exist without some sort of transparent rating/reputation system to build trust between people. Because both buyer and seller have a very good idea they'll be satisfied, they feel safe sending their stuff all the way across the country in return for money, or staying in an apartment with someone they've never met before.
These marketplaces are just the beginning, and that's why we're building Credport. We want to enable every kind of interaction between people through trust, to build a peer to peer world instead of hierarchical one. This added efficiency of cutting out the middleman and these new connections to people we've never met before are extremely powerful. By combing a verified identity, a reputation trail, and social networks, Credport allows people to feel safe interacting with a stranger in the real world - and that's what the next phase of the Internet will really be about.
So maybe this new technology really will take away some of our privacy, but we don't have to view it as a bad thing. We can use information to build trust. We might even end up collaborating in ways like our ancient ancestors who lived in caves and huts did.
We'd love to hear your thoughts!
I just recently finished reading Geoffrey A. Moore's Crossing the Chasm, and found myself drawing parallels to the sharing economy and collaborative consumption throughout the entire book. Though it was originally written in 1991 (revised 1999) and focuses mainly on high technology products, it holds so many valuable lessons that I'm devoting a few blog posts to discussing it.
The book starts with the "Technology Adoption Life Cycle", which is an excellent way of framing any market. It's designed to trace how a new technological innovation enters the market and gains widespread adoption. It was originally developed at Iowa State University to track the purchase of hybrid seed corn, and in the book is used to follow new software, PDAs, and some "hard tech" products, but it holds true for sharing as well.
Even though the sharing economy isn't a strict technological innovation, the new products, services, and ideas it is responsible for follow the same curve of adoption. Though it is enabled by new technologies (Leah Busque and Lisa Gansky discuss solomo here and here ), the heart of the sharing economy is really about an idea: sharing stuff instead of buying it.
As you can see from the curve, there are 5 categories in the life cycle - innovators, early adopters, the early majority, the late majority, and of course laggards. The book focuses on helping companies cross the chasm between early adopters and the early majority, but I think it's first important to talk about the characteristics of each group.
The innovators (besides the creators of technology themselves) are the consumers most drawn to new products. In the sharing economy, these are people who are already carpooling, bartering, and efficiently using their resources. They actively look for new ways to share, and have jumped at the chance to Couchsurf, TaskRabbit (it's a verb now), or Getaround.
Next are the early adopters. They're open to try new technology (or in this case ideas), and easily understand the benefits - "Renting a room on Airbnb instead of a hotel saves me money. Cool!" The vast majority of existing sharing services only have members from these first two groups.
Next, across the chasm, are the early majority. They're also known as pragmatists, which starts to explain why it's difficult for companies to attract them. They are open to new technology, but wish to see a solid reference base. Especially for a large behavior change (renting out a car that normally is always in your driveway, for example), the early majority wants to know that the model works.
Once the early adopters are on board, the late adopters want to see the technology (or idea) as a standard before committing. They're another very large segment, but only adopt a new technology after it's been proven consistently. Last are the laggards - they're not worth devoting much time to, simply because they very rarely actually adopt a new technology.
Phew - now that the market part is out of the way, I can spend some time in future posts talking about companies on both sides of this chasm, and what lessons there are to be learned. I'd love to hear your thoughts if anyone else has read the book, or anything you'd like to see in the future.
A Response to "Can Trust Systems Build a New Economy From Ruin?"
Shareable just had a great article about trust - this post won't make sense unless you read it here. The article discusses a handful of very difficult questions. There's an extended discussion of FICO in comparison to trust. We thought a trust score deserved a post in itself, since it seems to be a popular approach.
In order to create a global trust system, one has to define what trust is. A reasonable attempt to quantify something, like some try to do with a trust score, has to lay out exactly what is being quantified.
A quick look to Merriam Webster brings "assured reliance on the character, ability, strength, or truth of someone or something". One of our previous blog posts also has a lengthier discussion in attempting to define trust. In my mind, trust is (confidence in) our ability to predict future behavior. For the sharing economy, it comes down to whether or not you'll have a good experience when someone stays in your home, drives your car, or any other interaction.
When you look at trust in this context, the differences between FICO and a trust score become immediately apparent. FICO only predicts one thing: your ability to pay back debts. A credit score can be reasonably accurate based on a few numerical attributes - your past history of paying debts, your current debt load, the length of your credit history, etc. It also has the benefit of very measurable outcomes. Did you pay your debts back? How quickly?
In contrast, a trust score requires the prediction of a seemingly endless list of behaviors. Are you a good driver? Are you courteous when staying in someone's home? Do you take care of dogs well? Are you dangerous around my children? Will you do a good job putting together my IKEA furniture? It seems hard enough to gather data on one person in all of these situations, let alone predict future behavior in any of them.
Even more important than the prediction itself is the error associated with a false prediction. In credit scores, a score that's artificially high will in the worst case mean lost money for the borrower. A credit score which is too low will only lead to opportunity cost.
On the other hand, a trust score which is too high won't just jeopardize money. It potentially puts human life in danger to blindly trust someone malicious. If we as a society become dependent on a score, every false prediction threatens human safety, and irreparably damages the scoring agency itself. Erring with a score that's too low might not threaten safety, but it does shut off participation in sharing marketplaces, which is the last thing they need right now.
In short, one score can't encompass the vast number of situations, contexts, and definitions which fall under the term trust. Even if some score did manage to come reasonably close to predicting human behavior, every false prediction would have such a negative effect that the score would be unsustainable.
We haven't really spent time introducing ourselves or why we're building Webcred, but we figured late is better than never. (Especially, since our public image seems to be described by the picture on the left). We're also planning to include more personal posts in the future as well as more about Webcred and trust.
So first - why Webcred?
Well, our roots go back a while. Nam started getting into the online startup world a bit and somewhat dragged me along since we happened to live together - after enough TechCrunch and Hacker News links, I think I just started reading on my own.
Airbnb was immediately a company we both admired, mainly because it combined two things we highly value: meeting new people, and efficiency. It might sound weird to say we love efficiency, but the idea of staying in space which would otherwise go unused - "efficient allocation of resources" - and people making money off of it was pure gold. We also loved that while traveling, you could connect with new people in ways that a hotel would never offer.
We started to see this sharing model being used in other businesses like ridesharing and carsharing - basically anything sharing. There's even a topic on Quora called "Airbnb for X": http://www.quora.com/Airbnb-For-X. We also discovered the work of thought leaders like Rachel Botsman and Lisa Gansky who have managed to describe the new sharing economy in terms of collaborative consumption and the mesh.
After we realized our passion for the space, we just said, "Shit, everybody should do this." My mom should be making money off of my room and car while I'm away at college.
However, we realized not everybody could do this, at least yet. Nam and his family took a trip to Rome, but he struggled mightily to convince his parents to use Airbnb instead of a hotel. They eventually relented and had an awesome trip - they stayed right next to the Vatican, and it was much cheaper than any hotel. Now they use Airbnb all the time, but unfortunately not everyone has a son as persuasive as Nam.
The lack of trust is a major barrier to sharing transactions. My mom probably wouldn't ever let some random stranger from the internet stay in my room or drive my car. However, we realized that if we could solve this trust problem, we could bring sharing to the mainstream. The key is to make my mom feel like the person in my room isn't some random stranger.
Webcred is all about helping the sharing economy to grow by reducing friction. We're passionate about a new economic model, and we'd love for it to be accessible to everyone, not just the visionaries and early adopters we see now.
And here's a little bit about ourselves.
My co-founder Nam is a native of Berlin, so he loves being back in his hometown for Startupbootcamp. He heads up the technical/product side, will probably beat you at ping pong, and also wrote Maps Offline (check out the brilliantly narrated video here.)
I'm Connor, and apparently am in charge of everything else. I grew up in Ohio which means that whenever someone mentions roller coasters, I brag about Cedar Point because it's the only thing we have to brag about. I'm also a huge Green Bay Packers fan.
We met as random freshmen roommates at Boston University, and basically have been hanging out every since, so it seemed like a good idea to startup together.
Well first and foremost, you may have noticed that Webcred is now Credport. We had to change due to a trademark issue, but we're pretty happy with our new name and logo. We also really like the word port (for ships, passport, airport, portal, portable, etc. - all awesome words).
We also now have an alpha up! Nam and Samir have made an amazing amount of progress with fixing bugs, polishing everything up, and finally getting Credport live. We still have a ton of things in the pipeline, but it feels really good to take the first step.
Check our our github repo from the past few days!
One of the reasons we were working harder than usual was HY Berlin, a pitch competition here in Berlin last night. It was great to spend a few minutes talking about Credport, but also a good deadline for us to stick to. We'll post a video here as soon as it's available.
That's all we have for now, but let us know what you think of the initial version!
The best thing about the Internet is that it can connect you with anybody anywhere in the world. You can Skype from Boston to San Francisco, tweet from London to Hong Kong, and plan which restaurant in Helena, Montana, you’re going to visit next week before you drive 30 hours to get there.
However, meeting people on the Internet can also be more difficult and complicated than shaking hands with somebody new at the coffee shop.
The main issue with the Internet is that you don’t know if the person you’re interacting with is even a real person or is pretending to be someone else. You could be talking to a someone made of code and a processor rather than skin and bones. There are thousands and millions of other spambots like Lisa that you might encounter.
Scammers can claim to be the King of Nigeria or offer you a great deal on Viagra to get your credit card number and personal information, which they can then use to steal from you. Phishing—where scammers pretend to be a site such as eBay or your bank in order to steal information—has also been used frequently.
For the most part, spam and unwanted interactions can be screened so we can get to the people we really want. Gmail has excellent spam filters, and sites such as Twitter and Facebook allow us to flag fraudulent profiles. Though we still run into spam emails and get friend requests from fake profiles on Facebook, these distractions are relatively rare.
When someone does manage to steal personal information, the worst thing that can usually happen is financial loss. But credit card companies and banks often protect against identity theft, so losing money is unlikely to happen with proper reporting.
On the other hand, screening becomes much more difficult when money is not the only thing at risk. When interactions move from the Internet to the real world, they can become much more dangerous. Craigslist, online dating, ridesharing, and many other services can all jeopardize safety.
Screening the same way doesn’t seem to make sense. Correct spelling and a reasonable email address usually merit a response, but say nothing about the safety of meeting in person. You might notice if you’re browsing on a suspicious URL, but a trusted URL doesn’t promise that the person you are about to meet has no malicious intent.
Existing methods of filtration are well developed for interactions that take place solely on the Internet, but once these interactions transition to the real world, we lose our sense of security. We need a new way to screen people that’s more than just typing a name or email into Google.
Privacy is one of the most important parts of trust. While there might be more trust if everyone knew everything about everyone else, that would completely compromise personal privacy. We value privacy as a fundamental right and believe that no one should have to give up any personal information against his or her will.
Interacting in the online world often leaves us questioning where our privacy went. We’ve heard stories about photographs posted on Facebook becoming advertisements across the globe. And we’ve all seen how advertisements seem to can target us a little too well, mentioning products specific to your home state, your music tastes, or your college major. Online advertisements seem to know exactly who we are. These somewhat creepy moments can make people more likely to guard their personal information.
But keeping all personal information private also has drawbacks. To hide all personal information, you would basically have to remove yourself completely from society - no bank account, no cable or Internet, no insurance. Without sharing your information, there is also no way to build trust. How can you trust someone you know nothing about?
An ideal system to build trust would respect privacy, but still provide necessary information. You don’t want people to have enough information to break into your house or to steal your identity. However, you do want to be able to research other people before you meet them and make sure they're safe and not a scam.
To find a good middle ground, it’s important to realize what kind of information is necessary to building trust. If you can verify some basic information, see a history of past actions, and know how you’re connected, you’re able to decide whether you feel safe interacting with this person. You also don’t compromise anyone’s privacy in the process.
An even better system would let you control which information people see. If you don’t worry about sharing your name, your Craigslist purchases, and how many Facebook friends you have, then you should be free to share. However, if you’d rather keep some of this information to yourself or wait until you learn more about the other person, you should also be able to keep some things hidden.
It’s important to remember that the point of a trust system is to build trust. If you keep all your information secret, other people can’t determine whether are they can trust you.
Striking a balance between privacy and trust is crucial to the success of any system and is an attainable goal with proper planning.
The more we've thought about trust at Webcred, the more we have realized that trust is an integral part of societal development. Increases in trust have led to many human advances, while trust’s absence has resulted in many setbacks.
From the beginning, trust has led people to live efficient, productive lives. The first hunter-gatherers worked in groups to hunt large animals. They had to trust one another to complete their assigned duties and to share the rewards of their labor equally. When humans discovered that working cooperatively is more efficient than working independently, they began to live in small societies of about 150 people.
On the other hand, trust’s absence has led to warfare, a flagrant squandering of resources. People have fought over land, resources, power and money, and distrust between groups always played a role. Successful societies grew in size by building trust between people and sharing resources.
Roman soldiers agreed to fight because they trusted their leaders to provide them with food, money, and weapons. Likewise, farmers cultivated their crops because they were guaranteed protection from invaders.
One technology crucial to human progress was the development of a codified set of laws guaranteed by a government. Once citizens knew exactly how their society would respond to their actions and the actions of others, they could devote less time to worrying about threats and more time to engaging in productive activity.
Instead of building moats around their houses to keep people out, people could build plows to make farming easier or wagons to make transport faster. A more trustworthy government means better off citizens.
There are countless other examples of the importance of trust in human history. Building more trust in our rapidly changing society could only allow us to continue to progress.
As we were traversing the internet recently, we stumbled upon an interesting definition of trust in this paper. The paper focuses on security and cryptography, but its definition applies to a wide variety of activities. The authors reason that “the local trust depends on the gap between behavior and expected behavior of an ideal agent in that role.”
In terms of a person, this means that someone is trustworthy if his or her actions are almost always what you expect an ideal person to do. Someone who isn’t trustworthy will frequently deviate from your expectations. In short, trust is your ability to predict accurately another person’s behavior.
Let’s say you have a cousin who you have seen steal cookies from the cookie jar many times, breaking a promise to his mother. If today your cousin says he won’t steal any cookies, you probably will expect him to steal them anyway. His actual behavior—stealing cookies—is not the same as the ideal, promised behavior—not stealing cookies—so he is less likely to be considered trustworthy.
In simple situations, evaluating trustworthiness is easy. However, trust becomes much more difficult in new situations when you’re interacting with new people because it’s hard to predict the behavior of a stranger.
People don’t always think rationally when evaluating trust. Since trust is your judgment of how well you can predict someone else’s behavior, evaluating trustworthiness depends on your own thoughts and feelings. For example, if you repeatedly witness your cousin stealing cookies, you may be predisposed not to trust another person with cookies, even though your cousin’s stealing history has nothing to do with the other person.
At the end of the day, trust is subjective. It’s nearly impossible to predict human behavior, so people have developed their own methods to determine whether a person can be trusted.
We plan to give people the power to make their own decisions about trust online by providing them with the right information.
Webcred is about building trust, so we've spent a great deal of time thinking about exactly how and why people trust one another in the real world. Our goal is to bring that same level of trust to online interactions.
The easiest way to build trust is to get to know someone. Some of the people you trust the most are probably the ones you’ve spent the most time with. From family to friends and co-workers to classmates, the people who spend the most time with you are often the ones you trust most. The more familiar you are with a person, the better equipped you are to decide if he or she is trustworthy.
Another way we understand trust is through the transitive property--yes, all the way back to algebra--which says that if A=B and B=C, then A=C. Now let’s apply that to trust: If person A trusts person B, and person B trusts person C, then person A trusts person C.In simpler terms, a friend of a friend is usually someone you can trust.
A third way to measure trust is through a person’s relations to institutions. For example, you may trust a graduate of Harvard Medical School more than a doctor from your local community college, especially if you went to Harvard. You also might trust a lawyer who passed the bar exam in your state more than a neighbor who has read a few law books when you need legal advice. While institutional relations aren’t a perfect measure (you can’t trust every Harvard graduate), in many cases institutions can signal trustworthiness.
These three ways of measuring trustworthiness help govern our social and business interactions. By appropriating these trust-measuring tools to the online world, we believe we can make online interactions just as safe as real world interactions.