Tuesday, November 21, 2006

overcoming mistrust

I'm doing my homework on Howard so that I hopefully have something intelligent to ask when he visits Seattle for the OCLC Symposium at Midwinter. Following is a little tidbit I found in his rapid decision making paper:

In cooperative strategy, creating a "“shadow of the future" is a concept for thinking about how to create trust among strangers. The notion is to aggregate cues and indicators from the present and past that will reduce uncertainty about another personĂ‚’s future action. The auction site eBay does this by providing a rating system for sellers. Buyers rate their experience with sellers, so that prospective buyers have some indication of how a particular seller performed. If rating, ranking, and reputation systems can be created for other kinds of contexts, they can be used to help reduce the fear and mistrust among strangers in quick-response situations. For example, if organizations made peer-based ratings for key indicators of cooperation available companywide, individuals could use these indicators as a proxy for direct experience. Also, strategies that leverage the transitive nature of trust can help reduce the risk and uncertainty of interactions with strangers. Making social networks and degrees of separation visible could serve as proxies for how a person is connected to others with whom there may be a great deal of trust.


Let's read that one sentence again:

...if organizations made peer-based ratings for key indicators of cooperation available companywide, individuals could use these indicators as a proxy for direct experience.

So what he's suggesting is that if we can figure out how to indicate to one another where cooperation has successfully taken place, we can use those indications to facilitate quick decision making, rather than having to rely on our own personal experiences. Developing systems that help us communicate those indicators is key.

WebJunction has been grappling with the concepts of trust and cooperation in our online community since we opened our message boards to public posting in May of 2003. The first day we had cceller post for the first time. He has since posted nearly 200 questions and answers spanning the last three years. Though we didn't know who "cceller" was when he started, he has come to be known as the Chad, the Automation Services Manager in a North Carolina library. We now know his homepage URL and the things he cares about and works on in his library. Over time, the number of posts next to his name and the amount of personalized information he has shared with us has created a trusted peer-based resource for the library community.

In absence of the time it takes to read posts and get connected to an online community over time, it's nice to be able to communicate quickly whether or not newcomers can and should trust the advices of our members on our message boards. The only such indicator that we currently offer on the message boards (technically speaking) is the number of posts that show up next to the member's handle. This indicates the amount of time any one member has spent in the community and can be used as a rapid decision-making tool. But, has it been? And how do newcomers to a virtual social networking space know that the hundreds of posts that so-n-so has posted aren't spam, or flames, or trolls? Are there now better systems for quickly identifying whether a community-based source is worth hanging our decisions or actions on? What are they?

I still have a lot of catching up to do on Howard's work in this space, so I'm not quite ready to say this is my question for him yet. But, I thought I'd share where I'm at today...

No comments: