Wikimedia Product/Perspectives/Trust/Transparency

Transparency

edit

Summary

edit

The general level of trust in digital platforms, both in terms of the accuracy of their content, and of their operating practices, has hit a new low.[1]  Most content sites have optimized around the popularity of their content and the speed at which they pump it out, often to the detriment of quality and trustworthiness. As a culture, we can now see the social costs of moving fast and breaking things.[2] And while it’s easy to bemoan the experience gap between wiki projects (slow and old) and other people-powered platforms there’s a hidden upside. Our projects adapt at a plodding (human) pace, change is slow and painfully incremental. But slowness is an advantage when it comes to trust because trust is built on consistency and predictability. The open knowledge model has an innate stability and is inherently more reliable since very little of this output changes at the pace of the world around us.

In the current climate of distrust, being perceived as trustworthy presents an opportune moment for Wikipedia.  With so much positive social capital built up over so many years, it’s now time to take a risk: we must openly critique the flaws in this edifice in order to retain trust in the long run. Being more intentionally transparent about the messy process by which knowledge is created would almost certainly invite criticism but it’s also the only way to begin to address gaps and bias at a system level.


Resources

edit

Author., 20xx title https://www.linklink.net

References

edit