This is the third of four parts in a management series for nonprofits on platforms as organizational forms. The series will describe the dynamics of how platforms work and how to approach balancing, growing, and sustaining them over time.

While not all platforms are big, many are, such that people who study them compare some to nation states.1 Whether big or small, platforms are not only labs for innovation, they are spaces to practice shared decision-making, which is central to democracy. In fact, applying platform expertise to social change efforts starts to look a lot like civil society.

Platform governance has been defined as “the set of rules concerning who gets to participate in an ecosystem, how to divide the value, and how to resolve conflicts.” Because platforms create value both on and off the platform, ethical governance, or governance where the platform does not rule selfishly, is critical.

Platforms are based on meaningful interactions, and interaction failures occur when good interactions fail to take place and bad ones do. There are four main causes of these failures: information asymmetry, externalities, monopoly power, and risk. As the phrase suggests, information asymmetry occurs when one user knows facts that others don’t and uses it to his advantage, as in the case of counterfeit goods. Externalities are when costs or benefits accrue to people not involved in the interaction, such as when a friend gives your information to a company in order to gain a reward. Dropbox, for example, gives extra storage space to users who invite friends who sign up. This also includes the concept of public good, “whose value is not fully captured by the party that created it.”2 Monopoly power is advantage resulting from the capture of a valued good, such as access to resources. Finally, risk is the possibility of an interaction going bad, such as a user not delivering on her end of the interaction. All of these must be mitigated through governance.

There are four main sets of tools for platform governance: laws, norms, architecture, and markets. Laws are the explicit rules that “moderate behavior at both the user and the ecosystem level.”3 They include terms of service and rules of engagement and should generally be transparent.4 For example, Apple allows users to share digital content with up to six devices or family members. This balances incentivizing the purchase of additional Apple products and services with allowing reasonable levels of sharing.

Platforms are essentially dedicated communities that are nurtured by norms that create the desired culture. This includes principles to guide interactions and actions. For example, the iStockPhoto community’s norms include “feedback, high-quality content, open engagement, and a natural progression to greater levels of authority.”5 Norms are created by what platform designers call behavior design, “a recurring sequence of trigger, action, reward, and investment.”6 The trigger is a signal from the platform to the user that prompts the user to take some action, which produces a reward, and then asks the user to make an investment, usually of time, data, social capital, or money. For example, you may see a Facebook ad for an interesting vacation adventure. You click on it and receive useful information about how to bring that adventure closer to reality. In return, you provide information about yourself so that you can continue to receive more of this kind of information.

However, especially when dealing with public goods, “as a rule, it’s desirable to have users participate in shaping the systems that govern them.”7 This has been shown to follow a pattern.

1. Clearly defined boundaries between who is and who is not entitled to community benefits
2. People affected by decisions regarding community resources can influence decision making
3. People who monitor community behavior are accountable to the community
4. Graduated sanctions are applied in violation of rules
5. Community members have access to low-cost dispute resolution
6. As community resources grow, nested tiers define governance, with simple issues addressed by small, local groups and complex ones by formally organized groups

In platform governance, architecture refers to well-designed systems that encourage and reward desirable behavior and correct for the aforementioned interaction failures. For example, Bitcoin digital currency and the blockchain protocol governing it offers unforgeable currency that is decentralized, that is, not controlled by a government, bank, or individual. In this case, “the blockchain protocol makes decentralized governance possible.”

The value exchanged on the platform market is usually in the form of social currency, giving something to get something. For example, when you offer fun via a photo post, you get people who like it and maybe even share it. Further, this may get you more followers, which builds your online reputation, which you can then leverage off platform. When creating and sharing intellectual property that may be useful as public goods, a different aspect of market emerges. The platform must seek to balance individual ownership, which incentivizes idea sharing, with platform ownership, which enriches the platform ecosystem. This is a feature of risk, the reduction of which is always a platform concern. However, platforms must focus on minimizing risk for users, which maximizes value creation.

Platform governance must orient towards new value, not protecting the past. It must promote evolution. Therefore, the ultimate governance is “‘design for self-design’—that is, it encourages platform members to collaborate freely and experiment fearlessly in order to update the rules as necessary.” Platform managers must be on the lookout for signs of change. This includes new behavior by users, unanticipated conflicts among users, and encroachment by competitors. When change is spotted, information about it should spread quickly throughout the platform and encourage conversations about creative governance evolution. Governance should pay attention to speed and design both for issues that require a slow response and those that require a fast one.

Notes

  1. Parker, Geoffrey G., Marshall W. Van Alstyne, Sangeet Paul Choudary. Platform Revolution. New York, London: W.W. Norton & Company, 2016. 159.
  2. Ibid, 163.
  3. Ibid, 166.
  4. Ibid, 166–167.
  5. Ibid, 168.
  6. Ibid.
  7. Ibid, 169.