Social media, public safety and virtues

Virtue ethics can help to steer the development and usage of social media apps for public safety. It is especially helpful for developers. They can ask questions like “How could this app help people to cultivate, e.g., honesty, self-control, justice and civility?”. Careful reflection on such questions can help them to develop apps that support people in cultivating these very virtues.

Virtue ethics

A virtue ethics approach raises questions like:

  • How can we create a just society?—one in which people can experience, for example, freedom, equality and democracy.
  • How can people cultivate virtues that will help them to flourish?—where virtues are understood as dispositions or patterns of thinking, feeling and acting—and indeed the alignment of thinking, feeling and acting.

In her book Technology and the Virtues (2016) Shannon Vallor, professor at Santa Clara University in Silicon Valley, proposed virtue ethics as a framework to discuss emerging technologies. She proposes to “investigate how various forms of emerging technologies, depending upon how we choose to develop and engage them, may enable or frustrate our efforts to individually and collectively become virtuous: to make ourselves into the sorts of human being able to live truly goodlives.” (p. 159).

Below, we will discuss Neighbourhood Watch and the ways in which this can help or hinder people to cultivate the virtues of self-control, justice and civility.

Neighbourhood Watch 

(see also D4.1, p. 15)

Increasing numbers of people use social media like WhatsApp, as Neighbourhood Watch: to share information and communicate, e.g. about garbage lying on the street. For example, in The Netherlands some 640.000 people use WhatsApp in over 7000 neighbourhoods for this purpose. Apart from the benefits of having this channel for information and communication it also poses risks, for example, privacy and data protection (with personal data in servers of companies abroad), for bullying and personal vendettas, and for prejudices or biases propagating unchecked via this channel.

Now, let us look at the virtues of self-control, justiceand civility, and discuss how WhatsApp or other Neighbourhood Watchapps can help or hinder people to cultivate these virtues.


Many apps are designed with the goal to attract and hold peoples’ attention. Tristan Harris, former design ethicist at Google and founder of the Center for Humane Technology, warns us of the power of apps to hijack our attention, typically to satisfy a business model that aims to maximize the time people spend in the app.

We cannot change the business model or algorithm behind WhatsApp, but we can introduce and use guidelines for wiser ways of using the app. For example, making agreements on which topics are in-scope (e.g. burglaries) and which are out-of-scope, e.g., pets—this will combat information overload; validating one’s observation before posting it (e.g. verifying whether the neighbour is indeed on holidays) to prevent invalid information, gossip and noise; and never sharing pictures from a closed group to open social media like Twitter or Facebook (see also: Guidelines in Dutch).


Sharing information is so easy and so quick on social media, which results easily and quickly in people sharing biased, incorrect or outright discriminatory messages. It is, however, possible to design apps in ways that help people to cultivate justice. Let’s look at social media app Nextdoor, which is used also for Neighbourhood Watch purposes.

Staff at Nextdoor found that people sent potentially racist messages, for example of a “suspicious person” walking around referring to the person’s skin colour. In order to combat such ethnic profiling they changed the app’s user interface: the user is now initially posed a question: Ask yourself: is what I saw actually suspicious after I take race or ethnicity out of the equation? Moreover, users are required to describe hair, clothing top and bottom, shoes, age and build before lastly identifying race.


Finally, social media can be used to cultivate the virtue of civility. This virtue is defined by Vallor as “a sincere disposition to live well with one’s fellow citizens … to collectively and wisely deliberate about matters … and to work cooperatively towards [common goods]” (2016: 141). Civility understood in this sense will require also virtues like empathy and perspective. Civility depends upon people’s abilities to empathise with each other and to apply and combine different perspectives, e.g. individual and collective concerns, short-term and long-term interests.

Sherry Turkle, professor at MIT, has studied people’s usage of technology for over 30 years. In her recent book, Reclaiming Conversation, she offers a diagnosis of the problems with social media: we are obsessed with being always ‘on’ and connected, which made us forget the benefits of solitude; we’ve lost basic skills for making conversations, we’re avoiding ‘awkward’ conversations; and we forgot how to organize conversations in groups. Turkle observes that the key problem, that is a lack of conversations, can be solved by exactly that what’s lacking, by having better conversations.

We can design apps in ways that help people to cultivate virtues that facilitate conversations: empathy and perspective, e.g. learning to connect to others, also when they have not much in common with us at first glance, or perspective, e.g. learning about ‘the other side of the story’.


If we want to use social media for collaboration between citizens and between citizens the police, with the goal to co-create public safety, then we need to do better in the design and usage of these social media apps. Developers can design these apps in ways that support citizens to cultivate virtues like self-control justice and civility — virtues that are needed to build a just and safe society.

Marc Steen



Leave a Reply

Your email address will not be published. Required fields are marked *