7. Wellness, well-being and digitalization

“We are beginning to witness the emergence of a greater sense of collective responsibility and collective action in tech workers of all kinds, driven by their growing awareness of the urgency and importance of changing social course and by a renewed sense of solidarity…” (Becker 2023)

Many people think that interest in sustainable digitalization mostly attracts humanists, ethicists and philosophers and others without core IT skills, people who are not engaged in the so-called practical reality. Nevertheless, there are many people with strong ethical ambitions for a more sustainable digital world and IT culture that resonates with their IT expertise.

An early example of addressing social issues from within the ranks of the digital profession can be found in computer scientist Ursula Franklin’s The Real World of Technology from 1998. She sets out seven principles for what we, consumers, citizens and IT professionals, should ask of technology and ourselves at every stage of development:

  1. Does it promote justice?
  2. Does it establish reciprocity?
  3. Whether it assigns divisible over indivisible benefits?
  4. Does it favor humans over machines?
  5. Whether it maximizes profits or minimizes disasters?
  6. Whether conservation is favored over waste?
  7. Does it favor the reversible over the irreversible?

Franklin operates with broad categories and does not have a specific focus on sustainability (as we talk about it today), but we choose to highlight her principles here to convey the simple point that critical thinking about the role of digitalization in relation to issues such as justice, humanistic values and overuse of resources is not a new concern and thus also comes from within the disciplines themselves. Enhancing sustainability is not just an external, academic maneuver. Digital practitioners are often the most knowledgeable about problematic, emerging issues that come with digital innovation.

What is the problem with digitalization?

If you have a purely instrumental view of the internet and mobile phones and operate with an idealized, resourceful user who knows what problem they have, you might think that digitalization contributes greatly to well-being. In addition to credible and free online resources about health and guidance to treatment options on the internet, there are also built-in health functions in mobile phones (which can count steps or remind you to go to bed on time), and there are an endless number of wellness and meditation apps in Google Play and Apple’s App Store.

You can buy pallets of books, podcasts and documentaries on how to live healthier, become less stressed, how to complete a marathon, care for your loved ones, etc. There’s no excuse not to do something about your wellbeing issues with all these offers!

Nevertheless, the proliferation of smartphones is linked to a number of problems, all related to poor wellbeing. The dominant theme is addiction. In 2024, Danish news site dr.dk reported that “86,000 Danes are addicted to social media”, based on a report from the National Institute of Public Health (Santini et al. 2024). For many, this addiction leads to depression, loneliness and loss of social relationships. One of the most debated effects of our media habits and the global spread of smartphones is the phenomenon of “manipulative algorithms”. For example, films like The Social Dilemma (Orlowski 2020) highlight how algorithms trick us into making choices - that were not part of our original intention with a digital action - and other unfortunate phenomena, connected to the use of smartphones:

  • Scrolling for an inordinate amount of time on social media, even though the intention may only have been to quickly check something.
  • Buying two T-shirts instead of the one you were looking for.
  • Being tricked into subscribing to a newsletter that you later have a hard time unsubscribing.
  • Buying insurance or renting a car, even if you just need a plane ticket.
  • Infinitely long pages (leading to “doomscrolling”).
  • The unhealthy preoccupation with how others will react to a post or comment.
  • Phenomena like “perverse effects”: the fact that algorithms are coded to give us more of what we want (even if they are unhealthy and counterproductive interests) based on previously documented behavior (a purchase, a like, a group you have joined).

Many of these phenomena are a consequence of clever user experience tactics (UX tactics) that entice us to overuse screen time by exploiting vulnerabilities in our cognitive apparatus. Much literature has been published in this area, mostly from the neurosciences and medical domains (e.g. by Michel Desmurget, Manfred Spitzer and Imran Rashid).

An attempt to compile criticism and research documenting the problematic conditions can be found on the website “Ledger of Harms” (Center of Humane Technology). In their “Ledger of Harms”, the Center works with eight categories of harms that have been given the headings and explanations you can see below - they are supported on the website with quotes from research reports.

  • Future generations: Exposure to unrestricted access to technology can impact child development and create permanent changes in brain structure.
  • Disinformation: A broken information ecology is hindering our ability to tackle complex, global problems (such as COVID-19).
  • Attention: Constant distraction prevents us from thinking, solving problems and interacting in a fulfilling way.
  • Physical and mental well-being: Overuse of technology can create stress, loneliness, self-doubt and have negative consequences for our mental health.
  • Social connections: Digitalization can help create distance between people who are close to us.
  • Politics: Social media platforms are built to spread the content that generates the most reactions, with the consequence that public attention is often directed towards polarizing and misleading content.
  • Systemic oppression: Technology often has mechanisms embedded in it that can reinforce racism, sexism, ableism, homophobia, and create an attention economy that works against marginalized communities.
  • Treat your neighbor as she would want to be treated: This slightly different category is not an inventory of injuries, but of Silicon Valley leaders and their statements about the riskiness of the products they helped bring to market.

Is the Center of Humane Technology alone with this focus? No, and most readers will have noticed the debate in the media in recent years. This debate is not only led by individuals, a number of institutions have joined the fight against these various harms and unhappiness. It’s not just about screen use, but about the way our social and working lives are organized and the consequences this has on how we interact with each other, the acceleration of decision-making processes and many other things.

You might think about digital development in the same way as the cars in the post-war era: they became wildly popular and widespread - but they caused many accidents, pollution and environmental disruption. This created the need for a lot of what one might call “negative innovations” (particle filters, noise reduction measures, abs brakes, parking assistance-technologies), that have to do with the opposite of getting from point A to B quickly.

The flip side of digitalization: examples of unhappiness in a technological world

We must acknowledge the growing body of research on potential harms, increase our collective understanding of the risks associated with social media use, and take urgent steps to create safe and healthy digital environments that minimize harm and protect the mental health and well-being of children and youth during their critical developmental stages. (U.S. Surgeon General 2023)

In Denmark, the Ministry of Business and Industry has been at the forefront of a showdown with “Big Tech”, and the ministry addresses some of the same themes as the American Surgeon General does in the quote above. In 2023, an expert group’s recommendations were published in the report “Democratic control of tech giants’ business models”.

From the foreword you can read: “The reason why the expert group has chosen to focus on the business models of tech giants as its first theme is because many of the problems in the digital world today can be traced back to business models. These include problems related to democratic discourse, geopolitics, competition and consumerism, and the well-being of children and young people.” (Flyverbom 2023).

The recommendations include regulating data harvesting, limiting the use of retention mechanisms and raising the age limit for access to social media. The report continues work started in a white paper published by the government in 2021: Towards a better society with tech giants. Here the focus is especially on the well-being of children: social media offers new opportunities for interaction, but it also has negative impacts on the well-being of children and young people. Children and young people are at risk of being exposed to inappropriate or offensive content, bullying and misleading and false information. This often happens in secret and without adults knowing about it or being able to intervene. (The Danish Government 2021)

A memo from the European Data Protection Board points out the following problems and thus paves the way for a broader regulation (European Data Protection Board 2023):

  • Overloading: Overloading means confronting users with an avalanche of requests, information or opportunities to manipulate them into sharing more data (than originally intended) or having their data used against their expectations.
  • Skipping: The user interface is designed in such a way that the user forgets to consider aspects of their privacy.
  • Stirring emotions (Stirring): Stirring emotions is a strategy that tries to get the user to act in a certain way, either by appealing to their emotions or using visual cues.
  • Obstructing: Is what happens when you prevent or block users from making decisions over their data.
  • Fickle: The user interface is designed to be unclear and inconsistent in a way that makes it difficult for the user to access their data protection tools and/or understand what data is being used for.
  • Left in the dark: This is when an interface is designed in such a way that you can’t find information or privacy tools, or you are unsure what control you have over them.

In the above, we have not included all the different sub-variants of manipulative tactics that the EU report describes: e.g. “privacy maze”, “deceptive coziness”, “dead end” - and many others. The report offers a taxonomy of (or classification of) deceptive design patterns, of which there are many suggestions in the research literature. Frank Lewis and Julita Vassileva in Integrating Dark Pattern Taxonomies (Lewis and Vassileva 2024) try to create an overview of all the different proposals to categorize the different patterns, and they end up identifying nine overall patterns.

Other policy focal points for social sustainability

On the one hand, there is a political focus on the unscrupulous methods that some companies have used and continue to use, and on the other hand, there are legislative packages that require companies to document how they relate to a number of parameters related to social sustainability. In other words, factors beyond energy and environmental conditions are becoming important non-financial indicators - from a management or governance perspective.

The logic behind the focus on social sustainability seems simple: if we don’t thrive in our personal lives, we won’t have the resources to address the external problems associated with the consequences of the climate crisis and the challenges of the green transition. The Karlskrona Manifesto talks about “other dimensions” of sustainability that digital technologies must take into account. The footnote to the manifesto explains what these other dimensions are: individual sustainability in terms of human capital (e.g. health, education, skills, knowledge, leadership), and social sustainability in terms of preserving communities of solidarity (Becker et al. 2015).

The UN SDGs also talk about sustainability, which relates to well-being and health, education, gender equality, economic equality and peace (SDGs 3, 4, 5 and 10). Another way to talk about social sustainability is to use the Brundtland Report’s principle of meeting the needs of the present without compromising the needs of future generations to meet theirs, but thought of as social and cultural capital, rather than access to energy and resources (Brundtland Commission 1987). The concept of cultural capital was developed by French sociologist Pierre Bourdieu (1930-2002), who distinguishes between economic capital and cultural capital. Cultural (and social) forms of capital are those that support the individual in getting an education, becoming educated, developing an understanding of social conditions, entering social networks, having access to knowledge about how to take action to promote personal happiness and well-being.

It creates distinctions that signal to your peers that you belong to the same segment (and that others should be excluded or fit in). For example, if you want to gain access to the royal family, it is an advantage to have a thorough knowledge of the cultural codes, but also knowledge of etiquette, manners and politeness.

The cultural and social capital in the form of educational, cultural and social institutions and the resources they provide must also be available for the next generations. But what is the role of digitalization in relation to these SDGs? In what follows, we will share some of the criticisms of IT practices that are thought to directly undermine the SDGs, and we review a number of solutions that attempt to address or prevent the problems digitalization is thought to create in the areas of well-being and health, education, gender equality, economic equality and peace. Some of the solutions are software-based, some rely on individual strategies, while others work on a more systemic level through legislation, developing management models or regulating the networks themselves.

CSRD : requirements for documentation of sustainability efforts

In the section above, we highlight the political appetite to regulate in favor of social sustainability, that social media, among other things, has eroded. In the following, we look at how the regulation translates into a very concrete legislative package that will be rolled out in the coming years, namely CSRD - Corporate Sustainability Reporting Directive. CSRD prescribes that companies must continuously document their ESG efforts. In addition to the environment, ESG reporting focuses on diversity, well-being and health and on social sustainability.

Specifically, this means that large companies must document how they fulfill their social responsibility. They must do this in a transparent and understandable way that makes it easy for investors to assess the company’s ability to work with the ESG criteria. This should be done on four levels: in its own workforce, among workers in the value chain, in the communities it touches, and finally among customers and end users. In 2023, the Danish association Lederne (an interest organisation for managers) conducted a survey among more than 1,000 companies about their approach to ESG (Lederne.dk 2023). In the social category, companies were asked about their documentation practices regarding a number of social aspects, including issues such as:

  • The presence of a code of ethics.
  • Measures to prevent workplace injuries.
  • Dialogue with subcontractors about sustainability parameters.
  • Training employees on the Code of Ethics.
  • Offer flexible working to employees.
  • Satisfaction surveys in the workplace.
  • Policies to prevent sexual harassment and discrimination.
  • Attitudes towards diversity, diversity and inclusion.
  • Principles for working with health and safety.
  • Accessibility in the physical workplace.

In the survey in question, the top scorer was ‘Health and Safety Working Principles’ (74% of companies reported having them). What does all this have to do with digitalization? A lot - and on three levels:

  1. A new documentation reality: the ESG framework is being incorporated into a series of legislative packages coming from the EU. Companies in the digital sector will face a number of documentation requirements related to social parameters. You need to prepare for this.

  2. A potential to innovate on documentation technologies (either in the form of questionnaires, various measurement tools, sensors of different kinds) for the B2B market. ESG is likely to impose a heavy documentation burden on companies. This will create a market for technologies that can ease it, in the form of automated measurements, digital systems that can make various processes of measuring sustainable parameters easier.

  3. Greater need for data visualization: With all the new documentation, there will be a lot of new data that shows things that need to be communicated. These need to be visualized so they can be used investors, but also in marketing (“Look at this nice figure showing how good we are at including ethnic minorities in our workforce!”). Various marketing websites talk about how this kind of value-based communication particularly resonates with millennials and Generation Z.

As a critical and curious reader, you probably have an opinion on the CSRD project. You might ask whether documenting sustainability initiatives automatically increases sustainability. At the very least, it is to be hoped that if companies revisit their sustainability goals year after year and publish both environmental and social issues, this will lead to increased attention from the outside world. After a number of years, investors and business partners will be able to read whether a given company can show progress on the ESG front or whether the issues are stagnant.

If the company cannot demonstrate good practices on the ESG front, investors may fear that the various ESG issues will accumulate within the company, which could negatively impact the company’s overall performance. For example, if a mobile phone manufacturer fails to discover and address the unacceptable working conditions in China, where the phones are produced, it could suddenly have a negative impact on the manufacturer’s brand (Stern 2013).

There is an inherent weakness in the fact that companies self-report their ESG issues, which can lead to the temptation to greenwash their results. This can be countered to some extent by a credible and independent third party that contributes to the ESG reporting through quality assurance - just as an auditor quality assures financial statements.

Sustainability in practice

While we wait for legislation to become a reality, we need to talk about what can be done about social sustainability issues on both an individual and organizational level. While the legislative machinery is working to regulate big tech, a number of techniques are being developed from different sides that may be able to address the issues. Let’s start at the individual level:

In a 2020 study, Ulrik Lyngs and his fellow researchers found 50 browser extensions (for Chrome, Firefox and Safari) that aimed to regulate unwanted Facebook features. Most (36 out of 50) allow the user to remove or change distracting elements, and more than half (27/50) specifically hide the newsfeed (e.g. “Newsfeed Eradicator” removes the newsfeed completely and replaces it with a motivational quote). (Lyngs et al. 2019)

The purpose of installing these features was of course to increase well-being (of the students who were the subject of the study). Facebook use is associated with decreased wellbeing because it can interfere with a focused and concentrated work life and can create a sense of loss of control. Since, Lyngs has published a collection of digital tools to help users take control of the digital environment and its unwanted elements. The premise of Lyngs’ research is that many people do not feel in control of their online behavior and that we do not know enough about the design mechanisms that support self-control.

His website (www.redd-project.org) refers to tools in the following categories:

  • Self-measurement - tools to raise user awareness about online consumption.
  • Blocking - temporarily closing or blocking unwanted services.
  • Making attractive goals - making certain features less attractive to maintain focus on own behavior and goals.

The tools provided operate on several different levels: in the operating system (e.g. by offering control of other software on the computer or by allowing DNS blocking), through software installation, or through the installation of browser extensions. Here is a selection of the mentioned technologies that are used to try to solve technology-induced problems:

Cold Turkey Blocker - The most advanced software in the collection is Cold Turkey Blocker - a control panel for blocking access to software or websites. The program allows you to code “blocks”, each with their own time-controlled control function. For example, you can turn off Outlook a certain period of time, or you can turn off access to a distracting news source for three months.

Focus software - Another type in the collection is software that excludes everything else. In Lyng’s collection you can find text editors that only let you out of the program when you have reached a (self-selected) goal, for example when you have written 1,000 words or when a time goal has been reached.

Aesthetic strategies - Other tactics include making interfaces and browser content less attractive - trying to neutralize the clickbaiting graphics of YouTube thumbnails, for example, or the red notification dots that appear next to app icons in smartphones. In addition to the familiar tactic of turning off visual notifications, you can choose to have your device’s screen display only grayscale colors. You can also install a browser extension that replaces YouTube’s enticing thumbnails with random frames from the video or movie in question.

Digital well-being

If you want to consult further research that tries to create an overview of digital well-being initiatives , digital well-being (digital well-being) is one of the keywords that you should search for. What is being investigated in this field is both which (digital) tools can contribute to well-being and which factors undermine well-being online. The first group includes tools that help people take breaks, remove unhealthy elements from user interfaces and help people through digital withdrawal. The second group looks at factors that create overuse in various forms and elements such as cyberbullying and loneliness.

A review of the digital wellbeing literature tells us that wellbeing is particularly relevant in four different areas: health, education, governance and media use. In other words, the focus is on the connections between health and well-being, how our younger citizens (pupils and students) are doing, how to create the best framework for well-being (and thus create good conditions for productive habits), and finally the connections between screen use and well-being.

The focus is typically on how to develop digital products that improve human-machine interaction and enhance autonomy and empowerment. This focus is of course a reaction to some the negative themes mentioned in this chapter. It gives way to a professionalism that aims to improve things and promote the “promotion of human flourishing” that the Royal Society in 2017 describes as a goal of digitalization (Burr et al. 2020).

But what is the impact of the new initiatives coming out of the digital well-being field? As in any other context, there is a tendency to oversell the positive effects of various digital well-being products. Therefore, Roffarello and his colleagues made it their project to systematically and methodically investigate 42 well-being apps in an attempt to uncover their effect on well-being. Their research question was: “How do these solutions actually work? What functionality do they have? Do they have a real effect on user behavior ?”(Roffarello & De Russis 2019)

However, the researchers found only evidence that some apps can address problematic behaviors related to addiction. But there is still a long way to go before apps can be considered a decisive factor in changing people’s behavior, Alberto M. Roffarello and Luigi de Russis conclude. The way forward, they argue, is through solutions that regulate behavior or create other frameworks for our ways of dealing with habits - including our mental attitude (as opposed to “only” doing something about behavior) - and through methods that make use of social support. For example, the danish “E-kvit” app, which is being used to kick nicotine addiction, is a great benefit to many. It allows people to air their concerns and discuss failed quit attempts with people in the same situation and help others achieve results.

As we’ve mentioned, the amount of research focusing on the negative wellbeing effects of digitalization is vast. One of the reasons we have highlighted Roffarello and his colleagues is their interesting take on how to approach digital wellbeing issues as researchers and investigators. “Researcher” here should be understood in a broad sense - their points are of interest to almost the entire professional spectrum that this book is aimed at: anyone who has a practical approach to researching, designing or planning digital products aimed at improving wellbeing , can use their methods. These are described in “Coping with Digital Wellbeing in a Multi-DeviceWorld” (Roffarello & De Russis 2021).

Achieving digital well-being is a complex task where it’s not just the individual device that needs to be regulated, but our entire digital environment. In the article, they review the arguments for the methods used to regulate what they call multi-device ecosystems with a focus on the problems that cannot be solved by single apps. It is difficult to regulate unwanted “noise” across all devices (e.g. turning off notifications simultaneously on smartphone, laptop, game console and smartwatch) because the common wellbeing apps only target a single device. To this end, they developed a co-design methodology focused on the innovation of DCTSs (Digital Self Control Tools ) targeting ecosystems of devices. Participants in the design project were asked to outline problems and prototypes to specifically address them in the tools. The participants’ strategies fell into two categories: one was technological, the other was more interested in social and cultural solutions such as recommending digital literacy education from a young age. Digital Wellbeing is thus a project that has both a technological leg and a cultural leg.

Before we examine the ‘big policy’ framework for introducing socially sustainable regulation, we must first address the social problems that have been created by over-digitisation. There is certainly no shortage of issues to address!

The potential for innovative, socially sustainable IT

In the above, we have taken two perspectives on the digitally created wellbeing issues: one political and one research-related. In the first section, we looked at political reactions and the documentation requirements to increase people’s knowledge and reflection on socially sustainable conditions that are on the way for everyone.

In the second part, we referenced research that explores connections between wellbeing issues such as feelings of loss of control and distraction and, in general, the consequences of people’s (over)use of digital media and tools. We lack the technical, actionable perspective: what tools and services are available for professionals dealing with wellbeing issues?

Socially sustainable UX in practice

In addition to various software suites for front-end development (e.g. HTML and CSS editors), modern developers use various frameworks in their practice. At the time of writing, these are often frameworks such as Ruby on Rails, Django or Angular. For UX designers, there are also a number of UI frameworks that contain collections of standard components (forms, calendars, radio buttons, etc.).

What would a playbook for “loyal patterns “ look like? The term “loyal patterns” (as opposed to “deceptive patterns”) comes from our colleague Steen Carlsen, who uses it in a scientific article on “socially sustainable UX” (Balslev 2024). Could we imagine practical guides to coding and developing user interfaces that are both compliant with new requirements and true to the user’s intentions? The two words may need some explanation. In this context, a playbook is a collection of UX solutions that can practically address different situations in a digital user journey. Elements of a playbook are the detailed and isolated recipes on how to achieve different goals - often based on statistical knowledge of what users prefer. A playbook might include suggestions on how to design the process of subscribing to a newsletter, how to remove an item from a virtual shopping cart, or how to save a document as a PDF, etc.

So what are loyal patterns? We talked earlier about the growing criticism of manipulative interfaces and how a lot of dirty tactics have crept into interfaces to trick more data out of users, keep them longer, persuade them to buy more, etc. What would a playbook of UX elements that do the opposite look like? A playbook that was simply loyal users?

It’s not a completely new question, in fact there are already a number of UI solutions in use that will side with the user. The prime example is pop-ups that ask if you want to allow the use of cookies. Many may find these questions annoying, but they are actually designed to protect the consumer from a data collection that many would like to avoid. Another good example of a loyal UI detail that opposes dominant principles of speed is Outlook’s and Gmail’s little offer to stop sending an email message (to prevent an email being sent to the wrong person, for example). Or on a slightly larger scale: the option to voluntarily sign up to ROFUS (the gambling authorities’ register of voluntarily excluded players) - which has the effect of preventing access to online gambling services.

Designer Will Soward has created a playbook for developing neurodiverse interfaces in digital learning systems. “NDS is a coherent set of standards and principles that combine neurodiversity and user experience design for Learning Management Systems” (Neurodiversity Design System 2024). It aims to increase the accessibility of digital learning materials for people with learning disabilities, people who may experience problems with low mood, social skills and sustained attention. He has developed a number of personas: Katie and Karen, who have autistic traits, Tama, who is dyslexic, Tanja, who has ADHD, Rachel, who suffers from coordination difficulties (dyspraxia), Paris, who is hypersensitive, and Dan, who has visual difficulties. In other words, the system caters to a diverse target group.

Another example is the “Human By Design “ website (Yablonski 2024), which is an appeal to designers to take responsibility for some of the problems created by the rapid growth in the use of mobile phones. As Yablonski writes: “As designers, we play a key role in the creation of this technology, and it’s time we take responsibility for the impact that the products and services we build have on the people they will serve.”

Below we have gathered some focus points from organizations that represent the interests of or provide services to vulnerable groups - and they should really become principles for any form of communication.

The responsibility of the human designer:

  • Provide the option to be contacted (and you have to respond).
  • Offer the opportunity to communicate with a human.
  • Provide authority and control.
  • Be clear.
  • Adapt your communication and services to changing needs.
  • Create safe spaces.
  • Be credible and consistent.

Finally, we can point to Google and Microsoft’s mental wellbeing initiatives . Microsoft provides resources to make your digital design more inclusive (Microsoft 2024). By that they mean communication that is mindful not to exclude, learns from diversity initiatives and understands that inclusion benefits everyone. You can learn to do this by using Microsoft toolkits that provide tools on topics such as learning, focusing, decision making, remembering and communicating. You can learn how to do this by downloading their guides at inclusive.microsoft.design.

Google’s focus on wellbeing - as stated on their site - is less about exclusion and cognitive challenges and more about what we now refer to as “healthy screen habits”. The site offers advice on how to focus, “unplug” when necessary, minimize distractions and make sure you spend time with your family.

Actions from nongovernmental organizations (NGOs)

In the NGO sector, there are also initiatives to promote social sustainability. The United Nations has a specific data focus for their work on sustainable development (United Nations 2024).

Big data can shed light on differences in society that were previously hidden, according to the UN. One example used is that women and girls, who often work in the informal sector or at home, suffer from social restrictions on mobility and can be marginalized in both private and public decision-making.

Data is described as the raw material for accountability. Big data analysis is commonplace in the private sector, where consumer profiling, personalized services and predictive analytics are used for marketing, advertising and management, but similar techniques can also be used to gain real-time insights into people’s wellbeing and to target aid efforts to vulnerable groups. New data sources can pave the way for more agile, effective and evidence-based decision-making and can measure progress towards the Sustainable Development Goals (SDGs) in a way that is both inclusive and equitable. This allows for diverse populations - including women and girls - to be taken into account and ensure that no one is left behind, while using data to create solutions that are adapted to local contexts and global goals.

Another example of a voluntary organization that aims to promote social sustainability is Computer Professionals for Social Responsibility (CPSR 2024). The organization works globally to promote the responsible use of computer technology and has initiated projects such as EPIC (software house that develops software to improve well-being and health) and the Computers, Freedom & Privacy conference. Another organization is Data Science for Social Good, which focuses on applying data science for positive social impact. They train data scientists and develop tools that strive for the fair and beneficial use of artificial intelligence and data (Data Science for Social Good).

Chapter summary: Soft values and hard metrics

In this chapter, we have argued that there has been a political focus on what were previously considered ‘soft values’: wellbeing, equality, personal fulfillment and social harmony. This political awakening may have arisen in the wake of the Cambridge Analytica scandal in 2018 (where it was revealed that Facebook users’ data was sold to influence them politically), and by observations of - and popular debates about - technostress, political radicalization, bullying and anxiety. The latter, some believe, arises from the intense hypersociality of social media, where it is difficult not to constantly assess oneself against the more or less beautified images of others’ lives.

The values may still be perceived as soft, but the requirements for documenting them, i.e. social sustainability indicators, are moving into “hard” frameworks used to document financial indicators.

We’ve also looked at the technological options for individuals to regulate “noise”, distraction and unwanted communication in the digital environment. Hopefully it opened your eyes to the “wellbeing market” or made you aware of the demand for what we refer to as negative computing features. It’s not always that consumers want more features. Sometimes it’s digital minimalism that’s missing - which is also the name of a book by Cal Newport (2019).

Finally, we have looked at the rejection of the rationalism that underpins many computer sciences, criticized for being antisocial and irresponsible.