From selfie to digital double

In the future, data-enriched selfies, will be much more than an image of an external state. Algorithms will soon be able to read us better than any psychiatrist.
22 June, 2018 by
Vom Selfie zum digitalen Double
GDI Gottlieb Duttweiler Institute
 

The following text is an excerpt from the GDI-study “Wellness 2030 – The new techniques of happiness”. The study is available as a free PDF download.

We already live in a selfie culture. Since Tom Wolfe characterised the 70’ s as the USA’s "Me Decade", there have been a host of studies describing the individualistic, egotistical and narcissistic cultural development of the Western world. But it was only with smartphones that the self-portrait became the dominant message format: today, 93 % of Facebook users regularly post self-portraits for personal branding purposes. How we present ourselves online is who we are.

In the future, these superficial portraits will be supplemented by new data that will document aspects of our lives rarely appearing on the radar screens of today’s algorithms. For a number of years now, wearable tech has been collecting data on our heart frequencies, the number of kilometers we walk and the calories we burn. Data offers insights into our well-being and is of particular interest to the wellness industry. The more data is collected from various sources, the more precise and multifaceted our digital doubles become. We are becoming machine readable. Our behavior can be analyzed and predicted. For algorithms, we have become an open book. No psychiatrist could sketch so precise a picture of our beings as this technology.

Even our emotions can be interpreted by algorithms in real time. This development is admittedly in its early stages in 2018, but the “proof of principle” has existed for some time. Technology is able to filter the data it has on a user to provide information about their current emotional state. The technical mechanism that underpins this is quite simple: all emotional events are perceptible on a physiological level and can therefore be measured. Finnish scientists have been researching the extent to which these emotional events register physically amongst people from various different cultures. They spoke to 700 survey participants from Finland, Sweden and Taiwan, asking them what kinds of reactions they had to emotions such as happiness, sadness, shame, disgust, envy, fear and love. Participants were asked to indicate on a computer-generated human silhouette where and how they experienced each emotional change.

The results showed that when participants were angry, they felt warmth in their faces, heads and arms. When they were depressed, they experienced a cool numbness in their heads and arms. Love triggered a positive feeling in the entire upper body; subjects experienced a particular feeling of warmth around their hearts. The study was able to prove that emotions can cause changes in physical perception, and that this happens in the same way across all cultures. Based on their survey, the scientists produced «bodily maps» that showed where and how the individual reactions were perceived in the body.

Share this post
Tags
Archive