Researchers' Zone:

A new strategic technology trend combines and analyses data from various digital and physical ecosystems into a type of model of your behaviour – an ‘imperfect’ digital twin, the researchers write.

Urgent – your digital twin needs your help

Organisations are becoming increasingly skilled at predicting and ultimately influencing our thoughts and actions. Researchers here offer an idea about how to take control of your own data.

If your car suddenly broke down due to a bout of cold weather, what are the best options and routes to finish the journey to work – public transport, walking, return home and cycle?

In such a situation, many of us would barter reduced privacy for access to a free travel-planning app, such as Google Maps, to find an answer.

However, many of us have begun to understand that the majority of apparently free digital products and services not only use data that we consciously submit, but also collect and profit from other data that we leave behind.

Do we have a chance to reclaim these data and insights that are valued as an asset by others but not by ourselves? If we awaken our senses to seeing this problem in new and engaging ways - we believe we do.

Technologies such as Augmented Reality could be one answer. Enabling us to see how our choices and actions create patterns that allow others to predict and manipulate our behaviour.

Dead skin cells, dust and data from nightmares

First, we will present a relevant metaphor. The digital data, we leave behind when we use digital products, can be described as digital dust.

An interesting fact about dust in our homes is that it consists of approximately 50 percent dead skin cells, and dead skin contains our DNA – the fundamental building blocks of our genetic selves.

So, if we transfer that metaphor to the digital world, analysis of digital dust can reveal our individual behavioural characteristics – which are potentially the stuff of nightmares.

Physical and digital ecosystems impact our daily lives

Over the past 20 years, technology companies including Facebook, Google, and Apple have built digital and physical ecosystems that impact our daily lives.

An illustrative example of the potential effect of digital ecosystems can be found in Facebook’s political mobilisation experiment from 2010 that resulted in an extra 340,000 votes during the 2010 US congressional elections.

Google and Apple’s smartphones, together with other digital devices, exemplify the physical ecosystems that can use their integrated sensors to collect, and exchange data via the internet - often known as the Internet of Things (IoT).

Technology trend creates your digital twin

In late 2020, a leading global research and advisory firm, Gartner, announced the emergence of a new strategic technology trend – the Internet of Behaviours (IoB) - that combines and analyses data from various digital and physical ecosystems into a type of model of your behaviour – an ‘imperfect’ digital twin.

With this, employers, businesses, and organisations are becoming increasingly skilled at predicting and ultimately influencing your thoughts and actions.

Interestingly, an opt-out option did not seem to be the primary focus of Gartner’s announcement. So, who would decide when the data-collection, processing, and manipulation should start and stop?

Fear-of-missing-out leaves privacy as an after-thought

The Danish Agency for Digitisation underlined the importance of addressing this situation through the education of young people.

The TechDK-commission reiterated similar goals that focused on technology and culture and the importance of raising awareness about the consequences of our digital lives for all Danish citizens.

But there is a limit to how much national governments can use regulations, such as the General Data Protection Regulation (GDPR), to protect us in the digital world.

When it comes to those decisive moments, our fear-of-missing-out (FOMO) and craving for instant gratification leaves privacy as an after-thought.

Unable to see the big picture

Back to the dust metaphor. COVID-19 can survive in dust for up to a month which presents an infection risk to anyone who shares those spaces.

But, as a society, our experience of tangible consequences has taught us to take adequate precautions – even though it is invisible. So why don’t the risks of manipulated information command a comparable level of respect within the digital world?

Part of the answer seems to be that we perceive each digital privacy choice as a disconnected momentary event without any tangible consequences.

But as a result, the accumulation of digital dust opens a window of opportunity for the industrialised collection of data and construction of behavioural models. But their subtle collective consequences are often only apparent at a much larger order of magnitude.

It is necessary to consider the problem from several perspectives to appreciate how this mountain of digital dust can feed the Internet of Behaviours and ultimately affect real-world events.

An example is the Facebook / Cambridge Analytica scandal where 87 million people had sensitive Facebook data harvested to be used for advertising purposes during the 2016 US Presidential election.

Taking control - Small choices can lead to large effects

Firstly, as previously mentioned, governmental regulations give us a valuable supportive framework to help us manage how we exercise our privacy rights – but if we fail to actively engage, their effect is limited.

Secondly, we must individually re-evaluate the price we pay for online services such as Facebook. Mastodon is one social media option that is both free and opensource – leading to more transparent business practices.

Alternatively, if you are willing to pay for your privacy, VERO offers another way forward because the business model is easier to understand, and you can feel more justified in your expectations of privacy.

Thirdly, we need to explore how we can visualise the link between individual choice and socio-political consequences. If we succeed in doing that, the importance of switching off our autopilot and actively engaging in our choices would demonstrate that many small choices can have a large effect.

Envisics Holographic AR HUD Demonstrator Video CES 2019. (Video: YouTube/Sam Abuelsamid)

Maps as a metaphor – scale, context, and just enough detail

The 1200 pages of collected data from the 2013 Max Schrems vs. Facebook case that focused on data transfers to the US and the violation of EU citizen rights, illustrated the need for non-expert interfaces to help navigate and comprehend huge quantities of data.

To address this challenge, maps might offer some valuable inspiration with their optimal use of available space and avoidance of unnecessary detail.

Before the internet, printed maps were produced at a range of scales (zoom levels), from the global level down to highly detailed local maps.

The arrival of digital solutions, such as Google Maps, introduced radical improvements to the user’s experience such as:

  • smoothly zooming between any level,
  • automatically adding or removing text and graphic details to maintain readability,
  • switching between view types such as map-view (2D) and street-view (3D),
  • and the global positioning system (GPS) to dynamically place you within the context of the map.

Could something similar be done with 1200 pages of data?

Mapping out your digital twin

For a moment, let us explore a thought experiment that takes inspiration from Google Maps to visually represent digital dust as spatial locations – a type of 3D map of your digital twin.

Instead of having a limited view through a 2D screen - such as a smartphone, we could use a special user interface to enter a 3D spatial environment that benefits from the extra third dimension.

We could navigate and rescale areas of interest to reveal large-scale patterns or minute details. A type of birds-eye-view of your digital twin’s 3D environment could reveal the network of interconnections to your social media friends and possibly their collective effect on socio-political events.

At human-scale, you might find that your twin resembles you due to its access to your online photos.

Popup-information windows could appear when you click on specific features displaying biometric data and other digital dust items, such as fingerprints that have been extracted from those same high-resolution photos.

Identifying and ear-marking problematic features of your twin, such as your fingerprints, could help you to understand its effect on you in the real-world. To protect both of you, you could begin to exert some influence over your digital twin by requesting the removal of incorrect or invasive data.

But how could we begin to transform such a thought experiment into reality?

Let’s use technology against technology

Extended reality (ER), that includes virtual- and augmented reality, might be a good starting point because it is already playing an increasingly important role in our daily lives.

Its communicative advantage is rooted in the fact that you experience the content via multiple senses, e.g., sight, sound, and movement – that together increase your attention and engagement. A noteworthy example is the automotive head-up display (auto-HUD) that can show an intended lane-change as a visual overlay on the actual road (augmented reality).

There is a genuine need for research and development of extended reality environments that will enable users to have an insight into the digital data that is harvested in the background of their digital lives.

We should use technology against technology to not only ‘look-at’ but also ‘understand’ the swirling patterns and movements of the invisible specks of digital dust that we barter in exchange for our apparently free services.

Read the Danish version on our sistersite Forskerzonen.

References

Søren Riis' profile (Roskilde Universitet)

Herman Bailey's profile (LinkedIn)

'A 61-million-person experiment in social influence and political mobilization', Nature (2012), DOI: 10.1038/nature11421

'Indoor Dust as a Matrix for Surveillance of COVID-19', Environmental Microbiology (2021), DOI: 10.1128/mSystems.01350-20

TechDK Kommissionens 3. rapport: Kultur

Powered by Labrador CMS