Archived website

This online community was active in conjunction with the Digital Agenda Assembly 2012 and is now archived and available for institutional memory. You can now join the discussion at https://ec.europa.eu/digital-agenda/en/community

Sensor data will explode: trust needed

freekbomhof's picture
Submitted by freekbomhof on Thu, 2012-06-07 10:15

There are reasons to believe that sensor data will explode around us. This raises many questions, especially trust questions. Do we trust the data on which we base our decisions? If yes, on what grounds? Google has more or less introduced a mechanism for trust in websites: their PageRank algorithm serves as a primary trust indicator for many people. We will need such an indicator for sensor data as well.

Another trust question is the other way round: if you provide your sensor data to others, it may very well be that you give it away if your data is "for the common good". But if someone wants to make commercial use of your (sensor) data, you want to charge for it. So, the trust that YOU have on the USER of your data also plays a role. This might even go as far as privacy related data: for non-commercial use you are willing to share your data, but for commercial use you are only willing to share it if you get a fair compensation for it "sell your privacy". Do we need mechanisms for this?

Group audience: 
Interesting!
3 users have voted.

Comments

Engberg's picture
Submitted by Engberg on Thu, 2012-06-07 10:48

The problem is true, but the solution thinking is not constructive, but dis-empowering, dangerous, creaintg bureaucratic rules and even not legal.

To get into a constructive mode of thinking

We need to separate sharply between biometrics (sensor data creating identifiability such as camera, sound, DNA) and other sensor data (pressure, temparature etc.) - biometrics is invasive in itself.

A way to deal with biometrics is client-side virtualisation - i.e. if I issue a voice command, ONLY my device should be able to recognice my voice and translate into a semantic command.

Biometrics for security is the same - Chip-on-card biometrics is pro-security, Bioemtric passports enable biometrics-based identity theft.

A way to deal with other sensordata is identity virtualisation - i.e. to ensure the inability to corrolate sensor data to whom/where provding services to a virtualised context instead of a physical context.

Trust should then be the residual risk consideration - service quality validation & expectation on one side, and condextual risk acceptance as in comparing residual risk to expected value.

On these topics, the Google way is no way.
In Europe you cannot sell your "privacy" as informed consent as to be purpose specific - and in free markets demand empowerment is pre-condition for markets to work as control of history will otherwise control the future.

So we need empowerment and security thinking instead of ways to sell you soul to the devil.

The hardcore issue is that in order to avoid sensor data to become control data beyind regulatory control and enforcement - we need to retain the ability to combine unrelated sensordata clientside in the hands of those at risk.

E.g. RFID - Try comparing this paper with EPC/NFC
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.68.4137&rep=rep...

Interesting!
0 users have voted.

freekbomhof's picture
Submitted by freekbomhof on Thu, 2012-06-07 10:58

Ah, the trigger was "sell your privacy" - which was meant as an example but not proposed as a solution. The key message I tried to convey was that trust works both ways, but that we do not have *any* way today to implement this trust aspect. (As far as I see.) The Google example (for trust in information) is not meant as an example that we should follow, although sadly it has become a more or less de facto standard. We really need to address these issues explicitly, because (as the Google example illustrates) de facto standards will not meet our needs.

Interesting!
1 user has voted.

Engberg's picture
Submitted by Engberg on Thu, 2012-06-07 11:18

The trigger was "monetizing risk" instead of eliminating risk it through design and empower the demand-side to be in control to govern reuse out-of-context.

Sensors as in Ambient is a huge and very complex field which we should focus carefully on not to get wrong like GSM/SIM, EMV, SAML etc. where the gatekeeper focus on business models over security was allowed to distort standards.

When people use the term "privacy", they rarely know what they mean nor the consequences of their lack of risk analysis - it is just a big box of undefined risks trying to leave it to lawyers and PR-communications instead of dealing with them.

The essense is that we should NOT "tag" physical or human objects for server-side control and tracking, but get the balances right from the start.

Sure, there are lots of interets in taking ownership of others, but it is not Single Market thinking.

Interesting!
0 users have voted.

Engberg's picture
Submitted by Engberg on Thu, 2012-06-07 11:21

Put it another way - dont polute the digital space with identification - think syustainable from the start.

One thing is if I can know where my things are - but it should be enabled without a gatekeeper inherently knowing to. Otherwise you repeat the mobile phone mistake.

Interesting!
1 user has voted.

Engberg's picture
Submitted by Engberg on Thu, 2012-06-07 11:31

To be even more constructive

If ONLY the owner (citizen user) of a mobile or home-located ambient device can recognise the device, then the data can often be "Open data".

If not, then we are in a bureaucratic nightmare that still only provide semisolutions.

Consider RFID and then read e.g this paper.
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.68.4137&rep=rep...

Interesting!
0 users have voted.

freekbomhof's picture
Submitted by freekbomhof on Thu, 2012-06-07 11:37

A good point - don't pollute the digital space with identification. This becoming very interesting.

With sensors spreading all over the physical space (some of them owned by or connected to people, some of them more public), we will need ways to identify them (yes) but, as you put it very eloquently, without tieing them to some server-controlled central identification mechanism.

So, the question seems to be: how to separate the Internet of Things from the Internet of People (web2.0, roughly speaking) in such a way that the world can maximize the positive effects and avoid negative ones?

Interesting!
0 users have voted.

Engberg's picture
Submitted by Engberg on Fri, 2012-06-08 08:24

Sure this is interesting - it is about designing the future - nothing less. And we are letting the shortterms interests do it even through it so openly has anti-market, insecurity and seriously negative rigthts implications and effects.

You ask:
How to separate the internet of things from the "internet of people (web 2.0)?"

web 2.0 has - in its present dis-empowering versions - turned into a society nightmare. It is abused to take commercial control over society processes (both social and increasingly also commercial) even paid for through abuse of personal data for cross-purpose profilling (not leagal), targetting and other kind of behavior influencing (e.g. political).

So how do we restore rights, markets and social processes?
Big issue, I tried to address here
http://daa.ec.europa.eu/content/who-say-social-media-has-be-dis-empowering

Interesting!
0 users have voted.

nabeth.thierry's picture
Submitted by nabeth.thierry on Thu, 2012-06-07 14:55

Attention management research.

I just want to mention here some work that was conducted some years ago (including by myself with the FP6 project AtGentive ;-) ) in the domain of the management of attention and notably in relation to the management of "attention metadata".
It can provide some ideas about how to manage some of the sensor data.

Works in this area looked for instance at the idea (1) to create standards (Attention.xml, APML) for encoding and exchanging this information, (2) to have users to better control their attention data.
The interest for this kind of research appears however to have faded!.
Yet, the work is still somewhat continuing, for instance via the concept of the Quantified Self by Erik Duval who used to do research on attention some years ago.

Some references:

Martin Wolpers, Jehad Najjar, Katrien Verbert, Erik Duval: Tracking Actual Usage: the Attention Metadata Approach. Educational Technology & Society 10(3): 106-121 (2007)
http://www.ifets.info/index.php?http://www.ifets.info/abstract.php?art_i...

The Attention Economy: An Overview
By Alex Iskold / March 1, 2007
http://www.readwriteweb.com/archives/attention_economy_overview.php
Extract:
The user information that is stored in the database can be accessed by trusted services. These services, approved by the user in advance, have the opportunity to take advantage of the user information to deliver personalization. For example, Netflix can take advantage of the user data to personalize movie recommendations. Newsvine can use the user's OPML to personalize news and Google can use the data to prioritize and filter their search results. So from a technical point of view, the key to facilitating the attention marketplace is in decoupling of attention capturing, attention storage and attention recording services.

What's AttentionTrust.org all about?
By Dan Farber | July 28, 2005
http://www.zdnet.com/blog/btl/whats-attentiontrustorg-all-about/1648

Attention.XML
Attention.XML is an open standard, built on open source that helps you keep track of what you've read, what you're spending time on, and what you should be paying attention to.

APML (Attention Profiling Markup Language)
http://apml.areyoupayingattention.com/

For a more management / organizational perspective of the concept of attention economy, have a look at the (excellent book):
The Attention Economy
by Thomas H. Davenport, John C. Beck, 2001

Interesting!
1 user has voted.

People

competencesmarocaines.org's picture
fhardes's picture
fredriklinden's picture
keneastwood's picture
Nicholas Bentley's picture
JacintaArcadia's picture
Loankanassy's picture
Kasper Peters's picture
Kristijan Jakic's picture
lpujol's picture
Digital Agenda Assembly engagement
glqxz9283 sfy39587stf02 mnesdcuix8
glqxz9283 sfy39587stf03 mnesdcuix8
glqxz9283 sfy39587stf04 mnesdcuix8