Missing Link: Privacy, DAUs and Developer – Who is allowed to control software?

0
11
Missing Link: Privacy, DAUs and Developer - Who is allowed to control software?
Missing Link: Privacy, DAUs and Developer - Who is allowed to control software?

Should internet users be left with a choice who answers DNS queries for them? The dispute over a kind of “informed consent” for the resolver is currently boiling up in the debate about a new type of DNS traffic. In fact, there is a fundamental question behind this: How much technology can and must one expect the normal net citizen and how many decisions should the techies or even the state make for the users?

That does not work, found large network operators, that DNS queries, which deposit their customers via browser, are passed on to a central DNS operator elsewhere in the world. But this idea has just been standardized by the Internet Engineering Task Force (IETF).  Mozilla plans to make the so-called DNS over HTTPS (DoH) in the future for the presetting of the Firefox, and also Google announced just that one DoH over Google’s public Resolver 8.8.8.8. provides.

Before the users’ DNS requests are outsourced to a global DNS service provider, users should be at least informed, and then loudly called upon a number of large network operators in the face of the trend. The passing on “by order of Mozilla” – or also GAFFA – requires a consent of the user. After all, these requests might leave their own jurisdiction and unexpectedly bring a third party into play for the user.

Many developers are staring at the idea of ​​asking end users which DNS resolver they would like. Mozilla CTO Erik Rescorla does not think this is a good idea, and Dan York of the Internet Society asked in Prague, “Does that interest users at all?” Based on the experience from the Web, developers expect that, as in the case of faulty certificates, the majority of users just click through annoyed. “The vast majority of people on the planet are unable to understand the security context and will not have the discipline to consistently apply such knowledge,” says an IETF developer.

Mark Nottingham, Chair of HTTPBIS and the QUIC working group at the IETF, said that developers are very unlikely to be able to assess, and often do not, the effects and consequences of choices offered to them in terms of security features really want to.

The reasons for this are varied according to the developers. On the one hand, security and privacy are complex issues. To understand them, you have to take your time and understand the technology at least in the beginning. In addition, the effects of security or confidentiality given up for short-term benefits usually take effect with delay. And finally, users usually work goal-oriented on specific tasks. Therefore, they tend to click through everything that comes between them and the accomplishment of their task.

The browser developers derive the approach of continuously trimming their platforms for security and confidentiality, without asking the user – but, as with Google, with the idea of ​​perfecting their own advertising business. The complaints of network operators about the fact that traffic bypasses them or encoded goes through their channels, they hold against: The local network operator, not least the network operator “on the fly” on the road, is not necessarily trustworthy. Is it a central DoH provider, of which the user knows nothing, is the opposite question. Maybe some users, at least as long as they are in their own home network, would give preference to their local provider. Here’s the problem: should you ask the user and how? Preventing the user’s choice, many developers at the IETF warn against the belief that a secure default without exceptions is the best solution.

Making it easy for the user to encrypt as much as possible is the declared concept of the Swiss foundation pretty easy privacy (pEp), which has been working on “turnkey” solutions for secure email for several years. “We believe that it’s unreasonable, just as with messengers or HTTPS, that users need to be more involved with keys or cryptography in order to have the right to privacy,” says Board of Trustee Hernâni Marques.

The pEp suite, which already exists for several operating systems and above all as an integral tool in new versions of Firefox, handles key generation, key exchange with partners, synchronization with other devices and signals to the user how secure the exchanged messages are.

If the user has to take over all these things himself, it is still barely accepted, especially with regard to messaging and e-mail encryption. An example of how tough it is when user initiative is in demand is Marcel’s offerings like Telegram. Although it only requires a one-time opt-in by the user, most telegram users communicate unsecured. Even pEp, despite much effort for installation and ease of use, larger numbers of users, especially since the integration into the Firefox browser show. Around 400,000 Firefox pEp users were talked about last year.

Without an informed user it is not synonymous with such concepts. Despite the clear commitment to the “Privacy by Design and Default” concept, even pEp users have to take care of the up-to-dateness of the software. They recommend backups and device encryption, Marques says. Because for the user-friendliness one made safety-technical compromises: The usual with OpenPGP passphrase does not need it any more. On the other hand, that means: If you lose your device, you can get rid of the finder. He can read e-mails or messages and the pEp key is gone. Absolute security is not provided by pEp insofar as the key of the counterparty is not checked in the opportunistic variant without any further effort.

The pEp suite also tries to bring the DAU up close: with traffic light colors, red for “distrustful”, yellow for “opportunistic”, green for “secured”. “It’s important,” says Marques, “that the user does not feel completely safe.”

Limits of Plug and Play

“Complete plug-and-play without ifs and buts is unrealistic for a variety of reasons,” says Marit Hansen, data protection officer of the state of Schleswig-Holstein, and a strong voice for technical data protection.

Of course, developers and providers should do a lot more for ease of use, so that services and products are delivered in a way that minimizes risk and integrates privacy and security right from the start, she points out. In Europe, this is even mandatory. Storing things like plain text in the cloud, sending out signals without restricting reach in networked cars, or even distributing or logging DNS data across the world without distress does not comply with the provisions of the General Data Protection Regulation.

Users also need to be able to make informed choices based on understandable information as to whether and when they will release more data or allow more processing, Hansen points out. There it is again, the “informed consent” over which the technicians so nascent.

Incidentally, according to Article 25 GDPR, they are not directly addressed. The article is directed, Hansen states, “to those who process the personal data and are responsible for the processing, these developers and manufacturers do not belong.” But: those responsible must take the principle of “data protection by design & by default” very well into account in the selection of processing systems and demand this from the manufacturers. “Unfortunately, this is still not enough,” warns the data protector.

What grants developers such as providers is that there are big differences in protection needs. The differences may be technically motivated or depend on “where the service providers involved are based or their servers, which law they are subject to, and what unforeseen access to data or computers is possible – by government agencies or others Service providers or components in their own jurisdiction prefer, others just not, “she says.

This idea was also used by the advocates of HTTPS as a transport for DNA. Those who need DNS in a country (or network) heavily filtered by government pressure could benefit from the central, external DoH server. The fact that large network operators now, where they threaten to lose traffic, that they should actually give their users the choice of where their DNS queries should be answered is, in the view of DoH advocates rather loss of control, than the idea of ​​the Informed consent in processing processes owed.

According to Hansen, one of the tasks of the state is to ensure that data protection and security are integrated in the area of ​​standardization. On the other hand, the state is also called upon to raise awareness and understanding about the risks and protection concepts in the population, “without everyone having to become a computer science expert,” says the data protection officer.

Practically speaking, she advises on the question of how much technology is reasonable to educate users about the important security and privacy-related decisions during installation and configuration and to give recommendations by their providers or by placing their trust.

“Configuration files or wizards from consumer advocates, privacy advocates, security experts or associations would also be conceivable here,” she says. The job for the developers is to enable and support this. The documentation should give examples of typical configurations. “It would also make it easier for those responsible to choose the privacy-friendly preferences for their processing,” she says.

In the case of risky processing, clear warnings should also be given, for example if secret keys are passed on, encryption is deactivated or third parties have access to their own device (as remote maintenance). Sometimes it may be necessary to stop the corresponding actions, which operators should define and make transparent. The extent to which this privacy advocate’s recommendation applies to the use of DoH servers is likely to be debated in the coming months.

Despite all the inadequacy of the users, the transparency about possible risks or partners or third parties involved in a service, at least a first decision on who – which software developer, which provider and which jurisdiction – they want to trust.

The idea that anyone should be able to control their software sounds good and is not completely abandoned in developer circles. Perhaps future generations of users will be more familiar with the technology or better educated, says an IETF developer. “Then maybe we could develop smarter models for user-network interaction and give the user a choice,” he says. “Of course, that does not happen on its own, we should not assume that growing up with the Internet will automatically make you understand how it works.”