The “technology” in “technology policy” should have a real meaning - there are signs it’s actually happening. With the growing involvement and engagement of people versed in technology, good concepts are reaching decision-makers. In this short note I highlight the 3 interesting items currently on plate in Europe. They are seemingly unrelated, but taken together - mark a trend.

Bug bounties a component of cybersecurity policy

Bug bounties are already pretty well established paradigm that rewards vulnerability researchers for the time they devote to testing security and privacy of systems they are not responsible for. These kind of programs are useful components of organization’s broad security policy. Bug bounties are increasingly used by technology organizations beyond the most popular corporations such as Google or Facebook, US Army, Dutch National Cybersecurity Center and others. So the fact that European Commission is apparently starting to conduct limited-scale field-tests of a bug bounty sounds really interesting and meaningful. According to this document, the program is underway and European Commission is looking for a contractor to manage its operation. The scope of the bug bounty will cover selected open source programs.

End-to-End Encryption in ePrivacy - confidential communication

End-to-end encryption (E2EE) is the strongest mechanism guaranteeing communication confidentiality. The approach is seriously considered in the works on the ePrivacy Regulation, currently at the European Parliament; I previously wrote about the topic here. The fact that E2EE is even considered on a relatively high decision-making level is a great success of technology community. It was possible through the continuous engagement and contribution of valuable research, analysis and other public voices. The stakes are higher though, and go beyond the mere item of end-to-end encryption, because at the same time a broader (world-wide) debate is taking place about strong communication and system security guarantees. One of its core aspects are E2EE and the existence (or lack of) backdoors.

Algorithmic transparency measurement - research to the rescue

In the era of General Data Protection Regulation and the new ePrivacy Regulation, services are required to follow certain rules - for example, when user profiling is in use. It is expected that sooner or later, the use of Artificial Intelligence will end up regulated as well, and specific requirements of algorithmic transparency and digital ethics will be enacted. These kind of aspects require deep knowledge of technology and some people still argue that Data Protection Authorities tasked with enforcing regulatory frameworks are not adequately prepared.

How do we test the operation of technology and algorithms, their inner workings or transparency? Perhaps one of the best approaches is measurement. This approach is pursued and developed in academic circles (testing/automation is of course widely used at in industry as well) and is successfully applied to security and privacy testing. In short: it’s about automating the task of using of a specific service, for example connecting to a website, performing some actions and collecting results - then, analysing the results. This approach may reveal meaningful insight about the inner workings of a system. I know this because I have experience in privacy measurement (example: studying profiling and privacy configurations in advertisement systems utilizing Real-Time Bidding).

Google was recently fined with €2.42 billion by the European Commission in an antitrust case (Facebook - in a privacy case). That’s related to anti-competition laws but the actually interesting part is elsewhere and is revealed in this contract notice. European Commission (EC) seeks help with advising in technology matters, for example how to verify whether the requirements of EC are respected, as we can find in this tender specifications. The actual requirements are about offering consulting to EC how to actually make sure the decisions are in place - and whether they are adequate. By launching this contract, European Commission acknowledges the lack of adequate in-house expertise in this fast-changing field and it seeks external help. Here, I am not mentioning whether the tender request is formed correctly - which I think is not the case. Rather, I’m highlighting that a similar approach is be viable for European Data Protection Authorities in relation to topics they find challenging: seek external help. There are many great places in Europe with adequate skills and body of knowledge, where privacy research is a well established field.


Simple. Interesting ideas are finally reaching decision-makers. And we’ll all benefit from it. But I expect this process will remain lengthy.