PEGA-UntersuchungsausschussStaatstrojaner gefährden Grundrechte und Demokratie

Am 27. Oktober traf sich der Ausschuss, um über die Auswirkungen von Staatstrojanern auf Grundrechte, auf die Demokratie und auf Wahlen zu sprechen. Wir veröffentlichen ein inoffizielles Wortprotokoll der Anhörung.

Der Experte Ot van Daalen sitzt auf dem Podium und spricht zu den Abgeordneten des Auschusses.
Ot van Daalen von der Universität Amsterdam sprach im Ausschuss zum Einfluss von Staatstrojanern auf Grundrechte. – Alle Rechte vorbehalten Europäisches Parlament

Am 27. Oktober traf sich der Ausschuss, um über die Auswirkungen von Staatstrojanern auf Grundrechte und die Demokratie zu sprechen. In zwei Panels kamen verschiedene Expert:innen zu Wort. Das erste Panel behandelte die Auswirkung von Staatstrojanern auf Grundrechte, das zweite Panel konzentrierte sich dann auf die möglichen Folgen für die Demokratie und Wahlen.

Von der Anhörung gibt es ein Video, aber kein offizielles Transkript. Daher veröffentlichen wir ein inoffizielles Transkript.


  • Date: 2022-10-27
  • Institution: European Parliament
  • Committee: PEGA
  • Chair: Jeroen Lenaers
  • Experts:
    Panel 1: Ot van Daalen (Institute for Information law, UvA), David Kaye (former UN special rapporteur for freedom of expression)
    Panel 2: Krzysztof Brejza (targeted with Pegasus while head of election campaign), Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI), Iverna McGowan (Director, Europe Office of Center for Democracy and Technology)
  • Links: Hearing, Video
  • Note: This transcript is automated and unofficial, it will contain errors.
  • Editor: Emilia Ferrarese

The impact of Spyware on Fundamental Rights/Democracy and Electoral processes


Panel 1

Jeroen Lenaers (Chair): Minutes past nine. So I propose we start the meeting. Welcome to all of the fellow substitute members of the Pegasus Committee. We had a very busy day yesterday with 7 hours of hearings on the Pegasus equivalent spyware and big tech on ePrivacy and then spyware. And today, we’ll continue our session with two panels. The first one on the impact of spyware, on fundamental rights. And on later this morning, we will hear about the impact of spyware on democracy and electoral processes. We have translations, interpretation, sorry, in German, English, French, Italian, Dutch, Greek, Spanish, Hungarian, Polish, Slovak, Slovenian, Bulgarian and Romanian. And if there are no comments with regards to the agenda, I consider that adopted and I would like to move to our first panel immediately, which will be dedicated, as I said, to the impact of spyware on fundamental rights. To participate in this hearing, we have two distinguished speakers, Dr. Ot van Daalen from the Institute for Information Law from Amsterdam University, and Professor David Kaye from the University of California, who is also the former UN Special Rapporteur for freedom of expression and giving to time. I’m just restrict my introduction. I just immediately passed the floor to Dr. Ot van Daalen. You’ve got the floor about 10 minutes.

Ot van Daalen (Institute for Information law, UvA): Thank you very much. Dear members of the Pega Committee. I will be sharing my testimony in digital forum with you afterwards. So thank you for inviting me today to the hearings on the impact of spyware and fundamental rights and the impact of spyware on democracy and electoral processes. So I am an Assistant Professor of Privacy and Security Law at the Institute for Information Law at the University of Amsterdam. And last year I published a report for the Dutch Ministry of Foreign Affairs on the export of cyber surveillance technologies under a new dual use regulation. This month I finished a PhD on a human rights obligations for governments relating to information security. And today I want to focus on one aspect of this research, namely the human rights obligations of states with regard to the regulation of vulnerability in software and hardware.

So first, what does the regulation of vulnerabilities have to do with spyware? Well, in order for spyware to be installed and operated on a device, you would need access to and control over this device. But isn’t a device protected against this? Of course, however, there are defences and these sometimes do not provide a fully effective shield. They are insured, vulnerable. Now, gaining control over device requires exploitation of one or more film abilities in a systems defence. So filter abilities are, in other words, a prerequisite of spyware. Without vulnerability, spyware would be next to impossible to covertly install and operate. So the first takeaway of my contribution is that the regulation of spyware is intimately connected to the discovery, sharing and exploitation of film abilities.

Now, if you would want to curtail the use of spyware, one regulatory response could be to strengthen the defence of digital systems. Currently, much attention of EU policymaking is devoted to this aspect. So see, for example, in this new directive in the Cyber Resilience Act, still if organisations would shore up their defences significantly, vulnerabilities will remain. It’s next to impossible to develop systems without filter abilities in the current digital ecosystem. And even if some organisations succeed in doing so, it isn’t realistic to expect organisations to do this. Another regulatory response then would be to limit the research into vulnerabilities, so if you don’t find them, they cannot be exploited. Would be the thinking. Unfortunately and perhaps unintentionally, there’s some of that in EU policy as well. Under current EU rules, such as the Cybercrime Directive and the Copyright Directive, Information Security researchers may face civil and criminal liability when doing research into filter abilities and sharing their results.

Now there’s a problem with that, too. Vulnerabilities are discovered regardless of such prohibitions. Think of criminals running ransomware schemes, intelligence agencies running hacking operations. And again, research into information security can be useful because if you don’t find filming abilities, you cannot fix them. So this leads to a surprising situation in the EU. Currently, if you are a security researcher and you find it your own ability in, for example, an online camera, you’re not obliged to share your findings. This research might even be illegal in the EU. You could choose, however, to sell the knowledge of this vulnerability to a broker who will pay good money for it. Now, this might be legally dubious, but the chances of being caught are small. As a result, there are currently numerous individuals in private companies whose business model is to find vulnerabilities and sell them to others, sometimes for millions of euros per vulnerability. No, it’s not. Yes, many of these vulnerabilities will not be used to strengthen systems, but instead to attack them. And some of these vulnerabilities may end up facilitating spyware. The question is whether EU governments have an obligation to change this. I conclude that I do. For his conclusion, I firstly analyse information security related case law and Article eight of the convention and Article seven and eight of the Charter. From this case law, it becomes clear that states have an obligation to minimise the risk of unlawful access to private information and systems. The consideration of the European Court of Human Rights in I versus Finland of 2008, which was about unauthorised access to medical data, is particularly relevant. The court considered that states have an obligation to exclude any possibility of unauthorised access. Similarly, the EU Court of Justice has written to Article seven and eight of the Charter an obligation to prevent unauthorised access to private information in the context of data retention. Now, this does not mean there is no room for balancing interests, for example, between the right to confidential communications and the prevention of crime. This has been confirmed, for example, in Kew versus Finland of 2008 also. So this brings me to my second takeaway. If states have an obligation to reduce the risk of unlawful access and vulnerabilities, increase the risk of unlawful access, then governments have an obligation to ensure that knowledge of vulnerabilities is used to strengthen information security in the public interest.

The next question then is how can this be achieved? For answering this question, I analysed the obligations of states with regard to the right of freedom of expression under the Convention and the Charter and the right to science as protected in the International Covenant on Economic, Social and Cultural Rights and to charter as well.

So this led me to conclude two things. First, the EU should clarify that information security researchers have a right to research vulnerabilities and share the results thereof. They should not face the risk of criminal or civil liability if certain conditions are met. And this is the second conclusion most relevant to this committee. States should introduce a duty to disclose the findings of vulnerability research. So if you find a vulnerability, you have to disclose it. This has to benefit. Firstly, it ensures that information security measures can be strengthened. Second, it puts an end to the current situation where researchers can keep vulnerabilities secret and sell them to private brokers.

Now, obviously, there should be some boundaries to this duty. Firstly, when you’re disclosing vulnerabilities, you should do so in a way which reduces the risk that others can exploit it. This means generally that the involved parties should be granted an opportunity to fix the hole before disclosure. Second, if you found vulnerabilities which only affect your own systems, it would not be necessary to disclose them to allow others to fix them first. If you found Filner abilities in things such as ransomware, which would, for example, assist victims in decrypting their files, it would not make sense to disclose these either.

Finally, there is a question to what extent states can retain vulnerabilities for a limited time to exploit them. On the afterwards informing the organisations which can fix it. I hope this is the most relevant to this committee and it’s also the most complicated. The German Constitutional Court has already considered this question, concluding that the fundamental right to confidentiality and integrity of its systems does not require authorities to notify any. Their abilities immediately circumstances, according to the court. Delaying notification must, however, be based on a legal framework that resolves the conflict between the different interests involved. When it is conclusion is warranted under the convention and the charter requires further research. My initial assessment is that a German court gives too much leeway to the states in this regard. On balance, the public interest favours immediate disclosure and the burden risks and governments to this dilemma should demonstrate otherwise. This, given the poor state of information security currently, will be difficult to accomplish. Now, one important distinction is probably between the vulnerability leading to a clause risk, which is a weakness in many systems at once and a vulnerability in one particular server, for example, as a result of a wrong configuration. If only one particular surface is affected, this is far less problematic than if many systems are affected.

And this leads me to my final conclusion. Spyware is intended to work on every device. And as the research of this committee demonstrates, its use is often highly problematic and in some cases legal. This means that the retention and sharing of abilities for this purpose should be severely restricted, if not completely bent. Based on the case law of the convention and the charter. And with that, I thank you very much.

Jeroen Lenaers (Chair): Thank you. Thank you very much, Mr. van Daalen. Very interesting because we have we have discussed vulnerability, stockpiling of vulnerabilities, markets and vulnerabilities extensively and it’s very interesting to hear this approach. I’m sure there will be many questions on this. But we first move to Professor David Kaye, you also have the floor for 10 minutes, right? Well, so could I ask members who like to intervene in the Q&A session afterwards to indicate so during the professor case introduction so we can make the speaker’s list? Thank you.

David Kaye (former UN special rapporteur for freedom of expression): Thank you. Chair. Members of the Pega Committee, thank you very much for the opportunity to appear before you today. I genuinely believe that this committee can set the standard for the control of the kind of intrusive technologies on your agenda. And I thank you for taking on this work and rising to the occasion. I’m sharing, along with this testimony this morning, an overview of key aspects of international human rights law applicable to spyware like Pegasus. I would also draw attention to the first and second footnotes of that testimony and the annexe that I’m attaching that identifies other work, including work that I did as the United Nations Special Rapporteur on freedom of opinion and expression from 2014 to 2020 that explores the human rights at issue and the policies that might address them.

In these introductory remarks, I would like to identify a series of related points that will be eight points if you’re keeping track that in considering the fundamental rights at issue, bear the scrutiny of legislators and policymakers especially, but not only in Europe. First, this committee’s remit focuses on the severe threats to freedom of expression, privacy association and other fundamental rights posed by a particular type of aggressive surveillance tool. I understand that this is not per se about the NSO group’s Pegasus spyware, because we know that there is an opaque industry of spyware tools hidden from global attention right now. As such, my comments focus on the specific problems posed by intrusive spyware like Pegasus and not the broader but still serious problems of other forms of digital surveillance, mass or targeted.

Second, with the question of lawfulness in mind, what is it about this kind of spyware that requires urgent global action? I believe it is in large measure, this surveillance technologies like Pegasus give attackers an unprecedented power of intrusion and collection that fails to distinguish between legitimate and illegitimate targets of surveillance. It provides the attacker with the ability to gather and monitor its target’s digital life without distinguishing, say, criminal conspiracy from an individual’s opinions communications, politics, contacts, location, data, dining habits, banking and more, sometimes in real time. All of these human activities today are often mediated through our personal devices, for better or as here, for worse.

Third, given that extraordinary level of intrusion, the risks to fundamental rights are correspondingly severe. The rights at issue are not only those held by individuals as such. Yes, of course, human rights law. Here I am focussing in particular on the International Covenant on Civil and Political Rights, but also the European Charter on Fundamental Rights or the European Convention on Human Rights. Human Rights Law protects individual rights to privacy, opinion and expression, but these very rights are foundational to democratic societies as they are, and charter and their case law. And the Human Rights Committee of the U.N. repeatedly make clear spyware causes individuals to doubt the privacy of their communications and opinions, strategically designed to cause people to question their intention to engage in private and public discourse. And I hardly need to say this to legislators, but for democratic societies, that withdrawal can be fatal, particularly when the targets of such intrusion are those we depend upon to inform our public life and debate, such as human rights defenders, journalists, civil servants, and elected leaders like you.

Fourth, given the severity of the threats posed by such intrusiveness into individual life and democratic society, the burden to justify such threats falls on the attacker. Here, governments and the private actors that provide the tool. Put another way. Human rights law places the obligation on the state to. Straight that any burden it imposes on a fundamental right is justified by the law. It is emphatically not a matter of balancing interests, but one of justifying a burden by legal standards. States and spyware companies argue that they need the tool in order to counter terrorism or other threats to national security and public order. But as was clear in the NSO group’s testimony here in June, they are generally unwilling or unable to explain why that is so how their tools meet basic human rights standards, always hiding behind state secrets, contractual arrangements and other excuses. And these excuses. Even if one thinks of them as legitimate, must nonetheless be supported by evidence in its absence. That is, in the absence of evidence. The rule of law requires that we proceed on the assumption that such spyware fails to meet several key principles of international human rights law.

Fifth, the human rights to privacy and freedom of expression share a common, or at least a close to common set of standards that require the state to meet tests of legality, necessity and proportionality and legitimacy. And this means several things. It means that any burden on privacy or freedom of expression be provided or prescribed by law precisely draughted to give the subject of the regulation notice, but also to limit the discretion of the state to impose any burden. It means that the restriction must be the least restrictive of available tools available to the state, and that it imposes no greater burden than necessary, and that the burden not eliminate the right entirely. And it means that the ends must be legitimate. These are cumulative standards, which means the attacker cannot simply say, for example, that the restriction is for national security. They must demonstrate meeting each condition.

Sixth, every right must have a remedy for its violation. The ICCPR itself obligates states to ensure an effective remedy. The nature of that remedy may exist along a spectrum, depending on the circumstances from criminal accountability, restitution and compensation to satisfaction. Apology and guarantees of non repetition. Unfortunately, too often states hide behind claims of sovereign immunity or national security to avoid liability and remedy. But impunity only incentivises the use of the tool. However, the committee proceeds remedy, I believe, should be a part of the equation consistent with human rights law.

Seventh, it’s often suggested that human rights law applies only to states and not to private actors. And while this is in part true, it’s not entirely true. For one thing, the state is obligated not only to promote and protect fundamental rights, but also to protect those the enjoyment of those rights. The state has an obligation to protect those within its jurisdiction against, for instance, interference with one’s privacy and obligation that should be understood to include transnational surveillance threats, but also under the UN’s Guiding Principles on Business and Human Rights adopted by the Human Rights Council over a decade ago. Companies themselves have a responsibility to prevent or mitigate human rights, harms their activities, cause, or to which they contribute. The adoption of a human rights policy by a company and an internal process to address human rights concerns. Is nice, even a prerequisite. But without transparency, external oversight and remediation, it’s really window dressing, hardly even a first step.

Finally, in light of all that I’ve noted, I have serious doubts that surveillance technologies with similar characteristics as Pegasus can ever meet the tests of international human rights law. As such, their use should be considered unlawful. Your committee’s work, combined with actions such as the US blacklisting of NSO group, suggests that a ban of such technologies is the correct answer to result from your work. At a minimum, however, and I am about to conclude, I returned to the call that I made in 2019. The development marketing, sale, transfer and use of tools like Pegasus should be brought under a moratorium that is temporarily halted, while states, regional institutions and international organisations consider and implement a range of minimum steps that should be undertaken. Strict. Internationally agreed. Export control. Genuine transparency and oversight. Radical legal reform of surveillance. Practises and law. Removal of barriers such as sovereign immunity. These are a few of the steps that ought to be taken, again, at a minimum, to begin the process of replacing a lawless use of technology with the rule of law. Thank you very much.

Jeroen Lenaers (Chair): Thank you very much, Professor Kaye. And we move to our question and answers and we start with our rapporteur, Sophia in’t Veld.

Sophie in ’t Veld (Renew): Yes, thank you, Chair. And thanks to our two guests this morning. I have. This is a very rich exchange. I have a lot of questions that I would just fire at you in random order. So, first of all, to Mr. van Daalen, you talk about the obligation, the general obligation to disclose, which makes perfect sense. And that would mean that already the commercial trade in vulnerability should be completely banned. But you say there should be certain exceptions to that disclosure obligation in cases where the harm is limited that I understand, but for example, also for research or possibly for government bodies, which may need such vulnerabilities or say they need such vulnerabilities. But how could that be? I mean, what kind of restrictions could we envisage? Everything okay?

Jeroen Lenaers (Chair): Yes. Mr. Arłukowicz sent me a message, but I left my phone at home. So he’s going to orally transmit the message to me now.

Sophie in ’t Veld (Renew): Just getting a little bit paranoid.

Jeroen Lenaers (Chair): No, no, no, no. Sorry for the confusion.

Sophie in ’t Veld (Renew): If you could say something about that, you know, what kind of how could we shape that? And then to Mr. Kaye, I, of course, entirely share your concern about the use of spyware. Otherwise, we wouldn’t be here. But what would you say to those who say yes, but spyware also has its uses because we can track down evil, evil people with it.

And then the second question is, I very much concur with you that Europe should set the standards. However, there are certain practical obstacles or limitations, I would say, because we can set the standard, but the very nature of the stuff is that it’s worldwide. What about Russia? What about China? What about spyware, which is not commercial that developed by governments themselves? What about our friends in the U.S.? And finally, what about our friends in Israel? Israel, which is very much the hub, the cradle, if you want, of this kind of stuff.

And then finally, you mentioned remedy. And of course, nobody can disagree with that. But the problem is what we see in Europe now is that in several cases, the state itself, which is supposed to protect citizens, is actually the attacker. So what does remedy mean? I mean, this is the problem we see in a number of member states. There is no real remedy anymore on paper. Yes, but. But not in reality. Thank you.

Jeroen Lenaers (Chair): Thank you, Mr. van Daalen. If you would like to respond first, and then I pass the floor also to Professor Kaye.

Ot van Daalen (Institute for Information law, UvA): All right. So. First, let me clear up something which I might have understood incorrectly from your suggestion. So my proposal is not to ban the sale of vulnerabilities or anything like that. My proposal is to introduce a duty to disclose vulnerabilities, and this may have the effect of banning the entire trade, hopefully. But the reason why I am emphasising the duty to disclose is that by imposing a duty to disclose, you also strengthen information security. And I think that that particular aspect of, you know, the whole debate also needs emphasis because information security is quite weak currently and it is really important to shore up our digital defences and in duty to disclose would not only effectively ban vulnerabilities but also strengthen information security. This also finds some some basis in in freedom of expression and the right to science case law. So that’s an additional bonus.

Then you rightfully pointed out the most problematic or difficult aspect of my suggestion, my recommendation, namely how to deal with restrictions on a duty to disclose in the context of, say, national security, etc.. So. I’m hesitant to say that if you look at the case law under the convention in the charter, that there is room for entirely for saying that there is no room at all for states to keep vulnerabilities secret for temporarily. We’ve also seen that from the constitutional court case, from the German Constitutional Court. There is room for case for states probably to keep vulnerabilities temporarily secret. However, this firstly should be based on a specific legal framework. And secondly, those vulnerabilities should have very limited scope, which means they would only affect, say, one system. That would be my initial intuition on what would be sort of proportionate response to this. And this is an important conclusion because spyware affects multiple systems. And that for me would be the legal basis to say, well, vulnerabilities with regard to spyware should have to be disclosed for the simple reason that they affect all systems, not only one. But again, I’m somewhat hesitant to provide a final conclusion here because this is really a very difficult topic.

Jeroen Lenaers (Chair): Thank you. Professor Kaye. Yes.

David Kaye (former UN special rapporteur for freedom of expression): Thank you for those questions. They’re good questions and also hard questions. So let me I’ll take them in order.

But I want to start with sort of my fundamental point, which is that presently spyware, the kind of spyware that we’re talking about, is not subject to robust regulation. And I think the effort of this committee, in large measure, as I see it, is to impose a kind of legal order on its regulation.

So, for example, with respect to the first question, in response to the argument that spyware has its uses, of course it does. I mean, there’s a reason why this industry is making money hand over fist. It’s useful to governments. It’s useful to other actors as well. But as useful as it is, any surveillance tool needs to be brought under a system of law. And I don’t think that it’s tolerable for societies driven by rule of law to enable states simply to say, we need this tool, therefore let us use it. I think it’s we need this tool. And therefore, in order to use it, we need to put it under this framework of legal standards. And I think that’s what generally I’m trying to suggest the problem with this particular set of technologies, because Pegasus is kind of a set of technologies rather than just one in particular, is that it? It really does, as a technical matter, lack the standards of discrimination. And so by its nature, it sweeps up all sorts of information that is totally illegitimate for law enforcement or national security purposes. And that’s I think that’s the fundamental conundrum. In a way, how do you how do you regulate a tool that, by its technical nature, may be unregulated at some level?

The second question about Europe setting the standard, but what about others? It’s absolutely true that some states have tools that are at least as powerful as tools like Pegasus. We don’t know the extent to which those are used, but we do know they exist. Occasionally there are leaks and we learn something about them as well. But what I take from this committee is the committee is interested in one particular kind of technology, and it’s not excluding the possibility of either remedy or addressing other state tools. But given the fact that this particular technology is available on an open market, essentially, and available to be used by European states against perhaps members of this body or leaders or journalists within the European space, I think it shows it poses a particular threat. I do think that as this committee makes proposals, as with so much right now coming out of Brussels, it will have a normative impact on consideration of these issues worldwide. It may take time. It may not directly or immediately apply to states. It may be that states like the ones that you mentioned, step into the breach and offer their tools to bad actors as well. That’s all absolutely true. But I think at the moment, the current threat posed by tools like Pegasus is severe enough that steps to put it within a legal framework make a lot of sense. And it could contribute to creating a normative environment that addresses what other states are doing. And my answer there also leads into the third question you asked about remedy.

So one of the areas in which I think remedy could be helpful, both in the context of dealing with the private surveillance industry, but also the state use of tech of surveillance technologies is the question of sovereign immunity and official immunities that protect states, in particular against transnational litigation. And I think that now it requires careful study. And I think there will be some states that are that would object to changes in the way sovereign immunity applies. On the other hand, there has been movement and we actually saw a case out of the UK earlier this month that suggested that sovereign immunity should not apply in the context of transnational uses of surveillance technologies. And I think that as we move towards a discussion about remedy, that remedy could have a direct impact on. The use of surveillance technologies like Pegasus, but not only Pegasus, but also a longer-term disincentive to states using surveillance technologies transnationally. It won’t eliminate it. I’m not I don’t live in a naive world where suddenly, you know, the target committee adopts a proposal, and the world becomes safer from these threats. I mean, we could dream, but I do think that this is kind of a first step in changing the norms. And I think we could imagine a kind of cascade. If sovereign immunity is somehow taken, at least in some measure, removed as a barrier and we could imagine the development of law in state courts, that is in national courts, in in human rights bodies and elsewhere that begin to assert a real strong measure of lawfulness around the control of these kinds of technologies. But I again, I agree with the premise of your questions, which is there’s sort of a there’s a world of technology out there that poses a threat that is probably sort of a committee part to some extent.

Jeroen Lenaers (Chair): I didn’t know we had a Pega committee part two yet. But thank you for that proposal. And Mr. Solé.

Jordi Solé (Greens): Thank you, Mr. Chair. And thanks to our guests for their very interesting interventions. First, I have a question to Mr. van Daalen, your advocate for the duty for the states to disclose findings on vulnerabilities. My question is, to what extent do government agencies research and actively look for such vulnerable vulnerabilities, and what are they doing to redress them? Do you think they are all across the European Union? I mean, do you think they are doing enough or much more should be done at government level to address the issue of vulnerable vulnerabilities?

And then to Mr. Kaye, you rightly said that when it has been proven that spyware tools have been used in member states, then the burden to investigate on these attacks lies on the governments and on the private companies that have provided such tools. And then you mentioned that restriction on national security grounds. Normally it’s a two broad concept and somehow it should be narrowed down because if it’s on the states to investigate and to give information about these cases. Normally what we find is that in the end, we are against the world of the concept of national security. And this is just been using as an excuse not to disclose information and not to go, you know, until the end with the with the investigation. So my question is, don’t you think we need a rigorous more rigorous and more targeted and unified definition of what national security is and under what circumstances? These attacks of fundamental rights, if, if ever can be, if ever, I stress is if ever can be justified under the concept of national security. Thank you.

And then just a final comment. This is not a question, this is a comment. We are talking on the impact of spyware technologies, on fundamental rights. And we talk about the right to privacy, freedom of expression and so on. But I think we have to speak also about the right to political pluralism, because we know that in some cases these are spyware technologies have been used to spy on a whole political movement, and that affects the possibilities of some legitimate democratic movements, too, to play on a level playing field and pursue their legitimate and democratic political objectives. So I believe that we should also speak about the impact of spyware technologies on the right to political pluralism. Thank you.

Jeroen Lenaers (Chair): Thank you, Mr. Solé, for two questions and one comment. Maybe Professor Kaye, if you’d like to reply first, and then we pass the floor to Mr. van Daalen.

David Kaye (former UN special rapporteur for freedom of expression): Great. Thank you. Thank you for those questions. So I want to clarify what I mean in talking about national security. I’m not sure that it’s a particularly fruitful path to go down to redefine national security or to or to change how we think about what is national security. I think that that rather although that might be possible. But first and foremost, to my mind, the issue is, as you were suggesting in your question, is how do we ensure that states justify their intrusions on the basis of national security? Too often, courts defer to executive in law enforcement and security service conclusions that they need to do something because of national security. And I understand the politics around that. There’s been some pushback to that recently. Interestingly enough, last year, the Indian Supreme Court actually responded to a kind of national security claim of the Indian government by saying national security is not a magic wand, that you can waive over your activities and suddenly justify anything that you that you do, you need to justify it. And so the main point that I was trying to make in my testimony is that when a state says it needs to do something because of national security, it also bears the burden of justifying that and justifying the cumulative nature of human rights laws or requirements. So we might say, okay, that is a legitimate aim, national security or public order. That’s a legitimate aim to protect those obvious national interests. But in order to do that, you also have to demonstrate that the law that provides you with that authority meets the prescribed by law standard. In other words, it’s clear enough to provide the targeted person with an understanding of the rule, but it’s also precise enough to impose some limits on the states discretion. So there’s that part of it.

And then there’s the necessity and proportionality part, which I think that very often, if not. Typically or even always, the use of tools like Pegasus fails the necessity and proportionality standard. And by thinking of this in terms of that, that kind of three part assessment that human rights law requires. I think it allows us to move away from having a debate with the state over what is or is not national security, because in in the public sphere, often that’s a losing battle. But putting it into a rule of law set of standards is more of a is more of the ground in which human rights law allows us to operate and allows for some pushback. So just to sum that up, it’s less about redefining national security than putting it within a rigorous legal framework. And with respect to your comment at the end. I totally agree. And I think that the point of a human rights law, by its terms, applying to individual rights and yet also being framed within the context of democratic society, as the European Convention makes absolutely clear, gets exactly to this point that these are individual rights, but in particular privacy, freedom of expression, freedom of association. Those go to public participation and as you put it, political pluralism. And I think those are all very much tied together here.

Jeroen Lenaers (Chair): Thank you, Dr. van Daalen.

Yeah. Thank you for giving me the opportunity to clarify the scope of the duty to disclose. So my suggestion is to not only extend this duty to disclose to states, but also to extend it to private research. So my research has shown that many information security in filter abilities are discovered by outsiders, academics, freelancers, people just doing it from their own basement for whatever reason they want. And the question is, are current legal rules sufficiently accommodating that activity to ensure that actually those findings are being fed back into the cycle?

And there are two things which are currently not working well. First of all, these researchers face criminal and civil liability under the EU Copyright Directive and the EU Cybercrime Directive. And this means that the EU legislators should introduce a an explicit exception for information security research. So that’s the first thing. And then the second thing is you need to have a framework for ensuring that those Filner abilities that you find and that you want to disclose that they are channelled back into the system in a responsible way. This is called coordinated filter ability, disclosure, CVD, and currently in this too, in an issue directive, there is some like a suggestion that Member States should adopt a policy and that I think that’s really not sufficiently strong. You should really have a legal framework setting out how this would should work in practise. So if you have these two in place, first of all, the freedom to do the research. Second of all, ways to feed back the findings of your research in a way which actually minimises the risk of exploitation. Then I think you are in a good place.

Jeroen Lenaers (Chair): Thank you very much, Mr. Arłukowicz.

Bartosz Adam Arłukowicz (European People’s Party): Thank you very much. Thank you very much. Thank you for your presentation, sir. And I would have four questions to both speakers. Two questions. Because governments very often justify the purchase of Pegasus or Pegasus-like software as a tool to combat serious crime, drug trafficking, terrorism.

So my question is as follows, sir, can you provide some examples from Europe in from Europe, or do you know a case and it may be some decisions of course, were criminals were condemned to using Pegasus or Pegasus like software as evidence for crimes, for ordinary crimes. Do you know such cases from Europe?

And the second question, when it comes to is it violating human rights if public authorities purchase Pegasus? The use of Pegasus against the head of election campaign of opposition parties. Then this Pegasus source, the information is divulged publicly. Then there is the whole witch hunting on public television and public media. Their lives are devastated, too. Is it violating the human right when Pegasus is used against independent prosecutor who initiates proceedings or investigations against the officials of public authorities?

And finally, do you think that using Pegasus against the lawyers of EU politicians, is it the violation of human rights and worth maybe the impact of using Pegasus against the highest officials of the EU within the context of the war that is going on when decisions on sanctions are taken. And then we know, we find out that the commission officials have been spied upon. What is your opinion on that?

Jeroen Lenaers (Chair): Thank you very much, Mr. Arłukowicz. Dr. van Daalen.

Ot van Daalen (Institute for Information law, UvA): So with regard to the first question, whether I’m aware of any cases where spyware was used to condemn or to prosecute someone successfully, I’m not aware of those, but to be honest, that was also not part of my research. So that’s. So that should be an important footnote. And then let me share in condemning the use of spyware to spy on lawyers, election officials, I think particularly if you use that kind of invasive technology to target the democratic process and to target confidentiality of lawyers, you violate human rights in a more invasive way. So I totally agree in your assessment in this regard.

David Kaye (former UN special rapporteur for freedom of expression): Thank you for those questions. And I agree with Professor van Daalen and by the way, I agree with Professor van Daalen’s statement on vulnerabilities as well. But to answer your questions so first, I don’t know of any case where a criminal investigation has been justified, has been based on evidence drawn from Pegasus. And if we go back to the June testimony of the NSO group before this committee, also there was no explicit claim. There was sort of a hiding behind sort of an assertion of its value, but no evidence of its value. So your question is a very good one. But it really does highlight the secrecy that surrounds the use of this tool and the failure of its users to justify it on the grounds or to prove that it’s justified on the grounds that it is asserted in this.

On the second question and by the way, that’s not only with respect to Europe. That’s also worldwide. I don’t know of cases where it’s been really proven that it was Pegasus or something like Pegasus that provided the evidence to convict a person. With respect to the second your second question, of course, I agree that that sort of targeting those kinds of individuals, that those categories of participants in public life is illegitimate. And it allows me, in a way, to go back to the previous question on national security and to put this into a framework again, which is, as I’ve said, that in order to justify the use of any surveillance technology, that is to justify a burden on privacy or expression or association, the state or the attacker has to show that it meets these three different tests. And those are cumulative. But the ones that you highlight go to the third part of the test, which is the legitimacy of the use. Now, oftentimes those users will say, oh, it was national security. So this allows me to sort of go back to that question and say, of course, there are some uses, there are some arguments that this is national security, where on its face that is clearly not the case. So uses against lawyers, against public prosecutors, against the kinds of political figures that you’re highlighting. Those are, I think, without further evidence for sure, illegitimate uses of a technology even beyond Pegasus. I mean, those are clearly uses designed to intimidate, to embarrass, etc., but not designed to actually address a legitimate interest of the state. And so I think your question is, is apt, and it’s clear that that those uses would fail on any number of grounds, but certainly on the ground of legitimacy.

Jeroen Lenaers (Chair): Thank you. Miss Delbos-Corfield.

Gwendoline Delbos-Corfield (Greens): Yes. Hello. Thank you. So I it’s becoming very clear that we have some states also in European Union that has in-house technology, if we could call it that way. And we are focussing a lot on these Pegasus issues, but I’m a bit afraid that we’re not looking at these bit. France, Germany, I guess it’s the big ones. Well, and yourself, you just said we know very little about this. Where is this discuss? Is there legal framework around this? I mean, like should one exist? Like, you have a place where, you know, bans is discussed on the international level. I mean, on European levels, I mean, at least in between pairs, they have a bit of control of the situation. And how what’s the remedies for citizens to of informed this when it’s in-house technology and it’s not abide by board from a private company like NSO.

And my second thing, my second types of question would be on the ban. So I would completely agree with you. I even today really start questioning if we should be surveilled in a number of other things and other places in our everyday life. And I’m getting very concerned about all of this, but of course indeed we have problems. So the first one would be the realism of today being able to put in place this ban. So and I’m also afraid of this very specific cases of those states that have the money itself to do all of things, all of this in a very secretive way and would have ban, in fact, maybe be have the inconvenience that we would completely get all of this under the dark and very secret. And the ban would prevent the little countries and the poor ones to do some things but would not prevent the big ones. So that’s one of my questions about the ban. My other question is it comes back to what you’ve just said. I mean, I don’t think Pegasus has often been the pretext has been used to say that it would be to prevent normal criminal cases. But I do guess that what they would say, especially these big states, is that they use these technology for tourism. Do we I mean, I guess on this, to be honest, I think that there are cases where did help what, where and when.

And then my last question would be you’ve not pointed a lot, neither one of the other, but maybe it’s not in your competences. But what would be the role of judiciary in all of this? I mean, okay, we create a legal framework. We have laws. But then when should a judge into in the discussion and we need authorisation at the beginning. What are what are the different ways where we could at least involve a judiciary, which would be one safeguard at one moment.

Jeroen Lenaers (Chair): Thank you. Mister van Daalen.

Ot van Daalen (Institute for Information law, UvA): Yes. So I want to focus on the first question regarding Internet as a the use of spyware or development of spyware by states, not so much by third parties. Here again, I think looking at this through the lens of vulnerabilities is helpful. So as I said in my earlier presentation, the existence of a vulnerability creates the risk of unlawful access by criminals, by states, by intelligence agencies, etc.. So the fact that you as a state have in-house knowledge of a weakness in the system and you don’t exploit and you don’t tell it to say, the vendor or the developer so that they can fix it, but instead keep it to yourself in itself already creates a risk of unlawful access. So how does this relate to human rights under human rights law? Every interference with, for example, the right to privacy has to be prescribed by law. It has to be governed by a legal framework. In this case, however, the fact that you keep these vulnerabilities in-house, you don’t tell anything, anyone about them is often not governed by any legal framework. Now there are certain frameworks being used in certain countries. These are called vulnerability equities processes, VEBs. But they often do not have the force of law. They are merely administrative policies or practises on how you should deal with vulnerabilities. So one of my recommendations would be to really require all states to have a vulnerability equity process in place. This in itself would very much sort of circumscribe the in-house, in-house retention and potentially also the use of Illinois abilities and thus of spyware.

Jeroen Lenaers (Chair): Mr Kaye.

David Kaye (former UN special rapporteur for freedom of expression): Thank you for those questions. So I’ll start rather than with the first with the fourth. Because I think your question about the role for the courts, for the judiciary is actually quite an important one. And I’ll tie this afterwards to the question of the ban.

So there certainly is a role for the courts. I mean, in in general, when it comes to law enforcement or other kinds of investigations that are tied to law enforcement, there are basic rules of due process that have to apply and that are mediated by the courts. And so if we think if we don’t think about a ban for a moment and we think about how is it possible to put surveillance technologies within a legal framework? I think part of that framework necessarily includes control by the judiciary, as has been the case for decades in terms of controlling all sorts of law enforcement behaviour. And so the question is, what are we asking of courts now in in the most serious, legitimate, say, national security claims, counterterrorism claims. Some states have developed specialised courts that allow for the protection of privacy and national security information, but still allow for give and take between different actors. So, for example, in in the United States, you have the foreign you have the surveillance court essentially. And those kinds of courts are a potential mechanism for the most serious concerns that a state might have. But even beyond that, the regular day to day work of courts warranting investigations and then controlling those investigations is absolutely essential. And so if we’re not talking about a ban, but we’re talking about regulation, I think the courts have to be involved in that for sure. But the courts can have other roles as well. So and it’s an interaction between the courts and the legislature. So, for example, if a national legislature says, you know, sovereign immunity has been used too often in order to protect attackers attacking states and to protect transnational surveillance. So we’re going to tinker with that and disallow sovereign immunity as an admissibility claim. Well, it’ll be up to the courts to interpret that, to implement it, and that’s entirely legitimate. And so, I mean, I think your question is really important, because I think the role of the courts is a part of the bringing back rule of law to govern these kinds of tools. Now, I still believe so. This gets to your question about the ban. I still believe that this particular set of technologies that doesn’t allow technically for a distinction between legitimate and illegitimate targets on one’s on a person’s personal device, that that still makes it generally unlawful as both a problem of necessity and proportionality, but also potentially legitimacy. But I but I take your point that if you ban the technology, you might incentivise pushing it down deeper into the shadows. I think that’s that that’s a problem that exists. Whatever we do. Right. Whatever. Whatever way this committee or states move forward on this, the problem of surveillance increasingly going dark is a problem that we need to struggle with moving forward, that I think that exists regardless of ban or regulation. Now, I do think that that is a very important question, but it also signals that the work of putting surveillance technology under a rule of law framework isn’t the only thing that’s happening here, that there are other steps that need to be taken in order to control the technology. And part of the work of this committee and of regulation or ban will also be establishing a normative framework for how we think about those things moving forward.

And then you also asked about the sort of the question, the way in which states may or may simply argue that we’re only using this for terrorism, not for generalised crime. I think that the evidence that’s been developed by, you know, by. Journalists and by human rights organisations. Doing the forensic work belies that. In other words, it suggests that states are actually. To the extent they get the tool, they’re not limiting themselves to using it only for the most serious crimes. Again, an argument for transparency, disclosure and so forth in order to control its use.

Jeroen Lenaers (Chair): Thank you, Vlado Bilčík.

Vladimír Bilčík (European People’s Party): Thank you very much. And let me just maybe pick up on two points and push a little more. Now, I think it’s very clear from what Professor Kaye just said, surveillance has always been here. And it’s part of how we work and organise things in the society. And there may be a legitimate reasons when it works to prevent a serious crime and to protect the basic and fundamental rights of individuals or even a group of people. Particularly in this context, when war has come back to Europe, you know, it sets the whole discussion in a bit of a different setting. And I think we need to strike the right balance. And my question is just to push a bit more, because what you’re saying is we don’t necessarily have to redefine the basic priorities when it comes to national security framework laws, etc.. But what is a game changer is the level, the intensity and the pervasiveness of technology.

And here I want to kind of push particularly along the lines of questioning, what can we do to catch up with it when it comes to regulation, when it comes to not just what we put in laws, but also what we do in terms of the institutional framework.

Let me just pick out three different areas that you’ve touched on. One is the regulation when it comes to the use. I mean, you may perfectly you may have perfect things written down in your laws, but because the institutional framework doesn’t work either in general or specifically doesn’t work because it isn’t designed to deal with these sorts of technologies, what sorts of loopholes should we be looking at or gaps should we be closing when it comes to the institutional application of making sure that the use indeed is only in cases when there is a real threat to certain individuals. And this might be prevented either when it comes to terrorist attack or whatever other threat there might be.

Now, the second area is loopholes and gaps when it comes to victims, because obviously this technology has been used. And again, do we have the proper framework to deal with the victims of this who have been spied and attacked unlawfully and what does need to be done to actually close the existing gaps? And lastly, and this is where we get to the area of where you have the connexion between the states and the industry and the Organised Crime Network, let me put it this way. What sorts of gaps and loopholes should we be looking at and how to best address them when it comes to the sale of this to the production and the sale? Because that’s where we are dealing with particularly nation states or member states governments. And really we need to consider how this transnational aspect, a global aspect of the technology can be regulated more effectively. So those would be the three areas just to push a bit more for some insights, how you see this. And the second point are on duty to disclose do we have this discussion all the time? You know, and we have very different access to information even across the EU institutions. This Parliament has arguments with the Council, the Commission all the time. What can we access, what we cannot access? So this is the eternal discussion.

And again, I think I want to take it from the standpoint of the technology, because this is where we are trying to catch up with understanding what is going on with the potential harm effects of potentially good technologies. And we had a huge discussion about this, for instance, in the Digital Services Act, when we look at the platforms, the online world and of course, very trying to see how the online platforms should be reporting, disclosing information and what they do in the black box of, of, of the algorithms, etc., because these things can kill, as we see in the real world and the real life when it comes to the spread of hate or this information. So again, in terms of the duty to disclose, the question is to whom through what channels, to what extent, because there are legitimate reasons when you don’t disclose things, because you want to protect, again, the fundamental rights of the society, of the people, of the individuals. So if you can be a bit clearer on this as well, it would be it would be very useful because, yes, we need to protect perhaps the whistleblowers and others who help us understand. But in terms of the system in which we trust, we need to have clarity across the society. Thank you.

Jeroen Lenaers (Chair): Thank you very much, Mr. Kaye first.

David Kaye (former UN special rapporteur for freedom of expression): Great. Thank you for those questions. And they’re organised nicely so I can answer them in sort of policy terms. So it’s about use regulation, addressing victims and sale and production, slash export, let’s say.

So first on use regulation, you ask, you know, what are the gaps and loopholes that need to be that need to be addressed? So to my mind, this connects to too many of the questions that have been asked, and that is putting use of surveillance technologies not limited, frankly, to Pegasus like tools, but to surveillance generally ensuring that it’s within the legal framework. As Professor van Daalen said, ensuring that the prescribed by law are part of human rights laws requirement is genuinely a part of law. So it’s not just a matter of a policy statement, but it’s a matter of if a state wants to use this particular tool against a particular target, which is a euphemism for an individual, it must go through a legal process in order to get access to it and for its use to be controlled. And I think one of the things that we’ve seen in the use of this particular tool, Pegasus, is that often that part is skipped over that intelligence services or national security services seem to think that they can use the tool without constraint. So, for example, outside of Europe, we saw this and we’ve seen this repeatedly actually in the context of Mexico and the use of the tool pretty evidently without any judicial or legal kind of control. So I think that is a major it’s more than a loophole. You know, it’s the total absence of legal framework in many places. So I think that that is a key thing that can come out of this committee is what does that legal? Again, if we’re not talking about ban, but we’re talking about regulation, what does that legal framework look like?

Secondly, in terms of loopholes around victims and how do we close gaps? You know, one thing that we haven’t talked about is so there’s the responsibility of the state. So I’ve highlighted a few areas where the state can provide, as it’s responsible to do, provide remedy. And I think that’s useful in thinking about victims. But there’s also a role for third party companies, private actors. And we’re starting to see this develop. I think we see some of it in the vulnerabilities market, but we see it especially in the lawsuits brought in the United States by Buy Metta or WhatsApp and Apple against the NSO group. Right. So those are lawsuits that third party companies are bringing against NSO group because of the intrusion into their infrastructure, into their computer networks. And I think that’s actually one of those moments where I know because you mentioned the DSA, we could talk about Big Tech’s problem on one side, but this is actually, I think, a positive step that those companies have taken in order to they’re protecting the integrity of their networks, but ultimately, it’s about protecting the security of their users experience and their communications. Now, there are all sorts of political and public perception reasons to do that also. But I think there’s something valuable in encouraging third party companies to have a responsibility with respect to the security that they offer their users. And that’s an example of closing a loophole for victims. Apple has gone a little bit further, and in its latest iOS, its latest operating system, actually provides tools. I don’t want this to be a commercial for Apple, even though I’m from California. But they do provide this extra tool for individuals who have a particular level of threat to shut down their device when there’s a potential intrusion. Those kinds of things are important and valuable and I think can be incentivised.

And then the last in terms of scale production and I would say export, it’s useful that there’s a lot of Dutch around here because the Wassenaar Arrangement is a useful construct, a framework for us to think about the export of dual use technologies like, like Pegasus. So generally speaking, the export control regime is focussed on national security threats or national security implications. But, but it doesn’t include human rights implications for. For at least four international export controls. And I think that could be something that, again, this committee could really add value in terms of encouraging the international export control system to integrate not only national security concerns, but also human rights concerns. States have taken that on themselves. I mean, the EU itself, in its transparency obligations, has focussed in particular on fundamental rights. The DSA also focuses to a certain extent on fundamental rights. And I think that that’s a that’s an important aspect that can be injected into export controls as well. But then I want to bring that to the more general point, which is that it’s important for those export controls to genuinely be international, to apply not only to a few states that might be hosting companies that export these tools, but sort of a commitment by all states to ensure that the development, the marketing, the sale of these tools is under a generally agreed framework. And I think I think there’s some movement in that respect, but a lot of a lot of the effort lately has been unilateral. So for example, the US blacklisting of NSO group, that was a unilateral move. It’s a positive move in my view, but it’s the kind of thing that could be globalised, let’s say, and Wassenaar might be, might be, although there are risks there, and we need to ensure that you don’t overregulate the space and interfere with security research, for example. But that’s a useful place to think about how export controls can be coordinated.

Jeroen Lenaers (Chair): Thank you. Thank you very much. I just have maybe a couple of questions or more also to maybe go deeper into some of the concepts, I think.

Ot van Daalen (Institute for Information law, UvA): There was also a question to me.

Jeroen Lenaers (Chair): I am so, so sorry. I was so excited about my own questions and I completely forgot the questions of Vlado about indeed the vulnerability. Sorry. Please go ahead.

Ot van Daalen (Institute for Information law, UvA): All right. So first on the duel on the export control, of course, in the EU, we have the dual use regulation which actually takes into account human rights when exporting of tools. And the exporting of spyware is specifically listed in the in the dual use regulation. All right. So your question is, with regards to the specifics on disclosure, how would this work? So in information security, we have there is a thing called coordinated filter and building disclosure. We have certain practises which have been formed over the past decades on how to disclose in a way which is responsible, which basically minimises the risk of exploitation. Then my first point is the goal of disclosure is to strengthen information security. And then working from that goal, this means that you first have to inform those companies or organisations who are able to actually. Close the hole. So if you find a hole you have to inform those organisations. You can close the hole and only after the hole is closed you should disclose it with the rest of the world in order to ensure that others cannot exploit it. Then how do you do that to make sure that this is well, that that is that it functions well. It’s advisable to have sort of a specific place on a website where you could actually notify people of vulnerability that you found. So I would make that mandatory. Basically, the most complex question is when do you notify and when do you notify to whom? Because you might notify to say Intel that there’s a bug in their CPU, but then Intel might take one and a half years to actually fix the bug and meanwhile everyone is still vulnerable. This is not how things should work. Intel should basically act on this quickly. I don’t propose any specific time limits on this, but you should have an open norm which favours disclosure so that you fix things as quickly as possible.

Jeroen Lenaers (Chair): Also again, apologies. A couple of points. One thing if you could maybe explain a little bit in more detail, what I was very interested about is what the liability for researchers is under the current copyright directive, amongst others, because you mentioned it a couple of times that it’s me not very clear yet where this where this risk is. Exactly. Secondly, you mentioned also that you want to specifically for vulnerabilities with regard to spyware should be disclosed because they target all systems. How do you know at the point where you discover a vulnerability whether this is going to be used for spyware or not or whether it’s because, of course, the vulnerability is vulnerability at that moment. Still, the spyware as such, too, to take advantage of the vulnerability still needs to be developed. So how do you make this definition and where do you make that that analysis?

Also, if you know, because for me, it’s a little bit opaque still how this market for vulnerabilities actually works is this if you have private individuals who are researchers who find vulnerabilities, they offer this to the highest bidder, or there are their business relationships that are pre-existing and work like that. Is are these vulnerabilities also offered to the tech companies themselves? And do they have a responsibility to sort of purchase their own vulnerabilities or should they have such a vulnerability? Because, of course, the budgets of some of these companies are huge also compared to the NSA’s of this world. So shouldn’t they have more of a responsibility to participate in this market, or does that include risks by itself?

And to Professor Kaye, if you talk about so you spoke about banning regulating because of ban might not be useful at all instances but how could regulation actually look because for instance, the name of the mandate of our committee is to look at Pegasus and equivalent spyware and already on just a definition of what is equivalent spyware, you could spend a couple of sessions, what is equivalent to Pegasus and what is the technology exactly? Because nobody, apart from I think NSO itself knows exactly what the technology is. And there’s also no disclosure on the technology. So can you actually regulate the technology or would it be better to regulate the use? And so then in the use, you could target maybe the you talked about the this the sweeping information grabbing that Pegasus does that you could do you could prohibit this and opt for only targeted kind of information grabbing. So what kind of definition could we could we get to this? And then there was the question also by Mr. Arłukowicz about I think the courts, etc. And how do you see regulation functioning in a situation in which we also have in the European Union where there is questions about the independence and the impartiality of the courts or the court system, judicial system. And how do you view in that regard? Also, for instance, what we’ve seen in Hungary, where the data authority investigated 200 cases of the use of Pegasus and said, well, this was all within the law, this was within the Hungarian law, so there is no problem. How do you see then international human rights treaties, the convention, etc. and what kind of possibility does that still give to victims in such a country? So another.

Ot van Daalen (Institute for Information law, UvA): Yeah. Thank you very much for your questions. So with regard to the risk of criminal and civil liability under the copyright directive, there’s a prohibition to circumvent so-called technical measures, which are measures to protect copyright and these circumvention prohibitions and actually impose criminal and civil liability. The technical measures are security measures. So if you do research, there’s a risk that you actually circumvent these technical measures. And if you didn’t share the results of that research, there’s a risk that you provide circumvention tools. Now, this is intended for to find things to protect. For example, digital rights management technologies from being exploited. So that’s e-book copyright protection. Copy protection is not broken. However, it has far broader effects and also has been used to target security researchers who found the film Inability in an access chip for four door locks everyone uses on their in their offices. It has been used to target researchers who found filming abilities in locks for cars, digital locks for cars. So there are serious real life examples of information security researchers being confront confronted with this risk. So now with regard to the question of how do you know that something is being used for spyware? Well, my point was not so much that if it this will be used for spyware, then it should be prohibited. But if it is a vulnerability which exists in multiple systems, so in, for example, all iPhones or all Microsoft instances of Microsoft, Windows 11, whatever. That vulnerability puts all users at risk all the time unless it is closed. And spyware, by definition, exploits that kind of film abilities, vulnerability to which are present on more than one device. And that is why, in effect, those kinds of vulnerabilities are the most problematic. And that would for me be the reason why immediate disclosure would be obligatory. So then finally, with regard to the vulnerability market, I’m not an expert on the vulnerability market for precisely the same reason that you do not know that you have difficulty finding out what happens there. It is a very opaque market. What I know is that some of these firms advertise a so-called vulnerability feeds. You could subscribe to a channel where in their abilities which they find. I didn’t share it with you ostensibly to a show. They could use that information to sort of protect yourself. I don’t think it works that way. And these vulnerability feeds are in actual effect, used to attack systems, most probably. So that’s my impression, thank you.

David Kaye (former UN special rapporteur for freedom of expression): But yeah, thank you for those questions. So for the first question, in terms of, you know, what would regulation look like and how would you define what you know, the subject of regulation, I think it would be sort of a way to over define the space would be to limit the sort of the future use of any regulation. So I think your question sort of the direction you were heading, which is do you define by use or function or you or do you define by the technology itself? I think the former is preferable. In other words, you define what you’re regulating, not by the technology that exists at this particular moment, because technology is dynamic and it will change and your definition will ultimately leave out the kinds of uses that you want to cover, but focus on the use, focus on the ability of a technology to have indiscriminate access through this aggressive intrusion into the entirety of one’s digital life, essentially. So figure out the use defined by that, by the function, rather than trying to say the technology is X, Y or Z. So I think that’s to my mind, that’s the approach that would give any regulation a long shelf life, let’s say.

And then the second question around, you know, how do you regulate in this space when you have, you know, the failure of independent judiciaries? And I frankly don’t have a really good answer to that to that problem. I mean, the broad problem right now is that the discussion around surveillance is taking place at a moment where there are multiple examples across Europe and certainly around the world of a breakdown in the rule of law, such as independent judiciaries. And that is something that I don’t have a concrete answer to. Right. And I also believe that that we don’t want you know, as the cliche puts it, you don’t want the sort of the pursuit of, you know, the perfect to be the enemy of just good regulation. And I think that to the extent that the committee is able to define the problem and to say this is a framework for regulation, that all states, which includes an independent judiciary, capable of doing this. But that’s also implicit in all rule of law frameworks, that that’s something that this committee could put forward. And it would be extremely valuable and addressing the situation in Hungary or other places where you’ve seen Pegasus and its and similar technologies being used in Europe has to be addressed in that context.

Jeroen Lenaers (Chair): Thank you. Thank you very much to you, to both of you, because I’m not sure if there’s anything you would like to add that you didn’t get a question on. No, thank you very much. I think it’s been a very rich exchange. Definitely key for us, food for thought. Also with regards to the potential avenues for solutions that we will find here, you’ve already more or less invited yourself to pick up part two. So we well, we’ll see how that goes, but no. Thank you very much, very sincerely for your for your contributions. And we really hope that you would also stay keep a close eye on the work that we do, stay involved. And if you see or have any better ideas and we have, please feel free to proactively contact us again. Thank you very much. We will take a five minute break, but we prepare the podium for the next panel. So we’ll start at 10:40 again. Thank you all very much.

All right, dear colleagues, if everybody could take their seats, then we can start the second panel of this morning’s session. If also, maybe those who are doing interviews could do so outside of the room so we can actually do the work of the committee. That also goes for you, Ms. Kempa. Thank you very much. So thank you very much.

We conclude our two-day hearing very intensive with our panel on the impact of spyware on democracy and electoral processes. With this hearing, we wanted to have a discussion with experts covering the impact of spyware on democracy and electoral processes. And I want to thank the participants for having made themselves available on short notice. It’s very much appreciated that you are with us here today to do so. We also have some visitors in the room. You are warmly welcome. I just like to remind you that you are here as observers and listeners and it’s not allowed to participate in the debates either verbally or nonverbally. So please respect those rules. Thank you very much.


Panel 2

Jeroen Lenaers (Chair): Now we have three distinguished guests in our panel. Today, we have Mr. Krzysztof Brejza who was targeted with what Pegasus while head of the election campaign in Poland for his party. We have Professor Giovanni Sartor from the faculty of law at the university of Bologna and we have Mr. Iverna McGowan, the director of the Centre for Democracy and Technology’s Europe Office. I will not take a long time for any introductions because we are mostly interested in hearing what you have to say. So we’ll give the floor first to Mr. Krzysztof Brejza, Senator of Poland, who was targeted with Pegasus while leading the electoral campaign. You have the floor for 10 minutes and thank you very much for being with us today.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): Okay. Thank you. Good morning, everyone. Thank you for inviting me. My name is Christopher Brejza. I’m a senator. I’m a member of the Polish Senate. Today, I’m here with my lawyer. We have together 13 cases, 13 court proceedings. Some of them are civil, some of them are criminal, and they are all linked. In was the use of Pegasus against me for three terms in office. I was a member of the Polish Parliament in 2019 when I was targeted. I was working for the biggest group of the opposition. Civic Coalition. I was the head of the campaign, but I was also targeted when I was running to be an MEP. This was a very difficult experience. It’s hard to be outside your country abroad and to have to tell you what my own authorities, Polish authorities, have done. But I’m proud. I’m proud to be Polish. I’m a patriot. And the fact that in my beloved country. People who have power have committed such crimes against democracy is very painful.

Starting from 2015, PIS came into power. Since then, until today, the party in power very consciously has been removing all the safeguards, protecting democracy against any temptation to have an authoritarian regime. Our public prosecution has been politicised. The Minister of Justice is the same person as the Prosecutor General. We no longer have an independent constitutional court. If you look at how the judges are appointed in ordinary common courts, it’s also politicised. There has been a freezing effect in the courts and the secret services have also been politicised. There is no longer a system of checks and balances with respect to the secret services and what is happening on the public television. This is a television of the government, basically. This is the work of the press. This is the mouthpiece of the government inciting to hatred against the opposition. There are no safeguards protecting our democracy. As a result. Was all the impunity. The Polish government was using Pegasus against the Opposition in a very cheeky way, if I may say so. They were hoping that they would get away with it because it would never become public knowledge. But that wasn’t the case. I was being spied on by Pegasus. It’s not only about me being a victim of a crime. It’s also about attacking free elections in Poland on an unprecedented scale. The whole. Everyone from the campaign staff was being spied on. This is an attack against the freedom of election. We know that there was not a level playing field during the elections in 2019, but this is not back then. But we as the opposition, had no knowledge of the fact that the elections were being manipulated.

This is a European Watergate. If you look back at the Watergate in the US. There was just an attempt to wire top the Democrats. The press took a step further during the whole campaign. Before the elections, they were wiretapping us. They were spying on us. They were stealing the data from the members of the opposition and people working for the campaign. So this is Watergate multiplied by the factor of ten.

So let me go back to 2019. I would say that those elections were basically organised by the Secret Services because of the Pegasus. Now you can look at it in two ways. We were permanently under surveillance around the clock. We were being spied on as the people organising the campaign for the opposition. But that’s not it. They were not only wiretapping us and spying on us, they stole the data and they used this data. They used the data from phones, including my private messages from many, many years ago. They took the messages from many years ago. They manipulated those messages, doctored them, and published in a changed form on public television in order to prove. What they wanted to prove. In the middle of the election campaign, the Secret Services and the Public Television launched a large scale smear campaign against the opposition. But that wasn’t enough for our authoritarian government turning Pegasus into a weapon. They wanted to attack us directly, undermining our work by publishing manipulated messages stolen with the help of Pegasus. Those messages have been changed, doctored, manipulated and then published during the election campaign. This is unprecedented, at least in Europe, in European democracies. But we know what happened in Russia. This is a country where such practises are common. In 2006. Russia one launched an attack against Navalny. They have published stolen and manipulated messages from his phone. This surveillance applied not only to me, but also everyone around me. Amnesty International established that my father was under attack. The mayor of in a broad swath, a small town in the central Poland. By the way, he was recently considered to be the best mayor, according to Polish Weekly. But also the head of my parliamentary office was attacked with the means of Pegasus a polish daily. Is it possible to disclosed that this operation cost 20 million PLN €4 million? A tremendous amount. Well, I’m quite sure that the aim was to compromise the whole opposition. There was slander against us on public TV. 5000 materials against us. Pegasus became the main tool, the foundation for the attack against the opposition. So the public TV launched an attack against me. There was a campaign of hatred and as a result, threats. Started to be sent. Sent against my children. So on the one hand, Pierce was trying to attack me using Pegasus. But paradoxically, the Polish police officers offered help because there was a risk that my children would be kidnapped. I can tell you that it was a lonely battle, a very lonely experience exposed to cyber weapons which should be used against terrorists. And I found myself in a situation where it was used against me by the authoritarian state in my own country. It’s a borderline experience. It’s hard to talk about it. I felt abandoned by the Polish institutions, just hijacked by the press. The prosecution doesn’t want to deal with it. The courts are subject to this freezing effect. They are afraid in May. I was a witness to the situation where Secret Services denied that I was being spied on. The government doesn’t want to react at all. Therefore, I’m very honoured that I can give my testimony here. We need universal safeguards. That would be. Applicable to all the member states. This should never happen again to anyone. Thank you.

Jeroen Lenaers (Chair): Thank you very much, Mr. Brejza. The honour is ours to have you here and to be able to listen to your statement on the impact of the use, of the abuse of this kind of spyware on your personal life and on the electoral processes in Poland. So thank you very much. I’m sure there will be plenty of questions after us before I also listen to our two other distinguished guests. And we start with Professor Giovanni Sartor. You have the floor for 10 minutes as well.

Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI): Okay. I’m very happy to be here with you today. In particular, I’m honoured to speak after the so enlightening testimony by Mr. Brejza. I would like to say that democracy, as always, being a concern for data protection. I would like to remember two great scholars and pioneers of data protection in Europe, such as Professor Stefano of the Thai Spyros Asymmetries, who address these concerns already starting in the seventies. So, for instance, Symmetry said that freedom of speech, neither freedom of speech and nor freedom of association nor freedom of assembly can be fully exercised as long as it remains uncertain whether under what condition, what circumstances and for what purposes, personal information is collected and processed.

Considerations of privacy protection involve more than any particular right. They determine the choice between a democratic and an authoritarian society, which is, in a way the choice that we have to face today. And in the case of Pegasus and similar surveillance software, the collection of information is massive and the impacts on the individuals concerns is enormous. As you know better than me, Pegasus can collect email, correct calls, social media post user password contacts, pictures, videos, sounds, recording browsing histories. It can activate the cameras or microphones to capture images that are called Do for the environment that listen to, call in voicemails, location logs, or to monitor a movement. And all these happen without the user even touching. The phone is called, I think, a zero click surveillance. You don’t need to click. It’s sufficient that the message arrives at the phone apparently. And the information acknowledged that the interference took place can even apparently be removed without leaving any trace.

And we live, as we all know, in a hyperconnected world, what we use the computer devices to store our ideas, to engage in our work, to engage in conversation, telephone conference, whatever, to engage in written interactions, to participate in social networks. And our personal devices such as the mobile phone play a special role. They provide us access to all of these, the digitally mediated dimension where we spend our life and therefore taking control over our computer devices, in particular our phones that provide for a level of surveillance manipulation that was unthinkable until recently.

And I render the famous philosopher said that everything that leaves the needs, the security of darkness to grow at all and beggar those, I think removes that every darkness makes the life of the controlled affair sound completely visible to those who are using these surveillance techniques. And that is obviously, as we know, an interdependence between enjoying the ability to develop one personality without surveillance and manipulation and the ability to take an active and responsible role in the public sphere. These general considerations that have been the topic of the value of philosophical theoretical reflections become a much more specific and concrete that as we learnt that from the testimony of Mr. Brejza, when the surveillance that impacts on active and the activity of people that play an active role in that in politics, such a journalist, judges and politicians and the pervasive surveillance open space is for manipulation, blackmailing, falsification, destroying one’s reputation based on false partial misleading data.

Data that is brought presented out of out of context so that it can be misunderstood. And being subject to those risks may induce people to self-censorship, that sense of themselves to avoid expressing opinions, even in private context, and possibly to avoid engaging in politics and to choose to live outside of the of the public sphere.

I would like to conclude my presentation with some consideration concerning what can be done through EU law to address this kind of issue. And this is not an easy issue because as you well know, according to an Article Four on the treaty and on the facts on the European Union, the in national security remains that the sole responsibility of each member state. But this, I think, is not the real obstacle that we have to address in subjecting to legal review and measures that are intended to achieve national security purposes, because even measures that are enacted for this purpose would be subject to judicial review when they interfere with the provisions of EU law as it is up and also in, in some case law.

But imagine, for instance, if a state decided to stop the free movement of people for national security reasons, obviously these could be subject indeed to review. And I think that the problem that we have to address the legal issues seem to consist rather in the fact that there are the activities of national security, are excluded from key provisions of EU law, such as apparently the GDPR and the privacy directly according to provisions included in this very instrument. And but the situation in there is the reality very confusing because both in the GDPR and the Privacy Directive, we can find the statements that apparently go in a different direction.

On the one hand, we learnt that the GDPR and the Privacy Directive do not apply to national security. On the other hand, it is a fear that the limitations to data subject rights for the purpose of national security are only acceptable when legitimate and proportionate. A way to address this conflict completely under national security say that there is no is a provided by the Council position on the new e-Privacy Regulation, where it is clearly stated that the regulation does not apply to measures concerning national security and defence, regardless of who is getting out this operation, whether it’s a public authority or a private operation, acting at the request of a public authority. But this would be going to the extreme of it and allowing of completely these applying the ePrivacy, the future privacy regulation to the activities of national security and those, and not enabling this kind of concerns, not giving a space for this kind of concern that emerged that by the presentation of Mr. Brejza.

So we need, I think, to rethink the connexion between national security and EU law to prevent the opportunistic abuse of national security exceptions for purposes that pertain to undermining the democratic process.

How can this happen? I think that the first thing we should consider how to restrict the notion of national security and clarified that the exemption only covers activities that are exclusively directed to national security. And this is an aspect that fully pertains to and to EU law as these provisions are included in the instruments that are part of the of the EU law. And then another important issue is the fact also, which is very relevant to us in connexion with Pegasus, is determining the extent to which national security activities only cover state activities to the exclusion of private entities that collaborate with the state for the purpose of, of surveillance and, and also of those private entities that are order. To use for or to process or to keep it for national security purposes. Data that have been collected for other purposes. And this was a case that was decided recently by the European Court of Justice, and that went in the direction of submitting such a further processing to proportionality review.

And finally, ideally, we should transform the applicability of GDPR any privacy in the domain of national security into exception, subject to proportionality and a judicial and judicial review. Based that on, as I said on that the legitimacy of the of the measure, the fact that it is regulated by the law and that it is subject to a proportionality, how can we achieve this purpose? I think that if the Parliament that in the interaction with the Council now in connexion with the privacy regulation will succeed in clarifying this issue. This would be an extraordinary achievement, but a difficult one to obtain an advantage of having, I think, the application of GDP and any privacy to national security would be that the also data protection authorities could be involved in the process, at least when it is established that that that the exceptions cannot be used because the requirement of legality and proportionality as not there are not the satisfied. So thank you very much for your attention.

Jeroen Lenaers (Chair): Thank you. Thank you very much, Professor. And we move immediately to Mr. Iverna McGowan, director of the Centre for Democracy and Technology’s Europe Office. We also have the floor for 10 minutes.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): Thank you. And Mr. Chairman, members of the committee, members of Parliament, esteemed colleagues, I’d like to thank you for the opportunity to speak today and highlight how important it is to have the voices of civil society heard, to ensure that and this committee’s findings can be human rights compliant and binding solutions to the threats that spyware poses to human rights and our democracy more generally.

For those of you who might not be familiar with my organisation, the Centre for Democracy and Technology is a not for profit organisation. Our European headquarters is here in Brussels and we work to protect our democracy and human rights in European tech law and policy. And we indeed work also in the United States and internationally. I’m speaking a little bit quickly. It’s my Irish tendency, I think.

My intervention today will focus on three points. Firstly, I would like to examine a bit more broadly the implications that spyware has on democracy, human rights and civic space in particular. Secondly, I will raise some of the challenges that it poses to election integrity. And thirdly, I would like to address the committee on certain points relating to the EU’s dual use regulation. Before concluding with some concrete actions that we would compel the Committee of Enquiry to take.

As you’ve heard from our colleagues this morning, unlawful surveillance violates the right to privacy. It can also violate rights of freedom of expression, opinion, association and peaceful assembly. The European Court of Justice, as we’ve also heard, insists upon a very strict necessity and proportionality test. State led surveillance before it can be deemed lawful and must both satisfy the European Court of Human Rights and the European Court of Justice’s tests on this, which means the limitation on surveillance must be limited in a democratic society to what is strictly necessary. It’s important to understand then I was, as was also outlined by Professor and by Professor David Kaye earlier today, that the very type of the state hacking tools such as the NSO group use are such extraordinarily invasive and generalised forms of surveillance that they would de facto never meet either of the European court standards. Nor, of course, would they meet the standards under international human rights law. That’s why it is particularly worrisome that this enquiry has already heard that such tools are being used both within the EU and that elements of the technologies could have been exported from the EU. We must remember why our courts so staunchly uphold the right to communicate securely and so carefully. Consider any derogations from us. The right to communicate securely is the bedrock on the key pillar upon which key pillars of democracy are built. This includes press freedom, the presumption of innocence in a trial, privacy and freedom of expression. And indeed, as is the topic today, the very ability to hold free and fair elections. A vibrant civil society is a prerequisite for a free and fair election and indeed for a strong, resilient democracy. The pandemic has fast forwarded digitalisation, meaning across the globe. Hundreds of thousands of people are organising online to fight racism to protect our planet. But there’s also been a major backlash against these demands for societal change and against the power of online organising. Civil society actors across Europe have witnessed online smear campaigns, stigmatisation of their organisation staff, as well as personal attacks on those working on the front line to protect human rights and democracy. Already in 2019, the EU Fundamental Rights Agency on Civic Space found three of the four most common threats and attacks to civil society took place online. It can be no small coincidence, then, that the primary victims of the Pegasus scandal are journalists, whistleblowers, human rights defenders and political opponents. These actors all have a crucial role to play in defending democracy and human rights. But it’s because of this very role, because of this challenge of speaking truth to power at your own peril, that some actors become the primary targets of such operations, like we saw at Pegasus, as we’ve mentioned. A civil society, a vibrant civil society is imperative to run free and fair elections. Being spied upon inhibits people’s ability to organise, to campaign, and has a detrimental impact on a chilling effect on freedom of speech and expression. Indeed, ladies and gentlemen, it is a key prerequisite to a free and fair election to have a vibrant and free civil society. Another point to mention today, as in line with what we just heard, are hack and leak operations. It should be stressed that information lawfully obtained about any candidate that would be in the public interest is a normal part of the cut and thrust of election campaigning and important to a vibrant democracy. But that’s not what we’re talking about. We are talking about hack and leak operations where information is unlawfully obtained through spyware or freezing attacks and which is often accompanied by smear campaigns.

As we heard in Poland in 2018, Pegasus Fire was used to hack political opposition members, and that information was then used as part of a smear campaign. In 2016, Russian hackers released hacked emails from the Democratic officials and rocked the US presidential elections in 2017. Again, a similar hacking leak operation released and the French election period. Thousands of documents about Emmanuel Macron just hours before the election. The matches and actors that carry out these different hacking make operations may be different, but the tactics and impact are similar. A week is a long time in politics. Hours and minutes can be a very long time in the context of election. The timing of such hack and leak operations tends to be at the 11th hour just before polling starts.

Therefore, inhibiting the targets and victims are restoring their reputation in time before the votes. This is what makes effective remedy in such cases extraordinarily challenging when election results have already been called. This also reminds us of international election standard of equal suffrage. It is imperative, according to international human rights law and standards, that different arms of the state remain impartial towards candidates and political parties. We can see then how such spyware poses an extraordinary danger to elections when used by unauthorised actors, by governments or against political promotion opponents. This is why it’s all the more important that we regulate effectively to control the dissemination and use of these tools in accordance with human rights standards. The use dual regulation is helped to control the export of such technologies, but what about their use and trade within the European Union?

Another factor that is very important that we discussed today and has implications on election integrity is the harvesting and exploitation of user data from social media accounts for the purpose of targeting ads or other election rate contact. We must include this particular type of unlawful harvesting and surveillance of people online in our analysis, given the serious implications for election integrity. The Cambridge Analytica scandal resulted in disclosure and uses of data at the expense of user prices on a mass scale by responding to a seemingly innocuous quizzes via their Facebook page. Users unwittingly gave access to third parties that harvested their data installed spyware similarly uses and results and has the impact of using this data on a massive scale. And stolen data, of course, can be used to manipulate what content and information we can see online. User surveillance and targeting in this way has an impact on universal suffrage. That is the right of all people to have a vote. This is because such targeting is used to conduct users voter suppression. Voter suppression means campaigns whereby the personal data is used to target people and discouraging them from exercising their right to vote. Minorities and communities of colour are unfortunately a typical target of this technique. This personal data is used to run campaigns that target these people to give them false information on election procedures or dissuade them from exercising their right to vote. The obviously the European Union has already taken some action with the EU Digital Services Act. There’s ongoing debates about the online political ads regulation and we, the side of civil society would be causing calling for the phasing out of such tracking and micro-targeting as it requires and require restrictions on the use of spyware, as well as a robust implementation of current EU legislation on data protection to counter this particular threat. Ladies and gentlemen, the United Nations Office of High Commissioner for Human Rights has called on states to impose a global moratorium on the sale and transfer of surveillance technology until they put into place more robust regulations that guaranteed its compliance with international human rights standards. With this, and with the objectives of this enquiry in mind, our organisation is in the midst of conducting research into the implementation of the EU’s dual use regulation. We hope that this research can inform how export controls could be tightened and on the broader question of what other measures might be necessary at European and international level. The regulation is essential to the integrity of the EU’s foreign policy and the future too could ensure that the recommendations of this enquiry go far beyond the European borders. This enquiry has already heard allegations that Pegasus Fire was partly exported from EU countries. During the negotiation, indeed on the recast EU to regulation and since civil society and our partners have highlighted some of the shortcomings.

In other words, how is it possible with the existence of a regulation that we might have seen such exports? We know that the definition of cyber surveillance within the regulation was very narrow. There’s a weakness in the so-called catch all clauses which can essentially be vetoed by member states. And there’s also a real lack of precision regarding which human rights concerns should be taken into account, both when applying the catch all control clauses and when assessing whether the export should occur. There are also insufficient transparency measures. I would like to briefly focus on one particular point of dual use, and that is on the point of due diligence. The United Nations Guiding Principles on business and Human Rights require that companies take proactive steps to ensure they do not cause or contribute to human rights abuses in their global operations. In order to meet that responsibility, they must carry out due diligence to identify, prevent, mitigate and account for how they address those human rights impacts. Since the revision of the EU. Jewel regulation. Actually, EU law has advanced a lot in aligning with the standards of business and human rights due diligence. And so the results of our research will hopefully provide a more in-depth opportunity. But we can already suppose that we would need to update the EU regulation with a stronger provision on do Julie on due diligence obligations both for companies and states to make such export controls more effective. Members of the enquiry well, regulation of surveillance technologies is obviously a substantial step. We must also ensure that Europe, as was spoken about in our first panel, protects the technologies that enable secure communications in the first place. Regulation of technology to gain access to private communication can only be effective if the communications are in fact private. We are concerned that the detection requirements in the proposed child sexual abuse material online proposal would actually provide private messaging service, would compel private actors to make private messages available and break end to end encryption. I would welcome any questions you have about the privacy impact of that proposed regulation and would like to bring it to the committee’s attention.

Esteemed colleagues, I’m now closing. I would like to have demonstrated today the right to communicate certainly is a keystone in the arc of European democracy and what appeal in the enquiry vote to focus on the broader need for safeguards on communications and the rule of law, but also checks on state surveillance as well as the particular aspect of surveillance technologies.

Our recommendations to this enquiry would focus on the need to strengthen oversight of state surveillance, including insisting upon international human rights standards with regard to checks and balances and to ensure independent, impartial investigations where there’s reason to believe that such transgressions have occurred. Be critical about whether law enforcement itself is sufficiently independent and impartial to lead such investigations, given that there are possible implications already. We call for the Europe. We invite enquiry to call on the European Commission to broaden the scope of its annual Rule of Law report, to include an analysis on reprisal of journalists and human rights defenders and include reports of unlawful surveillance against members of civil society.

We’d also highlight the critical need of tools such as end to end encryption. On the two concluding points. We would call for an increase of accountability of corporate actors. This would mean to revise and strengthen the provisions of the EU’s regulation on corporate due diligence and also consider what further regulation will be required to control domestic trading of spyware. In terms of immediate action, we have just heard many times today just how dangerous the particular spyware at hand is. We would therefore call for a moratorium in the EU for exports out of the EU and place the sale and transfer of all surveillance technology. And they put in place robust tech, robust regulations that guarantee its use and comply with international human rights law. Following the lead of the US and other jurisdictions, the EU should put NSO on its global sanction list and take all appropriate action to prohibit the sale, transfer and exports and use of those technologies. Members of the Enquiry, we implore you to act urgently as if democracy and human rights depend upon us because they do.

Jeroen Lenaers (Chair): Thank you very much, Mr. McGowan. Apologies for distracting you there, but it is important because we need to make sure that your message is also interpreted in all the other languages, and it’s a very demanding task for the interpreters. When people speak fast, they have a tendency to repeat myself. So I know, I know where it comes from. But we really it’s important that your message doesn’t get lost in translation because we speak too quickly. Thank you very much. I open the floor for the questions and answer session with our colleagues. I would like to ask those colleagues who haven’t indicated yet but also want to take the floor to do so. And we start with Sophie in ’t Veld, our rapporteur.

Sophie in ’t Veld (Renew): Thank you, Chair. And my thanks to our three interlocutors in this session. Maybe I can start by asking Mr McGowan if you would share your speaking notes, because I’ve listened with great interest and agreement and approval, I have to say. But as for the details, it’s always useful to read it back.

My first question would be to you, would you consider that there are cases in the European Union, for example, Poland, where the use of spyware has corrupted elections too, to such a point that they would actually be invalid.

And then question to Senator Brejza. I would be interested to hear a little bit more about the smear campaign, because if I understand correctly, that is being conducted or has been conducted on the basis of material which has been retrieved by spyware. Now, the government denies, if I understand correctly, that they are involved. If it wasn’t the government, then it. Was a non-state actor and therefore a criminal act. And I would expect the authorities to be on top of it, which is not the case. So how do they then explain that the material that was retrieved in. Either a criminal or a secret operation. How did it end up in the hands of public television? Can you say something about the methods that have been used in the smear campaign? Are there similarities with other smear campaigns, not only against other people, but also in other? I mean, we know that there have been scandals and wiretapping scandals in the past. Do you see do you see a pattern? Do you see are there similar channels? Is the message the same? Can you say something about that? Can you confirm that that there are no charges against you, that that would justify such a such a surveillance operation? And has the government indeed not confirmed that there was judicial authorisation? In other words, there’s absolutely no official explanation for you being spied upon. And then finally, I hear that there apparently there are still people who are being targeted. Are you still being targeted? Can you verify that? And if that is true, then we would have to conclude that the that Poland is still using Pegasus or possibly another spyware, despite the fact that everybody assumes that they are no longer on the customers list of NSO. So that would be an interesting piece of information. Thank you.

Jeroen Lenaers (Chair): Thank you very much. We’ll take the questions in the order. So first, Ms McGowan.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): Thank you, MEP in’t Veld. So obviously our organisation has not had the resources or time to do a full investigation into any particular cases. But as I mentioned in my intervention, it is the very challenge that when such intrusive spyware is used and such sophisticated hack and leak operations come, that the difficulty is indeed that it is. So the timing is such that it can really impact the outcome of election. But obviously you need that really thorough and independent investigation, which I know is proving very challenging to do, to then assess and the extent to which it had that impact on election. And that’s why we are really calling, you know, to obviously outlaw the type of use of such surveillance technology in the first place, given that the risks that it poses and also to have independent and impartial investigations. And I know I’d mentioned the police in that regards, and I know that the committee has draughted a letter to Europol in particular, but it is quite complicated because sometimes arms of law enforcement are requested in cases where the state might have been involved and therefore makes the independence of those actors challenging as well. So that’s just something to be mindful of.

Jeroen Lenaers (Chair): Thank you, Senator Brejza.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): Thank you very much. I have never committed a crime. Let’s get this straight. I have never been charged with anything. There was never a request to lift my immunity. The prosecution has never summoned me. That’s one thing when it comes to this hate incitement campaign.

There are 500 materials on TV, public TV during the election campaign, starting from 5 a.m. to 1 a.m.. They have published materials that constituted slander. Together with my lawyer, we have initiated murder proceedings, launched civil lawsuits. When it comes to the main issue, using retrieved materials was the help of Pegasus. In December, we have wne in the proceedings considering an interim order interim injunction. If I were to try to repair my reputation, I would have to pay. Around 5 million because that’s how that’s the financial equivalent of what they did on public television or the government television. But now public are in the middle of the election campaign. If I were to clear my name, I would have to spend 5 million. Where did they get those materials from? They set themselves on TV. This was retrieved from proceedings, court proceedings or prosecution proceedings in Gdansk. This matters.

Back then I was in the Parliamentary Committee of enquiry on Amber Gold. My task was to explain what went wrong in the Gdansk prosecution. So we had a special committee of enquiry on underground undergo the Polish scandal, and I was a member of it. At the same time, the public TV said that the information it has against me comes from the Gdansk prosecution. Paradoxically, at the same time, we in the Polish side were analysing the mistakes committed by this. Prosecution office in Gdansk, the same institution.

How many people were targeted? We don’t know because the government doesn’t want to speak out. Perhaps they are just withholding information or perhaps they are lying. As simple as that. They don’t have to work together with you. As you very well know, they do not want to participate in a debate. They are avoiding confrontation. There are some things I cannot say publicly because then I would be liable. I would have to face liability for disclosing information. So there are some things I cannot say, but I know. I know how it was done. The media reported on that people would be summoned by the special, by the secret services and forced to provide some dirt on me to provide some compromising information. But that failed because I am not guilty of anything. But the secret services were used in an instrumental way to fight the opposition. Thank you.

Jeroen Lenaers (Chair): Thank you very much. I’ll just to check on the Speaker’s list I have for the remainder of our Q&A. Arłukowicz, Kohut, Thu, Delbos-Corfield, Novak, Bilčík. If there’s anybody else who would like to add their name. If not, then I close the speaking list and pass the floor to Arłukowicz.

Bartosz Adam Arłukowicz (European People’s Party): Thank you very much, Chairman. I will not hide the fact that we have known each other with Senator Brejza for many years. We were cooperating during elections. Also, when, during the work of the election campaign, there were many people then we thought this will be equal. A fight to election, a fight. However, then we found out it was not the case and not at all.

This hearing is very important to, for the sake of the future of democracy in Poland, as well as for the sake of the future of democracy in the world and Europe. Because if it’s true, but we assume it is true, what Mr. Barroso is saying, this is corroborated by documents, then we can even venture to say that the results of the elections in Poland were rigged. They were manipulated, maybe even the results of the European elections. Maybe we should to delete the word, to strike out the word. Maybe if in the middle of the election campaign and on Polish television, this is a to television. This is if we have 500 materials on Brejza, not to mention other people, me, Mr. Tusk and other politicians. So it is quite self-evident that these results were manipulated in a way. Maybe this government would not be in power right now. The government, which do not want to talk to you, to us, but maybe this is the government that was elected by speech, by special services.

I have a question to you, Senator. Do you know? Do you know, uh uh uh, who is using all this material from your device? Maybe Russian services, Israeli? Maybe both. Who is using this? We have Article 130 of the criminal law in Poland, which says that those who transfer and to forward to materials from devices to foreign services, they may be convicted of espionage. And if we refer all this to the situation of Commissioner Reynders, then we have to realise that this is something affecting us. Europe. We have bombs we have for a war. But maybe at the same time Pegasus is installed on the devices of the most important people in Europe when decisions are taken on sanctions. This is the scale of the issue. This is not about wiretapping Brejza’s phone. This is about spying on political parties. So there was a manipulation of elections. So where are all those wiretaps, Mr. Brejza, who is in possession of all those wiretaps? Thank you.

strong>Jeroen Lenaers (Chair): Thank you very much, Mr. Arłukowicz. Then first to Senator Brejza, and then if the other two panellists, of course it could reflect on the wider context of the question of Mr. Arłukowicz. Senator Brejza.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): Thank you very much. Mr. Arłukowicz. Thank you. Because you have drawn our attention to yet another aspect of this illegal use of Pegasus in Poland too. And IT tool and operational tool. IT, based on Polish law, should be certified by the Polish civil counter-intelligence. This had not happened to you cannot certify a system to which the services of foreign states have access. And when you rightly said that there was a crime committed espionage, you mentioned espionage. I would agree with you. Unfortunately, all this data from my device is somewhere it is out there.

However, I am not able to tell you were whether it was Israeli services or maybe all this information on the operations and activities of a Polish MP is in the hands of other services. This could affect you, MEPs. This is priceless information. This is information. This your conversations or you all you do in plenary, during committee meetings and all this data that is gathered by way of Pegasus all this information. Even when you talk to your constituencies. This is behind the scenes. This is a priceless this is a trove of treasure for Secret Service services which could be used for other purposes in the future. God knows what purposes.

Jeroen Lenaers (Chair): Thank you. Professor Sartor.

Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI): And I think that what we are hearing raises issues, fundamental issues of law and politics that go beyond the small domain of data protection law. But it appears to me that affirming that these aspects of European law dealing in particular with data protection law that are relevant to this case, are such as that the GDPR and the e-Privacy Directive would contribute to and enable a control over those kind of activity also by the European institution, in particular by the Court of Justice. It seems to me that if standards of EU law with regard to data protection and the limitation of data protection for the purpose of a national security were applied, certainly those kind of behaviour would fail to comply with the conditions required for limiting this right. And so we would have a right to access the content, a right to and to block this kind of illegal activity. So I would not underestimate the significance that that the application of the data protection law could bring in this domain. We have to overcome the idea that the data protection does not apply to national security activity as defined by the state concerned, which is an issue that I think the Parliament could contribute to overcome.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): Thank you. And thank you very much. I think indeed, when you think about the consequences and implications of surveying political opponents and people in the election, I mean, the stakes are so high that I am particularly, as we’ve said before, the type of very pervasive and invasive spyware such as Pegasus just could never be justified under a national security concern. We need to be very clear about that.

Also, something that strikes me is that just how challenging the right to an effective remedy is. I absolutely concur with the professor that of course there would be standing for violations under EU privacy protections. But when you hear about the implications such as obviously potentially losing election, the damage and we’ve seen this with human rights defenders and civil society as well, the damage to your personal reputation and your livelihood, etcetera. We need to have a deeper conversation and reflection and what effective, really effective remedy and restoration would look like.

And a final point that links a little bit to the conversations that’s come on with the regulation on online political ads. We have pan-European elections, we have national elections, but actually there’s not as yet a standardisation of and, for example, election commissions at member state level or a pan-European approach to that. And so some of these questions we’re coming up with those questions when we work, be now implementing things like maybe the Digital Services Act or the online political ads to say when issues of election integrity cross borders, what is the body that can independently assess that and make calls on it? And that’s also something these conversations are obviously throwing up again.

Jeroen Lenaers (Chair): Thank you very much, Mr. Kohut.

Speaker 5 Thank you, sir, for the floor. I have two questions to Mr. Brejza and one sort of question to Mrs. McGowan. Question for Krzysztof Brejza in polish.

Dear Senator, we are not in the same political family, but we are in the opposition and we are all there with you. For the sake of democracy, for the sake of freedom, we want Poland to be back in the European family, to be one of the main players. Again, a few questions briefly. Some things need to be said again. Do you have any knowledge of how much money was spent to spy on? You mentioned 20 million PLN. What was the source of this money?

Second question. After the visit to Poland. We think there is a kind of Bermuda Triangle involving the secret services, the prosecution and the public media. How many at times did the manipulated materials from your phone found themselves on the public media? In the public media? And what was the impact on your family and your life?

And the third question, I’m sorry. Is there a risk that the next election would be undermined? Did you make any research regarding the parliamentary elections in 2015 in Poland? I’m talking about using by law and justice and Solidarnosc Polska, the algorithm of Facebook and also using illegal records from civil society restore during the campaign. Thank you very much.

Jeroen Lenaers (Chair): Thank you, Mr. Kohut. Senator Brejza.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): Thank you very much. Thank you very much, Mr. Crawford, for your important questions. This data has been disclosed by the journalists of the Polish daily Red Possibilita. Where did the money come from? Apparently, Pegasus was purchased by an intermediary, a company linked to former Communist Secret Services. Apparently, they found a way to make money. To reinvent themselves. And this was done to do something which smacks of communism or of neo bolshevism. The Government is doing is repressing the opposition in the same way as previous Communist secret services.

It so happens that my family has an anticommunist tradition. In the eighties, we had visits from the Secret Services. My father was detained, my uncle was detained. And I cannot tell you how sorry I am that the same methods plus the technological edge, are being used now. State of the art system intended to fight terrorism. I cannot tell you how sorry I am that my government, the authoritarian government, is using it today to fight the opposition because the objective is the same as during the communism. When it comes to manipulating and doctoring my messages. That was done in the seventies against Adam Michnik. Against the opposition. Letters were being doctored, manipulated, falsified and sent. As fakes.

Then we had a collapse of communism. We joined the EU. Everything was brilliant. We joined the Western civilisation and such practises were no longer applied. But they are back. They returned seven years ago and now we have a reversal. A regression compared to what we had gained. I am very sorry about this because my generation of my parents and grandparents couldn’t wait for Poland to join the EU. We wanted Poland to be a part of the Western civilisation. We always wanted to be there, but it had never been possible because of the communism. Because of communism, because of the Soviet Union, because of the Iron Curtain.

Then the dream became reality. We got there. We were flourishing. And what happened then? A nationalistic party won the elections and started to dismantle the democratic institutions. The use of Pegasus is just the natural consequence of that. The safeguards had been dismantled and the Pegasus was used. We all talk about the division of powers. It might seem to be something abstract because we are because you are experiencing it every day. But this division of powers is something very tangible for us. Or rather lack of it is being tangible in Poland because lawyers are being intercepted. Activists are being intercepted with the use of Pegasus. Pegasus was used because the division of powers is being undermined in Poland.

As to the next elections and whether they will be undermined or not. I am convinced that the government is going to try. This Bermuda Triangle mentioned by you is a very telling metaphor here. This is very dangerous because we have this alliance of the government media while the independent media are being destroyed. So we have the government media, the public prosecution subordinated to solid Arnab Polska, which is a member of the government. The prosecutors are scared. This is a chilling effect. And the prosecutors on the ground. Have their hands tied because if they do something which is not to the liking of their superiors, they are being moved, they are being dismissed. They are being harassed. So you have this triangle, this alliance, the media, the prosecution, the government, the secret services. Thank you.

Jeroen Lenaers (Chair): Thank you, Ms. McGowan.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): Thank you. And we have not conducted and research on that particular case. However, what I can say is that had we wanted to at that time, it would be extraordinarily difficult. And I think what is an exciting opportunity, but one that will also need to be carefully considered in line with privacy and free expression implications is the opportunity that the EU Digital Services Act is presenting with its provisions on research your access to data. Because I mentioned that it would have been very difficult because actually getting access to that kind of information, in particular how different algorithms are working and whether or not there’s a correlation between a specific data breach. And also just to say, as I did a bit in my intervention in the analogue in the old days, political campaigns were fought door to door. You would collect that information about constituents knocking on each door, asking what they were interested in, etc., etc.. So the fact that this information can now be collected on such a mass scale online obviously poses a certain challenge. And at the same time, any regulation of particularly political online speech would need to be very, very carefully. You shouldn’t regulate speech online and beyond what we’ve seen in the Digital Services Act. And so, yes, that’s not something that would have been, I think, even possible. But going forwards with the right checks and balances, it would perhaps be possible to have such an investigation.

Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI): I concur with what was just said by Mr. McGovern that indeed the Digital Services Act and the possibility to access of to access information and control are going to give us an additional opportunity to exercise to exercise some control. Obviously, there is always the national security limitation that has to be addressed and I think this is a key source for this kind of enquiries.

Jeroen Lenaers (Chair): Thank you, Róża Thun.

Róża Thun und Hohenstein (Renew): Thank you. Thank you, Chair. Thank you for everything that we have heard until now. What is seems to me very strange is if I well understood what Mrs. McGowan was saying was that such a deep surveillance would never be justified by a European court. We heard the same thing from the prosecutor, just like when we were on our mission in Poland, that such a deep surveillance is not within the Polish legal framework and it would never be justified by a Polish court. And since and still the authorities keep on saying that they had those authorities. But the story about Pegasus, we have known it already a longer time that such a system is being used. I really wonder why until now, nobody went to court against those governments that used them. And that, I must admit, confirms me or convinces me that we were really right when we, Renew Europol Polska 2050 demanded to have this enquiry committee, but because it looks freely like it brings to the daylight issues and I hope that the clarification court cases, etc. will have a very broad support, not only us in Parliament, in the European Parliament.

And with this I have I have a question also to you. I don’t know to whom do you see this? The fact that in Russia and in the EU, in Poland, this hate campaign was used in the same way, or there were the same experiences, used the same practises. Is this just a coincidence or do you think that that was something more systematic than that? Mr. Brejza has not answered the question of Sochi, namely if you still target it, or do you think that you are still targeted? I would like to know. You don’t know. Do you know where those data that they collected about you, about your family, friends, etc., etc.? Do you know where they are at all today? Have you tried to get information about it? Because except for Polish public television, which uses your data, we don’t know who collected the you know, who collects your data. You said that the that the database was stolen during the electoral campaign. Do you know what happened with this? And have they said they used it? But how did they how did they use it? And the last thing you said, that the elections and we all assume that the elections, in fact, were so strongly influenced by the fact that you were back at all the time, that the result is not objective or not honest or not real. But I think it would be very interesting if you could show some examples how you noticed that it was being used by the competitive parties that competed against your party. Thank you very much.

Jeroen Lenaers (Chair): Thank you very much, Ms Thun. Let’s start maybe with Ms McGowan on the justification, authorisation and lack of court cases. And then we move to the second question where you can all reflect on, and then Mr. Brejza can answer this specific questions to him.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): The obvious most first response is it’s very difficult to take a court case if you don’t know you have been a victim. And I think that’s something and it echoes what we heard in the first panel this morning, that as long as national security, as used as an overly broad excuse to push these kinds of surveillance totally outside of any legal framework, and we also obviously have instances where it might not be directly a state actor, but other actors. And that poses a separate number of problems. Another point, and that is important in terms of ultimately getting accountability or getting some of these questions before the court. So the European Court of Human Rights in its jurisprudence has said that any subject of a state led surveillance should be notified that they were under surveillance at such a time that it would no longer and obviously impinge on that ongoing enquiry. Now, that’s tricky, right, because it still leaves it in the in the hands of the state to say one or not. But I think notification and frameworks around notification and are important so that people know and therefore could actually have it and unchallenged. It’s also important to recall that and under the GDPR that you have a right to judicial review. So in some Member States, as we know, they can give more or less information regarding whether or not you are subject to surveillance, but that those provisions in the GDPR are really important. That you do have that right to judicial review. But the starting point, as this enquiry is very importantly pointing to, is if you do not know that you were subject to such as surveillance, it just in the cases we do know it became because of the work of organisations like Amnesty International and Collaboratives that you know, we’re independently looking at that. I think that speaks to the lack of independent oversight in Europe, the fact that it was brought to our attention in that manner. And then you also asked about is it a coincidence? So obviously there have been several different hack and leak operations in the United States. The one I mentioned, the United States government has called that out with Russian implications and in other places there hasn’t. And as I opened with to say that obviously leaking information, leaking unkind or damaging information about your political opponents is nothing new. That’s happening a lot in elections. What is different here and what is troubling here is that the information is unlawfully obtained and often is then combined with a doctoring of that information or a sustained and smear, you know, smear campaign as well. And I think we need to be really clear in that distinction. So there is it’s a common tactic. And I but I think the response to it comes back to how we better regulate and control such spyware and data and privacy breaches in the first place.

Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI): I can follow on on this. It seems to me that the reason why there have been so few or no legal cases concerning the Pegasus or this kind of surveillance more generally, on the one end of that is the difficulty of even being aware that these surveillance is taking place and collecting adequate evidence. There are also problems in getting adequate responses from the judicial for certain national systems. And at the international level, I think that these kind of activities would, as it was said also this morning, violate the international human rights regime and both the U.N. at the United Nations level, but also the European and the European Convention of Human Rights. But as we know, the kind of measure on the can adopted at these level are limited, and more and more effective measures could be adopted by them in the Court of Justice of the European Union.

But there we have the difficult issue of the interaction between the Union and the member states. The Member State being interested in keeping and not being controlled in their activities that pertain to national security and having a broad understanding of this notion. Also, France, I think, was recently involved in this kind of in this kind of issue concerning in particular the European data protection regulation, such as the GDPR and the European and then e-Privacy Directive and the regulation, future regulation. The problem is determining to what extent these instruments also cover the activity concerning national security, and I think that an approach should be taken, but also by promoting appropriate cases, I think in front of the Court of Justice is to assess the extent to which certain activity may be considered as pertaining to national security. And in some cases, like the ones also that we heard today, it would not be difficult, I think, to determine that they are indeed outside the of any reasonable concept of national security.
And then there is the issue to determine whether the limitation can be assessed according to proportionality by the Court of Justice in the domain of national security. I would be in favour, but I think that different opinions would persist in this regard and the current discussion concerning the privacy regulation in front of them and the in the interaction between the Council and the Parliament, I think is a key domain, a key point where this debate is now taking place. And I hope that the Parliament succeeds in making his perspective go through in the end.

Jeroen Lenaers (Chair): Thank you, Professor. And then there were a number of questions for Senator Brejza, so please.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): Yes. The question by Ms Thun whether I am still targeted, I don’t know that because in we don’t have an institution, a solution in our legal system, which would allow a citizen, opposition politician, a journalist, prosecutor, and they have been targeted to they would not be able to verify or even ask a question whether I have been targeted or the target of a spyware or surveillance and institutions which could do it were taken over by the government. In our civil cases, we act against people who for war to who promote to propagate all this material also against the public television. In civil cases, we are trying to submit evidence and to a request to secure at these materials to verify what was the justification for operational activities, if it had been had been the case at all. And this is a problem because these materials could have been. Destroyed, deleted to secret services may be destroying them. Right now, as we speak, journalists, politicians may be surveilled and after some time, secret services will simply destroy them, leaving no trace.

And now, please take a look at our paradoxical situation in Poland when one person is both minister of Justice, a politician, a cat of political party, and this same person is also MP and at the same time prosecutor general. So the prosecutor general who accepts and approves requests for operational surveillance, including opposition politicians. I reported to a reporter to the manipulation of my device by way of Pegasus. We are travelling hundreds of kilometres to where distant prosecutor’s office. Prosecutor does not want to do anything with it. They behave like robots. So please tell me, how can prosecutor’s office verify these requests which were which had been approved by the Minister of Justice, the head of a political party, which is part of the coalition in power. So this is the system, a authoritarian system, and it is really hard to check to get to the bottom of it. But hopefully following the elections next year, the truth will see the day of light. Light of day. Thank you.

Gwendoline Delbos-Corfield (Greens): Yes. Hello. My first question would be about the relatives and all of those that have spied at the same time. None of you have approached this. What would be their rights? What would be their remedy? Because, in fact, when so many people have been surveilled in one country, it’s in fact, a lot of others that I concern. It’s a complete community. It could be people in other countries. I mean, what would be the things that these people could do? I mean, if someone in another member state has been spied by the DSS of Mr. Brejza, what would be his solution?

My second question is about the impact very slowly on democracy and how there is a legal remedy, because what the story in Poland, I mean, 95% of it is so illegal and outrageous that in most of the member states, it would be dealt with in one way or another. And it would be probably even very, very well dealt with because it’s just a no fault all the way. So we’re not in a member state where we independence of justice would really exist and media pluralism would really exist and where democracy would be functioning in a right way. I mean, this to me would already be in tribunals and dealt with. My question is, in these other member states where a independence of justice does exist, media pluralism does exist, and for a relatively good reason, there could be some spying of people doing elections because you do consider that there’s a constitutional reason or something like that. How would you analysts analyse the effects on democracy? You know, without all the awful story of using this data to get hate against him and misuse the data and all this? And where do we have the solutions, the legal solutions for democracy? I noted your interesting point of putting civilians in the rule of law report of the commission. I we will keep that we had something about it yesterday and they are fascists. But we can come back to this.

And then my third question would be, you’re not the only there’s not only Poland, there were victims in Hungary specifically, but in a few of the member states, also Greece, Spain. What would be possible on a collective point of view? Are you in contacts with each others? And then again, on the on the in the European Court of Justice. Is a collective action here possible because of deficiency of the European Union to resolve these problems.

Jeroen Lenaers (Chair): Thank you very much. Maybe, Professor Sartor, you want to start on the first two questions in any case?

Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI): Okay. Now, thank you very much. I think these two. Right. Concerning the possibility of raising a case in one of the other member states in case that the citizens of these member states were involved. Of that, the activity took place in another member states. I think that in principle and this would be possible. Maybe we have to distinguish whether the authors of the surveillance. The surveillance could be directly attributed to this, that we have to win other member states or to a private or to a private entity task the by. Possibly by a member state to engage in this kind of activity. But I think that the initiatives could be taken in both in front of the data protection authorities and in front of the judiciary.

And also the idea of a collective action in front of the Court of Justice could be an important initiative. Again, I have to we have to see how these and these issue of the national security can be taken into consideration. There was a recent case by the Court of Justice them and the French case, if I remember well, where the Court of Justice applied them, consider that the limitations to data protection in this case it was about the data retention directive were not the was subject to proportionality review even when a concerning being adopted for the purpose of national security and in particular in that case the Court of Justice considered that an unlimited collection of data, data collection retention for an unlimited time would be incompatible with the ePrivacy Directive and in particular and also with the European Charter of Fundamental Rights. So the Court of Justice, I think, would be sensible, would be sensitive to this kind of initiatives and possibly this might be a way to obtain such significant result at the level of EU law. Thank you very much for this question. But I think it is both issues need to be further investigated in order to come to convincing outcomes.

Jeroen Lenaers (Chair): Thank you. Ms McGowan.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): And thank you. On the not so easy question of when is it okay to survey political candidates? And this, again, would repeat to some of the points that we had made earlier. There would have to be an extraordinary reason there is need for the arms of the state, you know, particularly those in government, to be impartial in the context of election. So in any national system, that would mean that you would need an independent and extremely narrow assessment. And as we’ve said several times, that assessment would never conclude that spyware such as Pegasus could be used because it is just so invasive and intrusive and so and I think we need to be and yeah, it really comes back to how you would bring surveillance and measure freedoms really fully in line with rule of law safeguards and have the robust and independent institutions within the state that you could permit that. But it would indeed be a particularly tricky situation in which to conduct surveillance.

And then on the point of collective point of view and I think it’s a very interesting a very interesting one, and maybe my colleagues have a better idea, but putting my EU lawyer cap on, I think it would be difficult to think of a way of and because we don’t have clear collective action options in our European system. Right. So I think the point of collective action is an interesting one, but not one I immediately think of. Obviously, you could also take us internationally. There’s lots of different ways to complain to draw, but those fora would have more political implications and pressure rather than direct legal effect.

Jeroen Lenaers (Chair): Thank you. Senator Brejza, would you like to.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): The rights of many people have been infringed upon. Let me tell you that back then. I had a number of civil proceedings in which I were involved. The lawyer’s secret has been violated. The secret of the journalists has also been violated. Professor Shen. Professor Dahl, the biggest Polish authorities when it comes to law and the Constitution, have made their minds clearly on that. And they said clearly that that tool should not have been used during the election campaign if back then in 2019, it had become public knowledge that cyber weapons were used against the public opposition, perhaps we would have had the opportunity to take it to the Supreme Court. And questioned the validity of the elections. And then the Supreme Court, according to the lawyers, would have had no option but to say that the elections were not valid. It was possible, theoretically, in 2019, when the materials from my phone were published on public TV, the Gdansk prosecution was informed but did not do any follow up. And it was possible it would have been possible. Why didn’t they do anything? Why didn’t the prosecution office try and establish the source of the materials the software used? Well, in order to answer this question, we need to go back to the fact that the separation of powers had already been undermined, if not destroyed, in Poland, and the prosecution office had been already subordinated to the government. So they didn’t want to act against the interests of the government. What are we doing now? Well, we are using all the legal instruments at our disposal. The administration seems to be helpless. They are escaping from this topic. The fact that the government did not want to meet you in war, so is very telling. So we are submitting civil lawsuits and we are gathering information for a future committee of enquiry in the Polish Parliament. Thank you.

Jeroen Lenaers (Chair): Thank you very much, Ms. Novak.

Ljudmila Novak (Group of the European People’s party (European People’s Party)): Thank you very much for the floor. Good afternoon. It is true Polish language is similar to Slovenian, but it is better for you to put your headphones on. Senator Brejza. I would have a question for you. I would like to know. To what extent do you believe that in Poland, at courts, in the media, there are still some remnants of the past regime present? Because I often hear that the Polish government is of fighting the remnants of the past regime. Therefore, it had to annul certain judgements. He had to transfer certain judges. And a similar discourse can be heard in Slovenia. When something is not to the liking of somebody, then the previous regime is to be blamed. You’re too young to be a part of the former regime. I am old enough to have experienced it in Slovenia. Luckily, it was not as bad as it was in other communist countries in Slovenia. And the force propaganda is perhaps also very strong, like you say, because I, too, in Slovenia, I frequently hear within the church Christian circles, because I am a Christian myself, that in Poland there are such strong remnants of communism still present that the current government has to undertake all of these measures to just do away with all of these remnants of the past. I know that there are many people who have suffered because of the communism, and these people are easily convinced that there are still some elements from the past regime present and because such cleansing is needed and necessary. So I would like to ask you, to what extent do you believe this is still present and true, or do you believe these are just false propaganda and excuses?

Jeroen Lenaers (Chair): Thank you, Senator Brejza.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): Thank you very much for this question. I don’t think it’s a true narrative. It’s full of lies. It’s what the government is trying to invoke as an excuse. But in fact, they are bringing the communism back. They are doing exactly the opposite. In front of this splendid building, we have the avenue of Solidarnosc. And I’m very proud that this name, Solidarnosc, has been used. But what was Solidarnosc in Poland? It was a movement fighting for a free Poland. If you look at the demands, you will see that they didn’t want the government to dominate. The courts in in the eighties, there was a trade union, Solidarnosc created amongst the workers of justice. They wanted the autonomy for the judges. They wanted independence for the judges. They didn’t want the influence and the political instructions of the Communist Party. This was what Solidarnosc wanted. And in the nineties we had the Independent Council of the Judiciary. This council was established. Now we have the separation of powers and the government could no longer influence the courts. The eighties were gone. It was no longer possible for a political party to issue instructions to the judges on how to rule. But what is happening now is going back to the times of communism. We had enjoyed 25 years of the independent judiciary in line with the rule of law and the separation of powers that allowed us to join the EU. The Copenhagen criteria. The separation of powers, rule of law. But Paris, starting from 2015, is bringing communism back. In the system of the judiciary. The judges are being nominated in a politicised way. The judges are afraid to give independent rulings because their career is at stake. We are getting a lot of info on how attempts are being made to undermine the status of the judges. Kaczynski, the president of the ruling party, said he wants to flatten the structure of the judges, allowing the political party to choose the judges they trust. The lawyer judges, those who are afraid to be independent will be promoted. Those who are independent will be downgraded. This is already happening. And even with respect to our proceedings, some very bizarre things are happening. There is that chilling effect. And let me also tell you that there is something which theoretically should be the random choice of judges by drawing lots. But even this is manipulated. Even the machine that is supposed to make those random choices is being manipulated. So if someone tells you that this government is destroying the relics of the communism, please don’t be taken in. Those are lies. In fact, they are bringing the system back. They are bringing the communism back in the judiciary. But not only. It’s also about the structure of the government. They want to centralise that power. They want to limit the prerogatives of the regions. Thank you.

Jeroen Lenaers (Chair): Thank you very much. I will attempt to other experts to delve into the topic of the relics of communism. We do have some time, so I would like to give the opportunity to two or three speakers to make some final concluding remarks. We’ll start in the reverse order. So this McGowan for you, the honour. If you have anything to add or any important points to make for this committee.

Iverna McGowan (Director, Europe Office of Center for Democracy and Technology): And again, just to say thank you for the opportunity to address the committee to underpin just how important the work of this committee is, given the other lack of remedy and investigations that we have ongoing. And a final appeal, really, to pay heed to some of those calls that this is making together with our civil society colleagues, for the urgent needs of moratoriums and to review the regulation. And I think to think more carefully about how we cabin surveillance and have more rule of law checks and balances across the European Union and beyond. Thank you.

Jeroen Lenaers (Chair): Thank you very much, Professor Sartor.

Giovanni Sartor (Part-time professor at Faculty of Law at the University of Bologna and at the EUI): I also would like to thank very much the committee for having me here and giving me the opportunity to participate in this so important discussion. I believe that we are really at a crucial juncture in the evolution of the rule of law and democracy in the European Union. And it is important that we understand that the governance of the to infrastructure is a is a key issue. And democracy can be it can be destroyed or diminished or destroyed, even if there is not the opportunity of enjoying freedom and privacy when using those assets, tools that have become a part of our life and that can be interfered there at the level of surveillance, also the level of manipulation of online information, censorship and etc.. And that I believe that this committee as a crucial role to play in this in this evolution and in particular, I think an important and important issue to be developed is ensuring that the surveillance and the European is subject to control, legal control at the European the European level. And I think that is one of the key issues to be addressed for the effective implementation of democracy and the rule of law throughout the European Union.

Jeroen Lenaers (Chair): Thank you very much, Professor. Senator Brejza.

Krzysztof Brejza (targeted with Pegasus while head of election campaign): I would like to thank you all. I think that changes at the level of the functioning of the European Union. European law. All your recommendations are very important, are very necessary so that the same situation cannot be reproduced in other member states. What we are going through in Poland is evidence of the fragility of our democracy. This is evidence of the fragility of all the values that we believe in. We believe in democracy. And we have to fight for a democracy. We have to cherish democracy. We have to cherish value. We cannot take it for granted. This is not something that is only written in textbooks and treaties. This is something we have to cherish and fight for each and every day. Thank you very much for your invitation.

Jeroen Lenaers (Chair): Thank you. Thank you very much to all three speakers. I think it was an excellent combination of the very impressive personal impacts on your life, the pressure, but also the wider context in which spyware like Pegasus has an impact on them, on democracy, electoral processes, also with the excellent contributions of our two experts. So thank you all very much.

Just a couple of announcements from my side. At the Coordinators meeting of yesterday, the PIC coordinators decided to request the mandate of our committee to be extended by three months. So just to take note of that, our next meeting is on Monday, the 14th of November from 3:00 onwards. And of course, we’ll meet many of you next week when the committee will travel to Cyprus and Greece for its mission there. Thank you all very much. Special thanks to the speakers once again for making the time available to meet with us. And thank you to all colleagues and staff and the interpreters for making this a very smooth hearing of our committee. Thank you so much and have a nice day.

Deine Spende für digitale Freiheitsrechte

Wir berichten über aktuelle netzpolitische Entwicklungen, decken Skandale auf und stoßen Debatten an. Dabei sind wir vollkommen unabhängig. Denn unser Kampf für digitale Freiheitsrechte finanziert sich zu fast 100 Prozent aus den Spenden unserer Leser:innen.

0 Ergänzungen

Dieser Artikel ist älter als ein Jahr, daher sind die Ergänzungen geschlossen.