Interview

Josep M. Ganyet

COMPUTER ENGINEER

“I trust a lot in European values, I trust little in Spain”

“algorithms have learned it’s much more efficient to generate clicks by giving you content that polarises you to one side or the other”

Wa­ter­gate, Carl Bern­stein and Bob Wood­ward of The Wash­ing­ton Post taught us that when po­lit­i­cal power uses tech­nol­ogy to spy on po­lit­i­cal ri­vals, “democ­racy dies in the dark”. With Cata­lan­gate, the Cit­i­zen Lab re­searchers have taught us that in today’s world “democ­racy dies in the cloud”. And that is pre­cisely the title of the book writ­ten by Josep Maria Ganyet, com­puter en­gi­neer, busi­ness­man and dis­sem­i­na­tor from Osona who was one of the vic­tims of es­pi­onage in the Cata­lan­gate af­fair. The book in­tro­duces us to the sur­veil­lance cap­i­tal­ism con­ducted by big tech com­pa­nies, and shows us how se­cu­rity agen­cies have taken ad­van­tage of this to cre­ate a per­ma­nent record of cit­i­zen ac­tiv­ity and the con­se­quences that this has for democ­racy. The book (La Ma­grana, 2023) also warns about the dan­gers of es­pi­onage through new tech­nolo­gies and the need for reg­u­la­tion to pro­tect the in­di­vid­ual rights of cit­i­zens.

Did you write the book, or Chat­GPT?
[Laughs] No, I wrote it my­self, but it is true that when I was fin­ish­ing it, around No­vem­ber, the issue of Chat­GPT came out in the media and, I don’t deny it, I tried it. As an ex­per­i­ment, I en­tered the sub­ject of the last chap­ters and the con­clu­sions I hoped to draw from it. The re­sult was that the texts were very flat, with­out any re­ally deep re­flec­tions. There was a lack of ethics and there was no irony, and none of the dou­ble mean­ing I like to use. Not in­ter­est­ing at all. Also, edit­ing it, going over it, fix­ing it took a lot longer than if I’d writ­ten it my­self. It’s true the texts are above av­er­age, but they lack soul.
And who is to say that I, your in­ter­viewer, am not AI (ar­ti­fi­cial in­tel­li­gence)?
Well, why not? You could be. I can’t know. We don’t know each other and we’re doing the in­ter­view over the phone. There are plat­forms that syn­the­sise the voice and some­one could have recorded their voice and there could be a Chat­GPT be­hind it ask­ing the ques­tions. In the tech­no­log­i­cal times we are cur­rently liv­ing in, I don’t be­lieve I could de­tect whether I’m talk­ing to a per­son or not.
We al­ways talk about the cloud..., but what is it?
A very easy way to un­der­stand it is this metaphor of the cloud, which is open, ethe­real, above our heads; it’s avail­able, it saves our data and makes it ac­ces­si­ble from any­where in the world. All of this is very good, but in re­al­ity the cloud com­prises con­crete build­ings, large data cen­tres of re­in­forced con­crete, steel and glass, which con­sume a lot of en­ergy and are here on the ground, weigh a lot and, un­like clouds, which don’t be­long to any­one, they do have an owner. Summed up in one sen­tence: the cloud is some­one else’s com­puter. When you keep things there, they cease to be one hun­dred per­cent your prop­erty, you waive a num­ber of rights; we’re mak­ing a copy of these on some­one else’s com­puter, be it Google, Apple, Ama­zon, Net­flix... And it will no longer be just yours, that data. You have shared cus­tody of that data and you have to trust that some­one else who owns this com­puter makes good use of it. The data that pro­duces this data, the meta­data, is not yours ei­ther.
Would it be an equiv­a­lent to what Or­well called Big Brother?
Not nec­es­sar­ily. Hon­estly, we hand over the data with some con­di­tions. Mak­ing these com­put­ers work has very large costs, sav­ing this data in­volves a lot of eco­nomic ef­fort and in one way or an­other you have to charge for these ser­vices. Let’s think about what the ser­vices of Google Maps, YouTube, Net­flix, Ama­zon must cost... Some are free, but our data, records of those ac­tiv­i­ties we gen­er­ate, meta­data, this is traded, sold, pack­aged and clas­si­fied. It has a lot of value and brands pay a lot of money to know what we are like, in an anonymised way, maybe, but they pay a lot of money to have it. It’s in­ter­est­ing to know what we re­ceive and what they give us in re­turn.
So we can’t talk about com­pa­nies rul­ing from the shad­ows.
Ex­actly, it’s a tacit agree­ment: we don’t look at the con­di­tions, nor can we un­der­stand them. I give the data in ex­change for an ex­cel­lent ser­vice. I’m think­ing of Google Maps or Google Earth... But, going back to Big Brother, we can refer to it when it comes to mo­nop­o­lis­tic poli­cies or ma­nip­u­la­tion of pub­lic opin­ion through this great knowl­edge they have of us. Then we could talk about a kind of Big Brother. In the end, we’re all watch­ing over each other, we’re a fam­ily of Big Broth­ers. In the book, I talk about it in a chap­ter about the power we have on so­cial media and what hap­pens if we mis­use it.
A hand­ful of data that, in the hands of oth­ers, al­lows them to in­flu­ence how we think to be be­cause they know what we think?
There are two ways to do it, which I ex­plain in the book. When Google re­alises it can pre­dict the fu­ture, or where we will click next – through our searches they al­ready sense our tastes: “These peo­ple are sure to buy a car be­fore they buy some sports shoes”. The other is when Face­book re­alises that not only can it pre­dict the fu­ture but it can also in­flu­ence it. In 2014, an ex­per­i­ment was car­ried out with 6,400 users, di­vided into two groups. One was given pos­i­tive news on the Face­book wall, and the other neg­a­tive. Those with neg­a­tive news posted com­ments on their wall like “It’s Mon­day, back to f****** work.” And the oth­ers, “It’s Mon­day, great, I get to see my cowork­ers again!”
We’re being ma­nip­u­lated, then.
It’s the con­ta­gion ef­fect in groups and so­ci­eties, which oc­curs so much in so­cial net­works. That’s how you can un­der­stand what’s going on in the world, how a char­ac­ter like Trump can be­come pres­i­dent, the rise of ex­trem­ism and phe­nom­ena like flat-earth­ing. Un­der­stand­ing this, they will give us the con­tent that makes us stay glued to the net­work longer. Note that I say “the con­tent that makes us spend more time on the net­work”, not what we like most. The most log­i­cal thing would be that if peo­ple like kit­tens then offer them kit­tens. Well, that’s not what hap­pens! So­cial media al­go­rithms have learned that it’s much more ef­fi­cient to gen­er­ate more clicks by giv­ing you con­tent that po­larises you to one side or the other. Be­cause when you’re very bi­ased to­wards a po­lit­i­cal party, for or against, or you’re very pro­gres­sive or very con­ser­v­a­tive, it’s more likely you’ll click on what’s clos­est to you but also the other op­pos­ing pole, whether to crit­i­cise it or troll it. There­fore, the more ex­treme vi­sions we pro­mote, the more money we make, and so­cial net­works have made a rip­ple in the so­cial fab­ric with our col­lab­o­ra­tion.
Who is it that en­cour­ages us to po­larise?
There are three major power groups: Sil­i­con Val­ley is the one that sets the pace, to the point that if you want to cam­paign against breast can­cer, you can’t, be­cause that doesn’t suit them; the great power that states have and use to spy on us, at­tack­ing or even ex­ter­mi­nat­ing dis­sent, as is the case with China or Spain; and, thirdly, we also have great power. We can de­stroy democ­racy with the cloud, as we said, or if we man­age it well, it can be­come a fan­tas­tic, open tool for par­tic­i­pa­tion. The para­dox is that these tools, the most par­tic­i­pa­tory and open we have ever had, are ca­pa­ble of de­stroy­ing democ­racy.
An­other dilemma is se­cu­rity ver­sus free­dom. You’ve suf­fered due to this, haven’t you?
Yes, but Cata­lan­gate is only the tip of the ice­berg. As a priv­i­leged wit­ness on the front line, I can say that this can hap­pen to any­one in any cor­ner of the world, being sub­ject to the big tech com­pa­nies chang­ing in­ter­ests and this im­bal­ance of power at the whim of states. As far as I know, there were at­tempts to get into my mo­bile, but I don’t know if they suc­ceeded or not. When my mo­bile was analysed by Cit­i­zen Lab they con­firmed my sus­pi­cions.
Is it law­ful for states, cor­po­ra­tions, law en­force­ment agen­cies to use this data against our free­dom?
I am happy to offer it to them, just as I give away blood for the com­mon good. They can’t tell us that they use our data to bet­ter serve us and then ac­tu­ally use it solely to ben­e­fit them. It’s not the same that Face­book uses fa­cial recog­ni­tion to es­tab­lish ties be­tween peo­ple we know than if it of­fers this data for mil­i­tary use.

in­ter­view TECH­NOL­OGY

From the Rockefellers to Musk

In the 19th-century US, a handful of men accumulated power, led by Stanford, Carnegie, Vanderbilt and Rockefeller. They did and undid: if they wanted a train to pass through a place, then it did, taking advantage of the fact that for the Indians those lands were nobody’s. Although they cleaned up their image by acting as philanthropists and patrons, they put American democracy at risk. In his book, Democracy dies in the cloud, Josep M. Ganyet draws a parallel with Musk, Bezos, Gates and Zuckerberg, who have made data their own in the common space of the Internet, which at first had no owners.

Sign in. Sign in if you are already a verified reader. I want to become verified reader. To leave comments on the website you must be a verified reader.
Note: To leave comments on the website you must be a verified reader and accept the conditions of use.