Since 08/2022 6 episodes

Netzfest EPISODE 03: Democracy and Civil Society with Prof. Dr. Jeanette Hofmann

The OXFAM podcast for a fair digitalisation

2023-01-16 31 min Season 1 Episode 3 Oxfam Deutschland | Katrin Steglich and Landeszentrale f. pol. Bildung BW

Description & Show Notes

In this episode, we explore the interrelationship between digitalisation and democracy.

Transcript

KI Linda
00:00:00
Digital is now everywhere. The promises that it would make everything better have not been confirmed for everyone. In this podcast series, we explore the question of how digitalisation can contribute to a more livable world, e.g. by reducing inequality and strengthening equality, sustainability and civil society.
Kay
00:00:29
Welcome to the Oxfam Podcast which is dedicated to the topic of "fair digitalisation“ and which is produced in cooperation with the Federal Agency for Civic Education Baden-Württemberg! My name is Kay, and today I'm talking to Jeanette Hofmann. She is a professor for Internet Policy at the Free University of Berlin and heads various research groups at the Social Science Research Center Berlin and the Weizenbaum Institute, among others. Her research areas include digital society, internet governance and quantification and social regulation. Welcome, Professor Doktor Hofmann, and thank you for taking the time for this interview! According to a 2020 estimate by the International Telecommunication Union, about half of the world's population had access to the internet. It could be described as an additional infrastructure in which we move around to get everyday things done, only without the spatial distance that exists in the analogue world. As a next stage of expansion, the Meta Group is planning the complete transfer of our lives into the digital à la Second Life. Although the internet may no longer be the Wild West and the first laws exist in the EU to regulate rights and obligations in the digital space, there are many questions about the regulation of this relatively new space. For example, there are good arguments for anonymity on the net to protect people who are being persecuted or for whistleblowers and to protect against cyber bullying or doxing - that is publishing personal data on the net against the person's will, but there are also good arguments for a digital identity to be able to detect crimes on the net or also to enable legal transactions online. In the analogue world, we have both options. We can still move around anonymously in public spaces, but in certain cases we need to identify ourselves. On the net, there is virtually no public infrastructure. We move in the infrastructures of tech companies that can collect data about us undisturbed and log what we do. We can't escape this if we want to be part of the online world. What are the trends in internet governance and internet policy?
Jeanette Hofmann
00:01:13
It’s a pleasure! In fact, in the analogue, we have a mixture of public and private infrastructures. In general, I would say that since the 1980s, more and more public infrastructures have been privatised. In this respect, this juxtaposition of "in the digital everything is commercial and in the analogue we have the public infrastructures that protect us" is not correct. In the analogue space, too, one will have to search a long time for infrastructures that deserve the label "public". But as far as regulation is concerned, we increasingly have laws that place greater obligations on the private sector and make it more accountable, to an extent where one sometimes has to fear that our fundamental rights are also being undermined. In the desire to tame the digital space, we see that our fundamental rights - and that of course includes the right to privacy and data protection - are often not taken seriously. So strong civil society commitment is needed to protect our fundamental rights.
Kay
00:04:14
In relation to increasing digitalisation, I wondered how this affects democracy and therefore civil society, and also our expectations of democratic processes.
Jeanette Hofmann
00:04:26
In the public discussion, this image - that the digital is infiltrating our poor democracy and that democracy sits there and watches as digitalisation, if you will, steals its thunder –is fairly dominant. That, I think, is a completely false picture. What we observe instead is that digitalisation is clearly progressing more and more, but democracy is also constantly changing with and to some extent independently of digitalisation. We can observe, for example, that the forms of participation in democracy have been changing quite sustainably for several decades. And in this process of change in democracy, digitalisation plays a major role as an enabler. It allows young movements to try out things, to express new forms of organisation, but also the articulation of political interests or political resistance. And here, above all, the transformation of the digital public sphere plays a major role. With many-to-many communication, individuals are enabled for the first time to speak for themselves and not just be recipients of the old broadcast media. For a long time, we have been predominantly readers, viewers and voters in democracy, and with the digital we organise ourselves in a very practical way, and to some extent outside the institutionalised forms in German democracy. That's where the political parties play a big role. They are also mentioned in our Constitutional Law. They enjoy privileges. The parties receive money from us taxpayers, depending on how many votes they have won. The party-affiliated foundations get a lot of money from the state coffers. But in addition to these parties and these traditional forms of opinion-forming, which are organised via parties, a variety of initiatives have developed that benefit greatly from the digital.
Kay
00:06:50
Thanks to journalists we have been used to a certain quality of reporting, which is not necessarily true for individuals when disseminating information that is not necessarily well researched. Does that have an effect on the overall picture?
Jeanette Hofmann
00:07:07
Of course, it has an effect, it has pluralised the sources of information enormously. Some people also talk about a disintegration, the fragmentation of the public sphere. What we can observe, when everybody looks at their smartphone, you can assume that everybody sees something different. And thus, the things that are enormously important for the formation of wills, the common political points of reference - they become rarer. This is not always the case - during the pandemic, for example, we were all talking about the same thing, and now with the Ukraine war there are events to which many different sources of information refer. So, we see that the change of the media does not necessarily lead to a disintegration of the public. But you are absolutely right. With many-to-many communication, especially the platforms, everyone can express themselves publicly, and that means the old quality standards and also the obligation of reporting to inform the people and thus the democratic contribution to the formation of political will, that is no longer in the foreground, but the old mass media share the attention with the many other sources that exist today. I would say that on the one hand this is a problem, but on the other hand one should not forget that until the 1980s / 1990s our mass media were also often criticised for the gender-setting power, i.e. the power of the media over what is communicated, the selectivity that is associated with it, certain parts of the world, for example, of which we have never heard anything. But the old media have also often been accused of being close to established politics. In this respect, it is not the case that we have now fallen from a rosy existence into chaos, but we also see a change in the problems that we have to deal with.
Kay
00:09:05
Now this is not your special field, but democracy also includes gender justice. Studies show that women are more affected by hate speech and digital violence than others. This tends to silence them. Do you have any ideas on how to deal with this, how to prevent it?
Jeanette Hofmann
00:09:28
In general, the digital has not made things better, different to what earlier optimistic voices imagined, but on the contrary - inequalities of various kinds - have rather increased. This is certainly true for user skills, also among young people, between the well-educated and the poorly educated. Here we see total amplification effects, i.e. children from educated homes who know how to use the digital for themselves, while children from poorly educated households don't have much confidence in themselves and tend to take on a passive consumer role in the digital world. And then, of course, we also see a considerable gender inequality, especially in the area of digital policy. However, when I look at how civil society organisations are set up today, what is emerging and who is in leadership positions, something is already happening. A new generation of young women, let's say between the end of their 20s and the beginning of their 40s, is establishing itself and they know very well how to use the digital for their own benefit. So, we have to be careful not to create a one-sided image. But as far as public communication and controversies on the Internet are concerned, women are certainly treated more harshly than men. One suggestion that has not been made with regard to women is that we need a kind of victim protection in the digital world. That the platforms must not only ensure that digital violence, verbal violence, does not spread, but that they must also institutionalise contact points to which victims of verbal violence in the digital world can turn.
Kay
00:11:10
Would this entrust private corporations with tasks that are actually in the public sector?
Jeanette Hofmann
00:11:25
We have been doing this all along through the Network Enforcement Act, but also through the Digital Services Act, which the European Commission is currently passing. You can see that the platforms are being given a lot of tasks that used to be the responsibility of the courts. Just think of the fact that certain messages have to be deleted if they are of clearl criminal intent or that certain names have to be passed on to the police if they are connected to clearly criminal acts or calls for criminal acts. The fact that they have to be passed on is actually a sovereign task that we are now transferring to the private sector. This is also due to the fact that the large amount of information that is uploaded every day is now sorted by algorithmic filters rather than by people. And these filter systems are all in private hands.
Kay
00:12:20
And how do you assess this situation?
Jeanette Hofmann
00:12:24
I find this very problematic. Again, from the point of view of freedom of information, the current legal situation often leads to overblocking. That means that messages are filtered that are perfectly okay, but possibly contain the wrong keywords, for example if you quote something or if you upload a picture for demonstration purposes that shows what could be a problematic photo. This is often blocked and so far the possibilities to defend oneself against this are still very limited.
Kay
00:12:59
This is a perfect keyword: algorithms. You also deal with the effects of artificial intelligence (AI) on democracy. What have you found out about this?
Jeanette Hofmann
00:13:16
The picture here is very contradictory. A lot of people looking at the use of algorithms in the field of automated decision-making - that exists in the US, less in our country; in our country, such models are only marginal in police work. But we know from other countries what we can expect. For example, that such systems are also used in social services, for example to identify families at risk. Then there are cases where such systems are used in the tax administration to be able to detect tax fraud. Or also with judges, who are supposed to determine the sentence for repeat offenders. So, there are a lot of tasks that have a certain routine content, that are susceptible to automation, and our current laws stipulate that there must always be rights of appeal for such decisions. That there must be a right to have a person look at such decisions and to justify them, and that this cannot simply be left to algorithms, because they often get it wrong. So much for automated decision-making - discrimination is often found there. There was another case recently in Spain where the system often made the wrong decision about eligibility for electricity subsidies. It is often the case that public administrations refuse to make transparent the underlying code, to give people access so that they can check whether such automated decisions are correct or not. This really has to change. There simply has to be more transparency. But this is, I would say, justified criticism that relates very strongly to the present. Theoretically, we know very little about the long-term effects of using such systems, and we have thought very little about them. But what technology researchers say is that we often overestimate the short-term effects of new technologies, whereas we tend to underestimate the long-term effects. So it would be much more important, especially with regard to democracy, to think about what is in store for us in the long term. Democracy researchers fear that the expert systems that are being created based on artificial intelligence (AI) could possibly compete with the way democratic will is formed and decisions are made.
Kay
00:15:47
You also do research on the quantifying form of regulation. What's that all about? And what impact might that have on democracy?
Jeanette Hofmann
00:16:04
So we have had tendencies of quantification for quite a long time. In the last few years with algorithmisation, we see that such quantifications are also used as an instrument of regulation. An example of this are so-called scoring systems. We know this, for example, from Schufa, a private provider that examines the creditworthiness of citizens and then provides information. Let's say you want to rent a flat or an apartment. Then your landlord wants a Schufa report to make sure that you can pay your rent. Then Schufa comes over with a scoring value that gives your creditworthiness as a number and does so in a highly non-transparent way. You only have limited access to the data that has been collected about you. You often cannot object to it. And above all, you do not know how these data have been weighted against each other, how exactly this scoring value has been determined. However, as non-transparent as it is, it is at the same time very important, it directly affects your life chances, your existence, and above all: People with a low educational background and little income also lack the resources to defend themselves against such systems. At the same time, they are more affected than people who have a lot of money and therefore do not have to reckon with such negative scoring values. So, this is an example of quantification, where in effect their credit score is translated into a number and this form of quantification at the same time regulates access to important resources. Another example is the Social Credit System in China. It's not just about creditworthiness, it's about all your good behaviour (do you pay back your loans regularly? Do you return books to the library on time? - All those questions - What do you buy? How healthy do you eat? How do your children behave at school?) All of this information is compiled, condensed into a profile, and then in turn, calculated down to scoring values. This is a system that a non-democratic country has devised, also to deal with the problem of a lack of trust in Chinese society. Which is why - more as a footnote - many Chinese women also welcome this system, i.e. they are grateful if they know the scoring values of their neighbours and therefore know whether they are behaving well or not. Something like this is inconceivable over here. But it is an example of regulation through quantification.
Kay
00:18:41
In the first case, I would say that with scoring values, there is a danger that the weak will be weakened even more and the strong will be strengthened even more. With social scoring this is probably not quite so pronounced.
Jeanette Hofmann
00:19:01
The social scoring that China does, it sort of gets everybody at the same time. Maybe not the political elite. But basically it affects the whole of society - and interestingly enough, companies are also to be subjected to this scoring system. And here, in turn, one can actually expect something positive, because corruption is very widespread in China and legal violations, for example in the area of the environment, often become known. If scoring values lead to companies paying more attention to existing laws, also in the economy, then that would certainly be a positive impulse, but overall it is roughly the opposite of what one would wish for in a democratic society.
Kay
00:19:55
Especially since it's about transparency of algorithms, as they can make mistakes and produce errors. If that is not transparent, people are affected by such errors and have no possibility to defend themselves against those.
Jeanette Hofmann
00:20:25
Yes, there are many examples of this. It is often not the algorithms that produce errors, but the way they have been programmed and how they then interpret the available data based on this programming. There are many examples of when companies digitise their hiring practices, for example, and use algorithms to pre-select applications, that the traditional hiring policy was so strongly discriminatory that the algorithms do nothing but basically extend these discriminatory hiring practices into the digital, into the algorithmic sphere. And we see something similar in the field of social administrations. When they use such systems, it has been shown - I think it was in Australia - that rich families living in affluent areas were not even included in the risk groups, but that only people with a low income and after high unemployment, or people living in such areas, were included as risk groups at all - in these systems, you can notice that prejudices and discrimination practices that are so self-evident that they are not even questioned are reflected in the algorithmic system.
Kay
00:21:48
For democratic processes as we know them, free opinion formation is very important. If all these influences come via digital mechanisms, do you see a chance that digital sovereignty can be preserved?
Jeanette Hofmann
00:22:11
I always assume - and I am often reproached for this - but I do insist that digital technologies as such do not dictate how we use them, rather, it depends on the rules, the actors, the resources and also power relations in which these digital technologies are used. If you imagine that digital technologies open up a space of possibilities and that we as people decide less individually but collectively how we use them, because both options are possible. Let's take the example of platform companies like Facebook, Twitter, Instagram, etc. In themselves, these platforms are a wonderful invention because they enable new relationships between people, but also between people and objects that would never have found each other before. So, a small company in the countryside can suddenly serve a world market if they have a product or service that is interesting to many people. For that company, that's great. It doesn't have to build up all the logistics and with much less capital, a lot can be achieved for this small company. And of course, that also applies to us as individual users. But the way Facebook and others try to keep us on their platforms, the way they reduce our response to a limited number of buttons, that is absolutely worthy of criticism. But that means it's the business model. And it's the way it's implemented algorithmically that is criticised. But not, I think, towards the platform model as such. And that's why we have a public discourse in Germany asking how we can use platforms in such a way that they serve the public interest. For example, a few years ago, the Bavarian Broadcasting Corporation began to consider whether public platforms could also be envisaged. At the moment, this is largely hindered by German legislation. But something could change here so that, for example, what we now call a media library, which functions more poorly than it should, could all be housed on a platform where, even years later, you can access programmes that were created with taxpayers' money.
Kay
00:24:42
So, for the beneficial use of digital technologies, a framework is needed to make society more resilient?
Jeanette Hofmann
00:25:00
Resilient, I don't know. I have problems with the term because I find it so under-defined. In general, I believe that we can definitely use digital technologies for our social benefit. And there are also many examples where you can see that this is already happening today. I have already mentioned this: Today we have many young civil society initiatives outside the traditional party apparatus that want to lead a meaningful life for themselves. I'll take as an example the many platforms below the big platforms that we all know, something like nextdoor.com. But there are also platforms in the field of engagement, for example, where people run platforms to connect volunteers with organisations (often charities) in order to bring together demand, need and supply. Very often it is migrants who are looking for voluntary jobs where they can improve their language, integrate into society and also gain their first work experience. And there are platforms that do valuable mediation work here. You couldn't do that without the digital world. Many birds are killed with one stone here.
Kay
00:26:23
Looking at Big Data - what framework do we need for civil society to benefit from the immense amounts of data that are produced in one way or another? Do you have any ideas?
Jeanette Hofmann
00:26:40
So, the Dgital Services Act of the European Commission will contain a regulation that obliges platforms to hand over data at least to researchers. But presumably civil society organisations will also be involved. It's too early to say exactly what that will look like. But in general, we can say that Big Data, i.e. the large data sets that companies like Google, Microsoft, Facebook and Amazon collect, strictly speaking do not belong to them. Firstly, it is our data that we are more or less forced to make available to the companies. And secondly, it is also legally the case that you cannot own data. There is no ownership of data, there is only de facto ownership at the moment, i.e. the platforms just pretend that it is their data and nobody prevents them from pretending. Therefore, in the long term, we need a regulatory framework that regulates the access rights to this data. In the economic sphere, one can imagine that small and medium-sized enterprises that plan data-based innovations pay a certain fee and get access to data in return. It is enormously important for research, firstly to understand how platforms actually rank, sort and curate information. But it's also important to be able to observe ourselves as a society. The platforms are dramatically underutilising this data that they have. They use the data they have predominantly to sell our profiles to advertisers. But all the things we do every day, all the things we could know about ourselves as a social collective, the platforms don't care about that. And that's why this data is all unused. And it's really useful for political engagement, for exploring society and for regulating platforms.
Kay
00:28:50
To conclude, what tips would you like to give us - three points that belong on our agenda, in terms of digitalisation and democracy - as political actors, but also as private individuals.
Jeanette Hofmann
00:29:05
I think it's very important that we fight back more. People simply have to protest against the constant disregard for data protection rules on the net. You have to contact the consumer protection agencies. You can write to the Federal Network Agency. You can write to your representatives in parliament. You simply have to protest and not accept it. The big problem is that at the beginning people find it very scary and unpleasant how much data they have to give away in order to be able to move around on the net, because we can't get around these infrastructures. And at some point you get used to it and forget about it. This is also called habitualisation in such cases, and it is imperative that we defend ourselves against it.
Kay
00:30:47
Thank you very much for taking the time for this very interesting and informative interview!
Jeanette Hofmann
00:30:53
Thank you also!

Give us Feedback

Whether you'd like to give us general feedback on our Podcast or discuss a certain episode, this is the place to go. Just enter your message and select the specific episode. Thanks so much for reaching out to us!

By clicking on "Send message", you agree that we are allowed to process your contact information for the sole purpose of responding to your inquiry. The form processing is handled by our Podcast Hoster LetsCast.fm. You can find more information on their Privacy page.