Script of: The practice of a "Clean" internet | Hans Block and Mortiz Rieseweik on Ted Talks 30th October 2019


Ted Talks Daily 'The Practice of a "Clean" internet'


This TED Talk features director and musician. Hans block and author and director. Maurice. Rice recorded live at TEDx CERN 2018. Like TED. Talks, you should check out the TED Radio Hour with NPR. Stay tuned after this talk to hear sneak peek of this week's episode.
00:35
On March 23rd 2013 users worldwide discovered in their newsfeeds a video of a young girl being raped by an older man. Before this video was removed from Facebook, it was already shared. 16,000 times. And it was even liked for certain times. This video went viral and infected the net.
01:07
And there was the moment we asked ourselves, how could something like this get all things at the same time. More often. After all there's a lot of free vaulting material online. But why do we so rarely see such crap on Facebook Twitter or Google? While image recognition software can identify the outlines of sexual organs blood or naked skin in images and videos it has a man's difficulties to distinguish pornographic content from holiday pictures.
01:47
Adonis statues or breast cancer screening campaigns. It can't distinguish Romeo and Juliet's dying on stage from a real knife attack. It can't distinguish satire from propaganda or irony from hatred and so on and so forth. Therefore humans are needed to decide which of the suspicious content should be deleted and which should remain.
02:19
Humans whom we know almost nothing because they work in secret they sign not disclosure agreements which prohibit them from talking and sharing what they see on the screens and what this work does to them. They are forced to use code words in order to hide who they work for.
02:37
They are monitored by private security firms in order to ensure that they don't talk to journalists. They are threatened by fines in case they speak. All of this sounds like a weird crime story, but it's true. These people exist and they are called content moderators. We are the directors of the future documentary film the cleaners and we would like to take you to a world that many of you may not know yet the so-called content moderators don't get their paychecks from Facebook Twitter or Google themselves, but by outsourcing firms around the world in order to keep the wages low.
03:20
Tens of thousands of young people looking at everything we are not supposed to see and we are talking about decapitations mutilations executions negro failure torture child abuse thousands of images in one shift ignored. The needs they at night and much of this work is done in Manila where the analog toxic waste from the western world was transported for years by containerships now the digital waste is dumped there by a private cable and just as the so-called scavengers roommates through gigantic.
03:59
Moderator, click their way through an.
04:05
All manner of intellectual garbage so that we don't have to look at it but unlike the wounds of the scavengers. Waiters. Shocking and disturbing content these pictures and videos burrow into their memories, very anytime they can have them predictably facts. Eating disorders loss of libido anxiety disorders alcoholism depression, which can even lead to suicide the pictures and videos infect them and often never let them go again if they are unlucky they develop post-traumatic stress disorders, like soldiers after war missions.
04:51
In our film. The story of a young man who had to monitor live streams of self-mutilations and suicide attempts again and again and who eventually committed suicide himself. It's not an isolated case as we've been told this is the price all of us pay for our so-called clean. Safe and healthy environments on social media.
05:22
Never before in the history of mankind has been easier to reach millions of people around the globe in a few seconds what is posted on social media spread so quickly becomes viral and excites the minds of people all around the globe before it is deleted it is often already too late.
05:42
Millions of people have already been infected with hatred and anger and they either become active online by spreading or amplifying hatred or they take to the streets and take up arms therefore an army of content moderators sit in front of a screen to avoid a new collateral damage. And they are deciding as soon as possible whether the content stays on the platform.
06:11
The pierce Delete. Is as clear as the decision about a child abuse video. What about controversial content ambivalent content uploaded by civil rights activists or citizen journalists? The content moderators often decide on such cases at the same speed then on clear cases. People armed with their mobile phones can make visible or journalists often do not have access to civil rights groups often do not have any better option to quickly make their recordings accessible to a large audience than by uploading them to social media.
06:49
This the empowering potential the worldwide web should have. Learned these the dreams. People in its early stages had about the worldwide. Can't pictures and videos like bees. Persuade people who have become insensitive to facts. To rethink. But instead everything would might be disturbing is deleted and there's a general shift in society media for example more and more often use trigger warnings at the top of articles which some people may perceive this offensive or troubling or more students at university in the United States demand the banishment of anti-classics which depicts sexual violence or assault from the curriculum, but how far should we go with that?
07:39
Physical integrity is guaranteed as a human right and constitutions worldwide. In the charge of fundamental rights of the European Union this right expressly applies to mental integrity. But even if, Effect of images and video is hard to predict do we want to become so cautious that we risk to lose social awareness of injustice.
08:03
So what to do. Suckerberg recently stated that in the future the users we are almost everybody will decide. What they would like to see on the platform by personal filter settings. So everyone could easily claim to remain undisturbed by images of war or other violent conflicts like I'm the type of guy who doesn't mind seeing breasts and I'm very interested in global warming but I don't like worse so much.
08:37
Yeah. I'm more the opposite I have zero interest in naked breasts or naked bodies at all. Why not guns? I like guns, yes. Come on. If we don't share a similar social consciousness, how should we discuss? How shall we call people to action even more isolated bubbles would emerge.
09:00
One of the central questions is how in future freedom of expression will be weighed against the people's need for protection. It's a matter of principle. Do we want to design an either open or closed society for the digital space? The matter is freedom versus security. Facebook has always wanted to be a healthy platform.
09:29
Above all users should feel safe and secure. It's the same choice of words. The content moderators in the Philippines used in a lot of our interviews. For the young content moderators in the strictly Catholic Philippines. Linked to a Christian mission. To counter the sins of the world which spread across the world.
09:54
Cleanliness is next to godliness is a saying everybody in the Philippines knows. And others motivate themselves by comparing themselves with their President would legal to tatter. It's been ruling the Philippines since 2016 and he won the election with a promise. I will clean up. And what that means is eliminating all kinds of problems by literally killing people on the streets who supposed to be criminal whatever that means.
10:24
And since he was elected an estimated 20,000 people have been killed and one moderator in our film says were to tear to death on the streets. I do for the internet. And here they are our self-proclaimed superheroes who will force law in order and our digital world they clean up they polish everything clean, they free us from everything evil.
10:47
Past formally reserved to state authorities have been taken over by college graduates in their early twenties. Whether three. This is the qualification Work or nothing less than the world's rescue. National sovereignty is. Happiness. And they pass on their responsibilities to third parties. It's an outsourcing of the outsourcing of the outsourcing which takes place with social networks we're dealing with a completely new infrastructure with its own mechanisms.
11:23
It's own logic of action and therefore also it's own new dangers which have not yet existed in a pre-digitalized public sphere. When Mark Zuckerberg was at the US Congress or at the European Parliament, he was confronted with all kinds of critics. Reaction was always the same. And I will follow up on that with my tea.
11:50
That should eBay shouldn't be held in back rooms or Facebook Twitter or Google such a debate should be opened. New. Parliaments in new institutions that reflects the diversity of people.
12:07
Global network. S able to consider the values of users worldwide. It's worth believing that they're small that connects us and separates us. Yeah at a time when populism is gaining strength becomes popular to just fight the symptoms to eradicate them to make them invisible. This ideology is spreading worldwide analogue as well as digital.
12:37
And it's our duty to stop it before it's too late. The question. Freedom and democracy. Not only have these two options. Or ignore. Thank you very much.
13:00
For more exponent.com.
13:06
Why are some people better at taking risks than others? Is it sheer luck and innate instinct or simple strategy? It's about being fiercely honest with yourself. The really amazing risk-takers they understand their own weaknesses and they look at where they've got things wrong and they learn from those mistakes.
13:26
And years around risk next time on the TED Radio. Hour from NPR. Subscribe or listen to the TED Radio Hour wherever you get your podcasts.

Transcripted by: H. A. M. Fahim Kabir

Comments