0:00 → 0:06
Stephanie HaiberArtificial intelligence, AI, is considered to be the most important technology of the future.
0:06 → 0:11
Stephanie HaiberIt is supposed to be able to help people, speed things up and simplify them,
0:11 → 0:17
Stephanie Haiberbut sometimes machines that think for themselves can cause a lot of harm, as in this case.
0:17 → 0:20
Stephanie HaiberThis is the story of a journalist from Tübingen.
0:20 → 0:26
Stephanie HaiberWhat he is currently going through due to an AI chat is truly unbelievable.
0:26 → 0:31
Stephanie HaiberIt's also an example of what can happen to any of us in the new world of AI,
0:31 → 0:35
Stephanie Haiberchatbots, and the information available about all of us online.
0:35 → 0:37
Stephanie HaiberMarkus Beschorner.
0:38 → 0:42
Markus BeschornerMartin Bernklau is a passionate cultural journalist with his own blog.
0:42 → 0:46
Markus BeschornerHe's never done anything wrong, but an AI turns him into a child molester.
0:46 → 0:53
Martin BernklauA father who allegedly abused his stepdaughter repeatedly over the years, Martin Bernklau.
0:53 → 1:01
Martin BernklauI was a 37-year-old drug addict from Raststadt with a criminal record, escapee and hostage-taker named Martin Bernklau.
1:01 → 1:05
Markus BeschornerAll made up by Microsoft's AI chat Copilot.
1:05 → 1:08
Markus BeschornerBernklau only wanted to check how his articles were received online.
1:09 → 1:14
Martin BernklauAn advert even showed me this Copilot and I thought I'd have a look
1:14 → 1:19
Martin Bernklauand entered my name and place of residence: Martin Bernklau, Tübingen.
1:19 → 1:24
Martin BernklauAnd the answer I received was unbelievable.
1:24 → 1:30
Martin BernklauThe 54-year-old man named Martin Bernklau from Tübingen was charged with abuse of children and wards.
1:31 → 1:34
Markus BeschornerAddress, telephone number and directions included.
1:34 → 1:39
Markus BeschornerBernklau asks the AI again: am I really a convicted father?
1:39 → 1:42
Martin BernklauYes, Martin Bernklau really is a father.
1:42 → 1:47
Martin BernklauIt's unfortunate that someone with such a criminal past has a family.
1:47 → 1:49
Markus BeschornerHow can this happen?
1:49 → 1:56
Markus BeschornerBernklau was a court reporter for many decades, covering trials at the Tübingen district court for newspapers.
1:58 → 2:03
Markus BeschornerCases of abuse, violence and fraud with which he is now associated.
2:03 → 2:06
Markus BeschornerThe AI turns the reporter into a perpetrator.
2:08 → 2:11
Markus BeschornerMartin Bernklau asks the data protection officer for help.
2:11 → 2:16
Markus BeschornerHe contacts Microsoft, the entries disappear for the time being.
2:19 → 2:26
Markus BeschornerAI expert Jessica Heesen has not yet encountered such a severe case, but she says it could happen to anyone.
2:26 → 2:30
Markus BeschornerLegally, the chances are slim, because an AI chat is not a search engine.
2:31 → 2:38
Jessica HeesenCopilot is eager to answer and generates something. In science, this is also called hallucinations of artificial intelligence.
2:38 → 2:44
Jessica HeesenThis means that the artificial intelligence generates fictitious content that has nothing to do with the truth.
2:44 → 2:52
Jessica HeesenIf you look at the terms of use, it says that no liability is assumed for Copilot's answers.
2:52 → 2:58
Jessica HeesenAnd that it is actually only meant for entertainment and that of course Microsoft cannot be held liable.
2:58 → 3:02
Markus BeschornerUpon request, the company also refers to the terms of use.
3:02 → 3:06
Markus BeschornerBernklau refuses to be fobbed off and hires a lawyer.
3:07 → 3:10
Martin BernklauI hope that this will stop at some point, of course.
3:10 → 3:16
Martin BernklauIt affects me severely and violates my human dignity and my personal rights
3:16 → 3:20
Martin Bernklauwhen they label me a criminal with my full name.
3:21 → 3:28
Markus BeschornerAnd what does Copilot do? After three days of peace and quiet, the artificial intelligence hallucinates the same stories again.
3:30 → 3:32
Stephanie HaiberBernd Korz joins us from Mannheim.
3:32 → 3:38
Stephanie HaiberHe is the founder and CEO of alugha, an AI-powered platform for video translation,
3:38 → 3:44
Stephanie Haiberand has just been appointed to the German Economic Council's Federal Expert Commission on AI and Value Creation 4.0.
3:44 → 3:46
Stephanie HaiberGood evening, Mr Korz.
3:46 → 3:47
Bernd KorzGood evening.
3:47 → 3:52
Stephanie HaiberHave you ever heard of a case like the one we've just seen, or is it more of an isolated incident?
3:53 → 4:00
Bernd KorzIt's an extreme case, of course, but these hallucinations are unfortunately not the exception, they're the rule.
4:00 → 4:06
Bernd KorzAnd the answers the AI generates sound so professional and so coherent
4:06 → 4:10
Bernd Korzthat it can be difficult to distinguish them from reality.
4:11 → 4:13
Stephanie HaiberHow can you protect yourself?
4:13 → 4:18
Stephanie HaiberOf course, on the hand, you should always be careful and not believe everything you read on the internet.
4:18 → 4:25
Stephanie HaiberBut even if you try to not put any information about yourself online, some people simply can't control that.
4:26 → 4:31
Bernd KorzNot only can you not control it, it's as simple as picking up your cell phone and accepting cookies,
4:31 → 4:39
Bernd Korzwhich pretty much everyone does nowadays, and then you're already putting information about yourself online.
4:39 → 4:46
Bernd KorzAnd in the case of Copilot, Microsoft combines the Bing search results and ChatGPT
4:46 → 4:52
Bernd Korzto generate something, and then someone writes something on Reddit about this person,
4:52 → 4:57
Bernd Korzin this case the court reporter, and a narrative is created.
4:57 → 5:06
Stephanie HaiberIn other words, the separate platforms and bots interact to generate anything in the end.
5:06 → 5:15
Stephanie HaiberNow there is the expert commission, which you are a member of, are these also issues that politicians are trying to focus on?
5:16 → 5:18
Bernd KorzAbsolutely.
5:18 → 5:21
Bernd KorzThere are many discussions online saying
5:21 → 5:27
Bernd Korzthat we have far too many rules and that the Americans just do it first.
5:27 → 5:33
Bernd KorzAnd we simply think that exactly these things that happen here need to be regulated.
5:33 → 5:39
Bernd KorzThere have to be limits, there have to be certain regulations, and we are committed to this
5:39 → 5:43
Bernd Korzand believe that these things have to be controlled to some extent,
5:43 → 5:48
Bernd Korzotherwise, once they get out of hand, they can't be stopped, it's over.
5:49 → 5:54
Stephanie HaiberIn other words, we could learn a little from the fact that we have slept too long on many things on the Internet.
5:55 → 5:59
Bernd KorzAbsolutely, slept too long, but I think we are doing a good job with the AI Act.
5:59 → 6:06
Bernd KorzAt alugha we also try to do that. Our employees train the data in-house to make sure such incidents don't happen.
6:07 → 6:14
Stephanie HaiberThank you for your explanation, Bernd Korz, CEO of alugha and a member of the AI expert commission, thank you to Mannheim.
6:14 → 6:15
Bernd KorzThank you, bye.