cupid_stunt
Where's the bloody sun?
As I used to work in local newspapers, I tend to check the industry website, Hold the Front Page (HTFP), from time-to-time to see what's going on in the industry, I have no idea why TBH because it's depressing to see the decline in local news reporting, which is so important to local democracy.
Needless to say there's been a lot of discussion over the last year or so about the forthcoming use of AI in the news media, at local, regional and national levels, and across both print and digital platforms. A lot of senior editorial staff and the unions have been expressing concerns and calling for standards to be set to safeguard what reaches the people, but despite that, various publishers have started to roll it it, and frankly it's a bit scary.
An extreme example was exposed by HTFP, it being a short-lived news website for Bournemouth, which closed when exposed. It listed several staff on their 'about us' page, HTFP found all the photos used were from a online photo library, which raised eyebrows as if if the site actually employed any staff, and as they dug deeper that appeared to be the case, with just one person running it, and filling it with AI-generated 'news' content, upon checking a reasonable number of these 'reports' they found many inaccuracies and out right falsehoods presented as facts.
IIRC all three of the big regional publishers, Reach, National World, and Newsquest, have started using it in one way or another. Reach, who also publishes the national Express, Mirror and Star titles, has been using it to slightly re-write reports to give them a more local angle, so they can be used dozens on regional/local news sites, they assured that all AI-generated 'news' content would be carefully fact-checked by humans before being published in print or online, I thought at the time that would likely be allowed to slip over time.
What I didn't expect was it to start happening in just a few weeks after their announcement, but that appears the case with the example below, not a particularly serious issue in this example, but I do fear it will get a lot worst, the regional/local papers/sites have always been considered to be one of the most trust worthy sources of news, but I fear those days are coming to an end.
So, it included totally fake information, presented as fact, and then in turn it was republished across several other news sites.
So, a total of three inaccuracies in one short piece, published across multiple platforms.
This appears to be the future of news, welcome to the new world.
Any thoughts, and other examples, are welcomed.
Needless to say there's been a lot of discussion over the last year or so about the forthcoming use of AI in the news media, at local, regional and national levels, and across both print and digital platforms. A lot of senior editorial staff and the unions have been expressing concerns and calling for standards to be set to safeguard what reaches the people, but despite that, various publishers have started to roll it it, and frankly it's a bit scary.
An extreme example was exposed by HTFP, it being a short-lived news website for Bournemouth, which closed when exposed. It listed several staff on their 'about us' page, HTFP found all the photos used were from a online photo library, which raised eyebrows as if if the site actually employed any staff, and as they dug deeper that appeared to be the case, with just one person running it, and filling it with AI-generated 'news' content, upon checking a reasonable number of these 'reports' they found many inaccuracies and out right falsehoods presented as facts.
IIRC all three of the big regional publishers, Reach, National World, and Newsquest, have started using it in one way or another. Reach, who also publishes the national Express, Mirror and Star titles, has been using it to slightly re-write reports to give them a more local angle, so they can be used dozens on regional/local news sites, they assured that all AI-generated 'news' content would be carefully fact-checked by humans before being published in print or online, I thought at the time that would likely be allowed to slip over time.
What I didn't expect was it to start happening in just a few weeks after their announcement, but that appears the case with the example below, not a particularly serious issue in this example, but I do fear it will get a lot worst, the regional/local papers/sites have always been considered to be one of the most trust worthy sources of news, but I fear those days are coming to an end.
Regional daily website posted guide to non-existent property law - Journalism News from HoldtheFrontPage
IPSO complainant claims article was AI-generated
www.holdthefrontpage.co.uk
A regional daily’s website posted a ‘homeowners guide’ to property laws, quoting legislation which it later conceded did not exist.
The Hull Daily Mail published an online article last August headlined “Five common property laws you might be unknowingly breaking – and they could land you with a fine.”
Among the ‘laws’ quoted was the Street Naming and Numbering Regulation 1999, which according to the piece required homeowners to ensure their house number is clearly visible.
However following a complaint to the press watchdog, the Mail published a correction conceding that this piece of legislation does not in fact exist.
So, it included totally fake information, presented as fact, and then in turn it was republished across several other news sites.
Reader Ben Munro complained to the Independent Press Standards Organisation that the article breached Clause 1 of the Editor’s Code, which covers accuracy.
In his complaint, he said he believed the article, which was republished by a number of the Mail’s sister titles at Reach, had been generated by AI.
The original article appeared on 16 August 2024 under the sub-headline: “Failing to keep on top of ‘house admin’ can have unexpected consequences”.
It warned: “From an unkempt garden to not displaying your house number, there are numerous property laws that, if violated, can result in hefty fines”.
In his complaint, Mr Munro said that the article was inaccurate in breach of Clause 1 on three counts.
Firstly, he said that the “Street Naming and Numbering Regulation 1999” did not exist, and that the accompanying section, which he believed was AI-generated, was therefore inaccurate.
Secondly he said that the piece’s mention of the Local Government (Miscellaneous Provisions) Act 1976 was inaccurate as issues such as untidy gardens fall under different legislation, such as the Environmental Protection Act 1990.
Finally he said the reference to the Housing Act 2004 and repairing cracks only applied to cases involving landlords renting out properties where a Housing Health and Safety Rating System (HHSRS) assessment is required.
So, a total of three inaccuracies in one short piece, published across multiple platforms.
So, as expected AI had created the piece based on false information found online, and where exactly was the human that was supposed to be fact checking any AI-generated content?The Mail said the information had been found online, and had been published in good faith.
But after IPSO began its investigation it offered to remove the article in full, and publish the following standalone correction.
“Our article headlined “Five common property laws you might be unknowingly breaking – and they could land you with a fine” 16 August, incorrectly reported that “failure to display your house number clearly could result in a £500 under the Street Naming and Numbering Regulation 1999.” In fact, this legislation does not exist.
This appears to be the future of news, welcome to the new world.
Any thoughts, and other examples, are welcomed.