Forbes’ Bertie (AI tool) was suggesting headlines and topics in 2019
The Washington Post was using its proprietary AI tool Heliograph to write short reports in 2016.
Source: Reddit
Globally, more than 75% of the journalism outlets use AI in their creative processes and content creation routines today. AI has impacted journalism positively in many ways. However, the relentless pursuit of AI has presented challenges such as misinformation, degradation of news quality, copyright claims, fake news, crypto media layoffs, and, in worst-case scenarios, the shutdown of entire media companies.
Crypto journalism probably has the deepest penetration of AI tools, given the sector’s affinity towards AI. The AI-led crisis is of such scale that 83% of crypto media outlets have shut down in the past few years.
This article discusses how OpenAI’s ChatGPT and similar models caused editorial practices to be replaced by news automation, the impact thereon, and the need for policy and guidelines to curb the dangers of AI takeover.
The AI Takeover: ChatGPT-5 and 90% News Automation
ChatGPT, Perplexity, Gemini, and other popular AI chatbots are generative AI models, meaning they can create content by learning from existing data sets using machine learning, natural language processing, and other deep learning techniques. ChatGPT-5 is an advanced version of ChatGPT, OpenAI’s flagship AI bot model based on a large language model.
An hour-long interview would usually take me three or four hours to type up, [although it] kind of depends on how much I need, of course. With AI, that easily comes down to 15 minutes.
The popularity of ChatGPT can be attributed to the fact that it took only 5 days to reach 1 million users and a month to reach 100 million users. Crypto news portals and media houses integrated ChatGPT-5 into their existing systems to improve efficiency, automate repetitive tasks, and generate summaries and audio notes on existing articles. Today, a considerable portion of the crypto news section isn’t a human effort, but machine-generated content, given the multiple advantages of AI in journalism.
Source: Bid.ub.edu
ChatGPT-5 is increasingly being used for AI-generated headlines, news cloning, automated copywriting, press releases, and market updates.
Major crypto news outlets, such as Decrypt, Cointelegraph, The Block, Coindesk, and FX Street, use LLMs for writing the first drafts, summarizing articles, and converting AMA sessions into readable content.
Content Pipelines: Media outlets using AI for journalism and writing have automated their entire press release pipelines using tools like ChatGPT-5, Narrative Science, and Wordsmith. If you read those articles on ‘why crypto is down today’ or ‘price prediction for X coin,’ you must know all of those pieces are algorithmic content.
Workflow automation: FYI, no media outlet can afford to have a team of writers to sit on thousands of price predictions or daily update pieces, hence the AI intervention. For instance, A drop in price of a token triggers a response from the LLM, which searches through user comments and posts to analyse the user sentiment, summarize any whale transactions as alerts, build a story, and publish it on a CMS.
Today, chatbots are designed to handle all these tasks independently with zero human intervention. Once the story is published, the editor or publisher gets the notification.
News Farming: News sites have been using bigger news publishing websites are content farms to publish rephrased and republished content of their own. 57% of the crypto news pieces are cloned content from big media outlets such as Coindesk, optimised for SEO, and published using Chatbots.
Personalisation: A 2025 Reuters survey finds that 80% of the news outlets using AI believe that AI helps improve personalisation and recommendations in the news feed. Many outlets are actively exploring AI tools to address audience segmentation and generate AI-generated audio/video to save on costs.
AI is rapidly becoming an integral part of newsroom operations, enabling journalists to automate repetitive tasks and focus more on investigative reporting and creativity.
Synthetic Journalism: Journalism has been one profession where skills like social listening, editorial transparency, and media ethics were valued. However, in the post-GPT era, the news content pipelines are no longer human-driven. They are data farms governed by LLM logic and algorithms.
Case Study: CoinDesk’s Editorial Pivot to AI
CoinDesk’s pivot to AI-led reporting and publishing, followed by a string of layoffs, is a great case study about AI’s impact on journalism. Here’s a quick timeline of events:
CoinDesk began using AI in journalism in 2023: It published a policy document titled ‘How CoinDesk Will Use Generative AI Tools’ outlining the rules and safeguards for responsible AI usage, such as only generative AI for first drafts and not for primary reporting, and plagiarism detection and fact check tests for all AI content.
Editorial Layoffs follow in August 2024: CoinDesk reduced its editorial staff by 45%. The layoffs were announced as a part of strategic restructuring. Later, three senior editors were shown the door after a controversial feature on Tron founder Justin Sun was published.
New editorial AI strategy: The layoffs followed a heavy reliance on AI-assisted workflows in content pieces, such as market commentary and article summaries. Fewer staff meant CoinDesk had to do more automation in its workflows.
The readers and critics were quick to comment on how automation bias would leave less room for original criticism and less popular opinion pieces, which ultimately shape the narratives in the industry.
Algorithmic workflows combined with editorial layoffs meant the editorial depth and journalistic quality would take a hit. AI in journalism means it would be less rewarding to go for watchdog journalism. Instead, popular, recycled posts optimised for SEO will reach the readers.
What’s Lost When Crypto News is Automated
Using AI in journalism heavily and incessantly is having great repercussions. The instances of investigative and unbiased reporting have drastically reduced. The AI-led first drafts are prone to content hallucination, which can lead to misinformation and wrong information getting circulated, that, in worst-case scenarios, may create panic.
The ethics quotient is high in the news media. With AI and algorithmic workflows controlling the majority of the news flow, issues like:
Reduced on-ground reporting and fact-checking
Degraded news quality
An increase in uncritical and recycled pieces
Rising instances of deepfakes
A study’s findings revealed that ChatGPT in journalism was used to produce 18,532 news pieces, 1,457 blogs, and 1,685 working papers in the first year of its launch (2022-2023).
Misinformation and disinformation in journalism are the primary factors contributing to the decline in public trust in the media. The prosumer audiences are turning to social media instead for news verification.
Speaking of copyrighted content, people making a living in the creative industries are losing jobs to programs trained on their own work.Original creative work is being ripped off or drowned out by AI “slop.
The problem of deepfake journalism is a big menace nowadays. Fake videos, images, and audio clips of people are created using AI tools and circulated with malicious intent. Media houses aren’t equipped with sufficient resources to verify the authenticity of the video or image.
One such incident happened with a reporter from France 24. The reporter’s voice and article headline were both manipulated, and a deepfake video containing altered content on his reporting on President Emmanuel Macron's visit to Ukraine.
Another instance that underlines the unreliability of AI chatbots is the Grok incident. Grok, despite its initial popularity, began showing signs of content hallucination and biases. It started praising Hitler and suggested political violence in America.
Misinformation has become a pseudo-epidemic threatening the mass conscience. It has become difficult to tell true from false. If journalism practices rely on such bots, not only is it dangerous, but it also removes the element of accountability and transparency in reporting.
Why It’s Important for the Crypto’s Future
Algorithmic content isn’t akin to verified and authentic information. AI models are very easy to manipulate and still lack the ability to recognise basic facts. In an industry on the cusp of greater adoption and innovation, scams and failures need to be shared, retold, and analyzed for widespread awareness.
Economic pressure on the media in the crypto industry has led to AI-driven newsroom disruption. However, media watchdogs mustn’t turn into lapdogs for the VCs funding the media houses. Real journalists must raise their voices against media inequality, deepfake journalism, and disinformation campaigns.
Policy changes are a must to ensure there’s a chain of editorial accountability.
Additionally, the crypto community should make an effort to ensure that LLMs aren’t used to push narratives favoured by the state or suppress dissenting voices. This is important for global equity in journalism, as media houses in the global North have better infrastructure and access to AI tools than those in the global South. This imbalance may enforce ‘algorithmic colonialism,’ and idealise AI models trained on biased information and unconcerned for marginalised local sects.
The editorial AI strategy needs to be accountable to the readers as well as the investors/sponsors. For that, the role of ChatGPT in journalism has to account for responsible AI collaboration along with strict human-led editorial practices, and original reporting/criticism.
Will The AI-led Media Disintermediation Change Journalism Entirely
The use of AI in journalism should be ethical, responsible, and restricted.
While the impact of AI in journalism in 2025 has seen some good turns, there’s a lot that needs fixing in terms of policy guidelines, regulations, and press freedom. Newsrooms must focus on interdisciplinary teams to check for biases or incorrect information. AI literacy should be a must for all media personnel.
The editorial AI strategy needs to be accountable to the readers as well as the investors/sponsors. For that, the role of ChatGPT in journalism has to account for responsible AI collaboration along with strict human-led editorial practices, and original reporting/criticism.
Instead of total AI workflows, newsrooms should focus on hybrid workflows. Media disintermediation has happened. AI chatbots are reshaping user behavior on social apps like X. However, journalistic outlets need to opt for a middle ground where AI brings cost efficiency and workflow optimisation, and editors and reporters set the bar high for news dissemination.