[ad_1]

An interesting news recently published by Reuterswhere Reuters conducted an interview with Alphabet Chairman John Hennery in which, John stated that Search Engines that use ChatBot with a large language model are likely to cost as much as 10 times as much as normal Search Engines.

And in that regard, it aligns with a tweets as disclosed by the CEO of OpenAI – Sam Altman who tweeted that ‘Computation costs are very attractive’, so with that, OpenAI decided to launch ChatGPT Plus in early February at a price of $ 20 per month.

Reuters also uses a report from Morgan Stanley which estimates that Google will cost a fifth of a cent for each time someone searches on its service by 2022, but estimates that if an AI ChatBot like Bard is deployed, that could rise to $6 by 2024, and even then with the asusmi chatbot handles only half of Google searches with only 50 word answers.

Why Can It Cost More?

In this regard, this is because to run super-intelligent AI, more CPUs are needed, which of course, at the same time, the electrical power will also increase to be in line with the servers it handles.

For now this technology is still new and may still require a lot of money, including Microsoft itself which has just launched the new Bing which is currently available as a Preview for a number of users, in the future we do not know whether there will be paid Bing Chat or not, but if news and rumors that the operating costs of this AI ChatBot are greater than ordinary Search Engines, so of course it is possible that there will be a paid version of AI from Bing Chat and Bard, just like openAI’s ChatGPT today, unless layoff what these big companies are doing is continuing to cover the operating costs of this AI ChatBOT.

Reference : Reuters, Neowin



[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here