[ad_1]
Ever since ChatGPT kicked off the AI revolution, other tech companies have been hard at work trying to develop the next evolution of generative AI. Now, in a recent development, Anthropic, the company founded by former OpenAI research executives, has released the next generation of their conversational chatbot, Claude 2, following a $750 million investment.
While similar in concept to Google’s Bard or OpenAI’s ChatGPT, Claude sets itself apart with its conversational tone and added humor. Additionally, the new version also allows for longer and more detailed responses while improving its skills in math, coding, and reasoning. Moreover, the company also highlighted the fact that the new chatbot outperformed its predecessor, scoring 76.5% on the multiple-choice section of the bar exam compared to Claude 1.3’s score of 73%.
Furthermore, addressing recent concerns about chatbots generating harmful and manipulative content, Anthropic claims that Claude 2 is twice as effective at providing harmless responses, reducing the risk of generating harmful content during interactions.
“We really feel that this is the safest version of Claude that we’ve developed so far, and so we’ve been very excited to get it into the hands of a wider range of both businesses and individual consumers,” said Daniela Amodei, co-founder of Anthropic.
Not connected to the internet
Although Claude 2 shares similarities with the Bing AI chatbot, it is not connected to the internet and is trained on data up until December 2022. As a result, the chatbot does not have access to the latest information or developments. However, it’s worth mentioning that its dataset is more up-to-date than the free version of ChatGPT, which cuts off in 2021.
Analyzing books and papers
Another notable feature of Claude 2 is its contextual understanding, which can accommodate approximately 75,000 words. This not only allows users to upload lengthy documents, such as novels or research papers, for analysis by Claude but also helps them quickly obtain summaries of complex texts and documents.
[ad_2]
Source link