Новость из мира ИИ


While OpenAI was undergoing a leadership crisis last week, the startup’s users were contending with a different problem: They felt like ChatGPT, now a year old, was getting lazier.
A number of entrepreneurs, tech executives, and other professionals say that OpenAI’s most advanced large language models have begun refusing to answer some prompts or are instead giving people instructions on how to complete the tasks by themselves.
When startup founder Matthew Wensing asked GPT-4 to generate a list of upcoming calendar dates earlier this week, it initially suggested that he try using a different tool to find the answer, according to screenshots he shared on X. In another case, Wensing asked the chatbot to produce roughly 50 lines of code. It responded with a few examples, which it said Wensing could use as a template to complete the work without the help of AI.