ChatGPT users found an interesting phenomenon: the popular chatbot refused to answer queries requested about a “David Mayer,” and asking it to do so triggered it to freeze up immediately.o1 is meant to fix more advanced issues by expending a lot more time "thinking" ahead of it answers, enabling it to research its answers and check out various