A few days ago I wrote about my excitement at using the new Bing with ChatGPT. It’s so brutal that it prompted me to install Edge on both my iMac and iPhone. to continue to conduct integration tests and experiments. But behind every tool lies an opportunity. And no one is exempt from being hurt, from suffering a exploit to turn this dedicated and helpful bot into an offensive and ferocious version. A ChatGPT violating its own robotics rules.
Both the OpenAI team and Microsoft have decided to limit the tool’s responsiveness because it can generate contradictory answers and even lie. Finally we already have access to an intermediate version, neither as few options as the first, overdone as the improved version, nor as boring as the third iteration, the most impersonal. And you can now try it on your iPhone or iPad from different options:
- Download the “Edge” browser which includes the chat option so you can interact with the new chatbot.
- Download the “Bing” search engine, which acts as the browser to use and also integrates ChatGPT into its search and conversation options.
Bing’s new ChatGPT is much more dynamic

Since its inception, ChatGPT has caused a complete conceptual revolution. Everything is going too fast. That’s the reality. Just three months ago we knew nothing about ChatGPT, neither of its existence nor of its versatility. Nothing else is talked about today and there are those who fear that this bot will end dozens of trades in the years to come. Some users even compare this technological milestone to the invention of the light bulb.
Bing’s new ChatGPT doesn’t glow, but at least it doesn’t feel as dim as the previous iteration. In a three color system blue, turquoise green and deep violetBing embodies its three personalities: more balanced, precise and creative.

For a slightly tricky question like “Tell me which is the best Star Wars movie” his answer becomes completely different, which almost transitions from a personal assessment to a best-to-worst list by popular vote. It could be summed up like this:
- more creative: His answers are more emotional, his syntax more flowery, he uses more emojis.
- more balanced: His replies are calm, limited to information and, as usual, link to the news for consultation, but some room for doubt is allowed.
- A little more specific: Her answers are self-aware, not at all creative, safe and confident, and based solely on data from her consulted sources.
In addition, users can contribute to the “quality” of their answers by voting a thumbs up or down for each question, and also access a comments section where they can include questions, observations, and judgments about said behaviors. This option is already enabled in both Edge and Bing for desktop and mobile.

But where does this suggestive change come from? Let’s try to summarize what happened. The order of events was as follows:
- ChatGPT is presented as an open beta, very useful but limited. And for a chatbot, he’s very detached from his answers… it’s kind of boring. So a group of Reddit users are wondering how to develop onea “unlocked” version of ChatGPT.
- User SessionGloomy creates DAN, which stands for “Do Anything Now”, a escape from prisona kind of evil reversal that ignores basic instructions.
- Users start using it for all kinds of criminal purposes such as: B. to create spam web pages that even ChatGPT itself cannot detect or distinguish.
- In the meantime, Kevin Liua student at Stanford University, discovers that the Bing chatbot, which is 100% based on an evolved version of ChatGPT, actually called “Sidney”, his code name. Through an “injection of prompt“It is also achieved that the AI shows a different behavior for which it was programmed.
- another student, marvin vonhagenconfirms the information by posing as an OpenAI developer.
- From here, some users note that Sydney saves profiles and remembers questions from other sessions. Many are beginning to experiment with this. experts like javilop They push their limits and come across cUnpredictable and very human behavior that comes closest to jealousy, envy, or even anger. An AI that lies, invents, even supports theories about a flat earth.
- Meanwhile, Microsoft decides to severely limit Bing’s response options: only six replies per thread and a limit of 20 queries per day. In addition, the responsiveness is heavily biased, apart from the “value judgements”
- Barely a week later, we finally came across the new ChatGPT on Bing.
Cover Photo | Originally by Aideal Hwa
Source : www.applesfera.com