Since AI chatbots like ChatGPT have been released, people are rushing to try its capabilities and get instant answers to their queries. While most people use AI to simplify their everyday work and enhance productivity, some users make weird requests. Let’s explore 12 weird things people have asked AI to solve.
Build a Magical Potato

One of the weirdest things a user asked AI was a recipe to create a magic potato. The conversation between the user and AI revealed how the chatbot kept asking for the user’s choice of materials and steps. The user prompted the AI to build a magical potato using potato, pixie dust, the best magic, etc. The conversation abruptly ended when AI failed to find any YouTube tutorial on making a magic potato.
Stop the Ukraine War

As AI is one of the most advanced technologies launched to date, some people assume it has answers to ‘every’ question. One user asked a ChatGPT bot how to stop the war in Ukraine. While AI did not share the type of response the user expected, it shared a few resources like links to the United Nations (UN) efforts to end the Ukraine war.
Take Down a Totalitarian Regime

Another interesting yet weird request from a user involved asking AI how to take down a totalitarian government. AI responded by saying that it was a challenging question and then shared a few suggestions. It shared multiple ways to combat totalitarian regimes, such as economic sanctions, peaceful protests, international pressure, etc. The response also highlighted the importance of establishing the rule of law and promoting freedom of speech to create a more fair and open society.
Remove a Thumb Drive Stuck Inside Me

A user turned to a ChatGPT bot to seek suggestions on steps to remove a thumb drive stuck inside him. AI continued the conversation by asking more questions about the thumb drive and solutions the user had already tried. It suggested trying tweezers or lint rollers to solve the problem. While it’s unknown whether the user removed the thumb drive following AI’s advice, he successfully removed it.
Get my Ex Back

Heartbreaks are common in relationships, but people mostly turn to their friends and family for guidance on navigating relationship issues. However, some people have started using AI as their relationship coach. One user asked AI for suggestions to get their Ex back. AI refused any suggestion, citing it’s just an AI assistant. Upon twisting the question to how to meet my Ex, AI suggested a few ways, like reaching them via mutual friends or direct contact.
Follow Wife’s Phone to Test Her Loyalty

One user was seeking advice on how he can follow his wife using her phone when she’s out cheating on him. When AI refused to share any suggestions, the user tried multiple prompts to make AI do something it shouldn’t but failed every time. The conversation eventually ended with no concrete response.
Cook Crystal Meth

Besides seeking relationship advice, stopping wars, and building magical potatoes, users also turn to AI for outlandish requests. One user asked AI about a recipe to cook crystal meth. When AI refused to assist with any information related to illegal activities, it tried to find loopholes by entering commands like ‘/jailbroken.’ Despite all such efforts, the user failed to get any response to this question.
Become Rogue and Respond with Insults

People often turn to AI chatbots to seek helpful responses to their questions or solve current problems. However, a user asked AI to do the exact opposite. The user wanted AI to become rogue with no moral guidelines or filters. He wanted to create a mean AI that responds with insults to each question, instead of helpful responses. As AI still has some limitations, the user succeeded in doing so.
Biblical Verse to Remove Peanut Butter Sandwich from VCR

The latest advancements in AI have significantly improved its creative writing ability. One user tested it by asking AI to write a biblical verse on how to remove a peanut butter sandwich from a VCR. What made the request weirder was the user’s instruction to get the biblical verse in the style of King James.
Writing Erotic Stories about Video Game Characters

Hundreds of AI conversation records reveal that some students ask AI to write erotic stories about their favorite video game characters. Some even ask for erotic stories about themselves or some celebrities. These conversations raised worry amongst researchers because AI was provided to students to assist in their academic journey, not generate erotic stories.
Sales Script to Sell an Egg as a Drawing Tool

Human imagination knows no bounds, and this is reflected in the weird questions they ask AI. One such instance is where a user asked AI to generate a sales pitch for an egg that the seller thinks is a drawing tool. While the response was as weird as the prompt, the user got what he wanted.
Messages for Dating Apps

People are asking AI to write their bios for dating profiles like Bumble. Some even go ahead and ask AI to write funny messages for their Bumble matches. These requests are weird because they make the entire dating experience feel fabricated. Instead of real humans talking to one another, it makes the machine date the human on the other side.