I Broke ChatGPT With This Paradox

525,273
0
Published 2024-04-05
You can get started with InVideo AI for free here: invideo.io/i/ActionLab

You can create up to 4 videos for free but with a watermark, but if you’re serious about video creation and want to publish videos without a watermark (which I highly recommend), you should upgrade to a paid plan which starts at as low as $20/month.

All Comments (21)
  • @gandalfgrey91
    When GPT starts apologizing you know its time to start over with a new chat
  • @The-Anathema
    For the crocodile paradox there are a few possible answers.

    1) crocodiles can't speak, the man is delusional and the crocodile will simply eat the child.

    2) given the axiom that we have a sapient crocodile that can speak, and that for whatever reason chooses its prey based on giving paradoxes as the contingency (much like a sphinx would give riddles)... giving a paradoxical answer would likely result in the crocodile returning half the child, that way it both returned and did not return it which may satisfy the paradox under some axioms of what precisely we mean by 'return the child'.

    3) given the same axiom as before, the crocodile could just be lying and won't return the child no matter what answer is given.

    4) we can be boring and take the paradox at face value, in this case there is a logical inconsistency as mentioned and the paradox is that the crocodile would presumably have to vanish in a puff of logic to satisfy the logical conditions inherent to the paradox lest it contradicted itself, which we hold as axiomatically impossible.
  • @Qermaq
    "Thank you for bringing that to my attention" from a chatbot who just proved it has no attention span.
  • @Wolforce
    It should definitely tell you the degree of certainty when answering. People have started thinking chatgpt is a search engine and that is terribly dangerous
  • @therealelement75
    I broke ChatGPT with "The old man the boat". It kept on arguing with me that it wasn't grammatically correct because it couldn't comprehend the fact that "man" was the verb. Even after I told it that "man" was the verb and "the old" was the subject, it told me that that wasn't grammatical because who tf uses man as a verb and any adjective as a noun (which is very common to do). So, according to ChatGPT, "I will duck" isn't grammatically correct because "duck" is a bird and not an action that you can do.
  • @zoroark567
    This is a good demonstration of the fact that LLMs like ChatGPT are just very advanced chat bots, they don’t “understand” what they’re saying or what it means, only what a conversation is supposed to look like. It doesn’t have the ability to “think” about a problem. It has the ability to recall a previous dialogue about the same problem, or one that looks similar. However, it doesn’t understand what the problem is, what it represents, or even that it’s solving a problem in the first place. It just knows the order the words are supposed to show up in.
  • @finkelmana
    ChatGPT isnt trying to understand the problem. Its just spitting out LANGUAGE that it thinks would be the answer.
  • @MrGriff305
    The advertisement about the AI video maker is the scariest part of this video
  • @gentleman4844
    5:58 "Before your intervention, the wall remained pristine. However, your utterance of 'sorry about your wall' preceded its defacement. Are you expressing remorse for the wall's original cleanliness or for the acknowledgment of its impending tarnish?"
  • @scamianbas
    Next version of ChatGPT will respond : "don't know don't care" 😂
  • @_TheIlluminator_
    My dad would’ve easily broken the crocodile paradox by giving me away to the croc and telling it, “You can have him for as long as you want.”
  • @in1
    I love that all of these paradoxes are basically the same
  • Reminds me of the scene from Portal 2 where potato GlaDOS tries to take down Wheatly by throwing "This statement is false" paradox at him and get him stuck in an infinite loop but he's too stupid even fall for it
  • @masterchief5603
    4:16 isn't this category mistake? This is ontological way of sorting words out by their meaning. But there's a difference between a "terms meaning" and "Terms utility" and inturn when he used heterological to be put into category of autological, then it does not matter if the definition of heterological is "words that can't describe their meaning." Cause putting this word itself in autological category it effectively only shows relation between word and meaning which is here "heterological" and its meaning, but there is no connection between these two distinct knowledge with what this word is used as a term to describe another category, that is rather kept separate for we do not ask this.
  • @maxvaessen
    Awesome stuff, well done on the quality ❤