403
Sorry!!
Error! We're sorry, but the page you were looking for doesn't exist.
ChatGPT Mimics Human Reasoning in Solving Ancient Puzzle
(MENAFN) ChatGPT, an artificial intelligence-powered conversational tool, seemed to improvise and commit human-like missteps while addressing a 2,400-year-old mathematical challenge, as revealed in a recent investigation by scholars from the University of Cambridge and Hebrew University.
The study examined the well-known “doubling the square” riddle featured in Plato’s Meno, where Socrates leads an untrained youth to expand a square’s area by using its diagonal — instead of merely extending the sides.
Academics wanted to determine whether ChatGPT would recall Plato’s time-honored method or formulate a unique strategy of its own.
Rather than offering the geometric answer right away, ChatGPT initially handled the question using algebraic techniques and made several miscalculations.
It was only after being prompted multiple times that it eventually arrived at the classical geometric solution.
When researchers presented alternate versions of the puzzle — like doubling the area of rectangles or triangles — the chatbot responded in similar fashion.
It tended to rely on algebraic reasoning, occasionally slipped up, but ultimately generated accurate geometric solutions.
The investigators noted that ChatGPT’s replies echoed the cognitive behavior of a learner testing ideas, instead of merely retrieving stored facts.
The study examined the well-known “doubling the square” riddle featured in Plato’s Meno, where Socrates leads an untrained youth to expand a square’s area by using its diagonal — instead of merely extending the sides.
Academics wanted to determine whether ChatGPT would recall Plato’s time-honored method or formulate a unique strategy of its own.
Rather than offering the geometric answer right away, ChatGPT initially handled the question using algebraic techniques and made several miscalculations.
It was only after being prompted multiple times that it eventually arrived at the classical geometric solution.
When researchers presented alternate versions of the puzzle — like doubling the area of rectangles or triangles — the chatbot responded in similar fashion.
It tended to rely on algebraic reasoning, occasionally slipped up, but ultimately generated accurate geometric solutions.
The investigators noted that ChatGPT’s replies echoed the cognitive behavior of a learner testing ideas, instead of merely retrieving stored facts.
Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.

Comments
No comment