2 Comments

I wonder whether such problems really test the reasoning capabilities of ChatGPT, or its capabilities to abstract. It is after all trained as a _language_ model. I tested the following and found the results remarkable - though designing analogies to the more complex abstract problems would be a fair bit of work:

Human:

Here is a riddle: Please fill in the blank. For Valentine's day, Marcy gives to Aaaron a feather, a nest, and an egg. Aaaron gives to Marcy a leaf, a flowerpot, and _________ .

ChatGPT:

Aaaron gives to Marcy a leaf, a flowerpot, and soil.

Human:

Here is a riddle: Please fill in the blank. Marcy and Aaaron like to exchange gifts. Marcy has a pawn, a rook, a peach, and a bishop. She gives to Aaaron as a gift the pawn, the rook, and the bishop. Aaaron has a spoon, a knife, a fork, and a book. He gives to Marcy as a gift _____________ .

ChatGPT:

Aaaron gives to Marcy as a gift the spoon, the knife, and the fork.

Nice.

Expand full comment