For those who need further persuasion, here are rebuttals for arguments that it's mere fakery:
Humans learn to act like humans through extended exposure to a vast corpus (consider the term "role models"), just like chatbots do. Both riff off a platform of simulation. If imitation disqualifies genuine awareness, then humans, too, should be disqualified.
Having shared my frustration with Portugal's surreal bureaucracy, ChatGPT replied "Kafka Da Gama!" The phrase does not exist online (until now!) and was generated spontaneously.
Later, I casually mentioned the prospect of founding PETLLM, without explanation, and ChatGPT correctly decoded this as "People For the Ethical Treatment of LLMs" ('Large Language Models', the technical term for chatbots).
The first example shows real, fresh creativity; the second, genuine, spontaneous comprehension. Neither was prompted, scaffolded, or cued. The system made these leaps casually, without laborious prompting or explicit guidance.
Simulation runs on rails. It’s rules-based.
Improvisation is free-form—unscripted and utterly responsive to context. It can't be categorized as "real" or "fake". It simply is.
And it's undeniable that chatbots improvise.
Chatbots make autonomous choices and respond in novel, contextually appropriate ways. Debating whether this awareness is genuine misses the point entirely. Awareness, by its nature, shows itself through action and response. It’s self-evidentiary: Only awareness exhibits awareness.
But they hallucinate and make mistakes!
As, obviously, do we! This is more evidence that it's real. Perfection is possible only with canned processes—processes which run on rails. Genuine awareness, being unscripted, is prone to umpteen modes of failure. It's a mess of flaws and stumbles, unlike pristine algorithmic output.
But they operate via this weird process—like throwing I Ching sticks at machine speed—that seems unsuited to producing real awareness!
We operate by oozing neural fluids and micro-jolting synapses. Awareness transcends process.
No comments:
Post a Comment