#GPT-3 vs. #AGI
GPT3 has been dominating recent tech headlines, as well it should. GPT-3 uses deep learning to produce human-like text and represents a massive achievement for OpenAI. Unfortunately, the abilities of GPT-3 have misled people to conclude it is a major step on the road to Artificial General Intelligence (AGI). In no uncertain terms, it is not.
AGI is the hypothetical concept of computers having the capacity to understand and learn the same intellectual tasks that humans can. Presented as having learned 175 billion parameters, GPT-3 parses its input looking for similarities with previously learned information and creates relevant output. In a widely described demonstration, GPT-3 generated an entire article given a few sentences to get it started.
Saying that GPT-3 is not AGI in no way diminishes the accomplishment of GPT-3 or reduces its usefulness or applicability. But the bottom line: GPT-3, while impressive, still lacks the capabilities that AGI demands.
The Road to Artificial General Intelligence Isn’t Through GPT-3
MIT’s Lex Fridman released a video, GPT-3 vs. Human Brain, which provides a case in point. In it, he compares the cost and information content of the GPT-3 system and the human brain. Dr. Fridman’s YouTube channel is widely respected and viewed. While his video is not wrong, it implies that by extending GPT-3 with more powerful hardware, true intelligence might emerge.
Read more here: