5 Reasons the Tech World Is Buzzing about OpenAI’s GPT-3
You may have heard: AI is now at the point of taking over the internet. You will never need to read, write, think, code, or work again. Coders, retire. Writers, go back to school. It’s over. It’s done.
OK, not quite.
But the recent hype over OpenAI’s new neural-network-powered language model called GPT-3 may have given you that idea. British newspaper The Guardian even published an article written entirely by GPT-3. It was entitled “A robot wrote this entire article. Are you scared yet, human?” and the byline, the author's name, was simply “GPT-3”.
So what is this newfangled technology? And is it as impressive, as important — or as intimidating — as so many in the media are making it out to be?
The short answer is: no.
What is GPT-3?
Created by OpenAI, research, and development company founded by Elon Musk in 2015 (it was originally a not-for-profit enterprise but is a market-based and works with Microsoft), GPT-3 is, in short, a language prediction technology. It’s the most powerful and capable language predictor technology ever created. And when we say language, we mean any human language: Spanish, Klingon, or PHP.
GPT-3 is built to find patterns in data and to use those patterns to complete prompts.
It works by finding patterns in any data it’s trained on and using those patterns to complete whatever task the human user desires. Which means if you give GPT-3 ten or fewer “training examples” (sentences prompts, guidelines, text examples), it can extrapolate out a nearly endless amount of finished products: applications, short stories, bodies of code, even blog articles. (Not this article though!)
GPT-3 is able to do this, and to do it so impressively, because it’s the most comprehensive AI model ever created. As Rob Toews wrote in Forbes earlier this year, size alone is the determining factor in this power: “the model has a whopping 175 billion parameters. To put that figure into perspective, its predecessor model GPT-2 — which was considered state-of-the-art and shockingly massive when it was released last year [in 2019] — had 1.5 billion parameters.”
Because of the size and scope of these parameters, the GPT-3 can create texts which are incredibly detailed, thorough, and, well, human. Hence the media’s general flip-out.
5 Ways GPT-3 Is Revolutionary But You Still Shouldn’t Worry
- GPT-3 Can Make Code, but...
It can’t solve problems. Developers’ work isn’t just plugging in blocks of code, or copying and pasting pre-written code from other places. As we’ve often written here, the job of developers is often more about navigating changing human needs, understanding psychology and their client’s projects, and responding to those as they evolve. Can GPT-3 help with these processes? Of course! But can it perform without a human standing guard? No, not yet.
- GPT-3 Can Make Coding More Efficient But…
It can’t do high-level coding or develop new languages as such needs to evolve. Sure, developers early in their career may find some of their rote tasks are no longer needed. That might even mean some low-level coding jobs disappear. But the result of that should mean that more developers up their game, learn more, study harder, and get on to the bigger stuff faster. No AI can replace creativity and critical thinking. So let us humans focus on our human stuff!
- GPT-3 Is Nothing New, But…
It’s a lot more of what was old. In basic terms, GPT-3 provides no new technology that was present in GPT-2. It just uses so much more content to navigate and reproduce existing language that it performs on a scale previously unseen with GPT-2. Because both GPTs use statistical analysis to choose what language should follow based on the massive database of data it has “consumed,” GPT-3 seems much more intelligent than GPT-2. In reality, it’s just pulling from a significantly more massive store of data.
- GPT-3 Does “Zero-Shot Learning,” And…
It does so faster and on a greater scale than anything before it. Zero-Shot learning, in short, is when you can prompt an algorithm to do something without teaching it how first. As Przemek Chojecki writes in BuiltIn.com, “you could, for example, give [GPT-3] ‘Summarize’ as an input and provide a text that you wanted a synopsis of. GPT-3 will understand that you want a summary of the text without any additional fine-tuning or more data.” This means work that was previously slow and tedious could be a lot quicker with GPT-3; it doesn’t mean we’re ready to replace humans with machines.
- GPT-3 Might Seem Scary Now, But…
Most futurists and tech writers (once the original euphoria wore down) agree that the ultimate effect of GPT-3 will not be the obsolescence of human developers and writers, but the democratization of technology and technical skill. No longer will basic coding and development depend upon years of learning and study for their use; rather, GPT-3 will be an accessible option for many entrepreneurs, designers, and others seeking to create technological solutions to problems they face every day. So instead of taking over, it’ll be more like an excellent tool more people can use. Less like an overlord, and more like a can opener. (Just a very fast one that also writes code!).
While the future of GPT-3 is still uncertain, we’re confident it’s something we should be excited about — even those of us who work in the development space, even those of us jealous of the technology's whopping number of parameters. It may change our work, and it may be a bit scary. But almost certainly it’ll change our work for the better, and make our world better as well in the process. Which is a different kind of scary prospect — a scary-good one.
If you want to stay up to date with all the new content we publish on our blog, share your email and hit the subscribe button.