Technologists say the vague definition of “artificial intelligence” leaves a big opening for companies to over-promise or over-market the capabilities of their products — or even make “AI” more of a marketing gimmick than a real technology. (Photo illustration by tolgart/Getty Images)
During his lectures at Stanford University, Jehangir Amjad asks his students a curious question: was the 1969 moon landing a product of artificial intelligence?
It may sound like a science fiction or time travel story, he said, but understanding the history of AI answers the question for them.
“I would actually say, yes, many of the algorithms that were part of what got us to the moon are also precursors to a lot of what we see today,” says Amjad, a Bay Area technology manager and a computer scientist. lecturer at Stanford. “They are essentially the precursors to the same kind of ‘next, next, next generation’ algorithms.”
Amjad asks his students the question to underline how difficult it is to actually define ‘artificial intelligence’. This has become even more difficult as technology explodes in sophistication and public awareness.
“The beauty and the dilemma is: ‘what is AI?’ is actually very difficult to define,” Amjad said.
That broad definition – and public understanding – of “artificial intelligence” can make it difficult for both consumers and the tech industry to parse what is “real” AI and what is simply marketed as such.
Swapnil Shinde, the Los Altos, California-based CEO and co-founder of AI accounting software Zeni, has seen it through his investment firm Twin Ventures. Over the past two years, Shinde has seen a huge increase in the number of companies seeking funding that describe themselves as “AI-powered” or “AI-powered.” The AI market is very saturated and some “AI companies” are actually only using the technology in a very small part of their product, he said.
“It’s very easy to figure out after a few conversations if the startup is just building a wrapper around ChatGPT and calling it a product,” Shinde said. “And if that’s the case, they won’t survive long because it’s not really deep tech. It doesn’t solve a very deep, painful problem that has been caused by humans for a long time.”
The rush to build AI
Theresa Fesinstine said that since early 2023, she has observed a race in the business world to introduce AI technologies to stay competitive and relevant. That’s when she launched her AI education company, peoplepower.ai, where she leads workshops, teaches organizations about how AI is built, and advises them on which tools might be a good fit for their needs.
At a time when everyone wants to claim the most advanced tools, a little basic education about AI can help both companies and their employees navigate the technology landscape, according to the Norwalk, Connecticut-based founder.
In an effort to look more innovative, companies can tout basic automations or rules-based alerts as exciting new AI tools, Fesinstine said. While these tools do use some basic AI technologies, the companies could be overstating the tool’s capabilities, she says, especially if they throw around the popular buzzword term “generative AI,” which uses complicated algorithms and deep learning techniques to learn , adapt and predict.
The pressure on companies to stay on top of the latest and greatest may also lead some organizations to purchase new AI software tools even if they don’t have a strategy in place to implement and train their employees on how best to use them to use.
“It’s predatory, I would say,” Fesinstine said. “For companies, especially those that are uncertain about what AI will look like and what it should be, people are afraid of being left behind.”
Some technologists argue that the lack of clarity about what is or is not AI is causing all kinds of tech products to be sold as such. For example, predictive analytics, which use data to predict future outcomes, could be “borderline” AI, says Ed Watal, the Reston, Virginia-based founder of IT and AI strategy consultancy Intellibus.
True AI systems use algorithms to sort, analyze and assess data, and make informed decisions about what to do with it, based on what humans prompt them to do. The “learning” aspects of these systems are how AI gets smarter over time through neural networks that take feedback and use history to get better at completing tasks over time.
“But the purists, the purists, will argue that AI is just machine learning and deep learning,” he said.
“AI wash”
While there appears to be an AI-powered company that promises to do virtually any task for you, technologists warn that today’s “real” AI has its limitations. Watal said the industry has seen some “AI washing,” or over-promising and over-marketing the use of AI.
A company that promises its AI tool can build a website from scratch could be an example, he said. While you can get ChatGPT or another AI algorithm to generate the code, it can’t create a fully functioning website, he said.
“You couldn’t do things that require, say, something as simple as sending an email, because sending an email is a [simple mail transfer protocol] server,” Wat said. “Yes, you could ask this AI tool to also write the code for a mail server, but then you still have to host it and run it somewhere. So it’s not as simple as: oh, you click a button and you have a whole app.”
Amjad, who is also head of AI Platform at generative AI company Ikigai, said companies sometimes overpromise and over-market AI’s ability to perform original, creative tasks.
While artificial intelligence tools are excellent at recognizing patterns, sorting data and generating ideas from existing content, humans remain the source of original, creative tasks and output, he said.
“People would argue that AI is creating a lot of things in the public imagination, but in reality it is regurgitating. It’s not a creation, is it?” Amjad said. “And we have to question where we see claims of originality coming from AI, because originality is a very human trait.”
This is certainly not the first time that a new technology has captured the public’s attention and sparked a marketing frenzy, Watal said. About a decade ago, the concept of “Web3,” or a decentralized Internet that relies on blockchain technology, was rapidly growing in popularity, he said.
Blockchain technology works as a kind of public ledger, where transactions and data are kept in an accessible forum. It is the basis of many cryptocurrencies, and while it has become mainstream in recent years, it has not taken over the internet as predicted about a decade ago.
“The cloud” is another example of a technology marketing makeover, Watal said. The concept of remote servers storing information separately from your hardware goes back decades, but after Apple’s introduction of the Elastic Compute Cloud in 2006, every tech company competed to stake its claim on the cloud.
Only time will tell whether we overuse or underuse the term artificial intelligence, Amjad said.
“I think it’s very clear that both the hype and the promise, and the promise of applications, are actually quite real,” Amjad said. “But that doesn’t mean we don’t overdo it in certain circles.”
Amjad suspects that interest in AI will only continue to grow, but he thinks Ikigai’s technology will prove itself amid the hype cycle.
‘Yes, it came and captured the public imagination. And I’m definitely excited about that part, but it’s something that builds on a very long tradition of these things,” Amjad said. “And I wish this would help temper some of the expectations… the hype cycle has actually existed in AI, at least a few times, in the last, maybe 50 years itself.”