Blog: First Eagle Commentary – Artificial Intelligence: Back to the Future – GuruFocus.com
Nurtured by ever-cheaper computing power and the datafication of modern life, the rate of advancement in technologies based on artificial intelligence (AI) and offshoots like machine learning has accelerated sharply in recent years. While we aren’t yet being swiftly chauf-feured across unclogged streets by autonomous cars as our robot butlers mind the house, AI-related technolo-gies currently are improving our everyday lives in subtler ways that we’ve mostly come to take for granted; every-thing from the autocorrect on your most recent email to the latest movie recommendation from your streaming service is driven by AI and the machine-learning algo-rithms that continuously refine it.
Since its emergence in the mid-20th century, however, AI repeatedly has fallen victim to its own hype; as the practi-cal reality of the technology’s application inevitably fails to meet more fantastic expectations, enthusiasm and funding subsides. While such an “AI winter” may or may not be the outcome of the current hype cycle, we expect AI and machine learning will continue to impact the investment environment regardless—and not just for the big tech names and targeted startups. AI is already driving improved productivity across industries, but the economic returns it provides are not always large. We are still in early days of the technology though, and the companies that smartly in-tegrate AI-based technologies to improve processes, develop new products and better connect with their customers may be well positioned for the evolving competitive landscape— to the potential benefit of their shareholders.
Fertile Ground for AI Advancement
The terms “artificial intelligence” and “machine learning” are closely related. While artificial intelligence is a blanket term covering a wide range of disciplines related to a machine’s general ability to perform a task while exhibiting character-istics of human intellect, machine learning is an automated process by which AI is developed and refined without human intervention. By identifying patterns in data—massive amounts of data that require massive amounts of computational power to process—applicable to a predetermined task, a machine-learning algorithm dynamically adjusts to improve its performance of that task.
Today we seem to be in the midst of an AI renaissance, with the accelerating growth in Al technologies and applications enabled largely by three key forces:
Lower costs/greater power. Plummeting computing and storage costs have made the computationally intensive and data-hungry algorithms that enable machine learning more economically feasible. Hard-disk storage costs have declined by about 12% over the past decade according to Gartner, for example, while industry data show that the cost per unit of computing power has gotten cheaper by a factor of 1,6001!
At the same time, there’s been an arms race of sorts among developers to de-velop faster, cheaper microprocessors in support of the intense computational demands of machine learning. While CPUs (central processing units) had for years been the primary source of a computer’s processing power, GPUs (graphi-cal processing units, originally designed to render video game images) have become a popular addition given their ability to perform millions of math-ematical operations in parallel. More recently, companies have begun to roll out semiconductors optimized specifically for machine-learning tasks, from tailored CPUs and GPUs to innovations in FPGAs (field-programmable gate arrays) and ASICs (application-specific integrated circuits).
Rise of cloud computing. Cloud-computing platforms allow companies access to high-performance hardware and software infrastructure without the signifi-cant capital outlay necessary to build such systems on-premises. By allowing companies to “rent” access to applications, storage and processing power—and to shift the cost accounting from capital expenditures to operating expenditures in the process—cloud computing has lowered the barriers of adoption for a range of technologies, from everyday business applications like CRM (customer relationship management) to highly computation- and storage-intensive AI and machine-learning algorithms. Meanwhile, improvements in connectivity has enabled the transmission of huge datasets with minimal latency, enabling more efficient data storage and access in the cloud.
Datafication. The explosive growth of the digital world has resulted in im-mense volumes of structured and unstructured data capturing the minutiae of modern life, including everything from credit card transactions and cell phone tower pings to social media posts and CCTV images. The resulting availability of very large data sets has provided machine-learning algorithms with a treasure trove of information from which to learn. And this trove is growing rapidly; as shown in Exhibit 1, market intelligence firm IDC predicts that the “global data-sphere”—the aggregate of all data created, captured or replicated worldwide— will spike from 33 zettabytes2 in 2018 to 175 zettabytes in 2025.
Continue reading here.
About the author:
I’m a financial journalist with a Master of Science in journalism from Medill at Northwestern University.