Morgan Stanley
  • Research
  • Nov 14, 2017

AI and Semiconductors Rewrite the Future of Business

While industry explores the best business applications for AI, semiconductor companies race to meet the demands that will come from innovations like autonomous cars and virtual assistants.

While artificial intelligence—usually referred to as AI—has seen plenty of traction in the press, many institutions and corporations are still trying to figure out what opportunities AI, machine learning and deep learning will provide for their business.

Most organizations view AI as a potential source of disruption for their business as well as a source of differentiation from their competition. At the moment, however, the conversations are focused on trialing what can be done with the technology, rather than widespread adoption. But while various industries explore the best uses for AI, semiconductor companies are racing to meet the demands that will come from innovations like autonomous cars, virtual assistants and process automation.

Machine learning operates in two steps, the learning phase and the execution phase, and each has different needs and requirements for semiconductor chips. As these technologies develop, new generations of chips will be required to maximize efficiency and effectiveness for both learning and execution.

Teaching a Computer to Think

For the learning phase, semiconductor chips consume vast amounts of power and need to be computing intensive. Initially, graphics processing units (GPUs)—which were originally developed for computer gaming—were repurposed for machine learning purposes. 

Artificial Intelligence has the potential to supercharge the semiconductor industry, but it could be also transformative across all industries.

Now, companies are creating their own custom chips, designed specifically for AI and machine learning purposes. Server farms filled with these computing and power-intensive chips will try to figure out the rules of what it is exploring, be it a game like chess, logistics for a company or IT support.

Once a computer learns the rules, the execution phase allows for machine learning to scale. Execution takes a relatively small chip by comparison, because the rules have already been calculated by an army of GPUs in the learning process. These chips can be built by the tens of millions, and can be embedded in devices we use every day.

As the devices around us get smarter and more connected, we are seeing increased semiconductor content in everyday objects. These technologies are a continuation of what we call the Internet of Things (IoT):  3D sensors in smartphones that can be used for facial recognition, voice-controlled smart home devices that can answer questions, even internet search services.  All of these devices (and more) take advantage of training by artificial intelligence, machine learning and deep learning.

The power of this learning is clear when you look at how programmers built a computer that could play the board game “Go.” When AI researchers first taught the system to play the game, it required scores of interconnected, power-intensive servers to learn how to play. The new version of that system uses a tenth of the computing power.

The computer learned the rules itself by playing the game against another computers, and it became faster and stronger. There was no human teaching it how it should play based on those rules. This is what’s called deep learning. When it works, it is quite incredible, because the machine learns by itself, and there is very little human intervention. 

Industry Learns to Use AI

Let’s say that a company is facing a logistics problem related to shipping goods. If the machine learning or deep learning chips are applied to that problem, it might find a way to optimize the logistics, optimize efficiencies for that company, and increase returns.

But let’s take it a step further: Could AI someday use vast amounts of company sales data to help an apparel retailer determine amounts of product to produce?  Could it help banks replace a lost debit card without any human interaction?  Or help factories determine when machinery might fail using historical performance of equipment and advanced monitoring, ensuring a factory can run 24/7/365 with no downtime? Perhaps.

For companies, these new technologies may improve productivity in a meaningful way, particularly in sectors that are less digitized than others, such as capital goods, manufacturing, pharmaceuticals, healthcare and financial services. Many of these business sectors have a wide range of data flowing through their systems, and don’t necessarily use that data to its full potential.

Over the course of the next decade, Artificial Intelligence may not only have the potential to supercharge the semiconductor industry, it could be transformative across all industries. 

For Morgan Stanley Research on key Tech, Media and Telecommunications themes ask your Morgan Stanley representative or Financial Advisor.  You can also visit the TMT Barcelona 2017 page.