With this article we shall look more in detail at the relationship between Artificial Intelligence (AI) in its Deep Learning component, and computing power or hardware, a connection we started exploring with our previous article, “When AI Started Creating AI“. The foundations for understanding the link between AI-Deep Leaning and computing power being laid, the next article will focus on political and geopolitical consequences of this relationship, while considering a critical uncertainty uncovered here and according to which the evolution towards co-designing AI-Deep Learning architecture and hardware could alter the whole field.
Our aim is to understand better how computing power can be at once driver, stake and force for AI expansion and the related emerging AI-world. Computing power is one of the six drivers we identified that not only act as forces behind the expansion of AI but also, as such, become stakes in the competition among actors in the race for AI-power (Helene Lavoix, “Artificial Intelligence – Forces, Drivers and Stakes” The Red (Team) Analysis Society, 26 March 2018).
Artificial Intelligence, Computing Power and Geopolitics (2): what could happen to actors with insufficient HPC in an AI-world, a world where the distribution of power now also results from AI, while a threat to the Westphalian order emerges
High Performance Computing Race and Power – Artificial Intelligence, Computing Power and Geopolitics (3): The complex framework within which the responses available to actors in terms of HPC, considering its crucial significance need to be located.
Winning the Race to Exascale Computing – Artificial Intelligence, Computing Power and Geopolitics (4): The race to exascale computing, state of play, and impacts on power and the political and geopolitical (dis)order; possible disruptions to the race.
In this article we show that AI-Deep Learning indeed needs large computing power, although varying across the different phases of computation and evolving with improvements. Even though advance of AI systems leads to a decreasing demand for computing power across the process of an AI-system’s creation, the very search for optimisation not only demands more computing power, but also leads to changes in the hardware field (which we shall see more in detail in the next article), and even, potentially, in terms of algorithms. Meanwhile, more computing power also means the capability to go further in terms of Deep Learning and AI, indeed confirming that computing power is a driver of AI expansion. Feedback loops or rather spirals are thus starting to appear between AI and its expansion and at least two of its drivers, computing power and “algorithms”.
We explain first the methodology used to uncover the link between AI-Deep Learning (DL) and computing power in a rapidly evolving ecosystem, and point out two likely new frontiers in the field of Deep Learning, namely Evolutionary Algorithms applied to Deep Learning in general and Reinforcement Learning. We also briefly present the three phases of computation of an AI-DL system. Then, we deep dive into each of the phases: creation, training or development, and inference or production. We explain each of the phase and the needs in terms of computing power for each of them. We then move beyond categorisation and explain the constant quest for improvement across the three phases, pointing out the balance that is sought between key elements. We notably emphasise there the latest evolution towards codesign of Deep Neural Network architecture and hardware.
About the author: Dr Helene Lavoix, PhD Lond (International Relations), is the Director of The Red (Team) Analysis Society. She is specialised in strategic foresight and warning for national and international security issues.
Featured image: ORNL Launches Summit Supercomputer on Flickr (Public Domain) 30 May 2018.