Building AI: Layers of Innovation that Shaped the Past, Present, and Future - Part 3

Omed Habib

September 8, 2023

In the third part of our five-part series, we explore the critical role of cloud computing in the evolution of AI. As AI and machine learning models became increasingly complex, the need for powerful, scalable infrastructure became imperative. This chapter delves into how cloud computing emerged as a game-changer, offering the necessary computational firepower and flexibility that was previously inaccessible to many researchers and small companies. We trace the journey from the prohibitive costs of pre-cloud computing, where advanced AI was the domain of the well-funded, to the post-cloud era, where startups and individual researchers gained unprecedented access to powerful computing resources. This democratization of technology not only leveled the playing field but also sparked a wave of innovation, making the development and deployment of advanced AI models a reality for a wider audience. We also look at the advent of hybrid and edge computing, and the impact of containerization and orchestration technologies like Docker and Kubernetes, which further simplified and accelerated AI development. This segment highlights cloud computing as the backbone that powers the ongoing AI revolution, transforming theoretical possibilities into practical achievements.

Layer 3: Cloud Computing and Scalable Infrastructure

The Problem: The hardware requirements to run and train these models escalated as AI and ML models grew more complex. These needs were beyond the reach of most researchers and small companies. The challenge was not just about having a powerful computer but a flexible and scalable infrastructure that could handle the computational demands of advanced AI applications. With this infrastructure, the development and deployment of AI models were allowed to those with significant resources, stunting innovation and progress.

Cloud Computing Emerges

The emergence of cloud computing marked a turning point. Major providers like AWS, Azure, and Google Cloud began offering GPU and TPU (Tensor Processing Units) support, delivering the power needed to train large neural networks without owning physical hardware.

Before Cloud Computing

Imagine a small startup wanting to experiment with deep learning models. The cost of setting up and maintaining a data center with the necessary GPU support would have been prohibitive. Such limitations confined advanced AI experimentation to well-funded institutions and corporations.

After Cloud Computing

The same startup could now access state-of-the-art GPU clusters on a pay-as-you-go basis from cloud providers. This democratization of computational resources ignited a wave of innovation. Now, anyone from individual researchers to small companies could tap into powerful computing resources, scaling up or down as needed.

A real-life example is the training of deep learning models like BERT (Bidirectional Encoder Representations from Transformers). Training BERT on a single GPU would take an impractical amount of time. One can easily access a cluster of GPUs with the cloud, slashing the training time and making the model's development feasible.

Hybrid and Edge Computing

Beyond the centralized cloud, hybrid and edge computing advancements enabled AI processing closer to where data is generated, such as IoT devices. This technological shift improved efficiency and responsiveness, opening up new avenues like real-time analysis and decision-making in autonomous vehicles.

The Advent of Containers and Orchestration

Technologies like Docker and Kubernetes played a vital role in managing complex AI applications, allowing for efficient deployment, scaling, and management. These technologies made it easier to develop and deploy AI models across various environments, further lowering the barriers to entry.

Cloud computing and the accompanying technologies democratized access to computational resources. They provided the necessary infrastructure that made experimentation, development, and deployment of advanced AI models possible for a broader range of people. This layer is like the engine room powering the AI revolution, bridging the gap between theoretical possibilities and practical realities.

Fun Fact

Amazon Web Services (AWS), one of the leading cloud providers, began as an internal project to handle Amazon's own retail operations. It later evolved into one of the most powerful cloud infrastructures, empowering startups and researchers in AI development.


  1. 2002: AWS is launched, offering a suite of cloud-based services.
  2. 2008: Google App Engine is released, expanding the cloud computing market.
  3. 2010: Microsoft launches Azure, entering the cloud computing competition.
  4. 2013: Docker introduces containerization, revolutionizing deployment in cloud environments.
  5. 2017: Kubernetes becomes the de facto standard for container orchestration. End Date: Ongoing (as cloud technologies continue to evolve)