Today, it is difficult to imagine something that has changed our society more than cloud computing technology. Without the cloud, there would be no Twitter, no Facebook, and no Gmail, and millions of businesses around the world would not be as competitive, collaborative, flexible, and mobile. Perceived as a somewhat nebulous concept at first, cloud computing has gradually evolved to an integral part of our everyday lives, with its absence being unthinkable.
What is the Cloud?
Cloud computing, or simply the cloud, involves so many diverse technologies that it is challenging, if not impossible, to give it a comprehensive, commonly agreed-on definition. In this article, we suggest using one of the most widely accepted definitions proposed by the National Institute of Standards and Technology (NIST):
Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
(NIST Special Publication 800-145, September 2011, p. 2)
In other words, the cloud is a place where you can store data and access apps and services easily and quickly. A device with an Internet connection is the only thing you need to use your cloud-based programs anywhere and at any time.
Cloud computing is composed of three broad service models:
- Infrastructure-as-a-service (IaaS), where consumers use the provider’s computing resources including servers, networking, and data storage space;
- Platform-as-a-service (PaaS), where the provider hosts tools for general software development on their cloud infrastructures;
- Software-as-a-service (SaaS), where consumers gain access to a completed product managed by the provider.
Furthermore, cloud computing involves four main deployment models:
- Private cloud (for a single organization),
- Community cloud (for a group of organizations),
- Public cloud (for public use), and
- Hybrid cloud (a combination of cloud services).
History of Cloud Computing
The term ‘cloud’ borrows from telephony, where a standardized cloud-like symbol was used to mark a network on telephony schematics.
Cloud computing may seem like a relatively new trend. However, its roots trace back to the 1950s, when mainframe computing allowed multiple users to access a central computer. In the 1960s, some ideas similar to what we call cloud computing were introduced (e. g. J.C.R. Licklider’s idea of an “intergalactic computer network”).
In the 1970s, virtualization took 1950s’ mainframes to the next level and in the 1990s, telecom companies began offering virtualized private network connections. In 1999, Salesforce.com became the first company to deliver enterprise applications over the Internet. Apps could be accessed by many users simultaneously from a web browser at a low cost.
Cloud computing as we know it appeared in 2006, when Amazon.com, then an online book retailer, introduced Amazon Web Services (AWS) and thus pioneered the cloud computing movement. AWS provides a broad set of cloud computing services, such as computing power and database storage, and it remains the leading infrastructure platform in the cloud and is highly reliable.
Later, more vendors, such as Netflix, Microsoft, Google, Apple, and IBM, joined and the cloud market expanded. A variety of deployment models emerged. Nevertheless, it was still difficult to fully understand the advantages of cloud computing. “Even if someone builds the Cadillac of cloud services, they’ll be out of business within a year,” said Windows expert Mark Minasi in 2008.
In 2014, however, Gartner named cloud computing one of the top 10 strategic technology trends. The number of organizations relying on cloud applications is rapidly increasing, and this tendency is likely to continue in the future. From an unusual, unrealistic idea, cloud computing has turned into an influential concept that is widely appreciated by both businesses and private users.