Are you considering serverless computing? So is everyone else in IT. A recent survey released by New Stack reveals that half of IT executives in the survey say they are already running with a serverless architecture, and 28 percent intend to do so within the next 18 months.
The survey of 608 companies found that benefits among serverless users include scalability and a greater speed of development. For serverless users, adoption is spreading fast; 32 percent surveyed said that more than a quarter of their organization’s workloads use cloud-based serverless tech.
Most in the cloud industry could have guessed the results of this survey without actually doing a survey; it’s a “fire is hot” data point.
What’s missing is a critical look at what serverless cloud computing technology does well, and what it does not. For example, I know that it’s not smart to blindly chase trends, no matter how many other companies are chasing them. So, let’s visit the reality of serverless cloud application development systems, and focus on what works and what doesn’t.
On the pro side, and as revealed in this survey, serverless provides two major benefits:
- The ability to speed application development.
- Related, the ability to push dynamic sizing and operations to the cloud provider, sometimes called “no-ops,” but it’s really “less-ops.”
Because the promise of cloud computing is speed and agility, chasing these benefits with serverless technology is pretty much why companies move to cloud computing in the first place. They are often taken aback by the need to manage these remote virtual servers as if they were in your data center. Serverless removes that need.
The con side is a bit more complex:
- Serverless is fine for new applications, but porting applications to serverless systems can be hugely laborious and risky. This is so often the case that I find it’s usually better to start from scratch than to port something that won’t fit into a serverless framework.
- The cost of running serverless rather than in a traditional cloud computing environment seems to be a bit higher, depending on who you talk to. I see this as types of application behaviors that are just costlier when in a serverless platform (for example, excessive I/O). If your application rocks those patterns, you’ll pay more.
So, should you follow the crowed to serverless? My advice is really the same as with containers, machine learning, and now serverless. You need to find a use for it and see what it costs. Enough said for now.