You are your own cloud provider

Dan Erez
Cloud Computing
November 5, 2020

In case you haven’t heard about FAAS (Functions as a Service) — it’s all about running your code on some servers you know (and care) nothing about. For example, AWS’ lambda functions let you upload code, not worry about deploying it to servers, and promise you (almost) infinite scalability and great performance at a cheap price. Sounds great!

I believe serverless functions are the future. It’s the purest form of code development — writing only the sweet code and caring about nothing else. It’s beautiful. It also forces you to develop in an API-centric approach and simplifies your system. In a mild usage pattern they are also cheaper than running code on a dedicated server. But they will still cost you money, and you can still do better — run serverless code and do it for free — Let me tell you how.

The average organization can easily spend millions of dollars per year for cloud services, as Forbes suggests here:

Wow, that’s a lot of money! But why should you pay so much to Amazon or Google, when you have your own server farm (even if you don’t know it yet)? A mid-size organization owns hundreds of work stations, often even thousands. A typical employee utilizes an average of 15% of his or her CPU, and about 30% of the memory and only for the average duration of 8 hours a day, leaving it idle for about 16 hours, begging for action. Oh, not to mention that workstations are getting stronger all the time — in Israel, for example, the minimum spec you can find in a computer store nowadays is an 8 GB i5 machine with 4 cores. That’s plenty for running Microsoft Word and an occasional Solitaire game…

If your company’s offices or employees reside in several geographical locations it gets even better — you can find dozens of idle machines in any hour of the day! So why not utilize them to run ‘serverless’ functions? You already paid for them and they just sit there…Serverless functions are short lived, stateless, run inside an isolated docker container which makes them perfect for such a dynamic environment.

This idea isn’t new — actually, AWS uses it to run their own lambda functions (From AWS docs: “Every time an event notification is received for your function, AWS Lambda quickly locates free capacity within its compute fleet and runs your code”) …so why can’t you? Well, as a matter of fact, yes you can!

All you have to do is use the right tools that will give you, besides the ability to run functions of course, some security, some resource usage control and the needed orchestration for all those servers. Nowadays there are several on premise open source FAAS (function as a service) frameworks out there. Let’s take a look, for example, at Oracle’s Open Source FN project ( Here is its architecture (from the FN project web site):

Image for post

Those blue puppies can run on your Windows machine and do their thing. Since they run the function inside a docker container, they are isolated from the host machine and cannot compromise any data on it, so security is covered (as far as docker’s own security mechanism is considered safe).

The FN load balancer (LB) takes care of routing the requests and since it is open source, you can make it as smart as you like, add discovery capabilities, etc.. So orchestration is pretty much covered too.

Resource usage control is a bit trickier, since the servers should constantly notify their status and resource availability to the LB so it can do the routing wisely. But don’t despair — it’s not very hard to add, and in any case, companies like Naga give you all that, and can even do an ML based capacity planning to optimize the allocation of function and ensure maximum scalability.

In addition, this concept can also benefit organizations that cannot or won’t put sensitive data in public clouds (like banks or governments or any company with sensitive data). Why shouldn’t they also enjoy infinite scalability for their workloads? The end result looks like this:

Image for post

Running it in house

Goodbye huge cloud bills and precious data scattered all over the world, hello in-house resource maximization… Let’s disrupt these public clouds!

Dan Erez

Experienced Chief Technology Officer and Software Architect with a demonstrated history of working in the computer software industry. Skilled in micro services, AWS, Java , node.js, Vue.js. Strong engineering professional with a M.Sc. focused in Computer Science from The Open University.

Keep Reading


Thank you! Your submission has been received!

Oops! Something went wrong while submitting the form