Currently Being Moderated

Cloud computing is a game changer for capacity planning. There are several differences that you will find. One of cloud’s principles makes this task even harder: elasticity.

 

Some may argue that in order to respect the elasticity in a multi-tenancy environment, you should sum the peaks of your applications. However, this approach can degrade the economic gains provided by cloud infrastructure. The perception of cloud computing to facilitate unpredictable loads is merely a user’s perception, and not the reality of cloud IT architects.

 

Usually, for a capacity plan for cloud computing, you must deal with three macro variables: user, application, and infrastructure. In a cloud environment, you have different applications and different user behavior sharing the same infrastructure. There is no golden rule to follow that characterizes each variable, but I personally adopt the strategy that starts from the less mutable (i.e. user) to the cheapest/fastest to change (i.e. infrastructure). If this is a brand new environment, and you don’t know anything about your users, you can follow the reverse order.

 

Escher.png

 

Understanding the User Behavior


The first method to understand the user’s behavior for a defined application is to observe and try to find patterns, such as when usage is at its peak (i.e. day, week, month, etc.), how long a user spends in a given transaction, etc.

 

Just for exemplification of the method, let’s use the application log to estimate user behavior, as described in the following table:

 

table.png

table - 01

 

In this hypothetical case, we have 40k users in a given day that can be graphically viewed, as seen in figure 01:

 

Gaussian.png

Figure 01 – Users requests in a given day

 

Using some math tools to make it usable for a capacity plan, we can try to describe this behavior with the Gaussian function (aka. Normal distribution) as expressed in this function:

 

GaussianEq.png

 

From this equation, σ is the standard deviation (=6829.69), µ is the arithmetic mean (=4444.44) e got the following Normal (y) value equals 0.98863. With this equation, we can identify what the expected amount of user requests will be in a given time during the day, and also the absolute peak where the first derivative is zero (maximum value).

 

Diving deeply into these numbers helps us estimate how many users we should architect the system to handle simultaneously. In order to measure it, let’s use the Poisson theorem:

 

PoissonEq.png

 

We can now define how many users the system should be designed for based on probability, and not from guesses.

 

Poisson.png

Figure 02 – Poisson distribution

 

In this example, the probability to handle more than 20 simultaneous users is less than 2%.

 

The big picture

 

At this point we can mathematically express the user’s behavior with precision. For each service provided together, these equations can offer insight about the overall user demand for the entire cloud environment.

 

In the next post, I’ll present how to use it to measure the application impact and how deal with it.

 

 

Best Regards!

Comments

Filter Blog

By author:
By date:
By tag: