«

»

Aug 22 2012

The Virident FlashMAX II in my Lab

Advertisement

I am fortunate to work closely to vendors disrupting technology lifecycle, and often I get to test their technology. Last week I was provided with a FlashMAX II card from Virident. The Virident FlashMAX II is a Storage Class Memory (SCM) in a PCIe form factor with range in capacity from 550GB to 2.2 TB, in both MLC and SLC types of flash media.

The FlashMAX II provides an impressive number of local IOPS for a reasonable pricing with a total of 525K 4K Read IOPS in the 1100 model (See table bellow). Another interesting characteristic is the internal flash-aware RAID, which allows for multiple failures to occur without disrupting an application’s access to the data.

 

Screen Shot 2012-08-19 at 9.24.32 AM

 

When talking about VDI solutions we know that on average read IOPS are far less important than write IOPS. I previously published IOPS numbers for VDI in my article Get hold of VDI IOPs, Read/Write Ratios and Storage Tiering.

Unfortunately Virident does not have an available benchmark for VDI type workloads. I guess I will have to benchmark it myself; but not in this article.

What I want to demonstrate is that in this use case benchmark is not important.

PCIe NAND devices are local to each host, therefore what really counts are the virtual desktops in each host. The user data is normally offloaded to an external NAS for the non-persistent use case deployment. With that in mind I have chosen a rather aggressive consolidation ratio with 350 virtual desktops and high number of IOPs per desktop to demonstrate how this cards can handle the workload without a bleep. (I used my Online View Calculator to compute the numbers below)

 

Desktop Configuration

  • Number of VMs : 350
  • Number of vCPU : 1
  • Average vCPU MHz : 200 MHz
  • VM Memory Size : 2GB
  • VM Disk Size : 32GB
  • Refresh Policy : Refresh on Logoff (2%)
  • IOPS : 200 with 80% Writes and 20% Reads

Host Configuration

  • Sockets per Host : 2
  • Cores per Socket : 16
  • VMs per Core : 11
  • TPS : 30%

Screen Shot 2012-08-19 at 10.04.38 AM

Screen Shot 2012-08-19 at 9.58.31 AM

 

The resulting number would probably comfortably fit in any of the models provided by Virident, but there are a number of key consideration points.

  • Using local PCIe cards for VDI solutions does not allow User Installed Applications without additional 3rd party solutions such as Wanova Mirage.
  • Having so many virtual desktops in a single robust server considerably increases the fault domain.
  • The additional host RAM memory (542GB) in some cases can be more expensive than adding another host server with half of the amount of memory.
  • The number of IOPs may greatly vary depending on the applications in use.
  • CBRC cache offloading is not being considered in this solution and would drive IOPs further down.
The installation process is simple, requiring you only to install the VIB files in your ESXi hosts.

photo 2

photo 4

photo

 

The picture below demonstrate FlashMAX II specifications after being installed and drivers loaded. As you can see this card has 555GB useable capacity.

 

Screen Shot 2012-08-15 at 5.06.11 PM

 

 

In my lab I have not used the FlashMAX II as a primary read/writing device; instead I am using it as caching and logging device and I’ll explain the solution in one of my next articles.

 

This article was first published by Andre Leibovici (@andreleibovici) at myvirtualcloud.net.

Similar Posts:

Permanent link to this article: http://myvirtualcloud.net/?p=3760

2 pings

  1. myvirtualcloud.net » Should I use storage DRAM for VDI deployments?

    [...] with less memory (202GB) that would support a SSD NAND device such as Fusion IO ioDrive Duo or a Virident FlashMAX II with 700+ GB.  The Fusion IO ioDrive Duo with 640GB has an average price of [...]

  2. My Work & Home Lab Environments » myvirtualcloud.net

    [...] disks underneath. Enter the Virident FlashMAX II (I `previously wrote about the Virident PCIe card here). The team at Virident was kind enough to provide me with one of their amazingly fast Storage Class [...]

Leave a Reply