Originally posted at vcdx133.com.
I previously posted about my “Baby Dragon Triplets” VSAN Home Lab that I recently built. One of the design requirements was to meet 5,000 IOPS @ 4K 50/50 R/W, 100% Random, which from the performance testing below has been met.
The performance testing was executed with two tools:
VMware I/O Analyser Fling – Excellent tool that collects esxtop data as well; if you need fast and easy storage performance testing, keep this in your toolkit.
- Iometer configured as per the VMware 2M IOPS with VSAN announcement
Iometer – Test configuration
Iometer – Results
VMware I/O Analyser – Test configuration
VMware I/O Analyser – Results
- The realistic Iometer results were significantly lower compared to the same settings with the VMware I/O Analyser results. This is because the Iometer config was with 8 x 8GB disks and the VMware I/O Analyser was testing with the default 100MB disk. If you use VMware I/O Analyser, make sure you extend the 100MB disk to 8GB (as per User Manual that comes with the Fling). You can see the lower latency due to less parallel I/O over the smaller address space.
- Due to the small size of workloads, all storage tested was SSD and not SATA. Switching from VSS to VDS with LBT had no improvement on performance. Network Throughput was around 20MB/s for the VSAN VMkernel. The Corsair SSD drive is rated at 85,000 IOPS @ 4K 100% Write 100% Random, so with VM config, CPU, RAM, SSD and Network not being the bottleneck, I suspect it is the Z87 Serial ATA controller (or its ESXi driver) that is the limiting factor (even though it is supposed to support 6Gb/s).
- I am considering scrapping my ESXi environment to test a single host with Windows Server 2012 and Iometer and then ESXi with SSD (DAS) and Iometer again, just to see if not having VSAN makes a difference.