- The space saving which has helped to reduce our cabling.
- It makes it a lot easier to bring up a new system. When a new blade comes in, we slide it in the chassis and we're done. I don't have to spend time wiring up a new server. It's just there, with my team spending less time racking something and getting it configured. We're just ready to go.
- It's the speed that which we can deploy new systems.
Improvements to My Organization
I would guess it crosses over as the reduced cost on real estate as if we've got less room, there's less cabinets we need to buy at a data center. I don't know that there's really a cost benefit from the hardware standpoint. A standalone server is going to be cost comparable to a blade, maybe even cheaper. I guess the business is going to save money by using less man hours to get it up and spend less money on real estate.
Room for Improvement
It'd come from a software standpoint - software support on the BladeSystem, particularly with Helion and OneView in that if you're using the Cisco fabric extenders instead of the HPE fabric extenders, there's a lot of functionality that you can't use. Because our network stack is Cisco, we can't do a lot of that automated provisioning of new blades because it's not supported. That's one thing that we'd really like to see HPE implement - true supportability of the Cisco fabric extenders.
The other thing is the support. With our initial purchase, we bought three chassis and maybe 15 or 20 blades. Out of that, we had probably a 20% failure rate within the first few weeks. It was really high and enough to make us concerned. We spent a lot of money on the chassis. We're married to them at this point since we don't want to throw the chassis away. The chassis were fine, but the blade servers themselves had a high failure rate, which didn't give us a lot of confidence.
Since then, everything's been fairly reliable, very few problems as of late, probably on the same frequency as we do with the rack mount servers. Whereas previously the rack mount servers never had a problem, Blades servers come with loads of problems. It could be completely anecdotal coincidence.
Customer Service and Technical Support
We haven't had to do a lot of technical support beyond that initial failure rate as it was resolved very quickly. If it's a bad memory issue or similar, the guys are out the same day, and have replaced the broken piece or the entire blade.
Deployment is easy. We just slide the blade in and put an OS on it and we're done. It's a lot easier than dealing with the rack mount servers and it is a lot faster.
The reliability, has gotten better; initially it was bad. I don't think there's anything bad to say at this point beyond those initial first impressions.
Other Solutions Considered
We looked at also using the Cisco UCS platform. The UCS I felt was more complicated than what we needed. Perhaps another customer might choose it over HPs, but the features that UCS had didn't appeal or apply to us. If you're standing up dozens and dozens of chassis on a daily or weekly basis, then maybe those copy/paste features in the Cisco systems would benefit. But for us, I like the simplicity of the HP BladeSystem. I liked it; all of our staff are already familiar with HPE hardware, so they knew they could take it apart and do whatever maintenance they needed to do. With the Cisco, it was learning curve that we didn't want to have to ramp through. We still use it because Cisco requires you to use their play systems for the phone products.
If you're somebody who's undergoing rapid growth and not standardized on a platform yet, then I'd tell you that it depends on your environment. If you're already an HPE customer, then I'm going to say your engineers already know it. If you're not deploying 1,000 chassis, then the simplicity of using the HPE blades, it's so familiar to rack mount, the management interface, it's almost identical if you know iLO then it's already there. It's easy to set up and it's much lower cost than Cisco.
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Jul 25 2016