Badges

55 Points
8 Years

User Activity

About 7 years ago
It actually boils down to the amount of Sequential IO that can be pushed and massaged. Each physical core of a data warehousing host can consume about 1 gbps of data... In order to keep all the cores working on the data it has to get moved from storage into ram... you need…

About me

OpenStack Framework, Data Management, Business Continuity, Systems Architecture, Design and Implementation

Specialties: OpenStack including work on the various Sub-Projects: Keystone, Nova, Swift, Cinder, Neutron, Trove, Red Dwarf, Savanna, etc., Hadoop including the Hadoop ecosystem: HDFS, MapR-FS, MapReduce, Hive, HBase, Flume, Sqoop, ZooKeeper, Warden, Oozie, plus new Resource Frameworks such as Yarn and Tez (Distributions include: Apache, CHD3, CHD4, MapR and HortonWorks,) Windows and Unix Clustering, MySQL 5.1 (Percona,) Vertica, Greenplumb, Netezza, SQL Server 2005 - 2012, Oracle v4 - 11gR2 single node and RAC, Informix 7, DB2, Sybase 11, Business Intelligence ecosystems including: ETL, Data Warehousing and Analytics, Local and Remote Replication for High Availability, Scalability, and Disaster Recover/Business Continuity.

I am also a professional actor and full member of SAG-AFTRA and AEA.