Hadoop manages processing and storage for big data apps, enables computers to analyze large datasets in parallel, and process structured, semi-structured, and unstructured data.
HPCC platform is implemented on commodity computing clusters, enabling data-parallel processing to handle complex tasks and provides high-performance applications.
Apache Storm handle vast amounts of data in a fault-tolerant manner ensuring high data availability, and allows you to perfrom all manipulations on real time data concurrently.
Apache Cassandra manages structured data, and has a decentralized storage system with no single point of failure – so system continues to operate even if any components fail.
Qubole is autonomous big data management platform that automates the installation, configuration, and maintenance of clusters, and provides tools for exploring and analyzing data.