How cold storage works, and how it can help with big data
"Big data comes in large blocks, and big data analytics are often required to process against large data objects that are terabytes in size," said Flowers. "Using cold storage, we become a 'data lake' mass of storage that can be scanned through by a Hadoop compute node." Solutions like Storiant containerize large data objects that contain the unstructured data that characterizes most big data, and also Internet of Things (IoT) data (like website log files) that increasingly comprise big data.
An enterprise can sort through and classify all of this data when corporate IT uses a cold storage solution and decides which storage containers that specific chunks (or objects) of big data are going to be sorted into.
At the same time, permissions can be assigned to each container that establish who has access to the data in the container. Flowers says that internet services providers are moving quickly to implement this style of cloud-based cold storage for big data, because it is elastic with its ability to expand or contract as needed. Cloud-based cold storage is also financially agile, because it eliminates the need to long-term amortize data center capital expenses (CAPEX) in favor of a more flexible conversion to a pay-for-use operating expense (OPEX) approach to cold storage that enterprises can control in the short term.
"We believe that the Internet of Things will continue to exponentially increase the amount of big data that enterprises will need to manage, with millions of devices and data sources from around the world feeding in large volumes of big data," said Flowers. "Financial institutions, pharmaceutical companies, and government are already in need of large-scale, low-cost cold storage for big data. In the end, enterprises are going to have to find a way to safely and securely run low-cost analytics, and cold storage services in a private cloud setting provides this."
Source: www.techrepublic.com
