October 25th 2025

Cisco steps up cloud strategy with expanded Intercloud alliance

One year ago Cisco Systems Ltd. announced ambitious plans to invest $1 billion in creating a cloud computing platform that would be able to rival Amazon Web Services, which many consider to be the undisputed leader in the space.

 Cisco is trying to unify the smaller outfits into a more coherent and competitive mass. Behind Cisco’s logic is the assumption that most of the smaller guys customers won’t jump ship to Amazon or Microsoft Azure any time soon, and so it can keep selling them its networking gear as those companies grow.

 Besides the marketplace, Cisco made a second big announcement around the “hybrid cloud” concept. Hybrid cloud refers to the integration of on-premise data centers with the public cloud, and brings advantages of flexibility and scalability.

Read more ...

Big Data and analytics consulting marketplace Experfy raises $1.5m

Harvard-based Big Data and analytics consulting marketplace Experfy, Inc. has raised a seed round of $1.5 million.

Investors included XPRIZE Chairman and CEO Peter Diamandis, Harvard Business School Professor Tarun Khanna; Intrado President George Heinrichs; Former Chief Innovation Officer of the Agency for International Development Maura O'Neill; Former president of Singularity University Neil Jacobstein; Thomson Reuters Global Head of Trading Compliance Miftah Khan; and Smart Launch Cofounder Rawy Iskander.

Founded in 2014, Experfy offers a marketplace for data scientists that is similar in fashion to what is offered by companies such as Upwork Global, Inc. (formerly oDesk,) and Freelancer Technology Pty. Ltd. for more generic skills.

Read more ...

Pivotal supercharges its Big Data Suite with Quickstep acquisition

Based on a 2013 paper co-authored by founder Jignesh Patel, who is a professor at the school’s Computer Science department, BitWeave provides a way of speeding up the way columnar stores such as Pivotal’s Greenplum Database execute queries. Instead of handling tables directly the old fashioned way, the technology maps out each to a much smaller logical representation.

The resulting values, which are referred to as fixed-length order-preserving codes, are arranged into an array that emulates a column at the bit level in which processors operate. These abstractions take advantage of the different operating conditions that exist on that miniaturized scale to allow data from multiple columns to be ingested into the on-board registers at once.

Read more ...

MapR 5.0 edges Hadoop closer to real-time processing

Continuing its drive to bring the batch-oriented Hadoop Big Data platform into the real-time world, MapR technologies, Inc. today is announcing a host of new features designed to support on-the-fly decision-making. MapR 5.0 is also crafted to support bigger workloads, responding to what MapR says is a trend toward customers running more applications on individual clusters.

The new release automatically synchronizes storage, database and search indices for real-time transactions and includes improved security auditing, an area that is considered to be a MapR forte. Release 5.0 also adds support for Apache Drill 1.0 and the latest 2.7 release of Hadoop and YARN.

Read more ...

Intel sees value in a diverse ecosystem

Intel strives to meet the needs of all levels of workloads, from the low end to the highest end data center. Doug Fisher, senior VP and general manager of Intel’s Software and Services Group, lines up software for data center platforms and ensures that the platforms are optimized to “take advantage of any unique capabilities,” he said.

Fisher told theCUBE during Red Hat Summit 2015 that he sees OpenStack as a cornerstone for the datacenter. “It is clear that there is great interest,” he said. “What’s most exciting is the deep level of contribution. I’m confident now that we have the foundation driving working groups to really focus that.”

Read more ...

What you missed in Big Data: Open-source power

LinkedIn Inc. released another one of its internally-developed data crunching technologies under a free license. And in particular, performing real-time business intelligence at the kind of scale where the traditional databases typically used for the task fall short.

Pinot, as the project is known, currently stores over 100 billion records  in support of more than 30 of the professional social network's most important features, including the tracking capability that allows users to see who viewed their profiles and the analytic functionality provided to advertisers.

Read more ...

Kyvos Insights launches native OLAP engine for Hadoop

The Hadoop ecosystem has a natural inclination towards variety. That first came to the fore with the release of Apache Storm two years ago, which has since been followed by no less than three other stream processing engines, and is now repeating itself with the introduction of a new online analytics engine (OLAP) from Kyvos Insights Inc. that similarly throws down the gauntlet to its predecessor.

That predecessor is Kylin, a system created at eBay Inc. to address its internal requirements that was released under an open-source license late last year.

Read more ...

Business intelligence analysts are wasting time cleaning raw data. Why?

We hear a lot about the potential benefits of big data, but a new study reveals that those benefits are won at a cost of considerable time spent in cleaning up and preparing raw information.

The study by data integration company Xplenty surveyed over 200 business intelligence professionals and finds that a third of them spend 50-90 per cent of their time just cleaning raw data.

Looking at the ‘extract, transform and load’ (ETL) process, including preferences for on-premise or cloud-based solutions, perceived challenges, and the amount of time spent on ETL, the results show that 97 per cent of those surveyed say that ETL is critical for their business intelligence efforts.

Read more ...