09 September 2013 7 2K Report

The amount of data that is processed annually has exceeded the zettabyte boundary. This is a number that would only be mentioned in highly theoretical articles 2, 3 decades ago. Such insurmountable amount of data gave birth to a new term : big data. What do you think is the most important tool that will allow us to handle this explosion of data?

a) Do you think, it is the increase in the performance of CPUs, which is currently significantly slower than the growth of data?

b) Do you think, it is the new programming languages being introduced, which will make processing of this data much easier?

c) Do you think it is some novel data analytics algorithms that will shed light into significantly easing the handling of data?

d) Or, anything else ?

e) Or, are we dead in the water and there is no hope ?

More Tolga Soyata's questions See All
Similar questions and discussions