01 January 1970 0 9K Report

Infinity is often thought of as the inverse of zero. Thus, the difficulties found when dealing with an infinity of data (or massive data), should be of the same magnitude as those found in the context of very small amounts of data (one or two observations). For example, predicting rare events such as the covid-19 pandemic is obviously not easy, as there is very little historical data on pandemics. Work on statistical or learning methods applied to massive data is quite flourishing. However, almost everything remains to be done to process fairly small data. This question remains to this day, still quite open and will certainly be one of the concerns of the scientific community in the years to come. What do you think ? Do you know of any significant contributions concerning the analysis of very small quantities of data (allowing to analyze and then predict the occurrence of rare events, such as the Covid 19 pandemic or certain natural disasters) ?

Similar questions and discussions