As per my understanding, large-scale learning and big data problems in machine learning refer to the same thing, i.e., large datasets with large number of data points or large number of features in each data point, or both. Please clarify.
Large-scale learning for specific domains enables powerful models (prediction model) to be designed to analyze and provide sound decisions about the object to be studied. The same when you have a problem consisting of a considerable amount of data and you use machine learning this can only be beneficial for the accuracy of the results of the fact that the database is rich.
For more details i suggest to look at links and attached files in subject.
- What is difference between Big Data and Machine Learning? - Quora
Dear Vinod, this is my own view on the subject. Large Scale Machine Learning (LSML) is used for exploiting hidden patterns in very big datasets. It has the capability of extracting values from big and disparate data sources with far less reliance on human direction. It is data driven and runs at machine scale. It is well suited to the complexity of dealing with disparate data sources and the vey huge variety of variables and amounts of data involved. The big data problem on the other hand involves how to design techniques that can leverage data using large scale machine learning computational processes, how to apply LSML techniques to explore and prepare data for modeling, how to identify the type of machine learning problem in order to apply appropriate set of techniques and how to construct models that learn from data and how to analyze big data problems using scalable large scale machine learning algorithms. Large Scale Machine Learning (LSML) involves datasets that have 100 Billion+ training examples, 100 Billion+ features and ~100 features (big data) present per training example (sparse data).
Big data in the recent time presents huge amount of unstructured and feature-rich datasets in many domains which comes with new computational challenges that require complex Machine Learning models with millions to billions of parameters. Existing algorithms with super-linear complexity are computationally impossible in the context of millions or even billions of data points. This has in turn led to new demands for Large Scale Machine Learning (LSML) systems that are able to learn complex models with millions to billions of parameters in big data in order to support the computational needs of ML algorithms at such scales.
Yes I quite agree with the two previous submissions.
Large Scale Machine Learning (LSML) is different from traditional machine learning in the sense that in this instance large datasets are involved. LSML offers powerful predictive analytics for such rise of Big data. Therefore, the two cannot be separated because LSML is what is needed to bring out meaning from the big data.
Check this attached file for better understanding.