Well, if you understand under Big Data just the volume (in contrast to velocity and variability) of the 3Vs, my rule of thumb is "any data set which causes computational problems (in terms of speed and storage) with the currently available hardware".
Christopher Tozzi (2018) explain that according the defination of Big data it is not exctually about the size of data but is is how you use data.
He further indicates tha " When it comes to data, size is always relative. If you can’t distinguish Big Data from traditional data sets in terms of size, then what does define Big Data?The answer lies in how the data is used. The processes, tools, goals, and strategies that are deployed when working with Big Data are what set Big Data apart from traditional data." [online available}: http://blog.syncsort.com/2018/03/big-data/big-data-vs-traditional-data/.