I want to work on National Family Health Survey -4th round of India on R software, But I am facing problem to read complete data in R. File size is around 3.5 Gb.
It depends on the system configuration that you have. I had a laptop which had 8 GB of RAM and a Core i5 processor due to which huge amount of data made my R kernel crash (Google Predictions data from Kaggle which was around 3.2 GB in size)
Now, I have an upgraded desktop with 16 GB of RAM and Core i5 8K series, added with a RTX 2070 video card and this system can take in huge sets of data like a charm.
If in your case, you cannot invest in high-end systems, I would suggest you to use Kaggle's online kernel (they are pretty high-end) to perform your required analysis. Because a low-end PC would not have the require resource to handle huge sets of data as a result of which your R Studio might crash or the work space won't load up.
Thank you Kashyap Barua for providing system requirement for using large scale data and suggestion regarding Kaggle's online kernel . I will look into it.