I have been reaping the benefits of rigorous sports and workouts in my life. This morning a CNN news pop-up on my smart phone startled me, when I read that "Why exercise won't make you lose weight?"
https://www.cnn.com/videos/health/2018/12/14/exercise-weight-loss-orig-st.cnn
In your opinion and per your experience, why is diet more important than exercise? Do you believe the it is true?