Are there any studies that found that people who are diagnosed with cancer perceive themselves as victims or think their fate is unjust?
I'm sure there must be some, but so far I've only found some studies on cancer diagnoses and well-being.
Thank you!