Dear Vanishree Kanaka Sundar , first of all, due to Central Limit Theorem, normal distribution of data is not an issue if you have optimal set of responses because Sampling Distribution of a large (depends on area of research) data set tends to be normal. Please refer to “Many people take the ‘assumption of normality’ to mean that your data need to be normally distributed. However, that isn’t what it means. In fact, there is an awful lot of confusion about what it does mean” (Field, 2013, p. 229).
Secondly, PROCESS Macro provides you the facility to bootstrap, so normality itself should not be a problem while performing a mediation analysis in PROCESS. However, in case of latent variables, Reliability and Validity would still be an issue.
Field, A. (2013). Discovering Statistics using IBM SPSS Statistics (4th ed.): Sage.
Dear Vanishree Kanaka Sundar , to understand this issue in detail you need to read some pages of Andy Field's book I referred. YES, data normality is not an issue if data set is optimal but to defend yourself you must understand the concepts of Central Limit Theorem and Sampling Distribution. Once again, Reliability and Validity would still be an issue in case of latent variables because PROCESS only uses observed or computed scores (summated scales). Regards.
Como menciona el propio creador de la macro PROCESS, “cuando aprendemos una nueva estrategia analítica cambia nuestra forma de acercarnos y reflexionar sobre cuestiones teóricas y también nuestro modo de pensar sobre cómo probar o contrastar hipótesis. Y todo gracias a disponer de pronto de métodos analíticos que nos abren un nuevo abanico de posibilidades antes desconocidas Muchos de los adelantos científicos en las últimas décadas surgieron en mayor medida como consecuencia de innovaciones de tipo metodológico que como resultado de innovaciones en aspectos teóricos".