Do different uncertainty models like Fuzzy, Rough, Grey, and Vague all have the same role? Some researchers called them data representation methods, others called them data analysis.
The role of the different models for determination of uncertainty is close: to determine the uncertainty interval around the “true result”, i.e., the risk (output of uncertainty) of uncertain results (e.g., high chance to be false). This risk should be evaluated, and it is allowable when it doesn't compromise the decision taken on uncertain results. For example, in my area (blood bank, cells and tissues), the decision is a clinical decision with impact in post-transfusion safety. Currently the most well accepted uncertainty model is the measurement uncertainty determined according to the principle of the "Guide to the expression of uncertainty in measurement" (GUM) https://www.google.pt/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CCIQFjAA&url=http%3A%2F%2Fwww.bipm.org%2Futils%2Fcommon%2Fdocuments%2Fjcgm%2FJCGM_100_2008_E.pdf&ei=x56_VN6RG8byUoaSgtAO&usg=AFQjCNGlX6kEcxhxK1MQiIdUuTX_QnIPVw&bvm=bv.83829542,d.d24. GUM, also known as "uncertainty bible" features the "law of the propagation of uncertainty" model. These model is a "top down", where the uncertainty is a combination of the major uncertainty components following Pareto's principle. GUM is intended for chemistry and physics and uniquely for numerical values. When ordinal or nominal values are used, alternative method for the determination of measurement uncertainty must be used. Eurachem published a document intended for chemistry featuring a set of empirical models fulfilling GUM principles https://www.eurachem.org/images/stories/Guides/pdf/QUAM2012_P1.pdf.
The role of the different models for determination of uncertainty is close: to determine the uncertainty interval around the “true result”, i.e., the risk (output of uncertainty) of uncertain results (e.g., high chance to be false). This risk should be evaluated, and it is allowable when it doesn't compromise the decision taken on uncertain results. For example, in my area (blood bank, cells and tissues), the decision is a clinical decision with impact in post-transfusion safety. Currently the most well accepted uncertainty model is the measurement uncertainty determined according to the principle of the "Guide to the expression of uncertainty in measurement" (GUM) https://www.google.pt/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CCIQFjAA&url=http%3A%2F%2Fwww.bipm.org%2Futils%2Fcommon%2Fdocuments%2Fjcgm%2FJCGM_100_2008_E.pdf&ei=x56_VN6RG8byUoaSgtAO&usg=AFQjCNGlX6kEcxhxK1MQiIdUuTX_QnIPVw&bvm=bv.83829542,d.d24. GUM, also known as "uncertainty bible" features the "law of the propagation of uncertainty" model. These model is a "top down", where the uncertainty is a combination of the major uncertainty components following Pareto's principle. GUM is intended for chemistry and physics and uniquely for numerical values. When ordinal or nominal values are used, alternative method for the determination of measurement uncertainty must be used. Eurachem published a document intended for chemistry featuring a set of empirical models fulfilling GUM principles https://www.eurachem.org/images/stories/Guides/pdf/QUAM2012_P1.pdf.
The term "model" is relevant to both representation and analysis. How you represent your data, affects how you do analysis on them. For example, if you use fuzzy sets to model uncertainty, it makes most sense to use fuzzy operations that model.
The four models you mention, all offer their view on how to deal with uncertainty, in that sense you can say they have the same role. If you think about that they all deal with uncertainty in terms of partial memberships, then they also all fulfill the same role.
However, if you look at how they deal with uncertainty, then perhaps you can't say that they have the same role. If we add one more uncertainty model, Bayesian probability, then the concepts are very different, and I wouldn't say that the two uncertainty models fulfill the same role.
Hope this answers your question, otherwise feel free to elaborate :)
Think about various uncertainty models as of special kind of devices to measure uncertainty in a quantitative way. By analogy: the measuring tape is quite fine when you want to know how high is your table but not so good when your goal is to measure the crystal lattice constant or when you want to evaluate the distance from Earth to Sun. For still other purposes you will use a laser-based device (or radar, or ultrasound sensor) or a caliper. Simply speaking: some uncertainty models are better suited than other to any given problem. Choose the one which is most easily applicable and gives the most tight results. Oh, there is also a question whether you need guaranteed estimates or just confidence intervals are sufficient.