Crowdsourcing opens up new opportunities for QoE assessment of technical developments. The new developments are not only evaluated in the laboratory, but can also be "graded" by a large number of potential future users. The literature sources added here contain some information about it.
Best regards
Anatol Badach
Chen-Chi Wu, Kuan-Ta Chen, Yu-Chun Chang, and Chin-Laung Lei
Crowdsourcing Multimedia QoE Evaluation: A Trusted Framework
A well established way is to get M. O. S. Mean Opinion Score from 1*very bad to 5 very good.
People in the crowd sourcing exercise rate the experience by choosing their score 1,2,3,4, or 5.
You might need to offer them example for calibration of what 1 or other scores mean in your context. It's the training phase, then people score the actual target experience.
Simple, and very efficient.
It has been used for selecting the best parameters for voice coding in telephones.