What do you mean by supplementary? How collected? Same sample or different? How analyzed? Keep in mind different methods should also create divergence.
The classic design for this sort of question is a sequential explanation format (QUAN --> qual). For this to work well, you need to choose results from the quantitative strand of the study that you want to understand more fully, and then match this to a qualitative follow-up study that is designed to meet that goal.
It can/may add more validation in support of quantitative results due to the depth of qualitative analysis. However just make sure you are 'comparing apples with apples' (similar constructs)
Thank you for this thoughtful observation. I wholeheartedly agree—and I would go even further.
The inclusion of supplementary qualitative analysis in a quantitative study does not merely “enhance” insights; it transforms them from mere statistical patterns into meaningful, human-centered understandings of mathematical learning.
Let me be clear: numbers tell us what is happening. But only stories—voices, emotions, classroom interactions, student reasoning, teacher reflections—tell us why it is happening.
In my own research across public schools in Latin America, I’ve seen this repeatedly. For instance, a large-scale assessment might show that 72% of students struggle with proportional reasoning. A purely quantitative report might conclude: “Students lack conceptual understanding.” But when we add semi-structured interviews, think-aloud protocols, or analysis of student-written journals… we discover something profound:
“I don’t know what ‘half’ means unless I see it on a pizza.” “My teacher says cross-multiply, but I don’t know why—it’s magic.”
Suddenly, the 72% isn’t just a failure rate—it’s evidence of procedural over conceptual instruction, disconnected symbolic language, and a curriculum that assumes prior lived experience students never had.
This is where qualitative analysis becomes indispensable:
It reveals hidden misconceptions—not just errors, but deeply rooted epistemological gaps (e.g., treating “=” as an operator rather than a relation). It uncovers affective dimensions—math anxiety, identity struggles, cultural dissonance—that quantitative scales often miss or oversimplify.
It grounds generalizations in context—a finding from urban Bogotá may not hold in rural Ecuador without understanding access to resources, language of instruction, or community attitudes toward math. It generates theory—qualitative data doesn’t just explain results; it inspires new hypotheses, instruments, and pedagogical interventions that are culturally responsive and cognitively grounded.
And let us not forget: education is not about data points. It is about learners.
When we silence the voices behind the scores, we risk designing interventions that are statistically significant—but pedagogically irrelevant.
The power of mixed methods in mathematics education lies not in combining techniques, but in creating a dialogue between numbers and narratives. One without the other is incomplete.
So yes—supplementary qualitative analysis enhances insight. But more accurately:
It redeems the humanity of mathematical learning.
As a researcher and educator, I believe our mission is not only to measure understanding—but to witness it, to listen to its silences, and to respond with dignity.
Quantitative methods ask: How many? How often? Qualitative methods ask: What does this mean? Who is this for? And how can we change it?
Together—they make research that doesn’t just publish… but transforms classrooms.