Critical Self Evaluation and Reflection
One of the most interesting (and frustrating) outcomes of this project was the mismatch between what students showed in their design work and how they performed in the more factual parts of the survey. In the V&A Live Brief outcomes, many students demonstrated a significant shift in how they thought about audiences, access, and inclusion. Accessibility was clearly influencing design decisions around format, language, structure, and interaction. Yet when the post-brief survey asked more technical or factual questions about accessibility, some students still answered incorrectly or with uncertainty.
At first, this felt like a failure of the method. If students had genuinely learned, why couldn’t they recall or articulate the technical aspects? However, reflecting more carefully on this tension has been one of the most valuable parts of the project.
Did the method work?
I think the honest answer is: yes, but not in the way I initially expected.
The V&A Live Brief clearly supported deep, practice-based learning. Students were able to embed accessibility in ways that were thoughtful, contextual, and often creative eg using the Pride flag colour palette. They made better design choices, asked better questions, and showed more awareness of who might be excluded by their work. This suggests that the practice led element of the method worked well, particularly in supporting what Orr and Shreeve describe as tacit learning, learning that is visible in action rather than explanation.
The survey, however, revealed the limits of that learning when measured through factual recall. This does not necessarily mean students had not learned, they learned how to think differently, but not always how to nameor define technical standards or terminology. In that sense, the method worked for design learning, but less well for testing technical knowledge.
This raises an important question about alignment: the method measured two different kinds of learning, and the results exposed the gap between them rather than smoothing it over. In retrospect, that is actually a strength of the design, not a weakness.
What did I learn as a DPS Lecturer?
This project has challenged my assumptions about what learning “looks like”. I realised that I had unconsciously prioritised what could be measured easily (survey responses) over what is often more meaningful in design education (applied judgement). Students could not always explain WCAG-style principles accurately, but they could make more inclusive choices in practice.
I also learned that I had not scaffolded the technical aspects of accessibility as clearly or as repeatedly as the conceptual ones. The V&A Live Brief encouraged exploration and reflection, but I did not give students enough structured opportunities to connect their design decisions back to technical language and standards. In other words, I supported doing more than articulating. This is something I struggle with myself.
This has made me reflect on my own teaching bias: I value process and critical thinking in design, but I still expect students to perform well in knowledge-based technical work for going out into industry. This project has exposed a disconnect between those values and how I design learning activities.
Implications for my teaching
The most significant implication is that accessibility needs to be taught through both practice and explicit knowledge building, and the relationship between the two needs to be made clearer. I cannot assume that students will automatically pick up technical understanding through making alone.
In future teaching, I plan to:
- make technical accessibility principles more visible and obvious during critiques
- ask students to name and justify accessibility decisions and terms as part of the design process
- use short, low-stakes quizzes on padlet or mentimeter, or activities to reinforce factual understanding
- revisit key concepts multiple times rather than treating them as one-off content
- align assessment more clearly with both applied and technical learning outcomes
- Create physical game cards like in the V&A exhibition
I also see value in designing reflective prompts that explicitly connect making with knowledge, helping students translate tacit learning into language.
Limitations of the project
There are clear limitations to this study. It was small-scale, limited to one cohort and one brief, and conducted within tight timing constraints. The survey design, while useful for comparison and as a new tool for me, was not well aligned with the kind of learning the Live Brief produced. Anonymisation also prevented me from tracking individual learning trajectories, which would have provided deeper insight into how students moved between knowing and doing.
Another limitation is my dual role as DPS lecturer and researcher. My interpretation of the design work is inevitably shaped by my own values and experience around accessibility, which may have influenced conclusions about “successful” learning. Involving more external reviewers would help strengthen this in future research.
What I would do differently
If I were to run this project again, I would redesign the survey to better reflect the kind of learning I was actually aiming to support, refining it to just one area of design, including scenario-based or applied questions like in the exhibition design section, rather than primarily factual UX / UI ones. I would also introduce structured reflection points where students have to connect their design decisions explicitly to accessibility principles in their crits.
Most importantly, I would not treat the mismatch between survey and design outcomes as a problem to be solved, but as a signal of where learning is happening, and where it still needs support.
Final reflection
This project has been a useful reminder that learning is not always neat or measurable in the ways we expect. The method worked in that it revealed complexity rather than hiding it. Students learned a lot, and their design work is evidence of this, but not always the things the survey was best at capturing. As a result, I now have a much clearer understanding of how accessibility needs to be taught: slowly, repeatedly, and through a deliberate combination of making, reflection, and explicit knowledge-building.
Perhaps most importantly, this project has made me more reflective about my own teaching choices. The tension between knowing and doing is not a student problem – it is a curriculum design problem, and one I now feel better equipped to address.
Here’s some links to the technical sites and tools that were showed.
Accessibility Authority
- The World Wide Web Consortium (W3C)
- Web Content Accessibility Guidelines (WCAG) Current version is 2.2. It’s a legal requirement to meet AA standards. These get reviewed and updated yearly.
Who does it well
Checking tools
- Lighthouse Google
- Colourblindly
- Colour contrast checkers
- Google Chrome High contrast checker
- Checklist design Figma plugin (Thanks V)
- Stark checklist (Product Designer in my team uses)
- All in one from Axe (from the company that does the browser checker tool. I’m thinking of trailing)
AI has been used for SPAG and polishing text
Leave a Reply