Finally, we’ll take a brief look at the last part of the human-centered design process: evaluation. Evaluation can get quite technical in terms of statistical accuracy and experimental design. This complication is compounded by additional issues induced with development tools.
The first two readings are technical articles describing initial efforts in bringing HCI concerns and evaluation methods to the world of development tools. Feel free to skim these to get a sense of the space, but a deep read is not necessary. (But also not difficult if you have the time; these are both relatively light reads even though they are research articles.)
- Ko, LaToza, and Burnett. A practical guide to controlled experiments of software engineering tools with human participants. Empirical Software Engineering, 110–141. 2013. https://doi.org/10.1007/s10664-013-9279-3
- Meyers, Ko, LaToza, and Yoon. Programmers Are Users Too: Human-Centered Methods for Improving Programming Tools. IEEE Computer, 44–52. 2016.
Rather than deep diving into some of these methods, we’ll focus our discussion on one relatively lightweight evaluation technique: the heuristic evaluation:
- Euphemia Wong. Heuristic Evaluation: How to Conduct a Heuristic Evaluation. Interaction Design Foundation.
In addition to these things, please install and walk through as much of the tutorial of Sonic Pi as you can manage.
You will use the techniques outlined in Wong’s article to perform a heuristic evaluation of Sonic Pi in the upcoming class period!