When you think of exams, you probably imagine rows of students in a hall, heads down, scribbling away in silence. The only things on their desks are the exam paper, a blue or black pen, and – at most – a calculator or other specialist equipment needed for the exam. There are certainly no notes or textbooks in sight.

Well, there is another way. Open-book exams allow students to consult set texts, notes or the internet in their exam to construct their answers. They can draw on all the information they need to inform their answers, rather than relying on knowledge recall.

There have been calls from some quarters for open-book exams to become the norm. Open-book exams, it is argued, could encourage more creative and critical thinking. That they are a valid and fair way to assess a student’s skills and understanding. This is a topic that has provoked some heated discussions recently.

Let’s take a look at what some of the available evidence says about open-book exams.

What is the impact on student performance?

There appears to be limited research on this topic, particularly when it comes to the specific context of UK high-stakes assessments. Also, given that many participants in previous studies had little or no experience of open-book exams, any reported effects on performance might be due to this lack of experience rather than the test format.

Where researchers directly compared the open-book and closed-book exam formats, they found no significant differences in student performance. Evidence that students do better in closed-book exams[1] has been attributed to them spending less time preparing for open-book exams[2], because they think they’ll be able to find answers in the sources they have at hand.[3]

Moreover, when it comes to expectations that open-book exams may promote more deep learning, these haven’t necessarily been supported in studies.[4]

While there may be limited evidence of performance differences, there are some advantages and disadvantages to using each of these formats.

What are the pros and cons of open-book exams?

An open-book exam can help assess skills such as analysis and evaluation, looking beyond knowledge recall.[5] It can also replicate the sort of tasks that students might be expected to perform in the workplace, such as finding, synthesising and interpreting information from multiple sources.[6] [7]

In some cases, students may prefer open-book exams to closed-book exams as they allow them to make more creative use of course content and have a greater sense of ownership over their personal study.[8] Students also appear to associate open-book exams with  less anxiety[9], (even if, in practice, 45% of students reported equivalent levels of stress when taking exams in either format).[10]

On the flip side, there is a risk that open-book exams might fail to assess students’ ability to understand and recall information.

Some studies report that students consider open-book exam questions to be more difficult and would like more training on how to complete this type of exam.[11] Perhaps not surprisingly, where students have less experience with open-book exams, their grades are initially lower until they develop specific skills for this exam format[12].

Another consideration with open-book exams is that students can spend a lot of time finding information instead of formulating their answers.[13] This means they either need longer to complete the exam – potentially creating fatigue issues – or the exam needs to contain fewer questions. However, reducing the number of questions may impact the reliability of the students’ results.[14]

A tailored approach

On the whole, it appears a sensible approach would be to ensure that the exam format suits the purpose of the assessment. Broadly speaking, closed-book exams are useful when assessing students’ knowledge (and can also be used to assess higher-order skills) while open-book exams are useful where the main goal is to assess students’ ability to analyse and evaluate information.

One clear conclusion is that any transition from closed-book exams to open-book exams would need to be carefully managed to ensure that students understand what is expected of them.

Read more on this subject:
If exams stay, don't fear ChatGPT
Test anxiety does not predict exam performance
Simulation assessment: A model for the future?

References

[1] Durning, S. J., Dong, T., Ratcliffe, T., Schuwirth, L., Artino, A. R., Boulet, J. R., & Eva, K. (2016). Comparing open-book and closed-book examinations: a systematic review. Academic Medicine, 91, 583–599

[2] Block, R. M. (2012). A discussion of the effect of open-book and closed-book exams on student achievement in an introductory statistics course. Problems, Resources, and Issues in Mathematics Undergraduate Studies, 22(3), 228–238.

[3] Dale, V. H., Wieland, B., Pirkelbauer, B., & Nevel, A. (2009). Value and benefits of open-book examinations as assessment for deep learning in a post-graduate animal health course. Journal of Veterinary Medical Education, 36, 403–410.

[4] Heijne-Penninga, M., Kuks, J. B. M., Hofman, W. H. A., & Cohen-Schotanus, J. (2008). Influence of open- and closed-book tests on medical students' learning approaches. Medical Education, 42(10), 967–974.

[5] Green, S. G., Ferrante, C. J. & Heppard, K. A. (2016). Using Open-Book Exams to enhance Student Learning, Performance, and Motivation. The Journal of Effective Teaching, 16, 19–35.

[6] Deneen, C. (2020). Assessment considerations in moving from closed-book to open-book exams (Melbourne CSHE Teaching and Learning short guide series). University of Melbourne.

[7] Williams, J. B., & Wong, A. (2009). The efficacy of final examinations: A comparative study of closed‐book, invigilated exams and open‐book, open‐web exams. British Journal of Educational Technology, 40, 227–236.

[8] Brightwell, R., Daniel, J. H., & Stewart, A. (2004). Evaluation: Is an open book examination easier? Bioscience Education, 3(1), 1–10.

[9] Durning, S. J., Dong, T., Ratcliffe, T., Schuwirth, L., Artino, A. R., Boulet, J. R., & Eva, K. (2016). Comparing open-book and closed-book examinations: a systematic review. Academic Medicine, 91, 583–599

[10] Baillie C., & Toohey, S. (1997). The “power test”: Its impact on student learning in a materials  science course for engineering. Assessment & Evaluation in Higher Education, 22, 33–48.

[11] Dale, V. H., Wieland, B., Pirkelbauer, B., & Nevel, A. (2009). Value and benefits of open-book examinations as assessment for deep learning in a post-graduate animal health course. Journal of Veterinary Medical Education, 36, 403–410.

[12] Durning, S. J., Dong, T., Ratcliffe, T., Schuwirth, L., Artino, A. R., Boulet, J. R., & Eva, K.   (2016). Comparing open-book and closed-book examinations: a systematic review. Academic Medicine, 91, 583–599

[13] Theophilides, C., & Dionysiou, O. (1996). The major functions of the open-book test at the university level: A factor analytic study. Studies in Educational Evaluation, 22, 157–170.

[14] Durning, S. J., Dong, T., Ratcliffe, T., Schuwirth, L., Artino, A. R., Boulet, J. R., & Eva, K. (2016). Comparing open-book and closed-book examinations: a systematic review. Academic Medicine, 91, 583–599