Click through the PLOS taxonomy to find articles in your field.

For more information about PLOS Subject Areas, click here .

Loading metrics

Open Access

Peer-reviewed

Research Article

Use of very short answer questions compared to multiple choice questions in undergraduate medical students: An external validation study

Roles Data curation, Formal analysis, Investigation, Visualization, Writing – original draft, Writing – review & editing

* E-mail: [email protected]

Affiliation Center for Innovation in Medical Education, Leiden University Medical Center, Leiden, The Netherlands

ORCID logo

Roles Conceptualization, Data curation, Formal analysis, Investigation, Methodology, Software, Visualization, Writing – original draft, Writing – review & editing

Affiliation Department of Clinical Epidemiology, Leiden University Medical Center, Leiden, The Netherlands

Roles Data curation, Writing – review & editing

Affiliation Department of Gastroenterology and Hepatology, Leiden University Medical Center, Leiden, The Netherlands

Affiliation Department of Cell and Chemical Biology, Leiden University Medical Center, Leiden, The Netherlands

Roles Writing – review & editing

Affiliation Department of Pathology, Leiden University Medical Center, Leiden, The Netherlands

Roles Conceptualization, Methodology, Validation, Writing – review & editing

Affiliation Leiden University Graduate School of Teaching, Leiden University, Leiden, The Netherlands

Roles Conceptualization, Methodology, Software, Visualization, Writing – review & editing

Roles Conceptualization, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Supervision, Validation, Writing – review & editing

  • Elise V. van Wijk, 
  • Roemer J. Janse, 
  • Bastian N. Ruijter, 
  • Jos H. T. Rohling, 
  • Jolein van der Kraan, 
  • Stijn Crobach, 
  • Mario de Jonge, 
  • Arnout Jan de Beaufort, 
  • Friedo W. Dekker, 
  • Alexandra M. J. Langers

PLOS

  • Published: July 14, 2023
  • https://doi.org/10.1371/journal.pone.0288558
  • Reader Comments

Fig 1

Multiple choice questions (MCQs) offer high reliability and easy machine-marking, but allow for cueing and stimulate recognition-based learning. Very short answer questions (VSAQs), which are open-ended questions requiring a very short answer, may circumvent these limitations. Although VSAQ use in medical assessment increases, almost all research on reliability and validity of VSAQs in medical education has been performed by a single research group with extensive experience in the development of VSAQs. Therefore, we aimed to validate previous findings about VSAQ reliability, discrimination, and acceptability in undergraduate medical students and teachers with limited experience in VSAQs development. To validate the results presented in previous studies, we partially replicated a previous study and extended results on student experiences. Dutch undergraduate medical students (n = 375) were randomized to VSAQs first and MCQs second or vice versa in a formative exam in two courses, to determine reliability, discrimination, and cueing. Acceptability for teachers (i.e., VSAQ review time) was determined in the summative exam. Reliability (Cronbach’s α) was 0.74 for VSAQs and 0.57 for MCQs in one course. In the other course, Cronbach’s α was 0.87 for VSAQs and 0.83 for MCQs. Discrimination (average R ir ) was 0.27 vs. 0.17 and 0.43 vs. 0.39 for VSAQs vs. MCQs, respectively. Reviewing time of one VSAQ for the entire student cohort was ±2 minutes on average. Positive cueing occurred more in MCQs than in VSAQs (20% vs. 4% and 20.8% vs. 8.3% of questions per person in both courses). This study validates the positive results regarding VSAQs reliability, discrimination, and acceptability in undergraduate medical students. Furthermore, we demonstrate that VSAQ use is reliable among teachers with limited experience in writing and marking VSAQs. The short learning curve for teachers, favourable marking time and applicability regardless of the topic suggest that VSAQs might also be valuable beyond medical assessment.

Citation: van Wijk EV, Janse RJ, Ruijter BN, Rohling JHT, van der Kraan J, Crobach S, et al. (2023) Use of very short answer questions compared to multiple choice questions in undergraduate medical students: An external validation study. PLoS ONE 18(7): e0288558. https://doi.org/10.1371/journal.pone.0288558

Editor: Ipek Gonullu, Ankara University Faculty of Medicine: Ankara Universitesi Tip Fakultesi, TURKEY

Received: December 21, 2022; Accepted: June 29, 2023; Published: July 14, 2023

Copyright: © 2023 van Wijk et al. This is an open access article distributed under the terms of the Creative Commons Attribution License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Data Availability: The anonymized data and annotated R code used for the results in this study are freely available from the GitHub repository ( https://github.com/rjjanse ).

Funding: The author(s) received no specific funding for this work.

Competing interests: The authors have declared that no competing interests exist.

Introduction

Assessment in the educational field commonly uses Multiple Choice Questions (MCQs), because this question type offers high reliability and easy machine-marking. However, it also allows for cueing (i.e., answering questions based on cues in the question or answer options rather than on content knowledge) and stimulates a recognition-based study approach [ 1 – 4 ]. Although recognition may be sufficient to pass a MCQ-based assessment, oftentimes MCQs are not representative for a future situation in which the assessed knowledge has to be applied, for instance because of the absence of a demarcated set of possible answers. This is, among others, the case in medical education, where it has been critically noted that clinical practice does not offer a multiple choice list of possible diagnoses or procedures, nor is there a single best recognisable answer in the medical profession [ 5 , 6 ].

Although other question formats have been proposed to circumvent the limitations of MCQs, such as uncued questions and extended matching questions [ 6 – 8 ], these question formats may still facilitate a recognition-based study approach. Very Short Answer Questions (VSAQs), a free-response type of questions with the answer being limited to 1–4 words, may be better suited to circumvent some of the general limitations of MCQs. The open-ended nature of the VSAQs may prevent surface-level study approaches and cueing [ 4 , 9 – 11 ], and it may better represent a profession’s real-life practice, such as the medical profession, where VSAQs better reflect clinical practice. In addition, VSAQs are better able to discriminate between students based on proficiency in the content knowledge [ 9 – 14 ] and may increase retention of knowledge [ 15 – 17 ].

Although the use of VSAQs in medical assessments is increasing, evidence regarding validity and reliability of this question type in the medical setting is mainly based on studies from a single research group, consisting of teachers experienced in developing and marking VSAQs [ 4 , 10 , 13 , 14 , 18 ]. In one study, Sam et al . (2018) [ 13 ] compared 3 rd year medical students starting a test with either VSAQs or MCQs, followed by questions in the opposite format. They observed a higher reliability (Cronbach’s α: 0.91 vs. 0.85) and lower mean test score (52.4% vs. 69.7%) for VSAQs vs. MCQs, respectively. Moreover, two-third of students (strongly) agreed that VSAQs better represented clinical practice and half of the students (strongly) agreed VSAQs better prepared them for clinical practice. Cueing was more strongly associated with MCQs. In 5 th year pathology students, higher reliability (Cronbach’s α: 0.86 vs. 0.76) and a lower median test score (72% vs. 80%) were found in VSAQs vs. MCQs [ 14 ]. Lastly, across 20 UK medical schools, Sam and colleagues [ 18 ] found a 21% higher test score for MCQs compared to VSAQs, as well as a higher positive cuing rate in MCQs. In this study, they reported marking VSAQs to be feasible.

However, it remains unclear whether the application of VSAQs by teachers with less experience in writing and marking VSAQs, in a different population, country, and medical educational setting yields the same results. Before VSAQs can be implemented in a wider context, more evidence is needed. Therefore, we aimed to externally validate the positive results of VSAQs regarding reliability, discrimination, and acceptability in a cohort of Dutch medical undergraduate students with non-expert teachers. Additionally, we wanted to explore the impact of VSAQs on cueing effects and student experiences of VSAQ-assessment. In order to achieve these aims, we partially replicated the study design of Sam et al .[ 13 ].

This study was simultaneously performed in two different student cohorts (cohort 2019 and cohort 2020) using the same study design. First year students (cohort 2020) followed the fundamental course “ Regulation and Metabolism ” (RM, May 2021) and the second year students (cohort 2019) followed the clinical course “Diseases of the Abdomen” (DA, April 2021) in the bachelor of Medicine at the Leiden University Medical Center (LUMC), the Netherlands. Both courses (6 and 7 weeks, respectively) cover metabolic and gastrointestinal topics. During the course, students had weekly mini-exams in DA where they were presented 2–3 VSAQs, but not in RM. Near the end of these courses, students are offered a formative exam and the courses end with a summative exam. After the summative exam, students can evaluate the course with the Automated Education Evaluation System (AEES, Fig 1A ), which includes questions on constructive alignment. Relevant AEES questions for this study were answered using a 5-point Likert-scale (strongly disagree, disagree, neutral, agree, strongly agree). The LUMC uses RemindoToets (Paragin) [ 19 ] for digital assessment with the possibility of proctoring. The current study included the formative and summative exams in both courses for analyses.

thumbnail

  • PPT PowerPoint slide
  • PNG larger image
  • TIFF original image

(A) Set-up of both courses (RM and DA) with the formative exam and contents, summative exam and contents, and the Automated Education Evaluation System (AEES) (B) Flowchart of the study participants.

https://doi.org/10.1371/journal.pone.0288558.g001

Formative exam

To determine reliability, discrimination, cueing effects, and students’ insights in the formative exams of the RM and DA courses, students were randomly assigned to a group starting with MCQs (RM-MCQ first and DA-MCQ first ) or starting with VSAQs (RM-VSAQ first and DA-VSAQ first ), followed by identical questions in the opposing format, similar to the study of Sam et al . [ 13 ] ( Fig 1A ). When a section (i.e. either the VSAQ part or the MCQ part) of the exam was finished, students were not able to revisit this specific section. For instance, a student starting the exam with VSAQs could not go back to the VSAQs nor change their answer when they started the section with MCQs. However, it was possible to revisit items or change the response within the same section of the exam before closing the section. The topics in the formative assessment covered the entire spectrum of the course. The MCQs used in the formative exam were written by the course directors with the intent to test the course learning goals. The VSAQs were written in the same way, with assistance from the research team (RJJ, AMJL) to create VSAQs of good quality. The research team was familiar with the literature on VSAQs, but did not yet have any experience in writing VSAQs. For DA, new questions were created as open ended questions suitable for the VSAQ format, and then four answer options for each question were generated to create the parallel MCQ. In RM, existing MCQs that passed the cover test (i.e. the answer can be given without reading the answer options) were transformed into VSAQs by removing the answer options. If the existing MCQs were not specific enough, adjustments to the questions were made to fulfil the VSAQ requirements. Thus, for DA, 24 completely identical questions in both formats were asked, while for RM, 25 questions that tested the same knowledge were asked, albeit sometimes worded differently. The formative exam was available in a fixed timeslot. Participation in this formative exam was mandatory in RM and optional in DA. The exam format order per student was determined using a random number generator in Microsoft Excel (i.e., a Mersenne Twister algorithm) [ 20 ]. Only students who gave informed consent were included in the analysis.

After having finished the first part of the formative exam (either MCQs or VSAQs) students were asked to rate three statements on a 5-point Likert scale (strongly disagree, disagree, neutral, agree, strongly agree) and one question ranging from 1 to 10, based on the specific question format with which they were just tested: 1) The questions are a good representation of how I would be expected to answer questions in clinical practice; 2) I found the questions easy; 3) I was often unsure whether my answer would be correct) ; 4) If I had to give an estimate of the grade I would have achieved based on these questions , my estimate would be <grade> . Because these statements were presented to the students after they finished the first part of the exam, half of the students answered the questions after having finished MCQs only and half of the students after having finished VSAQs only. After the students answered the four evaluation questions, they continued with the second part of the formative test, in which they had to answer the exam-questions in the opposite format. At the end of the second part of the formative exam, all students were asked to rate six more general statements about both question formats: 5) VSAQs are easier than MCQs ; 6) VSAQs are more in line with daily clinical practice than MCQs; 7) I prepare differently for an assessment with VSAQs than for an assessment with MCQs; 8) VSAQs would be a better preparation for clinical practice than MCQs ; 9) Through the use of VSAQs , the test is better aligned with this course , than a test using MCQs; and 10) Any comments I would like to add : <open question> . Finally, for research purposes, students were asked whether they used during the formative exam. Given that there were no negative repercussions to using study materials and this was clear to students, we believe that the answer to this question reflects the actual use of study material in the majority of students. Students who used study materials were excluded from the analysis.

We determined reliability and discriminative capability for content knowledge. The average score, calculated over MCQs and VSAQs separately, was stratified by whether students took MCQs or VSAQs first. Cueing was measured by comparing the answer to an MCQ with the answer to the corresponding VSAQ. We looked at cueing per question (i.e., how often did cueing occur per individual question) and cueing per person (i.e., in how many questions did cueing occur per individual student). We discerned positive and negative cueing. In positive cueing, students used clues in either the MCQ question and/or answer options to arrive at the right answer, which was not possible in VSAQs because no answer options were available. In our study, this could be observed when a student answered a VSAQ incorrectly, but the equivalent MCQ correctly. Negative cueing happens when students are misled by an incorrect answer option in a MCQ (e.g., due to a distractor that is too plausible). In our study, this was derived from a student being able to answer the VSAQ correctly, but not able to give the correct answer to the equivalent MCQ. Although it might have been of influence, the probability of guessing the right answer could not be taken into account. Students’ insights were determined from the evaluation questions asked midway through and at the end of the formative exam.

Summative exam

The summative exams of RM and DA were rewritten to replace part of the MCQs with VSAQs (45 in RM and 16 in DA). For RM, this was done through rewriting existing MCQs, whereas for DA, a 2-hour workshop was organized for teachers on how to write VSAQs. Question writers in both courses received written instructions about how to write VSAQs based on information provided by the author of the initial paper on VSAQs, as can currently be found in the publication by Bala et al . [ 13 , 21 ]. For each learning goal, questions were initially written by the experts with content knowledge (e.g., gynaecology questions were written by a gynaecologist). The research team (RJJ, AMJL) assisted, if necessary, in adjusting the questions to the optimal VSAQ format. At the end of both exams, preapproved answers to the VSAQs were automatically marked as correct. Subsequently, teachers reviewed all incorrect answers and could easily add answers that were not in the predefined list, but were also found to be correct. The grading was done by one teacher; a second teacher was consulted when there were doubts about certain answers.

VSAQ review time per question for each teacher was recorded in DA to determine acceptability. The total reviewing time per question was recorded by the reviewer using the timer function on a smartphone. Reviewing time started when the reviewer first looked at the question and ended when the question was fully resolved. This included both reviewing the answers and discussion with other teachers when necessary. Because the marking of only a few VSAQs of the exam was recorded during the initial data collection, which impeded a correct and unselected overview of the reviewing time, one year later the reviewing time of all VSAQs in that year’s summative exam was collected again. The AEES questionnaire was supplemented with two questions regarding students’ insights: 1) Because I knew that I would be tested by very short answer questions , I studied in another way than I normally would ; and 2) Through the use of very short answer questions , the test was a better representation of what I learned in this course , compared to a test using multiple-choice questions . In addition, the perceived alignment between teaching and assessment was compared to the alignment of the course in the years 2017, 2018, and 2019, determined from two pre-existent questions in the AEES questionnaire: 1) The assessment as a whole (form and content) is appropriate for what you should have mastered at the end of the course ; and 2) The (online) test formats (e . g . MCQs , open questions , oral and written presentations , practical assessments) matched what I have learned . Due to emergency remote teaching during COVID-19, 2020 is not considered in these comparisons.

Statistical analysis

Continuous variables are presented as mean (standard deviation) or median (interquartile range) depending on their distribution. Categorical variables are presented as number (proportion). Reliability was determined by calculating the Cronbach’s α or the VSAQs and MCQs in both formative exam formats, which is a measure of internal correlation between items on a test level [ 22 , 23 ]. A higher Cronbach’s α indicates better reliability with values of 0.7 or higher indicating acceptable reliability. The discriminative capability for content knowledge was determined using the mean of the R ir -values of each question, where the R ir -value is the correlation between one test-item and other test-items [ 24 ]. Items with a R ir -value of more than 0.25 typically represent items with an adequate discriminative capability. Mean test scores were calculated as the percentage of correctly answered questions. Reviewing time was expressed in minutes and seconds. All statistical analyses were performed using R version 4.1.0 (R Foundation for Statistical Computing, Vienna, Austria).

Ethical approval

This study was reviewed and approved by the Educational Research Review Board of the Leiden University Medical Center (file number: OEC/ERRB/20201208/1).

Of the 335 students who took the formative exam in RM, 216 students were included in our study. In DA, 159 of the 259 students who took the formative exam were included ( Fig 1B ). In RM, 104 students started with MCQs (RM-MCQ first ) and 112 students started with VSAQs (RM-VSAQ first) . In DA, 90 students were assigned to DA-MCQ first and 69 students to DA-VSAQ first The summative exam was made by 352 students in RM and 308 students in DA.

Reliability and discrimination

We compared the VSAQs of students starting with VSAQs with the MCQs of students starting with MCQs. This comparison reflects the results of the VSAQs and MCQs that are not influenced by prior questions. VSAQs had higher reliability compared to MCQs (Cronbach’s α 0.74 vs. 0.57 in RM; 0.87 vs. 0.83 in DA for VSAQs vs. MCQs, respectively) ( Table 1 ). In the same students, discrimination (mean [SD]), expressed as the R ir -value, was higher in VSAQs compared to MCQs (0.27 [0.15] vs. 0.17 [0.13] in RM; 0.43 [0.10] vs. 0.39 [0.10] in DA, for VSAQs vs. MCQs, respectively). The mean scores (mean [SD]) were lower and had a wider distribution width for VSAQs compared to MCQs (57.0 [15.7] vs. 71.2 [12.2] in RM; 51.6 [23.9] vs. 70.0 [19.7] in DA, for VSAQs vs. MCQs, respectively). These results were similar when comparing results within groups (e.g., VSAQs vs. MCQs within MCQ first ).

thumbnail

https://doi.org/10.1371/journal.pone.0288558.t001

Acceptability

In the initially collected data, the average reviewing time per VSAQ by one teacher in the summative exam of DA (7 VSAQs, 308 students) was 2 minutes and 20 seconds (SD 52 seconds). Additionally, on average 2 minutes and 9 seconds (SD 2 minutes and 36 seconds) were spent replying to comments and consultation of other teachers. The maximum time spent on a single VSAQ was 11 minutes and 24 seconds. One year later (22 VSAQs, 338 students), the average time spent on reviewing questions in DA was 1 minute and 58 seconds (SD 40 seconds) and consultation of other teachers took on average 36 seconds (SD 47 seconds).

Secondary outcomes

Positive cueing, defined as a correctly answered MCQ with an incorrectly answered equivalent VSAQ, occurred on average more often per student in RM-VSAQ first and DA-VSAQ first (20.0%, IQR; 16.0–28.0%; 20.8%, IQR; 12.5–29.2%, respectively) compared to RM-MCQ first and DA-MCQ first (4.0%, IQR; 4.0–8.0%; 8.3%, IQR; 4.2–16.7%, respectively) ( Table 2 ). On a question level, positive cueing occurred in 100% of questions in all groups. The frequency of positive cueing per question was on average higher in RM-VSAQ first and DA-VSAQ first (14.3%, IQR 7.1–33.9%; 22.7%, IQR 10.9–28.5%, respectively) compared to RM-MCQ first and DA-MCQ first (4.8%, IQR 2.9–9.6%; 15.9%, IQR 11.8–20.3%, respectively) ( Table 3 ). Negative cueing in students, which was defined as students answering the VSAQ correctly and the equivalent MCQ incorrectly, occurred more often per student in RM-MCQ first compared to RM-VSAQ first (8.0%, IQR 4.0–12.0% to 4.0%, IQR 0.0–4.0%). In DA-MCQ first , negative cueing was on average not observed in students (0.0%, IQR 0.0–4.2%). Negative cueing per question occurred in 92%, 56%, 79%, and 79% of the questions for RM-MCQ first , RM-VSAQ first , DA-MCQ first , and DA-VSAQ first , respectively. The frequency of negative cueing per question was on average lower in RM-VSAQ first compared to RM-MCQ first (0.9%, IQR 0.0–1.8%; 3.8%, IQR 1.9–12.5%), but higher in DA-VSAQ first compared to DA-MCQ first (3.1%, IQR 1.6–5.1%; 1.8%, IQR 1.2–3.5%). The maximum percentage of positive cueing by students in a single question was the highest in RM-MCQ first (62.5%). The maximum percentage of negative cueing by students in a single question was 38.5% in RM-MCQ first .

thumbnail

https://doi.org/10.1371/journal.pone.0288558.t002

thumbnail

https://doi.org/10.1371/journal.pone.0288558.t003

When asked whether they found the questions easy, students who had been answering only VSAQs more often disagreed compared to students who had been answering MCQs only in the DA course (EQ2: 3, IQR 2–3 vs. 2, IQR 2–2), but students estimated their final grade to be higher if they had started with VSAQs ( S1 Table ). More than 80% of students were uncertain about answering VSAQs correctly (86% and 82% in RM and DA, respectively) ( S2 Table and Fig 2 ). Also at the end of the formative exam, after having answered questions in both formats, approximately 90% of students (strongly) disagreed that VSAQs were easier than MCQs ( S3 Table ). 51% of students in RM and 46% in DA (strongly) agreed that assessment with VSAQs changed their test preparation. In DA, 60% of students agreed or strongly agreed that VSAQs better represented clinical practice. This was 34% in RM. Almost 70% of students in RM and 48% in DA (strongly) disagreed that the test was better aligned with the course by using VSAQs. 45% of students in RM and 42% in DA (strongly) agreed they would change learning behavior if tested with VSAQs ( S4 Table ). 83% of students in RM (strongly) disagreed with the statement that the use of VSAQs made the exam a better representation of what they learned during the course compared to MCQs. This was 51% in DA. Perceived alignment of assessment, teaching and learning activities are reported in S5 Table .

thumbnail

Distribution of the answers given to the 5-point Likert scale evaluation questions halfway through the exam after the MCQs or VSAQs and at the end of the exam; and estimates of their grade halfway through the exam in RM (A, B, C) and DA (D, E, F).

https://doi.org/10.1371/journal.pone.0288558.g002

In this study we aimed to externally validate the earlier results regarding reliability, discrimination, and acceptability of VSAQs compared to MCQs in a cohort of Dutch medical undergraduate students, based on earlier work by Sam et al . [ 13 ]. In accordance with their findings, we observed higher reliability and discrimination of VSAQs compared to MCQs, with an acceptable time to mark VSAQs. Results were more positive in DA than in RM, which might be attributable to the workshop offered to the teachers, better suitable course material, and the opportunity for students to practice with the VSAQs prior to the exams. Additionally, we explored the impact of VSAQs on cueing effects, perceived alignment between assessment and teaching, and student experiences of VSAQs. Cueing effects occurred less frequently in VSAQs compared to MCQs. Students noted a high level of uncertainty when answering VSAQs and around half of students prepared differently for VSAQs. More than half of the students thought VSAQs better represented clinical practice. However, perceived constructive alignment seemed to diminish in RM and not improve in DA.

The higher reliability and discrimination but lower test scores of VSAQs compared to MCQs may in part reflect the decreased possibility of guessing correctly in VSAQs, and are line with Sam et al . [ 13 ] and other previous studies [ 14 , 18 ]. The lower score also suggests that VSAQs are more difficult, possibly due to a need of answer generation, rather than answer recognition, which provides a better measure of a students’ true content knowledge and increases validity [ 4 , 13 , 14 ]. The high discriminative capability of VSAQs is further supported by higher average R ir values of VSAQs in DA. In RM, average R ir values were relatively low for both MCQs and VSAQs, although an increase in R ir value in VSAQs compared to MCQs could still be observed.

The teachers who graded the VSAQs deemed the reviewing time of VSAQs acceptable. This is supported by previous studies that found comparable and shorter review times, using different marking systems, multiple examiners, and more questions [ 13 , 14 ]. Nonetheless, whereas not every MCQ has to be reviewed, it should be noted that a VSAQ should always be reviewed after machine marking, although repeated use of questions may decrease reviewing time, depending on software used [ 13 ].

Positive cueing per student occurred more often in the students who started with VSAQs, which is in line with the findings of Sam et al . [ 13 ]. This is expected, as students answering the VSAQs first and MCQs second cannot carry over the MCQ answer to the VSAQ, therefore having to rely on content knowledge for the VSAQ. Cueing per question was also seen more often in this group, but not for every question [ 13 ]. However, we most likely also measured students guessing the right answer, as it is nearly impossible to separate guessing and cueing in MCQs [ 11 ]. Negative cueing differed only slightly between groups, similar to Sam et al . who observed similar negative cueing between groups [ 13 ]. It should be noted that in many questions cueing occurred, but per question cueing was observed in few students.

Looking at students’ experiences, we found results comparable with Sam et al . [ 13 ]. The vast majority of the students thought the VSAQs were more difficult than MCQs and almost half of the students said they changed their learning behavior because they were assessed with VSAQs. We observed several noteworthy differences in student experiences between courses that may serve as primer for future research. Concerning clinical practice, students of DA were more positive than students of RM, possibly due to the clinical content in DA having been a better fit for VSAQs than the more fundamental content of RM. This indicates the importance of identifying areas that will benefit most from assessment with VSAQs [ 21 ]. Feedback provided by students mainly indicated that VSAQ phrasing might not always have been clear enough. This led to uncertainty regarding the level of specificity of the desired answer, highlighting the importance of a well-designed VSAQ with specific lead-ins [ 4 , 21 , 25 ]. Additionally, a majority of students in RM considered VSAQs to be a poorer representation of course content, while this was only half of the students in DA. This may in part be due to the differences in course content, but uncertainty as a result of an unclearly formulated question may also have played a role. The student feedback in RM possibly also reflects insufficient preparation for the new question format during the course, as students in DA were exposed to VSAQs at multiple timepoints throughout the course. If students have more time to practice, their ability to answer VSAQs may improve [ 21 ].

Study strengths are the randomized design, studying two different courses, and the investigation of student perspectives. Furthermore, the fact that teachers who participated in our study had limited experience with VSAQs allowed us to validate the previous results in an independent setting with less experienced teachers. Limitations are the seemingly poor question quality in the formative RM exam, and the relatively small sample size. Furthermore, due to the low-stakes nature of the formative exam, we cannot be certain that students performed at their best when answering the questions. To determine acceptability, we used only one reviewer who logged the times by hand, leading to less accurate reviewing times. To obtain a more precise measure of acceptability, these findings could be extended by using multiple examiners, more VSAQs and automatically logged times.

Although we validated the VSAQs and investigated student experiences in a medical cohort, we believe that the strengths of VSAQs compared to MCQs are generalizable to other educational fields. Especially, student experiences were mainly related to VSAQs without a focus on a medical context. Real life situations rarely offer a clear single best answer or a list of possible answers. Moreover, in any field open essay questions or other higher-order questions are costly to implement. Although further studies should extend these results to general higher education, our results show VSAQs may provide a promising alternative to MCQ-based assessment in education in general.

In conclusion, this study confirms the positive results of Sam et al . [ 13 ] on VSAQs in terms of reliability, discrimination, and acceptability in formative assessments in a Dutch cohort of undergraduate medical students. Additionally, these results were confirmed in teachers with only limited prior VSAQ experience and previous results on student experiences are extended. Wider implementation of VSAQs in medical education seems justified and may also improve assessment in other fields of higher education.

Supporting information

S1 table. median (iqr) scores of the 5-point likert scale evaluation questions (eq1-3 and eq5-9) and estimated grade question (eq4) in the formative exam..

EQ1-4 halfway of the exam after the MCQs (MCQ first ) or VSAQs (VSAQ first ); EQ5-EQ9 at the end of the exam after both MCQs and VSAQs.

https://doi.org/10.1371/journal.pone.0288558.s001

S2 Table. Distribution of the answers given to the 5-point Likert scale evaluation questions halfway of the formative exam after MCQs (MCQ first ) or VSAQs (VSAQ first ).

https://doi.org/10.1371/journal.pone.0288558.s002

S3 Table. Distribution of the answers given to the 5-point Likert scale evaluation questions at the end of the formative exam.

https://doi.org/10.1371/journal.pone.0288558.s003

S4 Table. Median (IQR) scores and distribution of the answers given to the 5-point Likert scale evaluation questions after the summative exam.

https://doi.org/10.1371/journal.pone.0288558.s004

S5 Table. Median (IQR) scores of the 5-point Likert scale questions on constructive alignment after the summative exam (1: strongly disagree, 2: disagree, 3: neutral, 4: agree, 5: strongly agree).

https://doi.org/10.1371/journal.pone.0288558.s005

Acknowledgments

The authors wish to thank all members of the research group of the Centre for Innovation in Medical Education at Leiden University Medical Centre for their critical appraisal of the research protocol and all the students for their willingness to participate and valuable feedback.

  • View Article
  • PubMed/NCBI
  • Google Scholar
  • 19. Toetsanalyse in RemindoToets [Internet]. Paragin; 2018 [Available from: https://www.paragin.nl/update/toetsanalyse-in-remindotoets/ .
  • Research article
  • Open access
  • Published: 13 October 2016

Validity of very short answer versus single best answer questions for undergraduate assessment

  • Amir H. Sam 1 , 2 ,
  • Saira Hameed 1 ,
  • Joanne Harris 2 &
  • Karim Meeran 1 , 2  

BMC Medical Education volume  16 , Article number:  266 ( 2016 ) Cite this article

13k Accesses

35 Citations

2 Altmetric

Metrics details

Single Best Answer (SBA) questions are widely used in undergraduate and postgraduate medical examinations. Selection of the correct answer in SBA questions may be subject to cueing and therefore might not test the student’s knowledge. In contrast to this artificial construct, doctors are ultimately required to perform in a real-life setting that does not offer a list of choices. This professional competence can be tested using Short Answer Questions (SAQs), where the student writes the correct answer without prompting from the question. However, SAQs cannot easily be machine marked and are therefore not feasible as an instrument for testing a representative sample of the curriculum for a large number of candidates. We hypothesised that a novel assessment instrument consisting of very short answer (VSA) questions is a superior test of knowledge than assessment by SBA.

We conducted a prospective pilot study on one cohort of 266 medical students sitting a formative examination. All students were assessed by both a novel assessment instrument consisting of VSAs and by SBA questions. Both instruments tested the same knowledge base. Using the filter function of Microsoft Excel, the range of answers provided for each VSA question was reviewed and correct answers accepted in less than two minutes. Examination results were compared between the two methods of assessment.

Students scored more highly in all fifteen SBA questions than in the VSA question format, despite both examinations requiring the same knowledge base.

Conclusions

Valid assessment of undergraduate and postgraduate knowledge can be improved by the use of VSA questions. Such an approach will test nascent physician ability rather than ability to pass exams.

Peer Review reports

Single Best Answer (SBA) questions are widely used in both undergraduate and postgraduate medical examinations. The typical format is a question stem describing a clinical vignette, followed by a lead in question about the described scenario such as the likely diagnosis or the next step in the management plan. The candidate is presented with a list of possible responses and asked to choose the single best answer.

SBA questions have become increasingly popular because they can test a wide range of topics with high reliability and are the ideal format for machine marking. They also have a definitive correct answer which is therefore not subject to interpretation on the part of the examiner.

However, the extent to which SBAs measure what they are intended to measure, that is their ‘ validity ,’ is subject to some debate. Identified shortcomings of SBAs include the notion that clinical medicine is often nuanced, making a single best answer inherently flawed. For example, we teach our students to form a differential diagnosis, but the ability to do this cannot, by the very nature of SBA questions, be assessed by this form of testing. Secondly, at the end of the history and physical examination, the doctor has to formulate a diagnosis and management plan based on information gathered rather than a ‘set of options’. Furthermore, the ability of an SBA to accurately test knowledge is affected by the quality of the wrong options (‘distractors’). Identifying four plausible distractors for SBAs is not always easy. This can be particularly challenging when assessing fundamental or core knowledge. If one or more options are implausible, the likelihood of students choosing the correct answer by chance increases. Thus the distractors themselves may enable students to get the correct answer without actually knowing the fact in question because candidates may arrive at the correct answer simply by eliminating all the other options.

Lastly, and perhaps most importantly, cueing is inherent to such a mode of assessment and with enough exam practice, trigger words or other recognised signposts contained within the question mean that ultimately, what is tested, is the candidate’s ability to pass exams rather than their vocational progress. It is therefore possible for candidates to get the answer correct even if their knowledge base is inadequate.

Within our own medical school, throughout the year groups, students undergo both formative and summative assessments by SBA testing. In order to address the shortcomings that we have identified in this model, we have developed a novel, machine marked examination in which students give a very short answer (VSA) which typically consists of three words or less. We hypothesised that VSAs would prove a superior test of inherent knowledge compared to SBA assessment.

In a prospective pilot study, the performance of one cohort of 266 medical students in a formative examination was compared when the same knowledge was tested using either SBA or a novel, machine-marked VSA assessment. All students sat an online examination using a tablet computer. Questions were posed using a cloud-based tool and students provided answers on a tablet or smart phone. In the examination, 15 questions in the form of a clinical vignette (Table  1 ) were asked twice (Additional file 1 : Supplementary methods).

The first time the questions were asked, candidates were asked to type their response as a VSA, typically one to three words, to offer a diagnosis or a step in the management plan. The second time the questions were asked in an SBA format. We compared medical students’ performance between the questions in this format and the VSA format. In order to avoid the effect of cueing in subsequent questions, the exam was set in such a way that students could not return to previously answered questions.

SBA responses were machine marked. The VSA data was exported into a Microsoft Excel spreadsheet. Using the ‘filter’ function in Microsoft Excel, we reviewed the range of answers for each question and assigned marks to acceptable answers. In this way, minor mis-spellings or alternative correct spellings could be rapidly marked as correct (Fig.  1 ). When several students wrote the same answer, this only appeared as one entry within the filter function to be marked. As a consequence, the maximum time spent on marking each question for 266 students was two minutes.

Marking method using Microsoft Excel and filter. In this example, candidates were given a clinical vignette and then asked to interpret the patient’s 12 lead electrocardiogram (ECG). There were 19 answers given by the students and seven variants of the correct answer (‘pericarditis’) were deemed acceptable as shown by the ticks in the filter

The McNemar’s test was used to compare the students’ responses to VSA and SBA questions.

There was a statistically significant difference in the proportions of correct/incorrect answers to the VSA and SBA formats in all 15 questions ( p  < 0.01). Figure  2 shows the numbers of students who got each question correct, either as a VSA or an SBA. In all questions, more students got the correct answer when given a list to choose from in the SBA format than when asked to type words in the VSA examination. For example, when asked about the abnormality on a blood film from a patient with haemolytic ureamic syndrome, only 141 students offered the correct VSA. However a further 113 students who didn’t know this, guessed it correctly in the SBA version of the question.

Number of correct responses given for questions asked in two formats: single best answer (SBA) and very short answer (VSA). Medical students were given a clinical vignette and asked about clinical presentations, investigations and treatment of the patient. In the VSA format, students were asked to type the answer (one to three words). In the SBA format, students chose the correct answer from a list. In all questions, more students got the correct answer when given a list to choose from in the SBA format than when asked to type the answer in the VSA assessment (* p  < 0.01)

The high scoring of the SBA questions in comparison to the VSA format is a cause for concern for those involved in the training of fit for purpose doctors of tomorrow. Despite the same knowledge being tested, the ability of some students to score in the SBA but not in the VSA examination demonstrates that reliance on assessment by SBA can foster the learning of association and superficial understanding to pass exams.

Assessment is well known to drive learning [ 1 ] but it only drives learning that will improve performance in that type of assessment. Studying exam technique rather than the subject is a well-known phenomenon amongst candidates [ 2 ]. In the past 20 years there has been an emphasis on the use of tools such as the SBA that offer high reliability in medical school assessment [ 3 ]. The replacement of marking of written exams with machine marked SBAs has resulted in students engaging in extensive and strategic practice in that exam technique. Students who practise large numbers of past questions can become adept at choosing the correct option from a list without an in-depth understanding of the topic. While practising exam questions can increase knowledge, the use of cues to exclude distractors is an important skill in exam technique with SBAs. This technique improves performance in the assessment, but does not enhance the student’s ability to make a diagnosis in clinical situations. Students who choose the correct option in SBAs may be unable to answer a question on the same topic when asked verbally by a teacher who does not present them with options. In clinical practice a patient will certainly not provide a list of possible options. We are thus sacrificing validity for reliability.

One of the key competencies for the junior doctors is to be able to recall the correct diagnosis or test in a range of clinical scenarios. In a question about a patient with suspected diabetic ketoacidosis (DKA), only 42 students offered to test capillary or urinary ketones, whereas when the same question was posed in the SBA format, another 165 students chose the correct answer. We expect our junior doctors to recall this important immediate step in the management of an unwell patient with suspected DKA. Our findings therefore suggest that assessment by VSA questions may offer added value in testing this competency.

An ideal assessment will encourage deep learning rather than recognition of the most plausible answer from a list of options. Indeed tests that require students to construct an answer appear to produce better memory than tests that require students to recognise an answer [ 4 ].

In contrast to SBA, our pilot study has demonstrated that to correctly answer a VSA, students need to be able to generate the piece of knowledge in the absence of cues, an approach that is more representative of real-life medical practice.

Our increasing reliance on assessment by SBA is partly an issue of marking manpower. SBAs can be marked by a machine, making them a highly efficient way to generate a score for a large number of medical students. Any other form of written assessment requires a significant investment of time by faculty members to read and mark the examination. However, our novel machine marked VSA tested knowledge and understanding but each question could still be marked in two minutes or less for the entire cohort. The use of three or less words allowed for a stringent marking scheme, thus eliminating the inter-marker subjectivity which can be a problem in other forms of free text examination.

It must be emphasised that there is no single ideal mode of assessment and that the best way to achieve better reliability and validity is by broad sampling of the curriculum using a variety of assessment methods. This is a pilot study and further research should evaluate the reliability, validity, educational impact and acceptability of the VSA question format.

This pilot study highlights the need to develop machine marked assessment tools that will test learning and understanding rather than exam technique proficiency. This study suggests that students are less likely to generate a correct answer when asked to articulate a response to a clinical vignette than when they have to pick an answer from a list of options. This therefore raises the possibility that the VSA is a valid test of a student’s knowledge and correlation of VSA marks with other modes of assessment should be investigated in future research. Future examinations may be enhanced by the introduction of VSAs, which could add an important dimension to assessments in clinical medicine.

Abbreviations

Electrocardiogram

Short answer questions

  • Single best answer
  • Very short answer

Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357:945–9.

Article   Google Scholar  

McCoubrie P. Improving the fairness of multiple-choice questions: a literature review. Med Teach. 2004;26:709–12.

van der Vleuten CP, Schuwirth LW. Assessing professional competence: from methods to programmes. Med Educ. 2005;39:309–17.

Wood T. Assessment not only drives learning, it may also help learning. Med Educ. 2009;43:5–6.

Download references

Acknowledgements

Availability of data and materials.

The dataset can be requested from the corresponding author on reasonable request.

Authors’ contributions

AHS, SH, JH and KM designed the work, conducted the study, analysed and interpreted the data and wrote the manuscript. AHS, SH, JH and KM read and approved the final manuscript.

Authors’ information

AHS is Head of Curriculum and Assessment Development in the School of Medicine, Imperial College London. SH is an NIHR Clinical Lecturer at Imperial College London. JH is Director of Curriculum and Assessment in the School of Medicine, Imperial College London. KM is the Director of Teaching in the School of Medicine, Imperial College London.

Competing interests

The authors declare that they have no competing interests.

Consent for publication

Not applicable.

Ethics approval and consent to participate

The study was reviewed by the Medical Education Ethics Committee (MEEC) at Imperial College London and was categorised as teaching evaluation, thus not requiring formal ethical approval. MEEC also deemed that the study did not require consent.

Author information

Authors and affiliations.

Division of Diabetes, Endocrinology and Metabolism, Imperial College, London, UK

Amir H. Sam, Saira Hameed & Karim Meeran

Medical Education Research Unit, School of Medicine, Imperial College, London, UK

Amir H. Sam, Joanne Harris & Karim Meeran

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Karim Meeran .

Additional file

Additional file 1:.

Supplementary Methods. Questions used in Single Best Answer (SBA) and Very Short Answer (VSA) formats. (DOCX 22 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Sam, A.H., Hameed, S., Harris, J. et al. Validity of very short answer versus single best answer questions for undergraduate assessment. BMC Med Educ 16 , 266 (2016). https://doi.org/10.1186/s12909-016-0793-z

Download citation

Received : 11 December 2015

Accepted : 08 October 2016

Published : 13 October 2016

DOI : https://doi.org/10.1186/s12909-016-0793-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Reliability

BMC Medical Education

ISSN: 1472-6920

short answer questions medical education

  • Educational Assessment
  • Multiple Choice Questions

Use of very short answer questions compared to multiple choice questions in undergraduate medical students: An external validation study

PLOS

  • 18(7):e0288558

Elise van Wijk at Leiden University Medical Centre

  • Leiden University Medical Centre

Roemer Janse at Leiden University Medical Centre

Abstract and Figures

(A) Set-up of both courses (RM and DA) with the formative exam and contents, summative exam and contents, and the Automated Education Evaluation System (AEES) (B) Flowchart of the study participants.

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations
  • BMC Med Educ

Sethapong Lertsakulbunlue

  • Joseph A. Costello

Anthony R Artino

  • Rachel J. Westacott

Celia Brown

  • Amir H. Sam
  • Rebecca K Wilson

Rachel Westacott

  • Stephanie Ewen
  • Joachim Neumann
  • Stephanie Simmrodt
  • Holger Teichert
  • Ulrich Gergs

Emilia Peleva

  • Chee Yeen Fung

Karim Meeran

  • Mark Gurnell
  • Samantha M Field

Carlos Collares

  • Mohsen Tavakol

Reg Dennick

  • Isabelle Desjardins

Claire Touchie

  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Javascript is currently disabled in your browser. Several features of this site will not function whilst javascript is disabled.

  • Why Publish With Us?
  • Editorial Policies
  • Author Guidelines
  • Peer Review Guidelines
  • Open Outlook
  • Submit New Manuscript

short answer questions medical education

  • Sustainability
  • Press Center
  • Testimonials
  • Favored Author Program
  • Permissions
  • Pre-Submission

Chinese website (中文网站)

open access to scientific and medical research

A part of Taylor & Francis Group

Back to Journals » Advances in Medical Education and Practice » Volume 10

short answer questions medical education

Very Short Answer Questions: A Novel Approach To Summative Assessments In Pathology

  • Get Permission
  • Cite this article

Authors Sam AH   , Peleva E   , Fung CY   , Cohen N , Benbow EW , Meeran K  

Received 12 December 2018

Accepted for publication 23 October 2019

Published 4 November 2019 Volume 2019:10 Pages 943—948

DOI https://doi.org/10.2147/AMEP.S197977

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder

Amir H Sam, 1 Emilia Peleva, 1 Chee Yeen Fung, 1 Nicki Cohen, 2 Emyr W Benbow, 3 Karim Meeran 1 1 Imperial College School of Medicine, Imperial College London, London, UK; 2 King’s College London, London, UK; 3 University of Manchester, Manchester, UK Correspondence: Karim Meeran Department of Endocrinology, Charing Cross Hospital, Fulham Palace Road, London W6 8RF, UK Email [email protected] Background: A solid understanding of the science underpinning treatment is essential for all doctors. Pathology teaching and assessment are fundamental components of the undergraduate medicine curriculum. Assessment drives learning and the choice of assessments influences students’ learning behaviours. The use of multiple-choice questions is common but is associated with significant cueing and may promote “rote learning”. Essay-type questions and Objective Structured Clinical Examinations (OSCEs) are resource-intensive in terms of delivery and marking and do not allow adequate sampling of the curriculum. To address these limitations, we used a novel online tool to administer Very Short Answer questions (VSAQs) and evaluated the utility of the VSAQs in an undergraduate summative pathology assessment. Methods: A group of 285 medical students took the summative assessment, comprising 50 VSAQs, 50 single best answer questions (SBAQs), and 75 extended matching questions (EMQs). The VSAQs were machine-marked against pre-approved responses and subsequently reviewed by a panel of pathologists, with the software remembering all new marking judgements. Results: The total time taken to mark all 50 VSAQs for all 285 students was 5 hours, compared to 70 hours required to manually mark an equivalent number of questions in a paper-based pathology exam. The median percentage score for the VSAQs test (72%) was significantly lower than that of the SBAQs (80%) and EMQs (84%), p Conclusion: VSAQs are an acceptable, reliable and discriminatory method for assessing pathology, and may enhance students’ understanding of how pathology supports clinical decision-making and clinical care by changing learning behaviour. Keywords: pathology, teaching, assessment, very short answer questions A Letter to the Editor has been published for this article.

Introduction

Described as the “science underpinning medicine”, 1 pathology is fundamental for all doctors, helping to guide clinical reasoning, the appropriate use and interpretation of laboratory tests, accurate diagnoses and planning patient care. 2 Information from pathology laboratories is needed for 70% of the diagnoses in hospital inpatients in the United Kingdom (UK). 1 Consequently, pathology teaching should be an integral part of undergraduate medical education. The survey of UK medical schools 3 suggests there is great variation in pathology teaching. Some authors have raised concerns about a decrease in pathology teaching in modern medical curricula and its impact on junior doctors’ understanding of what is wrong with their patients and their ability to interpret investigation results. 2

Assessments are known to drive learning. 4 , 5 Currently, most undergraduate assessments use multiple-choice questions, such as Single Best Answer questions (SBAQs) or Extended Matching Questions (EMQs), 3 whereby candidates are presented with a list of possible answers from which they select the most appropriate response. Well-constructed multiple-choice questions such as SBAQs and EMQs can assess deep learning; however, they have been criticised as these formats test recognition rather than recall 6 and are subject to cueing. 7 Furthermore, students will prepare differently for different examination formats. 8 – 10 Multiple-choice questions have been shown to elicit test-taking behaviours such as “rote learning”, that may be inauthentic to real-world clinical reasoning; 11 patients do not present with a list of five possible diagnoses for the doctor to choose from. If assessments required students to recall knowledge, rather than select responses, this may alter learning behaviour by driving students to seek deeper understanding of the subject. Requiring candidates to generate responses has also been demonstrated to improve long-term retention after studying. 12 – 15 Essay-type questions and Objective Structured Clinical Examinations (OSCEs) can test the ability to recall and apply knowledge, and are used by some UK medical schools to assess pathology; 3 however, these are very resource-intensive in terms of delivery and marking, and can only cover limited sections of the curriculum.

An alternative assessment method is Very Short Answer questions (VSAQs), consisting of a clinical vignette followed by a question (usually about diagnosis or management), which requires candidates to generate a short response, typically one to four words long. 7 We have previously shown VSAQs, administered using a novel online assessment management software, to be a highly reliable and discriminatory assessment method in formative examinations; however, these findings need to be confirmed in summative assessments, as the level of motivation of students will be different in high-stakes summative settings.

We used an online tool to run a pathology summative assessment at Imperial College London. The aim of this study was to evaluate whether VSAQs used in a summative pathology assessment were an acceptable, reliable and discriminatory assessment tool.

Participants And Assessment

The Medical Education Ethics Committee at Imperial College London deemed this study to be an assessment evaluation, which did not require formal ethical approval. The pathology course at Imperial College School of Medicine currently starts at the beginning of Year 5 (the penultimate year), with a block of teaching, followed by some integrated pathology during Year 5. All medical students in Year 5 (n=338 in 2017 and n=285 in 2018) undertook a summative pathology assessment, as well as a written paper and a clinical skills assessment focusing on the specialties taught in Year 5. We introduced 25 VSAQs in the Pathology exam in 2017, which were delivered on paper and marked by hand. We subsequently included 50 VSAQs in the Pathology exam in 2018, which were administered on an iPad using an online exam management software (Practique; Fry-IT Ltd, London, UK) along with the Safe Exam Browser software, to ensure that only the exam was visible, and all other websites and applications were disabled.

The Pathology Summative Assessment Blueprint Questions Mapped To The Royal College Of Pathologists Undergraduate Curriculum. Each Question Covered One To Three Areas

Example of computer-marking of a Very Short Answer question (VSAQ). The answers shown in green match those on the list of possible answers and are automatically assigned a mark (1.00). The amber answer has been marked as correct based on its similarity to the acceptable answer. Answers marked by computer as incorrect are shown in red. During the examiner verification process, this can be over-ridden with all identical answers automatically receiving a mark (e.g., “acute lymphocytic leukaemia” in this case).

Identical responses were grouped in blocks by the application, and responses marked as correct by the examiners were applied to all identical answers. 7 Any answers marked as correct by the examiners were automatically added to the set of acceptable responses for that question and the software will recall these responses if the same question is used again in future.

In order to evaluate the acceptability of the VSAQs in terms of faculty time required for marking, we compared the time taken for manual marking of the Pathology paper in 2017 versus the time taken for examiner reviews after online making in 2018 (time was rounded to the nearest hour). Answers to EMQs and SBAQs were entirely machine-marked.

Statistical analyses were performed using IBM SPSS Statistics for Windows Version 24.0 (IBM Corp., Armonk, NY, USA) and PRISM Version 5.0C (Graphpad Software, Inc., San Diego, CA, USA). Mean is given for normally distributed data, and median for non-normally distributed data. The Kruskal–Wallis test with Dunn’s multiple comparisons test was used for non-normally distributed data to assess the difference between the groups. Cronbach’s alpha was calculated as a measure of reliability. Cronbach’s alpha for EMQs was adjusted for a 50-question test using the Spearman-Brown prediction formula. Item-total score point-biserial was calculated as a measure of discrimination.

Acceptability

Double marking of 25 VSAQs for 338 students in 2017 took 42 hours. The total time spent by examiners to review the machine-marked answers to all 50 VSAQs for 285 students in 2018 was 5 hours.

In the 2018 exam, the median for the VSAQs was 72% (interquartile range 62%–82%), SBAQs 80% (interquartile range 72%–86%) and EMQs 84% (interquartile range 76%–88%). The median percentage score for the VSAQs test was significantly lower than both SBAQs and EMQs (p<0.0001).

Reliability And Discrimination

Cronbach’s alpha for the VSAQs test was 0.86, compared to 0.76 for the SBAQs test and 0.77 for the EMQs test. The mean item-total score point-biserial for the VSAQs was 0.35, compared to 0.30 for the SBAQs and 0.28 for EMQs.

An alternative assessment method may be useful in driving pathology learning in medical students. VSAQs are a method for assessing ability to recall and apply knowledge as well as enabling broad sampling of the pathology curriculum across a wide variety of topics. We have used VSAQs with an online assessment tool for the first time in a summative pathology assessment. Results show that VSAQs can be an acceptable, reliable and discriminatory method for assessment of pathology.

Students scored significantly lower on the VSAQs test, consistent with our previous findings suggesting that students find this assessment format more difficult. 7 Unlike SBAQs and EMQs, the VSAQs format requires students to generate, rather than recognise, responses and to demonstrate deeper understanding of pathology. Students have previously agreed that VSAQs were more representative of clinical practice and that using VSAQs in summative examinations will likely influence learning behaviour and improve preparation for clinical practice. 7

Use of open-ended questions has previously been limited by the resource-intensive nature of administration and marking by examiners. Furthermore, paper-based delivery of open-ended questions for large cohorts of students is limited by the need for decollation of the mark sheets, marking by hand, the entering marks into spreadsheets, and the introduction of the risk of human error. We have shown how to overcome these limitations using a novel online assessment tool and machine-marking. We were able to mark all 50 VSAQs for 285 students on the same day as the exam.

As identical responses were grouped in blocks by the application, responses marked as correct by the examiners were applied to all identical responses. This facilitated the review process, reduced marking time and ensured consistency. Furthermore, the online assessment software remembers new marking judgments and saves these to the existing set of acceptable responses for that question. This ensures that marking time improves with future use of each question. 7

This study is limited by the sample size and inclusion of students from a single centre only. As more undergraduate programs include VSAQs in their assessments, the utility of this assessment instrument for larger cohorts could be evaluated. Another limitation is that student feedback was not collected for this study; however, we have previously reported positive student feedback for VSAQs in a formative exam. 7

In summary, the use of VSAQs in summative pathology assessments is reliable, discriminatory and acceptable, and the assessment method will likely encourage deeper learning of pathology by undergraduate students. Future studies need to explore the impact of VSAQs on deep learning. Choice of assessments can complement teaching strategies, including better signposting to students and making pathology more visibly clinically relevant, enhancing students’ engagement with the specialty. Greater engagement with the specialty would enhance students’ understanding of the role of pathology in supporting clinical decision-making and clinical care.

The authors report no conflicts of interest in this work.

1. The Royal College of Pathologists. Pathology Undergraduate Curriculum. 2014.

2. Marsdin E, Biswas S. Are we learning enough pathology in medical school to prepare us for postgraduate training and examinations? J Biomed Educ . 2013. doi:10.1155/2013/165691

3. Mattick K, Marshall R, Bligh J. Tissue pathology in undergraduate medical education: atrophy or evolution? J Pathol . 2004;203(4):871–876. doi:10.1002/path.v203:4

4. Cox M, Irby DM, Epstein RM. Assessment in medical education. N Engl J Med . 2007;356(4):387–396. doi:10.1056/NEJMra054784

5. Wass V, Van Der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet . 2001;357:945–949. doi:10.1016/S0140-6736(00)04221-5

6. Veloski JJ, Rabinowitz HK, Robeson MR, Young PR. Patients don’t present with five choices: an alternative to multiple-choice tests in assessing physicians’ competence. Acad Med . 1999;74(5):539–546. doi:10.1097/00001888-199905000-00022

7. Sam AH, Field SM, Collares CF, et al. Very-short-answer questions: reliability, discrimination and acceptability. Med Educ . 2018;52(4):447–455. doi:10.1111/medu.13504

8. Cilliers FJ, Schuwirth LWT, Van Der Vleuten CPM. A model of the pre-assessment learning effects of assessment is operational in an undergraduate clinical context. BMC Med Educ . 2012;12:9. doi:10.1186/1472-6920-12-9

9. Al-Kadri HM, Al-Moamary MS, Roberts C, Van Der Vleuten CPM. Exploring assessment factors contributing to students’ study strategies: literature review. Med Teach . 2012;34:S42–S50. doi:10.3109/0142159X.2012.656756

10. Newble DI, Jaeger K. The effect of assessments and examinations on the learning of medical students. Med Educ . 1983;17(3):165–171. doi:10.1111/medu.1983.17.issue-3

11. Surry LT, Torre D, Durning SJ. Exploring examinee behaviours as validity evidence for multiple-choice question examinations. Med Educ . 2017. doi:10.1111/medu.13367

12. McConnell MM, St-Onge C, Young ME. The benefits of testing for learning on later performance. Adv Health Sci Educ . 2014;20(2):305–320. doi:10.1007/s10459-014-9529-1

13. Larsen DP, Butler AC, Roediger HL. Test-enhanced learning in medical education. Med Educ . 2008;42:959–966. doi:10.1111/med.2008.42.issue-10

14. Wood T. Assessment not only drives learning, it may also help learning. Med Educ . 2009;43:5–6. doi:10.1111/med.2008.43.issue-1

15. McDaniel MA, Roediger HL, McDermott KB. Generalizing test-enhanced learning from the laboratory to the classroom. Psychon Bull Rev . 2007;200–206. doi:10.3758/BF03194052

Creative Commons License

Contact Us   •   Privacy Policy   •   Associations & Partners   •   Testimonials   •   Terms & Conditions   •   Recommend this site •   Cookies •   Top

Contact Us   •   Privacy Policy

Login to your account

Change password, your password must have 8 characters or more and contain 3 of the following:.

  • a lower case character, 
  • an upper case character, 
  • a special character 

Password Changed Successfully

Your password has been changed

Create a new account

Can't sign in? Forgot your password?

Enter your email address below and we will send you the reset instructions

If the address matches an existing account you will receive an email with instructions to reset your password

Request Username

Can't sign in? Forgot your username?

Enter your email address below and we will send you your username

If the address matches an existing account you will receive an email with instructions to retrieve your username

World Scientific

  •   
  • Institutional Access

Cookies Notification

Our site uses javascript to enchance its usability. you can disable your ad blocker or whitelist our website www.worldscientific.com to view the full content., select your blocker:, adblock plus instructions.

  • Click the AdBlock Plus icon in the extension bar
  • Click the blue power button
  • Click refresh

Adblock Instructions

  • Click the AdBlock icon
  • Click "Don't run on pages on this site"

uBlock Origin Instructions

  • Click on the uBlock Origin icon in the extension bar
  • Click on the big, blue power button
  • Refresh the web page

uBlock Instructions

  • Click on the uBlock icon in the extension bar

Adguard Instructions

  • Click on the Adguard icon in the extension bar
  • Click on the toggle next to the "Protection on this website" text

Brave Instructions

  • Click on the orange lion icon to the right of the address bar
  • Click the toggle on the top right, shifting from "Up" to "Down

Adremover Instructions

  • Click on the AdRemover icon in the extension bar
  • Click the "Don’t run on pages on this domain" button
  • Click "Exclude"

Adblock Genesis Instructions

  • Click on the Adblock Genesis icon in the extension bar
  • Click on the button that says "Whitelist Website"

Super Adblocker Instructions

  • Click on the Super Adblocker icon in the extension bar
  • Click on the "Don’t run on pages on this domain" button
  • Click the "Exclude" button on the pop-up

Ultrablock Instructions

  • Click on the UltraBlock icon in the extension bar
  • Click on the "Disable UltraBlock for ‘domain name here’" button

Ad Aware Instructions

  • Click on the AdAware icon in the extension bar
  • Click on the large orange power button

Ghostery Instructions

  • Click on the Ghostery icon in the extension bar
  • Click on the "Trust Site" button

Firefox Tracking Protection Instructions

  • Click on the shield icon on the left side of the address bar
  • Click on the toggle that says "Enhanced Tracking protection is ON for this site"

Duck Duck Go Instructions

  • Click on the DuckDuckGo icon in the extension bar
  • Click on the toggle next to the words "Site Privacy Protection"

Privacy Badger Instructions

  • Click on the Privacy Badger icon in the extension bar
  • Click on the button that says "Disable Privacy Badger for this site"

Disconnect Instructions

  • Click on the Disconnect icon in the extension bar
  • Click the button that says "Whitelist Site"

Opera Instructions

  • Click on the blue shield icon on the right side of the address bar
  • Click the toggle next to "Ads are blocked on this site"

CONNECT Login Notice

System Upgrade on Tue, May 28th, 2024 at 2am (EDT)

Essay questions and variations.

Challenges and Limitations

Basic Categories of Essay Questions

Short Answer Questions (SAQ)

Modified Essay Questions (MEQ)

References and Further Readings

Recommended

Basics in Medical Education

 logo

  • Advanced Life Support
  • Endocrinology
  • Gastroenterology
  • Infectious disease
  • Intensive care
  • Palliative Care
  • Respiratory
  • Rheumatology
  • Haematology
  • Endocrine surgery
  • General surgery
  • Neurosurgery
  • Ophthalmology
  • Plastic surgery
  • Vascular surgery
  • Abdo examination
  • Cardio examination
  • Neurolo examination
  • Resp examination
  • Rheum examination
  • Vasc exacmination
  • Other examinations
  • Clinical Cases
  • Communication skills
  • Prescribing

 logo

Questions about DVT (Deep Vein Thrombosis)

short answer questions medical education

Anaemia – Exam questions and answers

short answer questions medical education

Questions about myeloma and MGUS

Haematology

Questions about non-Hodgkin lymphoma

short answer questions medical education

Questions about Myasthenia Gravis (MG)

short answer questions medical education

Epistaxis – Questions

short answer questions medical education

Questions about the neurological exam

short answer questions medical education

Questions about Colorectal Cancer

short answer questions medical education

Hepatitis E – Questions

Hepatitis c – questions.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

The PMC website is updating on October 15, 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Int J Health Sci (Qassim)
  • v.2(2); 2008 Jul

Assessment Methods in Medical Education

Medical education, the art and science behind medical learning and teaching, has progressed remarkably. Teaching and learning have become more scientific and rigorous, curricula are based on sound pedagogical principles, and Problem Based and other forms of active and self directed learning have become the mainstream. Teachers have progressed from the role of problem-identifier to that of the solution-provider.

During the last three decades medical schools have been faced with a variety of challenges from society, patients, doctors and students. They have responded in several ways including the development of new curricula, the introduction of new learning situations, the introduction of the new methods of assessment and a realization of the importance of staff development. Many effective and interesting innovations have been forthcoming.

The effective and efficient delivery of healthcare requires not only knowledge and technical skills but also analytical and communication skills, interdisciplinary care, counseling, evidence- and system-based care. This warrants our assessment systems to be comprehensive, sound and robust enough to assess the requisite attributes along with testing for essential knowledge and skills.

Assessment is entering every phase of professional development. Assessment and evaluation are crucial steps in educational process. Before making a choice of assessment method, some important questions must be asked: what should be assessed?, why assess? For an assessment instrument one must also ask: is it valid? Is it reliable?, is it feasible? What is assessed and which methods are used will play a significant part in what is learnt. A wide range of assessment methods currently available include essay questions, patient management problems, modified essay questions (MEQs) checklists, OSCE, student projects, Constructed Response Questions (CRQs), MCQs, Critical reading papers, rating scales, extended matching items, tutor reports, portfolios, short case assessment and long case assessment, log book, trainer’s report, audit, simulated patient surgeries, video assessment, simulators, self assessment, peer assessment and standardized patients.

Assessment has a powerful positive steering effect on learning and the curriculum. It conveys what we value as important and acts as the most cogent motivator of student learning. Assessment is purpose driven. In planning and designing assessments, it is essential to recognize the stakes involved in it. The higher the stake, the greater the implications of the outcome of the assessment. The more sophisticated the assessment strategies, the more appropriate they become for feedback and learning.

Measuring progress in acquiring core knowledge and competencies may be a problem if the exams are designed to measure multiple integrated abilities, such as factual knowledge, problem solving, analysis and synthesis of information. Students may advance in one ability and not in another. Therefore, progress tests that are designed to measure growth from the onset of learning until graduation should measure discrete abilities.

Mastery testing (criterion-reflected tests) requires that 100% of the items are measured correctly to determine whether students have attained a mastery level of achievements. In non-mastery testing attainment of 65% of a tested material is considered sufficient.

Global rating scales are measurement tool for quantifying behaviors. Raters use the scale either by directly observing students or by recalling student performance. Raters judge a global domain of ability for example: clinical skills, problem solving, etc

Self assessment (self regulation) is a vital aspect of the lifelong performance of physicians. Self monitoring requires that individuals are able not only to work independently but also to assess their own performance and progress.

Every form of assessment can be used as a self assessment exercise as long as students are provided with ‘gold standard’ criteria for comparing their own performance against an external reliable measure. Self assessment approaches include: written exams (MCQs, True/False, Essay, MEQs, modified CRQs), performance exams (checklists, global rating, student logbook, portfolio, video, etc).

Oral examination/Viva has poor content validity, higher inter-rater variability and inconsistency in marking. The instrument is prone to biases and is inherently unreliable.

Long Essay Questions can be used for assessment of complex learning situations that can not be assessed by other means (writing skills, ability to present arguments succinctly).

The Short Answer Question (SAQ) is an open ended, semi-structured question format. A structured predetermined marking scheme improves objectivity. The questions can incorporate clinical scenarios. A similar format is also known as Modified Essay Question (MEQ) or Constructed Response Question (CRQ). Equal or higher test reliabilities can be achieved with fewer SEQs as compared to true/false items. If a large amount of knowledge is required to be tested, MCQs should be used. SAQs have a better content coverage as compared to long essay question.

Extended Matching Item is based on a single theme and has a long option list to avoid cueing. It can be used for the assessment of clinical scenarios with less cueing. It is a practical alternative to MCQ while maintaining objectivity and consistency. It can be used in both basic and clinical sciences.

Key Feature Test is a clinical scenario-based paper and pencil test. A description of the problem is followed by a limited number of questions that focus on critical, challenging actions or decisions. It has higher content validity with proper blueprinting.

Long Case involves use of a non-standardised real patient. Long case may provide a unique opportunity to test the physician’s tasks and interaction with a real patient. It has poor content validity, is less reliable and lacks consistency. Reproducibility of the score is 0.39; meaning 39% of the variability of the score is due to actual performance of students (signal) and the remaining 61% of the variability is due to errors in measurement (noise) (Noricine,2002). In high stake summative assessment long case should be avoided.

Short Case involves use of three to four non-standardised real patients with one to two examiners. It provides opportunity for assessment with real patients and allows greater sampling than single long cse.

Objective Structured Clinical examination (OSCE) consists of multiple stations where each candidate is asked to perform a defined task such as taking a focused history or performing a focused clinical examination of a particular system. A standardized marking scheme specific for each case is used. It is an effective alternative to unstructured short cases.

Mini-Clinical Evaluation Exercise (Mini-CEX) is a rating scale developed by American Board of Internal Medicine to assess six core competencies of residents: medical interviewing skills, physical examination skills, humanistic qualities/professionalism, clinical judgment, counseling skills, organization and efficiency.

Direct Observation of Procedural Skills (DOPS) is a structured rating scale for assessing and providing feedback on practical procedures. The competencies that are commonly assessed include general knowledge about the procedure, informed consent, pre-procedure preparation, analgesia, technical ability, aseptic technique, post-proicdure management, and counseling and communication.

Clinical Work Sampling is an in-trainee evaluation method that addresses the issue of system and rater biases by collecting data on observed behaviour at the same time of actual performance and by using multiple observers and occasions.

Checklists are used to capture an observed behaviour or action oof a student. Generally rating is by a five to seven point

360-Degree Evaluation/Multisource Assessment consists of measurement tools completed by multiple individuals in a person’s sphere of influence. Assessment by peers, other members of the clinical team, and patients can provide insight into trainees’ work habits, capacity for team work, and interpersonal sensitivity

In the Logbook students keep a record of the patients seen or procedures performed either in a book or in a computer. It documents the range of patient care and learning experience of students. Logbook is very useful in focusing students on important objectives that must be fulfilled within a specified period of time (Blake, 2001).

Portfolio refers to a collection of one’s professional and personal goals, achievements, and methods of achieving these goals. Portfolios demonstrate a trainees’ development and technical capacity.

Skill based assessments are designed to measure the knowledge, skills, and judgment required for competency in a given domain.

Test of clinical competence, which allows decisions to be made about medical qualification and fitness to practice, must be designed with respect to key issues including blueprinting, validity, reliability, and standard setting, as well as clarity about their formative or summative function. MCQs, essays, and oral examinations could be used to test factual recall and applied knowledge, but more sophisticated methods are needed to assess clinical performance, including directly observed long and short cases, objective structure clinical examinations, and the use of standardized patients.

The Objective Structure Clinical examination (OSCE) has been widely adopted as a tool to assess students, or doctor’s competences in a range of subjects. It measures outcomes and allows very specific feedback.

Other approaches to skill-based assessment include: traditional (Oral exam/viva, long case); alternative formats (tackle the problems associated with traditional orals and long cases by having examiners observe the candidates complete interaction with the patient, training examiners to a structured assessment process, increasing the number of patient problems. Traditional unstructured orals and long cases have largely been discontinued in North America.

While selecting an assessment instrument it is necessary to know precisely what it is that is to be measured. This should reflect course outcomes as different learning outcomes require the use of different instruments. It is essential to use an instrument that is valid, reliable and feasible (calculating the cost of the assessment, both in terms of resources and time). Full variety of instruments will ensure that the results obtained are a true reflection of the students’ performance.

Multiple sampling strategies as the accepted methods used in assessment in clinical competency include OSCE, Short Answer Questions, mini-CEX (Mini Clinical Evaluation Exerciser), Directly Observed Procedural Skills (DOPS), Clinical work sampling (CWS), and 360-degree evaluation.

The assessment is an integral component of overall educational activities. Assessment should be designed prospectively along with learning outcomes. It should be purpose driven. Assessment methods must provide valid and usable data. Methods must yield reliable and generalisable data.

Multiple assessment methods are necessary to capture all or most aspects of clinical competency and any single method is not sufficient to do the job. For knowledge, concepts, application of knowledge (‘Knows’ and ‘Knows How’ of Miller’s conceptual pyramid for clinical competence) context-based MCQ, extended matching item and short answer questions are appropriate. For ‘Shows How” multi-station OSCE is feasible. For performance-based assessment (‘does’) mini-CEX, DOPS is appropriate. Alternatively clinical work sampling and portfolio or log book may be used.

Standard setting involves judgment, reaching consensus, and expressing that consensus as a single score on a test. Norm Referenced Scores are suitable for admission exercise that requires selection of a predetermined number of candidates. Criterion Referenced Standard (based on predefined test goals and standards in performance during an examination where a certain level of knowledge or skill has been determined as required for passing) is feasible for competency-based examination. Various approaches available include test-centred approach (Agnoff’s method and its variations), examinee-centred approach (borderline group method), and several other innovations. Blueprinting r efers to a process emphasizing that test content should be carefully planned against learning objectives.

The purpose of assessment should direct the choice of instruments. Needs assessment is the starting point of good assessment that identifies the current status of the students before the commencement of the actual educational activities. Needs assessment is used to determine the existing knowledge base, future needs, and priority areas that should be addressed.

Student assessment is a comprehensive decision making process with many important implications beyond the measure of students’ success. Student assessment is also related to program evaluation. It provides important data to determine the program effectiveness, improves the teaching program, and helps in developing educational concepts.

Good quality assessment not only satisfies the needs of accreditation but also contributes to student’s learning. Assessment methods should match the competencies being learnt and the teaching formats being used.

Competence is a habit of lifelong learning, is contextual (e.g. practice setting, the local prevalence of disease, etc) and developmental (habits of mind and behaviour and practical wisdom are gained through deliberate practice.

Further Reading

  • ACGME Outcome Project Accreditaton Council for Graduate Medical Education & American Board of Medical Specialist. Toolbox for assessment methods, version 1.1. www.acgme.org/outcomes/assess/toolbox.pdf . [ Google Scholar ]
  • Case SM, swanson DB. Constructing Written Test for the Basic & Clinical Sciences. 3rd. ed. National Board of Medical Examiners; Philadelphia, PA, USA: 2002. www.nbme.org/about/itemwriting.asp [ Google Scholar ]
  • Day SC, Norcini J J, Diserens D, et al. The validity of the essay test of clinical judgement. Acad Med. 1990; 65 (9):S39–40. [ PubMed ] [ Google Scholar ]
  • Epstein RM, Hundert EM. Defining and assessing clinical competence. JAMA. 2002; 387 :226–35. [ PubMed ] [ Google Scholar ]
  • Friedman Ben_David M. Association for Medical education in Europe; Dundee, UK: 2000. Standard setting in student assessment, AMEE education Guide No: 18. [ Google Scholar ]
  • Miller GE. The assessment of clinical skills/competencies/performance. Acad Med. 1990; 65 (9):S63–67. [ PubMed ] [ Google Scholar ]
  • Norcini JJ, Swanson DB, Grosso LJ, Webster GD. Reliability, validity and efficiency of multiple choice questions and patient management problem item formats in assessment of clinical competence. Med Edu. 1985; 19 (3):238–47. [ PubMed ] [ Google Scholar ]
  • Norman G. Postgraduate assessment – reliability and validity. Trans J. Coll. Med. S. Afri. 2003; 47 :71–75. [ Google Scholar ]
  • Page G, Bordage G, Allen T. Developing key feature problem and examination to assess clinical decision making skills. Acad Med. 1995; 70 (3):194–201. [ PubMed ] [ Google Scholar ]
  • Swanson DB. A measurement framework for performance based test. In: Hart IR, Harden RM, editors. Further Development in assessing Clinical Competence Montreal Can-Heal 1987. [ Google Scholar ]
  • Wass Cees Van der Vleuten, Shatzer John, Jones Roger. Assessment of clinical competence. The Lancet. 2001; 357 :945–49. [ PubMed ] [ Google Scholar ]
  • Vleuten va der CPM. Validity of final examination in undergraduate medical traning. BMJ. 2000; 321 :1217–19. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Falchikov N, Boud D. Student Self-assement in higher education: a meta-analysis. Review of Education Research. 1989; S9 :345–430. [ Google Scholar ]
  • Van der Vleuten CPM, Swanson DB. Assessment of clinical skills with standardized patients: state-of-the-art teaching and learning in medicine. 1990; 22 :58–76. [ PubMed ] [ Google Scholar ]

First-year requirements

  • Subject requirement (A-G)
  • GPA requirement
  • Admission by exception
  • English language proficiency
  • UC graduation requirements

Additional information for

  • California residents
  • Out-of-state students
  • Home-schooled students

Transfer requirements

  • Understanding UC transfer
  • Preparing to transfer
  • UC transfer programs
  • Transfer planning tools

International applicants

  • Applying for admission
  • English language proficiency (TOEFL/IELTS)
  • Passports & visas
  • Living accommodations
  • Health care & insurance

AP & Exam credits

Applying as a first year

  • Filling out the application
  • Dates & deadlines

Personal insight questions

  • How applications are reviewed
  • After you apply

Applying as a transfer

Types of aid

  • Grants & scholarships
  • Jobs & work-study
  • California DREAM Loan Program
  • Middle Class Scholarship Program
  • Blue and Gold Opportunity Plan
  • Native American Opportunity Plan  
  • Who can get financial aid
  • How aid works
  • Estimate your aid

Apply for financial aid

  • Tuition & cost of attendance
  • Glossary & resources
  • Santa Barbara
  • Campus program & support services
  • Check majors
  • Freshman admit data
  • Transfer admit data
  • Native American Opportunity Plan
  • Apply for aid
  • You will have 8 questions to choose from. You must respond to only 4 of the 8 questions.
  • Each response is limited to a maximum of 350 words.
  • Which questions you choose to answer is entirely up to you. However, you should select questions that are most relevant to your experience and that best reflect your individual circumstances.

Keep in mind

  • All questions are equal. All are given equal consideration in the application review process, which means there is no advantage or disadvantage to choosing certain questions over others.
  • There is no right or wrong way to answer these questions. It’s about getting to know your personality, background, interests and achievements in your own unique voice.  
  • Use the additional comments field if there are issues you'd like to address that you didn't have the opportunity to discuss elsewhere on the application. This shouldn't be an essay, but rather a place to note unusual circumstances or anything that might be unclear in other parts of the application. 

Questions & guidance

Remember, the personal insight questions are just that—personal. Which means you should use our guidance for each question just as a suggestion in case you need help. The important thing is expressing who you are, what matters to you and what you want to share with UC. 

1. Describe an example of your leadership experience in which you have positively influenced others, helped resolve disputes or contributed to group efforts over time. Things to consider: A leadership role can mean more than just a title. It can mean being a mentor to others, acting as the person in charge of a specific task, or taking the lead role in organizing an event or project. Think about what you accomplished and what you learned from the experience. What were your responsibilities?

Did you lead a team? How did your experience change your perspective on leading others? Did you help to resolve an important dispute at your school, church, in your community or an organization? And your leadership role doesn't necessarily have to be limited to school activities. For example, do you help out or take care of your family? 2. Every person has a creative side, and it can be expressed in many ways: problem solving, original and innovative thinking, and artistically, to name a few. Describe how you express your creative side. Things to consider: What does creativity mean to you? Do you have a creative skill that is important to you? What have you been able to do with that skill? If you used creativity to solve a problem, what was your solution? What are the steps you took to solve the problem?

How does your creativity influence your decisions inside or outside the classroom? Does your creativity relate to your major or a future career? 3. What would you say is your greatest talent or skill? How have you developed and demonstrated that talent over time? Things to consider: If there is a talent or skill that you're proud of, this is the time to share it.You don't necessarily have to be recognized or have received awards for your talent (although if you did and you want to talk about it, feel free to do so). Why is this talent or skill meaningful to you?

Does the talent come naturally or have you worked hard to develop this skill or talent? Does your talent or skill allow you opportunities in or outside the classroom? If so, what are they and how do they fit into your schedule? 4. Describe how you have taken advantage of a significant educational opportunity or worked to overcome an educational barrier you have faced. Things to consider: An educational opportunity can be anything that has added value to your educational experience and better prepared you for college. For example, participation in an honors or academic enrichment program, or enrollment in an academy that's geared toward an occupation or a major, or taking advanced courses that interest you; just to name a few.

If you choose to write about educational barriers you've faced, how did you overcome or strive to overcome them? What personal characteristics or skills did you call on to overcome this challenge? How did overcoming this barrier help shape who you are today? 5. Describe the most significant challenge you have faced and the steps you have taken to overcome this challenge. How has this challenge affected your academic achievement? Things to consider: A challenge could be personal, or something you have faced in your community or school. Why was the challenge significant to you? This is a good opportunity to talk about any obstacles you've faced and what you've learned from the experience. Did you have support from someone else or did you handle it alone?

If you're currently working your way through a challenge, what are you doing now, and does that affect different aspects of your life? For example, ask yourself, How has my life changed at home, at my school, with my friends or with my family? 6. Think about an academic subject that inspires you. Describe how you have furthered this interest inside and/or outside of the classroom. Things to consider:  Many students have a passion for one specific academic subject area, something that they just can't get enough of. If that applies to you, what have you done to further that interest? Discuss how your interest in the subject developed and describe any experience you have had inside and outside the classroom such as volunteer work, internships, employment, summer programs, participation in student organizations and/or clubs and what you have gained from your involvement.

Has your interest in the subject influenced you in choosing a major and/or future career? Have you been able to pursue coursework at a higher level in this subject (honors, AP, IB, college or university work)? Are you inspired to pursue this subject further at UC, and how might you do that?

7. What have you done to make your school or your community a better place? Things to consider: Think of community as a term that can encompass a group, team or a place like your high school, hometown or home. You can define community as you see fit, just make sure you talk about your role in that community. Was there a problem that you wanted to fix in your community?

Why were you inspired to act? What did you learn from your effort? How did your actions benefit others, the wider community or both? Did you work alone or with others to initiate change in your community? 8. Beyond what has already been shared in your application, what do you believe makes you a strong candidate for admissions to the University of California? Things to consider:  If there's anything you want us to know about you but didn't find a question or place in the application to tell us, now's your chance. What have you not shared with us that will highlight a skill, talent, challenge or opportunity that you think will help us know you better?

From your point of view, what do you feel makes you an excellent choice for UC? Don't be afraid to brag a little.

Writing tips

Start early..

Give yourself plenty of time for preparation, careful composition and revisions.

Write persuasively.

Making a list of accomplishments, activities, awards or work will lessen the impact of your words. Expand on a topic by using specific, concrete examples to support the points you want to make.

Use “I” statements.

Talk about yourself so that we can get to know your personality, talents, accomplishments and potential for success on a UC campus. Use “I” and “my” statements in your responses.

Proofread and edit.

Although you will not be evaluated on grammar, spelling or sentence structure, you should proofread your work and make sure your writing is clear. Grammatical and spelling errors can be distracting to the reader and get in the way of what you’re trying to communicate.

Solicit feedback.

Your answers should reflect your own ideas and be written by you alone, but others — family, teachers and friends can offer valuable suggestions. Ask advice of whomever you like, but do not plagiarize from sources in print or online and do not use anyone's words, published or unpublished, but your own.

Copy and paste.

Once you are satisfied with your answers, save them in plain text (ASCII) and paste them into the space provided in the application. Proofread once more to make sure no odd characters or line breaks have appeared.

This is one of many pieces of information we consider in reviewing your application. Your responses can only add value to the application. An admission decision will not be based on this section alone.

Need more help?

Download our worksheets:

  • English [PDF]
  • Spanish [PDF]
  • CBSE Class 10 NCERT Solutions

NCERT Solutions for Class 10 Science Chapter 12 Magnetic Effects of Electric Current: Download PDF For Free

Ncert solutions class 10 science chapter 12 magnetic effects of electric current: students find in this article detailed and accurate solutions for the exercise and intext questions for ncert class 10 chapter 12 magnetic effects of electric current..

Garima Jha

NCERT Solutions for Class 10 Science Chapter 12: For students of various classes, the intext and exercise questions provided in the NCERT Books are a resource hub from the examination point of view. Thus it becomes important that students have access to the detailed and accurate answers to the questions. In this article we have covered the detailed answers for NCERT Class 10 Science Chapter 12 Magnetic Effects of Electric Current. 

NCERT Books help students to gain an understanding of concepts and strengthen their basics. These books are recommended due to their explanation of even difficult topics in simple and clear language. Students can perform well in exams by referring to the NCERT Books. 

Also Check:  Difference Between Pound and Kilogram: Major Differences Between The Units

Questions given in the NCERT books are always considered very important as not only do they help you to analyse your understanding of the concepts involved in a chapter but also in preparing for the examinations. Students should read the chapters carefully along with attempting the questions given.

NCERT Class 10 Science Magnetic Effects of Electric Current Solutions 

Intext questions and solutions page no. 196.

Q. Why does a compass needle get deflected when brought near a bar magnet? 

Intext Questions and Solutions Page no. 200

Q. Draw magnetic field lines around a bar magnet.

Sol. Magnetic field lines of a bar magnet emerge from the North Pole and terminate at the South Pole. Direction gets reversed inside the magnet. 

short answer questions medical education

Q. List the properties of magnetic field lines. 

Sol.  1. Magnetic lines do not intersect with each other.

2. Magnetic field lines emerge from the North Pole.

3. They terminate at the South Pole. 

4. The direction of field lines inside the magnet is from the South Poleto the North Pole. 

Q. Why don’t two magnetic field lines intersect each other?

Intext Questions and Solutions Page no. 201

Q. Consider a circular loop of wire lying in the plane of the table. Let the current pass through the loop clockwise. Apply the right-hand rule to find out the direction of the magnetic field inside and outside the loop.

Sol. For downward direction of current flowing in the circular loop, the direction of magnetic field lines will be as if they are emerging from the table outside the loop and merging in the table inside the loop. Similarly, for the upward direction of current flowing in the circular loop, the direction of magnetic field lines will be as if they are emerging from the table outside the loop and merging in the table inside the loop. 

short answer questions medical education

Also Check:  Law Of Conservation Of Mass: Definition, Formula And More

Q. The magnetic field in a given region is uniform. Draw a diagram to represent it. 

Sol. The magnetic field lines inside a current-carrying long straight solenoid are uniform.

short answer questions medical education

Q. Choose the correct option. 

The magnetic field inside a long straight solenoid-carrying current 

(a) is zero. (b) decreases as we move towards its end. (c) increases as we move towards its end. (d) is the same at all points

Sol. (d) The magnetic field inside a long, straight, current- carrying solenoid is uniform.

Intext Questions and Solutions Page no. 203

Q. Which of the following property of a proton can change while it moves freely in a magnetic field? (There may be more than one correct answer.) 

(a) mass (b) speed (c) velocity (d) momentum 

Sol. (c) and (d)

When a proton enters in a region of magnetic field, it experiences a magnetic force. As a result of the force, the path of the proton becomes circular. Hence, its velocity and momentum change.

Q. In Activity 12.7, how do we think the displacement of rod AB will be affected if (i) current in rod AB is increased; (ii) a stronger horse-shoe magnet is used; and (iii) length of the rod AB is increased?

Sol. If a current-carrying conductor is placed in a magnetic field, it experiences a force. This force will increase with the increase in amount of current, strength of the magnetic field, and the length of the conductor. Thus, the magnetic force exerted on rod AB and its displacement will increase if

1. Current in rod AB is increased

2. A stronger horse-shoe magnet is used

3. Length of rod AB is increased

Q. A positively-charged particle (alpha-particle) projected towards west is deflected towards north by a magnetic field. The direction of magnetic field is 

(a) towards south (b) towards east (c) downward (d) upward

Intext Questions and Solutions Page no. 205

Q. Name two safety measures commonly used in electric circuits and appliances.

Sol. Two safety measures commonly used in electric circuits and appliances are:

Each circuit must be connected with an electric fuse. When the current passing through the wire exceeds the maximum limit of the fuse element, the fuse melts. It stops the flow of current through that circuit. Thus, protecting the appliances connected to the circuit.

Earthing is a necessary requisite as it helps in preventing electric shocks. Any leakage of current in an electric appliance is transferred to the ground and people using the appliance do not get the shock. 

Q. An electric oven of 2 kW power rating is operated in a domestic electric circuit (220 V) that has a current rating of 5 A. What result do you expect? Explain.

Sol. Current drawn by the electric oven can be obtained by the expression, 

Power of the oven, P= 2 kW = 2000 W

Voltage supplied, V= 220 V

I=2000/220=9.09A

Thus, the current drawn by the electric oven is 9.09 A, which exceeds the safe limit of the circuit. Fuse element of the electric fuse will melt and break the circuit. 

Q. What precaution should be taken to avoid the overloading of domestic electric circuits?

Sol. The precautions that should be taken to avoid the overloading of domestic circuits are:

1. There should be fuse connected to the circuit.

2. Don’t connect too many appliances to a single socket.

3. Too many appliances should not be used at the same time.

NCERT Class 10 Science Chapter 12 Exercise Questions

Q. Which of the following correctly describes the magnetic field near a long straight wire? 

(a) The field consists of straight lines perpendicular to the wire. 

(b) The field consists of straight lines parallel to the wire. 

(c) The field consists of radial lines originating from the wire. 

(d) The field consists of concentric circles centred on the wire.

Sol. (d) The field consists of concentric circles centred on the wire. 

The magnetic field lines, produced around a straight current-carrying conductor, are concentric circles. Their centers lie on the wire.

Q. At the time of short circuit, the current in the circuit 

(a) reduces substantially. (b) does not change. (c) increases heavily. (d) vary continuously.

Sol. (c) Increases heavily

When two naked wires of an electric circuit touch each other, the amount of current that is flowing in the circuit increases abruptly. This cause short-circuit. 

Q. State whether the following statements are true or false. 

(a) The field at the centre of a long circular coil carrying current will be parallel straight lines.

 (b) A wire with a green insulation is usually the live wire of an electric supply

Sol. True;A long circular coil is a long solenoid. The magnetic field lines inside the solenoid are parallel lines.

False; Live wire has red insulation cover, whereas earth wire has green insulation colour in the domestic circuits. 

Q. List two methods of producing magnetic fields. 

Sol. By using a permanent magnet, we can produce a magnetic field. 

A current-carrying straight conductor produces magnetic field.

Q. When is the force experienced by a current–carrying conductor placed in a magnetic field largest?

Sol. The force experienced by a current-currying conductor is the maximum when the direction of current is perpendicular to the direction of the magnetic field.

Q. Imagine that you are sitting in a chamber with your back to one wall. An electron beam, moving horizontally from back wall towards the front wall, is deflected by a strong magnetic field to your right side. What is the direction of magnetic field?

Sol. The direction of magnetic field is given by Fleming’s left hand rule. Magnetic field inside the chamber will be perpendicular to the direction of current and direction of deflection i.e., either upward or downward. The direction of current is from the front wall to the back wall because negatively charged electrons are moving from back wall to the front wall. The direction of magnetic force is rightward. Thus, using Fleming’s left hand rule, the direction of magnetic field inside the chamber is downward. 

Q. State the rule to determine the direction of a (i) magnetic field produced around a straight conductor-carrying current, (ii) force experienced by a current-carrying straight conductor placed in a magnetic field which is perpendicular to it, and (iii) current induced in a coil due to its rotation in a magnetic field.

Sol . i) Maxwell’s right hand thumb rule

(ii) Fleming’s left hand rule

(iii) Fleming’s right hand rule

Q. When does an electric short circuit occur?

Sol. Electric short circuits happen when somehow current flowing in the circuit increases. It can happen if the resistance of an electric circuit becomes very low. In such case, current flowing through the circuit becomes very high. This is caused by connecting too many appliances to a single socket or connecting high power rating appliances to the light circuits. Also, when the insulation of wires undergoes wear and tear, they touch each other, and then the current flowing in the circuit increases abruptly. Thus, a short circuit occurs. 

Q. What is the function of an earth wire? Why is it necessary to earth metallic appliances?

Sol. Earthing helps in preventing electric shocks. Metallic body of electric appliances is connected to the earth by means of earth wire. Because of this, if there is any leakage of electric current, then current will transfer to the ground. This prevents any electric shock to the user. That is why earthing of electrical appliances is necessary. 

CBSE Video Courses for Class 10 Students

Class 10 students can study effectively for the exams with the help of video courses prepared by the subject matter experts. These video courses will explain the concepts in a simple and interactive manner which will help learners to understand clearly.

CBSE Class 10 Video Courses 

Also, check

CBSE Class 10 Science Lab Manual PDF

  CBSE Class 10 Science syllabus 2024-25

Types of Waves in Physics

Archimedes' Principle: Formula, Derivation, Applications and Examples

Magnetic Field: Formula, Properties, and Applications

Avogadro’s Number: Definition, Value, Meaning, Unit and Symbol

Get here latest School , CBSE and Govt Jobs notification and articles in English and Hindi for Sarkari Naukari , Sarkari Result and Exam Preparation . Download the Jagran Josh Sarkari Naukri App .

  • RSMSSB CET Admit Card 2024
  • Rajasthan CET Admit Card 2024
  • OSSSC ICDS Admit Card 2024
  • RRB NTPC Apply Online 2024
  • OTET Answer Key 2024
  • RPSC RAS Notification 2024
  • राजस्थान CET एडमिट कार्ड 2024
  • Jharkhand Police Admit Card 2024
  • MP Police Bharti Physical Test 2024
  • Hindi Diwas Slogans, Thoughts

Latest Education News

PM Modi Auction: What Unique Items Are Up for Bidding Online?

Modi Government's New Rule: Get Disaster Alerts Directly on Your Mobile Phone

उत्तर प्रदेश के इन जिलों को मिली नई ई-बसों की सौगात, देखें किराया और बसों का रूट

You have eagle eyes if you can find the word “novel” in 7 seconds!

राजस्थान CET एडमिट कार्ड 2024 जारी: यहां दिए Direct Link से डाउनलोड करें RSMSSB CET Graduate Level Admit Card

{OUT} RSMSSB CET Admit Card 2024 Live: rsmssb.rajasthan.gov.in से डाउनलोड करें सीईटी ग्रेजुएट लेवल परीक्षा का एडमिट कार्ड

Brain Teaser: Feeling Smart? Only People With a High IQ Above 150 Can Find the Robot in 7 Seconds!

Picture Puzzle IQ Test: Your IQ Is Higher Than 130 if You Spot The Number of Triangles in 12 Seconds

RSMSSB CET Admit Card 2024 Link: Download Rajasthan Graduation Level Call Letter at recruitment.rajasthan.gov.in

Manu Bhaker: New Brand Ambassador for Shipping Ministry; Check her Relation with Marine

Optical Illusion to Test Your IQ: Find the hidden cat in 6 seconds!

ECGC PO Syllabus 2024: Check Subject Wise Important Topics & Exam Pattern

Personality Test: Your Heart Line Reveals Your Hidden Personality Traits

(Link Active) IBPS RRB PO Mains Admit Card 2024 Out at ibps.in, Direct Link to Hall Ticket

(Link Active) Rajasthan CET Admit Card 2024: Download RSMSSB CET Hall Ticket PDF at rsmssb.rajasthan.gov.in

What are the 4 supermoons of 2024? Check All the Details Here!

NPS Vatsalya Scheme: लाभ, पात्रता, जरुरी डॉक्यूमेंट और आवेदन प्रक्रिया की सभी डिटेल्स यहां देखें

MP Police Constable PET PST Date 2024 Announced: Check Physical Requirements for Male and Female

Today Current Affairs One Liners 19 September 2024: World Food India 2024

CBSE Class 10 Question Paper 2024 with Answer Key: Download in PDF

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • My Bibliography
  • Collections
  • Citation manager

Save citation to file

Email citation, add to collections.

  • Create a new collection
  • Add to an existing collection

Add to My Bibliography

Your saved search, create a file for external citation management software, your rss feed.

  • Search in PubMed
  • Search in NLM Catalog
  • Add to Search

Context-rich short answer questions (CR-SAQs) in assessment for learning in undergraduate medical education

Affiliation.

  • 1 Imperial College London, School of Medicine , London , UK.
  • PMID: 31570069
  • PMCID: PMC6781250
  • DOI: 10.1080/10872981.2019.1674569

PubMed Disclaimer

Similar articles

  • Validity of very short answer versus single best answer questions for undergraduate assessment. Sam AH, Hameed S, Harris J, Meeran K. Sam AH, et al. BMC Med Educ. 2016 Oct 13;16(1):266. doi: 10.1186/s12909-016-0793-z. BMC Med Educ. 2016. PMID: 27737661 Free PMC article.
  • The benefits of testing for learning on later performance. McConnell MM, St-Onge C, Young ME. McConnell MM, et al. Adv Health Sci Educ Theory Pract. 2015 May;20(2):305-20. doi: 10.1007/s10459-014-9529-1. Epub 2014 Jun 29. Adv Health Sci Educ Theory Pract. 2015. PMID: 24973998
  • An experimental comparison of multiple-choice and short-answer questions on a high-stakes test for medical students. Mee J, Pandian R, Wolczynski J, Morales A, Paniagua M, Harik P, Baldwin P, Clauser BE. Mee J, et al. Adv Health Sci Educ Theory Pract. 2024 Jul;29(3):783-801. doi: 10.1007/s10459-023-10266-3. Epub 2023 Sep 4. Adv Health Sci Educ Theory Pract. 2024. PMID: 37665413 Free PMC article.
  • Key Learning Outcomes for Clinical Pharmacology and Therapeutics Education in Europe: A Modified Delphi Study. Brinkman DJ, Tichelaar J, Mokkink LB, Christiaens T, Likic R, Maciulaitis R, Costa J, Sanz EJ, Maxwell SR, Richir MC, van Agtmael MA; Education Working Group of the European Association for Clinical Pharmacology and Therapeutics (EACPT) and its affiliated Network of Teachers in Pharmacotherapy (NOTIP). Brinkman DJ, et al. Clin Pharmacol Ther. 2018 Aug;104(2):317-325. doi: 10.1002/cpt.962. Epub 2018 Jan 30. Clin Pharmacol Ther. 2018. PMID: 29205299 Free PMC article.
  • The targeted oral. Rangachari PK. Rangachari PK. Adv Physiol Educ. 2004 Dec;28(1-4):213-4. doi: 10.1152/advan.00030.2004. Adv Physiol Educ. 2004. PMID: 15545351 Review.
  • Bird JB, Olvet DM.. Willey JM and Brenner J. Patients don’t come with multiple choice options: essay-based assessment in UME. Med Educ Online. 2019;24(1):1649959. - PMC - PubMed
  • Wakeford RE, Roberts S. Short answer questions in an undergraduate qualifying examination: a study of examiner variability. Med Educ. 1984;18:168–1. - PubMed
  • Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387–396. - PubMed

Publication types

  • Search in MeSH

LinkOut - more resources

Full text sources.

  • Europe PubMed Central
  • PubMed Central
  • Taylor & Francis

full text provider logo

  • Citation Manager

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

IMAGES

  1. 320 Single Best Answer Questions for Final Year Medical Students

    short answer questions medical education

  2. short answer questions anaesthesia & intensive care Medicine

    short answer questions medical education

  3. Can You Answer The 12 Medical Questions Everyone Should Be Able To Get

    short answer questions medical education

  4. 50+ Questions to Ask When Choosing Your New Doctor

    short answer questions medical education

  5. Very short answer questions: a viable alternative to multiple choice

    short answer questions medical education

  6. A How-To Guide for the Short Answer Questions for Highly-Selective Colleges

    short answer questions medical education

VIDEO

  1. Radiology Practice Questions

  2. PTE Answer Short Question

  3. Short Answer / Essay Question Setup

  4. Class 12th Hindi Subjective Question Answer For Bihar Board Exam 2024

  5. Let's Practice IELTS

  6. Interview With TN's Top 1% Medical College Students

COMMENTS

  1. Very short answer questions: a viable alternative to multiple choice

    Background Multiple choice questions, used in medical school assessments for decades, have many drawbacks such as hard to construct, allow guessing, encourage test-wiseness, promote rote learning, provide no opportunity for examinees to express ideas, and do not provide information about strengths and weakness of candidates. Directly asked, directly answered questions like Very Short Answer ...

  2. Full article: Twelve tips for introducing very short answer questions

    Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: cross-sectional study. BMJ Open . 9(9):e032550. PubMed Web of Science ® Google Scholar

  3. Use of very short answer questions compared to multiple choice ...

    Very short answer questions (VSAQs), which are open-ended questions requiring a very short answer, may circumvent these limitations. Although VSAQ use in medical assessment increases, almost all research on reliability and validity of VSAQs in medical education has been performed by a single research group with extensive experience in the ...

  4. Twelve tips for introducing very short answer questions (VSAQs) into

    Most undergraduate written examinations use multiple-choice questions, such as single best answer questions (SBAQs) to assess medical knowledge. In recent years, a strong evidence base has emerged for the use of very short answer questions (VSAQs). VSAQs have been shown to be an acceptable, reliable …

  5. Very short answer questions: a viable alternative to multiple choice

    Multiple True/False (MTF) and One Best Answer Questions (BAQ) are widely employed by the medical faculties by virtue of their advantages of instant machine scoring, freedom from examiner bias, and dependable reliability [1 - 4]. In this article, 'MCQ' is used to refer to both these instruments of assessment.

  6. (PDF) Very short answer questions: a viable alternative to multiple

    R E S E A R C H A R T I C L E Open Access. Very short answer questions: a viable. alternative to multiple choice questions. Thomas Puthiaparampil. 1*. and Md Mizanur Rahman. 2. Abstract ...

  7. Use of very short answer questions compared to multiple choice

    Very short answer questions (VSAQs), which are open-ended questions requiring a very short answer, may circumvent these limitations. Although VSAQ use in medical assessment increases, almost all research on reliability and validity of VSAQs in medical education has been performed by a single research group with extensive experience in the ...

  8. Very short answer questions: a viable alternative to multiple choice

    Directly asked, directly answered questions like Very Short Answer Questions (VSAQ) are considered a better alternative with several advantages. Objectives: This study aims to compare student performance in MCQ and VSAQ and obtain feedback. from the stakeholders. Methods: Conduct multiple true-false, one best answer, and VSAQ tests in two ...

  9. Comparing single-best-answer and very-short-answer questions for the

    Unlike short-answer questions, modified essay formats or clinical reasoning problems, 9 VSAs are straightforward to deliver in an electronic format and efficient to mark. We need to know that medical students and trainees have the required applied medical knowledge to practise safely without test scores being confounded by the ability to use ...

  10. PDF Very short answer questions: a viable alternative to multiple choice

    question tests, and it is widely accepted by medical students and academics in the medical faculty. Keywords: Very short answer questions, Multiple choice questions, Best answer questions Background

  11. Validity of very short answer versus single best answer questions for

    Single Best Answer (SBA) questions are widely used in undergraduate and postgraduate medical examinations. Selection of the correct answer in SBA questions may be subject to cueing and therefore might not test the student's knowledge. In contrast to this artificial construct, doctors are ultimately required to perform in a real-life setting that does not offer a list of choices.

  12. Use of very short answer questions compared to multiple choice

    (A) Set-up of both courses (RM and DA) with the formative exam and contents, summative exam and contents, and the Automated Education Evaluation System (AEES) (B) Flowchart of the study ...

  13. Very Short Answer Questions: A Novel Approach To ...

    To address these limitations, we used a novel online tool to administer Very Short Answer questions (VSAQs) and evaluated the utility of the VSAQs in an undergraduate summative pathology assessment. Methods: A group of 285 medical students took the summative assessment, comprising 50 VSAQs, 50 single best answer questions (SBAQs), and 75 ...

  14. Essay Questions and Variations

    Basics in Medical Education, pp. 299-307 (2003) The following sections are included: Abstract The following sections are included: Advantages Challenges and Limitations Basic Categories of Essay Questions Short Answer Questions (SAQ) Modified Essay Questions (MEQ) References and Fu...

  15. Question Archives

    Questions about myeloma and MGUS Example exam questions and answers about myeloma and MGUS (monoclonal gammopathy of undetermined significance) - for doctors a... Questions about non-Hodgkin lymphoma Non-Hodgkin lymphoma exam questions and answers for doctors, medical student finals, OSCEs, PACES and CPD

  16. Assessment Methods in Medical Education

    Medical education, the art and science behind medical learning and teaching, has progressed remarkably. ... The Short Answer Question (SAQ) is an open ended, semi-structured question format. A structured predetermined marking scheme improves objectivity. The questions can incorporate clinical scenarios. A similar format is also known as ...

  17. Personal insight questions

    Directions. You will have 8 questions to choose from. You must respond to only 4 of the 8 questions. Each response is limited to a maximum of 350 words. Which questions you choose to answer is entirely up to you. However, you should select questions that are most relevant to your experience and that best reflect your individual circumstances.

  18. Validity of very short answer versus single best answer questions for

    We hypothesised that a novel assessment instrument consisting of very short answer (VSA) questions is a superior test of knowledge than assessment by SBA. Methods: We conducted a prospective pilot study on one cohort of 266 medical students sitting a formative examination. All students were assessed by both a novel assessment instrument ...

  19. California Life, Accident and Health State Exam Practice Questions Part

    California Life, Accident and Health State Exam Practice Questions Part 1 Learn with flashcards, games, and more — for free. ... 9780078692512 McGraw-Hill Education. 3,967 solutions. ... D. Applicants are not required to answer medical questions on the application.

  20. NCERT Solutions for Class 10 Science Chapter 12 Magnetic Effects of

    Intext Questions and Solutions Page no. 200. Q. Draw magnetic field lines around a bar magnet. Sol. Magnetic field lines of a bar magnet emerge from the North Pole and terminate at the South Pole ...

  21. Biography of César E. Chávez (CA Dept of Education)

    An American HeroThe Biography of César E. Chávez. César E. Chávez was a good man who dedicated his life to helping others. César was born to parents who taught him important ideas about hard work, the importance of education, and respect. As a young boy, César worked on his family's farm feeding and watering the animals, collecting eggs ...

  22. Context-rich short answer questions (CR-SAQs) in assessment for

    Context-rich short answer questions (CR-SAQs) in assessment for learning in undergraduate medical education Med Educ Online . 2019 Dec;24(1):1674569. doi: 10.1080/10872981.2019.1674569.

  23. Find Local Help with Medi-Cal

    Get help applying for Medi-Cal and get answers to your questions. Local Medi-Cal Office Find Office. Last modified date: 9/10/2024 1:18 PM. Get Help in Your Language ... Please do not enter any personal, medical, or confidential information. /1000. Submit. Don't show this again ...