Case-Based Critical Thinking Questions Case 5-3

  • Arzi, H. & White, R.T. (1986). Questions on students’ questions. Research in Science Education, 16, 82–91.CrossRefGoogle Scholar

  • Barak, M. & Dori, Y.J. (2005). Enhancing undergraduate students’ chemistry understanding through project-based learning in an IT environment. Science Education, 89, 117–139.CrossRefGoogle Scholar

  • Barnea, N. (2004). Towards the new chemistry curriculum in high-schools. Alchemy- Bulletin For Chemistry Teachers, 5, 3–4. (in Hebrew).Google Scholar

  • Barnea, N. (2002). Updating high school chemistry syllabus: the process of change. Paper presented at the 17th International Conference on Chemical Education (17th ICCE), Beijing, China.Google Scholar

  • Barnea, N. & Dori, Y.J. (2000). Computerized molecular modeling–The new technology for enhancing model perception among chemistry educators and learners. Chemistry Education: Research and Practice in Europe, 1, 109–120.Google Scholar

  • Becker, R. (2000). The critical role of students’ questions in literacy development. The Educational Forum, 64, 261–271.CrossRefGoogle Scholar

  • Bloom, B.S. (1956). Taxonomy of educational objectives: Handbook 1 the Cognitive Domain. New York: Mckay.Google Scholar

  • Bransford, J.D., Brown, A.L. & Cocking, R.R. (Eds.). (2000). How people learn: Brain, mind, experience and school. Washington, D.C.: National Research council, National Academy press.Google Scholar

  • Chandrasegaran, A.L., Treagust, D.F. & Mocerino, M. (2007). An evaluation of a teaching intervention to promote students’ ability to use multilple levels of representation when describing and explaining chemical reactions. Research in Science Education, On line first.Google Scholar

  • Coll, R.K. & Treagust, D.F. (2003). Investigation of secondary school, undergraduate, and graduate learners’ mental models of ionic bonds. Journal of Research in Science Teaching, 40, 464–486.CrossRefGoogle Scholar

  • Dillon, J.T. (1988). The remedial status of student questioning. Journal of Curriculum Studies, 20, 197–210.CrossRefGoogle Scholar

  • Dori, Y.J. (2003). From nationwide standardized testing to school-based alternative embedded assessment in Israel: Students’ performance in the “Matriculation 2000” project. Journal of Research in Science Teaching, 40, 34–52.CrossRefGoogle Scholar

  • Dori, Y.J. & Barak, M. (2001). Virtual and physical molecular modeling: Fostering model perception and spatial understanding. Educational Technology & Society, 4, 61–74.Google Scholar

  • Dori, Y.J., Barak, M. & Adir, N. (2003). A web-based chemistry course as a means to foster freshmen learning. Journal of Chemical Education, 80, 1084–1092.CrossRefGoogle Scholar

  • Dori, Y.J., Barak, M., Herscovitz, O. & Carmi, M. (2006). Preparing pre- and in-service teachers to teach high school science with technology. In C. Vrasidas & G.V. Glass (Eds.), Preparing teachers to teach with technology, 2nd Volume of the book series: Current perspectives on applied information technologies. Greenwich, CT, USA: Information Age Publishing.Google Scholar

  • Dori, Y.J. & Hameiri, M. (1998). The “Mole environment” studyware: Applying multidimensional analysis to quantitative chemistry problems. International Journal of Science Education, 20, 317–333.CrossRefGoogle Scholar

  • Dori, Y.J. & Hameiri, M. (2003). Multidimensional analysis system for quantitative chemistry problems –symbol, macro, micro and process aspects. Journal of Research in Science Teaching, 40, 278–302.Google Scholar

  • Dori, Y.J. & Herscovitz, O. (1999). Question posing capability as an alternative evaluation method: Analysis of an environment case study. Journal of Research in Science Teaching, 36, 411–430.CrossRefGoogle Scholar

  • Dori, Y.J. & Herscovitz, O. (2005). Case-based long-term professional development of science teachers. International Journal of Science Education, 27, 1413–1446.CrossRefGoogle Scholar

  • Dori, Y.J. & Sasson, I. (2008). Chemical understanding and graphing skills in an honors case-based computerized chemistry laboratory environment: The value of bidirectional visual and textual representations. Journal of Research in Science Teaching, 45, 219–250.CrossRefGoogle Scholar

  • Dori, Y.J., Sasson, I., Kaberman, Z. & Herscovitz, O. (2004). Integrating case-based computerized laboratories into high school chemistry. The Chemical Educator, 9, 1–5.Google Scholar

  • Dori, Y.J. & Tal, R.T. (2000). Formal and informal collaborative projects: Engaging in industry with environmental awareness. Science Education, 84, 95–113.CrossRefGoogle Scholar

  • Dori, Y.J., Tal, R.T. & Tsaushu, M. (2003). Teaching biotechnology through case studies Can we improve higher order thinking skills of non–science majors? Science Education, 87, 767–793.CrossRefGoogle Scholar

  • Furio, C., Calatayud, M.L., Barcenas, S.L. & Padilla, O.M. (2000). Functional fixedness and functional reductions as common sense reasonings in chemical equilibrium and in geometry and polarity of molecules. Science Education, 84, 545–565.CrossRefGoogle Scholar

  • Gabel, D.L. (1998). The complexity of chemistry and implications for teaching. In B.J. Fraser & K.J. Tobin (Eds.), International handbook of science education (pp. 233–248). Great Britain: Kluwer Academic Publishers.Google Scholar

  • German, P.J., Aram, R. & Burke, J. (1996). Identifying patterns and relationships among the responses of seventh–grade students to the science process skill of designing experiments. Journal of Research in Science Teaching, 33, 79–99.CrossRefGoogle Scholar

  • Gilbert, J.K., De Jong, O., Justi, R., Treagust, D.F. & Van Driel, J.H. (2002). Research and development for the future of chemical education.. In J.K. Gilbert, O. De Jong, R. Justy, D.F. Treagust & J.H. Van Driel (Eds.), Chemical education: Towards research-based practice (391–408). The Netherlands: Kluwer Academic Publishers.Google Scholar

  • Harrison, A.G. & Treagust, D.F. (2000). Learning about atoms, molecules and chemical bonds: A case study of multiple-model use in grade 11 chemistry. Science Education, 84, 352–381.CrossRefGoogle Scholar

  • Harrison, A.G. & Treagust, D.F. (1998). Modelling in science lessons: Are there better ways to learn with models? School Science and Mathematics, 98, 420–429.CrossRefGoogle Scholar

  • Hofstein, A., Levy Nahum, T. & Shore, R. (2001). Assessment of the learning environment of inquiry-type laboratories in high-school chemistry. Learning Environments Research, 4, 193–207.CrossRefGoogle Scholar

  • Hofstein, A. & Lunetta, V.N. (1982). The role of laboratory in science teaching: Neglected aspects of research. Review of Educational Research, 52, 201–217.Google Scholar

  • Hofstein, A. & Lunetta, V. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88, 28–54.CrossRefGoogle Scholar

  • Hofstein, A., Shore, R. & Kipnis, M. (2004). Providing high school chemistry students with opportunities to develop learning skills in an inquiry-type laboratory–A case study. International Journal of Science Education, 26, 47–62.CrossRefGoogle Scholar

  • Johnstone, A.H. (1991). Why is science difficult to learn? Things are seldom what they seem. Journal of Computer Assisted Learning, 7, 75–83.CrossRefGoogle Scholar

  • Kaberman, Z. & Dori, Y.J. (2008). Metacognition in chemical education: Question posing in the case-based computerized learning environment. Instructional Science. In press.Google Scholar

  • Keig, P.F. & Rubba, P.A. (1993). Translation of representations of the structure of matter and its relationship to reasoning, gender, spatial reasoning, and specific prior knowledge. Journal of Research in Science Teaching, 30, 883–903.CrossRefGoogle Scholar

  • Kozma, R. & Russell, J. (2005). Students becoming chemists: Developing representational competence.. In J.K. Gilbert (Ed.), Visualization in science education (pp. 121–145). Dordrecht, The Netherlands: Springer.CrossRefGoogle Scholar

  • Kozma, R.B. & Russel, J. (1997). Multimedia and understanding: Expert and novice responses to different representations of chemical phenomena. Journal of Research in Science Teaching, 34, 949–968.CrossRefGoogle Scholar

  • Marbach–Ad, G. & Claassen, L. (2001). Improving students’ questions in inquiry labs. American Biology Teacher, 63, 410–419.CrossRefGoogle Scholar

  • Marbach–Ad, G. & Sokolove, P. G. (2000). Can undergraduate biology students learn to ask higher level questions. Journal of Research in Science Teaching, 37, 854–870.CrossRefGoogle Scholar

  • Mathewson, J. H. (1999). Visual-spatial thinking: An aspect of science overlooked by educators. Science Education, 83, 33–54.CrossRefGoogle Scholar

  • National Research Council (2000). Inquiry and the national science education standards. Washington, DC: National Academy Press.Google Scholar

  • National Research Council (1996). National education standards. Washington, D.C.: National Academy Press.Google Scholar

  • Pintrich, P.R. (2002). The role of metacognitive knowledge in learning, teaching and assessing. Theory into Practice, 41, 219–225.CrossRefGoogle Scholar

  • Sasson, I. & Dori, Y.J. (2006). Fostering near and far transfer in the chemistry case-based laboratory environment.. In G. Clarebout & J. Elen (Eds.), Avoiding simplicity, confronting complexity: Advance in studying and designing powerful (computer-based) learning environments (pp. 275–286). Rotterdam, The Netherlands: Sense Publishers.Google Scholar

  • Singer, H. (1978). Active comprehension: From answering to asking questions. Reading Teacher, 31, 901–908.Google Scholar

  • Small, M.Y. & Morton, M.E. (1983). Research in college science teaching: Spatial visualization training improves performances in organic chemistry. Journal of College Science Teaching, 13, 41–43.Google Scholar

  • Tamir, P., Nussinovitz, R. & Friedler, Y. (1982). The design and use of a practical tests assessment inventory. Journal of Biological Education, 16, 42–50.Google Scholar

  • Tobin, K. (1990). Teacher mind frames and science learning.. In K. Tobin, J.B. Kahle & B.J. Fraser (Eds.), Windows into science classrooms (pp. 33–91). New York, Philadelphia: The Falmer Press, London.Google Scholar

  • White, R.T. & Arzi, H.J. (2005). Longitudinal studies: Designs, validity, practicality, and value. Research in Science Education, 35, 137–149.CrossRefGoogle Scholar

  • Woodward, C. (1992). Raising and answering questions in primary science: Some considerations. Evaluation and Research in Education, 6, 145–153.CrossRefGoogle Scholar

  • Wu, H.K., Krajcik, J.S. & Soloway, E. (2001). Promoting understanding of chemical representations: Students’ use of a visualization tool in the classroom. Journal of Research in Science Teaching, 38, 821–842.CrossRefGoogle Scholar

  • Wu, H.K. & Shah, P. (2004). Exploring visuo-spatial thinking in chemistry learning. Science Education, 88, 465–492.CrossRefGoogle Scholar

  • Zohar, A. & Dori, Y.J. (2003). Higher order thinking skills and low achieving students: Are they mutually exclusive? The Journal of the Learning Sciences, 12, 145–182.CrossRefGoogle Scholar

  • Zohar, A. & Nemet, F. (2002). Fostering students’ knowledge and argumentation skills through dilemas in human genetics. Journal of Research in Science Teaching, 39, 35–62.CrossRefGoogle Scholar

  • Zoller, U. (1993). Are lecture and learning compatible? Maybe for LOCS: Unlikely for HOCS. Journal of Chemical Education, 70, 195–197.CrossRefGoogle Scholar

  • Zoller, U. (2002). Algorithmic, LOCS and HOCS (chemistry) exam questions: performance and attitudes of college students. International Journal of Science Education, 24, 185–203.CrossRefGoogle Scholar

  • Many more questions aiming on diagnosis

    At the end of year 4, students of the MUV had had various lectures but hardly any actual experiences with therapies. This may explain why significantly more items concerned the diagnosis of psychiatric diseases than their therapies.

    Among questions aiming on therapy, significantly more concerned pharmacotherapy than psychotherapy

    Before Block 20, the seminars concerning therapies in the MUV Curriculum were almost exclusively pharmacological. After successful attendance of Block 20 most students who did not have any personal experience of psychotherapy only had little insight into how psychotherapy is developing on the long-term and what psychotherapy can really provide to the patient. Psychotherapy associations were still loaded with old stereotypes [13, 16]. This could explain why significantly more therapy questions addressed pharmacology than psychotherapy.

    A huge majority of Step 1 questions

    The students mainly offered Step 1 questions. It can be questioned, whether the lack of case-oriented questions was an indication for insufficient clinical thinking by the students. An essential explanation could be that students lacked adequate patient contact until the end of year four. Indeed, MUV students were allowed to begin their practical experience after year two and eight compulsory clerkship weeks were scheduled before the beginning of year five [17]. Thus, Austrian medical students gained consistent clinical experience only after year four, with rotations in year five and the newly introduced Clinical Practical Year in year six. A European comparison of medical universities’ curricula showed that students of other countries spent earlier more time with patients: Dutch, French and German medical students began with a nursing training in year one and had 40, 10 and 4 months, respectively, more clerkship experience than Austrian students before entering year five [18, 19, 20, 21]. French and Dutch universities are extremely centered on clinical thinking, with a total of 36 clerkship months in France and the weekly presence of patients from the first lectures on in Groningen [22]. Thus, it would be interesting to repeat a similar case-based exercise in these countries to explore if medical students at the same educational stage but with more practical experience are more likely to offer patient vignette items.

    Students preferred to work with right facts and did not reject negatively worded questions

    As negatively worded questions were usually banished from MCQ exams, it was interesting to observe that medical students did not reject them. In fact, negatively formulated questions are more likely to be misunderstood. Their understanding correlates to reading ability [23] and concentration. Although many guidelines [6, 24] clearly advised to avoid negative items, the students generated 27.5 % of negatively formulated questions. Also Pick N format-questions with several right answers were offered by the students, despite the recommendations for this exercise: They offered significantly less total answer possibilities but significantly more right answers to positively worded questions than to negatively worded questions. Those results supported the hypothesis that the students preferred handling right content while keeping wrong content to a minimum.

    Several possible reasons can be contemplated. When students lack confidence with a theme and try to avoid unsuitable answer possibilities, it can be more difficult to find four wrong answers to a positively worded question instead of several right answers, which may be listed in a book. Furthermore, some students may fear to think up wrong facts to avoid learning wrong content. Indeed, among positively worded items, 26.6 % were offered with 3 or more right answers, which never happened for negatively worded items (Table 2).

    Notably, “right answer possibilities” of negatively worded items’ stems as well as “wrong answer possibilities” of positively worded items’ stems are actually “wrong facts”. For example, the right answer of the item “Which of the following symptoms does NOT belong to ICD-10 criteria of depression?” (Item 177) is the only “wrong fact” of the 5 answer possibilities. Writing the 4 “wrong answers” of this question, which are actually the ICD-10 criteria for depression, can help the students learn these diagnostic criteria. On the contrary, the “right answers” to a positively worded item such as “Which vegetative symptoms are related to panic attacks?” (Item 121) are the true facts.

    Finally, the students’ interest for right facts supports the theory that a positive approach, positive emotions and curiosity are favorable to learning processes. Indeed, asking for right content is a natural way of learning, already used by children from the very early age. The inborn curiosity — urge to explain the unexpected [25], need to resolve uncertainty [26] or urge to know more [27]— is shown by the amount of questions asked by children [28, 29]. The students’ way to ask for right contents appears very close to this original learning process.

    The inputs of developmental psychology, cognitive psychology as well as of neurosciences underline this hypothesis. Bower presented influences of affect on cognitive processes: He showed a powerful effect of people’s mood on their free associations to neutral words and better learning abilities regarding incidents congruent with their mood [30]. Growing neurophysiological knowledge confirmed the close relation between concentration, learning and emotions — basic psychic functions necessitating the same brain structures. The amygdala, connected to major limbic structures (e.g. pre-frontal cortex, hippocampus, ventral striatum), plays a major role in affect regulation as well as in learning processes [15], and the hippocampus, essential to explicit learning, is highly influenced by stress, presenting one of the highest concentrations of glucocorticoid receptors in the brain [31]. Stress diminishes the synaptic plasticity within the hippocampus [32], plasticity which is necessary to long-term memory.

    Neuroscientific research also underlined the interdependence of cognitive ability and affect regulation. Salas showed on a patient after an ischemic stroke event with prefrontal cortex damage that, due to executive impairment and increased emotional reactivity, cognitive resources could not allow self-modulation and reappraising of negative affects anymore [33].

    Considering this interdependence, right contents might be related to a positive attitude and positive affects among the students. It could be interesting to further research on this relation as well as on the students’ motivations concerning the formulation of the questions.

    The combination of those reasons probably explains why the students offered significantly more wrong answers to negatively worded items and more right answers to positively worded items, both resulting in the use of more right facts. All the students’ assessment questions and associated feedback were used to create a new database at the MUV trying to integrate more right facts in case-based learning exercises in the future.

    The main limitation concerns the small sample size and the focus on only one curriculum element. Further studies with convenient sampling should include other medical fields and bridge the gap to learning outcome research.


    Leave a Reply

    Your email address will not be published. Required fields are marked *