Understanding students’ higher order thinking skills through the lens of assessment

Rayanne Shakra and Jim Tognolini give clear and comprehensive advice to teachers on how to use modern definitions of assessment to better assess their students’ Higher Order Thinking Skills.

Defining Higher Order Thinking Skills (HOTS) is no easy task. Research nearly forty years ago noted that “Defining thinking skills, reasoning, critical thought and problem solving is troublesome to both social scientists and practitioners. Troublesome is a polite word; the area is a conceptual swamp” (Cuban, 1984, p. 676). Four decades onwards not much has changed definition-wise, to clean-up the ‘conceptual swamp’. However, modern definitions of assessment have emerged that are useful in guiding teachers to better assess HOTS for students in any school year.

To produce evidence on how students think, teachers need to develop assessments that enable the students to demonstrate what it is they know, can do and value. For the purposes of this paper the focus will be on cognitive abilities and the following definition of assessment will be used. Assessment involves teachers making informed judgements based upon an image formed by the collection of information about student performance (Tognolini & Stanley, 2007). This image is used to monitor student growth (progress) through an area of learning or domain of knowledge.  The higher levels of growth are differentiated by students having to demonstrate that they can do something with the knowledge that they have gained e.g., they can solve problems, think critically, evaluate the effectiveness of different strategies for solving problems.

Thinking is an internal process. Teachers cannot see this internal process, so they must depend on cognitive models and tools that can be used to categorise levels of learning. These models use verbs to describe the complexity of the thought processes students should demonstrate. Blooms’ revised taxonomy (Anderson & Krathwohl, 2001) is one of these models.

This taxonomy is a powerful tool for teachers because it provides a way for teachers to differentiate between different levels of cognitive depth. It categorises learning into the following three domains: psychomotor, cognitive and affective. The cognitive domain. This domain involves six major categories to which students’ skills and abilities are listed from the simplest thinking behaviour, also known as the Lower Order Thinking Skills (LOTS), to the most complex, known as the Higher Order Thinking Skills (HOTS). The taxonomy lists the skills in hierarchal order from the LOTS to the HOTS, as in Figure 1. These skills include the mental processes of remembering, understanding, applying, analysing, evaluating, and creating.

Figure 1 Bloom’s Revised Taxonomy

The logic behind this hierarchy is that before students can understand a concept they must remember it; to apply a concept they must first understand it; to evaluate a process they must have analysed it; and to create an accurate conclusion, students must have completed a thorough evaluation. Students’ thinking progresses from the LOTS to the HOTS. While the skills are presented in hierarchical form, the way students’ skills are developed does not necessarily have to be linear, that is, the skills may overlap onto each other (Krathwohl, 2002).

The thought processes are usually linked to the verbs associated with the thinking level that teachers are aiming to teach or assess. The mention of the verb here is not the actual word denoting the verb, it is the thought process or action behind the meaning of the verb. If teachers want to assess critical thinking, they should not look at a question beginning with ‘criticise’, rather the focus should be on how the student is going to solve the task that is being set. That is, when the students have produced the evidence from answering the task, does their evidence indicate a higher level of cognitive functioning? It isn’t the verb but the manifestation of the response to what is requested in the task that indicates whether the students have demonstrated higher order thinking in this circumstance.

Learning, by its nature, is developmental. Teachers act as facilitators in assisting the students to grow in knowledge, skill and understanding through the teaching of subject content. As students gain more content knowledge, and can use this knowledge to demonstrate growth, then teachers are required to provide tasks that are cognitively more demanding, to tap into the higher order thinking of their students. The cognitive level needed to solve these tasks is generally referred to as the depth of knowledge associated with the task (Webb, 1997). Cognitively demanding tasks require the students to think and use the knowledge that they have gained to solve both real life problems and even conceptually abstract problems.

HOTS are not only fostered and assessed in both mainstream primary and secondary students of diverse racial and socio-economic backgrounds, but also for students in special education classes. In fact, students in years 9-12 enrolled in special education classes, and who were given cognitively challenging tasks, outperformed students without disabilities at the same year level and who were given tasks that were less challenging (King, Schroeder, and Chawszczewski, 2001). Teachers need to deliberately provide their students with tasks that academically challenge and engage them. Often teachers think that their classroom assessments incorporate higher order thinking however, most do not (Care, Kim, Vista & Anderson, 2018: Hoogland & Tout, 2018; McMillan, 2001; McMillan, Myran, & Workman, 2002). Teachers are teaching HOTS through many new pedagogical methods such as inquiry learning or Project-Based Learning, but  are not assessing for these skills (Anderson, 2002).

Assessment is integral to teaching and learning (Baird, Andrich, Hopfenbeck, & Stobart, 2017). The success of HOTS development is determined by the alignment between learning outcomes to be achieved, as stated in curriculum documents, and the implemented assessments (Fitzpatrick & Schulz, 2015). The importance of teachers knowing the targeted HOTS they are teaching and assessing means that teachers need to actively engage in developing appropriate assessments and to use both formative and summative assessments together.  Formative assessments provide timely, regular feedback that informs instruction as students learn increasingly complex tasks. Summative assessments are necessary to determine if standards have been met or if students can perform tasks that involve HOTS.

Devising HOTS tasks that can lead to the production of valued outcomes and can be recognised through intuitive understanding is quite burdensome. Teachers need to formulate HOTS tasks that require reasoned thinking on behalf of the students, and this is far from simple. 

Therefore, it is important when planning for lessons to know where to incorporate HOTS in teaching sessions (Collins, 2014). Without prior planning, the tasks that teachers might end up requesting spontaneously may not lead to their students’ demonstrating HOTS.

Not every difficult task immediately measures HOTS. In fact, difficulty is not the same as cognitive depth. The difficulty of a task is usually determined by how many students can get the task correct. If very few students get it correct, it is a hard task for the group of students. If everyone gets it right, it is an easy task for the students.

This does not necessarily align with the cognitive depth of the task, nor the level of higher order thinking required to solve the task. Cognitive depth refers to the thought process, knowledge and skill required to solve the task. Hence planning beforehand, specifically for assessing HOTS, is key.

Professional Development and HOTS

For lasting changes to occur in education, it is imperative that teachers recognise necessary changes in learner expectations as well as the purpose of teaching: teaching students to think (Retna & Ng, 2016). In addition to the cognitive thinking models that teachers can utilise, they can also look at research that documents practices that encourage students to develop and practise higher quality thinking.

Professional development courses are a key factor in reviving teachers’ understandings and methods of implementing higher order thinking skills in our classrooms. Professional development courses should be structured in a way to provide teachers with a better understanding of what higher order thinking skills are. These courses also help teachers to conceptualise how the three categories of transfer, critical thinking, and problem solving are coherently interrelated in their instructional strategies.

Open All

Appendix A – Tips for writing HOTS tasks

The following are some tips for teachers to think about when they are writing HOTS tasks for their students:

  1. First, teachers should focus the load of the item on the problem to solve rather than on the content.
  2. Second, items that require students to predict the outcome of a situation are more suited for HOTS than simply labelling or listing.
  3. Third, give them examples and ask for the principle, or theory, they illustrate.
  4. Fourth, design items that permit multiple interpretations or solutions.
  5. Fifth, the skill required to respond to the item is what determines the relative difficulty, not the verb used.
  6. Sixth, make sure that the item is written in a way that makes it very clear to the students as to what is required of them in their responses.
Appendix B – Examples of tasks that promotes HOTS in students and assess their cognitive depth

The following are some examples of assessment tasks that help to both promote higher order thinking skills and assess their current cognitive depth:

Example 1:

Suggest a method, other than a vaccine, that scientists might develop to keep us safe from COVID. Then provide a short persuasive paragraph arguing why people should support this method.

This task can be given to students in any year. It is authentic and taps into the students’ creative thinking skills. Suggesting a new method other than the current ones available assumes students will formulate or create a new method. The persuasive text assumes that students will argue and provide an evaluative judgement of why their method should be accepted widely by the public.

To answer this the students will have to compile information together in ways that they have not yet been explosed to and combine content elements to propose new solutions. The answer to this question can be done collaboratively between the students and in conjunction with the teacher. This collaboration will spark higher order thinking because the students will acknowledge that the teacher does not know the answer and will work to devise one together.

Example 2:

The following is taken from NAPLAN year 3 Numeracy

This question presents the students with an unfamiliar scenario where they must extrapolate a mathematical pattern and apply it by making connections to more than one set of information. The students have to rotate the rectangle and make the connection of how the shapes within it will also vary and change their location.

Example 3:

The following is taken from NAPLAN year 5 Reading

Students are required to make connections between the meanings presented and the text. They also need to infer the meaning of each of the answer choices according to their comprehension of the text to be capable of providing a prediction of which answer best resembles the phrase in the question.

Anderson, P. (2002). Assessment and development of executive function (EF) during childhood. Child neuropsychology, 8(2), 71-82.

Anderson, L. W., Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom’s taxonomy of educational objectives: Complete   edition. New York, NY: Longman.

Baird, J. A., Andrich, D., Hopfenbeck, T. N., & Stobart, G. (2017). Assessment and learning: Fields apart?. Assessment in education: Principles, policy & practice, 24(3), 317-350.

Bloom, B.S. (1956). Taxonomy of educational objectives. Vol. 1: Cognitive domain. New York: McKay, 20 (24), p.1.

Brookhart, S.M. (2010). How to assess higher-order thinking skills in your classroom. ASCD.

Care, E., Kim, H., Vista, A., & Anderson, K. (2018). Education system alignment for 21st century Skills: Focus on assessment. Center for Universal Education at The Brookings Institution.

Collins, R. (2014). Skills for the 21st century: teaching higher-order thinking. Curriculum & leadership journal, 12(14).

Cuban, L. (1984). Policy and research dilemmas in the teaching of reasoning: Unplanned designs. Review of Educational Research, 54(4), 655-681.

FitzPatrick, B., & Schulz, H. (2015). Do curriculum outcomes and assessment activities in science encourage higher order thinking?. Canadian Journal of Science, Mathematics and Technology Education, 15(2), 136-154.

Hoogland, K., & Tout, D. (2018). Computer-based assessment of mathematics into the twenty-first century: pressures and tensions. ZDM, 50(4), 675-686.

King, M. B., Schroeder, J., & Chawszczewski, D. (2001). Authentic assessment and student performance in inclusive schools. Research Institute on Secondary Education Reform (RISER) for Youth with disabilities briefhttps://eric.ed.gov/?id=ED467479

Krathwohl, D.R., 2002. A revision of Bloom’s taxonomy: An overview. Theory into practice, 41(4), pp.212-218.

Lewis, A. and Smith, D., 1993. Defining higher order thinking. Theory into practice, 32(3), pp.131-137.

McMillan, J. H. (2001). Secondary teachers’ classroom assessment and grading practices. Educational Measurement: Issues and Practice, 20(1), 20-32.

McMillan, J. H., Myran, S., & Workman, D. (2002). Elementary teachers’ classroom assessment and grading practices. The Journal of Educational Research, 95(4), 203-213.

Retna, K. S., & Ng, P. T. (2016). The application of learning organization to enhance learning in Singapore schools. Management in Education, 30(1), 10-18.

Tognolini, J., & Stanley, G. (2007). Standards-based assessment: a tool and means to the  development of human capital and capacity building in education. Australian Journal of Education, 51(2), 129-145.

Webb, N.L., 1997. Determining alignment of expectations and assessments in mathematics and science education. Nise Brief, 1(2), p.n2.

Prof Jim Tognolini

Professor Jim Tognolini is Director of The Centre for Educational Measurement and Assessment (CEMA) which is situated within the University of Sydney School of Education and Social Work. The work of the Centre is focused on the broad areas of teaching, research, consulting and professional learning for teachers.

The Centre is currently providing consultancy support to a number of schools. These projects include developing a methodology for measuring creativity; measuring 21st Century Skills; developing school-wide practice in formative assessment. We have a number of experts in the field: most notably, Professor Jim Tognolini, who in addition to conducting research offers practical and school-focused support.

Lisa Edwards

Lisa Edwards is a school leader passionate about the potential of public education to change lives.

With over 20 years of experience with the NSW Department of Education, working in schools in Sydney’s south and southwest, Lisa is an English teacher by training, and at heart. Her leadership journey has included the roles of Head Teacher Wellbeing, Head Teacher English, and Deputy Principal. She has developed resources and presented on assessment, pedagogy and programming, working in collaboration with the Department’s secondary curriculum team, and has published and presented for the NSW English Teachers’ Association. Lisa received an Australian Council for Educational Leaders award and a Premier’s Award for Public Service for her work leading improvement in literacy and HSC achievement, and currently presents for the NSW Teachers’ Federation’s Centre for Professional Learning on leading lifting achievement and quality assessment practice.

Lisa is a strong advocate for public education and educational equity, dedicated to supporting teachers in public schools to maximise learning outcomes and, therefore, opportunities for our students. 

Rayanne Shakra

Rayanne Shakra is a NESA sponsored scholarship doctoral student at the Centre for Educational Measurement and Assessment CEMA) and a sessional academic at The University of Sydney.