Are our feedback comprehensible to students?
I am borrowing the linguistic term, "comprehensible input" to examine the feedback process. According to Dr. Stephen Krashen, comprehensible input refers to messages that people understand when acquiring a second language and it requires meaningful interaction. Communication breaks down when comprehensible language input is not provided and can not be understood by learners. In a way, I think the process for students to understand feedback for further improvement is quite similar to how we acquire an additional language. In order to make important decisions to advance their learning, students do need to understand and develop proficiency of the feedback language that we, teachers, use based on the criteria descriptors.
Teachers often spend tedious hours providing what they perceived as "quality feedback" and yet many students seem to take little interest in and benefit from it. The marking is time-expensive and the return on investment is simply too low. If feedback is seen as a communicative and interactive process, then it is very clear that there is a breakdown in communication. Teachers refer to the assessment criteria and provide accurate feedback, but their feedback is just not "effective" because students do not understand their feedback language. Here is an an example given by Dylan Wiliam in his book, Embedded formative assessment (page 120):
'I remember talking to a middle school student who was looking at the feedback his teacher had given him on a science assignment. The teacher had written, "you need to be more systematic in planning your scientific inquiries." I asked the student what that meant to him, and he said, "I don't know. If I knew how to be more systematic, I would have been more systematic the first time." This kind of feedback is accurate - It is describing what needs to happen - but it is not helpful because the learner does not know how to use the feedback to improve. It is rather like telling unsuccessful comedian to be funnier - accurate, but not particularly helpful advice.'
Making assessment criteria comprehensible
"If you want to become proficient in playing the guitar, probably the worst thing you can do as a beginner is to try and play the guitar like an expert. Instead, it is far more expedient to break the complex domain of an expert guitar player into its constituent parts of scales and chord formations and practise those before moving onto the more complex demands of playing a complete song." (Hendrick and Macpherson 105).
- MYP Year 5 Individuals and Societies criterion B ii shared via Google Plus MYP Coordinators communities
- MYP Year 1 Language and Literature
A well-crafted task-specific rubric has the potential to provide students a clear understanding of expectations in their learning process. It also allows teachers to provide students with more focused feedback about their strengths and areas for improvement relating to subject specific learning objectives. Although the task-specific rubrics might identify the topics, tasks or content knowledge that students are learning and describe varying levels of mastery for each criterion, the language used might still maintain ambiguity and not actually easily for students to understand. Simply adding the topics, tasks, content knowledge or concepts to the criteria descriptors without explaining the type of thinking needed, the majority of the students might still find feedback confusing and difficult to grasp.
For example in an analysis of advertisement task for MYP year 5 phase 5 language acquisition class, we might create a task-specific rubric to clarify expectation for criterion B i (analyse and draw conclusions from information, main ideas and supporting details) for level 5-6 as below:
The student analyses considerably and draws conclusions from information, main ideas and supporting ideas
You analyze considerably and draw conclusions from information, main ideas and supporting details from the print ad.
- Refer back to global context exploration and concepts and identify what content that we want students to learning, including knowledge, understanding and skills. Think what type of assessment task will allow students to demonstrate their understanding of the unit statement of inquiry.
- Provide worked examples of various achievement levels. I highly encourage teachers to build own collections of previously assessed student work that exemplify the descriptors of various assessment band levels. Students can evaluate and critique the worked samples against assessment criteria. Having students conduct standardization of assessment can also further engage them in interpreting the assessment criteria.
- It is useful for teachers to refer to the definition of the commend terms provided in the subject guide. According to the definition provided in the MYP language acquisition guide (published in 2014, updated in 2017, page 115), the definition of the command term, anslyse, means break down in order to bring out the essential elements or structure. (To identify parts and relationships, and interpret information to reach conclusions.) Discuss with students what "analyse" mean in this given task context. What are the essential elements or structure in a print advertisement? Discuss with students what actions might take place before analysing the print ad, considering depth-of-knowledge, DOK). For example we might start with:
- (Level one: recall): outlining elements of print advertisement that are used to attract the target audience
- (Level two: skill/concept) describing how various elements of a print ad are used to express an intended message;
- (Level three: strategic thinking) explaining how graphic feature of a print ad is used to present information by citing specific evidence.
- (level four: extended thinking) Lastly, analyzing by responding to different elements, citing specific evidence from the print ad to support conclusions and also make some personal connections. S.O.A.P.S.Tone reading strategy is a useful strategy (model) to help students find evidence from the print ad to support their analysis and draw conclusions.
- I think this is an important step, identifying knowledge domain to be assessed, to help students understand how thinking is progressed. It can also help with differentiated instruction planning. DOK is just one of the useful tools to help teachers plan the cognitive complexity and assessment task. SOLO taxonomy is also another useful tool that categorize higher order thinking. Before students master the essential skills, there should be smaller steps for them to take and practice.
- Students can work in groups to identify levels of mastery (or quality). What do we mean by saying "has difficulty analysing", "analyse adequately", "analyse considerably", and "analyse thoroughly".
- Share the task-specific rubric students and apply it throughout the learning process.
The student analyses considerably and draws conclusions from information, main ideas and supporting ideas.
I analyze considerably by identifying all elements presented in the print ad, including the speaker, occasion, audience, purpose, subject and tone. I draw conclusions from citing one or more specific evidence from the print ad in responding to each element (SOAPTone).
Steps of creating task-specific rubrics
- David Carless (2018): Feedback loops and the longer-term: towards feedback spirals, Assessment & Evaluation in Higher Education, DOI: 10.1080/02602938.2018.1531108
- Francis, Erik M. “What EXACTLY Is Depth of Knowledge? (Hint: It's NOT a Wheel!).” ASCD Inservice, ASCD, 9 May 2017, inservice.ascd.org/what-exactly-is-depth-of-knowledge-hint-its-not-a-wheel/.
- Hendrick, Carl, et al. What Does This Look like in the Classroom?: Bridging the Gap between Research and Practice. John Catt Educational Ltd, 2017.
- International Baccalaureate. MYP: from principles into practices.
- Stronge, James H., et al. Designing Effective Assessments. Solution Tree Press, 2017.
- Wiliam, Dylan. Embedded Formative Assessment. Solution Tree, 2011.