Academic journal article Literacy Learning: The Middle Years

Crowdsourcing Feedback

Academic journal article Literacy Learning: The Middle Years

Crowdsourcing Feedback

Article excerpt


Crowdsourcing involves canvassing for work or funding, typically via the internet, from a crowd of people. The word combines 'crowd' and 'outsourcing'. Using the principle that 'more heads are better than one', crowdsourcing can engage a large crowd of people for creative ideas, skills, knowledge and/ or participation. Well known examples of crowdsourcing include: Wikipedia, the Arab Spring (Egypt's uprising in 2011); the toy company, Lego, that invites users to design new products and then other users vote (see Figure 1); Airbnb; Greenpeace canvassing for the 'Let's Go' ads in an environmental protest.

Crowdsourcing feedback in education involves feedback to learners from multiple perspectives, including peers and self. Many researchers have demonstrated the effects of regular and multiple forms of feedback by teachers and peers to improve learning (Black & Wiliam, 1998; Hattie, 2009; Hattie & Timperley, 2007; Wiliam, 2014). What if teachers could ensure feedback to their students by crowdsourcing it? What would it mean for student learning? What would it mean for teacher workloads? This paper will explore these questions using the example of Scholar, a web learning environment, developed by Bill Cope and Mary Kalantzis (2013) from the University of Illinois and the Common Ground Research Networks, Champaign, Illinois. Some comments from students and teachers about their experiences of crowdsourced feedback are included.

Classroom interactions

One form of feedback in many classrooms is through the traditional Initiate-Respond-Evaluate (I-R-E) approach (Cazden, 2001). This is where the teacher asks a question, the student responds and the teacher provides feedback on whether the response is correct or not by evaluating it. Similarly, the teacher providing formative and/or summative feedback on an assignment, test or project puts the responsibility on the teacher. Crowdsourcing feedback, on the other hand, challenges existing forms of feedback to create a collaborative environment where learners learn from each other as well as from the teacher. Figure 2 demonstrates how feedback works in a traditional I-R-E classroom and in a collaborative environment.

New technologies are creating more opportunities for interactions that are learner to learner as well as learner to teacher, and where there are many interactions simultaneously. These include collaborative environments such as online discussion forums and blogs where learners give feedback by affirming, elaborating, questioning and challenging other students' responses, traditionally the work of the teacher.

An example of crowdsourcing feedback

In the following example of crowdsourcing in an online discussion, 10 of 66 responses are included. They are written by Year 7 students as they imagine the worst place in the world in which to live. This is a frontloading activity for a study of the novel Trash by Andy Mulligan (2011). This online forum is based on a social media environment where students read the comments and then at the bottom, add their own comments. In this way, the comments of other students prompt further thinking as well as providing models of how to respond, for students who are less confident. Another advantage is how much text students read in order to comment on other students' responses.

   The worst place in the world is a place that war and homelessness
   is extremely high, this place would also have limited food that
   would only be provided to the higher rich people. This place is so
   awful because of all the war and lack of food, this place also has
   a high rate of homelessness. For one, not having a home, or even
   shelter would be awful. There would be no food or even internet
   connection, and you would be in fear of your life. For two, living
   in a place surrounded by war would be horrible. You would literally
   die so easily from so many reasons. … 
Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.