Wednesday, 15 October 2014

Proud Teacher

An example of where my intervention is taking my students:

The feedback below shows an example of how dialogic interaction between learners helps encourage justification of responses. I am so proud of my students as this level of thinking is completely independent of any teacher input.

Tuesday, 14 October 2014

Accelerating Digital Learning

As part of my own journey to continue learning at the University of Auckland I was challenged to create a digital intervention that would help to accelerate student progress in reading. This was carried out over one term with a small focus group with the journey of one student represented in the case study below. This intervention was successful and as a result has been adapted this term to help our Year 7 and 8 students make deeper connections to the texts they read.


The purpose of this task is to carry out a case study that will allow me to critically reflect on my teaching of an aspect of learning, in one curriculum area, where I have used a digital teaching method to accelerate learning. The hypothesis is that connections made in an online forum of peer led feedback will lead to increased alignment between the decoding automaticity and reading comprehension levels of the target student. Intervention design draws from Wilkinson and Son’s (2009) ‘Fourth Wave’ of dialogic approaches to the learning and teaching of comprehension, and encompasses Vygotsky’s sociocultural theory of exploration and collaboration. The aim being to facilitate student participation in culturally orchestrated learning tasks that allow meaning and understanding to be co-constructed collaboratively, resulting in the emergence and transfer of understanding.  Creating “adaptive experts who know when to integrate knowledge from various domains to solve new problems that cannot be solved by what they did previously” (Timperley 2010, p. 6).

Classroom Context:

This case study was carried out over one ten week term in a Year 7/8 composite class (Table A) in a decile 1c full primary school in East Auckland. Enrolment records across the school reflect high numbers of pupils for whom English is not their first language. Of the twenty five students in total in this 1:1 digital learning environment, 52% of the students speak a language other than English at home. 24% of these students are classified as Refugee or from a Non-English-speaking background (NESB) with 4% eligible and withdrawn for funded teaching support. 96% of the students access their learning on an individual device, with 1 student reliant upon access to a shared classroom device. (Table B) Initial 2014 school wide data reflected that reading comprehension was an area of high need in my class. 60% of the students were identified as reading six months or more below their chronological age, 24% reading at their chronological age and 16% reading 1 year or more above their chronological age when tested on Probe. (Table B)

(Table A)
Total Students in  Class
Year 7
Year 8
ESOL funded

(Table B)
Students with own 1:1 device
Students with no 1:1 device
Well Below
Chronological Reading
Chronological Reading
Chronological Reading
Chronological Reading

Description of Student Learning Needs:

To respect confidentiality my target student has been given a pseudonym. ‘Mere’ is a 12 year old, Year 8, Maori female from a lower socio economic background. Part of the demographic that Lai, McNaughton, Amituanai-Toa and Hsiao (2009), describe as scoring “lower in reading comprehension measures than children from other ethnic groups” (p. 30). She is part of a blended family and has six older siblings who all live at home. When compared against her chronological age of 11 years and 10 months at the start of the school year, Mere’s reading age at 11 years, was ten months below her actual age.

The Ministry of Education’s (2010) guidelines in the Literacy Learning Progressions (LLP) show by the end of Year 8, Mere should be, drawing on knowledge and skills that allow her to decode texts with such automaticity that she does not need to decode all words, whilst showing increasing control of a repertoire of comprehension strategies that are drawn upon on when she knows she is not comprehending fully. When analysing assessment, Timperley (2010) states, “none of this evidence provides absolute answers but rather potential insights in how to improve” (p. 2). In saying that, analysis of initial data (Appendix 1i) against a normed reference, in this case the LLP, allows for some general insights to be made.

The beginning of Year 8 anecdotal observations that the 1:1 conferencing situation affords when administering a Running Record, highlighted that Mere is decoding more challenging texts with automaticity, but a lack of control over comprehension strategies means this is not mirrored at the same level, a weakness supported by STAR and e-asttle reading achievement (Appendix 1ii). Mere is able to recall literal facts but does not understand the concepts of evaluating or inferring. Informal teacher observations have shown this to be evident across the curriculum. She is able to ‘find and retrieve’ with confidence when working on collaborative research tasks, but when faced with questions that require use of these more complex comprehension strategies, she prefers to stay safe and not take risks that require more from her cognitively.

To establish Mere’s current reading level at the start of this case study I administered a Probe Running Record using the same text as her previous running record (Appendix 2). This 1:1 testing situation allowed me to observe what progress had been made to date and what close reading strategies she was using to make sense of the new and unknown, in order to make connections to the text in front of her. Observations of comprehension strategies showed during this time Mere did not seem aware she needed to find her own connection to the text. She needed prompting to justifying her thinking and prompting to refer back to the text to locate information. It was noted that each time she was guided back to the text, she read it in its entirety from top to bottom. Skimming and scanning skills were not evident as she carried out the find and retrieve process. Of concern it is
important to note here that Mere’s comprehension level of the same seen text dropped from 60% to 50% over a four month period.

Elley (2005) acknowledges “no reading test is perfect…. all have their uses and limitations…” (p. 4). To keep this case study manageable in the time frame available only one form of assessment was used to measure progress. The Running Record is diagnostic in nature and at the ‘reading to learn’ stage, allows teachers to assess students’ thinking and comprehension quickly, which meets the purposes of this case study.

To ensure validity each running record utilised a text from the Probe 1 Reading Assessment kit, and was administered by the same person, at the same time of day and in the same 1:1 environment. The results of each assessment were discussed with the student so that she was aware of her instructional level, the areas she performed well in and the gaps in her understanding. Brown (1995) suggests that we can “foster achievement by promoting strengths and eliminating weaknesses” (p. 12). Together we set her next learning steps and talked about how she might achieve these (Appendix 2), making her results transparent.

Description of Digital Teaching Method and Adaptations:

The case study centred around using a digital teaching method combined with focused discussion to support acceleration in reading comprehension. This area was selected as the beginning of year assessment results in my class reflected discrepancies between what my learners could decode and what they could comprehend (Appendix 1i). Although I am specifically targeting one student for this case study, the class needs analysis (not included) identified that a significant number of students needed to recognise what it was a question was asking them to do, then use that knowledge to help them locate supporting evidence in the text to reflect a deeper understanding of the content.
‘Symbaloo’ (Appendix 3 - additionally hyperlinked to the intervention), the digital book marking tool was chosen as the vehicle from which the students could drive their own learning. This provided the visual hook needed to increase reading mileage, and give the students easy access to the set texts and follow up tasks that were designed to accelerate comprehension levels. Text selection was based on a combination of: - direction taken from research by Bransford and Schwartz (2009), which highlighted the need to be aware of individual learning styles, - the area of focus in literacy, Migration the current topic of study in our class, - and online accessibility.
Myths and legends that represented by the cultures in the class were chosen to ensure connections on a surface level could be made. When considering the difficulty levels of these multimodal texts I knew from data gathered that decoding would not be a concern, but what I wasn’t sure of was comprehension levels. To afford everyone success and further facilitate learning, the task design merged ideas from Anderson and Krathwohl’s (2001), Revised Bloom’s Taxonomy and Maddux, Johnson, and Willis’ (2001) ‘Type ll’ web 2.0 tools, identified by Liu and Maddux (2010), that are found in Bloom’s Digital Taxonomy. These applications were already a part of each student’s digital schema and embodied the cluster wide ‘learn, create, share’ model. Capitalising on this would provide “opportunities to learn from texts and across texts and to explain, reflect and show that learning through creation” (Jesson & McNaughton, 2014)

Referred to as the ‘ADL task’ initially for organisational purposes, but later renamed ‘Awesome Dialogic Learners’ by the experimental group, required the students to read an online text (Appendix 3) then show the connections they had made to that text by responding to questions that would scaffold their thinking in an online peer led forum. No specific order was given for the stories to be read in, only an expectation that six responses per text would be completed each week. The students were given the option of working independently or collaboratively to complete the tasks, and had a choice of thirty task tiles to choose from (Appendix 3). Each question was framed around Anderson and Krathwohl’s (2001), Revised Bloom’s Taxonomy so that scaffolding was in place as they progressed from the lower order thinking to the more demanding higher order thinking.

To foster the argument element, the students were asked to leave feedback pertaining to the most recent response on the answer grid before responding to the questions themselves. This was not a new to them as the concept of being a critical friend was an established part of the literacy cycle. As the term progressed the students were made aware of the ‘tagging’ option in the comments tool of the shared Google Doc (document). The idea being each time someone received feedback they were sent an emailed notification that they were then able to respond to, justifying their thinking if necessary, making the feedback bi-directional and dialogic. The rationale behind this is by taking the time to ‘notice’ other’s responses, the ‘teachers’ continued learning when they gave their feedback, and the ‘learners’ continued learning when they received their feedback (Bransford & Schwartz, 2009).  

Having gained ‘‘knowledge fuel’’ identified by the Woolf Fisher Research Centre, 2014, from navigating a path through the response type tasks, the students were then challenged, as Jesson and McNaughton (2014) suggest, to “use their readings as sources for creating” a digital learning object (DLO) to show the connections they had made to the text by using old ideas to harness new ideas. These creations were then posted to their blogs, opening up a new feedback forum and the link was posted on the response page so that the peer led feedback process could continue.

For this intervention to be successful it required a high level of buy in from the students and needed to be ‘sold’ well. I met with the students involved at lunchtime and knowing if the students felt empowered they would rise to the challenge, so told them the university had set a few classes in our cluster a special task this term, and we were one of those classes. Placing a high value on the challenge ahead, increased feelings of self efficacy. I explained that our challenge was to move our reading comprehension to the next level by working collaboratively, having discussions and giving each other feedback. The next step was to project the ‘Symbaloo’ onto the board. The effect of the visual hook was immediate with a lot of excited chat taking place.

The challenge with this intervention being in a digital format was that a high level of trust was needed if this was to be successful. Firstly trust was needed as the digital platform of Google Docs had to remain ‘open’, by that I mean any changes made would affect the entire document with a very real possibility being that all work could be deleted at a push of a button. Secondly trust was needed to ensure the students felt they were working in a safe environment. Reinforcing this thinking is Yang & Carless’ (2012) view that “dialogic interaction within a trusting atmosphere can help to promote learner agency and self-regulation” (p. 290). So clear and specific ground rules were set together. This in itself put the onus on the students to amend or report any problems.

Hattie (2009) states “When students become their own teachers they exhibit the self-regulatory attributes that seem most desirable for learners… self monitoring, self evaluation, self assessment and self teaching” (p. 22). Engineering the type of feedback that would promote the level of deep thinking that would accompany these attributes needed to be modeled. I demonstrated my own low level responses to guide the students through the noticing process (Appendix 4). Doing this in an authentic context allowed for transfer of content knowledge and in its most simplified form knowing how to ‘tell each other how to fix answers’ was exactly the point of this exercise. Seigler (2000) suggests that encouraging a learner to explain an incorrect response produces greater learning than just asking for an explanation of why an answer is correct as it helps to create a sense of confusion and increase their repertoire of known strategies. The scaffolding of written teacher feedback contributions was withdrawn after the first two weeks to allow the intervention to become fully student led.

To avoid feelings of isolation or being singled out an experimental group was established. This group was made up of seven students, including Mere, with varying abilities (none of this data is reflected in this case study), four girls and two boys. The commonality being that all students had a similar digital schema in place, were able to take ownership of the tasks, and could be relied upon to stay on task when working both collaboratively and independently, without relying heavily on teacher input or guidance. These students knew what their next learning steps were, what they needed to do in order to make these learning gains, and had the confidence to articulate their thinking when giving feedback.

It is of interest to note here that the rest of the class were given a similar task, with similar expectations, the differences being that the tasks were much more structured and there was no expectation or provision for critical discourse as feedback was accessed only via the teacher.


As an educator I knew what I wanted to achieve, the challenge came when deciding how I was going to achieve this. Collegial collaboration and sharing of best practise was crucial at the design stage. Even though careful consideration was given to what I already knew about my students learning needs and preferred learning styles, I was able to access ideas that has already been proven successful in another classroom. With a wealth of learning diversity to cater for I knew I needed to create a climate where students felt safe, were prepared to take risks and were happy to learn from and with each other. Much of the first term was spent building a collaborative working environment which I capitalised on when introducing the term ‘dialogic learner’. As a class we collaboratively defined what this might look like (Appendix 5) and spent the second term trying to emulate this picture, so I was able to draw on these well established learning practises when introducing my intervention.

Parker and Chao (2007) suggest, “collaborative learning becomes even more powerful when it takes place in the context of a community of practice. A community of practice consists of people engaged in collective learning in a shared domain. Thus, learning becomes a collaborative process of a group” (p. 58.). By doing this within a dialogic arena, the instant success that would come with the lower order thinking tasks that focused on the student’s ability to remember, understand and apply literal facts, would act as a scaffold for the more challenging higher order thinking tasks that required collaborative analysis, evaluation and creation. Citing further findings by Parker and Chao (2007), Blau and Caspi (2009) state “the use of collaborative technology in an educational context enhances active participation through content creation, increases students' engagement with course content, and enriches the learning process” (p. 48).

Comprehension, the meaning constructed from a text, is not stable and will change if any facet of the interaction between the reader and the text is altered. (Wilkinson and Son, 2009; Pearson 2001 and Spiro 2004).  Taking a forensic look at each others responses was the method chosen for encouraging extended discussion. The aim being to use the power of talk to evoke the thought processes that advance students learning and understanding, and alter perspectives.  Reznitskaya, et. al’s, (2008) thinking that arguing is a scaffold for the reconstruction of meaning, allows for the facilitation of an environment of critical discourse amongst a group peers, and “provides a way of reflecting on thought processes to challenge and clarify thinking” (Gibbons, 1991, p. 29). The adaptation came in the form of the ‘Comments’ tool in the application Google Docs. By introducing this tool, an online forum was created where the peer led feedback would do exactly that, by facilitating conflict, and encouraging my learners to support and justify their thinking. I felt that if the students were able to identify each others successes and errors they would then be able to reflect on their own responses and apply that thinking independently. Yang and Carless (2012) suggest “students need to be stimulated through the feedback process to develop a sense of agency and responsibility” (p. 292).

The learning journey begins when connections are made between what we know and the new information we are processing. As teachers we need to “make connections by building on the familiar” McNaughton (2002 p. 35). In this case prior knowledge and classroom practice associated the task element with what was ‘familiar’, turning the discussion aspect into the ‘new’ learning. The new or unfamiliar are often seen as obstacles. In order for the students to overcome these new obstacles the initial ‘talk’ was very much teacher led (Appendix 4). When students are guided to engage in knowledge-building discussions, they learn to develop and justify an argument and eventually learn to disagree agreeably (Engle & Conant, 2002, Hattie, 2009). The discussion then begins to reflect Mercer’s (2000) learner led ‘interthinking’, thinking that occurs between peers when talk is not teacher led (Davies and Sinclair, 2012).

Conditions for teaching and learning are continuously changing due to the nature of the classroom environment. Design aspects of successful interventions need to give consideration to these ever changing variables if longevity is to prevail (Lai et al, 2012). The requirement to access the intervention in their own time, as homework in addition to the time allocated in Reading lessons, ensured access for a minimum of three hours each week, regardless of any unforeseen changes to the school day. Black and William, (2009) suggest “the choice of tasks for class and homework is important. Tasks have to be justified in terms of the learning aims that they serve, and they can only work well if opportunities for pupils to communicate their evolving understanding are built into the planning” (p.  7). Each student was expected to keep a task completion log (Appendix 6) and a reading mileage log (Appendix 7) which served dual purposes by allowing progress to be monitored, and by giving the students an increased level of responsibility.

Critical Reflection including Evaluation of Student Learning:

The New Zealand National Standard for Reading states:

“by the end of year 8, students will read, respond to, and think critically about texts in order to meet the reading demands of the New Zealand Curriculum at level 4.  Students will locate, evaluate, and synthesise information and ideas within and across a range of texts appropriate to this level as they generate and answer questions to meet specific learning purposes across the curriculum…. with increased accuracy and speed in reading a variety of texts from across the curriculum, their level of control and independence in selecting strategies for using texts to support their learning, and the range of texts they engage with” (Ministry of Education, 2009, p. 54).  

When comparing the pre and post intervention data there have been clear shifts in achievement with an increase in reading age with comprehension, of six months from 11.0 to 12.0 years to 11.5 to 12.5 years. Automaticity with decoding remained at a similar level while overall comprehension increased by 20%, meaning Mere has an increased chance of achieving AT the National Standard expected of a student at the end of Year 8. Possible reasons for this come from the assumption that connections made in the online forum of peer led feedback did lead to increased alignment between the decoding automaticity and reading comprehension levels of this target student.

Effective Literacy Practice in Years 1-4, (Ministry of Education, 2003) suggests that “helping students to make connections between what they know and what they are reading improves their comprehension” (p. 31).  Observations of performance during the final running record (Appendix 2) showed Mere to be more confident of her own abilities. She initiated questions about the questions, and when justifying her thinking, referred back to the text without reading from top to bottom each time. Skimming and scanning skills had improved with Mere searching for key words taken from the questions to help her find and retrieve facts at a greater speed. Mere knew she needed to and was tried very hard to find her own connection to the text. Asking at the end “Have I improved Miss? have I shifted my learning to the next level?” and so demonstrated a higher sense of ownership and engagement. (Appendix 8)

Without effect size evidence to compare against, there is a definite need to explore why this learning shift occurring in such a short time frame may have happened. A snapshot of feedback gained from a google form survey (Appendix 9) has been transcribed (Appendix 9) and gives some insight into the shifts in Mere’s thinking.

The key objectives of this intervention were to:- provide opportunities for the students act as critical friends to notice how their peers had interpreted and responded to a series of questions, - to use power of ‘talk’ via a ‘participants only’ online forum to, as Darling-Hammond and Bransford (2007) state, “respond to each other’s answers to create a wider community of learning that provides access to multiple perspectives on the topic” (p. 80). This form of analysing and formatively commenting on each others responses through peer led critical discourse helps students close the gap between current understandings by identifying flaws in learning strategies and skills (Yang and Carless, 2013,  p. 285).

“It is through talk that much learning occurs. Talk allows children to think aloud, to formulate ideas, to set up and evaluate hypotheses and to reach tentative decisions” (Gibbons, 1991, p. 27). From ongoing observations throughout the year, I knew that the students would work well together and would be able to complete the tasks, but from my own professional development I knew that if acceleration was to take place, I needed to facilitate ongoing opportunities for student led extended and reasoned discussion. Opportunities to notice their peer’s errors and offer advice in relation to their own understanding of the task created an evolving awareness of their own errors by noticing their peer’s successes. “Pupils can only assess themselves when they have a sufficiently clear picture of the targets that their learning is meant to attain…. When pupils do acquire such overview, they then become more committed and more effective as learners: their own assessments become an object of discussion with one another, and this promotes even further that reflection on one’s own ideas that is essential to good learning” (Black & William, 2009, p.  7).


Transparency of data between teacher and student meant that Mere’s next learning steps were clear.  In moving forward provision for transparency in whole group data would foster a commonality in purpose and would allow for a more collaborative approach towards reaching those identified shared next learning steps.

On the surface this intervention could be deemed successful however validity of shifts can be questioned. A lack of data for analysis pertaining to the control group meant there was no overall benchmark to compare results against and data could not be quantified. For this intervention to be proven a success it must involve a wider group of participants and run over a longer period of time. In addition data analysis needs to incorporate results of the biannual summative assessments that reflect cluster wide shifts and are not deemed subjective.

Provision for planned opportunities to participate in meaningful dialogic argumentation, a “powerful mechanism for increasing understanding of challenging concepts” (Clark, Sampson, Weinberger & Erkens, 2007, p. 344) among peers across the curriculum would allow a culture of cognitive conflict to evolve.  Initiating a practice of student led discourse would allow for scaffolding of feedback content, ironing out discrepancies between the surface level feedback initially present in the online forum, and allowing for connections to be made to the more cognitively demanding deeper level feedback emerging at the end of the intervention.

The collaborative co -construction of meaning and understanding which underpinned the design of the intervention drew from Wilkinson and Son’s (2009) ‘Fourth Wave’ of dialogic approaches to the learning and teaching of comprehension, and encompassed Vygotsky’s sociocultural theory of exploration and collaboration. Buy-in gained from the value placed upon the tasks and the visual hook of a digital gateway, increased reading mileage and resulted in the emergence and transfer of understanding evident in the increased alignment with decoding automaticity and reading comprehension levels.

Initial Data collected March 2014

Test Date
Chronological Age
e-asTTle Reading

10 months
10.5 - 11.5
Norm level



Normative Achievement Levels:

Reading Norms
At the end of
Year 7

Reading Norms
At the end of
Year 8
e-asttle Reading Norms
At the end of
Year 7
e-asttle Reading Norms
At the end of
Year 8
Running Record
NZ Curriculum
4 - 6
5 - 7
As per chronological age
Level 4

Appendix 1ii:

TARGET STUDENT READING ANALYSIS: Y8 Date: March ‘14                                                                                   

Student Reading Needs based on asTTle assessment and current seen running record analysis
Group Needs
(Independent in tumble)

Hot Spot
Consistently read for meaning (cloze with sophisticated vocabulary included)
Reading Strategies to be developed through GR focus lessons
  • Observe punctuation
  • Strategies for working unknown words
  • Read with expression
  • Predicting
  • Fluency
  • Ensure a mix of M, S, V strategies are being used
  • Dialogic approach

Guided Reading Focus
  • Respond using understanding and information
  • Identification and understanding of main ideas
  • Understand, organise and sequence material
  • Summarise and Evaluate
  • Make conclusions and form assumptions
  • Understand characters
  • Make predictions based on text
  • Ask questions to clarify u/st
  • Make connections
  • Reorganise information

NOT transferring learning in GR sessions to other curriculum areas – finding info, selecting and rejecting, using own words, asking qu about text, language of qu/instructions

  • Skim/scan for information
  • Find, select and retrieve information
  • Asking questions
  • Make inferences
  • Identifying and understanding main ideas

                               Vocabulary Knowledge:
  • Spelling activities
  • New words explored in GR sessions and follow up activity

The language of instruction to feature in questioning. Need to teach what each qu word is asking the reader to do
  • Modelling
  • Explaining
  • Shared/guided reading
  • Directing
  • Telling
  • Prompting
  • Questioning
  • Feedback
STAR: Assesses close reading and critical thinking ability
  • Sentence Comprehension: Reading for meaning and understanding
  • Paragraph
  • Comprehension (CLOZE):  Replace words within text whilst retaining meaning
  • Vocabulary: Word meanings in context
  • Advertising Language: Recognising emotive/persuasive words
  • Genre Awareness: U/st mode
To ask questions about text to help increase comprehension skills and vocabulary knowledge.

Locating literal information
  Decoding – chunking, reading ahead, sc when meaning lost, attending and searching – look for clues in and around the word and text to help read unfamiliar words, Cross checking and self correcting
Attempting to ref to text to justify thinking

Appendix 2: Posttest Data

Test Date
Actual Age

Next Steps
(Comprehension Focus)
10 months
10.5 - 11.5
Inference, Vocabulary, Evaluation, Interpreting questions
12 years
2 months
10.5 - 11.5
Inference, Literal, Vocabulary, Evaluation, Interpreting questions
September 2014
12 years
4 months
Evaluation, Inference,
(Next level up comprehension - 50%)

Running Record March 2014 (unseen)
Probe: 10.5 - 11.5 years
Running Record July 2014 (seen)
Probe: 10.5 - 11.5 years

Running Record September 2014 (unseen)
Probe: 11-12 years
Running Record September 2014
Probe: 11.5 - 12.5 years (unseen)

Appendix 3: Intervention

Click here to access the Symbaloo

Appendix 4: Guided Dialogue (scaffolding)

Teacher: Here is my response to the task. What feedback would you give me to help me notice the information I have forgotten to include?

Question: Who were the main characters in the story?

Answer: Maui and the sun

Mere: I would tell you that you need to use a capital letter for Maui’s name.
Ana: I would tell you the same but I would also remind you to explain what Maui’s role was in the story because the question asks who the main characters were but you need to move your answer to the next level and you can do that by saying what their role was. It gives more information.
Mere: Yes but she didn’t give a very detailed answer and she might be upset if we tell her it wasn’t right.
Teacher: Do you think I’d be upset?
Fia: No because you’re asking for feedback so you want to know what you did right as  well as what you did wrong.
Mere: So is that what we need to do Miss? Do we need to tell each other how to fix our answers?
Teacher: Exactly! Remember if you can’t find anything that could be changed make sure you leave a comment that says why you thought the response was well written.

Appendix 5:
Class Brainstorm to answer the question: ‘What does a Dialogic Learner look like?’

We think a dialogic learner explores the story to find out more information by asking 
questions, arguing and thinking critically to help them make connections and reflect on 
the new learning.

Asks questions
  • to show understanding and make sense of new learning
  • to ask and answer questions in your own mind (silent argument)
  • that are open ended
  • to invite others into a ‘game’ of verbal ping pong

  • to look back to find any errors or successes to correct or remember

Thinks critically
  • to show they are being smart with their thinking
  • to find things they agree or disagree with

Makes connections
  • to show they are able to relate to the book
  • to find things that are familiar or important to themselves, their family and their culture

Self manages
  • by making an independent promise to stay on task and make the most of their learning time
  • by asking questions when they don’t understand
  • by finding evidence to show others why their thinking is correct

Definition created by the focus group:

A dialogic learner in Room 9 is an individual who has the ability to correspond, examine, question, argue, discuss, find their next learning steps and take responsibility for their own education.
Appendix 6: Example of Personal Task Log

Appendix 7: Example of Personal Reading Log

Appendix 8: Post test teacher/student dialogue

After administering the running record both examples of pre and posttest data were on the teacher’s desk. Mere picked up the more recent one and opened the dialogue.

Mere: Have I improved Miss? Did I shift my learning to the next level?
Teacher: What do you think?
Mere: I think I have because the level is higher (points to reading age level)
Teacher: Why do you think that happened?
Mere: I got more answers right in the question part (points to the increase in comprehension percentage from 50% to 70%)
Teacher: Why do you think those numbers are different?
Mere: I looked for words in the questions that might help me then I used my A.D.L skills to go back and look for them in the story. I knew more because I thought about what the author was trying to get me to understand. When I got stuck on that one I asked you if this was a thinking question (points to the _________ question).
Teacher: That’s right you did. I was so excited when I heard you say that! (indication of rapport with student)
Mere: I know because you smiled, but then I thought about what my critical friend might say and I answered it like that.

Appendix 9: Google Form - student response sheet with transcription snapshot

Click here for access to student responses

Transcription of snapshot of Mere’s responses:

Teacher: What did you think when you first saw the reading challenge?

Mere: “I thought that it was going to be difficult because it was a lot of work to do within a short time limit. Also that the stories were going to be long and I am a slow reader and I wouldn't have enough time to complete the activities.

Teacher: How did your thinking change after the first few weeks?

Mere: My thinking changed a lot because I had understand the text in the story and It was easy to get the hang of the cycle of the A.D.L Task. It also changed my thinking to take my time reading the story so I can understand the question or task.

Teacher: What comprehension strategies has the reading challenge helped you practise?

Mere: It has helped me practise to go back and look in the story and read the story so I can memorise the story so It will help me with the questions I answer.”

Teacher: What is something that you will think about when you are doing reading tasks in the future?

Mere: I will think about how I can make the story easy for people to understand or I could go back and look  at  the A.D.L because it is there for a reason for us to move our reading to the next level.

Reference List:
Anderson, L.W., and D. Krathwohl (Eds.). (2001). A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom's Taxonomy of Educational Objectives. New York: Longman.

Black, P., & William, D. (1998). WEA Education Blog. Retrieved from:
Blau, I., & Caspi, A. (2009). What type of collaboration helps? Psychological ownership, perceived learning and outcome quality of collaboration using Google Docs. In Proceedings of the Chais conference on instructional technologies research (pp. 48-55).

Bransford, J. D., & Schwartz, D. L. (2009). It Takes Expertise to Make Expertise: Some Thoughts About Why and How and Reflections on the Themes in Chapters 15-18. Development of professional expertise: Toward measurement of expert performance and design of optimal learning environments, 432.

Brown, J. (1995). The elements of language curriculum. Boston MA: Heinle & Heinle.

Darling-Hammond, L., & Bransford, J. (Eds.). (2007). Preparing teachers for a changing world: What teachers should learn and be able to do. John Wiley & Sons.

Elley, W. "On the remarkable stability of student achievement standards over time." New Zealand Journal of Educational Studies 40.1/2 (2005): 3.

Gibbons, P. (1991). Learning to learn in a second language. Portsmouth, NH: Heinemann.

Hattie, J., & Timperley, H. (2007). The power of feedback. Review of educational research, 77(1), 81-112.

Jessen, R., & McNaughton, S. (2014). Lecture 3.4: EDCURRIC 740 [Powerpoint slides]. Auckland New Zealand: University of Auckland.

Lai, M. K., McNaughton, S., Amituanai‐Toloa, M., Turner, R., & Hsiao, S. (2009). Sustained acceleration of achievement in reading comprehension: The New Zealand experience. Reading Research Quarterly, 44(1), 30-56.

Liu, L., & Maddux, C. (2010). Using dynamic design in the integration of Type II applications:
Effectiveness, strategies and methods. International Journal of Technology in Teaching and Learning, 6(1), 71-88.

McNaughton, S. (2002). Meeting of Minds. Wellington: Learning Media.

Ministry of Education. (2003). Effective Literacy Practice in Years 1 to 4. Wellington, NZ. Learning Media

Ministry of Education (2010). The Literacy Learning Progressions. Wellington, NZ: Learning Media.
Parker, K., & Chao, J. (2007). Wiki as a teaching tool. Interdisciplinary Journal of e-learning and Learning Objects, 3(1), 57-72.

Reznitskaya, A., Anderson, R. C., Dong, T., Li, Y., Kim, I-H., K, S-Y. (2008). Argument schema theory and learning to reason. In editors (Collins-Block, C., & Parris, S) Comprehension Instruction; Research-based Best Practices. (1-26). New York: Guilford Publishing.            
Siegler, R. S. (2000). The rebirth of children's learning. Child development,71  (1), 26-35.

Timperley, H. (2010, February). Using evidence in the classroom for professional learning. In Étude présentée lors du Colloque ontarien sur la recherche en éducation.

Wilkinson, I. A. G., & Son, E. H. (2009) A dialogic turn in research on learning and teaching to comprehend.
In editors (Kamil, M. L., Pearson, P. D., Moje, E. B., & Afflerback, P). Handbook of Reading Research (Vol IV). (359-387). New York: Routledge.

Yang, M., & Carless, D. (2013). The feedback triangle and the enhancement of dialogic feedback processes. Teaching in Higher Education, 18(3), 285-297.