Next Article in Journal
Migrant Learners of Basque as New Speakers: Language Authenticity and Belonging
Next Article in Special Issue
A Toolkit for the Investigation of Greek EFL Teachers’ Assessment Literacy
Previous Article in Journal
The Effect of Dual Language Activation on L2-Induced Changes in L1 Speech within a Code-Switched Paradigm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Opening Pandora’s Box: How Does Peer Assessment Affect EFL Students’ Writing Quality?

Business School, London South Bank University, London SE1 0AA, UK
Submission received: 21 May 2021 / Revised: 23 June 2021 / Accepted: 28 June 2021 / Published: 1 July 2021
(This article belongs to the Special Issue Recent Developments in Language Testing and Assessment)

Abstract

:
Recent research has underlined the benefits of peer assessment (PA) as it helps learners write high-quality essays and increases their confidence as writers. In terms of this intervention study, 200 Greek Cypriot EFL learners’ essays (pre- and post-tests) were evaluated taking into consideration four aspects of writing quality after using either PA and teacher assessment (TA) (experimental groups, n = 100 students) or only TA (control groups, n = 100 students) in their writing classes for one year. This is one of the few studies, to the knowledge of the present researcher, which have performed text analysis of so many aspects of writing quality using such a—relatively—large sample (400 essays) in such a challenging setting (secondary education). Learners’ essays were evaluated in terms of accuracy, fluency, grammatical complexity, and lexical complexity using Atlas.ti. Findings indicated that learners who received PA and TA improved their essays more in terms of lexical complexity, accuracy, and some features of grammatical complexity and fluency than those who received only TA. The current study highlights the desirability of collaborative group work, in the form of PA activities, in the creation of opportunities conducive to promoting writing quality.

1. Introduction

Peer assessment (PA), as a formative type of ‘assessment for learning’ which fosters student-centred evaluation, has been widely discussed (Lee and Hannafin 2016; Panadero et al. 2016; Wanner and Palmer 2018). PA is a valuable ‘learning how to learn’ technique (Li et al. 2012) due to its positive impact on motivation and involvement in learning of students regardless of their age (Reinholz 2016; Tenorio et al. 2016). It supports students as they become accountable for their learning empowering each other’s achievements through peer response and evaluation (Tillema et al. 2011). It is also considered to be an effective method of enhancing students’ appreciation of their own learning potential (Lynch et al. 2012).
One of the main tendencies of current European and international education is to develop more active and responsible life-long learners who can effectively interact with their co-learners in their effort to shape their own learning (Gudowsky et al. 2016). However, the process of PA, which promotes learner-centred assessment, cannot be implemented in EFL classes unless more detailed description of this approach especially in terms of preparing adolescent students and their EFL teachers becomes available (Lam 2016). In addition, PA is related with unclear and obscure language and is not similarly implemented or perceived in terms of teaching secondary school learners (Harris et al. 2015).
Scholars state that a collaborative learning context is frequently absent in secondary education in terms of EFL language learning (Fekri 2016). Absence of information on what precisely the peer part of the assessment process really is discourages students from engaging in the practice of PA. In Cyprus, for instance, both learners and instructors tend to have limited previous experience in alternative assessment methods in the EFL language classroom (Meletiadou 2012; Tsagari and Vogt 2017; Vogt and Tsagari 2014), as assessment has traditionally been teachers’ sole responsibility. Nevertheless, students, teachers, and parents often complain that most students encounter significant hurdles in formal tests of writing and have negative attitudes towards writing and the assessment of writing (Bailey 2017).
In addition, international trade and tourism in various countries in Europe and worldwide (i.e., Cyprus) has had a significant impact on EFL language education. Progressively, multinational companies demand from their future employees to be proficient in English focusing on writing since employees usually communicate through emails in English and most documents are in English. The researcher chose to focus on writing in EFL because English is the foreign language most people learn worldwide and writing in English has drawn considerable attention over the years (Saito and Hanzawa 2016). Moreover, writing is an important part of most EFL external tests (i.e., IELTS). This has caused a backwash effect which has subsequently motivated EFL students’ instructors to focus on increasing their writing proficiency (Dewaele et al. 2018). As the role of writing in EFL learning is becoming more prominent, learners’ ability to peer-assess their writing drafts also gains in significance (Puegphrom et al. 2014).
Curricular aims in secondary and further education stress and occasionally require that students collaborate and increase their self-reliance and accountability as learners (Petra et al. 2016). PA promotes the development of students’ autonomous, cooperative, and self-regulation skills (Thomas et al. 2011). Consequently, PA methods should be explored if teachers need to foster ‘learning how to learn’ skills. There is also a need to understand the role and use of PA in the language learning process (Ashraf and Mahdinezhad 2015). Finally, although literature on PA is expanding in higher education (Adachi et al. 2018), very little information about PA in the EFL and secondary education context (Chien et al. 2020; Panadero and Brown 2017) is available.
The current study investigated the impact of PA of writing on adolescent EFL students’ writing quality by analysing the text quality of students’ essays (pre- and post-tests) taking into consideration four indicators of writing quality (Wolfe-Quintero et al. 1998) to add to the PA of writing in the secondary education literature. The main research question of the current study was:
What is the nature of the impact of PA and TA on the writing quality of adolescent EFL students’ essays as opposed to TA only?
To sum up, the aim of the current study was to explore whether the combined use of PA and TA in public secondary schools could enhance EFL students’ writing skills and promote more inclusive assessment practices which may foster learning and improve the writing quality of adolescent EFL students’ essays.

2. Literature Review

PA and Writing Quality

A major concern of the present study was to investigate whether PA of writing could have an impact on EFL students’ writing quality. Previous studies have explored only one or two aspects of writing quality. For instance, Jalalifarahani and Azizi (2012) examined the impact of two kinds of feedback (instructor versus student) on grammatical accuracy and general writing enhancement of 126 high versus low achieving Iranian EFL students. Findings indicated that peer response did not help high or low achieving learners improve their grammatical accuracy, but instructor comments were found to be beneficial for low achieving students, especially as regards grammatical accuracy. In terms of general writing achievement, both teacher and peer feedback were remarkably influential irrespective of learner prior achievement. Moreover, the study indicated that students preferred teacher comments and regarded their instructor as an almighty expert who could provide the right answer to every question.
In their study, Hashemifardnia et al. (2019) also claimed that learners improved their drafts considerably in terms of grammatical accuracy only when writing instructors provided feedback on grammatical errors. In addition, Liu and Brown (2015) reported that most studies on error correction in EFL writing classes indicated that learners receiving instructor error correction enhanced their accuracy over time. Moreover, certain researchers reported that instructor comments influenced students’ general writing quality more than peer comments (Hamer et al. 2015; Zhu and Carless 2018). They revealed that teacher feedback was likely to have more of an impact on overall writing quality rather than peer feedback.
Nevertheless, Sheen and Ellis (2011) reported that correcting local errors sometimes resulted in students increasing their mistakes on later drafts. They claimed that frequent error correction had a slightly negative impact on learners’ ability to improve their writing performance. Additionally, Lee (2015) stressed that instructor comments were not more helpful than peer comments. Students in his study indicated that they were unsure whether instructor suggestions were more effective than peer comments as regards grammatical error correction. In terms of the impact of instructor versus peer comments on the general writing enhancement of high versus low achieving students, researchers indicated that instructor and peer comments both helped students improve their writing performance irrespective of prior achievement (Meek et al. 2017). Researchers reported that teacher comments did not provide more significant benefits than peer comments as regards accelerating writing achievement (Chien et al. 2020; Ruegg 2015). Further, Patchan and Schunn (2015) and Pham and Usaha (2016) corroborate the effectiveness of peer comments for meaning level corrections and, therefore, improved writing quality, although the aim of the current study was not to explore the impact of instructor versus peer response on student writing but the effect of a combination of both PA and TA on learners’ essays. Hyland and Hyland (2019) and Yu and Hu (2017) indicated that peer comments had an overall beneficial effect on the quality of writing, although they did not compare peer to instructor comments. They highlighted the fact that peer corrections were significant, explicitly stating that peer revision should be seen as an important additional type of feedback for EFL learners as peer feedback was more closely associated with linguistic rather than pragmatic aspects of writing (Hyland and Hyland 2019).
Soleimani et al. (2017) also explored the impact of peer-mediated/collaborative versus individual writing on measures of fluency, accuracy, and complexity of 150 female EFL learners’ texts. Their findings revealed that collaborative groups outperformed the individual groups in terms of fluency and accuracy but not in terms of complexity. Therefore, the researchers concluded that learning is a social activity and students’ writing skills improve when they interact with each other. Ruegg (2015) also discovered that peer response groups had higher writing scores than instructor response groups. Some other studies report that peer feedback is more effective than teacher feedback (McConlogue 2015; Wang 2014). The underlying reason behind all these controversies lie in the number of different factors taken into consideration in each study and how these were treated by scholars.
Exploring the effectiveness of PA on writing and 24 grade 11 Thai learners’ perceptions of PA, Puegphrom et al. (2014) discovered that students’ writing skills were significantly enhanced after experimenting with collaborative PA. Students also thought that PA promoted collaborative learning and self-reliance. Further, according to Edwards and Liu (2018) and Ghani and Asgher (2012), instructor response and peer response enhanced learners’ writing quality in comparable ways. Ghahari and Farokhnia (2017), who conducted a large-scale study with 39 adult learners which explored the effect of formative PA on language grammar uptake and complexity, accuracy, and fluency triad scale levels in comparison to TA, reported that accuracy and fluency levels of both PA and TA groups improved significantly.
Finally, in a semi-experimental study, Diab (2011) examined the impact of peer versus self-editing on learners’ writing skills. The study included two complete English classes, one of which used peer-editing while the other used self-editing. Findings indicated that whereas peer-editors and self-editors showed similar observation skills, writers involved in self-editing rectified more mistakes than writers who were engaged in peer-editing. Surprisingly, peer-editors enhanced the writing quality of their essays considerably more than students involved in self-editing. Disparities in writing achievement resulted from the application of various language learning techniques, peer communication, and involvement with language.
To sum up, findings in the literature regarding the impact of PA and TA on the writing quality of EFL students’ essays are mixed. Some researchers claim that peer feedback is not always effective as students tend to think that peer comments are not that credible or accurate and do not take them into consideration favouring instructor comments (Kim 2005; Tsui and Ng 2000). Other studies report that PA is regarded as a vital component of the writing process that leads to enhanced writing skills (Yu and Wu 2013). Involving peers in presenting their viewpoints and providing suggestions to enhance student writing ability is comparable to a mirror reflecting the skills of the assessor and the assessee (Puegphrom et al. 2014). Although numerous studies highlight the positive impact of PA on the writing quality of students’ essays (Cho et al. 2008; Cho and MacArthur 2010), there is lack of research related to the impact of PA—especially when used in combination with TA—on the writing quality of EFL students’ texts in secondary education (Double et al. 2020). Addressing the urgent need for further research in the field of PA, the present study explored the impact of blind reciprocal PA on the writing quality of adolescent EFL students’ essays (Gielen and De Wever 2015).

3. Method

3.1. Participants

Participants of this study were two hundred adolescent Greek Cypriot EFL learners from four public secondary schools in Cyprus and twenty experienced EFL teachers who were also native Cypriot Greeks with more than 10 years of teaching experience. Students had to attend two 90-min classes per week in terms of a compulsory EFL writing module. Teachers taught them how to write 3 types of essays (descriptive, narrative, and argumentative) for 9 months (full school year) following the national curriculum provided by the Ministry of Education and Culture in Cyprus. Students had been taught letter writing, both informal and formal, during the previous school year. Teachers had to use an intermediate coursebook which was prescribed by the Ministry of Education. The aim was to gradually prepare learners for the IGCSE exams in a few years.
The researcher decided to conduct this study because students and their parents had been complaining about their EFL writing performance. Moreover, most of these learners were negatively disposed towards writing and received low grades as they failed at the end of the year local summative tests. Participation in the current study was voluntary and both students and their parents had to sign an informed consent form. The researcher received permission to conduct this study from the Pedagogical Institute in Cyprus and the University of Cyprus.
Learners formed 20 groups and were selected randomly (convenience sample) because of time and money constraints. The curriculum, syllabus and materials were the same for all students who had to write a pre-test, which served as a diagnostic test. This ensured that learners who participated in the study were at the intermediate (B1) level according to the Common European Framework (CEFR) (Council of Europe 2001). The test was provided by the Cypriot Ministry of Education and Culture.
All participants had to write 5 compositions. Experimental group students (n = 100) had to write two drafts for 3 of the compositions and receive PA from their classmates on their first draft and TA on their final draft while control group students (n = 100) received only TA on both drafts. Teachers and peers had to use the same rubric to provide feedback. The researcher chose to use process writing to enable students in the experimental groups to get multiple feedback (e.g., from teacher and peer) across various drafts (Strijbos 2016). Control groups were also able to receive teacher feedback twice.
Therefore, the only difference among control and experimental groups was that experimental group students received PA. Consequently, any difference in the writing quality of students’ essays was due to the impact of PA possibly because peers provided additional insights into their work and clearer suggestions regarding strategies they could use to improve their work.

3.2. Instrument

The main instrument of the study was a rubric (Appendix A), an adaptation of a well-known and widely used instrument for ESL composition writing, which is Jacobs’ ESL Composition Profile (Jacobs et al. 1981). The PA form was directly linked to the CEFR (Council of Europe 2001). The validity of the PA form was explored by consultation with experts, 8 headteachers, one inspector and 10 qualified EFL instructors who had been teaching at this level for a minimum of 6 years. The instrument had five categories (content, organisation, grammar, vocabulary and language use, and focus). A pilot study was then conducted, during which 60 students and 6 teachers used this form to assess 5 essays. Students and teachers thought that the instrument was user-friendly and suitable for these students’ level and age. The reliability of the PA form was assessed by calculating Cronbach’s alpha to measure the internal consistency of features and the coefficient value was 0.9. This clearly indicated that the PA form could be regarded as a reliable instrument.

3.3. Procedure

Teachers received training in PA methods and process writing and had then to train their students following a specific schedule devised by the researcher. The researcher prepared a PA training session for students which drew on models of awareness-raising programmes (i.e., Saito 2008). Its main purpose was to explain the assessment criteria (Patri 2002) and give a short PA introduction (Xiao et al. 2008). It lasted about 6 h and comprised many elements, i.e., revision strategies, mock rating, and revision with the PA form.
Students were supported as they gradually used process writing and PA of writing in their classes. They reflected on their purpose for using PA since they all wanted to improve their performance and succeed in their IGCSE exams. Learners were invited to participate in the study by assuming the role of the assessor and assessee anonymously using the PA form (Appendix A). The implementation lasted 9 months and was divided into the following phases (Figure 1, modified from Falchikov 2005, p. 125).
The goal of the study was to explore whether students who used both PA and TA enhanced their writing skills more than those who only received TA. Students received the same parallel instruction and were asked to write an informal letter as a pre-test and post-test. The researcher had weekly meetings with the teachers to supervise the whole procedure closely, provide additional support and training in PA and resolve any problems. The instrument used (Appendix A) was designed with the purpose of guiding students through a self-monitoring process in which they planned and evaluated their performance. It also helped teachers assess their students in a consistent way. All essays were marked by an external assessor and part of them (20%) was also marked by the researcher to ensure interrater consistency. Validity of all instruments was checked through consultation with experts (inspectors and experienced EFL teachers).

3.4. Analysis of Writing Samples

In terms of the influence of PA and TA on learners’ writing quality, the researcher decided on four indicators to apply in the analyses of students’ essays, based on previous studies of EFL writing and EFL writing evaluation:
  • Fluency [(a) average number of words per t-unit when the term ‘t-unit’ refers to a minimal terminal unit or independent clause with whatever dependent clauses, phrases, and words are linked to or incorporated within it (Wolfe-Quintero et al. 1998), and (b) text length, described as the total number of words included in an essay within the 30 min provided for every activity (Wolfe-Quintero et al. 1998)];
  • Grammatical complexity [(a) average number of clauses per t-unit; (b) average number of clauses per T-unit; (c) dependent clauses per T-unit, and (d) dependent clauses per clause (Ting and Qian 2010)];
  • Accuracy [(a) the proportion of error-free t-units to t-units; (b) the number of T-units, and (c) the number of errors per T-unit (Ting and Qian 2010), and;
  • Vocabulary or lexical complexity (an elaborate type-token ratio-word types per square root of two times the words (WT/2W—which considers the length of the sample to overcome the issue that regular type-token ratios are influenced by length).
Research has reportedly indicated that these indicators are the best measures of second language enhancement in writing (see, for example, Wolfe-Quintero et al. 1998; Yang et al. 2015). Moreover, to maintain reliability in listing these types of changes, the researcher and an external assessor analysed all data after initially marking on 20% of the revised essays.
The investigator ensured interrater reliabilities of employing the PA form and the coding scheme by following this procedure (Ergai et al. 2016): (a) the researcher retained three distinct random samples of 10% of the data (one determined for pilot testing, another chosen for assessor training, and a last one for the other group for interrater-reliability checks); (b) the investigator pilot-tested the PA form and the coding scheme to gain some experience in using them and practised completing the forms; (c) the researcher used consistent rater training; (d) the investigator and an external assessor autonomously coded the third set of data; (e) the researcher identified where she reached a consensus with the external assessor and where she did not, and eventually (f) the investigator explained any obscure points and bridged the gaps to resolve any conflicts.

4. Results and Discussion

4.1. Results

The present study investigated the kind of impact that PA—when used in combination with TA—may have on the writing quality of students’’ essays by analysing their texts. Text analyses were performed to determine whether using PA and TA can improve the writing quality of adolescent EFL students’ essays in four aspects which are regarded as the ideal indicators of writing quality in the literature (Foster and Skehan 1996; Wigglesworth and Storch 2009) as opposed to using TA only.
One-way MANOVA was performed to explore whether the use of PA and TA had any impact, which was statistically significant, on the writing quality of students’ essays as regards its four indicators, i.e., accuracy, lexical complexity, grammatical complexity, and fluency (Table 1). The analysis suggested that the disparity in the writing quality of students’ essays among experimental and control groups, in terms of the four indicators of writing quality, was statistically significant for all the measures mentioned above F (10) = 3.461, p < 0.005; Wilk’s Λ = 0.845, partial η2 = 0.16. This clearly indicates that the combined use of PA and TA yields significant benefits for adolescent EFL learners in terms of their writing performance as opposed to the use of TA only as experimental groups outperformed control groups in all aspects of writing quality.
Tests between subjects, which determine how the dependent variables differ from the independent variable (Davis et al. 2014), showed that the use of PA had a statistically significant effect on learners’ writing performance (Table 2): (a) regarding lexical complexity; (b) on all indicators of accuracy, that is error-free T-units and error-free T-units per T-unit; (c) on some aspects of grammatical complexity, that is clause per T-unit, and dependent clause per T-unit, and (d) on some aspects of fluency, that is text length (TL) and words per T-unit. Nevertheless, PA did not have a significant impact on one aspect of grammatical complexity, that is on dependent clauses per T-unit and on some aspects of fluency, that is words per error-free T-unit and words per clause.
This finding corroborates previous research which indicates that learners who received PA mainly concentrated on surface-level characteristics when rectifying their work (Baker 2016). Focusing on surface-level characteristics when revising their work slightly improved experimental group students’ writing fluency, considerably enhanced their writing accuracy, but did not enhance the grammatical or lexical complexity of their essays significantly when compared with the control group students. Moreover, this outcome aligns with Allen and Mills’ (2016) finding that there were significantly more surface than meaning changes in the essays that students produced when they received both PA and TA. However, it contradicts Hamandi (2015) who claims that peer-initiated changes were less related to surface changes because learners were self-aware of their poor language skills.
To fully understand the impact of PA on learners’ writing performance, the effect size for the total scores was calculated and it was moderately significant (see Table 3). The effect size for each one of the categories was also calculated. It was medium for lexical complexity, and one aspect of grammatical complexity, and low for accuracy, one aspect of fluency, and one aspect of grammatical complexity. Unfortunately, there was no effect for one aspect of grammatical complexity and almost all aspects of fluency (see Table 3).
To sum up, findings indicated that experimental group students (who received PA and TA) outperformed control group students (who received only TA) in terms of all indicators of writing quality, despite their young age, the limited training in PA and the lack of previous exposure to PA. Moreover, learners who received PA and TA did not enhance their writing fluency in a statistically significant way when compared to control group students (see Table 3) as this requires more time, training, and systematic exposure to PA over a long period (also in Hovardas et al. 2014). Experimental group students improved their accuracy, their lexical complexity, and some aspects of grammatical complexity in their essays (see Table 3) in a way that was statistically significant when compared with control group students, but they needed more time to develop its more complex aspects. Finally, the outcomes were not significant for one aspect of grammatical complexity and two of fluency possibly due to the instructional procedures, the syllabus and materials which predominantly focused on teaching grammatical points and vocabulary at this (intermediate EFL writing) level in Greek Cypriot public schools where the data were collected.

4.2. Discussion

Few studies have investigated the impact of PA on students’ writing quality, especially in terms of lexical complexity and grammatical accuracy. Most of them relied on marks rather than text analysis (Birjandi and Hadidi Tamjid 2012; Meletiadou and Tsagari 2014). Findings from the text analysis of the current study showed that students improved the writing quality of their essays predominantly regarding lexical complexity (Table 2) increasing the number of words they used in their essays considerably. The fact that intermediate EFL students wrote much longer essays is an indicator of increased lexical complexity and overall fluency in writing and indicates that PA had a positive impact on adolescent EFL students’ writing performance.
This was also confirmed by previous research which indicates that in time-limited language production tasks, producing more words is a good indicator of linguistic fluency (Wolfe-Quintero et al. 1998; Wu and Ortega 2013). Knoch et al. (2015) also reported that longer essays show richer content, more fluent language use, eloquence, and increased self-reliance in EFL writing. Consequently, in the present study, the fact that experimental group students managed to write longer essays indicated that they outperformed the control group students in fluency (Table 2 and Table 3), a crucial aspect of writing proficiency, confirming previous research (Zhang 2011). Therefore, writing instructors should consider using PA in their beginner and intermediate EFL classes as they frequently face the challenge of finding strategies to help their students write longer essays.
In the current study, students were able to improve only one aspect of grammatical complexity (Table 2), that is the use of independent clauses per T-unit. However, learners in the present study were unable to increase the number of dependent clauses in their essays (Table 2) probably because they found it challenging. In the current context, adolescent EFL students had only just began to learn how to use dependent clauses correctly (Ministry of Education and Culture 2010, 2011). Frequently, students still strive with grammar at that point (intermediate level) in their learning journey. It is extremely difficult to master all aspects of grammatical complexity—especially its more advanced aspects, e.g., how to use dependent clauses successfully. Learners in the present study may have improved other aspects of grammatical complexity if they had been involved in PA more frequently and over a longer time frame. These findings were not confirmed by Soleimani and Rahmanian (2014) who reported that students benefited from a steady writing complexity, accuracy, and fluency improvement and gained more benefits regarding writing complexity and fluency rather than accuracy. Pham and Nguyen (2014) also indicated that students in their study produced more feedback on local areas (rather than global areas), such as grammar, used a variety of grammatical structures and corrected their grammatical errors.
Students also improved all aspects of accuracy (Table 2) as the number of error-free T-units students produced increased. The fact that students improved their accuracy significantly, after being exposed to PA, is an important finding. PA is a promising alternative assessment method for teachers and their low achieving students who face considerable challenges in terms of their writing skills and are struggling with accuracy. EFL teachers should consider experimenting with PA as they need to use more inclusive assessment strategies to support underperforming students.
These findings were also confirmed by previous research since Trinh and Truc (2014), reported that students who used PA in their study improved their understanding of mechanics when writing their essays. Jamali and Khonamri (2014) also revealed that PA can be a beneficial technique to enhance the accuracy of EFL writings. Greater exposure to peers’ drafts allows learners to see and comment on various writing styles, methods, content, and skills, urging them at the same time to learn from both the errors and good performance of their peers (also in Han and Xu 2020).
However, the current study contradicts various other studies (Ruegg 2015; Wichadee 2013) which claim that peer review is not effective in improving the grammatical accuracy of students’ final drafts. It shows that learners can improve their accuracy by providing feedback to each other. This sharing of feedback among students helps them develop their cognitive skills allowing them to scaffold, and eventually, become more independent learners. They first reflect on their peers’ work, compare it to their own and gradually detect and correct their own errors.
Further, students did not improve two indicators of fluency (Table 2). They could not create longer sentences because fluency develops last than all other aspects of writing (Kyle and Crossley 2016). At this level, students still struggle with their grammar, syntax, and vocabulary (Fareed et al. 2016). They first need to improve these aspects of writing and then produce longer and more complex sentences. However, students improved one particularly important indicator of fluency, that is text length (Table 2). They produced longer texts using simpler sentences since they managed to increase the number of independent clauses they used. Students had more ideas about the topic after the implementation of PA and included them in their essays relying on what they knew best, that is forming simple rather than complex sentences. Consequently, PA seems to improve students’ writing fluency, but writing instructors need to encourage students to create simple sentences rather than urging them to produce complex sentences which do not match their level of maturity as intermediate EFL learners. They may also provide more training to learners and add more statements in the PA rubric which will help them reflect on ways in which they could gradually increase the length of their sentences in their essays.
To sum up, experimental group students revised their texts both locally and globally improving their accuracy, lexical complexity, fluency, and grammatical complexity (Table 2). They outperformed control group students providing additional evidence that PA can positively influence learners’ writing performance (also in Yu and Wu 2013). Experimental group students managed to read and critically engage in revising their peers’ and their own work more effectively than control group students who relied on TA only. The current study clearly shows that participants with peers and tutor’s scaffolding made considerable progress in terms of writing quality confirming previous research (Shooshtari and Mir 2014).

5. Conclusions and Pedagogical Implications

The present study is significant because it is one of the few studies, to the knowledge of the present researcher, that has used a semi-experimental design and involved many participants over a long-time frame—nine months—to investigate the impact of PA on the writing quality of adolescent EFL students’ essays. It yielded some remarkably interesting findings which clearly indicate that PA of writing can be used as a form of learning-oriented assessment with adolescent EFL learners to enhance their writing skills.
It highlights the role that PA can play in raising the consequential validity of an assessment/testing system in secondary education. First, it shows the kind of impact that PA may have on learning and defines the design principles for increasing the consequential validity of an assessment system on language learning (Behizadeh 2014). More specifically, this study shows that PA can help students better understand the assessment/testing demands (also in Gotch and French 2020), provide a supplement for formative TA, and support students’ response to TA by making it even more comprehensible for students (also in Suen 2014).
An additional pedagogical contribution of the present study is that its findings can help practitioners create proper teaching materials or adapt existing ones, improve their teaching strategies, and design PA rubrics to help adolescent EFL learners use PA effectively to enhance their writing skills. PA rubrics need to include more statements to help students reflect on and improve their writing fluency and grammatical complexity. Since intermediate adolescent EFL learners seem to find the use of dependent clauses quite challenging, EFL writing instructors need to focus on mechanics when they teach writing at that level and help them gradually create longer sentences as they seem to find this aspect of writing quite daunting. Students should be encouraged to work collaboratively providing feedback to each other in more informal ways using more student-friendly PA rubrics. Additional training and support for EFL learners and teachers is also necessary to reap the benefits of social learning and PA and help these learners improve their writing accuracy and fluency by reflecting on their own work as well as that of their peers. These recommendations and the outcomes of the current study can also be used by teachers who try to implement PA, when teaching younger or even older students, not only English, but a variety of subjects with the aim of enhancing their students’ learning.
Although many researchers claim that PA can only be used with adults (Boud et al. 2014), this study has indicated that teachers can help their adolescent EFL students improve their writing performance considerably when they choose to avoid the use of overly corrective feedback, but instead provide some comments, marks and PA using a well-structured rubric. This twofold kind of feedback (PA and TA) helps students improve their writing skills more than providing only TA in the form of marks, some comments, and a lot of corrections as is the norm in EFL classes in many European countries (e.g., Cyprus) (Meletiadou 2013).
The current study argues for the use of PA as an alternative form of assessment which informs instruction and helps promote learning (also in Dastjerdi and Taheri 2016). In that respect, PA can be used as a form of learning-oriented assessment which may help students increase their writing proficiency. The outcomes of this study will help teachers, researchers and all stakeholders visualise the classroom as a dynamic place in which students assume responsibility and act as agents of their own learning process (also in Adamson et al. 2014). While the instructors who took part in this study facilitated students’ learning by providing instruction in assessment techniques, cooperative activities, and PA training, learners were also able to intervene in their own writing process by becoming more independent and using various types of feedback: teacher marks and comments, the PA form, and, most of all, peer suggestions. For instance, it is obvious in this study that the PA procedure played a vital role in ensuring access to techniques and tools for learners through sharing, peer learning and clarifications. This indicates that student writing and assessment activities are not only cognitive tasks placed within the individual student, but instead they are contextually placed social and cultural activities (Panhwar et al. 2016).
One more significant pedagogical contribution of this study is that it promotes the creation of communities of writing practice (also in Midgley 2013). For instance, in this study, experimental group teachers systematically asked their learners to use PA as an essential learning strategy to improve their writing skills. Instructors encouraged students to create, edit, proofread, and exchange feedback on their essays in a cooperative manner acting as both assessors and assessees. This enabled the creation of a community of learning where learners helped their classmates improve their writing skills. In their exciting journey from almost ignorant assessors to knowledgeable evaluators, learners managed to familiarise themselves with the rules of their community, communicate with their peers, and play their distinct roles as assessors and assessees (also in Simeon 2014). This is a procedure that is worthwhile using in EFL and other subjects in Cyprus, but also worldwide. It will support students as they try to become more aware of themselves as writers and understand the nature of writing as well. It will also help learners realise how writing supports the social aspects of their lives as EFL learners.
The outcomes are also beneficial to both educators and theoreticians in the field of EFL/ESL teaching. They indicate ways in which EFL writing instructors can help their adolescent EFL learners improve their writing complexity, fluency and accuracy and write longer essays by fostering their engagement in PA activities. The findings may also enhance students’ self-regulation, reflection, and independence by providing insights on the benefits of PA, peer learning and collaborative writing and its applicability to adolescent EFL/ESL learners. In other words, assessment for learning-oriented tasks, such as PA activities, can create conditions that enhance students’ professional skills allowing them to gradually move towards self-assessment which could reduce the amount of time instructors spend to provide feedback to their students to help them improve their writing performance. Consequently, it is vital to allow students to experiment with alternative assessment methods, such as PA, which foster student autonomy and ultimately facilitate the language learning process saving teachers time and effort.
The results of the current study can guide educators who do not use alternative assessment methods, such as PA. These innovative approaches can provide an important avenue for learners to enhance their language proficiency and writing skills. Moreover, it is essential that educators prepare their students when they involve them in new learning methods, by training them in PA, establishing ground rules for giving PA and collaborating, modeling the PA process, and showing them how they can share ideas, help their peers and themselves by exchanging PA.
Further, the present study may act as a pilot study of a large-scale research project which could implement PA as a useful learning tool in secondary schools (e.g., in Cyprus) to improve students’ performance at the end of the year summative tests. The current study could, therefore, contribute to the project in terms of instrument design, refinement of instruments, and approaches to data analysis. The study also contributes to our understanding of PA in relation to the formative use of summative assessment and the tensions between the two purposes in high-stakes contexts (Cross and O’Loughlin 2013).
To sum up, the present study emphasises the value of interdependence among learners who provide valuable feedback to each other as they try to achieve common goals. It indicates that a combination of PA and TA can have a significant impact on the writing quality of EFL learners’ essays as they can produce much longer essays with more sophisticated vocabulary.

Limitations and Future Research

The current study comes with certain limitations as it could have used an additional source of data, e.g., the think aloud method or learners’ diaries, to provide more in-depth data and shed more light into how students can use PA in the classroom more effectively to improve the writing quality of their assignments. The PA rubric used in this study is only suitable for essays. Future researchers could develop a similar comprehensive PA checklist for other types of writing, e.g., reports, which could include additional statements to help students develop their writing fluency even more.
Moreover, only a small sample of learners was used in a specific context, that of secondary education. Future research should investigate the use of PA in primary education to examine whether this innovative assessment method could help even younger students improve the writing quality of their texts. A significant direction for further research would be to explore ways in which to design courses which use PA while they aim to develop other skills, i.e., speaking, and examine ways in which assessment training should be planned to foster skill acquisition. It would also be worthwhile to further investigate the link between PA skill acquisition and content skill acquisition, and to what extent domain expertise affects the enhancement of assessment skills. Finally, teachers and researchers are also encouraged to conduct observational studies to explore the effects of PA and how various situational variables may influence PA.
Further, researchers need to examine how students learn from their peers while assessing their texts, share ideas, cooperate, resolve conflicts, and exchange strategies as they try to make assessment and learning a mutually beneficial experience. Finally, future studies should explore and indicate how PA could be used from a younger age to foster peer learning and lead learners towards autonomy and effective collaboration. In the 21st century society, skills such as self-management and teamwork are as valuable as developing students’ writing skills. Therefore, educators should experiment with a variety of assessment techniques to enhance students’ linguistic and professional skills since assessment is the tail wagging the curriculum dog.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Pedagogical Institute of the Cypriot Ministry of Education and Culture (7.15.01.25.8.2/3 on 1/11/2013).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Conflicts of Interest

The author declares no conflict of interest.

Appendix A

EFL essay scoring rubric (sample statements for each one of the criteria).
Criteria/Weighting18–20
A
15–17
B
11–14
C
6–10
D
0–5
E
A. Content
5. The purpose of the essay is clear to its readers.
B. Organisation
11. The writer uses paragraphs with a clear focus and purpose.
C. Vocabulary and Language Use
13. The vocabulary is sophisticated and varied, i.e., use of unique adjectives.
D. Mechanics
29. There are errors of capitalisation.
E. Focus
32. There is a consistent point of view.
Additional open-ended comments:
1. Indicate three main strengths of the current essay.
……………………………………………………………………………………………………………
2. Indicate three main weaknesses of the current essay.
……………………………………………………………………………………………………………
Suggestions for revision
1. Write three specific recommendations to help the writer revise his/her work.
……………………………………………………………………………………………………………
Analytic score: Content:/4, Organisation:/4, Vocabulary and Language use:/4, Mechanics:/4, Focus:/4, Total score:/20.
Holistic score:
18–2015–1711–146–100–5
ABCDE

References

  1. Adachi, Chie, Joanna Hong-Meng Tai, and Phillip Dawson. 2018. Academics’ perceptions of the benefits and challenges of self and peer assessment in higher education. Assessment & Evaluation in Higher Education 43: 294–306. [Google Scholar] [CrossRef]
  2. Adamson, David, Gregory Dyke, Hyeju Jang, and Carolyn Penstein Rosé. 2014. Towards an agile approach to adapting dynamic collaboration support to student needs. International Journal of Artificial Intelligence in Education 24: 92–124. [Google Scholar] [CrossRef] [Green Version]
  3. Allen, David, and Amy Mills. 2016. The impact of second language proficiency in dyadic peer feedback. Language Teaching Research 20: 498–513. [Google Scholar] [CrossRef]
  4. Ashraf, Hamid, and Marziyeh Mahdinezhad. 2015. The role of peer-assessment versus self-assessment in promoting autonomy in language use: A case of EFL learners. Iranian Journal of Language Testing 5: 110–20. [Google Scholar] [CrossRef]
  5. Bailey, Stephen. 2017. Academic Writing: A Handbook for International Students. New York: Routledge. [Google Scholar]
  6. Baker, Kimberly M. 2016. Peer review as a strategy for improving students’ writing process. Active Learning in Higher Education 17: 179–92. [Google Scholar] [CrossRef]
  7. Behizadeh, Nadia. 2014. Mitigating the dangers of a single story: Creating large-scale writing assessments aligned with sociocultural theory. Educational Researcher 43: 125–36. [Google Scholar] [CrossRef]
  8. Birjandi, Parviz, and Nasrin Hadidi Tamjid. 2012. The role of self-, peer and teacher assessment in promoting Iranian EFL learners’ writing performance. Assessment & Evaluation in Higher Education 37: 513–33. [Google Scholar] [CrossRef]
  9. Boud, David, Ruth Cohen, and Jane Sampson, eds. 2014. Peer Learning in Higher Education: Learning from and with Each Other. Abingdon: Routledge. [Google Scholar]
  10. Chien, Shu-Yun, Gwo-Jen Hwang, and Morris Siu-Yung Jong. 2020. Effects of peer assessment within the context of spherical video-based virtual reality on EFL students’ English-speaking performance and learning perceptions. Computers & Education 146: 103751. [Google Scholar] [CrossRef]
  11. Cho, Kwangsu, Tingting Rachel Chung, William R. King, and Christian Schunn. 2008. Peer-based computer-supported knowledge refinement: An empirical investigation. Communications of the ACM 51: 83–88. [Google Scholar] [CrossRef]
  12. Cho, Kwangsu, and Charles MacArthur. 2010. Student revision with peer and expert reviewing. Learning and Instruction 20: 328–38. [Google Scholar] [CrossRef]
  13. Council of Europe. 2001. Common European Framework of Reference for Languages: Learning, Teaching, Assessment. Cambridge: Cambridge University Press. [Google Scholar]
  14. Cross, Russell, and Kieran O’Loughlin. 2013. Continuous assessment frameworks within university English Pathway Programs: Realizing formative assessment within high-stakes contexts. Studies in Higher Education 38: 584–94. [Google Scholar] [CrossRef]
  15. Dastjerdi, Vahid Hossein, and Raheleh Taheri. 2016. Impact of dynamic assessment on Iranian EFL learners’ picture-cued writing. International Journal of Foreign Language Teaching and Research 4: 129–44. Available online: https//:jfl.iaun.ac.ir/article_561178.html (accessed on 20 October 2020).
  16. Davis, Tyler, Karen F. LaRocque, Jeanette A. Mumford, Kenneth A. Norman, Anthony D. Wagner, and Russell A. Poldrack. 2014. What do differences between multi-voxel and univariate analysis mean? How subject-, voxel-, and trial-level variance impact fMRI analysis. Neuroimage 97: 271–83. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Dewaele, Jean-Marc, John Witney, Kazuya Saito, and Livia Dewaele. 2018. Foreign language enjoyment and anxiety: The effect of teacher and learner variables. Language Teaching Research 22: 676–97. [Google Scholar] [CrossRef] [Green Version]
  18. Diab, Nuwar Mawlawi. 2011. Assessing the relationship between different types of student feedback and the quality of revised writing. Assessing Writing 16: 274–92. [Google Scholar] [CrossRef]
  19. Double, Kit S., Joshua A. McGrane, and Therese N. Hopfenbeck. 2020. The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educational Psychology Review, 481–509. Available online: https://www.researchgate.net/publication/33787256 (accessed on 17 September 2020). [CrossRef] [Green Version]
  20. Edwards, Jette Hansen, and Jun Liu. 2018. Peer Response in Second Language Writing Classrooms. Michigan: University of Michigan Press. [Google Scholar]
  21. Ergai, Awatef, Tara Cohen, Julia Sharp, Doug Wiegmann, Anand Gramopadhye, and Scott Shappell. 2016. Assessment of the human factors analysis and classification system (HFACS): Intra-rater and inter-rater reliability. Safety Science 82: 393–98. [Google Scholar] [CrossRef] [Green Version]
  22. Falchikov, Nancy. 2005. Improving Assessment through Student Involvement: Practical Solutions for Aiding Learning in Higher and Further Education. London: Routledge. [Google Scholar] [CrossRef]
  23. Fareed, Muhammad, Almas Ashraf, and Muhammad Bilal. 2016. ESL learners’ writing skills: Problems, factors and suggestions. Journal of Education and Social Sciences 4: 81–92. Available online: https://www.researchgate.net/profile/Muhammad_Fareed8/publication/3116698 29_ESL_Learners’_Writing_Skills_Problems_Factors_and_Suggestions/links/58538d2708ae0c0f32228618/ESL-Learners-Writing-Skills-Problems-Factors-andSuggestions.pdf (accessed on 13 August 2020). [CrossRef]
  24. Fekri, Neda. 2016. Investigating the effect of cooperative learning and competitive learning strategies on the English vocabulary development of Iranian intermediate EFL learners. English Language Teaching 9: 6–12. [Google Scholar] [CrossRef] [Green Version]
  25. Foster, Pauline, and Peter Skehan. 1996. The influence of planning and task type on second language performance. Studies in Second Language Acquisition 18: 299–323. [Google Scholar] [CrossRef] [Green Version]
  26. Ghahari, Shima, and Farzaneh Farokhnia. 2017. Peer versus teacher assessment: Implications for CAF triad language ability and critical reflections. International Journal of School & Educational Psychology 6: 124–37. [Google Scholar]
  27. Ghani, Mamuna, and Tahira Asgher. 2012. Effects of teacher and peer feedback on students’ writing at secondary level. Journal of Educational Research 15: 84. [Google Scholar]
  28. Gielen, Mario, and Bram De Wever. 2015. Scripting the role of assessor and assessee in peer assessment in a wiki environment: Impact on peer feedback quality and product improvement. Computers & Education 88: 370–86. [Google Scholar]
  29. Gotch, Chad M., and Brian F. French. 2020. A validation trajectory for the Washington assessment of risks and needs of students. Educational Assessment 25: 65–82. [Google Scholar] [CrossRef]
  30. Gudowsky, Niklas, Mahshid Sotoudeh, Ulrike Bechtold, and Walter Peissl. 2016. Contributing to a European vision of democratic education by engaging multiple actors in shaping responsible research agendas. Filozofia Publiczna i Edukacja Demokratyczna 5: 29–50. [Google Scholar] [CrossRef] [Green Version]
  31. Hamandi, Dania Hassan. 2015. The Relative Effect of Trained Peer Response: Traditional Versus Electronic Modes on College EFL Lebanese Students’ Writing Performance, Revision Types, Perceptions towards Peer Response, and Attitudes Towards Writing. Master’s thesis, American University of Beirut, Beirut, Lebanon. [Google Scholar]
  32. Hamer, John, Helen Purchase, Andrew Luxton-Reilly, and Paul Denny. 2015. A comparison of peer and tutor feedback. Assessment & Evaluation in Higher Education 40: 151–64. [Google Scholar]
  33. Han, Ye, and Yueting Xu. 2020. The development of student feedback literacy: The influences of teacher feedback on peer feedback. Assessment & Evaluation in Higher Education 45: 680–96. [Google Scholar] [CrossRef]
  34. Harris, Lois R., Gavin T. L. Brown, and Jennifer A. Harnett. 2015. Analysis of New Zealand primary and secondary student peer-and self-assessment comments: Applying Hattie and Timperley’s feedback model. Assessment in Education: Principles, Policy & Practice 22: 265–81. [Google Scholar] [CrossRef]
  35. Hashemifardnia, Arash, Ehsan Namaziandost, and Mehrdad Sepehri. 2019. The effectiveness of giving grade, corrective feedback, and corrective feedback-plus-giving grade on grammatical accuracy. International Journal of Research Studies in Language Learning 8: 15–27. [Google Scholar] [CrossRef] [Green Version]
  36. Hovardas, Tasos, Olia E. Tsivitanidou, and Zacharias C. Zacharia. 2014. Peer versus expert feedback: An investigation of the quality of peer feedback among secondary school students. Computers & Education 71: 133–52. [Google Scholar] [CrossRef]
  37. Hyland, Ken, and Fiona Hyland, eds. 2019. Feedback in Second Language Writing: Contexts and Issues. Cambridge: Cambridge University Press. [Google Scholar]
  38. Jacobs, H., S. Zinkgraf, D. Harfiel Wormuth, and V. Hartfiel. 1981. Testing ESL Composition: A Practical Approach. Rowley: Newbury House. [Google Scholar]
  39. Jalalifarahani, Maryam, and Hamid Azizi. 2012. The efficacy of peer vs. teacher response in enhancing grammatical accuracy & general writing quality of advanced vs. elementary proficiency EFL learners. International Conference on Language, Medias and Culture 33: 88–92. [Google Scholar]
  40. Jamali, Mozhgan, and Fatemeh Khonamri. 2014. An investigation of the effects of three post-writing methods: Focused feedback, learner-oriented focused feedback, and no feedback. International Journal of Applied Linguistics and English Literature 3: 180–88. [Google Scholar] [CrossRef]
  41. Kim, Minjeong. 2005. The Effects of the Assessor and Assessee’s Roles on Preservice Teachers’ Metacognitive Awareness, Performance, and Attitude in a Technology-Related Design Task. Unpublished Doctoral dissertation, Florida State University, Tallahassee, FL, USA. [Google Scholar]
  42. Knoch, Ute, Amir Rouhshad, Su Ping Oon, and Neomy Storch. 2015. What happens to ESL students’ writing after three years of study at an English medium university? Journal of Second Language Writing 28: 39–52. [Google Scholar] [CrossRef]
  43. Kyle, Kristopher, and Scott Crossley. 2016. The relationship between lexical sophistication and independent and source-based writing. Journal of Second Language Writing 34: 12–24. [Google Scholar] [CrossRef]
  44. Lam, Ricky. 2016. Assessment as learning: Examining a cycle of teaching, learning, and assessment of writing in the portfolio-based classroom. Studies in Higher Education 41: 1900–17. [Google Scholar] [CrossRef]
  45. Lee, Eunbae, and Michael J. Hannafin. 2016. A design framework for enhancing engagement in student-centered learning: Own it, learn it, and share it. Educational Technology Research and Development 64: 707–34. [Google Scholar] [CrossRef]
  46. Lee, Man-Kit. 2015. Peer feedback in second language writing: Investigating junior secondary students’ perspectives on inter-feedback and intra-feedback. System 55: 1–10. [Google Scholar] [CrossRef]
  47. Li, Lan, Xiongyi Liu, and Yuchun Zhou. 2012. Give and take: A re-analysis of assessor and assessee’s roles in technology-facilitated peer assessment. British Journal of Educational Technology 43: 376–84. [Google Scholar] [CrossRef]
  48. Liu, Qiandi, and Dan Brown. 2015. Methodological synthesis of research on the effectiveness of corrective feedback in L2 writing. Journal of Second Language Writing 30: 66–81. [Google Scholar] [CrossRef]
  49. Lynch, Raymond, Patricia Mannix McNamara, and Niall Seery. 2012. Promoting deep learning in a teacher education programme through self-and peer-assessment and feedback. European Journal of Teacher Education 35: 179–97. [Google Scholar] [CrossRef]
  50. McConlogue, Teresa. 2015. Making judgements: Investigating the process of composing and receiving peer feedback. Studies in Higher Education 40: 1495–506. [Google Scholar] [CrossRef]
  51. Meek, Sarah E. M., Louise Blakemore, and Leah Marks. 2017. Is peer review an appropriate form of assessment in a MOOC? Student participation and performance in formative peer review. Assessment & Evaluation in Higher Education 42: 1000–13. [Google Scholar] [CrossRef] [Green Version]
  52. Meletiadou, Eleni. 2012. The impact of training adolescent EFL learners on their perceptions of peer assessment of writing. Research Papers in Language Teaching & Learning 3: 240–51. [Google Scholar]
  53. Meletiadou, Eleni. 2013. EFL learners’ attitudes towards peer assessment, teacher assessment and the process writing. In Selected Papers in Memory of Dr Pavlos Pavlou: Language Testing and Assessment around the Globe—Achievement and Experiences. Language Testing and Evaluation Series. Edited by Dina Tsagari, Salomi Papadima-Sophocleous and Sophie Ioannou-Georgiou. Frankfurt am Main: Peter Lang GmbH, pp. 312–32. [Google Scholar]
  54. Meletiadou, Eleni, and Dina Tsagari. 2014. An exploration of the reliability and validity of peer assessment of writing in secondary education. In Major Trends in Theoretical and Applied Linguistics 3. Edited by Dina Tsagari. Warsaw: De Gruyter Open Poland, pp. 235–50. [Google Scholar]
  55. Midgley, James. 2013. Social Development: Theory and Practice. London: Sage. [Google Scholar]
  56. Ministry of Education and Culture. 2010. Foreign Language Programme of Study for Cypriot Public Secondary Schools; Nicosia: Ministry of Education.
  57. Ministry of Education and Culture. 2011. Foreign Language Programme of Study for Cypriot Public Pre-Primary and Primary Schools; Nicosia: Ministry of Education.
  58. Panadero, Ernesto, and Gavin T. L. Brown. 2017. Teachers’ reasons for using peer assessment: Positive experience predicts use. European Journal of Psychology of Education 32: 133–56. [Google Scholar] [CrossRef] [Green Version]
  59. Panadero, Ernesto, Anders Jonsson, and Jan-Willem Strijbos. 2016. Scaffolding self-regulated learning through self-assessment and peer assessment: Guidelines for classroom implementation. In Assessment for Learning: Meeting the Challenge of Implementation. Edited by Dany Laveault and Linda Allal. Cham: Springer, pp. 311–26. [Google Scholar]
  60. Panhwar, Abdul Hameed, Sanaullah Ansari, and Komal Ansari. 2016. Sociocultural theory and its role in the development of language pedagogy. Advances in Language and Literary Studies 7: 183–88. [Google Scholar]
  61. Patchan, Melissa M., and Christian D. Schunn. 2015. Understanding the benefits of providing peer feedback: How students respond to peers’ texts of varying quality. Instructional Science 43: 591–614. [Google Scholar] [CrossRef]
  62. Patri, Mrudula. 2002. The influence of peer feedback on self and peer-assessment of oral skills. Language Testing 19: 109–31. [Google Scholar] [CrossRef]
  63. Petra, Siti Fatimah, Jainatul Halida Jaidin, J. S. H. Quintus Perera, and Marcia Linn. 2016. Supporting students to become autonomous learners: The role of web-based learning. The International Journal of Information and Learning Technology 33: 263–75. [Google Scholar] [CrossRef]
  64. Pham, Ho Vu Phi, and Duong Thi Thuy Nguyen. 2014. The effectiveness of peer feedback on graduate academic writing at Ho Chi Minh City Open University. Journal of Science Ho Chi Minh City Open University 2: 35–48. [Google Scholar]
  65. Pham, Ho Vu Phi, and Siriluck Usaha. 2016. Blog-based peer response for l2 writing revision. Computer Assisted Language Learning 29: 724–48. [Google Scholar] [CrossRef]
  66. Puegphrom, Puritchaya, Tanyapa Chiramanee, and Thanyapa Chiramanee. 2014. The effectiveness of implementing peer assessment on students’ writing proficiency. In Factors Affecting English Language Teaching and Learning. pp. 1–17. Available online: http://fs.libarts.psu.ac.th/research/conference/proceedings-3/2pdf/003.pdf (accessed on 1 July 2021).
  67. Reinholz, Daniel. 2016. The assessment cycle: A model for learning through peer assessment. Assessment & Evaluation in Higher Education 41: 301–15. [Google Scholar] [CrossRef]
  68. Ruegg, Rachael. 2015. The relative effects of peer and teacher feedback on improvement in EFL students’ writing ability. Linguistics and Education 29: 73–82. [Google Scholar] [CrossRef]
  69. Saito, Hidetoshi. 2008. EFL classroom peer assessment: Training effects on rating and commenting. Language Testing 25: 553–81. [Google Scholar] [CrossRef]
  70. Saito, Kazuya, and Keiko Hanzawa. 2016. Developing second language oral ability in foreign language classrooms: The role of the length and focus of instruction and individual differences. Applied Psycholinguistics 37: 813–40. [Google Scholar] [CrossRef] [Green Version]
  71. Sheen, Younghee, and Rod Ellis. 2011. Corrective feedback in language teaching. Handbook of Research in Second Language Teaching and Learning 2: 593–610. [Google Scholar] [CrossRef]
  72. Shooshtari, Zohreh G., and Farzaneh Mir. 2014. ZPD, tutor, peer scaffolding: Sociocultural theory in writing strategies application. Procedia-Social and Behavioral Sciences 98: 1771–76. [Google Scholar] [CrossRef] [Green Version]
  73. Simeon, Jemma Christina. 2014. Language Learning Strategies: An Action Research Study from a Sociocultural Perspective of Practices in Secondary School English Classes in the Seychelles. Doctor of Philosophy, Victoria University of Wellington, Victoria, Australia. [Google Scholar]
  74. Soleimani, Hassan, and Mahboubeh Rahmanian. 2014. Self-, peer-, and teacher-assessments in writing improvement: A study of complexity, accuracy, and fluency. Research in Applied Linguistics 5: 128–48. [Google Scholar] [CrossRef]
  75. Soleimani, Maryam, Modirkhamene Sima, and Sadeghi Karim. 2017. Peer-mediated vs. individual writing: Measuring fluency, complexity, and accuracy in writing. Innovation in Language Learning and Teaching 11: 86–100. [Google Scholar] [CrossRef]
  76. Strijbos, Jan-Willem. 2016. Assessment of collaborative learning. In Handbook of Human and Social Conditions in Assessment. Edited by Gavin T. Brown and Lois R. Harris. London: Routledge, p. 302. [Google Scholar]
  77. Suen, Hoi K. 2014. Peer assessment for massive open online courses (MOOCs). International Review of Research in Open and Distributed Learning 15: 312–27. [Google Scholar] [CrossRef] [Green Version]
  78. Tenorio, Thyago, Ig Ibert Bittencourt, Seiji Isotani, Alan Pedro, and Patricia Ospina. 2016. A gamified peer assessment model for on-line learning environments in a competitive context. Computers in Human Behavior 64: 247–63. [Google Scholar] [CrossRef]
  79. Thomas, Glyn J., Dona Martin, and Kathleen Pleasants. 2011. Using self-and peer-assessment to enhance students’ future-learning in higher education. Journal of University Teaching and Learning Practice 8: 5. [Google Scholar]
  80. Tillema, Harm, Martijn Leenknecht, and Mien Segers. 2011. Assessing assessment quality: Criteria for quality assurance in design of (peer) assessment for learning–a review of research studies. Studies in Educational Evaluation 37: 25–34. [Google Scholar] [CrossRef]
  81. Ting, M. E. I., and Y. U. A. N. Qian. 2010. A case study of peer feedback in a Chinese EFL writing classroom. Chinese Journal of Applied Linguistics 33: 87–100. [Google Scholar]
  82. Trinh, Quoc Lap, and Nguyen Thanh Truc. 2014. Enhancing Vietnamese learners’ ability in writing argumentative essays. Journal of Asia TEFL 11: 63–91. [Google Scholar]
  83. Tsagari, Dina, and Karin Vogt. 2017. Assessment literacy of foreign language teachers around Europe: Research, challenges, and future prospects. Papers in Language Testing and Assessment 6: 41–63. [Google Scholar] [CrossRef]
  84. Tsui, Amy B. M., and Maria Ng. 2000. Do secondary L2 writers benefit from peer comments? Journal of Second Language Writing 9: 147–70. [Google Scholar] [CrossRef]
  85. Vogt, Karin, and Dina Tsagari. 2014. Assessment literacy of foreign language teachers: Findings of a European study. Language Assessment Quarterly 11: 374–402. [Google Scholar] [CrossRef]
  86. Wang, Weiqiang. 2014. Students’ perceptions of rubric-referenced peer feedback on EFL writing: A longitudinal inquiry. Assessing Writing 19: 80–96. [Google Scholar] [CrossRef]
  87. Wanner, Thomas, and Edward Palmer. 2018. Formative self-and peer assessment for improved student learning: The crucial factors of design, teacher participation and feedback. Assessment & Evaluation in Higher Education 43: 1032–47. [Google Scholar] [CrossRef]
  88. Wichadee, Saovapa. 2013. Peer feedback on Facebook: The use of social networking websites to develop writing ability of undergraduate students. Turkish Online Journal of Distance Education 14: 260–70. [Google Scholar]
  89. Wigglesworth, Gillian, and Neomy Storch. 2009. Pair versus individual writing: Effects on fluency, complexity, and accuracy. Language Testing 26: 445–66. [Google Scholar] [CrossRef]
  90. Wolfe-Quintero, Kate, Shunji Inagaki, and Hae-Young Kim. 1998. Second Language Development in Writing: Measures of Fluency, Accuracy & Complexity. Honolulu: University of Hawaii at Manoa. [Google Scholar]
  91. Wu, Shu-Ling, and Lourdes Ortega. 2013. Measuring global oral proficiency in SLA research: A new elicited imitation test of L2 Chinese. Foreign Language Annals 46: 680–704. [Google Scholar] [CrossRef]
  92. Xiao, Yun, and Robert Lucking. 2008. The impact of two types of peer assessment on students’ performance and satisfaction within a Wiki environment. The Internet and Higher Education 11: 186–93. [Google Scholar] [CrossRef]
  93. Yang, Weiwei, Xiaofei Lu, and Sara Cushing Weigle. 2015. Different topics, different discourse: Relationships among writing topic, measures of syntactic complexity, and judgments of writing quality. Journal of Second Language Writing 28: 53–67. [Google Scholar] [CrossRef]
  94. Yu, Fu-Yun, and Chun-Ping Wu. 2013. Predictive effects of online peer feedback types on performance quality. Educational Technology & Society 16: 332–41. [Google Scholar]
  95. Yu, Shulin, and Guangwei Hu. 2017. Understanding university students’ peer feedback practices in EFL writing: Insights from a case study. Assessing Writing 33: 25–35. [Google Scholar] [CrossRef]
  96. Zhang, Jie. 2011. Chinese college students’ abilities and attitudes for peer review. Chinese Journal of Applied Linguistics (Quarterly) 34: 47–58. [Google Scholar] [CrossRef]
  97. Zhu, Qiyun, and David Carless. 2018. Dialogue within peer feedback processes: Clarification and negotiation of meaning. Higher Education Research & Development 37: 883–97. [Google Scholar] [CrossRef]
Figure 1. A cyclic scheme for PA.
Figure 1. A cyclic scheme for PA.
Languages 06 00115 g001
Table 1. One-way MANOVA analysis of all indicators of writing quality.
Table 1. One-way MANOVA analysis of all indicators of writing quality.
Multivariate Tests
EffectValueFH*SigPES**OP***
Df
Wilks’0.8453.461 a10.0000.0000.1890.991
a. Exact statistic; The mean difference is significant at the 0.05 level; *H = hypothesis; **PES = partial eta squared; ***OP = observed power.
Table 2. Tests between subjects’ findings for all dependent variables.
Table 2. Tests between subjects’ findings for all dependent variables.
Dependent VariablesFSigPES*OP**
Accuracy (errors per T-unit)5.6080.0190.0280.654
Accuracy (error-free T-units)5.0610.0260.0250.610
Accuracy (error-free T-units per T-unit)4.2640.0400.0210.538
Lexical complexity (WT/2W)9.8380.0020.0470.877
Grammatical complexity DCpT *0.2790.5980.0010.082
Grammatical complexity DCpC **2.2760.1330.0110.324
Fluency (text length)4.3390.0390.0210.545
Fluency (words per T-unit)1.0620.3040.0050.176
Fluency (words per error-free T-unit)0.1030.7480.0010.062
Fluency (words per clause)0.2320.6300.0010.077
* (Dependent clauses per T-unit); ** (Dependent clauses per clause); The mean difference is significant at the 0.05 level; *PES = partial Eta squared; **OP = observed power.
Table 3. Effects sizes for all indicators of writing quality.
Table 3. Effects sizes for all indicators of writing quality.
Effect Size of Various IndicatorsCohen’sEffect-Size
dr
Accuracy-EFTT (error-free T-Units per T-unit)0.200.13
Accuracy-ET (error-free T-units)0.330.16
Lexical complexity0.440.21
Grammatical complexity—DCT *0.070.03
Grammatical complexity—DCT **0.450.22
Grammatical complexity—DCC ***0.220.11
Fluency—WEFT (words per error-free T-unit)0.140.07
Fluency—WEFT (words per error-free T-unit)0.040.02
Fluency—WC (words per clauses)0.060.03
Fluency—TL (text-length)0.290.14
* (Dependent clauses per T-Unit); ** (Independent clauses per T-Unit); *** (Dependent clauses per clause); Cohen’s d: small = 0.2, medium = 0.5, large = 0.8; Effect size r: trivial = <0.1, small = 0.1.–0.3, medium = 0.3–0.5.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Meletiadou, E. Opening Pandora’s Box: How Does Peer Assessment Affect EFL Students’ Writing Quality? Languages 2021, 6, 115. https://0-doi-org.brum.beds.ac.uk/10.3390/languages6030115

AMA Style

Meletiadou E. Opening Pandora’s Box: How Does Peer Assessment Affect EFL Students’ Writing Quality? Languages. 2021; 6(3):115. https://0-doi-org.brum.beds.ac.uk/10.3390/languages6030115

Chicago/Turabian Style

Meletiadou, Eleni. 2021. "Opening Pandora’s Box: How Does Peer Assessment Affect EFL Students’ Writing Quality?" Languages 6, no. 3: 115. https://0-doi-org.brum.beds.ac.uk/10.3390/languages6030115

Article Metrics

Back to TopTop