This research focuses on student inquiry learning in a simulated scholarly research. The design was based on a framework of six principles. One of the student activities, the peer review, was founded on activity theory. 428 groups of pre-university chemistry students from various countries participated. All students conducted, in small groups, an inquiry on fermentation at their schools. They wrote an inquiry report, did a peer review on another article and wrote a final article. Four groups of two students were randomly selected. Their peer review comments and articles were analyzed on the level of the student understanding of inquiry quality concerning five categories. The data were completed with in-depth group's interviews. It was concluded that student understanding was positively influenced with an exception for their understanding of reliability. Simulated scholarly activities can be founded on activity theory. The implications of this foundation regarding simulated scientific activities are discussed.
Esta investigación está centrada en el aprendizaje basado en indagación dentro de un ambiente de investigación simulado. El diseño se fundamentó en un marco de seis principios. Una de las actividades de los estudiantes, la revisión por pares, se fundamentó en la teoría de la actividad. Participaron 428 grupos de estudiantes pre-universitarios de química de diversos países. Todos los estudiantes trabajaron en pequeños grupos y llevaron a cabo una indagación sobre fermentación, en sus escuelas. Los estudiantes escribieron un reporte de indagación, hicieron una revisión por pares de otro artículo y al final escribieron sus comentarios. Se seleccionaron al azar cuatro grupos de dos estudiantes. De éstos se analizaron sus comentarios de la revisión por pares y sus artículos. El análisis se llevó a cabo dependiendo del nivel de comprensión de indagación de los estudiantes, el cual consiste en cinco categorías. Los datos se complementaron con entrevistas grupales profundas. Con esto se pudo observar que, en general, la comprensión de los estudiantes fue positivamente influenciada con excepción de su entendimiento de la responsabilidad. Las actividades escolares simuladas pueden ser fundamentadas con la teoría de la actividad. Se discuten las implicaciones de esta fundamentación con respecto a las actividades científicas discutidas en este artículo.
In many countries science education standards requires involvement of secondary school science students in inquiry-based learning (cf. National Research Council, 1996). By emphasizing scientific inquiry learning in science curricula, teachers and educational researchers are challenged to come up with practically and theoretically founded approaches that are essential for student inquiry learning (e.g. Bencze & Hodson, 1999; Kass & Macdonald, 1999; Krajcik et al., 1998; Lotter, Harwood & Bonner, 2007; Roth, 1996; Windschitl, Thomson & Braaten, 2008; Van Rens, Van Dijk & Pilot, 2004).
Educational research often responds to this challenge from two perspectives: students who do activities that resemble scientists’ research activities (e.g. Driver et al., 1994; Van Rens, Pilot & Van der Schee 2010) or students who work at scientists’ elbows (e.g. Barab & Hay, 2001; Lee & Songer, 2003; Bell et al., 2003; Van Rens et al., 2011). These studies are based on the assumption that students who either work on simulated research activities or at scientists’ elbows will develop understanding about the practice of scientists, nature of science and scientific inquiry, as well as develop interest in and motivation for science.
However, in classroom settings science teachers often meet constraints in teaching scientific inquiry that really fosters students’ scientific inquiry learning (c.f. Lunetta, Hofstein & Clough, 2007). Moreover, in educational practice not all science students have the opportunity of working at scientists’ elbows, so it is proposed to study student inquiry learning when they are involved in simulated research activities.
To further address the issue of student inquiry learning in simulated research activities, this study investigates student understanding of quality in an inquiry when they perform an inquiry module, the design of which is based on a theoretical framework that brings in activities that resemble authentic science research and is based on activity theory with regard to one of its components.
Theoretical frameworkDesigning simulated scientific research activities that are feasible for pre-university chemistry students in classroom settings requires collaboration with pre-university chemistry teachers (Kelly, 2003). The design of such activities should give the students an insight in the scientific research practice, so a framework of the activities in a scientific research practice is needed. Van Rens et. al. (2010) argue that the design of a simulated scientific research should be based on six principles: a) create an inquiry community; b) select an adequate inquiry problem; c) design a cyclic and iterative inquiry process related to student willingness, knowing and ability; d) share inquiry results; e) create critical discourse in the community; and f) share new knowledge and further questions (see Figure 1). This framework was used to design, cooperation with five pre-university chemistry teachers, several chemistry inquiry modules: Traditional and modern soap: washing power; Cola and Teeth; Cool: design a cold pack; Salty or Ionic Liquids (Van Rens & Pilot, 2010); Biofuels; Chocolate; and Gastronomy. These inquiry modules are successfully implemented between 2003-2009 by a number of pre-university chemistry students reaching from 124 up to 663 and a number of pre-university chemistry teachers from 9 up to 34 respectively.
Scientific research includes peer review to create critical discourse in the science community. It has been used to determine academic merit for already several centuries (e.g. Larochelle & Désautels, 2002). The scholarly activity of peer reviewing can be taken as an example in order to concretize design principle (e), so that students have an opportunity to a critical discourse in the inquiry community.
In this study, scholarly peer review is considered as a human activity in terms of activity theory. This theory describes human activities with regard to the connection between scientific knowledge and social practice in a historical, cultural and societal sense (Leont’ev, 1978). Connecting these two creates relevance in the students’ eyes and so gives them motives to tackle scientific problems and to make socio-scientific decisions (e.g. Hofstein, Eilks & Bybee, 2011; Holbrook & Rannikmae, 2007; Lemke, 2001; Roth & Lee, 2004; Van Aalsvoort, 2004).
According to Leontev (ibid.) human activities manifest on three levels: the level of condition-driven and routinized operations, the level of goal-driven individual or group actions and the level of motive or object-driven collective activities.
The latter level or the level of object-driven collective activities is frequently depicted as an activity system with seven components: subjects, community, object, outcome, division of labor, rules and tools. Such a system is described in a sense that activities of subjects or humans are oriented towards an object and transformed into an outcome. Moreover, activities are carried out by a community and are mediated by tools, division of labor and rules (e.g. Engeström et al., 1999; Roth et al., 2002; Kahveci, Gilmer & Southerland, 2008; Hsu et al., 2010).
An activity system seems to be appropriate for simulating the activity of scholarly peer review, because it provides a connection between scientific knowledge in scholarly peer review and a collective inquiry peer review practice.
For the scientific knowledge component in scholarly peer review, the knowledge is considered that transpires when scientists submit their work for publication to the science community. This reviewing process is traditionally based on criteria by which peers judge the quality of the literature review, the significance of the question, the accuracy of the method, the reliability of the results, and whether the presented data support the conclusions and implications (c.f. Baker, 2002). This knowledge is imbedded in an activity system where the subjects are students cooperating in groups of two or three and sharing the same object — peer reviewing a student inquiry article regarding a specific topic — in a distinct inquiry community. In this study, the inquiry community of multiple groups of students from different schools can be identified as a community of practice (Roth & Lee, 2006) in a way that the various groups in the community collaborate in reviewing each other's inquiry articles with the aim at improving their articles. Division of labor refers to the division of tasks and decision making in the student groups within the community. The rules refer to the procedures and norms that mediate the regulation of constructive student peer review, for example in an internet symposium. In such a set-up of the peer review the internet is considered as the technical tool that makes all student reviews visible and helps the students in evaluating their peers’ inquiry articles. Finally, the outcome of the activity system consists of improved student inquiry articles and, as an intended spin-off, improved student understanding of quality in an inquiry. The various components in the dynamic activity system in this study, based on Hsu et al. (2010), is depicted in Figure 2.
Activity theory is a relatively new theoretical framework in science education research especially in design research. Van Aalsvoort (2004) designed a course ‘Chemistry in Products’ for grade nine students with activity theory as a model of society in which science and society are related, but she did not research student understanding during the enactment of the course. Furthermore, Holbrook and Rannikmae (2007) as well as Roth and Lee (2004) analyzed and discussed important definitions regarding the nature of science and scientific literacy based on activity theory. Activity theory as an analytical tool is also visible in the analysis of representations of scientists in high school and college textbooks by Van Eijck and Roth (2008). Roth et al. (2002) articulated — by means of two complementary activity systems — learning of subject matter, learning to teach subject matter and collective responsibility for teaching and learning in co-teaching of the dihybrid-crosses. Kahveci et al. (2008) analyzed by means of activity theory the influence on two lecturers’ use of educational technology in chemistry undergraduate education and found various contradictions within and between activity systems. Hsu et al. (2010) drew on activity theory to analyze high school students’ representations of scientific practice during a long-term internship program and concluded that students’ representations were constituted and hindered by mixed-up activity systems.
The present study is new in a sense that the design of the simulated scholarly research community contains peer reviewing as an activity in terms of activity theory. More precisely it is based on the understanding that collective student peer review represents an activity system in itself and as such will avoid the impediments of student understanding that occur when activity systems are mixed-up.
The following research question guided the study: What do pre-university chemistry students understand of quality in inquiries when they are involved in a simulated research community? When students become involved in such a simulated research community, it is postulated that they learn to evaluate inquiries and do show understanding of quality in inquiries.
MethodA design research method was used (Van den Akker et al., 2006). The reason for this was twofold. First, the inquiry community with its teaching and learning activities was designed in collaboration with pre-university chemistry teachers (Kelly, 2003). Second, to be able to study student scientific inquiry learning in a simulated inquiry community a naturally occurring setting of students in class is needed, which is according to Collins, Joseph and Bielacze (2004) one of the features in design research.
In order to determine any change in students’ understanding of quality in inquiry, four student groups’ first and final articles as well as the influence of the submitted and received peer review comments were qualitatively analyzed (Cohen & Manion, 1994). Moreover, these student groups were interviewed so as to complete the written student data. Student groups were taken as the unit of analysis (c.f. Cole & Engeström, 1993).
In the next paragraphs, the participants in and the setting of the simulated research community, the procedures of data collection and data analysis are described.
Participants in the simulated research communityThe simulated scientific inquiry community was composed of 880 pre-university chemistry students and 39 chemistry teachers. The students, age 16-18, came from 25 different schools in Brazil, Germany, The Netherlands and Poland.
The students worked in 428 groups on an inquiry module on ‘Fermentation’. Practical work was part of the chemistry pre-university curriculum for all students and their teachers. So, all students had experience in writing reports on experiments in chemistry. However, they were less experienced in conducting open inquiry in chemistry and they had no experience in writing an article and a peer inquiry review.
All teachers had a Master's degree in chemistry, a degree in teaching and at least five years of experience in pre-university chemistry teaching. Their role, in this study, was to prepare the students for and guide them through the open inquiry process and to ensure that the groups submitted their first and final article and registered for the on-line peer review in time.
Setting of the simulated research communityThe setting of the simulated research community was a six months student inquiry project on ‘Fermentation: the chemistry of making bio-ethanol’. The project was designed in collaboration with five experienced Dutch pre-university chemistry teachers. Moreover, its design was in accordance with the findings from earlier research (Van Rens et al., 2010). In this already more than ten years continuing research many students and their teachers from all over the world participated. The chemistry teachers who participated in one of the earlier student inquiry projects were by email invited to join the Fermentation inquiry project.
From the learning materials used in the inquiry project the students knew that they were part of a larger inquiry community. Furthermore, they knew that all inquiry groups did open inquiries on fermentation and would submit a first article on their inquiry for a peer review discussion in an internet symposium followed by a submission of their final article.
To get to know the content of the inquiry project the students individually: read on chemistry research in general; thought of examples related to the fermentation process; predicted, observed and explained (White & Gunstone, 1992) in a demonstration experiment concerning sucrose and baker's yeast.
Then, in groups of two or three, they firstly conducted a thought experiment on the amount of carbon dioxide gas released when 5 g of dry baker's yeast was put in a 18% D-glucose solution. They also calculated the theoretical amount of released CO2 (g). Second, they discussed a by the author and the teachers set research paper on ‘Yeast and fermentation: the optimal temperature’; an experiment of Saccharomyces cerevisiae (yeast cells) and sucrose. In this research paper they evaluated the quality of the research question, the assumptions and theory behind the hypothesis, the management of control variables, the accuracy of measurement, the presentation of the results, the reliability of results, as well as of the discussion and conclusions. These evaluations on the quality of the research put the student groups on the track that transpires when scientists submit their work for publication to the science community (based on Baker, 2002). They also gave the students the prerequisite knowledge of the rules and criteria that are essential in a peer review.
After this, the groups were asked to set any inquiry question related to the topic fermentation, to write an inquiry plan, to conduct the planned experiments, to report on the experiments in a first article and to submit it for publication. 428 groups submitted a first article to the ‘Fermentation Internet Symposium’. Then, each group was at random coupled to another group of students for the on-line peer inquiry review session that continued for four weeks. In this session 91.1% of the groups participated. 80.4% submitted a final article and competed for three chemistry inquiry awards. These awards were sponsored by industry and were assigned to the three best inquiries by a jury independent of the teachers and researcher. Moreover, the results of the inquiries in these articles were published in a Dutch science magazine; see www.nwtonline.nl.
The planned teacher activities were geared on the student activities and laid down in a teaching scenario. The learning teaching materials consisted of a student workbook with worksheets, a teaching scenario for six lessons and a website. This website had five functions (see www.pieternieuwland. nl/Menu_Items/Projecten/Symposium/index.htm). First, to portray all participating schools and their inquiry groups. Second, to deliver the learning materials and a cyber-tracker with topic relevant sites. Third, to have a platform for the inquiry review session: the Internet symposium. Four, to publish the student groups’ first and final articles. Last, to announce the three award winning teams.
Data collection and analysisFrom the above described simulated research community four groups of students (n = 8) were randomly selected from the Dutch cohort of student groups. These four groups did not know during their participation in the inquiry project that they were selected for the study. They were selected from this cohort because they needed to be not too far away from our university so that they, after the inquiry project ended, easily could be interviewed.
The peer review comments submitted and received by these four groups as well as their first and final articles were analyzed on the five categories that according to Chinn and Malhotra (2002) should appear in written inquiry products: (i) inquiry question and hypothesis; (ii) experiments: variables, accuracy and reliability; (iii) data presentation; (iv) data interpretation: discussion and conclusion; and (v) evaluation. For this analysis a coding form was used. This form was previously tested by two researchers in an analysis of 140 student groups’ first and final articles regarding their inquiries on ‘Cola and Teeth’ with an inter-reliability of 87%.
If a student group showed correct understanding with regard to one of the mentioned categories in its peer review, its first article and its final article respectively, this was qualified as “best understanding” (✓✓) concerning that specific category. Partial correctness was qualified as “partial understanding” (✓). Negligence or an incorrectness in review comments or incorrectness in the first article or the final article was qualified as “wrong understanding” (–). Furthermore, the first and final article of each of the four groups were analyzed to determine whether any change was brought about by a received review comment.
One week after the closure of the fermentation inquiry project the eight selected students received an email to ask whether they would agree upon an interview regarding the Fermentation inquiry project. They all agreed. Each of the four group interviews took about one and a half hour. In the interviews the first and final articles of the group as well as the peer review comments that they submitted and received in the internet symposium were at hand. Each group of students was asked to explain what the quality of an inquiry determines with respect to each of the categories (i-v). The four interviews were transcribed. The transcripts were analyzed on groups’ responses that regarded only those categories in respect to which they left their final articles unchanged while there was reason to do so.
All analyses were independently done by two researchers. Deviating scores were discussed till consensus was reached (Janesick, 2000).
ResultsStudent understanding in submitted peer review commentsThe inquiry review comments, regarding each of the categories (i-v) that the students of the groups (1-4) respectively submitted to their peers on the quality of their inquiry on fermentation, are:
- (i)
Inquiry question: ‘Perfect, dependent and independent variables are good.’ (group 1); ‘… put the temperature in your inquiry question.’(2); ‘… is interesting’(3); no comment (4); Hypothesis and theory: ‘Is anaerobic and oxygen free not the same?’ (group 1); ‘This aspect was good’ (2); ‘We only have a slight comment on the hypothesis. In which unit are the percentages of the sugar in fruits. Is this in moles? Or is this the mass percentage of sugar in fruits? And another point is: why should the highest percentage of sugar have a bigger ethanol production. Of course people that study chemistry know the answer, but will somebody who doesn’t know much about chemistry also know what you mean?’ (3); ‘You say that you think it won’t end up with the same results, and that you don’t know which one will produce more. That sounds a bit weird, because you assume it ends up differently, because otherwise what is the goal of the experiment? Maybe, you should say a bit more about which one you think will produce more and why?’ (4);
- (ii)
Experiments: variables, accuracy and reliability: no comment (group 1); ‘Maybe, you could mention how you did keep the control variables constant? (Oxygen level and concentration and pressure).’ (2); ‘In the experimental design is being said that the closed flask is shaken regularly, what is meant by ”regularly”? This could be once a day or more etc. Be more specific and are all flasks shaken at the same time? The same applies for the water in the kitchen blender, what kind of water is being used and does it have influence? Is this de-mineralized water? Tap water? Be more specific … The three ways of measuring are clever, but there was one point that was apparently overlooked. Because when you measure you have to take into account that also water evaporates during the process. This influences your measurements. The way of measuring including the balloon is also made up cleverly. But it's very difficult to know that there is no oxygen in the balloon before you measure, there's always some air in the balloon, this will also influence your results.’ (3); ‘We think it's not so accurate, because you say you only know that it is larger than 335 mL. How much larger exactly? … We think you made a mistake here, because you don’t say anything about the temperature in the experimental design, while you should have kept the temperature constant. And on your picture one sees a radiator on the background, which could influence the temperature. How do you know for sure that the temperature didn’t change, or influenced the experiment?’ (4);
- (iii)
Data presentation: no comment (group 1); ‘The significance in table 1 is not correct. If you look at the measurements of the rope, than you see that the significance is different from other measurements. Moreover, you use sometimes 3 significant figures instead of 1. We advise you to use the same significance in the whole table.’ (2); ‘The graph of the measurements looks very good, the only thing is that you have to really look at it before you understand what is what. There must be a way of presenting the measurements in a graph which is clearer and easier to understand.’ (3); ‘yes, well presented, except for the fact that you say in some bottles we saw bubbles and in others we didn’t. Which ones?’ (4);
Table 1.Student understanding in the peer review comments regarding each of the categories (i-v) that the groups (1-4) submitted with student best understanding (✓✓), student partial understanding (✓), student wrong understanding (–) and not necessary (blank space).
CATEGORY PEER REVIEW COMMENTS OF GROUP 1 2 3 4 (i) Inquiry question Concrete ✓✓ ✓ – Hypothesis related to theory – Hypothesis explained ✓✓ ✓✓ ✓✓ (ii) Experiments: variables, accuracy and reliability Measured variable Constant variables ✓✓ ✓✓ ✓✓ Read off instrument in significant figure ✓✓ Range in temperature/pH – – Conducted in the same way ✓✓ (iii) Presentation of data In table – ✓✓ In graph ✓ Calculations / Observations ✓✓ (iv) Interpretation of data: discussion and conclusion Reliability ✓ ✓ – Logic inference ✓ ✓✓ ✓✓ ✓✓ (v) Evaluation Reflect on method – ✓✓ ✓ ✓ Raises ‘new’ questions – ✓ ✓✓ ✓ - (iv)
Data interpretation: discussion and conclusion: ‘Your discussion is too long and not to the point.’ (group 1); ‘Are your measurements reliable? The measurement points in figure 2 do not correspond with you conclusion, because at 5.3 there is no measurement. Only a line in your graph. You can’t say for sure that the line in the figure will go up or go down. Discussion is too brief. You didn’t explain anything about factors and measurements’ (2); ‘The measurements in one setting are not compared, please do so … The results are not relatively compared to the other measurements.’ (3); ‘yes, you did, in the part where it is explained that you switched to a cork… You explained the big differences but we think you explained it quite easily. You only say, we made a wrong experimental design. But what if you assume that some of the measurements were correct, why would that be?’ (4)
- (v)
Evaluation: no comment (group 1); ‘You haven’t mentioned pressure. How can you keep the pressure constant? The second point is: first you say that the concentration is kept constant. A few sentences later, you say it wasn’t constant. Is it constant or not? What influence do the caustic soda and hydrochloric acid have on the concentration of sucrose? You have to mention future inquiry questions.’ (2); ‘You mentioned the troubles you had with the experiment, but what about questions that could possibly be answered in another inquiry. For example why did only one set-up work?’ (3); ‘Not really. Maybe, a bit more critical on the set-up of the experiment? With what new questions can you come up?’ (4).
The analysis of the student understanding with regard to each of the five categories (i-v) in the submitted peer review comments of the four groups (1-4) is shown in Table 1.
Student understanding in first and final articlesThe analysis of student understanding regarding each of the five categories (i-v) in the first and final groups’ articles on their inquiry on fermentation is shown in Table 2.
Student understanding in the categories i-v in the first (A) and final (B) article of the student groups (1-4) with student best understanding (✓✓), student partial understanding (✓), student wrong understanding (–) and not applicable (*).
CATOGORY | GROUPS’ FIRST(A) / FINAL(B) ARTICLE | |||||||
---|---|---|---|---|---|---|---|---|
1A | 1B | 2A | 2B | 3A | 3B | 4A | 4B | |
(i) Inquiry question | ||||||||
Unambiguous | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓ | ✓ | ✓✓ | ✓✓ |
Relevant | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Concrete | ✓ | ✓ | ✓ | ✓✓ | ✓ | ✓✓ | – | ✓ |
Hypothesis related to theory | – | ✓ | ✓✓ | ✓✓ | – | ✓ | – | ✓ |
Hypothesis explained | ✓ | ✓✓ | ✓ | ✓✓ | - | - | - | - |
(ii) Experiments: variables, accuracy and reliability | ||||||||
Measured variable | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Changed variable | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Constant variables | ✓ | ✓✓ | ✓ | ✓✓ | ✓ | ✓✓ | ✓ | ✓ |
Choice of measuring instrument | ✓✓ | ✓✓ | ✓ | ✓✓ | ✓ | ✓✓ | ✓✓ | ✓✓ |
Read off instrument in significant figure | ✓✓ | ✓✓ | ✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
Range in temperature/pH | * | * | ✓✓ | ✓✓ | * | * | – | – |
Repetition of measuremen | t ✓ | ✓ | ✓ | ✓ | – | ✓ | ✓ | ✓ |
Conducted in the same way | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
(iii) Data presentation | ||||||||
In table | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
In graph | ✓ | ✓ | ✓ | ✓✓ | ✓✓ | ✓✓ | ✓ | ✓✓ |
Observations / Calculations | * | * | ✓ | ✓✓ | – | ✓ | ✓✓ | ✓✓ |
(iv) Data interpretation: discussion and conclusion | ||||||||
Reliability | – | ✓ | ✓ | ✓ | – | ✓ | – | ✓ |
Logic inference | ✓ | ✓✓ | ✓ | ✓✓ | ✓ | ✓ | ✓ | v |
Conclusion related to question | ✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ | ✓✓ |
(v) Evaluation | ||||||||
Reflect on method | ✓✓ | ✓✓ | ✓ | ✓✓ | ✓ | ✓ | – | ✓ |
Raise further questions | ✓✓ | ✓✓ | ✓ | ✓✓ | ✓✓ | ✓✓ | – | ✓✓ |
The analysis of the peer review comments that group 1 received, reveals correct review comments that concern: category (i) on “Hypothesis”; category (iii) on “Experiments: variables, accuracy and reliability”; category (iv) on “Data interpretation: discussion and conclusion”; and category (v) on “Evaluation”.
These comments are respectively: ‘you could include, for example, the reaction of fermentation to justify the gas produced’; ‘… you just didn’t keep the temperature constant in each test (it was supposed to be a control variable, but it ended up being another independent variable). So you can’t be sure that UV really affects fermentation or if the changes were caused by the temperature (or even by a mix of both) … the measures are accurate. The major problem is with their reliability … we think that the Drechsel bottle, the way you used it, is not the best instrument … you could have used a light sensor in the Drechsel bottle to count uninterruptedly the number of bubbles released, like some people did … the deviation was enormous, which shows that the reliability was not good and that something in the experimental set-up was wrong’; ‘we agree that the experiments showed that UV-C reduces the fermentation rate. However, we can’t be sure if UV-A/B increases, decreases or doesn’t interfere with [the] fermentation [process]. You should mention that in the discussion and conclusion’; and ‘you are critical. You recognized the problem with the variables (temperature) in it. However, you should have discussed the deviation between the control Erlenmeyer flasks in week 1 and 2 and what to do about that problem’.
Analysis of the peer review comments that group 2 received, reveals correct review comments with regard to: category (ii) on “Experiments: variables, accuracy and reliability”; category (iii) on “Data presentation”; and category (iv) on “Data interpretation: discussion and conclusion”.
These comments are respectively: ‘The temperature is constant with the room, but this temperature changes during the day … so, eventually the temperature isn’t constant after all … as a result CO2 gas escaped. Altogether this part of the experiment wasn’t correct. The intervals from the pH scale are quite odd. In the report it isn’t clear why you have chosen these kind of intervals’; ‘Significance in the table isn’t the same everywhere’; ‘In the conclusion it is said that the measured quantity of CO2 is less then optimal. In the text by itself it isn’t clear. Is it meant that CO2 that the gas can’t be measured all or is it that the measured quantity is less then what is theoretically possible’.
Analysis of the peer review comments that group 3 received, reveals correct review comments that belong to: category (i) on “Inquiry question”; category (ii) on “Experiments: variables, accuracy and reliability”; category (iii) on “Data presentation”; category (iv) “Data interpretation: discussion and conclusion”; and category (v) on “Evaluation”.
These comments are respectively: ‘The independent variable (fresh or dry yeast) is missing. It is not explained how the hypothesis is formed. There is no theory supporting your hypothesis. How do you know [that] fresh yeast works better than dry yeast?’; ‘The control variables are not mentioned. Nothing is said about the accuracy. Did you weigh with a scale that is two decimals accurate? Did you put a water seal on the flasks or did you just let them stay open? If you did so, the measurements are not accurate, because oxygen was present, which made an aerobic process and the ethanol would have reacted further into carbonic acid’; ‘yes the results are not presented well. There is no calculation of any kind and what did you do with the change in mass of the experiment without yeast?’; ‘Is it logic that in more time more ethanol is produced? It is more relevant when you link it to how much ethanol is produced during the process, so you could follow the production of ethanol. Every measurement is done only once, which makes the results not reliable’; and ‘you are critical, but how reliable are your measurements?’
Analysis of the peer review comments that group 4 received, reveals correct review comments that fall into: category (i) “Inquiry question”; category (ii) on “Experiments: variables, accuracy and reliability”; category (iii) on “Data presentation; data interpretation: discussion and conclusion”; and category (v) on “Evaluation”.
These comments are respectively: ‘The actual inquiry question is missing, but from your experimental procedure we see the dependent (CO2) and independent (pH) variable. The hypothesis and correct theory is also missing.’; ‘Do you mean that it doesn’t matter if you take 18% D-glucose solution or 25%? Do you mean that this does not matter in general or that it does not matter if you use the same percentage in each bottle (so that you keep this variable constant?). We do not know for sure that you kept the temperature constant, because we do not know if you put them in the same oven (we think so but maybe you can add °C). We think you measured accurately ... we do not know if you used the same measuring instrument’; ‘Our only remark here is that the axes, what they represent is missing’; ‘It is good that you compared your results with Slaa et al. [article set by designers of the project]. Maybe, an explanation is needed on that they used another pH than you did’; and ‘we could not find it… there is something to say, such as: what went wrong with your set-up, that somebody who will repeat your experiment needs to avoid… have not given further questions.’
Student understanding of quality in the five categories (i-v)The analysis of the four transcripts regarding the students’ responses in the interviews reveals that group 1 during the interview discovered that the inquiry question in their final article didn’t show the dependent variable, whereas they had given a correct peer review comment regarding the inquiry question of the group they were coupled to in the internet symposium. Their response: “It is always easier to see mistakes made by other people, because … I don’t know why … maybe, you think that you yourself are smarter”. Further analysis of the transcript of group 1 shows that regarding the other categories that required change but were left unchanged by the students in their final article concern in category (ii) the “Repetition of measurement” and in category (iv) the “Reliability”. Their responses are respectively: ‘you need to repeat the experiment so that your measurements are more accurate’; ‘then you may get to know what the exact measurement could be’. And with respect to category (iii) on “in graph”, they responded: ‘in fact we had not enough data to plot a graph and beforehand we also did not think of the spread [range] we needed on the x-axis…we never experienced that an inquiry is so challenging’.
Comparison of student understanding in the first and final article of group 2 reveals that the categories that required change but were left unchanged by the students in their final article concern in category (ii) the “Repetition of measurements” and in category (iv) the “Reliability”. Their responses respectively are: ‘We know that repetition is needed. We did all experiments twice so that we could find the averaged masses with its deviation of escaped carbon dioxide gas at different pH values’; ‘… the deviation tells us whether we can trust the experiment … and we trusted all experiments except the experiment at pH 6 with balloon 1 that failed’.
Comparison of student understanding in the first and final article of group 3 reveals that the categories that required change but were left unchanged by the students in their final article concern: category (i) regarding “Unambiguous” and regarding “Hypothesis explained”; category (iv) regarding “Logic inference” and category (v) regarding “Reflect on method”. Their responses respectively are: ‘As a group we couldn’t decide on the inquiry question, so we had two questions in one’ and ‘we could tell what we expected to be the answers on the questions we had because of what we know, but we felt that it takes to much space in our article to explain our expectations so we left them out’; ‘on the way we discovered that the experiments we did were not sufficient enough to either answer one or both questions, so in our discussion we ended up in a kind of vague bla bla bla’; and ‘we just didn’t have enough materials and time at school so we really had to hurry up’.
Comparison of student understanding in the first and final article of group 4 reveals that the categories that required change but were left unchanged by the students in their final article concern: category (i) regarding “Hypothesis explained”; category (ii) regarding “Constant variables”, “Range in pH” and “Repetition of measurement”; and category (iv) regarding “Logic inference”. Their interview responses respectively are: ‘… we didn’t write down our expectation, because we really did not know what to expect … we looked for the influence of acidity on yeast cells, but we did not understand the information we found on the internet so we did not use it’, ‘the data were a problem … so in fact we should improve our experiments … we didn’t keep important variables constant’, and ‘we should have thought about the different pH's to measure. Maybe, first by trial and error to find around what pH to measure and then think of … eh … the gap between the pH values’; ‘The time we discovered that part of the experiments had failed we did not have the time to repeat them … repetition gives better results, because you are also more skilled to do the experiments’; and ‘in fact normally the data help you to find a conclusion, but unfortunately we couldn’t give a trustful conclusion’.
Discussion, conclusion and implicationsOur research question concerns pre-university chemistry student understanding of quality in inquiries when they are involved in a simulated research community.
From the analysis of student understanding in the submitted peer review comments it is concluded that student understanding is visible within several quality categories regarding an inquiry (see Table 1).
An interesting fact that arises is that the students in group 1 show that on the one hand they give a good comment to peers regarding the concreteness of the inquiry question (see Table 1). On the other hand they themselves face problems in formulating a concrete inquiry question (see Table 2). It seems that seeing mistakes of peers is easier than realizing own mistakes (see analysis interview group 1). According to Hofstein et al. (2005) a more focused practice on formulating inquiry questions will help the students to come up with better questions. Whether a focused practice also is necessary to enhance student understanding in reviewing inquiry ques-tions needs further research.
However, the opposite also occurs. The same students, group 1, show that they understand how to evaluate their inquiry already in their first article (see Table 2), but they also show negligence in commenting on this part of their peer's article (see Table 1). The same applies for group 2 regarding their understanding of an adequate range of the variable to be changed in an experimental inquiry. Hence, better student understanding leads to correct review comments, but does not always guarantee that students comment on parts in an peer inquiry that requires improvement.
From the analysis regarding the student understanding in the final articles it is concluded that the students show understanding in many quality categories regarding an inquiry (see Table 2, under B). Moreover, a comparison between the first and final articles indicates that students show an improved understanding in many categories (see Table 2). These improvements can almost all be brought back to the peer review comments that the student received. Also visible in Table 2 is that one aspect of quality in an inquiry is still difficult for student, namely the reason of repeating experiments as well as how to interpret reliability of data. Some students confuse accuracy and reliability, think that they can find exact measurements, use daily life language and expect more reliability with improved materials (see analyses interview groups). An explanation for this could be that the concepts of accuracy and reliability have meaning regarding one datum as well as regarding a series of data and are as such complex concepts for students (Gott et al., n.d.).
Of course, in this study a limited number of students were intensively studied but an earlier study on the student reviewing capacity in the whole inquiry community reveals supportive findings. Moreover, a scan by two researchers of all final articles reveals that more students face problems with the concept of reliability.
However, in general it can be concluded that the simulated scholarly peer review positively influenced student understanding of quality in an inquiry. A reason for this positive influence is that a student activity is concretized in only one activity system. This limitation avoids the contradictions and constraints by various activities systems that were found in students’ representations of authentic science by Hsu et al. (2010) as well as by Kahveci et al. (2008) in the introduction of new technology in chemistry education.
Hence, the components of an activity system can structure a student activity in an educational design for simulated scholarly research. Students show reasonable understanding of quality in an inquiry, which supports the hypothesis that components of an activity system in an educational design can be used. In other words, participation in a simulated scientific research community provide opportunities for students to gain a better understanding of quality in inquiries. Moreover, the knowledge that students’ understanding of quality in inquiries develops in simulated scientific activities supports the importance of design research based on activity theory. The practicality of this importance is that more students have the opportunity to learn about scientific research in science classes at schools.
To what extent the activity theory and activity systems can be a applied in a broader context of science education should be further studied, but this study provides new understanding of the perspectives on the possibilities.