Abstract Title of Document: USING ELABORATIVE INTERROGATION ENHANCED WORKED EXAMPLES TO IMPROVE CHEMISTRY PROBLEM SOLVING Rebecca Simpson Pease, Doctor of Philosophy, 2012 Directed By: Dr. William G. Holliday Department of Teaching, Learning, Policy and Leadership Elaborative interrogation, which prompts students to answer why-questions placed strategically within informational text, has been shown to increase learning comprehension through reading. In this study, elaborative interrogation why- questions requested readers to explain why paraphrased statements taken from a reading were ?true.? Although previous research in elaborative interrogation has examined the effect of utilizing these why-questions while reading biology content, they have not been explored with chemistry text or chemistry textbooks that include worked example problems, according to a review of the literature. This study investigated the effect of answering elaborative interrogation why-questions placed adjunct to worked examples which were embedded within a section of a college chemistry textbook, compared with the commonly used study strategy of rereading the same text as a placebo-control. A randomized two-group posttest only design was used in this study. Specifically, the ability to solve quantitative chemistry problems in terms of a problem solving posttest requiring comprehension (dependent variable) was estimated for both groups and statistically compared. The subjects in this research were 74 students enrolled in an introductory chemistry course at a community college in the southwestern United States. Prior chemistry knowledge, mathematics skills, and verbal ability were also measured and statistical methods were employed to assess their correlations with posttest results in both groups. The use of elaborative interrogation why-questions was found to significantly benefit students? quantitative chemistry problem solving requiring comprehension compared to the rereading strategy, even after the effects of prior chemistry knowledge and mathematics skill (factors that were statistically determined to be significant predictors of posttest score) were statistically controlled. USING ELABORATIVE INTERROGATION ENHANCED WORKED EXAMPLES TO IMPROVE CHEMISTRY PROBLEM SOLVING By Rebecca Simpson Pease Dissertation submitted to the Faculty of the Graduate School of the University of Maryland, College Park, in partial fulfillment of the requirements for the degree of Doctor of Philosophy 2012 Advisory Committee: Professor William G. Holliday, Chair Professor J. Randy McGinnis Professor Olivia Saracho Assistant Professor Gili Marbach-Ad Professor Jing Lin ? Copyright by Rebecca Simpson Pease 2012 ii Dedication To my wonderful husband Chris, who has always believed in me, even when I have had my doubts. iii Acknowledgements This endeavor would never have been possible without the help and support from wonderful mentors, colleagues, family and friends. Special thanks go to my advisor, Dr. William G. Holliday for providing assistance, encouragement and professional guidance throughout this process and Dr. J. Randy McGinnis for providing additional mentoring as well as assistantship opportunities. I would also like to express appreciation to my other advisory committee members, Dr. Olivia Saracho, Dr. Gili Marbach-Ad and Dr. Jing Lin who devoted their time and expertise to guide me in this work. In addition, I am grateful to Dr. Wayne Breslyn for volunteering his time to assist me in scoring students problem solving attempts and question responses. I would also like to add special thanks to the instructors and students at the community college where the research took place. This would not have been possible without their cooperation. During this experience I have had the good fortune to meet many people who have made my graduate school years memorable and enjoyable. Fellow graduate students, Dr. Rebecca Vieira, Dr. Andrew Kung, Dr. Selina Thomas and Dr. Arthur Winter were not only entertaining and supportive, but also understanding when I chose to take my research in a different direction. I have been fortunate to share much of this journey with Dr. Amy Dai, who has provided emotional support through the challenges of research and dissertation writing. I am proud to have such wonderful friends. Finally, I would like to thank my family, especially my husband Chris, my daughters Allison and Emily and my parents Robert and Margaret Simpson for their patience, continuous emotional support and willingness to listen to my whining. They have given me the strength to continue more times than I care to admit. iv Table of Contents Dedication ..................................................................................................................... ii Acknowledgements ...................................................................................................... iii Table of Contents ......................................................................................................... iv List of Tables ............................................................................................................... vi List of Figures ............................................................................................................. vii Chapter 1: Introduction ................................................................................................. 1 Reading Comprehension ........................................................................................... 2 Reading Research and Science Education Reading Research .................................. 3 The Importance of Reading ...................................................................................... 3 The Use of Questions to Improve Reading Comprehension .................................... 5 Reading in Chemistry and Worked Examples .......................................................... 6 Problem Solving........................................................................................................ 8 Research Description and Hypotheses ...................................................................... 9 Setting ..................................................................................................................... 11 Research Approach ................................................................................................. 12 Summary ................................................................................................................. 13 Chapter 2: Literature Review ...................................................................................... 15 Reading Comprehension ......................................................................................... 15 Reading Comprehension Strategies ........................................................................ 18 Questioning as a Reading Comprehension Strategy ............................................... 21 Elaborative Interrogation ........................................................................................ 23 Role of Prior Knowledge in Elaborative Interrogation ........................................... 30 Worked Examples ................................................................................................... 31 Quantitative Chemistry Problem Solving ............................................................... 33 Community College Characteristics ....................................................................... 34 Summary ................................................................................................................. 36 Chapter 3: Method ...................................................................................................... 38 Introduction ............................................................................................................. 38 Research Design...................................................................................................... 39 Participants and Setting........................................................................................... 40 Materials ................................................................................................................. 42 Procedure ................................................................................................................ 50 Data Analysis .......................................................................................................... 53 Simmary .................................................................................................................. 54 Chapter 4: Results ....................................................................................................... 55 Introduction ............................................................................................................. 55 Description of Data and Analyses........................................................................... 56 Correlations of Variables ........................................................................................ 58 Evaluation of Hypotheses ....................................................................................... 59 Analyses of Additional Data ................................................................................... 68 Problem Solving Transfer ................................................................................... 68 Elaborative Interrogation Why-question Responses ........................................... 70 v Interviews ............................................................................................................ 71 Summary ................................................................................................................. 79 Chapter 5: Discussion ................................................................................................ 81 Introduction ............................................................................................................. 81 Findings................................................................................................................... 82 Comparisons to Other Studies ............................................................................... 87 Implications............................................................................................................. 90 Future Research ...................................................................................................... 91 Limitations .............................................................................................................. 93 Summary ................................................................................................................. 94 Appendices .................................................................................................................. 95 Appendix A ............................................................................................................. 95 Appendix B ........................................................................................................... 100 Appendix C ........................................................................................................... 109 Appendix D ........................................................................................................... 117 Appendix E ........................................................................................................... 127 Appendix F............................................................................................................ 130 Appendix G ........................................................................................................... 134 Appendix H ........................................................................................................... 136 vi List of Tables 1. Sample why-question responses with scoring ..........................................................50 2. Independent samples t-test table: comparison of why-question treatment and rereading placebo-control groups on chemistry prior knowledge, mathematics skill and verbal ability Acknowledgements ..............................................................57 3. Correlation between study variables. ........................................................................59 4. Analysis of variance table: comparison of why-question treatment and rereading placebo-control groups on posttest scores in total. ...................................................60 5. Analysis of variance table: comparison of posttest scores overall and by group based on high v. low prior knowledge scores. ..........................................................62 6. Analysis of variance table: comparison of posttest scores of why-question treatment and rereading placebo-control groups based on high v. low mathematics skill scores. ................................................................................................................64 7. Analysis of variance table: comparison of posttest scores of why-question treatment and rereading placebo-control groups based on high v. low verbal ability scores. ............................................................................................................66 8. Stepwise regression analysis for variables predicting posttest score. .......................68 9. Analysis of variance table: comparison of why-question treatment and rereading placebo-control groups on lower transfer question and higher transfer question posttest scores. ..........................................................................................................70 10. Student volunteers? responses to questions regarding their opinions about their solutions to each posttest problem. ...........................................................................73 11. Interviewees? general responses to questions about their opinions of effectiveness of why-questions while reading and subsequent posttest problem solving ..............75 12. Pilot study: independent samples t-test table: comparison of posttest scores of the why question treatment and rereading placebo-control groups ..............................138 13. Pilot Study: performance on chemistry prior knowledge and mathematics skills tests .........................................................................................................................138 vii List of Figures 1. Worked example 2 from the students? instructional material????????45 2. Posttest score means for each strategy group divided into high and low prior knowledge test scores???????????????????????63 3. Posttest score means for each strategy group divided into high and low mathematics skill test scores????????????????????..65 4. Posttest score means for each strategy group divided into high and low verbal ability test scores?????????????????????????67 1 Chapter 1: Introduction The purpose of this study was to investigate the effects of using the elaborative interrogation reading comprehension strategy with college students as they read their course chemistry textbook?that includes worked example problems?by asking them why-questions regarding the worked examples. Elaborative interrogation is a higher order questioning strategy to enhance learning by linking new information to prior knowledge (Menke & Pressley, 1994; Willoughby, Wood & Khan, 1994). This strategy requires readers to explain why statements pertaining to the information in the reading are true (Menke and Pressley, 1994). While other studies have reported successfully using the elaborative interrogation why-question strategy with college biology students using textbooks (e.g., Smith, Holliday & Austin, 2010), this one varied in that it focused on a mixture of informational chemistry text and chemistry worked problem solving exercises commonly studied in college chemistry courses. More specifically, the reading material consisted of a combination of prose, chemical symbols, numbers and mathematical calculations. Assessment in this study was measured using reading comprehension problem understanding?the major goal of reading informational text (Pressley, 2006)? rather than rote or recall of learned information. A goal of this research was to investigate a question-based strategy that college chemistry instructors could use to aid their students in comprehension of the challenging problem solving information presented in introductory chemistry textbooks and thus, provide those instructors with an evidenced- based rationale for selectively incorporating this strategy into their instruction. 2 Reading comprehension Reading to learn is a major component in the lives of most students; however, for many it is often difficult and unproductive, perhaps because little time is allocated in earlier grades to teaching strategies for comprehension of informational text (Pressley, 2006). Students who comprehend what they read are generally able to explain the main idea, answer questions in their own words and make logical inferences regarding the information in the text (Smith et al., 2010). Much research has been conducted by reading education researchers to explore methods to improve the effectiveness of student reading comprehension and retention. Most of this past research focused on pre-adolescent learners and considered issues such as strategies for elevating reading level and decoding ability using non-science or non-authentic text (Kruidenier, 2002). While these concerns are certainly still relevant for many adolescent and adult learners (Best, Rowe, Ozuru & McNamara, 2005), much of their required course reading may assume that these decoding skills and comprehension strategies have been mastered, perhaps especially in science textbooks and other reading materials that are often required for various science courses. Since a certain amount of the content knowledge required for secondary or post- secondary level science courses typically must be acquired from outside reading, it is essential for students enrolled in these courses to possess strategies for comprehension of complex text if they hope to be successful (Ryan, 2006; Sappington, Kinsey & Munsayac, 2002). In a survey of particular interest for the present study, college freshman chemistry students reported poor reading comprehension as one of the top ten reasons for problem solving difficulty (Silberman, 1981). 3 Reading research and science education reading research Science education researchers have reported only a small number of findings on elementary school students? reading comprehension of informational science text (Duschl, Schweingruber & Shouse, 2007), and fewer still on higher level students (Pressley, 2006). Most of the science education research concerning reading is not focused on improving students? comprehension of text, but rather on issues such as how scientists are portrayed in college science textbooks (van Eijck & Roth, 2008; Wong & Hodson, 2009), and how high school students? learning may be influenced when concepts are presented in a variety of modes (Hand, Gunel, & Ulu, 2009). Other reading research conducted by science educators is mostly associated with reading material, such as surveys on the subject of textbooks (e.g., Digisi & Willett, 1995; Weiss, Banilower, McMahon, & Smith, 2001), textbook design (e.g., Gee, 2004; Holliday, 2004; Kesidou & Roseman, 2002; Wong & Hodson, 2009) and research reviews of science textbooks (e.g., Holliday, 2004; Shanahan, 2004). On the other hand, reading education researchers frequently center on the need to teach and utilize reading comprehension strategies across all content areas and grade levels. They also stress the need to conduct research using more authentic text such as that frequently assigned to students (RAND Reading Study Group, 2004). However, these researchers seldom investigate or discuss specific approaches to facilitating students? comprehension of reading science content (Holliday, 2004). The importance of reading Reading is a domain in schools where little performance progress has been observed based on the National Assessment of Educational Progress approach to 4 assessment used on fourth and eighth grade readers (Gewertz, 2010). In addition, there is little reason to believe older learners are progressing much faster (Brozo, 2009). Perhaps this is why, according to a leading International Reading Association survey of leaders in the field of literacy, adolescent literacy and comprehension are two of the four research domains (out of 27) ranked as ?very hot? research topics (Cassidy, Valadez, Garrett & Barrera, 2010). There is little doubt that reading science texts will remain an important activity since it is not only a task that most adolescent and older students find central to their school science learning, it is also regularly practiced by scientists (Craig & Yore, 1996; Norris & Phillips, 1994, 2008; Shanahan, 2004). Professionals in science and engineering fields and other scientifically literate individuals acquire much of their background knowledge by reading challenging text and rely heavily upon reading to remain current in their fields of expertise (Norris & Phillips 2008). These successful and influential individuals routinely implement reading comprehension strategies to assist them in comprehension of the information found in the text (Holliday, 2004; Pressley & Wharton-McDonald, 2006; RAND Reading Study Group, 2004). Even though college students are often required to supplement their classroom learning by reading from their textbooks, many college students apparently do not complete the readings assigned by their professors. In fact, a recent study found that college students often read only a few passages from their science textbooks in preparation for course exams (Bonner & Holliday, 2006). Another study found that most students surveyed at two universities either did not read their text at all, or only read sparingly even when they purchased the textbook (Sikorski, Rich, Saville, Buskist, 5 Drogan & Davis, 2002), and Pentecost & James (2000) concluded that college students rarely utilized their textbook while studying for their chemistry course. According to data obtained from the National Assessment of Educational Progress in 2008, over 60% of 17 year olds do not read at a level at which they can understand complicated information contained in text (Rampey, Dion & Donahue, 2009). Furthermore, they reported that only six percent of 17 year olds rank as capable of learning from specialized reading materials such as that typically found in science textbooks. Science text often differs from ordinary vernacular and commonly includes symbolic or mathematical representations making it more difficult for students to comprehend than disciplines such as history or literature (Millar, 1991; Yore & Shymansky, 1991). While most college students have the ability to read and comprehend the type of text that might be found in newspaper articles, many have difficulty comprehending information contained science textbooks (Callender & McDaniel, 2007; Caverly, Orlando, & Mullen, 2000; Shanahan, 2004), thus hindering many students? ability to grasp new science concepts (Gee, 2004). On the other hand, students who do comprehend can produce inferences based on what they have read and can go beyond producing verbatim responses from a text (Norris & Phillips, 1994). Therefore, strategies that improve a student?s comprehension of information obtained through reading science text, in particular chemistry text including worked examples, may increase the student?s learning and enhance the value of the course for the student. The use of questions to improve reading comprehension Questions typically found in college science textbooks, such as those at the end of a chapter, are often considered by instructors as good study aids for their students. 6 However, questions that require verbatim responses may actually hinder learning, especially with regard to low verbal ability learners (Holliday, Whittaker & Loose, 1984). Alternatively, the elaborative interrogation strategy, which requires students to elaborate or explain information as it is read by embedding why-questions within the reading, is designed to help students activate their prior knowledge rather than provide them with practice or application of principles contained in the text. In elaborative interrogation, why-questions are periodically presented during the reading, encouraging the reader to review their prior knowledge and formulate an explanatory response (Martin & Pressley, 1991). Reading in chemistry and worked examples College textbooks are often the most difficult type of text for students because they have a high density of technical information and new terminology (Caverly et al., 2000). Science textbooks use descriptions and explanations of events that are not part of common discourse and often rely on prior knowledge or experiences to be meaningful for the reader (Yore & Shymansky, 1991). More specifically, many science textbooks include mathematical calculations based on presented scientific principles and described in textual form to provide the reader an example quantitative problem solving method that can be applied to similar problems (Yore & Shymansky, 1991). This is the case in college chemistry textbooks. Generally, these instructional calculations are presented in the form of worked examples provided after the introduction of the concepts and problem-related information associated with the calculations. These worked examples typically include a problem statement followed by verbal descriptions of step-by-step methods for solving the problem along with explanations of the goals of 7 each step (Atkinson, Derry, Renkl, & Wortham, 2000), thus modeling the productive problem solving processes (Mayer, Sims & Tajika, 1995). While students may be able to follow along while reading these calculations, often they simply memorize the steps without comprehension of underlying scientific and mathematical principles that would be needed to solve new, related problems thereby demonstrating reading comprehension (Mayer et al., 1995). While this memorization approach may lead to some success in superficial problem solving, it is not an indicator of problem solving comprehension on which the calculations are based (Mayer, 2004). A great deal of research has been published and is currently under investigation regarding worked examples in textbooks, most of which is framed in cognitive load theory (e.g., Atkinson et al., 2000; Cooper & Sweller, 1987; Sweller & Cooper, 1985). Smith et al. (2010) successfully established the estimated power of elaborative interrogation why-questions based in biology. However, there is a lack of research using chemistry text and, more particularly, chemistry worked examples linked to reading comprehension strategies applied to textbooks used in today?s science classrooms. Since worked examples are typically embedded within relevant expository text, it may be valuable to view worked examples in terms of reading comprehension and in the context of elaborative interrogation why-questions. Elaborative interrogation is a reading comprehension strategy that uses why- questions to prompt students to explain why information presented in their text makes sense (Pressley, 2006). Research suggests that when students are prompted to answer these why-questions as they read, prior knowledge is activated and newly presented information is more easily linked, that is assimilated and accommodated, with their prior 8 knowledge which makes the reading more meaningful and memorable (Pressley, Wood, Woloshyn, Martin, King & Menke, 1992). In an analysis of research on prior knowledge and study strategies, Prawat (1989) asserts that meaningful learning requires connections between new knowledge and prior knowledge. However, students often do not make these connections automatically (Menke & Pressley, 1994; Prawat, 1989). Therefore, if by utilizing the elaborative interrogation why-question strategy students are more likely to activate prior knowledge, improved learning outcomes may be expected, according to Menke and Pressley (1994). Problem solving Problem solving has been defined by Wheatley (1995) as ?what you do when you don?t know what to do? (p. 3). While there has been much discussion in the chemical education literature over problem solving and the difference between problems and exercises, this distinction is difficult to delineate when considering students with little or no experience with a particular type of question. It has been asserted (Bodner, 2003) that if a student applies an algorithm or set of rules (such as the information presented in worked examples) to find the answer to a question, it is an exercise, not a problem. On the other hand, Smith (1991) justified the use of algorithms, stating that they are regularly utilized by skilled problem solvers and that the key was whether the use of algorithms was mindless or with understanding. Also, the distinction between a problem and an exercise may not be a matter of difficulty or complexity, but whether or not the student is familiar with the task (Bodner, 2003; Bodner & Herron, 2002). Smith (1991) stresses the impact of practice with similar questions as crucial in differentiating between problems and exercises for individuals, and suggests that this difference is more of a continuum 9 than a discrete separation. Therefore, what may be an exercise to an expert can be considered a problem for a novice. In the case of this research, the students were assumed to be novices in the type of problems on the posttest. This presumption was assessed using interview data (see Chapters 4 and 5). In the context of the present study, problems have been defined as questions that require students to find quantitative solutions based on information newly introduced in text form, but not demonstrated beforehand by an instructor or practiced by the student. Research description and hypotheses The present research was comprised of an experimental study using a randomized two-group, posttest only design. It was designed to investigate students? use of elaborative interrogation why-questions and determine whether or not this strategy would improve students? solution of chemistry problems in terms of reading comprehension. In elaborative interrogation, statements are generally taken directly or paraphrased from the text being read, and the reader is asked to explain why the statement is true, as was similarly reported by Smith et al. (2010). In the present study, these statements consisted of paraphrased portions of the worked example calculations presented in the course textbook assigned to students. It was hypothesized that by interrogating and encouraging students to focus on key aspects of a problem, the students would more readily activate their relevant prior knowledge related to the presented example, and promote improved comprehension of the chemistry problems. This explanation has been supported experimentally by Martin and Pressley (1991) and theoretically supported by Levin (2008). For comparison, a placebo-control group was asked to apply the strategy of 10 rereading, which is commonly recommended by teachers (Hedin & Conderman, 2006) and an approach often practiced by good readers (Pressley & Afflerbach, 1995). This research explored the following hypotheses: 1) College chemistry students provided with a reading from their textbook and adjunct elaborative interrogation why-questions will outperform, on a problem solving posttest requiring comprehension, students asked to read the same textbook material twice. 2) Among college chemistry students, there will be a relationship between prior knowledge and performance on a problem solving posttest. Specifically, students with high prior knowledge in chemistry will outperform those with low prior knowledge in chemistry. 3) Among college chemistry students, there will be a relationship between mathematics skills and performance on a problem solving posttest. Specifically, students with high mathematics skills will outperform those with low mathematics skills. 4) Among college chemistry students, there will be a relationship between verbal ability and performance on a problem solving posttest. Specifically, students with high verbal ability will outperform those with low verbal ability. 5) Prior knowledge in chemistry, mathematics skills, verbal ability and the elaborative interrogation why-question treatment as significant predictors of performance on a problem solving posttest. 11 Setting The authentic setting of this research set it apart from most similar published studies using questioning to improve reading comprehension studies, with the exception of the Smith et al. (2010). For example, in this investigation: (a) the subjects were students enrolled in an authentic chemistry course, not individuals recruited from unrelated courses; (b) the experiment was conducted in the subjects? regular classroom setting, rather than as individuals monitored by a researcher; (c) the concepts covered in the reading were taken directly from the textbook used in the course in which the students were enrolled, and included content routinely required in the course, not learning materials from outside sources that may not be relevant to the subjects; (d) the experiment was conducted in an authentic classroom setting rather than a situation where strategy training sessions preceded the experimental reading; (e) the subjects were allowed to complete the reading at their own pace, not a structured timeframe enforced by the researcher; (f) the subjects were administered a problem solving posttest to assess their ability to solve problems similar to the worked examples in the reading, not simple recall or recognition of facts; (g) the subjects? prior knowledge was estimated based on a test of background chemistry knowledge along with tests of mathematics skills and verbal ability, not a pretest containing the same or similar questions as the problem solving posttest or a survey of self-perceived prior knowledge. 12 This research took place in introductory chemistry classes at a community college in the southwestern United States. This course focused on basic elements of chemistry and provided the foundation required for success in a general chemistry course. The classes met twice a week for 110 minutes. Classes consisted of heterogeneous groups of students. The data collection took place approximately mid-semester, at a point where students had received instruction in chemistry concepts leading up to and including Avogadro?s number and the mole concept, but before instruction on the to-be-learned concept of molar mass and related calculations using molar mass. Research approach In five classes taught by two instructors, students were randomly assigned to one of two groups, an elaborative interrogation why-question treatment or a rereading placebo-control. The experimental text contained a three component molar mass and mole/mass calculations lesson. The learning objectives of this reading included: (a) the ability to determine the molar mass of a compound; (b) the ability to calculate mass in grams of an element from a given number of moles; and, (c) the ability to calculate the number of moles of a compound from a given mass. Before instruction in how to perform molar mass and mole calculations, all students were given a three-part multiple choice test: a chemistry prior knowledge test to assess domain knowledge essential for learning about mole problems, a mathematics skills test to assess ability to solve conversion and ratio problems, and a 48-question vocabulary test taken from the Kit of Reference Tests for Cognitive Factors (French, Ekstrom, & Price, 1963). Once these tests were completed, the treatment group was given a 1,144-word (counted as individual words, individual sets of numbers and chemical formulas) passage from their required course 13 textbook that contained instructional text including worked examples, as is always the case in college chemistry textbooks. These students were asked to respond to elaborative interrogation why-questions embedded within the reading. Three why-questions were linked to each of the three worked examples included in the reading, for a total of nine why-questions. The rereading group was asked to read the same passage containing instructional text and worked examples twice. No special instructions were provided to the elaborative interrogation why-question group on how to respond to these embedded questions. All students were orally directed to read the instructions on their handout and read the passage as if they were preparing for a test. All students were also advised to pay particular attention to the worked example problems. When finished, both groups answered six identical problem solving posttest items consisting of two sets of three quantitative chemistry problems. The first set of problems was copied verbatim from the worked examples in the instructional chemistry text, with only the text quantities or text compounds (determined by the problem type) changed. The second set of problems required the same type of calculations and procedural steps as the worked examples in the instructional text; however, they differed from the worked examples in the wording of the questions, as well as the elements or compounds and quantities given. In addition, six volunteers were recruited for a short interview to obtain data regarding the students? thoughts about the elaborative interrogation why-question study strategy and their perceptions of its usefulness. Summary Reading complex science text is a necessary but often difficult task for students enrolled in college chemistry courses. Research-based reading comprehension strategies, 14 such as elaborative interrogation, may help such students improve their comprehension and recall of concepts and processes learned while reading chemistry text. Previously published research on elaborative interrogation why-questions focusing on science information, but not quantitative science problems emphasizing worked examples, has found the strategy to be effective and important (Mayer, 2004; Smith et al., 2010). This study is novel in that it was designed to investigate the effects of using elaborative interrogation why-questions while reading a mixture of chemistry text and worked problem solving examples commonly studied by college students and presented in their course textbooks. 15 Chapter 2: Literature Review This study investigated the effectiveness of the reading comprehension strategy elaborative interrogation why-questions compared to rereading when students learned how to solve chemistry problems by reading text that included worked examples. This literature review explores the research in reading comprehension and the strategies used to improve this comprehension, particularly if it concerned reading science text. Emphasis has been placed on elaborative interrogation why-questions and the factors associated with this strategy. Worked example research, especially as it may be related to reading comprehension or the use of elaborative interrogation why-questions, has also been explored. Reading comprehension While Smith et al. (2010) have defined text as ?meaningful sequences of words printed on a page? (p. 363), comprehension of text is difficult to define, not to mention measure. Snow (2002) has defined reading comprehension as ?the processes of simultaneously extracting and constructing meaning through interaction and involvement with written language? (p. 11). Anderson (1972) suggested that reading comprehension requires deeper processing than simple orthographic or phonological decoding, and that reading comprehension, as opposed to simply reading, should include semantic encoding resulting in the generation of ?meaningful representation based on the words? (p.146). Others have gone further to describe comprehension of text to include generation of inferences and integrating text with prior knowledge (Kintsch, 1994; Kintsch & van Dijk, 1978; Yore & Shymansky, 1991). 16 Comprehension often takes time and effort, especially with technical expository text on unfamiliar topics such as those found in science textbooks (Graesser, 2007). In fact, Graesser (2007) contends that most adults find technical expository text difficult to comprehend. Moreover, readers often overestimate their understanding of information they have read, believing that they have comprehended what they have read when they have only gained a shallow level of understanding (Baker, 1985; Otero & Kintsch, 1992). Glenberg and Epstein (1985) found a startling disparity between college students? beliefs regarding their comprehension of expository paragraphs and their performance on a comprehension test. Bransford et al. (1982) found that students who successfully learned from reading descriptive text took more active roles in the process of learning, asking themselves questions or remarking on relevant illustrations, while less successful students were not as likely to relate the new information to prior knowledge. Yore and Shymansky (1991) claim that when reading science prose, comprehension cannot be obtained solely from text itself, but formed when readers develop meaning from the text combined with prior knowledge. Similarly, Prawat (1989) contends that while meaningful learning occurs as a result of interactions between new information and prior knowledge, students often do not make these connections. In fact, Wandersee (1988) found that only six percent of college students surveyed reported making a conscious effort to link new concepts to prior knowledge while reading new information in their textbooks. In the present study, reading comprehension was defined as understanding the meaning of the concepts and problem solving methods described in text that included worked example problems to such a degree that the reader would be capable of solving problems similar to those in the 17 worked examples. Since elaborative interrogation is thought to encourage prior knowledge activation, it was anticipated that participants in this study who answered elaborative interrogation why-questions as they read would achieve higher levels of comprehension than participants who read the same information twice. Most instruction in reading comprehension is done with preschool and primary school students and in this setting the reading is usually narrative rather than expository with the focus on word recognition, not deeper levels of comprehension (Pressley & Wharton-McDonald, 2006). Although middle- and high-school students are expected to read and comprehend text that is more complex and discipline-specific, the high-level skills needed for this task are rarely taught (Shanahan & Shanahan, 2008). By the time students reach the college level, they are required to read more explanatory and technical text that is more cognitively difficult to comprehend (Simpson & Nist, 2002), and while college instructors assume their students complete much of their assigned reading, these students often lack the skills to effectively comprehend this more complex type of text (Caverly et al., 2000). Even if students develop strong reading skills in their early years, these skills may not automatically continue to strengthen throughout their school-years, enabling students to manage complex reading material required in higher-level courses (Perle, 2005). Furthermore, as the content becomes more specialized, reading requires more sophisticated skills and routines (Shanahan & Shanahan, 2008). Science text in particular is known for its density of technical terms, complex mechanisms, as well as relationships and processes that frequently use mathematical language, symbols and formulas (Graesser, Le?n & Otero, 2002). According to Lemke (2004), science texts are not written in natural language alone, but instead a hybrid of mathematics, symbols and 18 other graphic representations embedded in language. These non-textual representations are usually essential to understanding, and meant to be read along with the text (Lemke (2004). These qualities may make reading comprehension of science text uniquely difficult and therefore increase the importance of utilizing reading comprehension strategies. Students often rely on a trial and error approach to reading for comprehension, testing an assortment of strategies when faced with increasingly difficult and specialized reading material (Shanahan & Shanahan, 2008). However, they often lack the ability to adjust their strategy when reading across different discipline areas even though the type of reading (and therefore the most advantageous strategy) varies considerably (Shanahan & Shanahan, 2008; Shanahan, 2004). Reading comprehension strategies Reading comprehension strategies, as defined by Lysynchuk, Pressley, d?Ailly, Smith and Cake (1989), are ?steps or actions that students could take to enhance comprehension? (p. 460). Research suggests that the use of reading comprehension strategies may lead to deeper processing of text, increased integration of text with prior knowledge and improved retention of information in text (Caverly et al., 2000). Cox and Guthrie (2001) found reading strategy use to be a significant predictor of the amount of school related reading among third and fifth-grade readers, even after the effects of motivation and previous achievement were statistically controlled. Rereading, underlining, summarizing, questioning and the production or use of diagrams or graphic organizers are a few examples of strategies that have been researched as methods to improve reading comprehension. One of the most commonly 19 used strategies, rereading, has been the subject of a great deal of educational research and was used in the present study as a placebo-control strategy. A recent study on college students? choices of study strategies found that rereading was used by 65% of the students when preparing for an exam; the most frequently used strategy (Carrier, 2003). In another study, 84% of college students surveyed reported using rereading as a study strategy and 55% of the students reported that it was the main strategy that they used (Karpicke, Butler & Roediger, 2009). In a survey of college students on preferred method of study, Wandersee (1988) found a significant positive correlation between the number of times students reportedly read new textbook material and grade-point average. Rawson, Dunlosky, and Thiede (2000) found that rereading not only improved scores on a test over the information in the reading, but also improved metacomprehension accuracy?the correlation between students? ratings of their comprehension of text and their performance on the test. While these and other studies have found that students who reread text outperform students who read a single time (e.g., Dunlosky, Rawson & Hacker, 2002; Rawson & Kintsch, 2005), some have found limitations in this benefit or greater improvements with other strategies (e.g., Callender & McDaniel 2009; Karpicke et al., 2009). Still, the comprehension strategy of rereading continues to be popular among reading teachers and researchers (Hedin & Conderman, 2006). According to Shanahan (2004), students should be taught strategies that lead them to make connections between information in text with prior knowledge, making appropriate adjustments to prior knowledge if needed. Published research in reading comprehension and professional literature most frequently supports teaching multiple strategies, although it is generally accepted that the best practice is to teach only one or a 20 few strategies at a time (Pressley, 2006). One commonly taught reading comprehension strategy is reciprocal teaching in which students are interrogated using questioning along with other techniques. This approach is supported by research and has been found to be successful with students of across a variety of grade levels (Duschl et al., 2007; Hacker & Tenent, 2002). Originally described by Palincsar and Brown (1984), reciprocal teaching requires teachers to train students working in small groups to use a combination of four comprehension strategies; predicting, clarifying, questioning, and summarizing. While the questioning strategy is often seen by teachers as one of the most effective and commonly used strategy in reciprocal teaching, many of the studies did not assess the quality of the students? questions (Rosenshine & Meister, 1994). One study found that the questions generated by the students using reciprocal teaching were often quite basic in nature and did not require the students to make inferences that might tie the new information to prior knowledge (Hacker & Tenent, 2002). Also, some aspects of reciprocal teaching, such as incomplete or inadequate use of all strategies and the amount of time required for strategy instruction and scaffolding, present considerable obstacles for teachers (Hacker & Tenent, 2002; Pressley & El-Dinary, 1997). The topic of instructional time required for training students in the use of strategies is not a trivial one; perhaps especially in content area courses at the college level. Wingate (2006) asserted that while a content area classroom embedded approach to teaching reading comprehension strategies may be more effective than a separate study skills class, academic staff at post-secondary institutions may be reluctant to accept this responsibility, perhaps due to class time restraints or lack of expertise in reading comprehension strategy instruction (Smith et al.,2010). Furthermore, McKeown, Beck, & 21 Blake (2009) claimed that the manner in which strategies should be taught to students is not readily apparent in the existing research, perhaps making effective strategy instruction difficult to implement in content area classrooms. On the other hand, the study conducted by McKeown et al. (2009) found no benefit to training students in reading comprehension strategies compared with teacher provided strategies such as questions related to the content of the reading. Therefore, teacher or textbook provided questioning strategies, if carefully designed, may have the advantage of being effective while consuming less, if any student training time. The elaborative interrogation why- questioning strategy, a type of questioning strategy that does not require student training, may be an option that would appeal to college instructors who would like their students to use a reading comprehension strategy without sacrificing significant instruction time. Questioning as a reading comprehension strategy Answering questions during the process of reading has long been seen as a strategy to enhance learning through reading. Thorndike (1917) asserted that reading should not be considered ?as a mechanical, passive, undiscriminating task? (p. 332) and suggested that students practice silent reading while answering given questions, summarizing, or listing answers to possible questions provided in the text rather than the common practice of reading aloud in class. When students question the meaning of text, they usually elaborate on the message of the text by retrieving and activating prior knowledge (Bransford et al., 1982; Levin, 2008). In this way, students are using cognitive processes that encourage associations between their existing knowledge and new information found in the text (Callender & McDaniel, 2007; Pressley, 2006; Surber & Schroeder, 2007; Woloshyn, Pressley, & Schneider, 1992). This process can be compared 22 to assimilation and accommodation as defined by Piaget (1983) or Ausubel?s subsumption as described by Novak (1980). This level of processing has been shown to result in improved scores on achievement tests that measure comprehension (Anderson, 1972). Rothkopf (1982) suggested that adjunct questions interspersed within reading material can aid in attention retention and therefore promote learning from expository text. In a review of adjunct question research, Rickards (1979) claims that readers using adjunct questions not only retain more of the material specifically questioned (direct effect) than a read-only control group, but also recall more material that was not included in the questions (indirect effect) when questions are presented after the passage is read. A recent investigation of study strategies used while reading passages of text compared students who read a passage twice, students who read the passage and answered adjunct questions and students who generated their own questions and answers while reading the passage (Weinstein, McDermott & Roediger, 2010). Both question groups significantly outperformed the rereading group on a comprehension posttest, but the two question groups did not differ significantly. However, the question generating group took ?at least twice as long answering questions? as the students who were provided adjunct questions (p. 308). This led the researchers to conclude that ?although it [question generating and answering] is a viable alternative to answering questions in the absence of materials, it is less time efficient? (p. 308). In a study with high school chemistry students, researchers found that students who answered questions after reading an essay on nuclear chemistry recalled significantly more information compared to students who read statements regarding the information after reading the essay, and students who simply read the essay in 23 preparation for a test (Pedersen, Bonnstetter, Corkill & Glover, 1988). More specifically, these researchers found that students who were asked to decide and explain ?why or why not? when asked questions over the essay recalled significantly more information than students who answered more traditional questions and those in the non-question groups. Elaborative interrogation is one type of adjunct questioning strategy that has emerged over the last two decades and instructs students to explain why statements about information found in text are true. Elaborative interrogation Strategies using elaborations have shown promise in a variety of learning environments and tasks (Levin, 1988). Levin (2008) has defined elaboration in the context of learning as ?adding (or creating) meaning and mediators to make learning materials more memorable? (p. 72). Elaboration theory predicts that additional information (elaborations) associated with to-be-learned material makes the material more memorable (Levin, 2008). Based on this theory, elaborative interrogation is a reading comprehension strategy that prompts readers to answer adjunct why-questions that are intended to elicit elaborations and inferences as they read new material (Callender & McDaniel, 2007; Woloshyn et al., 1992). The why-questions of elaborative interrogation are designed to enhance the reader?s attention and encourage more active participation with the information in the reading (Levin 2008). As the student reads, they encounter a statement adjunct to the text, taken directly or paraphrased from the reading followed by a why-question asking the student to explain why the statement is true or makes sense. The student is then expected to produce a reasonable response (Levin, 2008). Early research in elaborative interrogation focused mainly on young learners 24 trying to remember facts found in lists of sentences (Pressley, 2006). During the late 1980s and 1990s the research progressed beyond testing for simple recall of facts after answering elaborative interrogation why-questions to include comparisons with other factors. For example, Pressley, McDaniel, Turnure, Wood and Ahmad (1987) explored the differences in learning between college students who were asked to answer elaborative interrogation why-questions about factual information in a list of 24 sentences, students who were provided precise elaborations along with the sentences, and students who were given the sentences only. For example, the students in the sentences- only group were provided with a sentence that read ?The strong man carried a shovel.? The students in the elaborative interrogation why-question group were presented with the same sentence followed by the question ?Why did that particular man do that?? and the students in the precise elaboration provided group were presented with the sentence ?The strong man carried a shovel to dig heavy rocks.? The students who generated their own elaborations in response to the why-questions overwhelmingly outperformed the other groups on a test of recall of information found in the sentences. Wood, Pressley and Winne (1990) performed a related experiment testing recall of facts found in sentences with children in grades four through eight. Similarly, these researchers found increased recall in students who generated their own elaborations compared to students who were provided precise elaborations. Furthermore, they found that the magnitude of the improvement increased with age, which may support the view that prior knowledge plays a role in the effectiveness of elaborative interrogation why-questions. Pressley et al. (1987) also compared incidental learning (when students were not told that they would be tested after the study session) with intentional learning (when students were told that they 25 would be tested). These researchers found that the students who read sentences and answered elaborative interrogation why-questions demonstrated greater incidental and intentional learning than students who read the sentences and were provided with precise elaborations as well as students who simply read the sentences. In a similar experiment, Pressley, Symons, McDaniel, Snyder and Turnure (1988) evaluated the differences in learning between college students who read sentences and answered elaborative interrogation why-questions, students who were asked to create a mental image (a different form of elaboration) for each sentence, and students who were told to read the sentences aloud and make sure they understood the content. The results of this study found that both elaborative conditions (why-questions and imagery) produced more learning than the read-only condition, but there were no significant differences between the why-question and imagery conditions. Another interesting finding from this investigation was that the benefit of elaborative interrogation why-questions was observed regardless of the quality of the students? responses to the why-questions. A more recent study comparing elaborative interrogation why-questions with imagery, (Willoughby, Wood, Desmarais, Sims, & Kalra, 1997) compared the effects of these elaborative strategies when college students learn facts about familiar and unfamiliar animals. Results indicated that students who used elaborative interrogation why-questions and students who used imagery had similar recall ability for facts about familiar animals, but students who used imagery had superior recall of facts about unfamiliar animals. The researchers postulated that imagery provided additional benefits for unfamiliar facts ?because it encourages the construction of the basic association between the fact and its referent (relations) as well as more attention to distinctions as a result of creating unique 26 images for each animal? (p. 684), while elaborative interrogation why-questions require students to attempt to create inferences and expand on information, which is difficult when limited background knowledge is available. In other words, the researchers proposed that without reasonable prior knowledge, the students were unable to make the associations required for elaborative interrogation why-questions to be effective (Willoughby et al., 1997). To further investigate the role of prior knowledge on the effectiveness of elaborative interrogation why-questions, Martin and Pressley (1991) explored the variations on the type of questions asked and, consequently, the type of prior knowledge activated. In this experiment Canadian college students were asked to read sentences containing facts about Canadian provinces. The questions asked the students to either confirm why the facts are reasonable or explain why the facts were unexpected. In addition, some students were asked to answer the questions based on information about a specific province only or in terms of what they knew about other provinces only. The intention was to create four different elaborative interrogation why-question types which were designed to ?activate different aspects of prior knowledge related to the facts, with the assumption that some prior knowledge would mediate memory of the fact as stated and other prior knowledge would interfere with the fact as stated? (p. 118). Students were randomly assigned to one of four groups based on these question types: confirm-specific, confirm-other, unexpected-specific and unexpected other. They found differing recall abilities among the four groups and concluded that the type of elaborative interrogation why-question asked was a ?critical determinant of memory? (p.118). 27 Two studies published in 1990 expanded on the research in elaborative interrogation by investigating the effects of the strategy when reading factual paragraphs rather than groups of unrelated sentences as in the earlier studies. Both these investigations can be compared with two previously described reports (Pressley et al., 1987; Pressley et al., 1988). Woloshyn, Willoughby, Wood and Pressley (1990) evaluated college students who were instructed to: (a) simply read the paragraphs for understanding (control), (b) read the paragraph creating a mental image for each sentence, or (c) read the paragraph answering why-questions for each sentence. Both elaborative strategies, imagery and why-questions, facilitated learning significantly when compared to the control for both intentional and incidental learning. However, there was no significant difference between the why-questions and imagery treatment groups. Similarly, Wood et al., (1990), investigated the effects of various elaboration strategies. They compared provided elaborations, imagery generation, imagery generation plus provided elaborations, and why-question treatments to a read-only (base condition) and a no exposure control with students in grades four through eight. While all treatment conditions facilitated learning of facts, pairwise comparisons between the different treatment groups revealed the only significant increases in performance to be between the elaborative interrogation why-question treatment over both the elaborations-provided and read-only conditions. Based on their findings, they concluded that when attempting to learn unelaborated facts, students apparently do not automatically consider why the facts are true or why relationships mentioned with the facts are important to the degree that students do when they are asked to respond to elaborative interrogation why-questions. Additionally, evaluation of the quality of the elaborative interrogation treatment group?s 28 responses to the why-questions revealed that ?objectively correct explanatory responses were associated with striking recall advantages over explanation that were not correct? (p. 746). This finding is in contrast to some other elaborative interrogation research where the quality of why-question responses was evaluated (e.g., Martin & Pressley, 1991; Pressley et al., 1988; Siefert, 1993). The length and style of the reading material used in elaborative interrogation why- question research was further expanded by Seifert (1993) who used longer, more naturalistic, prose-like passages that asked one why-question for each paragraph rather than every sentence as in previous studies. In this investigation, variations on elaboration strategies were compared to a group who was asked to read and underline important information. The results of this study indicated that students who generated elaborations were more likely to remember elaborated facts read in prose than students who used an underlining strategy, and that the students who answered elaborative interrogation why- questions were able to remember more detail. This researcher speculated that the effect of elaborative interrogation why-questions may have been enhanced by the technique of asking the why-question at the end of the paragraph, thus postponing the focus on the facts until after the details had been processed. However, the results of this study did not suggest a correlation between quality of the elaborative interrogation why-question responses and amount of learning achieved, and did not find evidence of the use of inferencing. The style of reading material used with students was extended in research designed to investigate the influence of elaborative interrogation why-questions combined with analogy in science text (McDaniel & Donnelly 1996). In this 29 investigation, the students were provided 12 paragraphs over unrelated science topics with each paragraph followed by one why-question. Although the readings were brief (approximately 60 words per paragraph), they were more typical of the type of prose used in textbooks than those used in previous elaborative interrogation why-question studies. After a short distraction activity, the students were administered a test with factual and inference questions over the concepts described in the paragraphs. The results indicated that not only did the students using elaborative interrogation why-questions benefit more than the students in the control conditions with factual learning, but inferential learning as well. Two recent investigations of the effects of elaborative interrogation explored college students reading longer, expository science text. Ozgungor and Guthrie (2004) tested elaborative interrogation why-questions with college students reading a 1,481- word article from a popular scientific magazine. They found that even with this much longer text, elaborative interrogation why-questions significantly facilitated recall, inference formation, and development of coherent mental representation over a rereading control group. Smith et al. (2010) reported using an authentic reading passage with students enrolled in a college biology course. In this investigation, a 3,212-word passage was photocopied from the textbook used in the biology course in which the students were enrolled. Furthermore, the procedures were completed in an authentic classroom setting, making it more relevant to instruction than the clinical setting typical of previous elaborative interrogation research. The results of their investigation indicated that the elaborative interrogation why-questions treatment resulted in increased learning over the rereading placebo-control, even after prior knowledge and verbal ability (which, along 30 with strategy, were found to be significant predictors of posttest score) were statistically controlled. Elaborative interrogation research has also progressed to include variables associated with students? prior knowledge by using pretests or other methods to estimate content-related background knowledge and verbal ability (e.g., Ozgungor & Guthrie, 2004; Smith et al., 2010). Role of prior knowledge in elaborative interrogation In a review of research on prior knowledge, Dochy, Segers and Buehl (1999) concluded that a clear relationship exists between prior knowledge and educational performance. Research has shown that prior knowledge is positively correlated with a reader?s text comprehension (Ozuru, Dempsey, & McNamara, 2009). Students with similar reading comprehension abilities, but different levels of prior knowledge are expected to differ in their learning after reading the same text (Johnston, 1984). Based on a study of science knowledge and reading skills, O?Reilly and McNamara (2007) suggested that both reading skill and prior science knowledge are important factors in students? success in their science courses. The general postulation made by researchers is that prior knowledge plays an important role in the effectiveness of elaborative interrogation why-questions. Willoughby and her colleagues (1994) found that while the effects of elaborative interrogation why-questions are dependent on the readers? relevant prior knowledge, students must activate their prior knowledge in order to reap the benefits. Research has shown that elaborative interrogation why-questions can be an effective reading comprehension strategy, especially when the students? prior knowledge is high (e.g., Smith et al., 2010; Willoughby et al., 1997; Woloshyn et al., 1992; 31 Woloshyn, Wood & Willoughby, 1994). Martin and Pressley (1991) found that students who were given elaborative interrogation why-questions performed better on a posttest when the why-questions forced attention to prior knowledge, providing further evidence that elaborative interrogation why-questions are more effective in students with higher levels of relevant prior knowledge and suggesting that carefully designed why-questions may lead to activation of that prior knowledge. Worked examples Worked examples (sometimes referred to as worked-out examples or simply examples) have been defined as ?instructional devices that provide an expert?s problem solution for a learner to study and emulate? (Atkinson et al., 2000, pp. 181-182) and have been the subject of a considerable amount of research, especially within the domains of mathematics, physics and computer programming (Atkinson et al., 2000). Research has shown worked examples to be effective and valued by students, especially when the concepts are unfamiliar (Atkinson & Renkl, 2007). Ideally, worked examples provide learners with a basic structure that helps them understand how the problem is solved without providing a script or algorithm (Atkinson et al., 2000), thus providing the student with an example of a problem solving approach without suggesting a list of steps to memorize. This could provide an improved method for teaching and learning since students often fall back on memorization when studying for tests. One recent study found that many beginning physics students do not rely on strategic or scientific approaches to problem solving, but instead rely on finding or recalling a formula, often an inappropriate one, and then simply plug-in numbers to find an answer (Walsh, Howard, & Bowe, 2007). 32 Research in worked examples has found that studying a worked example paired with a practice problem resulted in improved problem solving ability compared to the more traditional practice of studying the worked example alone (Cooper & Sweller, 1987; Sweller & Cooper, 1985). However, this advantage has been shown to dissipate as the learners? domain knowledge and level of expertise increases (Kalyuga, Chandler, Tuovinen, & Sweller, 2001). Worked examples are often found in mathematics, physics and chemistry textbooks to assist students as they read about principles that lead to calculations. While a search of the literature found no reference to the use of elaborative interrogation why- questions with worked examples, some investigations have been conducted that could relate worked examples to this type of comprehension strategy. For example, several studies have been conducted that included self-explaining along with worked examples. Self-explaining in the framework of cognitive science research has been defined as ?the activity of generating explanations to oneself, usually in the context of learning from an expository text? (Chi & Glaser, 2000, p. 165). In a study of self-explanations with worked examples and subsequent problem solving, the students who produced more explanations were found to be more successful problem solvers (Chi, Bassok, Lewis, Reimann & Glaser, 1989). Furthermore, Renkl (1997) found that the quality of the students? self-explanations was important to success when learning from worked examples. However, Atkinson and Renkl (2007) claims that self-explanations are typically superficial in nature. Self-explanations, as opposed to elaborations prompted by why-questions, are generally spoken aloud and require some student training. While both may be considered forms of elaboration, there are some important differences. 33 Specifically, elaborative interrogation?s why-question strategy requires no student training and typically results in written elaborations that can be evaluated by a researcher or instructor. Quantitative chemistry problem solving A substantial part of chemistry consists of the study of the mathematical relationships between variables, but many college students have difficulties with the concepts, skills and strategies required for dealing with these relationships (Selvaratnam & Kumarasinghe, 1991). According to the National Research Council [NRC] (2003), four themes that characterize the work of modern chemists are analysis, synthesis, transformation and modeling; all of which require skills in problem solving and quantitative calculations. A recent study on questions and problems in general chemistry textbooks found that approximately 30% of end-of-chapter question could be classified as quantitative application questions (Davila & Talanquer, 2010), yet the teaching of the skills required for the quantitative study of chemistry is often neglected (Selvaratnam & Kumarasinghe, 1991). As a result, many students rely on memorization of equations or algorithms without understanding these relationships. However, students who memorize equations or algorithms are often unable to solve problems unless they are aware of the relationship between mathematics and the associated chemistry concepts (Gabel & Sherwood, 1984; Herron and Greenbowe, 1986). Prawat (1989) claims that procedural mathematical knowledge is ?extremely limited unless it is connected to a conceptual knowledge base? (p. 10). Therefore, strategies that connect equations used in problem solving to relevant chemistry concepts may prove advantageous. 34 The particular concepts to be learned by the students in the present study were molar mass of atoms and compounds and the relationships between molar mass, number of moles and mass. Molar mass, the mass of one mole of a substance, not only provides a bridge between mass and number of particles of a substance, but also between sub- microscopic and macroscopic quantities (DeMeo, 2006; Dori & Hameiri, 2003). It allows chemists to translate between chemical formulas, which represent the atomic composition, and measurable mass quantities. This key concept is used by students and chemists at every level (DeMeo, 2006). According to Dierks, Weninger & Herron (1985), teachers of introductory chemistry find the mole concept to be the most challenging concept for their students and that the lack of mathematical skills of their students is a major concern. A survey of approximately 300 college chemistry instructors found moles and molar mass to be second in importance only to basic skills in a list of topics they viewed as important for students to master in high school, followed by the factor-label method which is commonly used in molar mass calculations (Deters, 2003). Staver and Lumpe (1995) found that in order for students to solve molar mass problems, they must have sufficient understanding of the concepts and not simply rely on memorized algorithms or rules. Since elaborative interrogation why-questions are thought to improve students? comprehension as they read, its utilization when reading about these concepts and the worked examples that apply them may result in improved problem solving ability on a posttest. Community college characteristics The current research investigated the effects of the elaborative interrogation why- question strategy with community college students reading from their course textbooks 35 about molar mass and related problem solving. The study population was the specific target population for this research. Public two-year institutions, often referred to as community colleges, serve over 40% of the undergraduate students in the United States (Horn & Nevill, 2006). Community colleges in the United States originated in the early twentieth century to prepare trained workers for industrial expansion and to provide a pathway to social equity and upward mobility which were often associated with higher education (Cohen and Brawer, 2009). Currently, community college functions typically include ?academic transfer preparation, vocational-technical education, continuing education, developmental education and community service? (Cohen & Brawer, 2009, p. 20). According to Boggs (2006), the mission of most community colleges is to offer open and affordable access to postsecondary education while providing services that benefit both the individual student and the community as a whole. Community colleges generally offer lower tuition rates and open enrollment policies and thus provide access to populations who are often underserved in four-year institutions. According to Horn and Nevill (2006), ?community college students are more likely to be older, female, Black or Hispanic, and from low-income families? compared to students in four-year colleges (p. 9). Also, more students enrolled in community college are financially independent from their parents, work full time, and take one or more distance learning courses than students in four-year institutions (Horn & Nevill, 2006). Class sizes are typically smaller in community colleges than in other post- secondary institutions with approximately 65% of classes consisting of between ten and 29 students (Cohen & Brawer, 2009). As a result, a higher percentage of institutional 36 expenses are allocated to instruction at community colleges than at four-year institutions; 39% compared to 26%, respectively (Knapp, Kelly-Reid & Ginder, 2010). Summary Reading comprehension and strategies used to increase learning through reading have been researched over several decades. Questioning strategies have shown to be successful, and more specifically, the use of elaborative interrogation why-questions have produced favorable results in a variety of settings. Specifically, elaborative interrogation investigations have explored the effects of such variables as age of students, student learning goals, types of elaborations, types of questions asked during reading, and the quality of student responses to why-questions. Additionally, this strategy may be favored in some situations such as college level classes because it does not require student training and would therefore not consume additional instructional time. Although the reasons for the benefits have not been definitely established, research suggests that the amount of prior knowledge and its activation play key roles in the effectiveness of this strategy. Outside the realm of reading comprehension research, solving quantitative chemistry problems, in particular problems including molar mass have been shown to be challenging yet essential for chemists and chemistry students. Research indicates that studying worked examples may improve students? performance when solving quantitative problems. However, the use of reading strategies such as elaborative interrogation why-questions along with worked examples has not been researched. The current study was designed to explore the effects of elaborative interrogation why- 37 questions adjunct to chemistry text containing worked examples on subsequent quantitative chemistry problem solving ability. 38 Chapter 3: Method Introduction This study used elaborative interrogation why-questions embedded within instructional material from an authentic textbook used in an introductory college chemistry course. Specifically, a series of statements followed by the question ?why is this true?? were presented at appropriate intervals during an assigned reading. The theoretical and empirical basis for using elaborative interrogation why-questions is the idea is that students are asked (as in interrogation) to write (as in elaborative) why a statement taken from a text is true. The reading in this study was copied verbatim from the students? chemistry textbook containing worked example problems, as is customary. The textbook was required for the course in which the research subjects were enrolled. Based on previous elaborative interrogation research, it was predicted that students in an elaborative interrogation why-question treatment group would outperform a rereading placebo-control group on a posttest assessing the ability of the students to solve quantitative chemistry problems similar to the worked examples appearing in the chemistry textbook. In addition, it was anticipated that students within the elaborative interrogation why-question treatment group who produced high quality responses to the why-questions would score higher on the problem solving posttest requiring comprehension than students whose responses were less adequate or inappropriate. Furthermore, it was predicted that the posttest scores of students with high chemistry prior knowledge, mathematics skill and verbal ability would surpass students who were deficient in these attributes. 39 Research design A purpose of this research was to examine the effectiveness of using elaborative interrogation why-question enhanced worked examples as a strategy for improving problem solving skills in chemistry. Specifically, this strategy was tested against a commonly used strategy (rereading) which acted as a placebo-control. The effectiveness of the strategy was determined by comparing the problem solving posttest performance (dependent variable) of students in two, randomly assigned groups: an experimental group and a placebo-control group. The experimental group read a passage from the textbook required for the course and answered elaborative interrogation why-questions regarding worked examples that were included within the reading. The placebo-control group read the same passage and was asked to employ the commonly used strategy of rereading. A pilot experiment to determine feasibility of the present study was performed with three Basics of Chemistry classes at the same community college during the spring semester of 2009 and spring semester of 2010 using the same procedures and similar materials to those in the current study. These results suggested that the elaborative interrogation why-question treatment resulted in increased comprehension leading to improved ability to solve problems similar to the worked examples contained in the reading, compared to the rereading placebo control. See Appendix H for further discussion of the pilot study along with statistical analyses. A randomized two-group, posttest only design was selected because it minimizes threats to validity that often occur as a result of the pretesting sometimes used in other experimental designs (Campbell & Stanley, 1966; Surber & Schroeder, 2007; Willson & Putnam, 1982). For example, it has been suggested that pretesting over specific to-be- 40 learned material may alert the reader to which details are important while reading (Surber & Schroeder, 2007) and therefore influence the results of research focused on activating prior knowledge. In lieu of a typical pretest which would be identical or very similar to the posttest (Campbell & Stanley, 1966), chemistry prior knowledge, mathematics skills and verbal ability tests were administered prior to the assigned instructional reading to provide an estimate of students? prior knowledge (see Appendix A). The chemistry prior knowledge test was designed to assess the students? chemistry domain knowledge, but not their knowledge of the topics included in the reading. More specifically, the test was used to determine the students? ability to use the periodic table, identify the atomic mass of an element on the periodic table, and interpret a chemical formula?all skills related to the to-be-learned material, but not concepts directly explained in the study reading. The mathematics skills test was included to determine the students? ability to manipulate algebraic variables and solve ratio problems. A general vocabulary test was administered as well to assess the subjects? verbal ability. Scores on these three tests were also used to provide assurance of homogeneity of prior knowledge across groups. Participants and setting The research took place at a community college in the southwestern United States. This college serves its diverse population by offering a wide variety of programs leading to certifications or associate degrees, as well as academic courses that are designed to transfer to four-year colleges and universities. Participants (N = 74) were registered students in an undergraduate introductory chemistry course within the college?s School of Adult and General Education unit. The course focused on basic chemical principles and applied mathematics and was designed 41 to serve students with limited chemistry background or those who feel unprepared to enroll in a general chemistry course. Five sections of this course, taught by two instructors who volunteered to provide class time and access to students, were selected for data collection. Students enrolled in these sections were asked to volunteer to participate during a scheduled 110-minute class session, which provided an adequate timeframe for the students to complete the initial tests, instructional materials and problem solving posttest. Any student not wishing to participate was given an alternative, class-related activity during the research (see Appendix D). There were no apparent benefits or penalties provided by the researcher or course instructor for either choosing to participate or not participating. The only prerequisite for selection for participation was being a registered student in the course. Other than the Institutional Review Board approved consent form (see Appendix G) specification that participants be 18 years of age or older, there were no requirements based on other characteristics, such as age, sex, race, ethnic origin, religion or social or economic status. The study population was the specific target population for this research. Of the 79 students present on the day that data was collected in their class, 74 participated in the study. Three students were under 18 years of age and were therefore did not meet the age requirement on the consent form (see Appendix G), and two chose to discontinue participation before completing the problem solving posttest. Additionally, six volunteers representing a variety of posttest performances were recruited from the elaborative interrogation why-question treatment group to participate in oral interviews pertaining to the students? responses to the why- questions and their opinions on the value of the elaborative interrogation why-question strategy. See Appendix F for interview questions. 42 Materials The materials used in this study were all taken directly from, or designed around the content in one section of the required course textbook (Timberlake, 2009). Materials used before the students were given the instructional reading included multiple choice chemistry knowledge test used to estimate the students? prior knowledge, a multiple choice mathematics skills tests, and a multiple choice vocabulary test to assess verbal ability (see Appendix A). The instructional reading consisted of a 1,144-word (counted as individual words, individual sets of numbers and chemical formulas) passage copied verbatim from the textbook, including all color coding used in the textbook (see Appendices B and C). Inspection of the literature illustrated that the length of the selected text was long compared to most previously published elaborative interrogation why- question research, with the exceptions of Ozgungor and Guthrie (2004) and Smith et al. (2010). For the treatment group, this reading was enhanced with nine elaborative interrogation why-questions. Finally, a six-question problem solving posttest was used to assess comprehension of the reading (See Appendix E). In this research, a problem was defined as a question that required the students to determine a problem solving method and calculate a numerical solution. With the exception of the textbook passage and the vocabulary test, all materials were developed by the researcher who is an experienced high school and college chemistry teacher and holds a graduate degree in chemistry. The researcher designed materials were reviewed by two college chemistry instructors who verified that they were appropriate for the level of the students in the study and addressed the concepts and content covered in the reading. Also, these materials had been tested previously during a pilot study with only a few minor modifications. 43 The chemistry prior knowledge and mathematics skills tests were used to assess the participants? knowledge and skills related to, but not including the content of the instructional reading and the calculations demonstrated in the worked examples. Additionally, scores on these tests were used as an estimate of the students? relevant prior knowledge. More specifically, these tests evaluated the participants? ability to use the periodic table to access atomic masses of elements, interpret a chemical formula regarding the number of atoms of each element represented in the formula, mathematically manipulate variables, and solve basic conversion or ratio problems. The chemistry prior knowledge test included six multiple choice chemistry content questions over the mole, chemical formulas and the periodic table. The mathematics skills test included four multiple choice mathematics skills questions: one question that tested the participants? ability to manipulate variables and three non-chemistry conversion problems. The verbal ability test consisted of a 48-question, multiple choice vocabulary test taken from the Kit of Reference Tests for Cognitive Factors (French, et al., 1963). This verbal ability test has been used in previous educational research and has consistently correlated with achievement and provided valid predictors of science comprehension (Holliday, Brunner & Donais, 1977; Holliday, et al., 1984; Smith et al., 2010). The selected textbook passage used as the instructional reading for this study explained principles relating the mole concept to atomic mass values on the periodic table and the relationship between mole quantities and masses of elements and compounds. This topic was selected because it has been characterized as a fundamental yet challenging concept (DeMeo, 2006; Deters, 2003; Dierks, Weninger & Herron, 1985), 44 and included principles that can be expressed mathematically. Specifically, this chemistry textbook passage defined molar mass of elements and compounds, explained how to compute molar mass, described how molar mass is used in calculations to convert between moles and mass of elements and compounds, and provided three worked examples illustrating solution methods for these types of problems. Such instruction is virtually universal in college level chemistry courses. Each of these worked examples presented a problem statement, followed by a detailed, step-by-step solution process. The same textbook passage was presented to all students. However, the treatment group?s instructional materials also included elaborative interrogation why-questions based on statements regarding the worked examples. The treatment group was instructed to answer these elaborative interrogation why-questions in a space provided after each question. The instructional materials for the rereading group were identical to that for the treatment group except for the lack of why-questions. The rereading group was instructed to read the text a second time after they completed the first reading. There were nine elaborative interrogation why-questions contained in the treatment group?s textbook passage, three for each of the three worked examples. These why-questions were designed to encourage students to explain why a statement about a particular portion of each worked example was true, theoretically resulting in activation of their prior knowledge in the learning process. Martin and Pressley (1991) found increased learning when elaborative interrogation why-questions embedded in the reading forced attention to prior knowledge; therefore, the questions for this study were written with the intention of stimulating the activation of prior knowledge. For instance, immediately following one of the worked examples (see Figure 1), the treatment group 45 students were given the statement ?Step 3 of worked example 2 states that 1 mole of Ag = 107.9 g Ag.? followed by the elaborative interrogation why-question ?Why is this true?? Similarly, another elaborative interrogation why-question following this worked example (see Figure 1) states ?In step 4 of worked example 2, the correct form of the conversion factor to use is: . Why is this true?? The problem solving posttest consisted of six quantitative chemistry problems which were related directly to the instructional materials. This precise methodology was Figure 1. Worked example 2 from the students? instructional material, reprinted with permission from Person Education, Inc., from: Timberlake, K. C. (2009). Chemistry: An introduction to general, organic, and biological chemistry (10th ed.), pp 169- 170. Upper Saddle River, NJ. Worked example 2 (Sample Problem 5.4): Converting Moles of an Element to Grams: Silver metal is used in the manufacture of tableware, mirrors, jewelry, and dental alloys. If the design for a piece of jewelry requires a 0.750 mole silver, how many grams of silver are needed? Solution: Step 1 Given: 0.750 mole Ag Needed: grams of Ag Step 2 Plan: moles of Ag molar mass conversion factor barb2right grams of Ag Step 3 Equalities/Conversion Factors: 1 mole Ag = 107.9 g Ag and Step 4 Set Up Problem: Calculate the grams of silver using the molar mass. 46 considered important at this early stage of applying elaborative interrogation why- question research to a chemistry textbook used by college students. In other words, each posttest problem required applying an equation presented in the text to find a precise numerical solution, as described by Taasoobshirazi and Glynn (2009). These six problems may be regarded as two sets of three problems based on the degree to which they varied from the worked examples. While solving these posttest problems theoretically required the same solution steps and calculations as their corresponding worked example problems, one set of posttest problem statements perhaps was more easily performed by the students than the other. According to Mayer and Wittrock (1996), when experience with a problem influences the ability to solve a new problem, problem solving transfer has occurred. More specifically, the ability to solve problems with similar structure but with differing surface features may indicate a student?s ability to transfer previously learned information when asked to solve problems (Jonassen, 2003; Reed & Bolstad, 1991; Renkl, Atkinson, Maier & Staley, 2002). In the present study, the ability to solve the posttest problems that differed from the worked examples in surface structures, such as the wording of the problem statement, or the quantities and compounds provided in the problem statement, was considered to be evidence of problem solving transfer. The two sets of posttest questions were based on two levels of problem solving transfers (lower level transfer and higher level transfer). Both transfer level problem sets contained one problem that could be compared to each of the three types of problems illustrated by the worked examples in the instructional reading (calculation of molar mass, calculation of mass from moles, and calculation of moles from mass). The lower level transfer set of problems was copied verbatim from the problem statements in 47 the worked examples in the reading, but varied in the compounds or quantities (depending on the problem type) provided in the problem statements. The higher level transfer set of problems, was worded differently and presented different substances and given quantities in comparison to the worked examples. In other words, there were two posttest problems based on each worked example (one in each transfer level set) that differed in the amount of variation from the worked example and therefore differing in the degree of problem solving transfer required by the students. For instance, the problem statement in the first worked example read: ?Find the molar mass of Li2CO3 used to produce red color in fireworks.? The posttest problem in the first set relating to this worked example read: ?Find the molar mass of Na2SO4 used to produce color in fireworks.? The posttest problem in the second set that related to this worked example read: ?Calculate the molar mass of Ba3(PO4)2.? The lower level transfer molar mass posttest problem was identical to the worked example problem, except that it presented a different compound (Na2SO4 in place of Li2CO3). The higher level transfer molar mass posttest problem statement was worded differently and presented a more complex compound, thus requiring a higher level of transfer. The second worked example problem statement as located originally in the textbook read: ?Silver metal is used in the manufacture of tableware, mirrors, jewelry, and dental alloys. If the design for a piece of jewelry requires a 0.750 mole silver, how many grams of silver are needed?? The lower level transfer posttest problem relating to this worked example was verbatim to the second worked example except that the quantity of silver was changed to 1.75 mole. The higher level transfer posttest problem related to the second worked example read: ?Find the mass in grams of 1.25 moles of iron (Fe).? 48 Finally, the third worked example read: ?A box of salt contains 737 g NaCl. How many moles of NaCl are present in the box?? The lower level transfer posttest problem related to the third worked example was identical to the third worked example except that the quantity of NaCl was changed to 325 g. The higher level transfer posttest problem related to the third worked example read: ?How many moles of silver nitrate (AgNO3) are present in 225 grams of silver nitrate?? Furthermore, the posttest problem pairs were not presented sequentially as is often done with end-of-chapter problems in textbooks, to prevent leading the students to a solution process. The lower level transfer set of posttest problems was presented first, followed by the higher level transfer set of posttest problems. The lower level transfer posttest problems were presented in the same sequence as the worked examples in the instructional reading (calculating molar mass of a compound, followed by converting moles to grams, followed by converting grams to moles). The higher level transfer set of posttest problems was presented in random order (converting moles to grams, followed by calculating molar mass of a compound, followed by converting grams to moles). Rubrics were used to evaluate the posttest problem calculations for all students and the why-question responses of the treatment group participants. The rubric for the posttest problem calculations classified each answer into one of four levels: a) correct answer, b) correct problem set-up, incorrect answer, c) partially correct problem set-up, incorrect answer, and d) incorrect problem set-up, incorrect answer or no response. The researcher evaluated all responses and a second, independent rater was recruited to evaluate 20% of the responses to assure the reliability of the researcher?s evaluations. 49 The quality of the treatment group students? responses to the why-questions was also evaluated. The why-question responses were coded using the four-level rubric similar to those used in previous studies using elaborative interrogation why-questions (Martin & Pressley, 1991; Wood, Willoughby, McDermott, Motz, Kaspar, and Ducharme, 1999; Smith et al., 2010). The four levels indicated the quality of the participant responses to the why-questions. Each response was classified as: a) adequate- linked, b) adequate-not-linked, c) inadequate, or d) no response. To be considered adequate, the response must be scientifically and mathematically correct. To be considered adequate-linked, the correct response must also relate the statement portion of the why-question to appropriate information in the reading or the student?s prior knowledge. For example, if the student response to the first question after the second worked example (see Figure 1) refers to the periodic table or the atomic mass of silver when explaining why it is true that ?1 mole of Ag = 107.9 g Ag,? the response would be considered to be linked (see Table 1 for example responses with classifications). The researcher evaluated and classified all responses and a second, independent rater was recruited to evaluate 20% of the responses in order to assure the reliability of the researcher?s evaluations. 50 Table 1 Sample elaborative interrogation why-question responses with scoring Statement/why-question Sample student response Level Score Step 3 of worked example 2 states that 1 mole of Ag = 107.9 g Ag. Why is this true? ?Because 107.9 is the molar mass of silver and the molar mass is the mass of one mole.? Adequate- linked 3 ?Because when you set up the conversion factor, that?s what you use? Adequate- not-linked 2 ?Only 1 atom of Ag is present? inadequate 1 Procedure Approximately one week before data collection, the researcher addressed the potential participants during a regular class session, briefly described the research and procedure, and distributed consent forms (see appendix G) and researcher contact information. This gave the potential participants the opportunity to read and consider the consent form and ask questions or express concerns prior to data collection. The dates for data collection were determined by the schedules of the instructors. The goal was to allow the students to experience instruction in essential background concepts, but not the new topics covered in the instructional material used in the study, specifically molar mass and calculations using molar mass. Therefore, the data collection was scheduled for the class session immediately preceding the scheduled instruction on molar mass. A random sequence generator was used to assign each student to receive either the instructional material containing the elaborative interrogation why-questions (see Appendix B) or the instructional material with instructions to reread (see Appendix C). No training was 51 required for either strategy group. Other than the inclusion of why-questions, the instructional material for the two groups was identical except for the group-dependent written instructions. The written instructions provided to the students in the current study were adapted from those used by Smith et al. (2010). The treatment group instructions read: ?In preparation for a quiz, read the following passage and answer the questions **AS YOU READ** the passage. Pay particular attention to the worked examples.? The rereading group instructions read: ?In preparation for a quiz, read the following passage. Pay particular attention to the worked examples. Then **READ THE PASSAGE AGAIN**.? The words ?as you read? and ?read the passage again? were capitalized and asterisks were added to emphasize the methods of the assigned strategies, as was done by Smith, et al. (2010). A third set of instructional materials contained an alternate assignment for any students choosing not to participate (see Appendix D). The students were not be given prior notice of the date of data collection or concepts to be assessed in order to avoid any effect on attendance. The data collection was completed in one 110- minute class session. On the day of data collection, consent forms were collected and the chemistry prior knowledge, mathematics skill and verbal ability tests were administered to all participants (see Appendix A). The researcher and the course instructor were both present during the data collection. Once the participants completed the chemistry prior knowledge, mathematics skills and verbal ability tests, the researcher distributed the instructional materials and read the following instructions to the group: Each of you have been given a section copied from your textbook with instructions to use a particular reading strategy. This material is from section 5.2 of your textbook. Read the instructions at the top of the first page. Depending on 52 your instructions, you will either read the section, answering questions as you read, OR read the section twice. Read and study the section as if you are preparing for a test. Pay particular attention to the worked example problems. When you are finished, turn in your materials and you will be given a posttest. Treat this as you would a regular exam. You may not use books or notes or consult with other students. However, you may use the periodic table. The boldface words were emphasized during the reading to highlight the importance of the worked examples in the reading and to alert the students that different strategies were being tested. After the instructions were read and any questions were addressed, the participants completed the instructional materials. The researcher recorded the time that study began and the time that each participant turned in their instructional materials, so that study time for each participant could be determined. As the participants turned in their instructional materials, they were handed the problem solving posttest (see Appendix E) to complete. Once again the times were recorded so that posttest time could be determined for each participant. Six students in the treatment group were recruited for interviews following the study. Volunteers were chosen based on their availability and represented a variety of problem solving posttest performance levels. These interviews were digitally recorded and transferred to a secure computer for storage. Volunteers were asked to respond to interview questions (see Appendix F) about their perceptions of the elaborative interrogation why-question strategy. Specifically in a one-on-one interview, each volunteer was asked to elaborate on their responses to the why-questions and any effects 53 these responses may have had on their performance on the problem solving posttest. For example, the first interview question asked: ?Did the statements followed by the question ?why is this true? sometimes help you better understand the material?? Following this general question, students were asked about specific elaborative interrogation why- questions that were presented during the study, and if they felt that responding to these questions was helpful when attempting to solve the corresponding problem on the posttest. Data analysis The data collected included numerical scores from the chemistry prior knowledge, mathematics skills, verbal ability tests, why-question responses, and problem solving posttest. Study times were compared for possible influence as well. Means and standard deviations of tests were determined for the students as a whole and for the two reading comprehension strategy groups. Frequency tables were used to divide participants into high and low groups based on their standing above or below a median split of scores on chemistry prior knowledge, mathematics skills and verbal ability tests since these were predictors of posttest scores. Statistical analyses were used to determine any significant differences between the groups in order to address the research hypotheses. For example, independent samples t-tests were used to determine any significant differences between the chemistry prior knowledge, mathematics skills and verbal ability scores of the two reading strategy groups. Analysis of variance (ANOVA) was used to determine any significant differences between the two reading strategy groups? posttest scores as well as any significant differences between students with high and low chemistry prior knowledge, high and low mathematics skills and high 54 and low verbal ability. Regression analysis was used to determine effects of chemistry prior knowledge, mathematics skill, verbal ability and strategy (elaborative interrogation why-questions or rereading) on posttest scores. Summary The current study used a randomized two-group posttest-only control design to compare two reading comprehension strategies, elaborative interrogation why-questions and rereading, when studying a passage from the textbook required for the chemistry course in which they were attending. It was proposed that the students randomly selected for the elaborative interrogation why-question treatment group would outperform the rereading placebo-control group on a problem solving posttest. Chemistry prior knowledge, mathematics skill and verbal ability were also used as predictors of problem solving posttest performance. ANOVA, t-test and regression techniques were used to analyze the data regarding these predictions. In addition, the quality of the students? responses to the why-questions was assessed to determine the effect, if any, these responses had on performance of the students in the treatment group on the problem solving posttest. 55 Chapter 4: Results Introduction A purpose of this study was to provide evidence that the use of elaborative interrogation why-questions by college students enrolled in an introductory chemistry class improved comprehension of an assigned reading from their chemistry textbook. This investigation assessed the performance of two groups who were asked to read and study a section from their textbook that included worked example problems in preparation for a test, with each group theoretically using a different reading comprehension strategy. The elaborative interrogation why-question treatment group was simply instructed to read the provided text and answer adjunct why-questions as they read. The placebo-control group used the widely researched and commonly used practice of rereading (Pressley, 2006; Rawson, et al., 2000). After these assigned tasks were completed, students in both groups were given a posttest that assessed their ability to solve six quantitative chemistry problems operationally defined in terms of the worked examples presented in the assigned reading. Each question was later scored on a scale of zero to three points, with a maximum score of 18 points. This study also investigated whether the students? level of chemistry prior knowledge, mathematics skill, and verbal ability impacted the students? ability to solve quantitative chemistry problems after completing the instructional materials and if the effect of the treatment remained significant when these variables were statistically controlled. These data were analyzed using SPSS statistical software version 19.0. Finally, six volunteers from the elaborative interrogation why-question treatment group were interviewed to acquire qualitative data 56 pertaining to the students? attitudes and opinions of the elaborative interrogation why- question strategy. This chapter presents the results of the statistical analyses and qualitative summations for this study. Description of data and analyses Students were randomly assigned to study strategy groups to obtain equality between the groups with regard to prior knowledge in chemistry, mathematics skills and verbal ability and ensure group independence. This equality was verified by statistical analysis. Independent samples t-tests revealed no statistically significant difference between the elaborative interrogation why-question treatment group and the rereading placebo-control group in chemistry prior knowledge. Likewise, no significant difference was found between the two groups in mathematics skill or verbal ability (see Table 2). The lack of significant differences between the means of these two groups provided additional support, along with random selection of groups, that selection bias was not a threat to internal validity in this study. 57 Statistical analysis was conducted on the posttest scores to confirm normality of distribution and homogeneity of variance (assumptions underlying further statistical analyses). Kurtosis (-0.81) and skewness (-0.30) values were within the acceptable range for a normal distribution of test scores. This was further supported with a Kolmogorov- Smirnov test for normality of distribution (p > .05). Homogeneity of variance of the posttest scores was confirmed by Levene?s test, F(1, 72) = .153, p > .05. Thus, statistical tests that assume these characteristics, such as independent samples t-tests, ANOVA and multiple regression were appropriate for use with the data collected in this study. Each of the six posttest problems was assessed based on the accuracy of the solution to the problem and the correctness of the technique used in calculating the solution. An independent rater assisted the researcher by scoring approximately 20% of the posttests resulting in a 96% interrater agreement. Student names and their randomly assigned group assignments were blinded to both raters. A reliability coefficient for the Table 2 Independent samples t-test table: comparison of elaborative interrogation why-question treatment and rereading placebo-control groups on chemistry prior knowledge, mathematics skill and verbal ability Placebo-control (n = 37) Treatment (n = 37) Measure M SD M SD t p Prior knowledge 3.68 1.45 3.68 1.51 0.00 1.00 Mathematics skill 3.38 0.88 3.05 0.79 -1.66 0.10 Verbal ability 17.43 5.36 18.65 7.43 -0.81 0.422 58 posttest problems was computed using Cronbach?s procedure for calculating an alpha coefficient (? = .757). While opinions of the usefulness of this statistic vary widely, an alpha coefficient above 0.7 suggests that students are not answering questions at random, and is often reported as an indication of acceptable reliability for an instrument (Cortina, 1993; Mulford & Robinson, 2002; Sijtsma, 2009). Correlations of variables Statistical analysis was performed to assess correlations between the independent variables (i.e., prior chemistry knowledge, mathematics skill, verbal ability and study time) and the dependent variable (i.e., problem solving posttest scores). As shown in Table 3, there were significant positive correlations between the problem solving posttest score and chemistry prior knowledge, r(72) = .489, p < .001, and mathematics skill, r(72) = .295, p < .05, supporting the notion that chemistry prior knowledge and mathematics skill were related to the dependent variable (problem solving posttest score). The correlation between posttest score and verbal ability, was positive, but not statistically significant, r(72) = .100, p > .05. While this is in contrast to a recent investigation of elaborative interrogation why-questions (Smith et al., 2010), the fact that this study included quantitative chemistry problem solving may account, at least in part, for this disparity (see further discussion in the following chapter). 59 Table 3 Correlation between study variables (N = 74) Variables Posttest score Prior knowledge Mathematics skills Verbal ability Study time Posttest score Chemistry prior knowledge 0.489** Mathematics skills 0.295* 0.101 Verbal ability 0.100 0.127 0.188 Study time 0.113 0.050 0.004 -0.122 ** significant at the 0.01 level (2-tailed) * significant at the 0.05 level (2-tailed) Although on average the placebo-control rereading group spent approximately 28% less time (M= 13.27 minutes, SD=6.00) studying than the elaborative interrogation why-question treatment group (M = 18.41 minutes, SD = 7.12), study time did not significantly correlate with posttest scores, r(72) = .113, p > .05, as has been the case in previous research (Ozgungor & Guthrie, 2004; Smith et al., 2010). Evaluation of hypotheses Analysis of variance (ANOVA) and regression techniques were employed to evaluate the research hypotheses for this study. Each was tested for statistical significance at the .05 level. Means, standard deviations and other relevant descriptive values for chemistry prior knowledge, mathematics skill and verbal ability were reported in Table 2. To evaluate the hypothesis that students in the elaborative interrogation why- question treatment group would outperform students in the rereading placebo-control 60 group on a problem solving posttest requiring comprehension, a one-way ANOVA was performed (see Table 4). The results indicated a statistically significant difference between the mean posttest scores of the two groups. Expressed as whole number percentages as is customary for student scores, the mean posttest score for the elaborative interrogation why-questions treatment group was 70% (12.65 points out of a possible 18 points), compared to a mean posttest score of 57% (10.19 points out of a possible 18 points) for the rereading placebo control group. Specifically, the students in the elaborative why-question treatment group (M = 12.65, SD = 4.63) significantly outperformed the students in the rereading placebo-control group (M = 10.19, SD = 5.32) on the problem solving posttest. To further support this test of significance, Cohen (1994) has suggested the inclusion of effect size (Cohen?s d), which quantifies the difference between the two groups (Coe, 2002). The treatment strategy resulted in a medium effect size (d = 0.49), which has been defined by Cohen (1992) as a difference of approximately half a standard deviation between means. These statistics indicate that the hypothesis that the elaborative interrogation why-question treatment strategy would outperform the rereading placebo-control strategy on a problem solving posttest was supported by the data. Table 4 Analysis of variance summary table: comparison of elaborative interrogation why- question treatment and rereading placebo-control groups on posttest scores in total Sum of Squares df Mean Square F(1, 72) p d Between Groups 111.905 1 111.905 4.51 .037 0.49 Within Groups 1788.108 72 24.835 Total 1900.014 73 61 Chemistry prior knowledge, mathematics skill and verbal ability were hypothesized to effect posttest scores. Statistical analyses on assessments of these variables were performed, not only to explore these hypotheses, but also to isolate the effects of these variables from the effect of the treatment on the posttest scores. As shown in Table 3, a significant positive correlation between prior knowledge in chemistry and posttest score, r(72) = .489, p < .001. To further explore the effect of prior knowledge on student performance on a problem solving posttest, statistical analyses were performed to compare mean posttest scores for students with high and low chemistry prior chemistry knowledge test scores. One-way ANOVAs were performed for all students (both groups combined), and for each group separately (see Table 5). Students were categorized as having high or low prior knowledge in chemistry based on a median split of scores on the chemistry prior knowledge test. Overall (N = 74), students with high prior knowledge in chemistry scored significantly higher on the problem solving posttest (M = 13.59, SD = 4.80) than students with low prior knowledge scores (M = 9.24, SD = 4.47) with a large effect size (d = 0.94). Students in the elaborative interrogation why-question group with high prior knowledge in chemistry scored significantly higher on the problem solving posttest (M = 15.20, SD = 4.09) than students in the elaborative interrogation why-question group with low prior knowledge in chemistry (M = 9.65, SD = 3.40) with a very large effect size (d = 1.48). However, the students in the rereading group with high prior knowledge in chemistry did not score significantly higher on the problem solving posttest (M = 11.71, SD = 5.59) than students in the rereading group with low prior knowledge in chemistry (M = 8.90, SD = 4.84) and 62 a medium effect size (d = 0.54), suggesting a possible difference in the effect of prior knowledge between the strategy groups. This difference is illustrated graphically in Figure 2. Table 5 Analysis of variance summary table: comparison of posttest scores overall and by group based on high v. low prior knowledge scores Treatment and Placebo-control Groups Combined Sum of Squares df Mean Square F(1, 72) p d Between Groups 350.284 1 350.284 16.274 < .01 0.94 Within Groups 1549.730 72 21.524 Total 1900.014 73 Treatment Group Only Sum of Squares df Mean Square F(1, 35) p d Between Groups 283.350 1 283.350 20.361 < .01 1.48 Within Groups 487.082 35 13.917 Total 770.432 36 Placebo-control Group Only Sum of Squares df Mean Square F(1, 35) p d Between Groups 72.347 1 72.346 6.679 .111 0.54 Within Groups 945.329 35 27.679 Total 1017.676 36 63 Figure 2. Posttest score means for each strategy group divided into high and low prior knowledge test scores. As discussed previously, a significant, positive correlation was found between scores on the mathematics skills test given prior to exposure to the instructional material and posttest scores (see Table 3), which suggests that there is a relationship between mathematics skill and posttest scores as hypothesized. However, when students were categorized as having high or low mathematics skill based on a median split of scores on the mathematics skill test given prior to exposure to the instructional materials, one way ANOVA results did not support this hypothesis (see Table 6). While students in general and the elaborative interrogation why-question treatment group specifically with high scores on the mathematics skills test had higher mean scores on the problem solving posttest (M = 12.26, SD = 5.45 and M = 13.75, SD = 4.59, respectively) than students with low mathematics skills test scores (M = 10.81, SD = 4.84 and M = 11.35, SD = 4.46, respectively), ANOVA results indicated that this difference was not statistically significant. Students in the rereading group with high mathematics skill test scores had 64 lower mean posttest scores (M = 9.55, SD = 5.73) than students in the rereading placebo- control group with low mathematics skill test scores (M = 10.46, SD = 5.23), although not to a statistically significant extent (see Figure 3). These somewhat contradictory results are discussed in further detail in Chapter 5. Table 6 Analysis of variance summary table: comparison of posttest scores overall and by group based on high v. low mathematics skill scores Treatment and Placebo-control Groups Combined Sum of Squares df Mean Square F(1, 72) p d Between Groups 37.566 1 37.566 1.452 .232 0.29 Within Groups 1862.447 72 25.867 Total 1900.014 73 Treatment Group Only Sum of Squares df Mean Square F(1, 35) p d Between Groups 52.800 1 52.800 2.575 .118 0.53 Within Groups 717.632 35 20.504 Total 770.432 36 Placebo-control Group Only Sum of Squares df Mean Square F(1, 35) p d Between Groups 6.487 1 6.487 0.225 .639 0.17 Within Groups 1011.189 35 28.891 Total 1017.676 36 65 Figure 3. Posttest score means for each strategy group divided into high and low mathematics skill test scores. Correlation (see Table 3) and ANOVA results (see Table 7) indicated that the data did not support the expectation that students with high verbal ability would outperform students with low verbal ability on a problem solving posttest. Students were categorized as having high or low verbal ability based on a median split of scores on the verbal ability test given prior to exposure to the instructional materials. Although students with high scores on a verbal ability test had higher mean scores on the problem solving posttest than students with low verbal ability test scores, the difference was not statistically significant. Figure 4 illustrates the differences in means for high and low verbal ability students in both groups. 66 Table 7 Analysis of variance summary table: comparison of posttest scores overall and by group based on high v. low verbal ability scores Treatment and Placebo-control Groups Combined Sum of Squares df Mean Square F(1, 72) p d Between Groups 64.338 1 64.338 2.523 .117 0.37 Within Groups 1835.676 72 25.495 Total 1900.014 73 Treatment Group Only Sum of Squares df Mean Square F(1, 35) p d Between Groups 55.147 1 55.147 2.698 .109 0.54 Within Groups 715.286 35 20.437 Total 770.432 36 Placebo-control Group Only Sum of Squares df Mean Square F(1, 35) p d Between Groups 3.929 1 3.929 0.136 .715 0.12 Within Groups 1013.747 35 28.964 Total 1017.676 36 67 Figure 4. Posttest score means for each strategy group divided into high and low verbal ability test scores. Linear regression analysis was performed to evaluate the variables as predictors of posttest scores. The regression analysis examined how much variance in posttest score is explained or predicted by each variable. In forward stepwise regression analysis, the predictor variables are entered in sequence dependent on their contribution to the variance in posttest scores. In this regression analysis, chemistry prior knowledge was determined to be the strongest predictor of posttest scores and was therefore entered first, followed by mathematics skill and lastly strategy (elaborative interrogation why- questions or rereading). Verbal ability was eliminated since it failed to produce a significant contribution to the variance in posttest scores. The results of the regression analysis (see Table 8) indicated that chemistry prior knowledge, mathematics skill and strategy were all significant predictors of posttest score and, according to the regression analysis, these factors combined account for 34% of the variance in posttest scores. Prior knowledge in chemistry contributed significantly to the prediction of posttest scores and 68 accounted for approximately 24% of the variance in the posttest score (?R2 = 0.239). Mathematics skill accounted for an additional 6% (?R2 = 0.061) and study strategy accounted for approximately 4% (?R2 = 0.039) of the variance in the posttest scores. The effect of the elaborative interrogation why-question treatment vs. rereading placebo- control was statistically significant (p < .05) even after controlling for the effects of prior chemistry knowledge and mathematics skill. Table 8 Stepwise regression analysis for variables predicting posttest score Predictor variable R2 ?R2 ? Standard Coefficient t sig Step 1 Prior Knowledge .239 .239 1.694 .489 4.756 p < .001 Step 2 Mathematics Skill .300 .061 1.495 .249 2.491 .015 Step 3 Strategy .340 .039 2.051 .202 2.045 .045 Analyses of additional data Problem solving transfer Along with the data collected to evaluate the original hypotheses, additional information was collected during the sessions with the students. For example, as 69 described in Chapter 3, one set of posttest problems required additional amounts of problem solving transfer than the other set. Specifically, for each of the three worked example problems, there were two corresponding problems on the posttest. For example, one problem was posed verbatim to the first worked example problem, with only the compound changed. The other problem based on the first worked example was posed using different wording and a different, more complex compound, thus requiring a higher degree of problem solving transfer. This was true for each of the three worked example problems resulting in two sets of posttest problems; one set requiring lower problem solving transfer and the other requiring higher problem solving transfer. To probe for differences in transfer of problem solving ability between the rereading and elaborative interrogation why-question strategies, one-way ANOVA was performed to compare the mean posttest scores for each group on lower transfer problems and higher transfer problems (see Table 9). The elaborative interrogation why-question group (M = 6.00, SD = 2.53) significantly outperformed the rereading group (M = 4.68, SD = 2.77) on the higher transfer problems. Although the elaborative interrogation why-question group (M = 6.65, SD = 2.42) had a higher mean posttest score on the lower transfer problems than the rereading group (M = 5.52, SD = 2.92), the difference was not statistically significant. This may indicate that while rereading was perhaps a sufficient strategy for solving familiar (lower transfer) problems, the elaborative interrogation why-question strategy is perhaps more effective than rereading when students attempt to solve the more challenging higher transfer problems. 70 Elaborative Interrogation why-question responses Responses to elaborative interrogation why-questions asked of the treatment group were evaluated as described in Chapter 3 with each response classified as: a) adequate-linked, b) adequate-not-linked, c) inadequate, or d) no response. An independent rater assisted the researcher by scoring the why-question responses of approximately 20% of the treatment group students resulting in a 94% interrater agreement. Both raters were blinded to student names. Most of the why-question responses were judged to be adequate-linked (191 out of 333, or 57.4%) or adequate-not- linked (70 out of 333 or 21.0%). Only 17.7% (59 out of 333) of the responses were inadequate and 3.9% (13 out of 333) of the why-questions had no response. Scores on the why-question responses were compared to posttest scores to explore for relationships Table 9 Analysis of variance summary table: comparison of elaborative interrogation why- question treatment and rereading placebo-control groups on lower transfer questions and higher transfer questions posttest scores Lower Transfer Questions Sum of Squares df Mean Square F(1, 72) p d Between Groups 23.838 1 23.838 3.315 .073 0.42 Within Groups 517.676 35 7.190 Total 541.514 36 Higher Transfer Questions Sum of Squares df Mean Square F(1, 72) p d Between Groups 32.446 1 32.446 4.616 .035 0.50 Within Groups 506.108 72 7.029 Total 538.554 73 71 between the quality of the why-question responses and performance on the problem solving posttest. Analysis failed to find significant correlation between the total scores on the why-questions and the problem solving posttest, r(35) = .275, p > .05. Also, when students were divided into high and low why-question response groups based on a median score split, one-way ANOVA to compare mean posttest scores between students with high scores on why-questions (M = 13.37, SD = 4.63) and students with low why- question scores (M = 11.89, SD = 4.63) found that the mean scores were not significantly different, F(1, 35) = .994, p > .05. Finally, when why-questions and posttest problems were sorted by type (calculation of molar mass, calculation of mass from moles, and calculation of moles from mass), no significant correlations were found. In summary, no significant relationships were found between the quality of why-question responses and posttest scores. Interviews Interviews were conducted with six volunteers from the elaborative interrogation why-question strategy group within two weeks of the initial data collection. The interviewees met individually with the researcher at an on-campus location and time agreed upon by the volunteer and the researcher. All interviews were digitally recorded and transcribed by the researcher. The purpose of the interviews was to gain insight on the manner in which the interviewees perceived the effects of their use of elaborative interrogation why-questions on their understanding of the reading and worked examples, and their performance on the posttest problems. The interview questions were also designed to elicit the interviewees? opinions on the general usefulness of the strategy. Pseudonyms have been used to protect the student volunteers? privacy. 72 To begin the interview, each volunteer was asked six questions, one relating to each posttest problem. For each correct problem solution on the posttest, the volunteers were asked to select one of the following responses regarding their opinion of their problem solution during the test: a) It was easy; I could have done it without reading/studying b) It was easy, but I could not have done it before reading/studying c) I was fairly confident about my answer, but I was not sure that I knew what I was doing d) It was a lucky guess! For each incorrect answer on the problem solving posttest the volunteers were asked to select from the following responses regarding their impression of their problem solution during the test: a) I thought I knew how to do it before I read. I?m surprised I got it wrong b) I thought I knew how to do it after I read. I?m surprised I got it wrong c) I didn?t know how to do it. I just guessed. d) I didn?t know how to do it at all. The responses to these questions indicated that volunteers who solved a problem correctly were, for the most part (22 of 26 of the correctly solved problems), confident that their responses were correct, but that they could not have solved the problem before reading the instructional material (see Table 10). In addition, there was a low occurrence of response ?a? (two occurrences out of 36 responses) indicating that the volunteers felt they could have answered the question before reading the instructional material (see 73 Table 10). In other words, for most of the correct posttest problem solutions, the volunteers felt that they would not have known how to solve the problem before the reading, but felt confident about their ability to solve the problem after the reading. This provided support for the assumption that the students as a whole were novices in the problems described in the reading and worked examples before they used the instructional materials in the study. Furthermore, the interviewees were encouraged to elaborate on each response to provide additional insight into the students? opinions about the individual problems. Table 10 Student volunteers? responses to questions regarding their opinions about their solutions to each posttest problem. Posttest problem Aaron Ike Jason Karen Linda Rick CS IS CS IS CS IS CS IS CS IS CS IS 1 b b b c c b 2 b b b d b b 3 b b b d b b 4 b b b a b b 5 b c d b c b 6 b b d d a b Note. CS = correct solution; IS = incorrect solution For correct solutions, opinion choices were: a) It was easy; I could have done it without reading/studying b) It was easy, but I could not have done it before reading/studying c) I was fairly confident about my answer, but I was not sure that I knew what I was doing d) It was a lucky guess For incorrect solutions, opinion choices were: a) I thought I knew how to do it before I read. I?m surprised I got it wrong b) I thought I knew how to do it after I read. I?m surprised I got it wrong c) I didn?t know how to do it. I just guessed d) I didn?t know how to do it at all 74 The second series of interview questions dealt with the volunteers? opinions of the elaborative interrogation why-questions, and benefits, if any, of their use during their reading and subsequent posttest problem solving (see Table 11). Interviewees were asked to explain their opinions further, beyond a ?yes? or ?no? response. Most of the interviewees (five out of six) felt that the general use of elaborative interrogation why- questions sometimes helped them better understand the material. Interestingly, the interviewee who stated that she did not think that the elaborative interrogation why- questions were sometimes helpful, responded positively to the questions regarding her perceived usefulness of specific why-questions, and she stated that she would be likely to use elaborative interrogation why-questions in the future when reading if they were included in her textbook. Also, most (five of the six) interviewees responded that they think they would use elaborative interrogation why-questions if they were included in their textbook. Notably, the interviewee who said he would not be likely to use the why- questions if they were included in his book stated that he preferred to use a rereading strategy when studying. It was not determined whether or not this interviewee had discussed the rereading strategy with participants in the placebo-control group prior to the interview, perhaps influencing his opinion. 75 Table 11 Interviewees? general responses to questions about their opinions of effectiveness of elaborative interrogation why-questions while reading and subsequent posttest problem solving Interview question Aaron Ike Jason Karen Linda Rick Did the statements followed by the question ?Why is this true?? sometimes help you better understand the material? yes yes yes no yes yes When studying Worked Example 1, did the statement ?One mole of Li2CO3 contains two moles of Li, one mole of C and three moles of O.? followed by the question ?Why is this true?? help you better understand how you would calculate the molar mass of Li2CO3? yes yes yes yes yes no When studying Worked Example 1, did the statement ?One mole of Li2CO3 contains two moles of Li, one mole of C and three moles of O.? followed by the question ?Why is this true?? help you answer posttest question 1? yes yes n/a yes no no When studying Worked Example 2, did the statement ?The conversion factor for silver (Ag) can be written as or ? followed by the question ?Why is this true?? help you better understand how molar mass can be used as a conversion factor? yes yes yes yes yes yes When studying Worked Example 2, did the statement ?The conversion factor for silver (Ag) can be written as or ? followed by the question ?Why is this true?? help you answer posttest yes yes yes yes yes yes 76 question 2? When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the final answer is correctly expressed in units of moles NaCl.? Followed by the question ?Why is this true?? help you understand how the conversion factor was used? yes yes yes yes yes yes When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the final answer is correctly expressed in units of moles NaCl.? Followed by the question ?Why is this true?? help you answer posttest question 3? yes yes n/a n/a n/a yes When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the correct form of the conversion factor to use is .? Followed by the question ?Why is this true?? help you understand how to correctly solve for moles of NaCl in Worked Example 3? yes yes yes no yes yes When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the correct form of the conversion factor to use is .? Followed by the question ?Why is this true?? help you answer posttest question 6? yes yes n/a n/a n/a yes If your textbook included this type of study strategy (statements followed by the question ?why is this true?? do you think you would use it while reading? yes yes yes yes yes no 77 In addition, the interview transcripts were analyzed for evidence of other factors that may play a role in the effectiveness of the elaborative interrogation why-question strategy, such as prior knowledge activation or additional processing of pertinent details. For example, the following quotes demonstrate instances of interviewees referring to use of the factor-label method which may be an indication of prior knowledge activation: Aaron: ?Because for the conversion factor to work, two similar factors have to cancel each other, so the remaining factor would be the expressed value.? Karen: ?Because it said the final answer is expressed in units of moles, so you have to do the conversion to get what you?re looking for. You can cancel it out.? Ike: ?Well, not only that I knew I could read and write it as a ratio, but also that I could cross cancel if necessary to be able to solve the problem.? Ike: ?Yes, because I knew the moles need to be on top. That the moles would not be canceled, but the grams of that particular chemical would be.? The following quotes may indicate additional processing or calling attention to pertinent details: Jacob: Yeah, well seeing those numbers and stuff just made me snap right away that that is what it is referring to? Rick: ?I think it makes you think about it a lot more. It makes me think about it, what I?m being asked a little more.? Aaron: Sometimes you had to go back and look at the material before and you would remember and then you can put two and two together and put it in your own words and that helped me.? 78 Ike: ?Using the previous instructions of how to really dissect the problem and break it down into each individual component helped me to understand more; to figure that this answer was correct. Had I not been given the focus before?how to pull apart each element out of the table and knowing how many of each element appears where, I would not have been able to find that out. So going over the example, it was easy to find information.? While explaining their opinions about the usefulness of the elaborative interrogation why-questions, some interviewees expressed ways in which they felt the why-questions were helpful or why they may choose to use them in the future. Aaron: ?Yeah, I had to run it through my head and I put in my own words, what I read, so it helps me understand it. I think anybody can do math. It?s understanding what you?re doing and comprehending what you?re doing that tells you that you know what you?re doing. Karen: ?Possibly, because it might help me understand it better. Like why we are solving or why are we going to get moles of NaCl. Why are we doing it? It just helped me understand why.? Ike: ?Placing questions with problems and making the steps in the problem easier to understand helps.? Ike: ?In order to determine if the answer was true or false, it requires me, as the student, or the student in general, to take each part of the, of the element combination apart as it showed in the steps, so following what was in the steps, I was able to do virtually the same thing. Maybe not verbatim as to the same way 79 the steps were written, but in a way that I was able to comprehend and then I was able to answer the question a lot easier.? Linda: ?It made me think about it.? Rick: ?So I would say it helps. Yeah, it helped me understand a little bit.? Overall, most of the interviewees expressed positive attitudes toward the usefulness of elaborative interrogation why-questions and claimed that they helped them to better understand how to correctly solve the posttest problems. Summary Students were randomly assigned to one of two study strategy groups, a rereading placebo-control and elaborative interrogation why-question treatment, using a randomized two-group posttest only design. These students were administered identical problem solving posttests after applying their assigned study strategy to a reading copied from their chemistry textbook that contained worked example problems. The problem solving posttest requiring comprehension was designed to evaluate the students? ability to solve quantitative chemistry problems similar to those in the worked examples in the reading. Statistical analyses were performed to compare the students? performance and determine whether the elaborative interrogation why-question strategy resulted in increased ability to solve these problems compared to students who used the rereading placebo-control strategy. Other factors that may affect this ability, such as chemistry prior knowledge, mathematics skills and verbal ability were measured and included in the analyses. The analyses indicated that the use of elaborative interrogation why-questions, on average, resulted in higher posttest scores than rereading, suggesting that the 80 elaborative interrogation why-question strategy may be more beneficial to students attempting to comprehend the principles and processes needed to solve these problems than a rereading strategy. Furthermore, analyses that included the students? chemistry prior knowledge, mathematics skill and verbal ability scores revealed that this benefit remains significant even when these factors were taken into account. Qualitative data, in the form of interviews with six volunteers from the elaborative interrogation why- question treatment group, provided support for the theoretical notion that students benefit from the use of why-questions by activating prior knowledge, and indicated that most of these students felt that the why-question strategy was helpful for them when studying from their textbook. 81 Chapter 5: Discussion Introduction The purpose of this study was to investigate the use of elaborative interrogation strategy, specifically the use of why-questions adjunct to science text containing worked example problems, as a means to improve performance on a problem solving posttest requiring comprehension. This investigation differed from other elaborative interrogation why-question research in that college students in an authentic classroom setting learned from a reading from the chemistry textbook requiring comprehension of relevant information in order to solve quantitative problems of the type asked by chemistry teachers. More specifically, this study was novel in that it utilized a reading that contained quantitative worked example problems and consisted of a mixture of chemical (rather than biological) informational prose, numbers, chemical symbols and mathematical calculations, and it was assessed with a posttest that required students to solve problems similar to those in the worked examples, to test students? problem solving ability. This study was conducted with students enrolled in Basics of Chemistry, an introductory chemistry class taught at a community college in the southwestern United States. Data collection occurred during one 110 minute class period in which the student volunteers were enrolled. A randomized, two group, posttest only design was used. A total of 74 students participated in this study. Students were randomly assigned to one of two strategy groups: a rereading (placebo-control) group or an elaborative interrogation why-questions (treatment) group. Before the experiment, the 74 students were 82 administered tests to assess prior knowledge in chemistry, mathematics skill and verbal ability which were hypothesized to be predictors of posttest score along with type of study strategy. After this initial testing, the students were provided reading material copied verbatim from their required textbook. The materials distributed to the rereading placebo-control group instructed the students to read and study as if preparing for a test, paying particular attention to the worked example problems. In addition, the students in the placebo-control group were instructed to read the material a second time. The elaborative interrogation why-question treatment group was also instructed to read and study the material as if preparing for a test, paying particular attention to the worked example problems. In contrast to the placebo-control group, this treatment group was instructed to answer the nine elaborative interrogation why-questions presented throughout the reading. As each student finished the study assignment, they were administered a problem solving posttest consisting of problems comparable to the worded example problems which required comprehension. All pre- and post-study tests and why- question responses were assessed and these data were statistically evaluated. In addition, interviews were conducted with six volunteers from the elaborative interrogation why- question treatment group to assess any perceived effects of the why-question strategy and opinions pertaining to their use of the strategy. Findings The most fundamental prediction in this study, that students using the elaborative interrogation why-question treatment strategy would outperform students who used the rereading placebo-control strategy, was supported by the data, as expected. The students 83 who answered elaborative interrogation why-questions significantly outperformed the students who read the material twice on a problem solving posttest. Since prior knowledge activation is believed to be major contributor to the success of the elaborative interrogation why-question strategy, prior chemistry knowledge, mathematics skills and verbal ability, were posed as possible measures to support this assertion. Tests of these attributes were also used to provide assurance of homogeneity of these factors across groups. Mean scores on each of these three factors did not vary significantly by group, providing additional support along with random assignment of groups, that selection bias was apparently not a threat to internal validity. As anticipated, prior knowledge in chemistry was shown by the data to influence posttest scores. A significant positive correlation was found between chemistry prior knowledge and posttest scores. Additionally, ANOVA of both groups combined found that posttest score means for students with high chemistry prior knowledge were significantly higher than the posttest score means for those with low chemistry prior knowledge. Also, when ANOVA was performed on the elaborative interrogation why- question treatment group separately, students with high prior knowledge scored significantly higher on the problem solving posttest than students with low prior knowledge. However, there was no significant difference in posttest score means between low and high prior knowledge students within the rereading placebo-control group. In other words, this study found that while high prior knowledge may have had a small influence on posttest scores for students using a rereading strategy, it had a greater influence on posttest scores for students provided with the elaborative interrogation why- questions, thus providing further evidence to support the idea that prior knowledge 84 activation may play an important role in the effectiveness of the elaborative interrogation why-question strategy. Specifically, since the data indicated that prior knowledge had a positive effect on the posttest scores of the why-question group, but not on the scores of the rereading group (two groups who did not differ significantly in prior knowledge test scores), it seems reasonable to assert that prior knowledge may have been activated or utilized to a greater extent by the elaborative interrogation why-question treatment group than by the rereading placebo-control group. Analysis of the relationship between mathematics skills to posttest scores provided mixed results. Analysis of continuous data indicated a significant correlation between mathematics skill test scores and posttest scores. However, when students were divided into dichotomous groups based on high and low mathematics skill test scores ANOVA indicated no significant differences between the two groups. The lack of evidence from the ANOVA for expected differences may have been due to problems with separating the mathematics test scores into high and low groups. Dichotomization of variables based on a median split does not take into account inter-individual differences and may place subjects near the median in different groups even though they are actually more similar to each other than to other members of their group (Renkl, 1997). The mathematics skills test used in this study only consisted of four questions and the resulting median split did not result in evenly divided groups. Significant correlation between mathematics skill and posttest scores, which relied on individual students? mathematics skill test scores, rather than students divided into high and low categories based on their scores may be indicative of the weakness in this data. Even so, as Figure 3 (Chapter 4) illustrates, a simple comparison of means shows that students with high 85 mathematics skill test scores outperformed students with low mathematics skill to a greater degree in the elaborative interrogation why-question treatment group than the rereading placebo-control group where high mathematics skills actually seemed to have a negative impact on the posttest score, although not to a significant degree. In further investigations, perhaps the mathematics skills test should be modified to include more items and thus provide superior data for analysis. The prediction that verbal ability would influence posttest score, also was not supported by the data. A previous study investigating elaborative interrogation why- questions has reported finding relationships between verbal ability and posttest scores (Smith et al., 2010). That study presented and assessed mostly verbal information unlike the present study. In the current study, both correlation analysis between verbal ability test scores and posttest scores, and ANOVA comparing high and low verbal ability scorers to mean posttest scores failed to find significant effects. The lack of evidence for a relationship between verbal ability and posttest score in the present investigation may be due, in some part, to the type of information that was read and tested in the present study compared to the type of information read and tested in otherwise comparable research. Perhaps since the present research was novel in that it evaluated students? quantitative problem solving ability based on reading comprehension of chemistry text that included quantitative worked examples, the comprehension of these worked examples may not depend as heavily on verbal ability as compared to studies that test comprehension of text alone. This disparity is discussed further in the section on comparisons to other studies. 86 It was initially projected that study strategy, chemistry prior knowledge, mathematics skill and verbal ability would be significant predictors of posttest score. Regression analysis results indicated that chemistry prior knowledge, mathematics skill and strategy condition were all significant predictors of posttest score and, according to the regression analysis, these factors in combination accounted for 34% of the variance in posttest scores. Chemistry prior knowledge was found to be the strongest predictor of posttest scores, explaining 24% of the variance, followed by mathematics skill (additional 6%) and strategy, which predicted 4% of the variance in posttest scores after prior knowledge and mathematics skill were taken into account. While study strategy was found to produce the weakest of the three significant predictors, it should be noted that this is compared to rereading, which has been shown to be an effective strategy (e.g., Dunlosky et al., 2002; Rawson & Kintsch, 2005; Wandersee, 1988). The effect may have been greater if compared to a control such as reading once. It is also important to note that strategy remained a significant predictor of posttest score even after the significant effects of prior chemistry knowledge and mathematics skills were statistically controlled. As described in Chapter 4, for each worked example in the reading material, there were two corresponding posttest problems; one that was worded in the same way as the worked example (lower transfer) and one that was worded differently from the worked example, but required the same type of calculations (higher transfer). It is interesting to note that when the problem solving posttests were divided into two sets (higher transfer questions and lower transfer questions) the mean scores of the students in the elaborative interrogation why-question treatment group were significantly higher than the mean scores for the students in the rereading placebo-control group on the higher transfer 87 problem solving posttest problem set. However, the scores for the two groups were not significantly different when means of the lower transfer problem solving posttest problem set scores were compared. Therefore, results from this study indicate that the elaborative interrogation why-question strategy may have provided the deeper comprehension required for transfer problem solving. Interviews with six volunteers from the elaborative interrogation why-question treatment group were conducted to collect qualitative information about the study. The volunteers were selected to represent various levels of performance on the problem solving posttest so as to elicit the opinions of students who experienced differing levels of learning success. Overall, as discussed in Chapter 4, the interviewees seemed to view the use of elaborative interrogation why-questions in a positive light and felt that it was useful, regardless of their level of success on the problem solving posttest. Although not a goal of this investigation, this observation may be of note since such affective outcomes may influence the use and utility of a strategy. Also, responses indicated that, for the most part, students felt confident in their ability to solve the problems after reading the instructional materials and answering the why-questions, but felt that they would not have been able to solve the problems before reading the instructional materials and answering the why-questions. This may validate, at least for the volunteers, a lack of familiarity with the problem content beforehand, thus supporting the notion that the posttest presented problems to these students, not simply routine exercises. Comparisons to other studies The current study provided additional support to previously published effects reported on elaborative interrogation why-questions as a reading comprehension strategy. 88 This study also extended previous findings by using elaborative interrogation why- questions in a setting differing from most elaborative interrogation studies to date, and a type of reading material not previous reported in elaborative interrogation research. Specifically, this study investigated students who read a passage containing quantitative worked example problems copied from the textbook required for an introductory chemistry class in which they were enrolled. The overall positive effects of elaborative interrogation why-questions found in this investigation were consistent with several studies that are comparable to the current study in that the subjects read paragraphs or longer passages. For example, Seifert (1993) found that subjects who read paragraphs followed by elaborative interrogation why- questions were more likely to remember facts than students who used an underlining strategy. Similarly, McDaniel and Donnelly (1996) provided students with paragraphs on science topics similar to the style of writing found in science textbooks. They concluded that elaborative interrogation why-questions increased learning of science concepts. Furthermore, two studies that more closely compare to the current study documented evidence of elaborative interrogation why-question benefits for college students reading longer passages of science text (Ozgungor & Guthrie, 2004; Smith et al., 2010). The findings of this research are also consistent in many ways to previous studies in that it found evidence for the role of prior knowledge in elaborative interrogation why- questions (e.g., Martin & Pressley, 1991; Smith et al., 2010; Woloshyn et al., 1992; Woloshyn, et al., 1994). Smith et al. (2010) found biology prior knowledge and verbal ability to be significant predictors of posttest score. Similarly, the current study found chemistry prior knowledge and mathematics skill to be significant predictors of posttest 89 scores. However, no significant relationship was found between verbal ability and posttest scores. This particular finding is inconsistent with that of Smith et al. (2010). Possible explanations for this discrepancy could be the difference in the content of the reading, the nature of the why-questions or the type of posttest items. As discussed earlier in this chapter, the reading used in the study by Smith et al. (2010) was mostly prose in nature, while the reading in the current study contained prose passages followed by quantitative worked example problems. The elaborative interrogation why-questions in this study all focused on information in the worked examples, and the problem solving posttest evaluated the students? ability to solve problems similar to the worked examples. Perhaps the quantitative nature of the to-be-learned subject matter, why-questions, and posttest problems did not rely as heavily on verbal ability as the study by Smith and her colleagues (2010).. In addition, comparisons with the findings of Smith et al. (2010), who used the same instrument to measure verbal ability, may be influenced by differences in the student populations participating in the study. In the study reported by Smith and her colleagues (2010), the mean verbal ability score for the students (who were enrolled in a university biology course) was above 20 (M = 21.01 for the elaborative interrogation why-question treatment group; M = 20.08 for the rereading group), while the elaborative interrogation why-question treatment group students? in the present study (who were enrolled in a community college introductory chemistry course) had a mean verbal ability score of 18.04. Therefore, the median cutoff score separating high and low verbal ability students probably varied accordingly, making direct comparisons based on this criterion problematic. 90 Based on the rubric used to assess the quality of students? responses to why- questions, students in the elaborative interrogation why-questions treatment group apparently outperformed students in the rereading control group, regardless of the quality of their responses to the why-questions. This is consistent with several other research studies (e.g., Martin & Pressley, 1991; Pressley et al., 1988; Seifert, 1993; Woloshyn et al., 1994; Woloshyn et al., 1990), but contradictory with others (e.g., Smith et al. 2010; Wood et al., 1990, experiment 2). In this research, clear divisions among students were once again difficult to distinguish. In this case, most of the why-question responses were judged to be adequate-linked (191 out of 333, or 57.4%) or adequate-not-linked (70 out of 333 or 21.0%). Only 17.7% (59 out of 333) of the responses were inadequate and 3.9% (13 out of 333) of the why-questions had no response. Since it seemed that most students provided adequate responses to most of the questions, it may be difficult to draw conclusions based on the quality of these responses. Clearly this factor in the elaborative interrogation why-question strategy is complex and may require further investigation to clarify the effects of the quality of why-question responses. The differing nature of instructional materials and variety of assessed learning outcomes in the reported elaborative interrogation why-question literature may explain the variation in results in this aspect of the strategy. Implications While previous studies have indicated that elaborative interrogation why- questions can be an effective strategy over a range of ages, and for a variety of learning goals from recall of facts in lists of sentences to comprehension of science content in passages equivalent to textbook sections, this study was novel in that it explored the 91 effect of elaborative interrogation why-questions on students learning to solve quantitative chemistry problems by reading text that included worked examples. The key finding of this research, that elaborative interrogation why-questions appear to improve students? ability to solve quantitative chemistry problems similar to the worked example problems, may have implications for educational practice, and may further broaden the scope of the elaborative interrogation why-question strategy application beyond reading of sentences and longer prose passages. Furthermore, the current study provides additional rationale for educators and textbook publishers to include the use of elaborative interrogation why-questions to improve student learning from reading. However, since it appears important to carefully construct the elaborative interrogation why-questions in order to encourage activation of relevant prior knowledge and thereby create important linkages between the students? prior knowledge and the to-be-learned information, it is not recommended that this strategy be practiced arbitrarily. Perhaps this research will encourage community colleges to provide professional development opportunities to instructors on the value and use of such questioning techniques. This professional development might also include training in writing effective why-questions, enabling instructors to provide their students with the elaborative interrogation reading comprehension strategy with any assigned reading. In addition, the results of this research may influence textbook publishers to consider including elaborative interrogation why-questions designed to activate students? prior knowledge while reading in their textbooks, supplements or web-based resources thereby providing students with access to this valuable reading comprehension strategy at little cost to the publisher, as suggested by Smith et al. (2010). 92 Future research The use of elaborative interrogation why-questions in development of problem solving skills has a great deal of future research potential. The current research found positive effects from the use of elaborative interrogation why-question with students learning to solve various molar mass problems and while this skill is critical to success in most chemistry courses, there are many other types of chemistry problem solving methods where students may benefit from the elaborative strategy, not all of which are quantitative in nature. For example, in organic chemistry, students must learn to propose reasonable reaction pathways leading from reactants to products. Descriptions of these mechanisms are typically given in the text followed by examples illustrating the steps in the reaction pathway. Perhaps research methods similar to those in the current study could be implemented to explore the effects of elaborative interrogation why-questions for these types of problems as well. As mentioned previously, evaluations of the quality of student responses to why- questions has produced some conflicting results among reported elaborative interrogation studies. Future research into this aspect of elaborative interrogation why-questions may be of value and interest. One finding of the current investigation was the significant difference in ability of the elaborative interrogation why-question treatment group to solve transfer problems. The results indicate that elaborative interrogation why-questions may improve the ability to solve the typically more challenging transfer problems. However, the degree of transfer was somewhat limited in this study. Future work could explore this further by 93 assessing various levels of transfer problems, perhaps leading to richer data regarding this important skill. Limitations The intention of this study was to focus on the ability of the student volunteers to solve quantitative chemistry problems similar to those found in the instructional reading. While it can be stated that this was a diverse group of students, no demographic information was collected and therefore, no analysis based on these criteria were presented. This may be considered a limitation in this study, especially since age has been shown by some previous studies to have an influence on the effectiveness of elaborative interrogation why-questions. Collection of this data may have provided some interesting insight into this aspect of elaborative interrogation why-question research. Additionally, instruments used to assess prior knowledge and skills may have had limitations. For example, the mathematics skills test, as discussed previously, provided somewhat conflicting results that could perhaps be remedied in future investigations. Furthermore, prior knowledge instruments used in elaborative interrogation why-question research are not usually standardized. Often, as was the case with some of the instruments in this study, the instruments were created by the researcher, making comparisons between studies problematic. However, even when identical instruments were used, such as the verbal ability test used in the current study and in the research of Smith et al. (2010), variations among populations of students may also create complications. While it may be difficult to standardize instruments for prior knowledge over the diversity of domains and grade levels of the various elaborative interrogation why-question studies, the use of comparable instruments may prove to be valuable in future research. 94 Summary The purpose of this research was to extend research in elaborative interrogation by examining the effectiveness of using elaborative interrogation why-question enhanced worked examples as a strategy for improving problem solving skills in chemistry. This study builds on previous elaborative interrogation why-question research by investigating the effects of this strategy with community college students reading from their course textbooks about molar mass and related problem solving. While some results were unexpected, the main hypothesis, that students who utilized the elaborative interrogation why-question strategy while reading a textbook passage describing molar mass and calculations using molar mass would outperform students who used an alternative reading comprehension strategy (rereading) on a problem solving posttest, was supported in statistical analyses. This important finding may expand the domains in which elaborative interrogation why-questions strategy may be implemented as well as the research potential in the area of elaborative interrogation why-questions as a reading comprehension strategy. 95 Appendix A Background Chemistry Knowledge, Mathematics Skills, and Verbal Ability Tests 96 Chemistry Background Knowledge Test 1. What is the atomic mass of zinc (Zn)? a. 30 b. 65.39 c. 91.22 d. 40 2. What is the atomic mass of carbon (C)? a. 6 b. 12.0 c. 40.1 d. 20 3. How many atoms of nitrogen (N) are represented in the formula Mg(NO3)2? a. 1 b. 2 c. 3 d. 6 4. How many atoms of oxygen (O) are represented in the formula Mg(NO3)2? a. 1 b. 2 c. 3 d. 6 5. How many carbon atoms are in one mole of carbon? a. 1 b. 6 c. 12.0 d. 6.02 x 1023 6. How many moles of oxygen atoms are present in 2.0 moles of CaCO3? a. 2 b. 3 c. 6 d. 1.204 x 1024 97 Mathematics Skills Test 1. Given the equality a/b = c/d, which of the following equations could be used to correctly solve for a? a) a = bc/d b) a = c/bd c) a = b + (c/d) d) a = (c/d) - b 2. If there are exactly four cups in one quart, how many cups are in 2.5 quarts? a) 10 b) 1.5 c) 1.6 d) 7.25 3. How long would it take to travel 200.0 miles if you are driving at a constant speed of 55.0 miles/hour? a) 11700 hours b) 3.64 hours c) 145 hours d) 0.275 hours 4. How many centimeters are in 12 inches (1 inch = 2.54 cm)? a) 4.72 b) 0.21 c) 7.46 d) 30.48 98 Verbal Ability Test Circle the best definition or synonym for each word below. 1. cottontail 7. evoke 13. placate 19. curtailment a. squirrel a. wake up a. rehabilitate a.expenditure b. poplar b. surrender b. plagiarize b. abandonment c. boa c. reconnoiter c. depredate c. abridgment d. marshy plant d. transcend d. apprise d. improvement e. rabbit e. call forth e. conciliate e. forgery 2. marketable 8. unobtrusive 14. surcease 20. perversity a. partisan a. unintelligent a. enlightenment a. adversity b. jocular b. epileptic b. cessation b. perviousness c. marriageable c. illogical c. inattention c. travesty d. salable d. lineal d. censor d. waywardness e. essential e. modest e. substitution e. gentility 3. boggy 9. terrain 15. apathetic 21. calumnious a. afraid a. ice cream a. wandering a. complimentary b. false b. final test b. impassive b. analogous c. marshy c. tractor c. prophetic c. slanderous d. dense d. area of ground d. hateful d. tempestuous e. black e. weight e. overflowing e. magnanimous 4. gruesomeness 10. capriciousness 16. paternoster 22. illiberality a. blackness a. stubbornness a. paternalism a. bigotry b. falseness b. courage b. patricide b. imbecility c. vindictiveness c. whimsicality c. malediction c. illegibility d. drunkenness d. amazement d. benediction d. cautery e. ghastliness e. greediness e. prayer e. immaturity 5. loathing 11. maelstrom 17. opalescence 23. clabber a. diffidence a. slander a. opulence a. rejoice b. laziness b. whirlpool b. senescence b. gossip c. abhorrence c. enmity c. bankruptcy c. curdle d. cleverness d. armor d. iridescence d. crow e. comfort e. majolica e. assiduity e. hobble 6. bantam 12. tentative 18. lush 24. sedulousness a. fowl a. critical a. stupid a. diligence b. ridicule b. conclusive b. luxurious b. credulousness c. cripple c. authentic c. hazy c. seduction d. vegetable d. provisional d. putrid d. perilousness e. ensign e. apprehensive e. languishing e. frankness 99 Circle the best definition or synonym for each word below. 25. shortcake 31. demoniacal 37. corroboratory 43. aggrandizement a. condiment a. aloof a. plausible a. theft b. pastry b. mythical b. anticipatory b. impeachment c. fruit c. thoughtful c. confirmatory c. derision d. sweetmeat d. fiendish d. explanatory d. amazement e. vegetable e. eccentric e. esoteric e. enlargement 26. hardtack 32. highroad 38. figurine 44. effulgence a. nail a. mountain road a. metaphor a. prominence b. textile b. right of way b. wine b. outline c. weapon c. main road c. poem c. change d. wood d. roadbed d. organ d. radiance e. biscuit e. concrete road e. statuette e. energy 27. commendable 33. befog 39. rancorous 45. aphasia a. pleasurable a. dampen a. malignant a. loss of speech b. charitable b. forget b. jubilant b. drunkenness c. lucrative c. whip c. abashed c. anemia d. proscriptive d. mystify d. inglorious d. loss of memory e. laudable e. belittle e. careless e. rash 28. nonchalant 34. platoon 40. inveteracy 46. panoplied a. sarcastic a. tableland a. habitualness a. philosophic b. discourteous b. bridge of boats b. migration b. armored c. noble c. body of soldiers c. bravery c. panting d. unconcerned d. remark d. covering d. frenzied e. unsophisticated e. frigate e. hatefulness e. atavistic 29. coloration 35. dullard 41. choler 47. sacrosanct a. pigmentation a. peon a. anger a. sacrificial b. alteration b. duck b. chorister b. dormant c. configuration c. braggart c. guard c. inviolable d. prevention d. thief d. saliva d. superficial e. taint e. dunce e. refrigerator e. gullible 30. aridity 31. momentously 42. vacillation 48. prurience a. bitterness a. frivolously a. purification a. modesty b. surface b. moderately b. wavering b. sapience c. sonority c. weightily c. expulsion c. provender d. dryness d. momentarily d. tempting d. lust e. torridity e. modishly e. foolishness e. security 100 Appendix B Experimental groups? reading material With the exception of instructions and questions, reprinted with permission from Person Education, Inc., from: Timberlake, K. C. (2009). Chemistry: An introduction to general, organic, and biological chemistry (10th ed.), pp. 166-170. Upper Saddle River, 101 Instructions: In preparation for a quiz, read the following passage and answer the questions **AS YOU READ** the passage. Pay particular attention to the worked examples. Molar Mass A single atom or molecule is much too small to weigh, even on the most sensitive balance. In fact, it takes a huge number of atoms or molecules to make enough of substance that you can see. An amount of water that contains Avogadro's number of water molecules is only a few sips of water. In the laboratory, we use the balance to weigh out Avogadro's number of particles or 1 mole of a substance. For any element, the quantity called its molar mass is the number of grams equal to the atomic mass of that element. We are counting out 6.02 x 1023 atoms of an element when we weigh out the number of grams equal to its molar mass. For example, if we need one mole of carbon (C) atoms, we would first find the atomic mass of 12.01 on the periodic table. Then to obtain 1 mole of carbon atoms, we would weigh out 12.01 g of carbon. Thus the molar mass of carbon is found by looking at the atomic mass on the periodic table. More examples from the periodic table: 1 mole of silver atoms has a mass of 107.9 g 1 mole of carbon atoms has a mass of 12.01 g 1 mole of sulfur atoms has a mass of 32.06 g 102 ? Molar Mass of a Compound To determine the molar mass of the compound, multiply the molar mass of each element by its subscript in the formula, and add the results. For example, the molar mass of sulfur trioxide, S03, is obtained by adding the molar masses of 1 mole of sulfur and 3 moles of oxygen. In this text, we round molar mass to the tenths (0.1 g) place for calculations. Step 1 Using the periodic table, obtain the molar masses of sulfur and oxygen. Step 2 Grams from 1 mole of S 1 mole S x = 32.1 g S Grams from 3 moles of O 3 moles O x = 48.0 g O Step 3 Obtain the molar mass of SO3 by adding the masses of 1 mole of S and 3 moles of O 1 mole S = 32.1 g of S 3 moles O = 48.0 g of O Molar Mass of SO3 = 80.1 g of SO3 103 Worked Example 1 (Sample Problem 5.3): Calculating Molar Mass of Compounds: Find the molar mass of Li2CO3 used to produce red color in fireworks. Solution: Step 1 Using the periodic table, obtain the molar masses of lithium, carbon and oxygen Step 2 Obtain the mass of each element in the formula by multiplying each molar mass by its number of moles (subscript) in the formula. Grams from 2 moles of Li 2 mole Li x = 13.8 g Li Grams from 1 mole of C Grams from 3 moles of O 3 moles O x = 48.0 g O Step 3 Obtain the molar mass of Li2CO3 by adding the masses of 2 moles of Li, 1 mole of C, and 3 moles of O. 13.8 g of Li + 12.0 g of C + 48.0 g of O = Molar Mass of Li2CO3 = 73.8 g 1 mole C x = 12.0 g C 104 Questions: 1) One mole of Li2CO3 contains two moles of Li, one mole of C and three moles of O. Why is this true? 2) In worked example 1, to find the mass of oxygen (O) in one mole of Li2CO3 molecules, the atomic mass of oxygen must be multiplied by 3. Why is this true? 3) If you were asked to find the mass of oxygen in one mole of Mg(NO3)2 molecules, the atomic mass of oxygen must be multiplied by 6. Why is this true? 105 ? Calculations Using Molar Mass The molar mass of an element or compound is one of the most useful conversion factors in chemistry. Molar mass is used to change from moles of a substance to grams, or from grams to moles. To do these calculations, we use the molar mass as a conversion factor. For example, 1 mole of magnesium has a mass of 24.3 g. To express molar mass as an equality we can write 1 mole Mg = 24.3 g Mg From this equality, two conversion factors can be written. and Conversion factors are written for compounds in the same way. For example, the molar mass of the compound H2O is 18.0 g. 1 mole H2O = 18.0 g The conversion factors from the molar mass of H2O are written as and We can now change from moles to grams, or grams to moles, using the conversion factors derived from the molar mass. (Remember, you must determine the molar mass of the substance first.) 106 Worked example 2 (Sample Problem 5.4): Converting Moles of an Element to Grams: Silver metal is used in the manufacture of tableware, mirrors, jewelry, and dental alloys. If the design for a piece of jewelry requires a 0.750 mole silver, how many grams of silver are needed? Solution: Step 1 Given: 0.750 mole Ag Needed: grams of Ag Step 2 Plan: moles of Ag molar mass conversion factor barb2right grams of Ag Step 3 Equalities/Conversion Factors: 1 mole Ag = 107.9 g Ag and Step 4 Set Up Problem: Calculate the grams of silver using the molar mass. Questions: 1) Step 3 of worked example 2 states that 1 mole of Ag = 107.9 g Ag. Why is this true? 2) The conversion factor for silver (Ag) can be written as or Why is this true? 3) In step 4 of worked example 2, the correct form of the conversion factor to use is . Why is this true? 107 Worked example 3 (Sample Problem 5.5): Converting Mass of a Compound to Moles A box of salt contains 737 g NaCl. How many moles of NaCl are present in the box? Solution: Step 1 Given: 737 g NaCl Needed: moles of NaCl Step 2 Plan: grams of NaCl molar mass conversion factor barb2right moles of NaCl Step 3 Equalities/Conversion Factors The molar mass of NaCl is the sum of the masses of one mole of Na+ and one mole Cl- : (1 x 23.0 g/mole) + (1 x 35.5 g/mole) = 58.5 g/mole 1 mole NaCl = 58.5 g NaCl and Step 4 Set Up Problem: We calculate the grams of NaCl using the molar mass. Questions: 1) To find the molar mass of NaCl, the atomic masses of Na and Cl are each multiplied by 1 and then added together. Why is this true? 2) In step 4 of worked example 3, the final answer is correctly expressed in units of moles NaCl. Why is this true? 3) In step 4 of worked example 3, the correct form of the conversion factor to use is . Why is this true? 108 We can summarize the calculations to show the connections between the moles of a compound, its mass in grams, number of molecules (or formula units if ionic), and the moles and atoms of each element in that compound in the following flowchart. Mass Moles Particles Grams of element Molar mass (g/mole) Moles of elements Avogadro?s Number Atoms (or ions) Formula Subscripts Grams of compound Molar mass (g/mole) Moles of compound Avogadro?s Number Molecules (or formula units) 109 Appendix C Placebo-control groups? reading material With the exception of instructions, reprinted with permission from Person Education, Inc., from: Timberlake, K. C. (2009). Chemistry: An introduction to general, organic, and biological chemistry (10th ed.), pp. 166-170. Upper Saddle River, 110 Instructions: In preparation for a quiz, read the following. Pay particular attention to the worked examples. Then **READ THE PASSAGE AGAIN**. Molar Mass A single atom or molecule is much too small to weigh, even on the most sensitive balance. In fact, it takes a huge number of atoms or molecules to make enough of substance that you can see. An amount of water that contains Avogadro's number of water molecules is only a few sips of water. In the laboratory, we use the balance to weigh out Avogadro's number of particles or 1 mole of a substance. For any element, the quantity called its molar mass is the number of grams equal to the atomic mass of that element. We are counting out 6.02 x 1023 atoms of an element when we weigh out the number of grams equal to its molar mass. For example, if we need one mole of carbon (C) atoms, we would first find the atomic mass of 12.01 on the periodic table. Then to obtain 1 mole of carbon atoms, we would weigh out 12.01 g of carbon. Thus the molar mass of carbon is found by looking at the atomic mass on the periodic table. More examples from the periodic table: 1 mole of silver atoms has a mass of 107.9 g 1 mole of carbon atoms has a mass of 12.01 g 1 mole of sulfur atoms has a mass of 32.06 g 111 ? Molar Mass of a Compound To determine the molar mass of the compound, multiply the molar mass of each element by its subscript in the formula, and add the results. For example, the molar mass of sulfur trioxide, S03, is obtained by adding the molar masses of 1 mole of sulfur and 3 moles of oxygen. In this text, we round molar mass to the tenths (0.1 g) place for calculations. Step 1 Using the periodic table, obtain the molar masses of sulfur and oxygen. Step 2 Grams from 1 mole of S 1 mole S x = 32.1 g S Grams from 3 moles of O 3 moles O x = 48.0 g O Step 3 Obtain the molar mass of SO3 by adding the masses of 1 mole of S and 3 moles of O 1 mole S = 32.1 g of S 3 moles O = 48.0 g of O Molar Mass of SO3 = 80.1 g of SO3 112 Worked Example 1 (Sample Problem 5.3): Calculating Molar Mass of Compounds: Find the molar mass of Li2CO3 used to produce red color in fireworks. Solution: Step 1 Using the periodic table, obtain the molar masses of lithium, carbon and oxygen Step 2 Obtain the mass of each element in the formula by multiplying each molar mass by its number of moles (subscript) in the formula. Grams from 2 moles of Li 2 mole Li x = 13.8 g Li Grams from 1 mole of C Grams from 3 moles of O 3 moles O x = 48.0 g O Step 3 Obtain the molar mass of Li2CO3 by adding the masses of 2 moles of Li, 1 mole of C, and 3 moles of O. 13.8 g of Li + 12.0 g of C + 48.0 g of O = Molar Mass of Li2CO3 = 73.8 g ? Calculations Using Molar Mass The molar mass of an element or compound is one of the most useful conversion factors in chemistry. Molar mass is used to change from moles of a substance to grams, or from grams to moles. To do these calculations, we use the molar mass as a conversion factor. For example, 1 mole of magnesium has a mass of 24.3 g. To express molar mass as an equality we can write 1 mole Mg = 24.3 g Mg 1 mole C x = 12.0 g C 113 From this equality, two conversion factors can be written. and Conversion factors are written for compounds in the same way. For example, the molar mass of the compound H2O is 18.0 g. 1 mole H2O = 18.0 g The conversion factors from the molar mass of H2O are written as and We can now change from moles to grams, or grams to moles, using the conversion factors derived from the molar mass. (Remember, you must determine the molar mass of the substance first.) 114 Worked example 2 (Sample Problem 5.4): Converting Moles of an Element to Grams: Silver metal is used in the manufacture of tableware, mirrors, jewelry, and dental alloys. If the design for a piece of jewelry requires a 0.750 mole silver, how many grams of silver are needed? Solution: Step 1 Given: 0.750 mole Ag Needed: grams of Ag Step 2 Plan: moles of Ag molar mass conversion factor barb2right grams of Ag Step 3 Equalities/Conversion Factors: 1 mole Ag = 107.9 g Ag and Step 4 Set Up Problem: Calculate the grams of silver using the molar mass. 115 Worked example 3 (Sample Problem 5.5): Converting Mass of a Compound to Moles A box of salt contains 737 g NaCl. How many moles of NaCl are present in the box? Solution: Step 1 Given: 737 g NaCl Needed: moles of NaCl Step 2 Plan: grams of NaCl molar mass conversion factor barb2right moles of NaCl Step 3 Equalities/Conversion Factors The molar mass of NaCl is the sum of the masses of one mole of Na+ and one mole Cl- : (1 x 23.0 g/mole) + (1 x 35.5 g/mole) = 58.5 g/mole 1 mole NaCl = 58.5 g NaCl and Step 4 Set Up Problem: We calculate the grams of NaCl using the molar mass. 116 We can summarize the calculations to show the connections between the moles of a compound, its mass in grams, number of molecules (or formula units if ionic), and the moles and atoms of each element in that compound in the following flowchart. Mass Moles Particles Grams of element Molar mass (g/mole) Moles of elements Avogadro?s Number Atoms (or ions) Formula Subscripts Grams of compound Molar mass (g/mole) Moles of compound Avogadro?s Number Molecules (or formula units) Now return to page 1 and read the passage again. 117 Appendix D Assignment for students who choose not to participate in the research With the exception of instructions, reprinted with permission from Person Education, Inc., from: Timberlake, K. C. (2009). Chemistry: An introduction to general, organic, and biological chemistry (10th ed.), pp. 166-170. Upper Saddle River, NJ. 118 Instructions: Read the following passage and answer the attached questions. Molar Mass A single atom or molecule is much too small to weigh, even on the most sensitive balance. In fact, it takes a huge number of atoms or molecules to make enough of substance that you can see. An amount of water that contains Avogadro's number of water molecules is only a few sips of water. In the laboratory, we use the balance to weigh out Avogadro's number of particles or 1 mole of a substance. For any element, the quantity called its molar mass is the number of grams equal to the atomic mass of that element. We are counting out 6.02 x 1023 atoms of an element when we weigh out the number of grams equal to its molar mass. For example, if we need one mole of carbon (C) atoms, we would first find the atomic mass of 12.01 on the periodic table. Then to obtain 1 mole of carbon atoms, we would weigh out 12.01 g of carbon. Thus the molar mass of carbon is found by looking at the atomic mass on the periodic table. More examples from the periodic table: 1 mole of silver atoms has a mass of 107.9 g 1 mole of carbon atoms has a mass of 12.01 g 1 mole of sulfur atoms has a mass of 32.06 g 119 ? Molar Mass of a Compound To determine the molar mass of the compound, multiply the molar mass of each element by its subscript in the formula, and add the results. For example, the molar mass of sulfur trioxide, S03, is obtained by adding the molar masses of 1 mole of sulfur and 3 moles of oxygen. In this text, we round molar mass to the tenths (0.1 g) place for calculations. Step 1 Using the periodic table, obtain the molar masses of sulfur and oxygen. Step 2 Grams from 1 mole of S 1 mole S x = 32.1 g S Grams from 3 moles of O 3 moles O x = 48.0 g O Step 3 Obtain the molar mass of SO3 by adding the masses of 1 mole of S and 3 moles of O 1 mole S = 32.1 g of S 3 moles O = 48.0 g of O Molar Mass of SO3 = 80.1 g of SO3 120 Worked Example 1 (Sample Problem 5.3): Calculating Molar Mass of Compounds: Find the molar mass of Li2CO3 used to produce red color in fireworks. Solution: Step 1 Using the periodic table, obtain the molar masses of lithium, carbon and oxygen Step 2 Obtain the mass of each element in the formula by multiplying each molar mass by its number of moles (subscript) in the formula. Grams from 2 moles of Li 2 mole Li x = 13.8 g Li Grams from 1 mole of C Grams from 3 moles of O 3 moles O x = 48.0 g O Step 3 Obtain the molar mass of Li2CO3 by adding the masses of 2 moles of Li, 1 mole of C, and 3 moles of O. 13.8 g of Li + 12.0 g of C + 48.0 g of O = Molar Mass of Li2CO3 = 73.8 g ? Calculations Using Molar Mass The molar mass of an element or compound is one of the most useful conversion factors in chemistry. Molar mass is used to change from moles of a substance to grams, or from grams to moles. To do these calculations, we use the molar mass as a conversion factor. For example, 1 mole of magnesium has a mass of 24.3 g. To express molar mass as an equality we can write 1 mole C x = 12.0 g C 121 1 mole Mg = 24.3 g Mg From this equality, two conversion factors can be written. and Conversion factors are written for compounds in the same way. For example, the molar mass of the compound H2O is 18.0 g. 1 mole H2O = 18.0 g The conversion factors from the molar mass of H2O are written as and We can now change from moles to grams, or grams to moles, using the conversion factors derived from the molar mass. (Remember, you must determine the molar mass of the substance first.) 122 Worked example 2 (Sample Problem 5.4): Converting Moles of an Element to Grams: Silver metal is used in the manufacture of tableware, mirrors, jewelry, and dental alloys. If the design for a piece of jewelry requires a 0.750 mole silver, how many grams of silver are needed? Solution: Step 1 Given: 0.750 mole Ag Needed: grams of Ag Step 2 Plan: moles of Ag molar mass conversion factor barb2right grams of Ag Step 3 Equalities/Conversion Factors: 1 mole Ag = 107.9 g Ag and Step 4 Set Up Problem: Calculate the grams of silver using the molar mass. Worked example 3 (Sample Problem 5.5): Converting Mass of a Compound to Moles A box of salt contains 737 g NaCl. How many moles of NaCl are present in the box? Solution: Step 1 Given: 737 g NaCl Needed: moles of NaCl Step 2 Plan: grams of NaCl molar mass conversion factor barb2right moles of NaCl Step 3 Equalities/Conversion Factors The molar mass of NaCl is the sum of the masses of one mole of Na+ and one mole Cl- : 123 (1 x 23.0 g/mole) + (1 x 35.5 g/mole) = 58.5 g/mole 1 mole NaCl = 58.5 g NaCl and Step 4 Set Up Problem: We calculate the grams of NaCl using the molar mass. We can summarize the calculations to show the connections between the moles of a compound, its mass in grams, number of molecules (or formula units if ionic), and the moles and atoms of each element in that compound in the following flowchart. Mass Moles Particles Grams of element Molar mass (g/mole) Moles of elements Avogadro?s Number Atoms (or ions) Formula Subscripts Grams of compound Molar mass (g/mole) Moles of compound Avogadro?s Number Molecules (or formula units) 124 After reading, answer the following questions. 1. Calculate the molar mass for each of the following compounds. a. NaCl b. Fe2O3 c. Li2CO3 d. Al2(SO4)3 e. Mg(OH)2 2. Calculate the number of grams in each of the following: a. 2.00 moles Na b. 2.80 moles Ca c. 0.125 mole Sn 3. Calculate the number of grams in each of the following: a. 0.500 mole NaCl b. 1.75 moles Na2O c. 0.225 moles H2O 125 4. The compound MgSO4 is called Epsom salts. How many grams will you need to prepare a bath containing 5.00 moles of Epsom salts? 5. Cyclopropane, C3H6, is an anesthetic given by inhalation. How many grams are in 0.25 mole of cyclopropane? 6. How many moles are contained in each of the following? a. 50.0 g Ag b. 0.200 g C c. 15.0 g NH3 d. 75.0 g SO2 7. A can of Drano contains 480 g of NaOH. How many moles of NaOH are in the can of Drano? 126 8. A gold nugget weighs 35.0 g. How many moles of gold are in the nugget? 9. How many moles of S are in each of the following quantities? a. 25 g S b. 125 g SO2 c. 2.0 moles Al2S3 10. How many moles of C are in each of the following quantities? a. 75 g C b. 0.25 mole C2H2 c. 88 g CO2 127 Appendix E Problem solving posttest used with both experimental and placebo control groups 128 Molar Mass Posttest 1. Find the molar mass of Na2SO4 used to produce color in fireworks. Show your work. 2. Silver metal (Ag) is used in the manufacture of tableware, mirrors, jewelry, and dental alloys. If the design for a piece of jewelry requires a 1.75 mole silver, how many grams of silver are needed? Show your work. 3. A box of salt contains 325 g NaCl. How many moles of NaCl are present in the box? Show your work. 129 4. Find the mass in grams of 1.25 moles of iron (Fe). Show your work. 5. Calculate the molar mass of Ba3(PO4)2. Show your work. 6. How many moles of silver nitrate (AgNO3) are present in 225 grams of silver nitrate? Show your work. 130 Appendix F Interview questions used with volunteers from the experimental group 131 Interview Questions: 1. Did the statements followed by the question ?Why is this true?? sometimes help you better understand the material? 2. When studying Worked Example 1, did the statement ?One mole of Li2CO3 contains two moles of Li, one mole of C and three moles of O.? followed by the question ?Why is this true?? help you better understand how you would calculate the molar mass of Li2CO3? 3. When studying Worked Example 1, did the statement ?One mole of Li2CO3 contains two moles of Li, one mole of C and three moles of O.? followed by the question ?Why is this true?? help you answer posttest question 1? 4. When studying Worked Example 2, did the statement ?The conversion factor for silver (Ag) can be written as or ? followed by the question ?Why is this true?? help you better understand how molar mass can be used as a conversion factor? 5. When studying Worked Example 2, did the statement ?The conversion factor for silver (Ag) can be written as or ? followed by the question ?Why is this true?? help you answer posttest question 2? 6. When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the final answer is correctly expressed in units of moles NaCl.? Followed by the question ?Why is this true?? help you understand how the conversion factor was used? 7. When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the final answer is correctly expressed in units of moles NaCl.? Followed by the question ?Why is this true?? help you answer posttest question 3? 132 8. When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the correct form of the conversion factor to use is .? Followed by the question ?Why is this true?? help you understand how to correctly solve for moles of NaCl in Worked Example 3? 9. When studying Worked Example 3, did the statement ?In step 4 of worked example 3, the correct form of the conversion factor to use is .? Followed by the question ?Why is this true?? help you answer posttest question 6? 10. If your textbook included this type of study strategy (statements followed by the question ?why is this true?? do you think you would use it while reading? 133 Appendix G Consent Form 134 Page 1 of 2 Initials _______ Date ______ CONSENT FORM Project Title Using elaborative interrogation enhanced worked-examples to improve chemistry problem solving Why is this research being done? This is a research project being conducted by Dr. William Holliday at the University of Maryland, College Park and Rebecca Pease at Central New Mexico Community College. We are inviting you to participate in this research project because you are a student in an introductory chemistry course. The purpose of this research project is to determine the effectiveness answering ?why? questions while studying worked- examples as a means of improving chemistry problem solving. What will I be asked to do? The procedure involves studying worked-examples from a chemistry text or studying the worked-examples while answering ?why? questions. After the studying or studying with questions sessions, the students will be asked to solve problems similar to the ones in the worked-examples. This study will take place at Central New Mexico Community College, Albuquerque, New Mexico. The study will span one semester, with each student asked to participate in two sessions of the study as well as the comprehension test. There will also be a test of prior knowledge, verbal ability, and a basic math skills test administered prior to the start of the study. Each study session will take place during a regular class period. What about confidentiality? We will do our best to keep your personal information confidential. To help protect your confidentiality, the written work will be stored in a secure location off-campus. If we write a report or article about this research project, your identity will be protected to the maximum extent possible. Your information may be shared with representatives of the University of Maryland, College Park or governmental authorities if you or someone else is in danger or if we are required to do so by law. In accordance with legal requirements and/or professional standards, we will disclose to the appropriate individuals and/or authorities information that comes to our attention concerning child abuse or neglect or potential harm to you or others. What are the risks of this research? While we do not foresee any risk to you, potential risks such as anxiety or confusion will be monitored and minimized. We encourage you to ask questions throughout the duration of the study and you may withdraw from the study at any time without penalty. What are the benefits of this research? This research is not designed to help you personally, but the results may help the investigator learn more about how students learn chemistry. We hope that, in the future, other people might benefit from this study through improved understanding of learning and problem solving strategies used at the undergraduate level. 135 Page 2 of 2 Initials _______ Date ______ Project Title Using elaborative interrogation enhanced worked-examples to improve chemistry problem solving Do I have to be in this research? May I stop participating at any time? Your participation in this research is completely voluntary. You may choose not to take part at all. If you decide to participate in this research, you may stop participating at any time. If you decide not to participate in this study or if you stop participating at any time, you will not be penalized or lose any benefits to which you otherwise qualify. Participation is not a course requirement. You and your class members have other options for earning the same amount of credit. If you do not wish to participate, an alternative assignment will be given to you. What if I have questions? This research is being conducted by Dr. William Holliday and Rebecca Pease in the Curriculum and Instruction Department at the University of Maryland, College Park. If you have any questions about the research study itself, please contact Dr. Holliday at: (office) Department of Curriculum and Instruction, Science Teaching Center, 2226 Benjamin Building, College Park, MD 20742 (email) holliday@umd.edu (telephone) 301-405-3135 or Rebecca Pease at: (email) rspease@umd.edu (telephone) 505-873-1811. If you have questions about your rights as a research subject or wish to report a research-related injury, please contact: Institutional Review Board Office, University of Maryland, College Park, Maryland, 20742; (e-mail) irb@deans.umd.edu; (telephone) 301-405-0678 This research has been reviewed according to the University of Maryland, College Park IRB procedures for research involving human subjects. Statement of Age of Subject and Consent Your signature indicates that: you are at least 18 years of age; the research has been explained to you; your questions have been fully answered; and you freely and you voluntarily choose to participate in this research project. Signature and Date NAME OF SUBJECT SIGNATURE OF SUBJECT DATE 136 Appendix H Pilot study 137 A pilot experiment to determine feasibility of the present study was performed with three Basics of Chemistry classes at the same community college during the spring semester of 2009 and spring semester of 2010 using the same procedures and similar materials to those in the current study. The classes consisted of 55 students present on the days of data collection. Five students chose not to participate or were eliminated for other reasons, leaving 50 participants for the pilot study. Sets of elaborative interrogation why- questions instructional materials and rereading instructional materials were randomly distributed to the participants and the study was conducted as described in the following sections. Independent samples t-tests were performed to compare the posttest scores (see Table 12) as well as chemistry prior knowledge and mathematics skills tests scores (see Table 13) of the treatment and placebo-control groups. A significant difference was found between the two groups on posttest scores (p < .05), but no significant differences between the two groups were found on the prior knowledge test scores. These results suggested that the two groups began the study with similar amounts of chemistry prior knowledge and mathematics skills, and that the elaborative interrogation why-question treatment resulted in increased comprehension leading to improved ability to solve problems similar to the worked examples contained in the reading. Based on results and feedback obtained from this pilot study, some of the problem solving posttest questions and elaborative interrogation why-questions were revised to improve the validity of these instruments. 138 Table 12 Pilot study: independent samples t-test table: comparison of posttest scores of the elaborative interrogation why-question treatment and rereading placebo- control groups Group M SD t p Treatment ( n = 23) 7.57 4.17 2.28 .027 Placebo-control (n = 27) 5.04 3.54 Table 13 Pilot study: performance on chemistry prior knowledge and mathematics skills tests Group Chemistry prior knowledge test M Mathematics skills test M Treatment (n = 23) 4.04 (SD = 1.69) 2.70 (SD = 1.06) Placebo-control (n = 27) 3.93 (SD = 1.17) 2.26 (SD = 1.16) 139 References Anderson, R. C. (1972). How to construct achievement tests to assess comprehension. Review of Educational Research, 42(2), 145-170. Atkinson, R. K., Derry, S. J., Renkl, A., & Wortham, D. (2000). Learning from examples: Instructional principles from the worked examples research. Review of Educational Research, 70(2), 181-214. Atkinson, R. K., & Renkl, A. (2007). Interactive example-based learning environments: Using interactive elements to encourage effective processing of worked examples. Educational Psychology Review, 19(3), 375-386.K., & Renkl, d Baker, L. (1985). Differences in the standards used by college students to evaluate their comprehension of expository prose. Reading Research Quarterly, 20(3), 297-313. Best, R. M., Rowe, M., Ozuru, Y., & McNamara, D. S. (2005). Deep-level comprehension of science texts: The role of the reader and the text. Topics in Language Disorders, 25(1), 65-83. Bodner, G. M. (2003). Problem solving: The difference between what we do and what we tell students to do. University Chemistry Education, 7(1), 37-45. Bodner, G. M., & Herron, J. D. (2002). Problem solving in chemistry. In O. De Johg, R. Justi, D. F. Treagust & J. H. Van Driel (Eds.), Chemical Education: Research- Based Practice: Kluwer Academic Publishers. Boggs, G. R. (2006). Foreword. In G. B. Vaughan, The community college story (p. vii). Washington, DC: Community College Press. Bonner, J. M., & Holliday, W. G. (2006). How college science students engage in note- taking strategies. Journal of Research in Science Teaching, 43(8), 786 - 818. 140 Bransford, J. D., Stein, B. S., Vye, N. J., Franks, J. J., Auble, P. M., Mezynski, K. J., et al. (1982). Differences in approaches to learning: An overview. Journal of Experimental Psychology: General, 111(4), 390-398. Brozo, W. G. (2009). Response to intervention or responsive instruction? Challenges and possibilities of response to intervention for adolescent literacy. Journal of Adolescent & Adult Literacy, 53(4), 277-281. Callender, A. A., & McDaniel, M. A. (2007). The benefits of embedded question adjuncts for low and high structure builders. Journal of Educational Psychology, 99(2), 339-348. Callender, A. A., & McDaniel, M. A. (2009). The limited benefits of rereading educational texts. Contemporary Educational Psychology, 34(1), 30-41. Campbell, D., & Stanley, J. (1966). Experimental and quasi-experimental designs for research (reprinted from Handbook of Research on Teaching, 1963). Chicago, Rand McNally. Carrier, L. M. (2003). College students' choices of study strategies. Perceptual and Motor skills, 96, 54-56. Cassidy, J., Valadez, C. M., Garrett, S. D., & Barrera, E. S. (2010). Adolescent and adult literacy: What's hot, what's not. Journal of Adolescent & Adult Literacy, 53(6), 448-456. Caverly, D. C., Orlando, V. P., & Mullen, J.-A. L. (2000). Textbook study reading. In R. F. Flippo & D. C. Caverly (Eds.), Handbook of college reading and study strategy research. (pp. 105-147). Mahwah, NJ US: Lawrence Erlbaum Associates Publishers. 141 Chi, M. T. H., Bassok, M., Lewis, M. W., Reimann, P., Glaser, R. (1989). Self- explanations: How students study and use examples in learning to solve problems. Cognitive Science, 13(2), 145-182. Chi, M. T. H., & Glaser, R. (2000). Self-explaining: The dual processes of generating inference and repairing mental models. In Advances in instructional psychology: Educational design and cognitive science, Vol. 5. (pp. 161-238). Mahwah, NJ US: Lawrence Erlbaum Associates Publishers. Coe, R. (2002). It's the effect size, stupid: What effect size is and why it is important. Paper presented at the Annual Conference of the British Educational Research Association. Cohen, A. M., & Brawer, F. B. (2009). The American community college (5th ed.). San Francisco, CA: Jossey-Bass. Cohen, J. (1992). A power primer. Psychological Bulletin, 112(1), 155-159. Cohen, J. (1994). The earth is round (p<. 05). American psychologist, 49(12), 997-1003. Cooper, G., & Sweller, J. (1987). Effects of schema acquisition and rule automation on mathematical problem-solving transfer. Journal of Educational Psychology, 79(4), 347-362. Cortina, J. M. (1993). What is coefficient alpha? An examination of theory and applications. Journal of applied psychology, 78, 98-98. Cox, K. E., & Guthrie, J. T. (2001). Motivational and cognitive contributions to students' amount of reading. Contemporary Educational Psychology, 26(1), 116-131. 142 Craig, M. T., & Yore, L. D. (1996). Middle school students' awareness of strategies for resolving comprehension difficulties in reading science. Journal of Research and Development in Education, 29, 226-238. Davila, K., & Talanquer, V. (2010). Classifying end-of-chapter questions and problems for selected general chemistry textbooks used in the United States. Journal of Chemical Education, 87(1), 97-101. DeMeo, S. (2006). Revisiting molar mass, atomic mass, and mass number: Organizing, integrating, and sequencing fundamental chemical concepts. Journal of Chemical Education, 83(4), 617-621. Deters, K. M. (2003). What should we teach in high school chemistry? Journal of Chemical Education, 80(10), 1153-1155. Dierks, W., Weninger, J., & Herron, J. D. (1985). Mathematics in the chemistry classroom. Part 1. The special nature of quantity equations. Journal of Chemical Education, 62(10), 839-null. Digisi, L. L., & Willett, J. B. (1995). What high school biology teachers say about their textbook use: A descriptive study. Journal of Research in Science Teaching, 32(2), 123-142. Dochy, F., Segers, M., & Buehl, M. M. (1999). The relation between assessment practices and outcomes of studies: The case of research on prior knowledge. Review of Educational Research, 69(2), 145-186. Dori, Y. J., & Hameiri, M. (2003). Multidimensional analysis system for quantitative chemistry problems: Symbol, macro, micro, and process aspects. Journal of Research in Science Teaching, 40(3), 278-302. 143 Dunlosky, J., Rawson, K. A., & Hacker, D. J. (2002). Metacomprehension of science text: Investigating the levels-of-disruption hypothesis. In J. Otero, J. A. Leon & A. C. Graesser (Eds.), The psychology of science text comprehension (pp. 255- 279). Mahwah, NJ US: Lawrence Erlbaum Associates Publishers. Duschl, R. A., Schweingruber, H. A., & Shouse, A. W. (2007). Taking science to school: Learning and teaching science in grades K-8. Washington, DC: National Academy Press. French, J. W., Ekstrom, R. B., & Price, L. A. (1963). Kit of reference tests for cognitive factors (Rev. ed.). Princeton, NJ: Educational Testing Service. Gabel, D., & Sherwood, R. D. (1984). Analyzing difficulties with mole-concept tasks by using familiar analog tasks. Journal of Research in Science Teaching, 21(8), 843- 851. Gee, J. P. (2004). Language in the science classroom: Academic social languages as the heart of school-based literacy. In E. W. Saul (Ed.), Crossing borders in literacy and science instruction: Perspectives on theory and practice (pp. 383-394). Newark, DE: International Reading Association and Arlington, VA: National Science Teachers Association. Gewertz, C. (2010). Little progress seen in student results on reading NAEP. Education Week, 29(27), 6-6. Glenberg, A. M., & Epstein, W. (1985). Calibration of comprehension. Journal of Experimental Psychology: Learning, Memory, and Cognition, 11(4), 702-718. Graesser, A. C. (2007). An introduction to strategic reading comprehension. Reading comprehension strategies: Theories, interventions, and technologies, 3?26. 144 Graesser, A. C., Le?n, J. A., & Otero, J. (2002). Introduction to the psychology of science text comprehension. In J. Otero, J. A. Leon & A. C. Graesser (Eds.), The psychology of science text comprehension (pp. 1-15). Mahwah, NJ US: Lawrence Erlbaum Associates Publishers. Hacker, D. J., & Tenent, A. (2002). Implementing reciprocal teaching in the classroom: Overcoming obstacles and making modifications. Journal of Educational Psychology, 94(4), 699-718. Hand, B., Gunel, M., & Ulu, C. (2009). Sequencing embedded multimodal representations in a writing-to-learn approach to the teaching of electricity. Journal of Research in Science Teaching, 46(3), 225-247. Hedin, L. R., & Conderman, G. (2006). Teaching students to comprehend informational text through rereading. The Reading Teacher, 63(7), 556-565. Herron, J. D., & Greenbowe, T. J. (1986). What can we do about Sue: A case study of competence. Journal of Chemical Education, 63(6), 528. Holliday, W. G. (2004). Choosing science textbooks: Connecting research to common sense. In E. W. Saul (Ed.), Crossing borders in literacy and science instruction: Perspectives on theory and practice (pp. 13-32). (pp. 13-32). Newark, DE: International Reading Association and Arlington VA: National Science Teachers Association. Holliday, W. G., Brunner, L. L., & Donais, E. L. (1977). Differential cognitive and affective responses to flow diagrams in science. Journal of Research in Science Teaching, 14(2), 129-138. 145 Holliday, W. G., Whittaker, H. G., & Loose, K. D. (1984). Differential effects of verbal aptitude and study questions on comprehension of science concepts. Journal of Research in Science Teaching, 21(2), 143-150. Horn, L., & Nevill, S. (2006). Profile of undergraduates in U.S. postsecondary education institutions: 2003-2004: with a special analysis of community college students (NCES 2006-184). Washington, D.C.: U.S. Department of Education, National Center for Education Statistics. Retrieved December 16, 2011, from http://nces.ed.gov/pubsearch. Johnston, P. (1984). Prior knowledge and reading comprehension test bias. Reading Research Quarterly, 19(2), 219-239. Jonassen, D. H. (2003). Designing research-based instruction for story problems. Educational Psychology Review, 15(3), 267-296. Kalyuga, S., Chandler, P., Tuovinen, J., & Sweller, J. (2001). When problem solving is superior to studying worked examples. Journal of Educational Psychology, 93(3), 579-588. Karpicke, J. D., Butler, A. C., & Roediger, H. L. (2009). Metacognitive strategies in student learning: Do students practise retrieval when they study on their own? Memory, 17(4), 471-479. Kesidou, S., & Roseman, J. E. (2002). How well do middle school science programs measure up? Findings from Project 2061's curriculum review. Journal of Research in Science Teaching, 39(6), 522-549. Kintsch, W. (1994). Text comprehension, memory, and learning. American Psychologist, 49(4), 294-303. 146 Kintsch, W., & Van Dijk, T. A. (1978). Toward a model of text comprehension and production. Psychological review, 85(5), 363-394. Knapp, L. G., Kelly-Reid, J. E., & Ginder, S. A. (2010). Enrollment in Postsecondary Institutions, Fall 2008; Graduation Rates, 2002 & 2005 Cohorts; and Financial Statistics, Fiscal Year 2008 (NCES 2010-152). U.S. Department of Education. Washington, DC: National Center for Education Statistics. Retrieved December 16, 2011, from http://nces.ed.gov/pubsearch. Kruidenier, J. (2002). Research-based principles for adult basic education reading instruction. Washington, DC: National Institute for Literacy, Partnership for Reading. Lemke, J. (2004). The literacies of science. In E. W. Saul (Ed.), Crossing borders in literacy and science instruction: Perspectives on theory and practice (pp. 33-47). Newark, DE: International Reading Association and Arlington, VA: National Science Teachers Association. Levin, J. R. (1988). Elaboration-based learning strategies: Powerful theory = powerful application. Contemporary Educational Psychology, 13(3), 191-205. Levin, J. R. (2008). The unmistakable professional promise of a young educational psychology researcher and scholar. Educational Psychologist, 43(2), 70-85. Lysynchuk, L. M., Pressley, M., d'Ailly, H., Smith, M., & Cake, H. (1989). A methodological analysis of experimental studies of comprehension strategy instruction. Reading Research Quarterly, 24(4), 458-470. Martin, V. L., & Pressley, M. (1991). Elaborative-interrogation effects depend on the nature of the question. Journal of Educational Psychology, 83(1), 113-119. 147 Mayer, R. E. (2004). Teaching of subject matter. Annual Review of Psychology, 55, 715- 744. Mayer, R. E., Sims, V., & Tajika, H. (1995). A comparison of how textbooks teach mathematical problem solving in Japan and the United States. American Educational Research Journal, 32(2), 443-460. Mayer, R. E., & Wittrock, M. C. (1996). Problem-solving transfer. In R. Berliner & R. Calfee (Eds.), Handbook of Educational Psychology. New York: Macmillan. McDaniel, M. A., & Donnelly, C. M. (1996). Learning with analogy and elaborative interrogation. Journal of Educational Psychology, 88(3), 508-519. McKeown, M. G., Beck, I. L., & Blake, R. G. K. (2009). Rethinking reading comprehension instruction: A comparison of instruction for strategies and content approaches. Reading Research Quarterly, 44(3), 218-253. Menke, D. J., & Pressley, M. (1994). Elaborative interrogation: Using "why" questions to enhance the learning from text. Journal of Reading, 37(8), 642-645. Millar, R. (1991). Why is science hard to learn? Journal of Computer Assisted Learning, 7(2), 66-74. Mulford, D. R., & Robinson, W. R. (2002). An inventory for alternate conceptions among first-semester general chemistry students. Journal of Chemical Education, 79(6), 739. National Research Council [NRC] . Committee on Challenges for the Chemical Sciences in the 21st, C. (2003). Beyond the Molecular Frontier: Challenges for Chemistry and Chemical Engineering. Washington, DC: National Academies Press. 148 Norris, S. P., & Phillips, L. M. (1994). The relevance of a reader?s knowledge within a perspectival view of reading. Journal of Reading Behavior, 26(4), 391-412. Norris, S. P., & Phillips, L. M. (2008). Reading as inquiry. In R. D. R. Grandy (Ed.), Teaching scientific inquiry: Recommendations for research and implementation (pp. 233-262). Rotterdam, The Netherlands: Sense. Novak, J. D. (1980). Progress in application of learning theory. Theory Into Practice, 19(1), 58. O'Reilly, T., & McNamara, D. S. (2007). The impact of science knowledge, reading skill, and reading strategy knowledge on more traditional 'high-stakes' measures of high school students' science achievement. American Educational Research Journal, 44(1), 161-196. Otero, J., & Kintsch, W. (1992). Failures to detect contradictions in a text: What readers believe versus what they read. Psychological Science, 3(4), 229-235. Ozgungor, S., & Guthrie, J. T. (2004). Interactions among elaborative interrogation, knowledge, and interest in the process of constructing knowledge from text. Journal of Educational Psychology, 96(3), 437-443. Ozuru, Y., Dempsey, K., & McNamara, D. S. (2009). Prior knowledge, reading skill, and text cohesion in the comprehension of science texts. Learning and Instruction, 19(3), 228-242. Palincsar, A. S., & Brown, A. L. (1984). Reciprocal teaching of comprehension-fostering and comprehension-monitoring activities. Cognition and Instruction, 1(2), 117- 175. 149 Pedersen, J., Bonnstetter, R. J., Corkill, A. J., & Glover, J. A. (1988). Learning chemistry from text: The effect of decision making. Journal of Research in Science Teaching, 25(1), 15-21. Pentecost, T. C., & James, M. L. (2000). Creating a student-centered physical chemistry class. Journal of College Science Teaching, 30(2), 122. Perle, M., & Moran, R. (2005). NAEP 2004 trends in academic progress: Three decades of student performances (NCES 2005-464). Washington, DC: US Department of Education, National Center for Educational Statistics. Piaget, J. (1983). Piaget?s theory. In W. Kesson & P. H. Mussen (Eds.), History, theory, and methods, Vol 1, Handbook of child psychology (pp. 103-128). New York: John Wiley & Sons. Prawat, R. S. (1989). Promoting access to knowledge, strategy, and disposition in students: A research synthesis. Review of Educational Research, 59(1), 1-41. Pressley, M. (2006). Reading instruction that works: The case for balanced teaching (3rd ed.). New York, NY US: Guilford Press. Pressley, M., & Afflerbach, P. (1995). Verbal protocols of reading: The nature of constructively responsive reading. Hillsdale, NJ England: Lawrence Erlbaum Associates, Inc. Pressley, M., & El-Dinary, P. B. (1997). What we know about translating comprehension-strategies instruction research into practice. Journal of Learning Disabilities, 30(5), 486. Pressley, M., McDaniel, M. A., Turnure, J. E., Wood, E., & Ahmad, M. (1987). Generation and precision of elaboration: Effects on intentional and incidental 150 learning. Journal of Experimental Psychology: Learning, Memory, and Cognition, 13(2), 291-300. Pressley, M., Symons, S., McDaniel, M. A., Snyder, B. L., & Turnure, J. E. (1988). Elaborative interrogation facilitates acquisition of confusing facts. Journal of Educational Psychology, 80(3), 268-278. Pressley, M., & Wharton-McDonald, R. (2006). The need for increased comprehension instruction. In M. Pressley (Ed.), Reading instruction that works: The case for balanced teaching (3rd ed.). New York: Guilford Press. Pressley, M., Wood, E., Woloshyn, V. E., Martin, V., King, A., & Menke, D. (1992). Encouraging mindful use of prior knowledge: Attempting to construct explanatory answers facilitates learning. Educational Psychologist, 27(1), 91-109. Rampey, B. D., Dion, G. S., & Donahue, P. L. (2009). NAEP 2008: Trends in academic progress. NCES 2009-479. Washington, DC: U.S. Department of Education, National Center for Education Statistics, Institute of Education Sciences RAND Reading Study Group. (2004). A research agenda for improving reading comprehension. In R. B. Ruddell & N. J. Unrau (Eds.), Theoretical models and processes of reading (pp. 720-754). Newark, DE: International Reading Association. Rawson, K. A., Dunlosky, J., & Thiede, K. W. (2000). The rereading effect: Metacomprehension accuracy improves across reading trials. Memory & Cognition, 28(6), 1004-1010. Rawson, K. A., & Kintsch, W. (2005). Rereading effects depend on time of test. Journal of Educational Psychology, 97(1), 70-80. 151 Reed, S. K., & Bolstad, C. A. (1991). Use of examples and procedures in problem solving. Journal of Experimental Psychology: Learning, Memory, and Cognition, 17(4), 753-766. Renkl, A. (1997). Learning from worked out examples: A study on individual differences. Cognitive Science, 21(1), 1-29. Renkl, A., Atkinson, R. K., Maier, U. H., & Staley, R. (2002). From example study to problem solving: Smooth transitions help learning. Journal of Experimental Education, 70(4), 293-315. Rickards, J. P. (1979). Adjunct postquestions in text: A critical review of methods and processes. Review of Educational Research, 49(2), 181-196. Rosenshine, B., & Meister, C. (1994). Reciprocal teaching: A review of the research. Review of Educational Research, 64(4), 479-530. Rothkopf, E. Z. (1982). Adjunct aids and the control of mathemagenic activities during purposeful reading. Reading expository material, 109-138. Ryan, T. E. (2006). Motivating novice students to read their textbooks. Journal of Instructional Psychology, 33(2), 136-140. Sappington, J., Kinsey, K., & Munsayac, K. (2002). Two studies of reading compliance among college students. Teaching of Psychology, 29(4), 272-274. Seifert, T. L. (1993). Effects of elaborative interrogation with prose passages. Journal of Educational Psychology, 85(4), 642-651. Selvaratnam, M., & Kumarasinghe, S. (1991). Student conceptions and competence concerning quantitative relationships between variables. Journal of Chemical Education, 68(5), 370-null. 152 Shanahan, C. (2004). Better textbooks, better readers and writers. In E. W. Saul (Ed.), Crossing borders in literacy and science instruction (pp. 370-382). Newark, DE: International Reading Association and Arlington, VA: National Science Teachers Association. Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78(1), 40-59. Sijtsma, K. (2009). On the use, the misuse, and the very limited usefulness of Cronbach's alpha. Psychometrika, 74(1), 107-120. Sikorski, J. F., Rich, K., Saville, B. K., Buskist, W., Drogan, O., & Davis, S. (2002). Student use of introductory texts: Comparative survey findings from two universities. Teaching of Psychology, 29(4), 312. Silberman, R. G. (1981). Problems with chemistry problems: Student perception and suggestions. Journal of Chemical Education, 58(12), 1036. Simpson, M. L., & Nist, S. L. (2002). Encouraging active reading at the college level. In C. C. Block & M. Pressley (Eds.), Comprehension Instruction (pp. 365-379). New York: Guilford Press. Smith, B. L., Holliday, W. G., & Austin, H. W. (2010). Students' comprehension of science textbooks using a question-based reading strategy. Journal of Research in Science Teaching, 47(4), 363-379. Smith, M. U. (1991). A view from biology. In M. U. Smith (Ed.), Toward a unified theory of problem solving: Views from the content domains (pp. 1-20). Hillside, NJ: Lawrence Erlbaum Associates. 153 Snow, C. E. (2002). Reading for understanding: Toward an R&D program in reading comprehension: Rand Corp. Staver, J. R., & Lumpe, A. T. (1995). Two investigations of students' understanding of the mole concept and its use in problem solving. Journal of Research in Science Teaching, 32(2), 177-193. Surber, J. R., & Schroeder, M. (2007). Effect of prior domain knowledge and headings on processing of informative text. Contemporary Educational Psychology, 32(3), 485-498. Sweller, J., & Cooper, G. A. (1985). The use of worked examples as a substitute for problem solving in learning algebra. Cognition and Instruction, 2(1), 59-89. Taasoobshirazi, G., & Glynn, S. M. (2009). College students solving chemistry problems: A theoretical model of expertise. Journal of Research in Science Teaching, 46(10), 1070-1089. Thorndike, E. L. (1917). Reading as reasoning: A study of mistakes in paragraph reading. Journal of Educational Psychology, 8(6), 323-332. Timberlake, K. C. (2009). Chemistry: An introduction to general, organic, and biological chemistry (10th ed.). Upper Saddle River, NJ: Pearson Education, Inc. van Eijck, M., & Roth, W. M. (2008). Representations of scientists in Canadian high school and college textbooks. Journal of Research in Science Teaching, 45(9), 1059?1082. Walsh, L. N., Howard, R. G., & Bowe, B. (2007). Phenomenographic study of students? problem solving approaches in physics. Physical Review Special Topics-Physics Education Research, 3(2), 20108. 154 Wandersee, J. H. (1988). Ways students read texts. Journal of Research in Science Teaching, 25(1), 69-84. Weinstein, Y., McDermott, K. B., & Roediger, H. L., III. (2010). A comparison of study strategies for passages: Rereading, answering questions, and generating questions. Journal of Experimental Psychology: Applied, 16(3), 308-316. Weiss, I. R., Banilower, E. R., McMahon, K. C., & Smith, P. S. (2001). Report of the 2000 national survey of science and mathematics education. Chapel Hill, NC: Horizon Research. Wheatley, G. H. (1995). Problem solving from a constructivist perspective. In D. R. Lavoie (Ed.), Towards a cognitive-science perspective for scientific problem solving: A monograph of the National Association for Research in Science Teaching, Number Six. (pp. 1-12). Manhattan, KS: Ag Press. Willoughby, T., Wood, E., Desmarais, S., Sims, S., & Kalra, M. (1997). Mechanisms that facilitate the effectiveness of elaboration strategies. Journal of Educational Psychology, 89, 682-685. Willoughby, T., Wood, E., & Khan, M. (1994). Isolating variables that impact on or detract from the effectiveness of elaboration strategies. Journal of Educational Psychology, 86(2), 279-289. Willson, V. L., & Putnam, R. R. (1982). A Meta-Analysis of Pretest Sensitization Effects in Experimental Design. American Educational Research Journal, 19(2), 249- 258. Wingate, U. (2006). Doing away with 'study skills.' Teaching in Higher Education, 11(4), 457-469. 155 Woloshyn, V. E., Pressley, M., & Schneider, W. (1992). Elaborative-interrogation and prior-knowledge effects on learning of facts. Journal of Educational Psychology, 84(1), 115-124. Woloshyn, V. E., Willoughby, T., Wood, E., & Pressley, M. (1990). Elaborative interrogation facilitates adult learning of factual paragraphs. Journal of Educational Psychology, 82(3), 513-524. Woloshyn, V. E., Wood, E., & Willoughby, T. (1994). Considering prior knowledge when using elaborative interrogation. Applied Cognitive Psychology, 8(1), 25-36. Wong, S. L., & Hodson, D. (2009). From the horse's mouth: What scientists say about scientific investigation and scientific knowledge. Science Education, 93(1), 109- 130. Wood, E., Pressley, M., & Winne, P. H. (1990). Elaborative interrogation effects on children's learning of factual content. Journal of Educational Psychology, 82(4), 741-748. Wood, E., Willoughby, T., McDermott, C., Motz, M., Kaspar, V., & Ducharme, M. J. (1999). Developmental differences in study behavior. Journal of Educational Psychology, 91(3), 527-536. Yore, L., & Shymansky, J. (1991). Reading in science: Developing an operational conception to guide instruction. Journal of Science Teacher Education, 2(2), 29- 36.