homepage    mcep   e-portfolio                  <-- PREVIOUS REFLECTION                NEXT REFLECTION -->
  Home            MCEP home     E-portfolio


Reflection 7: Reflective Practice

How the rubric item was addressed in this reflection:
This reflection shows how I have shown a disposition toward inquiry on teaching, and an ability to apply educational theory to do research on teaching and learning in my own classroom.

WHAT is the evidence? WHY did I choose the evidence? HOW does the evidence show growth?

Before entering the MCE program, not only did I have no experience with science education literature, I had never participated or conducted formal research within my classroom (i.e. coming up with a hypothesis on how to improve teaching and learning in my classroom, testing my hypothesis my systematic and conscientious changes in my practice, and then, perhaps most importantly, assessing whether the change was effective).  Through my participation in the two education courses Edu536 and Edu636, I have conducted research in my classroom and intend to continue doing so, albeit in a less formal way.

I present 1 piece of baseline evidence from
my original application to the MCEP program.  I also present 2 pieces of later evidence from my classroom research.  In comparing my baseline and later evidence, I use a conceptual framework that shows

  • IMPROVED FORMULATION OF RESEARCH QUESTIONS THAT INDUCE CHANGES IN TEACHING PRACTICEMy growth in my ability to formulate questions worthy of research is shown by the refinement of my research question as I continued to work on my research proposal in Edu536.  As I encountered the literature (for the first time!), I began to understand and to integrate a constructivist framework that gave me the vocabulary and insight to observe my teaching and discern what I needed to change.


  • INCREASED USE OF QUANTITATIVE METHODS/ STATISTICS AND QUALITATIVE METHODS IN ASSESSING THE IMPACT OF PEDAGOGICAL CHANGESMy growth in my ability to perform classroom research has much to do with my increased awareness of the importance of assessing the effectiveness of pedagogical practices.  As I became more aware of my need to evaluate changes I had implemented, I became more comfortable with the use of both quantitative and qualitative methods of assessment.




Baseline Evidence:
Reflective self-evaluation
for video-taped baseline lesson on equilibrium provided with MCE application
JUNE 2007

From reflective self-reflection:
Excerpt 1:
Re: Disposition toward inquiry and the reiterative process of improving teaching.
"The reason for my instructional choice of having a teacher-centric, lecture and note-based class, at this point in time, is because that is the method that I am most comfortable with.  I have not witnessed or been intensively mentored by another Chemistry teacher (having done alternate route certification and never having taken any education courses), and so I rely on a teaching method that allows me to use my strengths."
Excerpt 2:
Re: Having a sense of direction in terms of what I can change in my teaching/ questions that I can try to answer through research:
"I believe that both teachers and students tend to teach and learn in default, traditional patterns unless challenged to teach and learn in another way.  The default, traditional patterns often do not give credence to the fact that there are multiple learning styles and multiple teaching personalities.  But, I have a strong conviction that teaching should encourage each student to individually engage with the information or content matter of the subject in a way that is best for that student.  Unfortunately, there is very little evidence of that philosophy or belief in my classroom right now, except for the fact that I acknowledge that a student's inability to grasp a concept may be more indicative of my lack of skill in teaching it rather than his lack of potential.  I'm here to learn how to align my teaching with my philosophy!  Help!!!!"
Excerpt 3:
Re: Gauging student understanding/ efficacy of my teaching:
"In this lesson, I gauged student understanding by putting certain students on the spot and asking them for answers to questions, in addition to asking the rest of the class for answers.  There is evidence that the students did understand this unit because several would call out answers enthusastically.  Also, the types of questions certain students asked reflect their grasp of the concepts.  I did not have the time in this unit to test or quiz my students' understanding with a written assessment, which is typically my method for checking for understanding, because I did not feel I had the time to do it.  I am certain that there are better ways of checking for understanding, and I have a feeling that my students' understanding of the concepts might still be fairly tenuous from what I witness in the lesson."

Before entrance into the MCE program, I was asked to tape a lesson and reflectively self-evaluate it.  This first excerpt from the self-evaluation shows that before the program, I was still hesitant and uncomfortable with changing my instruction and engaging in inquiry that would reform my teaching.  The second excerpt shows that while I wanted the change, I did not know where or how to begin--there is a clear lack of focus.  The third excerpt shows that assessment of student understanding (and the efficacy of my own teaching) was still not very important to me; I did not understand its value, nor have much experience with the variety (quantitative and qualitative) research methods I could use to get a more comprehensive perspective of what was happening in my classroom.


Later Evidence:

My first piece of later evidence shows how I became more aware of what I wanted to research in my classroom by simultaneous reflection and reading of science education literature.  My second piece of evidence summarizes my project and the outcomes of my research, in which I set out to see the effect of what I had read in the research and applied in my classroom.  The following is a brief summary of the relevant research articles:


Brief overview of literature (full citations at bottom of page):

Through my participation in the MCE program (particularly through my coursework in Edu536 and Edu636), I became interested in researching and improving students' grasp of the particulate nature of matter as a way to improve conceptual understanding of a range of topics.  I describe how and when these articles informed my research more fully in the "Why did I select this topic?" section immediately following.

Williamson & Rowe (2002) was the first research paper I really read.  It made me aware of the efficacy and impact of peer groups on students' engagement and success.  It made a profound impact on me in showing that improving one's classroom practice often requires detailed and deliberate changes that do not often happen naturally or intuitively.

Nakhleh (1993)  was the paper that challenged my view of student understanding.  Nakhleh showed that students can perform quite differently on conceptual and algorithmic (i.e. math-based, often reliant on memorization of patterns and strategies) problems.  This made me aware of my need to more fully commit toward nourishing conceptual understandings in my students (I felt I was already quite good at fostering algorithmic understandings).

Nakhleh (1992) and Johnstone (1993) both made me aware of the three levels of chemical understanding--the macroscopic, the microscopic, and symbolic.  Johnstone approached this triad from a constructivist standpoint, arguing for the need to foster the connections between these disparate aspects in improving conceptual understanding.  Nakleh indicated that reason why students had difficulty comprehending many fundamental concepts was based in their inability to correctly conceptualize the submicroscopic level of chemistry (particularly the particulate nature of matter) and connect it to macroscopically observed phenomena and the symbolic representations in chemical problems.

Ercikan & Roth (2006) made me aware of the need for both qualitative and quantitative methods in answering a research question, and made me actually want to do the research that I had been preparing for in Edu536.  The authors focused upon the necessity of asking good questions (questions that were meaningful in informing and improving teaching and learning) rather than asking questions that could be evaluated easily (but have little importance beyond publication).  This article pushed me into seeing the ultimate usefulness of research in informing my classroom practice and thus, into using whatever melange of methods necessary to appropriately answer my questions.

EVIDENCE #1:
Research question in research proposal drafts

JANUARY 2008-APRIL 2008

Jan 12, 2008:
rq1















Why did I select this topic?

The initial impetus for my research was my frustration with student performance on word problems.  I had the sense that students were not really "getting" the material in the deep, conceptual way that I desired.

The literature I had started reading in Edu536 provided me with the background to eventually narrow and focus my research to be meaningful in my classroom.  I first encountered the ideas of impacting classroom engagement and performance by reading Williamson & Rowe's (2002) paper on the effect of peer groups.  As you can see, the impact of this literature is readily apparant in the research question.  Unfortunately, I was asking a question that, for all intensive purposes, was already answered by Williamson & Rowe.
Feb 18, 2008:
rq2
Upon further reading Nakhleh (1993), I became interested in the dichotomy between algorithmic and conceptual undrstanding.  I was encouraged to do exploratory research of the algorithmic-conceptual disconnect Nakhleh had observed in college classes in my own highschool classroom.  I asked the questions because I was somewhat interested, but the questions I was asking seemed mildly pedantic to me--not a driving force in improving my own classroom.
March 8, 2008:
rq3
I narrowed the question down into something I could more easily measure, but this question was even more pedantic and far removed from my classroom experience.  Sometimes you have to take a couple steps back to move forward....
April 24, 2008:
rq4
Through my reading of Nakhleh (1992) and Johnstone (1993), I realized I wanted to do something very specific in my own classroom to improve conceptual understanding.  In particular, I wanted to attack the problem of student's inability to understand the particulate kinetic nature of matter (PKNM).  Eventually, in my classroom, I not only integrated the use of manipulatives, but also a broad range of tools targeting student's understanding of PKNM, from animations and applets representing the submicroscopic aspect of nature to questions asking students to draw representations.   Finally, I had arrived at a question that was of practical value to me in my classroom, of importance to a larger community, and focused enough to be meaningfully researched.


EVIDENCE #2:
Statistical analysis of students' answers to
paired algorithmic & conceptual MC questions (from Nakhleh, 1993)
on their final (comparison of 2008 vs. 2009)

JUNE 2009

TABLE 1. Comparison of student performance on matched MC questions on 2008 (n=37) and 2009 (n=45) final

QUESTIONS
Gases
Equations
Limiting Reactant
Empirical Formula
Algorithmic
Conceptual
Algorithmic Conceptual Algorithmic Conceptual Algorithmic Conceptual
YEAR
2008
2009
2008
2009
2008
2009
2008
2009
2008
2009
2008
2009
2008
2009
2008
2009
AVG SCORE
(0=incorrect; 1=correct)

*# x 100 =
% who got answer correct
0.46
0.89
0.39
0.18
0.65
0.56
0.62
0.60
0.49
0.38
0.38
0.29
0.43
0.470
0.62
0.73
T-TEST STATISTIC testing for significant difference in performance between years on the same question (1-tail, unpaired)
5.6 x 10-6

0.021

0.20

0.42

0.16

0.20

0.38

0.14

T-TEST STATISTIC
testing for significant difference in performance between algorithmic and conceptual questions
2008: 0.243



2009: 1.66 x 10-15
(algorithmic understanding > conceptual understanding)

2008: 0.406



2009: 0.337
2008: 0.177



2009: 0.188
2008: 0.053
(approaching significance, where conceptual understanding > algorithmic understanding)

2009: 0.00472
(conceptual understanding > algorithmic understanding
highlighted numbers are significant at 95% (p<0.05)

TABLE 2. Percentage breakdown of students in four groups based on performance on matched questions:

Categories
Gases Equations Limiting Reagent
Empirical formula
2008
2009
2008
2009
2008
2009
2008
2009
A0C0 27.0
11.1
16.2
20.0
37.8
46.7
24.3
13.3
A1C0 35.1
71.1
21.6
20.0
24.3
24.4
13.5
13.3
A0C1 27.0
0
18.9
24.4
13.5
15.6
32.4
40.0
A1C1 10.8
17.8
43.2
35.6
24.3
13.3
24.3
33.3
*Percentages sometimes do not add up to 100 because of rounding


Click here for full Excel 2007 file coding students answers.

SUMMARY OF PROJECT AND OUTCOMES:

The second piece of later evidence shows my analysis of student performance on paired algorithmic and conceptual questions on the 2008 final (n=37) and 2009 final (n=45).  After having intensively integrated submicroscopic represenations and applets and manipulatives representing the particulate nature of matter during the 2008-2009 school year, I wanted to identify whether students did significantly better or worse on questions gauging their algorithmic (calculation-heavy) understanding and conceptual understanding of gases, chemical equations, limiting reagents, and empirical formulas. 

It is important to note that statistical analysis alone can not determine the reason or causality for significant increases or decreases--however, it does highlight significant associations.  This aids in identifying areas of improvement or deterioration in performance and provides a starting point for determining possible causes and thinking about what I could improve the next time I teach my class.

It is evident from the data that, contrary to what I expected, my students showed significantly less conceptual understanding of gases, even as they improved their algorithmic ability to solve problems related to gas laws.  There was no significant difference in any of the other question area. Since NO ONE got just the conceptual question correct, my conjecture, in looking over the actual answers chosen, is that I may have taught my lesson in a way that encouraged students in learning or retaining a misconception--they often thought that when gases are cooled, the gas not only slows down, but condenses, even at temperatures above the gas boiling point.  I will have to make a point of stressing the importance of observing whether the temperature is below or above the boiling point, and emphasizing that a gas only condenses when cooled at a temperature below its boiling point (at the given pressure).  I think I may have actually improved my teaching of gases in general, but made the unfortunate mistake of not accounting for a common misconception that could arise once students started visualizing and qualitatively associating particulate motion, physical state, and the effect of temperature.

There was significantly better performance on the conceptual empirical formula question in 2009.  Also, the average for both the algorithmic and conceptual question increased, indicating that this was an overall area of improvement.  Because I do not have qualitative data to draw from, it's a bit hard to conjecture as to why--it may have been a change in my teaching, or it may have been the sample of students that took the class that year.  I hope to gain a more thorough assessment in the future by making use of qualitative surveys and evaluations in concert with quantitative analysis.


DISPOSITION TOWARD CONTINUING INQUIRY

As a result of my research, I have an increased desire to re-address the conceptual-algorithmic disconnect that students have, particularly in learning about the nature of gases.  I have made a mental note to be more careful about how I may unintentionally perpetuate misconceptions.  I have also grown in my awareness of the need for evaluation, particularly because my research results showed an unexpected outcome that I would not have been aware of if I had not taken the time to assess the effectiveness of my pedagogical changes.

As for the following year, I plan on using and assessing (i.e. researching) the integration of web-based tools from quia.com (another great idea given to me by Mark Hayden) into my teaching.  I hope to assess their effect on student engagement with mid-year evaluations (or perhaps even cogenerative diaglogues) asking for qualitative feedback.  My hypothesis is that the use of this web-based support software (which is easily integrated into pre-existing materials to make them interactive) will improve class participation, accountability, and student enjoyment of my courses. 

Of equal importance, through my experience in Edu636 (in particular, my reading of an article by Ercikan & Roth (2006)), I have gotten more comfortable with adjusting my methods and assessment to meaningfully probe and address the question I am asking.  I have become quite comfortable with quantitative methods, but I also feel that qualitative methods (e.g. interviews, surveys, evaluations) are equally legitimate and have the ability to give more nuanced information that may be more helpful in directing further efforts and iterations of a particular lesson.



RELEVANT ARTICLES:

Ercikan, K., & Roth, W-M. (2006). What good is polarizing research into qualitative and quantitative?  Educational Researcher, 35(5), 14-23.

Johnstone, A.H. (1993). The development of chemistry teaching: A changing response to changing demand. Journal of Chemical Education, 70(9), 701-705.

Nakhleh, M.B. (1993). Are our students conceptual thinkers or algorithmic problem solvers? Journal of Chemical Education, 70(1), 52-55

Nakhleh, M.B. (1992). Why some students don't learn chemistry: Chemical misconceptions. Journal of Chemical Education, 69(3), 191-196.

Williamson, V. M. & Rowe, M. W. (2002). Group Problem-Solving versus Lecture in College-Level Quantitative Analysis: The Good, the Bad, and the Ugly. Journal of Chemical Education, 79(9), 1131-1134.

Updated August 4, 2009.