Dr. Jack Barbera, Associate Professor

School of Chemistry and Biochemistry, Campus Box 98 Greeley, CO 80639

contact info:(office)Ross Hall 3576, 970.351.2545, (email) jack.barbera@unco.edu

Home   Research Interests   Group Members   Publications   Presentations

Barbera Group Research Interests

Item Response Theory and its Application to Assessment Instrument Development: The field of psychometrics dates back more than a century and is concerned with the theory and practice of psychological and educational measurement. Researchers in this field have established protocols to develop and evaluate a wide variety of assessment instruments, including multiple-choice concept inventories. With these protocols, chemists can make measurements of student knowledge and understanding with the same care and precision they use in the laboratory. Our projects make use of the Rasch and Item Response Theory models during the development of content-based assessment instruments. These models provide unique details regarding both items and persons, allowing for a robust evaluation of both. Using these models during the development phase of an instrument will lead to more reliable and valid data once the instrument is completed.

Modular Chemistry Concept Inventory: The primary goal of this project is the development and evaluation of instruments for the assessment of conceptual knowledge within specific areas of college-level general chemistry.  Conceptual understanding is an important aspect in all science, technology, engineering, and mathematics (STEM) education, as it implies both content knowledge as well as the ability for application of the knowledge.  As general chemistry is a fundamental building block for many of the STEM disciplines, a strong foundation in its concepts may impact the way in which students understand and retain future knowledge both in and out of chemistry. This project is atypical in that instrument development and validation is its main focus and not the subset of a larger project.  This focus allows the project to employ more rigorous protocols during all phases of the design/validation process.  One example of this is the use of open-ended student interviews to establish student conceptions for the development of questions and distractors.  Additionally, Item Response Theory is used in determining question quality and difficulty, characteristics which will be the primary basis for final question selection. While other concept inventories exist within chemistry, this project is unique in that none of the others address individual topics or concepts.  Existing tools prove useful in the assessment of course content knowledge but provide little detail with regard to the range of student conceptions around any one of the many subtopics.  Such information is useful to instructors and chemical education researchers wishing to measure conceptual change arising from targeted intervention.

Development and Testing of Learning Tutorials for General and Physical Chemistry: There is increasing evidence that after instruction in a typical course, many students are unable to apply the science concepts that they have studied to problem solving situations that they have not expressly memorized.  In order for meaningful learning to occur, students need more assistance than they can obtain through listening to lectures, reading a textbook, and solving standard quantitative problems.  It can be difficult for students who are studying chemistry for the first time to realize what they do and do not understand and to learn to ask themselves the reflective questions needed to come to a functional understanding of the material.  My past work has been in the development and testing of tutorials which promote the active mental engagement of students in the process of learning thermodynamics and kinetics during a first semester physical chemistry course.  Using guided inquiry methods, students develop deeper connections with difficult concepts.  These tutorials were inspired by the vast body of physics tutorials developed at the University of Washington and the process oriented guided inquiry learning (POGIL) materials developed at Franklin and Marshall College.  My work build on these traditionally “pencil and paper” methods with the addition of interactive materials, such as computer simulations and physical models, providing students with “hands-on” investigations.  My group will continue on with the development of physical chemistry based tutorials and venture into development of tutorials for general chemistry. 

The first step in the development of these learning tutorials is to identify the common misconceptions students have as well as the concepts that students struggle with most.  As many of these problems are not new, a thorough literature search is needed in order to uncover other effective and not-so effective solutions to these problems.  From these, a series of inquiry questions and physical models are assembled, clarity of the questions is then confirmed both with students and faculty.  Seamless construction of the best questions and models into thought provoking learning tools provides an initial format for testing. 

While there are worksheets, activities, and teaching tools available to address many of these same problems, most have no indications of validation studies or tests of effectiveness after development.  The testing and validation of the newly developed tutorial material is in itself a multi-step process.  In order to gauge a tutorial’s effectiveness in teaching a particular concept, a pre and post tutorial analysis can be performed.  Use of conceptual questions, synonymous to those addressed within the tutorial, allows learning gains to be measured.  Learning gains due to tutorial use can be compared to learning gains through traditional methods or other treatments by using the same pre-post methods with a test population.  Statistical comparisons are made in order to rank the significance of the learning gains.  In order to reduce any bias due to internal testing, one can also use a stock conceptual exam to monitor differences between the conceptual knowledge of test and control groups. 

Surveying Students' Beliefs: Calls from the chemistry community to reform the way chemistry is taught suggest the need for an effective measure of the impact of the changes.  In concert with evaluation of students’ conceptual knowledge is the assessment of the students’ beliefs about learning the subject at hand.  For example, do students find chemistry useless and believe that learning involves memorization of knowledge handed down by authority?  There is evidence indicating that students’ initial expectations and beliefs about science and learning science affect the education process and how people learn within a discipline.  The retention of students within science has been shown to be strongly influenced by beliefs.  An important goal in science education is to teach students how to think like an “expert” when it comes to problem solving and the processing of knowledge.  The utilization of a survey instrument to measure these widely accepted educational goals can provide valuable information to an educator about their influence on these goals.  This information can also provide a powerful motivator in convincing faculty to reexamine their teaching practices and explore research-validated alternatives.  While there are other attitudinal surveys for use in chemistry and physics, my group uses a modified version of the Colorado Learning Attitudes about Science Survey.  This survey, developed at CU-Boulder in the departments of Physics and Chemistry, is easy to use and has undergone rigorous testing and validation.

Administration of this on-line survey to chemistry classes requires little to no effort on the instructors’ part and only 10 minutes of students’ time, yet generates a rich set of data.  The survey is given at the start of the semester (PRE) and again at the end of the semester (POST).  Statistical analysis of the data can be used to study a wide array of categories.  As mentioned above, the data can be used to measure changes in students’ beliefs after the implementation of a new teaching style, providing important feedback to the instructor and department concerning reform methods.  Comparisons can be made between various sub-sets of students such as: men and women, ethnic minorities, various declared majors, as well as with other departments or schools using the same instrument.  Long-term studies can be utilized to track the change in beliefs of chemistry majors over the course of their academic career.  These studies can be used to guide improvements in teaching practice.  Improvements designed to better address these beliefs will result in more effective and enjoyable science learning and likely have favorable influence on retention and career choice.

In addition to the primary questions concerning students’ beliefs, a series of ancillary questions probe students’ backgrounds, with some questions elucidating free-response answers.  These additional questions not only gather demographic data from students, but are also used to collect a record of students’ past math, physics, and chemistry course work.  The post version contains two free-response questions asking students about their interest level in chemistry and about any changes which have occurred in their interest over the semester.  This project can be extended, providing additional research for undergraduate students, by investigating students’ beliefs and their correlation to previous math or science preparation or studies investigating connections between students’ beliefs and self-reported interest changes.