An email message from Karron Lewis on: Personal Response System
learning outcomes.

>Date: Mon, 13 Dec 2004 13:59:36 -0600
>To: CPS group:;
>From: "Karron G. Lewis" <kglewis@mail.utexas.edu>
>Subject: Fwd: [POD] POD weighs in Personal Response Systems learning
>  outcomes
>Cc: jashcroft@mail.utexas.edu
>X-ph.utexas.edu-MailScanner-Information: Contact: abuse@physics.utexas.edu
>X-ph.utexas.edu-MailScanner: Thought to be clean
>X-ph.utexas.edu-MailScanner-SpamCheck: not spam, SpamAssassin (score=-2.5,
>         required 5, autolearn=not spam, BAYES_00 -2.50)
>X-MailScanner-From: kglewis@mail.utexas.edu
>
>Hi Morrie and Charles,
>
>Below are some resources and citations about the need for additional
>research using CPS in classrooms and some of the research that is
>currently being done.  You may already have these, but in case you don't,
>they look very interesting and useful in case we decide to do some
>research here at UT on the effectiveness of using CPS.
>
>Have a wonderful holiday!
>-- Karron
>----------------
>>
>>Date:         Mon, 13 Dec 2004 11:34:33 -0500
>>Reply-To: m.diamond@NEU.EDU
>>Sender: Professional & Organization Development Network in Higher
>>               Education <POD@listserv.nd.edu>
>>From: Miriam Diamond <m.diamond@NEU.EDU>
>>Subject:      [POD] POD weighs in Personal Response Systems learning outcomes
>>To: POD@listserv.nd.edu
>>
>>My thanks to everyone for their helpful information on SOTL research
>>related to PRS.  Below is a compilation of responses sent directly to me
>>(in addition to those already posted on POD).
>>
>>- Miriam
>>
>>Miriam Rosalyn Diamond, Ph.D.
>>Center for Effective University Teaching
>>225 Hayden Hall
>>Northeastern University
>>360 Huntington Avenue
>>Boston, MA 02115 - 5000
>>(617) 373-2241
>>(617) 373 - 7531 FAX
>>M.Diamond@neu.edu
>>
>>
>>_______________________________
>>
>>Miriam,
>>
>>I was forwarded an email from Wayne Hall, UC's Vice Provost for Faculty
>>Development, that appeared on the POD listserv.  You were requesting
>>information/data related to the Personal Response System (PRS).
>>
>>I have been surprised at the lack of quantitative research in this area. I
>>have been able to find a number of qualitative case studies, but little
>>else.  I have found some information that was helpful to me, although not
>>quite what I was hoping to see.  In the case that you might find it
>>helpful:
>>
>>Hake, R.  Interactive Engagement vs Traditional Methods: A Six-Thousand
>>Student
>>Survey of Mechanics Test Data For Introductory Physics Courses. American
>>Journal of Physics, Vol. 66, p. 64, 1998.
>><?xml:namespace prefix = o ns = "urn:schemas-microsoft-com:office:office"
>>/>
>>Hake, R. Lessons from the physics education reform effort. Conservation
>>Ecology, Vol. 5, Iss. 2, p. 28, Jan. 2002.
>>
>>Also from your "neck of the woods": Russell, J. On Campuses, Handhelds
>>Replacing Raised Hands. <?xml:namespace prefix = st1 ns =
>>"urn:schemas-microsoft-com:office:smarttags" />Boston Globe,
>>9/13/2003.  This article states that UMass-Amherst economics professor,
>>Norman Aitken, won a $460,000 grant from the Davis Educational Foundation
>>to study the impact of the PRS technology.  I am uncertain as to where
>>their research currently stands.
>>
>>I saw that someone responded on the listserv with info related to the
>>chemistry professor from Notre Dame.
>>
>>Because of the lack of published research in this area, I have taken it
>>upon myself to study it.  I teach general chemistry classes, and with a
>>joint appointment between chemistry and the division of teacher education,
>>my primary research interests involve science education.  I have designed
>>a multi-part study where I will be comparing the use of the PRS in one of
>>my classes to several other classes including:
>>
>>1. one of my classes where I do not use the prs
>>2. To previous chemistry classes.
>>3. To a general physics class.
>>
>>I will be conducting pre- and post tests along with pre- and post-surveys
>>of student satisfaction.
>>
>>I also plan on using the PRS for attendance and I plan to track these
>>students through subsequent chemistry classes (namely Organic Chemistry)
>>to see if their long term learning and abilities are increased via the
>>interactive engagement of using the PRS.
>>
>>So while I can not be of more help at this time, I will gladly share any
>>information I gather with you, although it is likely to be the end of 2005
>>before I have anything of value.
>>
>>In the meantime, if anyone shares a publication or information with you
>>that I might have missed, I would appreciate you forwarding anything to me
>>so that I also might be able to use it.
>>
>>Thanks and best wishes!
>>
>>Jon
>>
>>Jonathan Breiner, PhD
>>Assistant Professor, Chemistry
>>529 Rieveschel
>>University of Cincinnati
>>College of Education, Criminal Justice and Human
>>Services & College of Arts and Sciences
>>(513) 556-0713
>>
>>Hi everyone,
>>
>>Here are some links on the Classroom Performance System from the Duke
>>website.  I also found out that at least McGraw Hill currently packages
>>the eInstructionCPS system (the same one we tried yesterday) so different
>>publishers are .
>>
>>http://cit.duke.edu/ideas/tools/response.do
>>
>>http://cit.duke.edu/ideas/engaging_lectures.do
>>
>>The eInstruction website is:
>>
>>http://www.einstruction.com
>>
>>Peter Alachi
>>Academic Technology Services
>>I.S. Vice President's Office
>>403 Richards Hall
>>Tel: (617) 373-4217
>>Email: p.alachi@neu.edu
>>
>>
>>
>>
>>I have been using the CPS system by einstruction
>>(www.einstruction.com) for the last two semesters in my general
>>chemistry course. The system is very popular with students, as the
>>attached PowerPoint file summarizing a survey I did two weeks ago,
>>shows. We are using the "purchase" model, where our department
>>purchased 48 responder units, a receiver unit, and software for
>>~$3000. The rental model, where students purchase the responders and
>>register online for $15/semester, was used last semester.
>>
>>Using the purchase model, there are not enough clickers to have every
>>student equipped in a class of 96 students, but my students work in
>>groups of three, so we have enough responders  for each group, and
>>students don't seem to mind that.
>>
>>I have not seen, however, any increase in test scores as a result of
>>using this system. We implemented them about midterm and I did not
>>see any major change in exam performance. I will keep using the
>>system, however, because of the enthusiastic student response.
>>
>>Best
>>Dave Frank
>>
>>
>>Miriam,
>>
>>We did a trial implementation of the system in a large enrollment ( 1850
>>students) Chemistry course taught by five instructors  teaching 13
>>sections each.  During the lecture meetings, the system was not up to the
>>task of accepting 300 responses at a time and would lock up half way
>>through the responses. This created great depths of frustration for the
>>students and prevented assigning credit to responses.
>>
>>One instructor surveyed her students for attitude during week 9 (135
>>responses) and the general feeling was that:
>>the system had promise, but the technology problems limited usefulness
>>the questions were useful in learning, but the technology problems made
>>responding not important
>>the students liked being able to see immediately whether their answers
>>were correct or incorrect
>>the faculty were able to correct misconceptions quickly
>>the responding process took too long (no doubt due to 300 or so
>>simultaneous responses choking the system)
>>
>>We will be using the PRS in a Spring semester  course with fewer students
>>(625), plus the vendor has made numerous improvements to eliminate the
>>problems , so I anticipate the students having a more successful
>>experience. I will be conducting a survey to assess learning value.
>>
>>In researching this, I found a couple of other studies you might be
>>interested. If you find anything new, perhaps you'd consider sharing and
>>if you wish, I will share our finding for the spring.
>>
>>References:
>>
>>Anderson, D. R., Gaddis, B. A., & Schoffstall, A. M. (2003). Improving
>>Learning through Clicking: Using and Electronic Audience
>>Response System in General Chemistry. Paper presented at the 225th ACS
>>National Meeting, New Orleans.
>>Hooker, N., & Roberts, M. (2003). Trialing the Personal Response System.
>>Retrieved November 9, 2004, from
>>http://cfaes.osu.edu/facultystaff/teaching/PriceChairTeachingImprovementGrants_000.htm
>>Jackson, M. H., & Trees, A. R. (2003). Clicker Implementation and
>>Assessment. Boulder, CO: University of Colorado.
>>Unknown. ( 2004, August). UMass Amherst Personal Response System (PRS?)
>>support pages. Retrieved November 9, 2004, from http://www.umass.edu/prs/
>>
>>Dear Miriam,
>>
>>At the U. of Illinois-Urbana, we are in the midst of a SoTL project
>>on the effectiveness of a new electronic polling system that was
>>developed on our campus.  We piloted the "I-Clickers" in 7 classes
>>across campus this fall, and are collecting various kinds of data
>>from the instructors and from the students.  Unfortunately we're
>>still mid-stream and don't have any results to share yet.  I'd be
>>happy to pass along any insights when we do, though!
>>
>>Best wishes,
>>
>>Laura Hahn.
>>
>>A few random thoughts:
>>
>>                  I know of a couple of studies that are currently underway
>>to test
>>the effectiveness of a particular pedagogical technique used with the
>>clickers.
>>Initial indications are promising.
>>                  I can tell you that students love them - anonymous
>>electronic
>>journal results to a question asking about them were overflowing with
>>praise.  When I have done SGIDs in classes that don't use them, students
>>sometimes ask for them.
>>
>>                  A bit of technological advice is that it is better to use
>>a
>>system that actively polls, that is, one where the receiver sweeps to
>>see if responses are coming in, rather than one that sits and waits for
>>the responses.  The active polling is much more time efficient and the
>>passive version can get frustrating for students in a large class.
>>Also, find out what the expected battery life is for the clickers - some
>>kill off the batteries a lot faster than others.
>>                  Another piece of advice is that it takes time to write
>>good
>>questions for the clickers - faculty should be armed with that knowledge
>>before incorporating the clickers.
>>
>>                  Kathy
>>--
>>
>>Kathy Harper     260 Younkin Success Center
>>Instructional Consultant 1640 Neil Avenue
>>Faculty & TA Development Columbus, OH 43201-2333
>>The Ohio State University (614) 292-3644
>>harper.217@osu.edu  (614) 688-5496 (fax)
>>
>>
>>Hi Miriam,
>>
>>I am a non-tenure track faculty member at NMSU (Mew Mexico State
>>University)
>>(although I moved here from Boston...can't say I miss the Boston
>>weather!).
>>The director of our Teaching Academy (Tara Gray) forwarded your e-mail
>>about
>>data & insights using the instant response systems, as I have used it this
>>semester in two large intro bio classes (I used the eInstruction system).
>>
>>I just completed a student survey, and am working on compiling that info,
>>and I am also working on comparing final course grades with attendance
>>info
>>from the clickers. I will be presenting this at our local teaching
>>conference (http://spacegrant.nmsu.edu/NMSU/sete/) early in January. If
>>you'd be interested, I can send along my powerpoint presentation with the
>>data (once it is done!), and I'd be happy to speak with you about it (if
>>you'd be interested). I also had to prepare a paper to submit to the
>>conference, and I'd be happy to forward that (although it is more general,
>>and doesn't have the data in it, as I just recently got that from
>>students).
>>
>>In a nutshell, I'd say it was a pretty positive experience, with some
>>hurdles/challenges/bumps (none of which are insurmountable). Students
>>seemed
>>to really enjoy using the system, for a variety of reasons. I am planning
>>on
>>using it next semester in a 300-level general microbiology course (with
>>about 115 students).
>>
>>Let me know if there is anything you'd like to hear more about.
>>
>>Cheers,
>>
>>Michele
>>
>>*************************************************************************
>>You are subscribed to the POD mailing list. To Unsubscribe, change
>>your subscription options, or access list archives,  visit
>>http://listserv.nd.edu/archives/pod.html
>>
>>For information about the POD Network visit http://podnetwork.org
>>
>>Hosted by the John A. Kaneb Center for Teaching and Learning and the
>>Office of Information Technologies at the University of Notre Dame.
>>*************************************************************************
>
>
>--
>- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
>Karron G. Lewis, Ph.D.                 Ph:  (512) 232-1776 (direct line)
>Associate Director                     Fax: (512) 471-0596
>Ctr for Teaching Excellence            email:  kglewis@mail.utexas.edu
>http://www.utexas.edu/academic/cte/
>Division of Instructional Innovation and Assessment
>http://www.utexas.edu/academic/diia/
>
>The University of Texas at Austin
>1 University Station  G2100
>Austin, TX   78712-0546