Session 2531

 

INTICE - Interactive Technology to Improve

the Classroom Experience

 

 

Jeffrey A. Siegel, Department of Civil Engineering,

Kathy J. Schmidt, College of Engineering Faculty Innovation Center,

Justin Cone, College of Engineering, Faculty Innovation Center

 

The University of Texas at Austin

 

 

Abstract

 

Interaction in the classroom is essential to improving student learning and using Classroom Performance System (CPS) technology is one way to promote interactions. CPS consists of student-operated remote controls and a receiver that records responses to multiple-choice questions posed by the instructor.  In order to promote the use of these questions and answers as a study tool, we designed an online application web site that provides a feedback loop for the instructor and students to examine their responses.  Our site also provides data to the instructor about individual student performance, aggregate class response to topic areas and specific questions, and student participation and class attendance.

 

In the fall of 2003, we implemented CPS in ARE 346N: Building Environmental Systems, a core class required of all Architectural Engineering majors at the University of Texas at Austin. The instructor used CPS an average of five times per lecture, including opinion or subjective response questions and collected figures on class attendance. Our evaluation of the data suggests that the majority of students reported that CPS enhanced their learning.  This observational study also suggests ways in which CPS can be used to minimize instructor time on class administrative chores and, most importantly, promote student learning of engineering material.

 

Introduction

 

Actively involving college students in lecture-based classes can be challenging, but with the use of emerging technologies there are ways to engage students and enhance communications among the students and between the students and instructor. One technology making headway in providing more student-centered, interactive classrooms is called the Classroom Performance System (CPS). This idea is not new; a hardware system called Classtalk has been in use for the last several years. While there were successes with Classtalk, CPS provides a more developed means of actively gathering students’ in-class responses without wired transmitters.  CPS consists of unique remotes for each student (purchased from the campus bookstore or borrowed from the library) and a receiver for the instructor.  When the instructor initiates a multiple choice question, the students key in their answers, the results are saved in data file, and the instructor can display a histogram of class results.  Individual and aggregate data is saved for each session.  When student anonymity is desired (i.e. for many of the opinion questions discussed in this paper) students can trade remotes so that responses can not be traced back to individual students.

 

Teaching methods that promote student participation and active learning are often advocated, however, the term “active learning” lacks a common definition in educational literature. Most educators assume that learning is inherently active; yet research suggests that for students to be actively learning, they need to do more than just listen. They must be dynamically engaged in tasks and in thinking processes. As such, “it is proposed that strategies promoting active learning be defined as instructional activities involving students in doing and thinking about what they are doing.”1 Research on undergraduate teaching advocates active student learning instead of the inherently passive lecture-discussion environment in which faculty talk and students listen. According to Chickering and Gamson2 the best practices in undergraduate education include:

·         encouraging student/faculty contact,

·         encouraging cooperation among students,

·         encouraging active learning

·         providing prompt feedback,

·         emphasizing time on tasks,

·         communicating high expectations,

·         respecting diverse talents and ways of thinking.

 

Awareness of the development of students’ ability to think is a common theme in much of today’s educational literature. Students need the ability to process and use information rather than to just store it. One way to assess foundational thinking skills is by Bloom’s Taxonomy.3 This taxonomy, developed in 1956, has evolved into a classic work that classifies cognitive behaviors into six categories ranging from simple to complex. The behaviors are hierarchical, with learning at high levels dependent upon attaining prerequisite knowledge and skills. The use of this taxonomy helps us go beyond the vagueness implied when we say we want our students to “understand” and provides us with six major levels of thinking as listed in Table 1.

 

Table 1

Six Major Levels of Bloom’s Taxonomy

Level                                      Characteristic Student Behaviors

Knowledge                            Remembering; memorizing; recognizing

Comprehension    Interpreting; describing in one’s own words

Application                           Problem-solving; applying information to produce a result

Analysis                                Subdividing to show how something is put together; identifying motives

Synthesis                              Creating a unique, original product

Evaluation                             Making value decisions about issues; resolving controversies

 

While lecturing is the most common college teaching method, another common strategy is that of asking questions. As far back as Socrates, questions have been used to guide and assess student thinking. The mere asking of questions is not sufficient, however, for “there are many classrooms in which teachers rarely pose questions above the ‘read-it-and-repeat-it level’ responses”4 and as such, questions do not stimulate deeper thinking for students.  Accordingly, a variety of questioning strategies is recommended and researchers suggest that questioning strategies are essentials to the growth of critical thinking skills, creativity, and higher level of thinking skills.5 There is extensive literature on teacher questioning6 as well as articles on the art of effective questioning. One way to become skilled as a classroom questioner is to use Bloom’s taxonomy to gauge proficiency and target areas for growth. Using Bloom as a guide, instructors can structure questions at each level and create questions that are meaningful and purposeful and that foster a learning environment that promotes the process of active learning. Classroom questions are often spontaneous and while such questions can be effective, CPS provides the capability to plan and pre-program questions. This thoughtful consideration of questions helps instructors tailor information to appropriate instructional levels and keep students engaged.

 

Specific methodologies for achieving an interactive classroom have been widely described in the literature.  Metha presented data on the value of active learning and described a method of student response to multiple choice questions in which students held up cards with a letter selection on them.7  Students self -reported that this technique improved their learning.  Although this method gave the instructor real-time feedback on student understanding, the data was potentially incomplete and unavailable for future analysis. In another study, this technique was extended and formalized to provide students with quick feedback on their learning for each class.8 Dufresne et al. report on a teaching framework that utilizes a classroom communication system to provide feedback on student learning.9 As far back as 1996, a classroom communication system called Classtalk was employed in large undergraduate physics classes in order to facilitate the presentation of questions for small group work.10 Dufresene et al. found Classtalk to be a useful tool not only for engaging students in active learning during the lecture hour, but also for enhancing overall communication within the classroom.10 Lopez-Herrejon and Schulman report on the use of CPS in a computer science programming class.11 They do not report performance or student preference data, but instead focus on several examples where the feedback from CPS provided real-time insight to the instructor about student learning and influenced the content or the teaching methodology in the class. Burnstein and Lederman (2003) described applications for wireless classroom systems and compared the costs and benefits of three commercially available systems, including the system that we describe in this paper.12 Not all researchers, however, have found significant benefits from CPS.  In an Advanced Chemistry class at the United States Military Academy, Blackman et al. (2002) reported that sections that utilized CPS had higher student satisfaction, but overall preparation for class or performance was not improved over classes that were taught with traditional lecture methods.13

 

Despite the many articles on CPS and other interactive classroom systems, there is relatively little data about whether these devices improve student learning.  In this paper, we present an observational study of using CPS in a junior level Architectural Engineering class.  Specifically we present data that indicates whether student responses and performance in the class correlate with their responses to CPS questions. We also provide data on the number of questions, the level of questions and how questioning strategies influenced learning. Our hypotheses include:

1) Students prefer CPS-supplemented lectures over traditional lectures.

2) CPS improves classroom participation for all students, especially those who do not typically ask questions or participate in discussions.

3) CPS allows instructors to monitor and evaluate student participation and attendance more easily than traditional techniques.

4) CPS provides a means to pre-plan questions at appropriate and challenging levels.

 

Description of CPS

 

CPS lets students respond to multiple choice questions using simple IR transmitters (often called “response pads” or “remotes”). Graphical summaries of students’ responses for each question are instantly available after each question has been answered, providing opportunities for class review and discussion. All response data is automatically stored and available in multiple formats for later analysis.  Examples of questions used in ARE 346N appear in Figure 1.

 

a1) How do you calculate current flow through a neutral conductor in a 3Ø system?

 

a2) For which situation would an absorption cycle be preferred to a vapor compression cycle?

A. I =3 E P

B. I =P/ (3 E )

C. I =P/ E

D. I= E P

 

A. A commercial building next to a cold-water creek
B. An office building complex that accompanies a power generating plant
C. A single-family residence
D. None of the above

b) How much wood would a woodchuck chuck if a woodchuck could chuck wood?

c)

 My learning in this class was helped most by:

A. A lot

B. A little

C. None

D. Don’t know

A. Readings

B. Lectures

C. In-class questions and answers

D. Homework

E. Quizzes

d) The daily usage of CPS was an incentive to improve my attendance

e) This class will contribute to my professional success

A. Strongly Disagree

B. Disagree

C. Neutral

D. Agree

E. Strongly Agree

A. Strongly Disagree

B. Disagree

C. Neutral

D. Agree

E. Strongly Agree

 

Figure 1: Examples of questions asked with CPS: a1) and a2) are questions about course material, b) is a question used at the beginning of class to evaluate tardiness, c) is an example of student self assessment of learning preferences d) is an assessment of CPS and e) is an evaluation of the course.

 

CPS consists of 1) at least one receiver (multiple receivers can be networked together for greater reception), 2) one response pad per student, and 3) CPS software.  Questions can be authored and delivered entirely within CPS software or presented in PowerPoint while CPS manages students’ responses. Hardware setup is minimal; plug in CPS receiver to an available serial port and place the receiver at the front of the classroom.  The receiver has a wide arc of reception (roughly 180º ± 15º), but fluorescent lighting can cause interference in some classrooms.

 

In addition to CPS’s in-class functionality, the Faculty Innovation Center (a student-fee supported center within the College that promotes enhanced instruction and multimedia development) created an online tool that allows students and instructors to track CPS data from multiple classes. Students can review specific questions asked in class, look up their attendance and check their in-class performance. Instructors can analyze attendance and performance data for the class as a whole or for individual students. eInstruction (the company the distributes CPS) offers a similar tool for free, but the Faculty Innovation Center (FIC) wanted total control over security, privacy and custom feature development.  Thus, the FIC’s web development team designed and built a tool from scratch.

 

Costs for CPS components vary from institution to institution.  At The University of Texas, students can purchase response pads for a net cost of $3 at the University bookstore.  Remotes are also available for semester-long loan at the engineering library.  For every semester in which students will use CPS, they must also purchase an enrollment code for $12.50. The same enrollment code can be used for an unlimited number of classes each semester.  The Faculty Innovation Center also purchased a receiver for at a cost of $250.

 

The instructor will need to spend additional time in order to use CPS in the classroom.  For this class, the instructor spent approximately 1.5 hours per week preparing CPS questions for the class.  An additional 0.5 hours per week were spent reviewing data and interfacing with students about the CPS technology.  The required time would increase for a larger class, although some CPS tasks could also be done by a TA or grader, rather than by the instructor.  There are also time savings associated with CPS, as student lateness and absence were monitored by the CPS without taking any class time.

 

Course description

 

In the fall of 2003, we implemented CPS in ARE 346N, Building Environmental Systems, a core class required of all Architectural Engineering majors. The instructor used CPS an average of five times per lecture, including opinion or subjective response questions and questions to ascertain class attendance and tardiness.

 

Table 2 lists the demographic information describing the 25 students in the class. As can be seen, the students enrolled in the course represent a typical upper-level Engineering course at The University of Texas at Austin.  Sixteen of the students in the class were juniors, seven were seniors, one was a graduate student and one was a continuing education student.

 

Table 2

Demographic information describing Students in ARE 346N (n = 25)

Gender

 

Male

18 (72%)

Female

7 (28%)

Ethnicity

 

Caucasian

17 (68%)

Asian

3 (12%)

Latino

4 (16%)

African-American

1(4%)

Average GPA (self-reported)

2.80

Average Final Grade in Course

2.96


Results

 

Our first hypothesis posited that students prefer the use of CPS over traditional lectures. Table 1 lists answers to questions asked on the last day of class that evaluated the use of CPS over the course of the semester.  Table 2 lists responses to specific questions about the histograms.  Three quarters of all students agreed or strongly agreed that interactive CPS questions were a positive addition to the class.  There was a similar response about the specific use of CPS: 65% of students felt that CPS should not be used less frequently and 83% said that CPS should be used in the future.

 

Table 3

Student preferences on use of CPS in ARE 346N (n = 23)

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Interactive lessons, such as those using CPS, are better rather than non-interactive ones.

0%

9%

17%

52%

22%

CPS should be used less frequently.

17%

48%

22%

9%

4%

Dr. Siegel should use CPS for this class in the future.

0%

9%

9%

57%

26%

 

Table 4

Student preferences on value of histogram (n = 23)

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

I found the histogram with the distribution of class responses helped spark my interest in the subject.

4%

13%

30%

35%

4%

Seeing the histogram that showed how the class answered gave me confidence to speak out in class.

4%

22%

52%

17%

4%

 

When asked further details about their responses to the histograms, 39% agreed or strongly agreed that the histograms sparked their interest in the class material, while 17% disagreed or strongly disagreed.   Although we suspected that seeing the histogram would give students confidence to ask further questions, the class was, on average, neutral on this point.

 

We were also interested in understanding the role of CPS in promoting learning, participation and student generated questions.  Table 3 lists three questions and responses about this subject.  The students overwhelmingly (83% agree or strongly agree) felt that answering CPS questions helped them to understand the class material.  Students were comfortable responding to CPS questions (only one student disagreed with this statement).  The use of CPS was only somewhat successful in encouraging students to ask questions (39% agree versus 17% disagree).  Follow-up questions on how CPS motivated student questions appear in Table 4.  Although 82% of students felt that asking questions (or other related participatory learning) were very or somewhat important, over half of the class (52%) seldom asked questions.

                                                                                                                 

Table 5

Student assessment of role of CPS in promoting questions (n = 23)

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Answering questions in class helped me better understand the content.

0%

4%

13%

61%

22%

I was comfortable having to respond to CPS questions.

4%

0%

22%

52%

22%

The use of CPS helped me to ask questions in class.

0%

17%

43%

39%

0%

 

Table 6

Importance and frequency of student questions (n = 23)

 

Very Important

Somewhat Important

Not Important

Participating (such as small group work or asking questions) in this class was:

30%

52%

17%

 

Every Class

Frequently

Seldom

On the average in this class I asked questions:

26%

22%

52%

 

One value of CPS is allowing the instructor to monitor attendance in the class.  Two attendance-related questions and their responses are shown in Table 5.  Only one student disagreed that the use of CPS was an incentive to improve attendance.  The majority of students (95%) felt that attendance was crucial for success in the class.  A portion (5%) of the final grade in the class was associated with student participation.  Participation was partially evaluated by student attendance and on-time arrival to the class.  The CPS made it easy to evaluate these aspects of participation.

 

Table 7

Role of CPS in motivating attendance and value of attendance (n = 23)

 

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

The daily usage of CPS was an incentive to improve my attendance

4%

0%

22%

43%

30%

Attendance was crucial for success in this class

0%

0%

4%

30%

65%

 

We made several other observations about the use of CPS:

1) Student grades correlated with their success at answering CPS questions.

2) Only one student made a negative comment (“The CPS was distracting.”) about CPS on anonymous instructor evaluations or in a verbal discussion about the cost and value of CPS.  Several students made positive comments about the CPS system.

3) Instructor evaluations improved slightly (from 3.8 to 4 out of 5) between the previous year’s offering of the class without CPS and this offering.  It is hard to attach significance to this result because of the many confounding factors such as different class size, different textbook, additional instructor experience, etc.

 

Our fourth hypothesis was that CPS would force the instructor to pre-plan questions and to plan for questions at challenging levels.  Prior to each class, the professor developed approximately five questions to assess student opinions and understanding. The course teaching assistant and CPS teaching assistant were provided information on how to categorize the questions according to Bloom and subsequently they independently identified the level of each CPS question. The instructional designer reviewed these categories in order to establish a reliable percent of agreement for the coding of questions.

 

During the semester, a total of 113 questions were posed using CPS. On average, five questions were asked during each class session. The following table identifies the percent of question types presented during the entire system. The 13% of questions not accounted for in Table 8 are opinion questions which often refer to questions that look at teaching strategies and are not content-specific questions.  In the case of opinion questions where instructor expectation could influence student responses, students were encouraged to switch remotes so that the instructor could not trace a response to a particular student.

 

Table 8

Semester Summary of Bloom Question Level Type

Knowledge

Comprehension

Application

Analysis

Synthesis

Evaluation

21%

23%

16%

26%

0%

1%

 

It was the intent of the professor to challenge students with the questions and an effort was made to ask questions representing the range of thinking skills. It is interesting to note, however, that when it came to the highest order thinking skills (synthesis and evaluation) these questions were difficult to create using CPS format. Questions that challenged students to synthesize and discriminate and evaluate were posed during small group activities and on written assessments. Open-ended questions are often used to generate class discussions and to get students to question their assumptions. In this class, CPS questions were used to gauge student understanding and to provide instructional information to the professor on how to proceed and whether or not to go more in-depth or to re-teach.

 

Discussion and Conclusions

 

The results suggest that students were generally positive on CPS and encourage its use in ARE 346N.  Although the student responses do not indicate that the histogram of results motivated additional questions, there were several scenarios in which the results of CPS questions did stimulate classroom discussion.  An example is that questions where the majority of the class selected an incorrect answer were often a sign to the instructor of a gap in student understanding and presented opportunities to discuss this material in more depth.  This is consistent with the findings of other CPS-related research.11 Questions that most of the class answered incorrectly were often repeated either immediately (where students had the benefit of eliminating one of the choices) or in a different form on weekly quizzes.  Students who followed the discussion in class or went to the CPS web site tended to learn this material.  Further, there were several occasions when a student asked for clarification about a specific CPS question during the instructor’s office hours; CPS helped these students understand what material they had not yet learned.

 

CPS can play a role in improving student learning.  Using Chickering and Gamson2 criteria as gauge of CPS’ role in student learning we found that:

1)       The CPS did encourage student-faculty interaction when student generated questions followed from CPS questions and when students sought clarification on CPS questions that they did not understand in class.

2)       When a large fraction of the class answered a CPS question incorrectly, students worked in groups to find the correct answer.  This encouraged student cooperation.

3)       The CPS system provides prompt feedback.

4)       The time actually spent in learning activities is often called “time on task” and when students are responding to CPS questions, they are on task. Often the CPS questions give students opportunities for reflection and investigation and the result is engaged students.

5)       The histogram of CPS results generally showed the level of learning in the class and indicated that students who were getting many answers incorrect needed to increase their time studying class material or clarify the material with the instructor.

6)       The variety of results, particularly on opinion questions about the class, showed students the diversity of their peer’s opinions and the variety of learning desires.

 

One of the major benefits of CPS is that it allows the instructor to preplan questions to address several different levels of Bloom’s Taxonomy.  The process of generating and categorizing questions for ARE346N, although time-consuming, illuminated the scarcity of questions at the highest levels.  The instructor compensated for this by designing homework assignments and group projects to address the synthesis and evaluation levels.  CPS also allows the instructor to influence discussion in the class: there is some evidence14 that student-generated questions will tend to be at the lowest levels without additional guidance. Fundamental to a successful implementation of CPS is to think analytically about what purpose it serves. Given that CPS format allows for multiple choice type responses, it may not be overly useful for open-ended responses that are typical of the highest level of Bloom’s Taxonomy. If there can be multiple answers or a single correct answer is not appropriate, then it seems likely that CPS would be restrictive.

 

Despite the fact that we are positive on the use of CPS and will continue to use and promote it for ARE 346N and other similar undergraduate engineering classes, we also have some reservations about the system.  Although no student indicated any problems with the costs associated with the system, the faculty member (and the teaching assistants associated with the course), and the FIC staff spent a considerable amount of time implementing CPS in the class.  The most time-consuming tasks were generating high-quality questions and analyzing the data quickly enough to address student weaknesses and improve learning.  Although subsequent uses of the system will require smaller time expenditures, it does take more preparation time than traditional lectures.  Another limitation of CPS is that it does not prepare students for non-multiple choice exams (such as those used for ARE 346N).  However, the value of motivating students in the class and with the material, promoting active learning and participation, and obtaining real-time data on student performance outweighs these concerns.  Additionally, one student submitted a comment on the class evaluation that indicated that the CPS remote was distracting.  Another student did not like the fact that the attendance and tardiness in the class contributed to their participation grade (5% of the total grade).  Conversely, three students submitted comments that the CPS improved their understanding of the class material.

 

In the future we plan to broaden our use of CPS.  We are currently using CPS in a limited way in ARE 465 a capstone design seminar for which ARE 346N is a prerequisite. By asking students questions about the 346N material we can evaluate student retention of information and tailor the material in 346N appropriately.  Also, in future iterations of ARE 346N we plan on improved integration of CPS software in the class to encourage more students to take advantage of a valuable study tool.  We are still finishing our evaluation of the data to determine how strong the correlation is between student performance on CPS questions and student performance on related quizzes and exams.  We are also evaluating whether CPS technology is appropriate for students of all learning styles. Decision makers considering adoption of this technology would also benefit from controlled experiments comparing student learning from CPS to traditional lecture methods.

 

 

Acknowledgements


We would like to acknowledge the support of an Academic Development Grant from The University of Texas of Austin College of Engineering that was used to implement CPS in ARE 346N.  We would also like to acknowledge the information that Dr. Charles Chiu provided to us on his experiences with CPS.  The teaching assistants for the class,
Joseph J. Fradella and Rajkumar S Thottikalai provided invaluable assistance generating and categorizing CPS questions, resolving student problems with CPS technology, importing data to the CPS website, and initial data analysis. The FIC’s Dan Peters and Amar Mabbu developed the online application tool and provided technical supports to the professors. Finally, we thank Natasha Beretvas, an assistant professor in the Quantitative Methods program in the Educational Psychology Department at UT Austin for providing advice and guidance on statistical issues.

 

 

 

References

 

1.        Bonwell, C., and Eison, J. (1991). Active Learning: Creating Excitement in the Classroom. ERIC Digest, 1991091.

2.        Chickering, A. and Gamson, Z. (1991). Seven Principles for Good Practice in Undergraduate Education, Jossey-Bass Publishers, San Francisco, California.

3.        Bloom, B. S. (1956). Taxonomy of Educational Objectives. Book 1, Cognitive domain. , Longman, New York.

4.        Wolf, D. (1997). The Art of Questioning. Academic Connections 1. Retrieved January 4, 2003 from the World Wide Web: http://www.exploratorium.com/IFI/resources/workshops/artofquestioning.html

5.        Schwartz, B. and Miller, G. (1996). You Are What You Ask – The Power of Teaching Students’ Questioning Skills for Enabling Thinking. Presented at the Annual Sage Conference Proceedings: Faces of Excellence. Calgary, Alberta, Canada. ERIC Document 408 744.

6.        Shermis, S. (1999). “Reflective Thought, Critical Thinking.” ERIC Digest, 19991101

7.        Mehta, S. (1995). “A Method for Instant Assessment and Active Learning.” Journal of Engineering Education, 84, 295-298.

8.        Mehta, S. I., and Schlecht, N. W. (1998). Computerized assessment technique for large classes. Journal of Engineering Education, 87, 167-172.

9.        Dufense, R, Gerace, W. Leonard, W., Mestre, J. and Wenk, L. 1996.Classtalk: A Classroom Communication System for Active Learning in the College Lecture Hall. Journal of Computing in Higher Education, 7, 3-47.

10.     Dufresne, R., Gerace, W., Leonard, W., Beatty, J., 2002. Assessing-To-Learn (A2L): Reflective Formative Assessment Using a Classroom Communication System. Pathways to Change: An International Conference on Transforming Math and Science Education in the K16 Continuum, April 18-21, 2002. Crystal City, Arlington VA

11.     Lopez-Herrejon, R. E. and Schulman, M. 2004. Using Interactive Technology in a Short Java Course. ITiCSE, Leeds, UK.

12.     Burnstein, R.A., Lederman, L.M., 2003. Comparison of Different Commercial Wireless Keypad Systems, The Physics Teacher, 41, 272-275.

13.     Blackman, M., Dooley, P., Kuchinski, B., Chapman, D., 2002. It worked a different way. College Teaching, 50(1), 27-28.

14.     Dillon, J.T. 1988. The Remedial Status of Student Questioning. Journal of Curriculum Studies, 20, 197-210.

 

 

 

 

Jeffrey A. Siegel is an assistant professor in the Department of Civil Engineering at The University of Texas at Austin.  He teaches classes in the Architectural and Environmental Engineering area that focus on building environmental systems, indoor air quality, and energy-efficient and healthy buildings.  Dr. Siegel has cooperated with the Faculty Innovation Center on several projects to promote active learning in his classes.

 

Kathy J. Schmidt is the director of the Faculty Innovation Center for the College of Engineering at The University of Texas at Austin. In this position, she promotes the College of Engineering’s commitment to finding ways to enrich teaching and learning. Dr. Schmidt works in all aspects of education including design and development, faculty training, learner support, and evaluation.

 

JUSTIN CONE develops multimedia and internet applications for The University of Texas’ Faculty Innovation Center.  Justin has five years experience with various forms of new media as both a designer and a producer. He received his B.A. in English-Creative Writing from the University of Houston.