#1 Major issues

Chris Hasegawa (chris_hasegawa@internet.monterey.edu)
18 Mar 1997 21:39:11 U


                      Subject:                              Time:  9:01 =
PM
  OFFICE MEMO         #1 Major issues                       Date:  =
3/18/97

Hello!
I am Chris Hasegawa, a professor at California State University Monterey =
Bay and the evaluator for the Virtual Canyon project funded through the =
Monterey Pennisula Unified School District.  The project involves a lot =
of collaboration between the School District, two Universities (CSUMB and =
University of California at Santa Cruz), the Monterey Bay Aquarium and =
the Monterey Bay Aquarium Research Institute.  I am also the principal =
investigator on an AT&T funded project where we are trying to build a =
learning community by providing computers, phone lines, and a voice mail =
system to the families and teachers of an inner city school in San =
Francisco.  

I am glad to begin the discussion by raising an issue, or question which =
seem major to my evaluation work at this time.  The first involves how to =
avoid the skewing of data in working with the broad audiences that our =
projects affect.  Much of the input received is from the most =
technologically literate, and likely the most positive about the project, =
and I don't get much dissention unless I am able to physically visit the =
sites and conduct interviews, which is simply not possible in all cases.  =
I'd like to ask others of you, who are involved in national or widespread =
projects, how are getting the feedback from the "less technologically =
inclined" folks.  This seems critical to me because we've had our best =
growth resulting from folks that really had troubles with our prototypes. =
 Additionally, there are groups of people who are so suspicious of any =
official "government-like" entity like a university that they shut down =
when we try evaluative or assessment activities and although we've used =
community members as interviewers, we've had a terrible time connecting =
meaningfully with some folks, particularly in the inner city and language =
minoirty populations.  Again, we tend to hear from people who like our =
stuff, but we grow the most from hearing from folks that have problems, =
and we don't get enough from them.   What do you folks do?

I also wanted to thank Mavis Green for getting us started by listing some =
of her concerns and to start the dialogue by addressing a couple which =
caught my eye.
  What should be the primary goals in setting up assessment and =
evaluation in this area? 

I think the main question to be asked is do the activities of the project =
meet the stated goals of the project, and is there exportable educational =
value to the project, that is, what contributions does the project make =
beyond the immediate audience.  What can be shared in terms of a useful =
model, disseminatable products, new wisdom?  I add the latter because I =
feel that most projects are successful in their immediate context, or at =
least I hope they are.  But I hope evaluators will continue to take the =
longer view that includes value to everyone else so we can all grow =
together.  I don't know if that addresses Mavis' concern, but I have seen =
to many evaluations which simply talk about the effectiveness of meeting =
the project design and fail to address the bigger picture, and I think =
those of us outside the project don't benefit as much as we might.

  Criteria for evaluators to measure the knowledge the students are 
  displaying rather than be taken by the attractiveness of the display.  

We've been having some success in our projects by having students and =
teachers use a variety of measures on a particular project or display.  =
For example, following the lead of a project in Jonestown, PA, in a =
project where students created a multimedia social studies presentation =
as a out come of our project, we've been using a 0-3 rating scale for =
teachers to look at each of the following:
=B7 Effectiveness of presentation (e.g., interesting, informative, =
creative) 
=B7 Effectiveness of stating an issue
=B7 Accuracy of information in relation to selected issue
=B7 Presentation of a full picture (e.g., who, what, when, where, why, =
and how) 
=B7 Demonstration of insight into the issue
=B7 Effectiveness of bringing together different points of view
=B7 Completeness
=B7 Organization
=B7 Demonstration of "best work" efforts (e.g., planful, neat, showing =
initiative) 
In this way we have been successful in getting teachers and students to =
differentiate between "spashy" presentation and solid substance.  Some of =
our students have really been getting into the assessment process and are =
beginning to give each others some very interesting and useful feedback. =
Someday, if anyone is interested, I'll share the rubric that a middle =
school class developed based on rap music.  It's a lot of fun!