[IAEP] [Olpc-open] Fwd: Defining success

Sandra Thaxter sandra.thaxter at verizon.net
Tue Jan 31 20:09:33 EST 2012


Marta, 

 Thank you again for elucidating the broader issues around evaluating OLPC impact.  I am going to keep these emails as they are so well written and your points so well made.  Let us all broadly distribute this rich contribution.   

I like your direction, not just a grant to test, but to develop an system of evaluation for the 21st century.   As we are in fact implementing learning that doesn't fit into the 19th century practices we inherited,   It does go back to Piaget.  In Kenya many leading educators agree that the system of exams and evaluation is failing Kenyan children, and that the system needs to connect learning to life.  The school population in Kenya is so large that many children are being left behind.  

See the site  http://amstref.org ... a Kenyan making math relevant.
 
Sandra Thaxter
www.smallsolutionsbigideas.org
sandra at smallsolutionsbigideas.org
(617) 320-1098
  ----- Original Message ----- 
  From: Marta Voelcker 
  To: 'Sandra Thaxter' 
  Sent: Tuesday, January 31, 2012 3:52 PM
  Subject: RES: [Olpc-open] Fwd: Defining success


  Hi Sandra and list, ( sorry for writing such a long message again J )

  I liked your comments! I have a suggestion

  You wrote:

        What we need is a big grant to do some field research and get the data.  

  I would say: or,  a system that would do that! 

   

  The system would keep versions of kids "productions", and track kids´ progress, and also kids´ attitudes to get to the final product (collaboration, creativity, communication, leadership, initiative, problem solving) . not easy , I know. but if technology allows that each learner develops its own project, we  should not evaluate  all learners  with the same test or exam, should we?     So we need new systems of evaluation that might be based on kids productions and peer review. Things like the 21st century skills could be used to name some of the possible outputs, and there is research behind each one of the skills to support the choose of criteria of success.

  Another good point of developing a system instead of or besides  conducting a research is that the system would stay for the schools as a  resource to evaluate each child and keep their records through the time, also a database for the educational system evaluate the schools.

   

  But to do that there is the need of  leaders (educational system reps)  that think  on the use of technology to innovate teaching and learning. I wonder what is the context on  the deployments?  they probably respond to regular country/state  evaluation? Do any of them have as a clear goal to change and innovate on education? have they defined what is this change and the criteria to identify new outputs?  

   

  At this moment, we have available the greatest technology ever made to enable  "the change" in education.

  Change from traditional education ( 19th century) to the currently desired education, which in fact, is a change that is desired since the early decades of the 20th century, when child psychology evolved (Dewey, Piaget)  and many things happened - since that time there are moves to renovate school , to prioritize  teamwork, learning by doing, problem solving , creativity. but that was an impossible thing to do by the time that all students had to read the same book and go through the same exercises and questions because only one teacher ( without technology) was not able to  guide a whole class of students with different motivation, learning in a different pace and creating projects about the subject.

   

  But now that technology could  enable a systemic change on education,  I have the feeling that few people remember that desire of change, or maybe they have tried so hard ( to change), before the  technology, that they gave up and don´t want to try anymore? Or maybe  few people understand "what is the change and why is it desired"... I´ve been studying this lately and realized that there are so many things and motivations involved. The willing for change is frequently present on National or state standards or guidelines for education, but is not present on the kids evaluation or assessment.  There is an important need for teachers and families to understand what are the outputs of the new education.

   

  Once reps from  an educational system have the courage to say: " Yes we want to change! Let´s use technology to enable change!"

  Then it would be possible to think, develop  and continuously improve a system to evaluate attitudes, skills development.

   

  Or  maybe not, maybe the system should be developed before, and then shown to leaders ( educational system reps)  to convince them to experiment change (including change on evaluation using the new system) ?

   

  Marta

   

  De: olpc-open-bounces at lists.laptop.org [mailto:olpc-open-bounces at lists.laptop.org] Em nome de Sandra Thaxter
  Enviada em: segunda-feira, 30 de janeiro de 2012 10:38
  Para: Samuel Klein; olpc-open; Ahmed, Farhan
  Cc: bdmoss at ku.edu
  Assunto: Re: [Olpc-open] Fwd: Defining success

   

  Greetings List,

         The question is what kind of evaluation of OLPC usage do we want, and what is the most useful educational measure. 

          

    1.. Document Activity Usage:  Are students accessing the XOs, how often,  which activities:  this could be answered by pulling the XO data to the school server and tablulating it.  Not hard to do, however the value as far as education isn't clear.  Possibly why no one has done that.
    2.. Document Conventional School Testing for XO students:  This means measuring differences in the students performance on conventional testing comparing students using XOs to those without.   These outcomes are useful for those of us making a case for funding, but educationally only somewhat useful.  Each country organization might do this if they had sufficient funding to cover the effort.
    3.. Long term impact:  Almost all sites are too young to measure long term impact, which in the end is the best measurement.
    4.. This program is learning through doing, through solving problems.  The change in attitude toward learning is the most important factor.  This can be measured by site visits, and inteviewing students and teachers.   
        What we need is a big grant to do some field research and get the data.  

   

  Sandra Thaxter
  www.smallsolutionsbigideas.org
  sandra at smallsolutionsbigideas.org
  (617) 320-1098

  ----- Original Message ----- 

  From: "Samuel Klein" <meta.sj at gmail.com>

  To: "olpc-open" <olpc-open at lists.laptop.org>; "Ahmed, Farhan" <farhan.ahmed at chicagobooth.edu>

  Cc: <bdmoss at ku.edu>

  Sent: Sunday, January 29, 2012 11:14 PM

  Subject: [Olpc-open] Fwd: Defining success

   

  Replying to the list.

  On Mon, Jan 30, 2012 at 4:10 AM, Samuel Klein <meta.sj at gmail.com> wrote:
  > Hello to you both.
  >
  > On Sun, Jan 22, 2012 at 8:15 PM, Ahmed, Farhan
  > <farhan.ahmed at chicagobooth.edu> wrote:
  >
  >> Is there a methodology through which OLPC tracks the concrete educational
  >> development a child goes through after he or she gets access to a laptop? It
  >> seems that tracking a child's progress over the years will allow OLPC to
  >> make substantial scientific claims about its impact.
  >
  > Agreed. There is no method shared among all deployments; each
  > country/school system has their own set of soft and hard measure of
  > development.
  >
  >> I do understand the limited effectiveness of
  >> quantifying "educational development", but I'm sure there's a
  >> well-researched methodology widely used.
  >
  > I don't know that it is theoretically limited in effectiveness;
  > however I am not aware of any single widely-used methodology across
  > different cultures or systems.
  >
  >> Furthermore, with regard to the Sugar interface, is it enabled to collect
  >> metrics on usage patterns (anonymized, of course)? Information on how often
  >> certain activities are enabled and used, the times of day a laptop sees most
  >> usage, the average data usage (mesh or the internet) and other such metrics
  >> would allow more targeted development and prioritization. Once again, I
  >> could not find any such data on the website.
  >
  > At a low technical level there is some capability to gather data - for
  > instance all machines 'call home' once after they are turned on.
  > However beyond this it has never been used to my knowledge to do so --
  > implementations so far have privileged user privacy over research
  > efficacy. I would also love to see (anonymized) collection of data as
  > you describe.
  >
  > Uruguay is the largest deployment that has gathered comprehensive data
  > on what activities are used for how long.
  >
  > You can see theirs and other reports here:
  > http://wiki.laptop.org/go/OLPC_research
  >
  >> My motivation here is to understand how OLPC prioritizes it work and backs
  >> its claims on the impact. I am doing this as part of a research project I
  >> have undertaken at my university (The University of Chicago Booth School of
  >> Business). I'd be happy to answer any questions.

  Thanks for sharing.  Can you tell us more about your research?

  Brian Moss writes:
  >> I'm currently writing my master's thesis on the OLPC program and why --
  >> despite the most honorable of intentions -- it has largely failed to live up to
  >> the hype.

  Ditto - can you elaborate on your view of what this means?

  Cheers, Sam.
  _______________________________________________
  Olpc-open mailing list
  Olpc-open at lists.laptop.org
  http://lists.laptop.org/listinfo/olpc-open


  -----
  No virus found in this message.
  Checked by AVG - www.avg.com
  Version: 2012.0.1901 / Virus Database: 2109/4775 - Release Date: 01/29/12

  No virus found in this message.
  Checked by AVG - www.avg.com
  Version: 2012.0.1901 / Virus Database: 2109/4778 - Release Date: 01/31/12
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sugarlabs.org/archive/iaep/attachments/20120131/a0dbaf81/attachment-0001.html>


More information about the IAEP mailing list