Saturday, October 30, 2010

Measuring the value of culture

Justine Roberts, Principal




There is a lot of discussion about the role of evaluation in our work - as museums, what is it we really can evaluate?  What is our sweet spot as learning organizations - content knowledge, skills, attitudes?  Is it reasonable to attribute certain kinds of impacts (behavioral change eg) to museums?  What set of instruments should we use to capture impact?  Who should do evaluation and when?  How should the results shape our work? Which comes first, priorities or results?  And does an evaluation framework negatively impact the ways we develop public programs, or even staff professional development opportunities - does it make an exhibit or program more linear, more focused on content transfer, and less wonder-ful?  There are others, these just come to mind.


So it was interesting to come across a project from the Museums, Libraries and Archives Council (MLA) in the UK that attempts to put a comprehensive framework in place for thinking about the role of evaluation for self-choice learning social-service organizations. They start with a definition of learning and then systematically define outcomes for both visitors of MLAs (what they call Generic Learning Outcomes) and for organizations themselves (which they call Generic Social Outcomes).  Using that as a foundation, this framework then shows how to create evaluation tools that help measure impact.  


So it links content to metrics.  It is an actionable framework, and because it measures success, it is also designed to tell a coherent, fundable, story about the work of a public service organization. 


This tight relationship between what it means to “learn”, the types of learning expected from the audience and the wider impact of the organization’s work, and the evaluation tool designed to measure learning is very strategic.  


This is a systematic framework and calls for a kind of on-going self-evaluation policy or attitude within an organization.  Individual programs, initiatives, exhibits, or workshops are evaluated, but there are no one-off, stand-alone evaluations.  Every time something is measured (professional development, summer camps, a new stage production, etc) it uses the same goals. There is no need to reinvent WHAT to evaluate for each event, and evaluation results are automatically consistent with the big, visionary impacts of the MLA (Whether they show success or not).


This project steps out of the stream of day-to-day work and program delivery and thinks about evaluation as part of the overall trajectory of an organization.  That makes it an interesting way to think about the importance of evaluation and how it functions.


Inspiring Learning
The MLA describes “Inspiring Learning” as “a self-help improvement framework” and encourages MLAs to use it to:       
       *Assess your strengths and plan improvements
       *Provide evidence of the impact of your activities through the generic learning and generic social outcomes
       *Improve your strategic and operational performance
What is in it?
As the saying goes - you get what you measure.  So it matters what framework you use and you have to start with a clear agreement on key terms.  MLA does this with a definition of learning as active engagement with experience: 

They expand that as follows:
  •            It is what people do when they want to make sense of the world
  •            It may involve the development or deepening of skills, knowledge, understanding, values, ideas and feelings
  •            Effective learning leads to change, development and the desire to learn more
So they are starting by positioning the work of MLAs as something of a fusion of learning frameworks to include cognition, social/emotional learning, skills development, and personal dimensions.


The MLA believes organizations should model the kinds of learning that they hope to foster.  Because of that, self-assessment is the first step in the bigger strategic process of identifying areas where MLAs can have an impact. The Council has created a kind of SWOT worksheet using their scale of impacts get you started.


From that point, the framework turns outward and looks at how to evaluate and measure impacts on individuals (GLO), and communities (GSO).
I want to talk about what the GLOs are because I think this way of looking at the opportunities for learning in MLAs is really nice.  MLA has identified 5 ways that audiences grow and develop through their relationships with organizations:

1.      I think the yellow circle is in some ways the most interesting because it is calling for lasting behavioral change as an outcome of a visit to a library, museum, gallery, or archive.  The difficulty of measuring this is one issue, and I am not sure that Inspiring Learning has solved that question.  But it is in many ways the ideal of many museums working on exhibits and programs related to climate change, wellness, and human rights.  So it is something I think the field is very interested in figuring out.  With that exception, I think the GLOs are familiar and part of how we think about museum outcomes here in the States.  So this diagram offers a handy tool that is easy to use - and when you go to their website each of the circles is clickable and pops up a more detailed description.


Generic Social Outcomes are intended to help measure the benefits, or social value, of these types of organizations.


So if that is the framework then you can use it to create research tools, questionnaires, focus group guides and other materials that support evaluation of programs and offerings against the GLO.


This is fundamentally political


In a way, Inspiring Learning is quietly pushing MLAs to recognize their potential contributions to their local and professional community, and also to policy discussions that impact their work in arts education, cultural funding, out of school time learning, healthy communities and more.


To that end, the MLA council has taken care to ensure Inspiring Learning aligns with other existing UK policies and initiatives.  Those include:
So is the framework helpful or just another hurdle?


The Museums, Libraries and Archives Council wants to make their membership stronger, more productive, and more sustainable, and to improve their social standing.  They are thinking about how to demonstrate value by creating ways to measure impacts. Fair enough but why an evaluation framework in particular?


Inspiring Learning was developed specifically as a response to the 2007 Local Government and Public Involvement in Health Act which shifted assessment from a service–based model to an outcome-based model. As MLA explains it, this shift meant no longer assessing the quality of programs but instead looking at how to demonstrate value by showing improved quality of life for individuals and communities.  That left the question: how do you identify the contributions of MLAs? 


Clearly MLA is operating in a very different environment than we have here in the States.  Museums here certainly respond to funder priorities and opportunities, and some of those can shape the experience of many museums across the country as they position themselves to be more competative.  But we don’t have the same pressure (safety net?!) of national funding and we don’t have national mandates or policies like this one in the UK that push a sector-wide response.


Even so, The Wallace Foundation, which studies Out of School Time (OST) organizations has come to the similar conclusion that we need to better “link and align learning opportunities within a community” if we hope to create high performing/high impact non profits. Although working in a different country, and coming at the question from a different point of view, they have arrived at a surprisingly similar formulation of the issue:


The Wallace Foundation clearly sees program evaluation as part of a larger process of learning and strategic thinking that will drive organizations to be more effective.  However, they do diverge from the MLA framework in some important respects.  They are not in favor of holding OSTs accountable for community-wide measures of success, and they do not try to map out the criteria that individual organizations would use to evaluate learning outcomes.


At the end of the day 


I think there is a lot to take away from Inspiring Learning.  For one thing, its comprehensive and consistent focus on strategy.  The tag line is placing learning at the heart of the museum.”  But it applies that to policy as well as to people, physical space, and partners.  It is a chance to align an organization top to bottom along a consistent strategic axis.  It recognizes that exhibits succeed not just when they are coherent with the style of the museum, or when they are able to meet their fundraising targets, but when they improve the museum’s capacity, make it more competitive, raise awareness and profile, and contribute to the social capital of their communities.


For another, it asks some important high level questions and encourages social service organizations to be explicit about them.  These include: what is our public value? How do we demonstrate our contributions? What are our priorities as a cultural organization? What is our own culture as a learning organization? It can seem like an organization is struggling just to deliver service and keep the doors open these days.  But it is also important to be able to say what difference you make in the world, and why you matter.  That seems to me one of the messages of Inspiring Learning.  


Although it takes seriously the idea that MLAs have to prove, demonstrate and perhaps even justify their work – and that can be very uncomfortable – the demand for evidence of need, outcome and performance data, and improved efficiency and effectiveness are all growing steeply here in the States as well as in the UK.  If this is our reality, a framework like GLO seems very handy.

No comments: