Writing 2By Sharon Clancy

“…in our messy, fuzzy, anarchic field of practice, how can we produce neatly packaged bundles of evidence that might be useful to busy policymakers?” (Field, 2015). [see Note]

One of the joys of working on a project like ENLIVEN, which spans two scientific disciplines, is its ability to challenge ways of thinking. My own experience of working with Computer Science colleagues over the last year, and the sharing of our different models and approaches, has shown me that collaboration requires a stripping away process on both sides. Abandoning the mystique and obfuscation of specialised language and jargon, and challenging one anothers’ ontological assumptions, allows for the development of shared frames of interpretation and a genuine commonality of endeavour. But it takes you out of your “comfort zone”….

As someone with an arts and humanities, social science and community practitioner background, I was uncertain about the applicability of computer science modelling to complex, and seemingly intractable, social problems. Can algorithmic reasoning really make a difference to policy making in lifelong learning when a slew of policies has been trialled within the European Union (EU) since 1993? Issues of inequality are escalating, with one in every five Europeans under 25 now unemployed, and many without access to education or training opportunities (NEETs). And of course those who are in employment face an increasingly precarious labour market.

Such factors lead to marginalisation and Enliven is keen to address the range of barriers faced by young people furthest from the education, training and labour market which lead to social exclusion. We seek to do this by influencing policy makers in the area of lifelong learning, by stimulating debate, and through policy formation and evaluation.

The Enliven team is an interdisciplinary one. We want to find out how an Intelligent Decision Support System (IDSS) might support policymakers – we are trying to do so by identifying policies and programmes which have previously been trialled, and by allowing them to be assessed against various criteria of whether they had worked or not. The IDSS will suggest interventions to end users, enabling them to solve similar problems by identifying what worked for their particular situation in the past. Our computer science colleagues call this ‘case-based reasoning’. In order to develop and design such a system, we needed to be clear on three key questions. Namely:

  • the key attributes/characteristics of the young people most in need of support – are they necessarily NEET?;
  • the ‘end users’ of the IDSS – are they policy makers, practitioners, young people themselves, or all three?
  • what kind of data is ‘out there’ in terms of programme/initiative evaluation and how is this informing policy currently?

We immediately hit significant obstacles. ‘Policy makers’ were understood as being the critical ‘end users’ of the IDSS but talking to them proved problematic. One complexity is that who they are varies greatly across the Enliven partnership – in different countries and in different areas of activity. Another was that the evaluation of programmes/initiatives aimed at young people is inconsistent and the data collected is often not comparable. So, one programme might provide an evaluation based on individual case studies – lifelong learning ‘journeys’– whilst another only deals in meta-level quantitative statistical analysis. We need to know what key data a policymaker needs to enable decision making. Even sharing an agreed definition of NEETs has not been straightforward, as all the partners within Enliven deal with different demographics and profiles. Policy-making at the European level is anything but straightforward!

We have some way to go before we are likely to have sufficient data to ‘feed’ and develop an IDSS, but we want to be creative in our responses to this issue. For instance, we are developing a stakeholder group of policy ‘end users’ who can inform us of what they need. We’re creating a framework to analyse what evaluation data is available as finely as possible. We’re engaging with young people so they can inform us directly about their needs. We’re also working with a programme deliverer and practitioner, using this as a case study to help us understand better how programmes and programme evaluation are designed in the real world. We are not trying replace a human decision maker with a computer program but aim to provide an advisor system that simplifies data access and presents it so that users can make decisions more easily. We are keen to find out how well an IDSS can model the complexities of how people and policy interact in Europe’s complex educational and learning markets.

Most of all, though, we know the development of the IDSS needs to be supported by the expertise and knowledge of our Enliven team colleagues. The more we can talk and share, the better the model we develop will be and the more help our efforts will be to all end users, including “busy policymakers”, in making a difference amongst the messy exigencies of our “fuzzy, anarchic field of practice”. How do we best communicate amongst ourselves to achieve this? We would love to know how you feel your research could influence the IDSS or how you think the IDSS could impact upon your work.


Field, J., the Learning Professor, blog, Mechanising education policy with intelligent decision support systems, Posted on January 18, 2015, Accessed 20 September 2017).