Strengthening the research and evidence base in the access to justice sector. Part 1 of 2
Opinion from Tracey Gyateng and does not represent the views of The Legal Education Foundation
Towards the end of 2019 I had a career defining dilemma. For almost two years I had been a data science manager and spent a large proportion of my time seeking and speaking with charities on how they could make use of AI techniques to improve their operations, service delivery and campaigning. But I kept hitting a main blockage. Many charities are not at a level of data maturity where they could easily extract more information from their data. Nor did they have the confidence to challenge private or public sector organisations with their use of data, which increasingly use AI techniques and make decisions about people most in need of support.
So here was my dilemma. Should I continue to seek and work with the pioneering organisations that could take advantage of the data they had collected/accessed? Or try to share the knowledge I had gained from over 15 years as a data manager to support more charity and community organisations? I decided to quit my job and go freelance and was fortunate enough to start a project for The Legal Education Foundation to develop a strategy to address existing deficiencies in the data infrastructure of frontline providers of legal advice and representation; and improve the availability of data to address access to justice challenges as part of the Justice Lab initiative.
What is the Justice Lab?
The Justice Lab is a brand new initiative to encourage and support the community of policymakers, frontline advice agencies and researchers working in the access to justice sector to:
- Improve the quality and availability of justice system data
- Increase the volume of robust research into the justice system and pioneer the use of advanced quantitative methods in the study of justice system processes.
- Increase capacity in the research community to design and deliver innovative quantitative research
- Improve the uptake of robust evidence in the design and operation of the justice system
I have been brought in to focus on programme one. To date, TLEF has worked on improving the quality of, and access to, government administrative data. A core project undertaken by Dr Natalie Byrom, director of research and learning at TLEF has been to advise HM Courts and Tribunal Services on how to embed within the £1bn digital court reform a programme of evaluation and learning to enable researchers and the law tech sector to access data legally and ethically. You can read the report, HMCTS response and Dr Byrom’s opinion on progress to date.
My remit is to create a strategy (outlining core activities) that will develop the data infrastructure and data use of front line organisations working in the access to justice sector. My ethos is to seek partnerships, be humble in learning from others, and avoid duplication wherever possible. I started in September 2020 with an initial scope of the sector in which: I spoke with 15 organisations (equally split between funders, infrastructure bodies and frontline services) & 3 consultants; reviewed these organisations strategies and theory of change (where available) and recommended documents; and attended data/digital research focused events. A report was provided to the The Legal Education Foundation- and key areas of need and challenges are summarised and provided in this blog.
What I have learnt so far
As a typical social researcher, I have learnt that further investigation is required to develop the strategy! Which is fine, September to October was an initial testing of the waters; I now have from December to summer next year to swim! Here are my findings to date. In my next blog, I will outline how I will investigate these areas.
1. Be continuously cognisant of the context the sector is working in
2020 has been a challenging year for the charity sector. With an increase in demand for services brought on by the pandemic, interviewees described their staff working long hours and in danger of burn out. Overall the sector faces an adversarial government which has continued, and exacerbated a hostile environment for immigrants and asylum seekersー and more generally for people with low incomes; and for legal advice/support organisations in which lawyers face death threats for simply upholding the rule of law. Organisations were generous with their time in speaking with me, but I am also aware that working on behalf of a funder creates an uneven power balance. Going forward I will seek to add value to the conversations I have, share learning and resources. One step is for me to learn out loud with this first blog.
2. Consider how to develop the data maturity of the sector
The levels of data maturity- the ability to use and learn from data, varied within and across organisation types I spoke with, and many ranked the data maturity of the access to justice sector as low. Data maturity is not dependent on size; smaller organisations with a flat line of governance can be more agile than a larger one with vast teams that work separately and have difficulties in knowledge sharing. Research conducted by Data Orchard and DataKind UK with non profits and social enterprises identified seven core factors for data maturity: leadership; skills; culture; data; tools; uses; and analysis, with organisations varying from unaware to mastering within each factor. The most important factors to increasing data maturity relies on people- leadership, skills and culture (p 6).
Should we* focus on building the data/digital skills of leaders? A wave of programmes focused on digital leadership have been recently developed e.g. SCVO’s Senior Leaders, The Data Lab’s workshop for leaders, ODI’s checklist for business leaders. Should we create data communities of practices to support workers within the sectors? And how do we identify and build on existing initiatives? Overall how do we inspire and show organisations the value of making effective use of data?
3. Collect core data
There is a strong argument for data on protected characteristics and other attributes which can place people at a social-economic disadvantage** to be collected by government and organisations working with people seeking access to justice. How else will we know if the justice system works equally for all people, or whether organisations are managing to reach the people most in need of help?
4. What are the outcomes the sector should be working towards?
Understanding how to measure outcomes (what changed after an intervention) versus outputs (the service/product delivered) was important to service delivery organisations.
Should there be an outcomes framework for the sector- as some interviewees suggested. This could be similar to health e.g public health outcomes framework or the youth sector e.g. framework for outcomes for young people. What can we learn from previous attempts to do this in the access to justice sector and from other sectors on how to implement shared evaluation frameworks, such as the Youth Investment Fund’s Learning Project?
5. Can we conduct an economic evaluation for the sector?
What is the value of the sector? What are the savings made in providing early intervention? Having rigorous outcomes data would support more economic evaluations to take place.
6. Not just outcomes, but look at quality of services.
Measuring the quality of service provision was a frequently occurring topic within interviews, but it was aligned with uncertainty on how to do this. What has been written about quality in access to justice service delivery? Is there an accessible guide/manual of what is meant by good quality service provision and how to measure it? If there is a guide, how commonly is it known across the sector?
7. Important to measure need/unmet needs; demand/unmet demand
For delivery/advocacy organisations, there was interest to learn more about clients, such as the type of people who managed to access their services, and which potential clients were missing. This clearly links to organisations wanting to better understand the users of their service and their user journey- how clients hear about their service, who are the key referrers and which other organisations are clients working with etc. Membership bodies have also been focusing on needs across the sector. The Law Society in partnership with the Legal Service Board has conducted comprehensive research on legal needs of individuals in England and Wales; and Advice Services Alliance has reported on social welfare needs and demands in London. For service delivery organisations, how do we support partnership working with data/digital support organisations to measure needs and demand such as DataKind UK, CAST, Royal Statistical Society Statisticians for Society, Probono OR- (latter two are geared towards small charities). Or engage with data initiatives taking place in the charity sector such as Data Collectives? We should also support organisations to develop partnerships with academic units, especially those who specialise in quantitative modelling as there is a paucity of robust quantitative studies.
8. More discussion and awareness raising needed on the role of algorithmic & automated decision making
AI brings an exciting opportunity to speed up the creation of information from data, and governments are rightly exploring how to utilise it. However, there has been increasing recognition that the application of algorithmic decision making within the social sector can lead to harm, entrenching or exacerbating existing societal biases. Civil society needs to become aware of algorithmic & automated decision making, and be prepared to challenge in areas where it may lead to discrimination. A recent example is the case brought by JCWI and FoxGlove against the Home Office’s use of nationality within visa decisions
9. More collaborations and data sharing?
Collaborations varied across the three types of organisations I spoke with. The number of membership bodies in the access to justice sector is larger than the numbers I have come across in other charity sectors I have worked in. Some serve the same members, which can cause tensions and competitive behaviours which is likely to limit areas for collaboration. Spurred by COVID-19, access to justice funders quickly increased their collaborative practices. Service delivery organisations tended to have strong relationships within their regions, or dependent on their specialism had strong links with other service delivery organisations. Importantly, the increased use of digital platforms such as Zoom has enabled more connections to take place & opportunities to contribute for organisations outside of London.
I’m open to suggestions/corrections
I’m sure there are other areas to develop the data infrastructure and data use of charitable organisations working in the social justice legal sector [having access to technology/software in the first instance, legaltech…]. I’ve considered these nine areas discussed above as data essentials and I’ve created a project plan of activities- see BLOG 2. BUT I’m open to correcting my course of work from kind and knowledgeable (academic or experienced) people who want to get in touch! Drop me an e-mail tgyateng[@]protonmail.com
*I will use we as I don’t plan to do this by myself!
**Dr Natalie Byrom identified 13 attributes that should be collected by HMCTS: Thirteen attributes are recommended for collection: age; disability; employment status/income; english as a foreign language; gender reassignment; highest level of education (proxy for literacy); postcode (permanent address, to identify whether in a care home, homeless, in an area of low internet coverage etc; pregnancy and maternity; race, religion or belief; sex; sexual orientation; fear or distress connected with the case e.g domestic violence/abuse, in detention, survivor of trafficking/trauma. See Figure 0–2