Evaluating the project

It’s 8 months since our project started in November 2011 and that means that we are nearly half way through.  This is a good time to start to ask the questions about what we have achieved so far and find ways of measuring this against our original objectives.

In January, the Information Management team held a facilitated workshop to capture baseline data about the six key record sets we had identified in our project bid.  This resulted in ‘As is’ process maps and captured some actions to fill in gaps where our knowledge of the data flow through systems was lacking as well as identifying who within the College is accountable for the overall record and constituent parts.

 Our initial focus was work on our Student records which we retain permanently although currently this is partly in digital form within our SITS student system as well as paper records, sometime files from both the central teams and departmental teams.  As part of a College funded initiative, we mapped the data flow from our admissions portal through the SITS system and also identified where data was duplicated in both digital and paper formats.  This exercise was captured in a ‘Data Managment Plan’, a new process that we have started to use to document data flows and risks.  Carrying out this analysis has enabled the Information Management team to develop the trust of business users and begin to build a business case for introducing document management to increase the content of a student record by linking scanned and born digital documents.  This should reduce the risks of core data being held outside the student record in the longer term, although implementing this functionality across the College will take time and resources to support this.    The plan has now been signed off by the business owners and feedback has been very positive.  This has resulted in project funding for Key Information Sets (KIS) and Higher Education Achievement Record (HEAR) being allocated to a 2 year Information Analyst post within the Information Management team to allow us to support the business in delivering these projects whilst ensuring that records and information management are embedded in the solutions. 

How we evaluate this is our next challenge – we can document qualitative changes such as the introduction of Information Management skills to the project team and at the end of the project in 2014, we will be able to show many back office changes such as how we capture programme and module approval and we could document the time taken to approve a module using a paper based process and compare this to the online approval and multiply this by the number of approvals per year to show a recurrent saving.  We have some experience of this as part of the JISC Impact Calculator pilot in 2010 where we defined the cost or time per unit of change against the resource costs to make the change.  We asked for help from our Finance team who are more used to calculations of this type so we may be going back for more help soon!

Our work on staff data has been progressing well as we analyse the individual fields of the PURE system feed, which will populate a portal to support our preparation for Research Excellence Framework assessment in 2014.   The use of HR data feeds exposes data quality and business processes in a way that was not intended and our challenge here has been to work with HR, Finance and ITS staff to map data flows through both the HR and Finance systems.   We have held several round table meetings to agree potential changes and ensure that we do not affect other processes by implementing these.  Mapping the process for communicating change has also been illuminating as accountable staff often have very different understanding of similar processes.  One outcome of this work has been a better overall understanding of data uses and we are now considering setting up an Institutional Data Governance Committee to sign off changes to reduce risks in this area. 

Committee records management has been progressing well behind the scenes and we are aiming to digitally archive 100% of our core committees (standing committees designated within our Ordnances) by 2012/13.  I’ve been trialling the Evidencing Change template in this area and will be sharing it with the action learning set when we meet on 10th July.

Other areas of records management have not been progressed at the same pace – there is always a limited amount of resource committed to working on this and student and staff records are high priority when matched with technical resources to implement change.

Estates records were a high priority in 2009 -2010 and improved working relationships led to better paper records management and some involvement in development of their project management system to add document management functionality.  The functionality project was put on hold in 2011 and has not yet been revived, so we will be working with Estates over the summer to understand any future development and help them to address the risks in the meantime.

Finance records have always been predominantly paper and work on this project needs to focus on building relationships with Finance staff and understanding the roadmap for development of the Finance system in this context.  Some of the work on staff data has meant work with Finance staff who control the establishment and this will be useful to build our understanding of the many aspects of financial management carried out within the College.

Our final set of records for this project is research administration records.  We have been working with staff within our Research Management Directorate over the last year on a range of projects around research data management and so have started to build our relationships there and will continue to be involved as they begin to specify the functionality of a new pre and post awards system.  One output from our earlier project is a guide to retaining research administration records so one of our frequently asked questions – what documentation should I retain can now be answered with a best practice guide

This entry has not really answered the question of evaluation but has certainly raised several new issues in my mind – I think I need several different levels of evaluation criteria, depending on the stage of change each area is at.  Where we are ready to make changes and have a project in place, this can be documented in one way but where change is not even being considered, a different set of criteria need to be applied – more of that in a later post

None of this will be solved overnight, but what we have learnt from this is the value of bringing key players around the table to talk about issues and working as a multi-disciplinary team to understand each person’s role in the process