Friday, 11 December 2009

JISC MEAoT Assembly

Yesterday I attended a JISC assembly organised by the Modular e-Administration of Teaching MEAoT Project run by CARET - Cambridge University. The aim of the event was to explore ways in which stakeholders and other parties can be encouraged to adopt tools developed in JISC projects. Below you can see my presentation:



The second slide briefly explains the nature of BRII and the Entity Registry we are creating. The registry is a container of Research Activity Data and it is these data which has been semantizised that we are trying to sell around the University. BRII is doing this at two levels. First, being the registry an abstract concept, users find it difficult to understand. Therefore BRII needs to provide tangible examples that use its data. Slide 3 shows this as level 1: providing practical examples of data use, one of which is the Blue Pages. However, as the Blue Pages is a new piece of software BRII needs to find ways to promote it across its user base. Through user tests (level2 ) on the Blue Pages we make sure it fits the needs of our potential users, but at the same time we sell the idea of Research Activity Data being available in our registry for another users.

There were other presentations of course. The MEAoT people have written a summary about the event in their blog:

The Centre for Applied Research in Educational Technologies (CARET) in Cambridge University hosted a JISC assembly for the Modular e-Administration of Teaching (MEAoT) project on Thursday, 10 December 2009.

The list of participants included:

Richard Prager, MEAoT project PI, Cambridge University Engineering Department (CUED)
Laura James, CARET
Anne Clarke, MEAoT project, CARET and CUED
Guy Chisholm, MEAoT project, CARET
Avi Naim, MEAoT project, CARET
Cecilia Loureiro-Koechlin, JISC BRII project, Oxford University
Bridget Taylor, DAISY project, Oxford University
Jenny Mackness, JISC liaison for MEAoT
Rachel Tuley, Teaching Administrator, CUED
Jen Pollard, Computer Officer, Cambridge University English Department
Carmen Neagoe, Teaching Administrator, Cambridge University Judge Business School
Helen Marshall, Teaching Administrator, Cambridge University Physics Department
David Goode, Computer Officer, Cambridge University, Department of Divinity

The programme was designed to make the meeting a pleasant event. It started with a historical overview of the project (by Prof. Prager). This was immediately followed by lunch at a nearby restaurant, which helped break the ice and make everybody more comfortable.

Read more here: http://modular-e-admin.blogspot.com/2009/12/modular-e-administration-of-teaching.html

Friday, 27 November 2009

BRII's Entity Store

In our internal meetings we use the term Registry or Entity Registry to refer to the Research Information Infrastructure. Wanting to know a bit more about the meaning and technological features of such kind of store I asked Anusha. She gave me the following lecture:

Cecilia: What is an Entity Registry and what is the difference with conventional stores?

Anusha: Lets first list our entities, so we are clear about what we mean when talking about an entity - person, organisational unit, publications (journal articles, books, chapters...), funder information and research activity info.
Now, the main difference between an entity store and conventional store (typically databases) is that in a conventional store, the columns relate to attributes of each entity and it needs to be created at the time of creation of the database. So how is this a problem? Well,
  1. We need to think up of all the attributes relating to the entity (example: all the attributes that make up a person) at the time of creation and we cannot change our structure very easily later on. (cost of change rises exponentially with time)
  2. All the people have to follow the same structure. So you cannot account for variations very easily.
  3. You cannot accomodate all of the multiple relationships easily. Eg - person belongs to multiple departments/colleges, person has multiple roles, person has multiple titles, person has multiple names
When aggregating data from multiple sources, points 1, 2 and 3 are crucial. To design a common structure having all the attributes we are going to come across in the future is going to be near impossible.

Cecilia: This sounds obviously relevant to BRII because we are collecting information from all kinds of sources around the University, and most importantly because we do not have control over the content or format of that information.

But beyond these technological advantages, what are the benefits that this way of organising data brings to scholars?

Anusha: The extra benefit to visitors is that we can show them multiple and different relationships between entities very easily (like collaborators, linking funders, research activity, people, departments in whatever way we want).
Or the entity registry will be transparent to them, as they will only see things like the Blue Pages.
The entity registry will be transparent or open to all to access our data. If they are interested in the data, they can build tools to analyse the data in whatever they want (we haven't yet done this, but will be doing so).
With a service like the Blue Pages, for a keen observer, the entity registry will be noticable. For other users no and rightfully so, as they need not know what's happening at the back (for example: we see nothing about how google does its work). They can however observe that some of them have a lot of information and are linked in multiple ways to other entities, while some others hardly have any information. The key thing is that the data can be linked very easily to other entities in multiple ways.

The power of the Blue Pages is mainly derived from three things
  1. Quantity of data - The more we have the better
  2. Variety - A one stop shop, visit one website rather than 10 different websites
  3. The way we present our data and the deductions / analysis we perform on our data (like finding collaborators). If we can think of more deductions like this, it would be useful and make our web service more powerful.
The Blue Pages is just one way of displaying the data. We can do much more with the information, like create a graph of collaborations and areas of research across Oxford University for example.

Cecilia: and how do you see all this helping scholarly communication?

Anusha: scholarly communication is more than just Journals. Journals were and to some extent still are primary sources of communication but they aren't the only sources. We now have institutional repositories which are helping with this. Also, "scholarly" in scholarly communication does not refer to the people, but to the type of communication. So its anyone (not just scholars) communicating on a scholarly topic.

Cecilia: yes and I guess that having all these connections facilitates these communications.

Wednesday, 18 November 2009

Blue Pages - User Tests 3

We have just started a third round of user tests of the Oxford Blue Pages. This time tests focus on research collaborations. Collaborations can happen in projects, when writing (books, articles, etc,) or they can be informal exchanges of knowledge. The Blue Pages will display the collaborators of researchers as far as data is available. Names of collaborators will be extracted from publications, project websites and personal websites. See screenshot below.

Users will be able to change the grouping of data between two views. In the example below collaborations are organised by people. When the user expands one person the Blue Pages will show the nature of that collaboration, which in the case of the example is one academic article and that person being listed in the researcher's website. When the user clicks on the group collaborations button the Blue Pages will organise data by sets of collaborations, for example by research projects or academic articles which when expanded will show their paticipants. Whenever data is available within the Blue Pages names of collaborators and research outcomes will have links to their corresponding profiles.

Note: data is real but connections between people were made up for this screenshot. Click to see full picture.

We are also testing how useful connections between data objects are to users. Data objects for us are People (researchers,) Research Activities (e.g. projects), Academic Units (e.g. departments) and Funders. The Blue Pages can connect all of these between them. For example: departments with people, people with people, people with research projects.

The Blue Pages also use research keywords to find and connect objects. For example search for research projects under a subject field, or find projects in similar areas to the one displayed on screen. Although some the of mentioned examples have not been implemented yet we ask testers how they would like to access and see these data.

Friday, 30 October 2009

Blue Pages - User Tests 2

After 3 intense weeks Anusha, Monica and I have completed a second round of user tests. This time no mock ups but with the real thing, which of course is the Blue Pages v0.00 :) We did 15 tests with a selection of academics, students, administrators and one research facilitator. Again feedback has proved to be an eye opener, and is helping us on our design.

Tests lasted an average of 45 minutes each and took place mostly in our testers offices. Monica and I (and sometimes Anusha and I) visited them, so lots of buses and taxis. We used our laptop which was setup to access our server. Testers were told that we were using a work in progress version of the software and that they should expect some errors. However we had very little. We used samples of data from 4 areas of the University: Faculty of Philosophy, Department of Cardiovascular Science, Department of Phisiology Anatomy and Genetics and Gray Institute for Radiation Oncology and Biology. That is aproximately 250 people. Anusha had previously added data representing the organisational structure of the University, which includes divisions, departments, faculties, institutes, centres of study and research groups.

Our questionnaire had a similar structure to the one we used in the first round of tests. See below. We altered some of the questions we had in the first 2 or 3 tests as Anusha was adding more data. We also felt the need for repeating the same first 2 questions we had in the first round. These questions tried to capture the testers perceptions of the site struture and look and feel. As we were using the real web-based version of the Blue Pages the look and feel was completely different from the one we had in the mock up version.

v2.0Introductory Questions
1.Ask tester to look at site’s Homepage for 10 seconds. From looking at this site, what kinds of information do you think you could get from this site?
2.Who do you think this site is designed for? Why?

Tasks
3.Find Prof AAA BBB’s profile and display it in the screen.
How would you find another person in the Blue Pages who does work in the same department?
4.Obtain a list of/display CCC DDD’s publications
5.Email a list of researchers in the Philosophy department to a colleague.
6.What you would do if you find a misspelling or missing information?
7.You just setup your new project’s website and would like it to be listed in the Blue pages. What do you need to do?
8.Find a list of research projects (activities) under the keyword "cardiology"

Briefing
9. Do you think the information displayed in the Blue Pages is useful for Research Work? Would you trust the information provided in this site?
10.What kinds of uses of the Blue Pages can you think of which are related to your work and or studies?
11.Do you think it would be good for you to be in the Blue Pages? What kinds of benefits can you think of? (only for academics)
12.What changes and or additions would you suggest in the Blue Pages?

Yesterday, Anusha, Monica and I met to discuss the feedback. They ended up with a long to-do list. I was able to able to dig out a few outcomes, suggestions for features to add or complement the Blue Pages which are not necessarily within BRII's scope:
  • Visual (or textual?) representation of networks of people connected by their collaborations and research interests. Anusha and Monica got excited by this as this could be an interesting challenge from the technical point of view.
  • A time line of people (and maybe projects) - this involves the changes of people's profiles across time. In this way we can know about their previous roles as they rotate across areas and previous research interests as they progress in their careers. This will also include the display of information of people who have left Oxford. One aspect of this is already covered by the display of research outcomes (e.g. publications) as each item has attached to it a date of production.
  • An extensive work on research subject ontologies - a sort of backbone of subject matters which Anusha could use to find and highlight keywords across descriptions (as in free text.) These keywords can then be used to interconnect people and activities, enabling the discovery of hidden connections.
  • Inclusion of information about funders' calls for funding applications - one of the outcomes of these tests is that academics would mostly use the Blue Pages when they are thinking on starting something: projects and/or collaborations. This is when they would like to find relevant people. However as they are in that stage they would welcome information about possible sources of funding.

Monday, 26 October 2009

BRII Inside O.R.

This week I got a copy of the Inside O.R. magazine - November issue with a short article about BRII. I wrote this article after the OR51 conference (Operational Research) where I presented a paper called Making Sense of Research Activity Information in the Information Systems stream.




Thanks to Rajan Anketell, Inside O.R. editor.

Click here to download whole issue.

Friday, 2 October 2009

Institutional Repositories: acceptance and adoption

I have been reading some literature about institutional (digital) repositories. I am interested in the human and social issues surrounding their development and implementation as well as their embedding. I think this literature is very relevant to BRII as it reports experiences in similar implementations (not necessarily technically but conceptually) in similar kinds of institutions.

Technically speaking BRII is not building a digital repository but a Research Information Infrastructure (so a bit much broader in scope.) However the issues, technical and non-technical, surrounding implementation of institutional (digital) repositories apply. The reasons are:
  • BRII has an institutional scope and it is situated within an academic context. It aims to collect information about research in all academic areas and its target audience is everyone within the University who carries out research-related activities.
  • BRII deals with Research Activity Data: information about research. So, not information used and generated by research (as in datasets and publications) but descriptions of research (people and activities.) This kind of information is essential to facilitate scholarly communication.
  • BRII core users are Researchers and Faculty: as content contributors of research descriptions and as users of the information deposited in the infrastructure.

Being now in the "development stage" of the project we are starting to think on how to make our products more relevant to our core users, so they understand them and use them. Of course to make users understand the Research Information Infrastructure we first need to understand our users. Our work in BRII should not be techocentric only but should expand to reaching out to our users, speaking their own language and observing them in their own working spaces. This can be a very difficult task as we are dealing with heterogeneous groups of academics and administrators, each one with different needs and perspectives (as per their research culture.)

Understanding our users is a crucial first step to achieving acceptance and adoption of the RII.

Research in the area of Institutional Repositories field has reported some issues concerning this.

In a study of implementation of an Institutional Repository (IR) at the University of Rochester, Foster and Gibbons (2005) report a "misalignment between the benefits and services of an IR with the actual needs and desires of faculty." They state that the benefits of Institutional Repositories are attractive only to the institutions which host the repositories. This is because Institutions (including Librarians and IT developers) see IRs as a way to facilitate access, reduce costs and improve efficiencies at collecting and managing information that is produced in the Institution. On the other hand academics see themselves as belonging to research communities (not institutions) and are not concerned about the proccesses of managing all that data they produce. This is, perhaps, because they see the problem as affecting others and not themselves. (Colbertt, 2009)

At the centre of this all is the Scholarly Communication crisis. In 2002 the Scholarly Publishing and Academic Resources Coalition (SPARC) (part of the association of Research Libraries (ARL)) stated that Institutional Repositories could help combat dissatisfactoin with the "monopolistic effects of the traditional and still pervasive journal publishing system" (Crow (2002) quoted in Maness et al (2008)). Rieger (2008) states that to combat this crisis, academic institutions aim at "reducing costs of producing and acquiring publications and gaining control of processes from commercial publishers." Seen from this angle, institutional repositories are good and sensible solutions. Attention therefore is devoted to technical efficiencies and control of the information produced within the institution. A consequence of this is a bias towards computer and librarian approaches to developing IRs, which neglect perspectives from relevant groups’ interests.

From BRIIs Stakeholder analysis we have learned that technical efficiencies, metadata, repositories are terms which are meaningless to academics. This concurs with Foster and Gibbons (2005) view. Seen from this technocentric perspective IRs do not provide benefits to academics. In addition having institutional labels suggests researchers that the IRs will support the needs of the institution and not their own individual needs. Academics are indeed a different crowd:
  • Academics think in terms of reading, researching, writing and disseminating.
  • Academics have strong ties with the people interested in their own field of research, or with whom they are interested to collaborate. Their geographical location is not important to them.
  • Academics are interested in a relatively small subset of research information, that one of their own research field. They have acquired skills to search for and use that information for their own work.
BRII Stakeholder Analysis (Loureiro-Koechlin, 2009)

On the other hand implementers of repositories (IT and library practitioners) possess an institutional perspective, they are interested in developing new forms of scholarly communication, control costs and improve data manipulation.

An example of institutional perspective is the mapping of people and information under the umbrella of their institutions and departments. This is an obvious mismatch as researchers see themselves as belonging to their research communities. Other issues connected to the particular ways in which research is carried out (technical and ethical) also accentuate these differences. One example of this is the differences of methodologies and nature of data collected by natural scientists and social scientists. Carusi and Jirotka (2009) report ethical issues of archiving qualitative data in a digital format, such us consent, anonymity and privacy…They state that reusing research data for other purposes will go against research participants wishes and therefore researchers ideals.

Misalignments like the above are reasons why academics do not feel institutional repositories are relevant to their work. Therefore as Foster and Gibbons (2005) report there is a need to get to know researchers in their own space so we can develop tools which are useful to them as well as to the institution. This improves the chances of IRs to being accepted and adopted by their users.

One example is Maness et al's (2008) study on needs and goals of institutional repository. It reports results which contradicted initial assumptions by IR designers and decision makers. While designers and decision makers assumed users wanted an open-access archive for research outputs, the study found out that in reality users wanted a network to share learning and teaching materials, where collaborators could be identified and where their research could be promoted to institutional colleagues.

Carusi and Jirotka (2009) From data archive to ethical labyrinth. Qualitative Research.
Corbett (2009) The Crisis in Scholarly Communication, Part I: Understanding the Issues and Engaging Your Faculty. Technical Services Quarterly, vol 26 p.125-134.
Foster and Gibbons (2005) Understanding faculty to improve content recruitment for institutional repositories. D-Lib Magazine, Jan 2005.
Maness, Miaskiewickz and Sumner (2008) Using Personas to Understand the Needs and Goals of Institutional Repository Users. D-Lib Magazine, Sept-Oct 2008.
Rieger (2008) Opening up institutional repositories: Social Constitution of Innovation in Scholarly communication. Journal of Electronic Publishing.

Thursday, 17 September 2009

Data, ideas and more

Yesterday, I met with Luis Martinez Uribe, Digital Repositories Research Coordinator, to talk about research data. Luis is working in the The Embedding Institutional Data Curation Services in Research (EIDCSR) project which looks at research data management and curation challenges. Previously he worked in the Scoping digital repository services for research data management project where he collected very interesting data which forms the foundation for the EIDCSR and hopefully many more similar projects.

Luis and I talked about overlaps in both projects and on how some common themes arised in both our data analyses. Luis has collected information about the processes and needs of researchers regarding the creation and use of research data. These are data that researchers create and use as part of their research activities (e.g., images of the heart. ) He is looking at developing processes for creating metadata and preserving both metadata and research data across time. In this form data can be reused by multiple researchers in multiple ways.

In BRII I have collected information about the nature of research activities across the Scientific - Human spectrum and from the academic, administrative and strategist perspectives and how these activities are reflected in public data such as websites. Part of the information collected reveals the reasons for sharing Rearch Activity Data (e.g., improve visibility) which are the reasons BRII wants to support and enhance across the University and beyond. Of course data have also revealed that there are reasons against making Rearch Activity Data publicly available (e.g. confidentiality issues) as well as technical difficulties which make sharing a complex task.

Anyhow, we found that there were common issues arising from the creation, use and sharing of research related data: Research Data (EIDCSR) and Research Activity Data (BRII) particularly within the context of institutional initiatives (as opposed to individual efforts). We think that these issues deserve some attention as they are influential in the implementation and acceptance of efforts like EIDCSR and BRII by their core users: Researchers.

...So we are outlining a paper which will explore those issues and let's see what happens.

Friday, 11 September 2009

Making Sense of Research Activity Information

This week I attended the Operational Research OR51 conference at the University of Warwick. This is a big conference for the OR people. A very exciting event. I gave a presentation in the Information Systems stream, having previously submitted an abstract. We started conference activities on Tuesday morning, but there was dinner and a wine tasting welcome event on Monday evening. :)

The information systems stream was opened by Nick Davey, IBM Delivery leader with a presentation on Client Relationships.

I followed then. My presentation was titled: Making Sense of Research Activity Information. I talked about the findings of the BRII Stakeholder analysis: needs we need to account for and benefits that a tool like the Research Information Infrastructure RII can bring to researchers, administrators, strategists and disseminators. I was a bit ambitious and in the few last seconds of my presentation I suggested and idea for analysing in depth different stakeholders profiles in different contexts (creation and use of research activity data) by using the Model of Enactment of Technologies-in-practice by Orlikowski (i). This is a useful tool, a practice lens as she calls it, to look at the use of technology and the (social) structures that influence its use. These structures are not necessarily inscribed or embeded in the technology by designers but surround users in their daily activities. See the presentation and abstract below.




Note: the last slide is just an example of how agents and contexts can be modeled by using the model of enactment of technologies-in-practice. By separating the issues in each of the boxes (interpretative scheme, norms and facilities) we can understand better why technologies are used the way they are used. Understanding all these issues within the context of the University of Oxford would help us to design tools that use information about research activities in the ways our users find most relevant and useful.

Abstract

Oxford University is a research-intensive institution. Research is carried out individually or in groups; within one subject field or in cross disciplinary collaborations. It also crosses organisational boundaries to other universities in the UK and abroad. Research expertise, resources and outcomes vary from discipline to discipline. Information about research activities is published in websites using independent systems belonging to academic units, research centres, institutes, projects and individual researchers. These systems and information have been designed to fulfil the individual needs of each area or research activity and accounting for outside needs is beyond their aims. Seen all together they look like fragmented, dispersed systems resembling Oxford’s federated, complex structure. From such heterogeneous sources of information it is very difficult to obtain a clear picture of research done in the University.

Research activity information needs to be visible and easy to access to increase the research impact of the institution, and to boost collaboration and funding. Efforts are being made to create a research information infrastructure to collect research activity data from around the University, to classify and connect those sources and to make them available in a consistent and organised way. This paper explores the perspectives and issues related to the use of research activity information and the systems through which this information can be made visible and kept updated in an accurate and consistent way. The paper is focused on a stakeholder analysis done on a sample of Oxford academic and administrative units involved in research and related activities.

Anyway, conference was good, saw many interesting presentations. We also went on a Shakespeare tour around Stratford upon avon :)



(i) Orlikowski developed her model of enactment of technologies-in-practice based on Giddens' ideas of structuration.

Tuesday, 1 September 2009

JISC SSBR Newsletter

The JISC Institutional Innovation programme has released a new issue of their Support, Synthesis and Benefits Realisation (SSBR) Newsletter. (Issue Six - 1 September 2009.) We are happy to see that we are featured on the front page, or is it the first paragraph? :)

Activity slowed a little over the past few weeks? Looking at the significant progress made by many projects, it appears not. BRII have completed their Stakeholder Analysis report and there are a several assemblies in the pipeline for projects to swap ideas and share resources. In this newsletter, you will also find news of free training, workshops and tools which you may find useful along with key conference and seminar dates.

You can access the SSBR Newsletter from:
http://newsletter.inin.jisc-ssbr.net/2009/09/01/issue-six-1-september-2009/

There is a short article about us titled BRII Stakeholder Analysis Report:

Recently Cecilia Loureiro-Koechlin from the BRII project announced that they had reached a project milestone by completing their Stakeholder Analysis report.

Click on the following link to read the complete article: http://newsletter.inin.jisc-ssbr.net/2009/08/31/briireport/

Friday, 28 August 2009

Beyond the Repository Fringe 2009

Sally and Ben were in Edinburgh last month attending the Beyond the Repository Fringe 2009, an event for repository developers, managers, researchers, administrators and onlookers.

They shared central stage when they gave the opening keynote speech. This is what they said:

Ben O’Steen and Sally Rumsey (Oxford) – “A sneak preview at the A-list stars of future repositories: blockbuster technical developments and the cultural drivers behind them”

Sally opens by explaining that she and Ben will be handing back and forth with Sally looking at the more library view of repositories whilst Ben will be talking about the more technical whizzy end of affairs.

Sir Thomas Bodley set up the library in Oxford and Sally is taking us through the history of the library including a lovely quote from Francis Bacon that the Bodley "is an arc to save knowledge". We're are also looking at search, 1620 style: a paper list.

The original library building fast ran out of space and the Radcliffe Camera, the Radcliffe science library and the new Bodlien library were all built. By 1914 the library received a million items a Year. It continues to grow and grow and Sally shows us a preview of the storage facility in Swindon which will be helping the Bodley deal with the volume of material by 2010.

Source: DataShare Blog

Click here to read the whole transcript.

Tuesday, 18 August 2009

BluePages - User Tests 1

Anusha, Monica and I have finished a first (short) round of user tests. We got very interesting feedback from them which we will use to design a first online version of the BluePages. We will have a second round in maybe a month time.

In the mean time I am going to give you a brief overview of the process we designed for the user tests.
  • We recruited a small number of testers from within the Library Services unit plus 2 external voluntiers. Most of these testers do not necessarily belong to any of the groups identified in the Stakeholder Analysis ("researchers", "administrators", and "strategists and disseminators".) As this was the first round and as we had a basic mockup we first needed general input.
  • We designed a script which had 2 main objectives: (1) to refine our initial design and (2) to identify and embed clear purposes and uses of Research Activity Data in the BluePages. Note: I am aware that this last objective involves more than one or more sets of user tests. However we are using these interactions with testers to start making sense of the features that need to be enhanced to make the BluePages transmit ideas such as collaborations, connections and membership. With this I think we can get acceptance which is an important part of the embedding process.


Introductory Questions
1.Ask tester to browse the site for 1 minute (or just to see the frontpage?). From looking at this site, what kinds of information do you think you could get from this site?
2.Who do you think this site is designed for? Why?

Tasks
1.Name the sponsor for one project run by Prof. AAAAA.
2.Obtain a list of collaborators of Prof. BBBBB and email that list to a friend
3.You are an academic in the area of statistics who wants to explore new areas of research. For that you are looking for some reading material. You are looking for information about genome-related diseases particularly from studies that use big samples (over 10000).
4.You are the Head of the Management department and are going to build a new Decision Making Support laboratory as part of a program to boost research among your staff. As you or your staff do not have any experience with building laboratories that need special computers and software you need to find someone in Oxford who could offer you some guidance. That person could be someone who is an expert in computers and/or works in laboratories.
5.You just set up your new project’s website and would like it to be listed in the BluePages. What do you need to do?

Briefing
1.Ask again: what kinds of information do you think you could get from this site?
2.What is your overall opinion of the BluePages?
3.Would you use a site like this? What kinds of information would you be interested in?
  • We always had two observers during the tests (either Anusha and I or Monica and I) We took turns to ask the questions. We had copies of the script and wrote notes separately. We also gave testers the list of tasks on a sheet of paper.
  • When we finished I transcribed all the notes and classified them by task or question. I then wrote a summary which highlighted main issues under each task or question.
After this we met to discuss our next strategy. This involves a clearer idea of the data "objects" that are searched for and browsed in the BluePages and the way these objects can be presented and connected. But I won't say more because this is a topic for another post.

Tuesday, 11 August 2009

BluePages - User Tests

Anusha and I are running a first round of user tests. For this we are using a mockup of the BluePages that Monica made with Balsamiq. Nice tool by the way, you can see some screenshots of a previous version of our mockup here.

I am using the term user tests as we are aiming to do a bit more than just asses the usability of our initial design. With these tests we are also trying to assess the perception of usefulness of the BluePages. That is, how useful a tool like the BluePages and/or the information that it offers is to its users' work. I think perception of usefulness plays an equal or perhaps more important role than usability in the acceptance of a technological tool. If a tool is user friendly but users do not think is useful for their work, they will not use it.

The stakeholder analysis revealed that people around the University already deal with research activity data by using their own systems (manual, computer or mixed). These systems are sometimes not as efficient as they could be but people trust them becasue they work for them. Therefore new tools like the RII and the BluePages should allow users to:
  • do what they are already doing in a more efficient and faster way (less rekeying, less duplication of data,)
  • do things they could not do or struggled to do before (collect a list of experts in one field from across the University,)
  • discover new uses to this information (find connections between people and between projects through their keywords.)
Anusha and I designed a user test script, comprised of some questions and tasks. In one of the tasks, for example, we ask testers to find someone with a particular expertise who can guide them in the design of a laboratory.

At the end of the tests we have informal conversations (with tea/coffee and buscuits!) We ask testers their overall opinion of the site and if they would like/need to use a tool like this for their work.

Tests have been very useful so far to refine our initial design but most importantly to define clear purposes and uses for the BluePages. With this we want to position the BluePages within our stakeholders work environments and with that to improve user acceptance.

Main points we want to emphasise in the BluePages are collaboration, connections (networks) and membership. These are ideas that we want to emerge from the BluePages through its use. In other words we want people to think that the BluePages is a tool for collaboration, for making connections and for improving membership.
  • Collaboration means that users can find who is collaborating with whom in Oxford. It also means that people listed in the BluePages are potential collaborators for whoever is browsing them. Therefore if anyone would like to collaborate in or outside Oxford they would probably like to be in the BluePages.
  • Connections means that the BluePages can allow people to discover connections which were not clear before. For example connecting people who did not know each other by their research interests, or connecting research activities by their keywords or themes.
  • By emphasising the concept of Membership we would like encourage Oxford research staff to contribute their data to make their research community more visible and more accessible within and outside the University.

Monday, 27 July 2009

Stakeholder Analysis report is ready

The BRII Stakeholder Analysis report is ready and available in PDF format from here. Any comments about the report are welcome. (Click on comments below or email me.)

Thursday, 23 July 2009

Developing the Bluepages

The BRII project is very much on track.

I finished the Stakeholder and User needs Analysis. Will be ready for distribution soon.

Ben has been working on the development of the foundations of the Research Information Infrastructure (RII). He has also created an online store for the vocabularies resulting from the BRII project. The vocab site has been given a single, central location in Oxford and now consitutes an institutional service.

Anusha is working on the harvesting of data for the RII. At the moment she is dealing with data about the structure of the University (a very complicated task!) which should form the basis of the RII. I will meet with her soon to talk about this as we have to document data and process audit. She has also promised to give me a technical overview of her work so far so I can post it in this blog.

Monica is working on a mock-up of the Bluepages, one of the proposed outputs of the BRII project. As part of the development process we are planning a series of user-test sessions with a selection of users from the University. Monica is following an iterative approach to development. She will produce a first, basic version of the Bluepages which will be tested by users. She will then work on their feedback to produce an improved version. We will do a few rounds of testing until users and Monica feel satisfied with the product.

This is what Monica says:

After a few meetings and brainstorming with the team, many layout considerations were made and an initial timeline for the Bluepages was created. We were able to gather enough information about what data to include and the website's behaviour after data harvesting. Now we have a good idea about the direction we want to go with the Bluepages.

The outcomes of the meeting include a definition of the look and feel of the website and a list of many possible features that could be included depending on time of development and resources available. I will create a mockup with a few screens that will represent the initial Browse by People and Profiles section of the website. (See pictures below.)

Click on pictures to view larger versions.

The following is a list of layout considerations and ideas:

General design ideas
  • Have a 'web 2.0' style of page, where simplicity and boldness will capture user's attention
  • Introduction paragraphs that tells what the website does (purpose) before we show how and why is good for the user.
  • attention map: what is more important bolder/bigger, less important/less used items smaller
  • fewer columns, 2 or 3 max
Gadgets/widgets/functionalities
  • "share this" button including email friend
  • download as PDF
  • "shopping basket" to add profiles/links to be printed later (as a list of 'things' the user wants to keep)
  • enable links in profiles to: people with same interests (clicking in keywords), people in same projects, funder's page, project's page, etc (we have to decide which one links to what)
  • collaborators sections in profiles (articles written together, include projects too?)
  • RSS?
Homepage
  • logos of University, BRII, JISC (bottom right), any other?
  • quick search on top
  • identification that is a BETA with a badge/star
  • different tabs for different areas: Browse (active from the homepage), Advanced search, How to contribute, Add your data, About, Help (others???)
  • Welcome blurb and max of 3 short paragraphs that can say WHAT IS THIS ABOUT, HOW TO USE and BENEFITS/Why
  • "Browse by" area with buttons (icons too maybe?)
  • Research Activity button will have a 'tooltip' type of baloon that will show what includes in Research activity, we can also add this info on Help page.
  • "slider" showing groundbreaking research
  • slider with random profiles ???
People's profiles
  • Name
  • Affiliations
  • Roles
  • Research topics/interests/keywords
  • Themes - controlled vocab/top level
  • Projects or other Research Activities -> link to the projects page inside blue pages
  • Funder -> link to funder page inside blue pages
  • Websites: department/college/personal page in each of the previous/"personal" personal page
  • Publications -> link to publications if available
  • Source of information
  • Report problem/correct your information button
Funder
  • Project
  • People
  • Department
  • Subject
We estimate to have this finished in the next few weeks. This initial design will serve for the first tests and useful to evaluate the speed at which I can develop the next items on the list.

About the user tests

User tests will start as soon as Monica gives us the green light. We would like to gather small groups in sessions that should last between 30 to 45 minutes. When possible we would have the sessions in our base (Osney One). However we are prepared to visit our testers in their own office. Testers will be asked to perform a particular set of tasks and give us their opinion. We will gather feedback from the users by observing their behaviour, from user notes and comments in the debriefing. Group testing and refinements will repeat until a good representation of the future users could test the website and the corrections stop/are reduced to a minimum

Now I am looking for testers. So if you work in the University of Oxford AND you are curious and want to see what we are doing AND/OR you are interested in participating in a user test please email me!

Note: you may have noticed that I am using the word Bluepages in this post instead of Blue Pages, we are thinking on what would be the best way to call them. What do you think?

Thursday, 16 July 2009

Stakeholder Analysis Report

I have been busy these last weeks writing the BRII Stakeholder Analysis report. It is almost ready! I hope to have it finished by next week. Sally has already read it and she has made some suggestions. I am working on them now.

This report has over 13 thousand words (35 pages), and it will probably reach 14 thousand as I will write an executive summary after I finish it. To carry out this study I talked to approximatelly 30 people. Most of these interactions were formal interviews. However I have also included data from meetings and informal conversations. The stakeholder analysis report has a slight academic touch. I made sure I followed a planned methodology, which I designed considering the characteristics of this project and the characteristics of the University. I wrote about this in a previous post. From the data analysis process four overall categories of data emerged:
  • Stakeholders
  • Perspectives on Research Activity data
  • Research Activities
  • Content of data - sources of data
The findings of this study are explained around these categories. They describe the BRII stakeholders and their interests in research activity data as well as highlight some user needs. (User needs will be further explored in subsequent development related activities.) I will not go into details now and here as the report needs refinement. However I should add that during this study I have also gathered a list of data contributors for the Research Infrastructure. Most of them participated in the interviews and others were approached by Sally through other networking activities.

The plan after this report is ready is to distribute it across the University. I will start by sending a copy to all my interviewees - I am very grateful to them! Obviously I will also send copies to the Project Board.

After that I will start working closer with Anusha on the harvesting of data and with Monica on the design of the Blue Pages. Next posts in the BRII Blog will explain this. Do not know exactly how the development will be done as I am not a technical person, so I will need some input from my colleagues. My role so far will be to contribute with what I have learnt from the stakeholder analysis and to recruit people to come to the office to do some user testing for us. So if you are interested please email me!

On a different matter...

My BRII colleagues (Sally, Ben, Anusha and Monica) will attend a Semantic Web Technical Review Workshop next Thursday 23th of July. The workshop will take place in the University of Bristol and is organised by the ResearchRevealed project.

Sally and Ben will give a keynote in the Beyond the Repository Fringe 2009 (30th - 31st of July) which will take place in the University of Edinburgh. They will be talking about issues affecting digital repository development and content.

I will attend the OR51 in Warwick (8th to 10th of September.) I will present the findings of the BRII Stakeholder and User needs Analysis in the Information Systems stream.

Thursday, 18 June 2009

Stakeholder Buy-in

On the 9th of June we hosted an Assembly. The topic was Stakeholder buy-in and we invited participants from other 6 JISC-funded projects and a representative from the JISC. (Find the programme here.) Our aim was to have a small but friendly gathering where we could exchange ideas, issues and problems related to stakeholder engagement. Basically what we asked presenters was to share their practical experiences at trying to engage their stakeholders. The event proved to be veeeeery interesting and useful for everyone! And I do not think I am exaggerating.

Most of our projects are focused on the development of a technological tool, and that on its own is an enormous task. Activities such as interviewing users, user needs analysis, writing of specifications and prototyping, are very much connected with the projects' cores. Somehow we take for granted that everyone is going to like our product and find it useful. That is not always the case.

Around all these initiatives there are a lot of human and social forces which affect or are affected by our projects. These forces need to be understood and assessed so they can be used in the design and implementation of our products. Stakeholders are the people and organisations that generate those forces which can be in favour or against us. Knowing what they do, think and expect is essential for the success of our work. We need to get most of them on board, and when that is not possible we need to be aware of their presence… and reasons for not liking us.

Anyway, we had 6 presentations, all of them different. (Yes, it was unbelievable how different our approaches were.) I guess those depended on the kinds of organisations we are working in, the kinds of projects we are working on and the kinds of stakeholders we were aiming at. I have embedded the presentations below so you can have a look at them (they are organised by order of presentation). Next to them I have attached some comments that Sally Rumsey (BRII project manager) wrote.



BRII [Oxford]
  • Aims at efficient sharing of research activity information by using semantic web technologies.
  • Exploratory phase around the whole University, aimed at understanding organisational structures and research cultures
  • Difficult to make sense of the chaotic structure of Oxford
  • Polarized views were found with respect to uses and needs of research activity data
  • views are influenced by research field, and kind of job (academic, non-academic), and scope (does the job involve just one field or department, or more: divisional level, University level, cross disciplinary?)



CAIRO [Roehampton]
  • Aimed at high level enterprise systems
  • Developed a communication plan
  • An intellectual thread runs through all communications whatever the medium
  • Delivering this high intensity communications is hard work
  • There is a distinct gulf between the strategies and the technologies



IDMAPS [Newcastle]
  • Aiming to improve institutional data flow
  • Systems have grown up piecemeal
  • 32 separate systems so data sharing is problematic
  • Creating a new information architecture which will lead to personalised services
  • Selling data flow as an institutional problem
  • Created ‘Wall of data doom’ diagrams of systems. Invited comments on their interpretation of systems based on the interviews.



SLAP [Gloucestershire]
  • Simplifying learner administration processes
  • Aiming to improve student enquiries and applications processes
  • Identified blockers and supporters within their stakeholders
  • Created a graph of Power against Interest and aimed communications at each group
  • Use staff news publication for regular updates
  • Demonstrate progress and change to counter any cynicism about nothing ever happening



Academic Networking [Cambridge]
  • Created using user experience methodology
  • Research phase followed by design phase
  • Need to design the criteria and then decide the number of participants
  • Used academia.edu and LinkedIn to find participants
  • Asked the question “Where are the problems?”
  • Clustered people who behave in the same way
  • Used similar behaviours to create 3 personas (a bit like use case)
  • Get input from stakeholders. Using a paper based task helped create ownership. Ask lots of ‘Why?’ questions when testing paper prototype
  • Keep going until every participant is happy
  • Read full text



eAdministration of Teaching [Cambridge]
  • Created sort of entities of jobs, people, units. Jobs are a teaching activity done by one person
  • Using 6 departments for this project
  • Involve those who will be involved in future ongoing maintenance of the system from early on
  • A problem of how to maintain momentum of input. Develop user forum
  • Read full text
Guest Speaker

After lunch Susannah Wintersgill,, Head of Internal Communications, Public Affairs, Oxford University, gave a presentation on Stakeholder buy-in for a project that involves the desing of a new staff web gateway for Oxford University.

She told us that the Uni website has >7000 pages managed by around 24 groups and includes about 187 departments etc. Any decisions have to be approved by Congregation, a group of around 4000 people. This preserves academic freedom and the democratic structure of Oxford.

There are major questions of who are the stakeholders and how to reach them
Three groups:
  • Steering Group (for direction)
  • Consultation Group (as representative of College & University as possible and carry out user testing)
  • Contesting Group – vocal and critical at set milestones. Clear criteria and remit. They are there to challenge and critique not for every thought of theirs to be incorporated in the website
They spent months user testing and user acceptance testing, from that she is able to recommended a few things:
  • Good to build in communications from an early stage.
  • Use volunteers as champions who will talk to others and gather research. To senior officers of the University eg VC. Use departmental newsletters to communicate. Plan and build in from the beginning.
  • Important not just to have one editor of the website – bias and could limit development
  • Consultation is key – everyone likes to have a say
  • Danger of stakeholder fatigue. Don’t overload your stakeholders. Use a mini survey to find out who else is doing similar or related work within the University
  • Important to learn from user research but not be absolutely tied to it
Breakout group

These are some notes from our last session. We discussed in groups our approaches and came up with ideas and suggestions for better practices. We hope we will develop these notes into more userfriendly format in the near future:

  • It is easy to pay only lip service to stakeholder buy-in
  • Communications plan – do it early. Importance of planning.
  • Tasks – make your stakeholders do something but keep it within reason and not to much
  • Identify stakeholders – what do they want?
  • Sending out newsletters etc can have mixed results. It doesn’t necessarily mean they’ll be read. Timed carefully eg Friday afternoon or just before lunch
  • Choose the right tools for the job – the message, the medium and the way to know and approach your audience (identify narrow bands of different groups). Eg some may prefer podcasts (young?) to printed literature (older?)
  • Manage expectations
  • Try not to appear too one-sided. If everything appears marvellous people may not believe you.
  • Difficulties of selling the potential when the project is only a proof of concept or pilot. How do you show the bigger picture? Stakeholders may only see the things that are immediately relevant to them (think – light bulb is useful, but the potential of electricity). Counteract this by ensuring that the initial project is useful. Create relevant demos and pilots, something that people can use
  • Either describe by saying ‘This is the end goal and these are the prerequisites we need to get there (painting bigger picture but possibly raising expectations) or say ‘This is what we’re going to do in this project that is useful to you’ (but danger of losing the ultimate goal
  • Communications must be onging. Distinguish between dissemination and discussion
  • Balance the number of stakeholders involved with the quality of the feedback.
We finished the day with a tour to the Bodleian Library

Thursday, 11 June 2009

BRII Blue Pages

Creating Oxford University Blue Pages is one of the objectives of the BRII Project (see http://brii.bodleian.ox.ac.uk.) The Blue Pages will be a directory of expertise. Through them you will be able to search for research activities and experts in Oxford University. As I see it, it will be a sort of mega tool allowing you to see what is in Oxford's Research Information Infrastructure from different angles, at different depths, and through a variety of search options.

This morning I had a meeting with Sally and Monica to start sketching a mock up. We made a few (technical) design decisions (which I could probably be able to comment on in a few weeks time, if Monica helps me!) After the meeting I ended up with the feeling that these Blue Pages are not going to be just some “Blue Pages”. Blue Pages is a nice name. It relates to the white and yellow pages where you were able to find telephone numbers or addresses of people and businesses. White and yellow pages were very useful in their time. However, those are old concepts.

BRII’s Blue Pages will be more than a list of something. Try to picture the Research Information Infrastructure as a sphere containing all information we harvest about research activities in Oxford. (The multidimensional cube analogy I used a few posts ago still applies. I used cubes to describe research activity objects.) Within that sphere you have information collected from different sources in Oxford and now connected by using semantic web technologies. There will be so much information that you will need something powerful to start exploring it and finding what you want. You will need to hold that sphere and turn it around as you wish until you find what you want (the way you need it) or at least until you find a starting point from where you can start digging (angle and depth).

The blue pages will do all that for you.

As I have mentioned in previous posts, the outcomes of the interviews I have done emphasise the need to provide access to information from different angles (are you interested in publications, in current activities, in collaborations?) and depths: broad perspectives – narrow perspectives, zoom in – zoom out. So for example if you type Pathology, the Blue Pages could give you a list of research sub fields, or research projects under pathology. If you are not interested in details, that list would give you an overview of the sort of research done there. If you are interested in detail, you can search for specific names, or click on the links resulting from your Pathology search.

At departmental and University levels there are needs for discovery of hidden connections which may lead to future collaborations. These needs may emerge from sponsors wanting to fund original, interdisciplinary research, or from the departments’ research strategies which see gaps or weaknesses in their current research. The Blue pages will be able to connect information offered at any level to whatever other information is available in the infrastructure. A straight forward example would be the Research Topics of Interest in a Researcher's Profile. With one click on a profile you will be able to find other people or projects related to the same topics.

Note: the above are just ideas. Not sure if all them will be implemented as I presented them or if we will need some changes.

--------------------
On Tuesday we had our Assembly. The event turned out to be very interesting and everyone left the place happy. I will post on this in 1 or 2 weeks after I finish a webpage with all the presentations, comments and outcomes from the meeting.

Wednesday, 3 June 2009

Stakeholder Analysis Report 1

On Tuesday we had our 3rd Project Board meeting. For that I prepared a short presentation with an overview of my data. I had previously emailed them a list of stakeholders I had interviewed with some comments. This gave them an idea of the kinds of (administrative/academic) areas that have participated so far. I also prepared a couple of slides to help me send a clearer message.



And this is roughly what I said:

This is an overview of the kinds of stakeholders I have met which give me an idea about their particular interests in the project and their needs of research activity data.

Top down – Bottom up (slide 2)
First, I found opposite views in terms of what stakeholders perceive of BRII. Whether some people see it as having a Top-down approach or being imposed from top levels of the University, other people see it as being a Bottom-up initiative allowing researchers make their work more visible among the administrative levels and outside the University. Mostly administrative staff at departmental and University levels have a bottom-up perception and academics have a top-down perception. The following are short profiles for these groups:

Academics already get the information and visibility they need and want. They do not have an interest beyond their field, (which can comprise interdisciplinary sub-fields), and beyond the network of connections they already have. Getting more contacts and promotion is part of their job but they do this by attending conferences and publishing.
  • To find information about finished projects, they see publications
  • To find information about current and new projects they attend conferences and request information from Research Services
PIs do not use the web as much as one would suspect. They are limited to academic journals and databases, some of them use bibliographic indexing systems and e-mail based alert systems, telling them about new publications in their area. That is how they find information about people and potential collaborations.
Some of them think web-visibility is important because they know that the sponsors see their websites - but due to their multiple responsibilities updating their websites gets pushed down. Academics belonging to competitive research areas put more effort in updating their information online.

Administrative staff‘s job involves overseeing a number of projects, mostly from the financial, project management point of view. They are driven by the need to get more money for the departments/units from external funding. Strategies are drawn to see how this can be achieved. All of these strategies also aim at increasing competitiveness to achieve better visibility and reputation in the eyes of sponsors, visibility and reputation on individual and departmental levels.

Some strategies involve recruitment, some others mentoring and some others tracking and controlling current projects. Other strategies involve promoting collaborations and finding particular expertise needed. The more competitive the area the more essential collaborations are. For all these tasks there is a need for information within and outside their fields of expertise. (within=current projects, outside= current and potential collaborations) Administrative staff have to deal with huge amounts of information which they get from multiple sources and which they format according to their needs. A system which helps them (1) manage and (2) find information would be very useful for them.

At (high) University level needs of information crossing academic fields is more essential. People here are not specialised or looking for information on their field but on information that provides an overview of what is going on in the University. Finding information on academics and/or activities using not specialised language is important as this information is being searched by someone who is not an expert and will probably be disseminated to a variety of people who may not be experts as well.
-----------------------------
The bars in slide 2 represent a continuum of needs of information from stakeholders. If you go more to the top you will need information crossing areas and connecting information from different sources. If you go down the continuum you need to get in-depth information on specific areas, which can perhaps complement and enhance academic journal or databases? This can be connected to the idea about a zoom in - zoom out tool I wrote a few weeks ago.

The right hand side bars represent flow of information. The actual information on research activities is generated by the researchers, this is their information. Academics make use of this information, particularly publications, which is relevant to their field(s) of expertise. Requests for this information are made by administrative staff at departmental level so they can keep (financial) control of their research activities. At University level requests of information are made to departments, information which summarises and highlights some of their research.

Research Culture (slide 3)

Research culture is a spectrum of cultures which go from the Humanities to Sciences. I have drawn this as a spectrum to emphasise the fact that most researchers are not located in either extreme but in between the extremes.

Humanities: researchers tend to work on their own. Need fewer resources. Some academics have never got a research grant in their life. Outcomes: publications and books. Their findings are more permanent in time (of course I do not want to generalise to all researchers in humanities.)

Sciences: work in bigger projects (time and size), they get their money from external funding (hence the need to be more exposed to sponsors, and the more competitive mentality.) They need specialised resources, buildings and technology. Outcomes, publications, software, discoveries, findings which can be applied in practice (e.g. drugs, treatments) some of their findings are ephemeral in time. (Of course I do not want to generalise to all researchers in sciences.)

The bigger the project (size, interdisciplinary, collaborations, resources) the more need for funding. For this, being able to sell/advertise their research to sponsors is vital.

Current Systems (slide 3)

With respect to sources of information, the bigger the area the bigger the need to organise and manage data.

Smaller departments are able to manage their information manually or with spreadsheets. They centralise the control of data and the administration of research activities. There is one person gathering information to upload in the website.
Bigger departments have systems or more staff managing their data. They are able to distribute some administrative responsibilities such as updating websites and reporting.
Everyone finds Research Services’ reports essential.

Thursday, 28 May 2009

JISC Assembly final Programme

The BRII project will host an Assembly on the 2nd of June. The following is the definite prorgamme.

Date: 9th June
Time: 10:30 – 15:00
Title: Stakeholder buy-in
Venue: Board room, Osney one Building, Osney Mead, Oxford OX2 0EW

* 10:30 Coffee
* 10:50 Introduction – Sally Rumsey, BRII Project Manager
o Roundtable of Presentations and discussions on Stakeholder buy-in.
View from Oxford University – Cecilia Loureiro-Koechlin
View from CAIRO-Roehampton – John King
View from IDMAPS – Newcastle – Sunil Rodgers
View from SLAP - Gloustershire – Stuart McQuaid
User research and user-centric design and how this can engage campus audience – Cambridge, Academic networking – Anne-Sophie de Baets and Oszkar Nagy
How stakeholder engagement works within e-admin of teaching – Cambridge, e-Admin of Teaching – Matthew Jones
* 12:30 Lunch
* 13:30 Presentation by Susannah Wintersgill, Head of Internal Communications, Public Affairs, Oxford University
* 14:15 We form groups to work on “a comparison of methods between participant projects”—> I thought we could use this slot to work on the document we have to send to JISC afterwards. They want a clear outcome coming from each assembly. Our outcome will be written in a report to JISC. If any one can think on a better idea for our Assembly outcome and on how to use this last slot please let me know.
* 15:00 Assembly ends

We are also organizing a Tour to the Bodleian Library at 16:00 which will last 30 mins.

Map to Osney One.

Monday, 18 May 2009

Data Analysis

In my previous post I talked about my initial thoughts on the data I am collecting from my interviews. That was an exercise to warm my brain up to start thinking on qualitative data analysis, categorisation and coding of data. In this post I would like to briefly explain the methodology I am using to analyse that data. Be careful... this post has a bit of theory on methodology, but I'll try to keep it simple.

To start I have to say that I have been contacting administrative and academic staff from around the University. (You can see a classification of interviewees in my previous post.) I have done this by using the contact details other people I have previously met gave me. So for example if I interviewed Dr. X and he suggested I could contact Professor M, I will then contact Professor M via email, and say Dr X gave me your name and suggested I could talk to you... This has helped me a bit with getting a bit of trust and credibility from potential interviewees. It has also helped me with making sure I am meeting with the right people.

I have had interviews as short as 20 minutes and as long as 1:45hr. I have recorded all of them except one telephone conversation I had with a divisional research administrator. I was on the phone with him for 1hr! Interviews have been mostly semi-structured/unstructured. I took a flexible approach to account for Oxford's heterogeneity. I would always start with the same questions (I would ask them to tell me about their jobs) and then I would choose questions depending on their answers. However I always tried to keep my questions mainly in these three areas:
  1. Questions related to the creation and management of Research Management/Activity data.
  2. Questions related to the use of Research Management/Activity data (perhaps from other sources.)
  3. Questions related to issues and future uses of Research Management/Activity data.
Some of my respondents were able to cover these three areas some only one or two. This depended of course on their roles and experience.

Next step was to transcribe those interviews into Word documents. I've been doing that on the train. Surprisingly this is the perfect place for me to do such a boring and tedious task. So 1hr each way and I am able to transcribe possibly 30 to 45 minutes. (If you are transcribing audiofiles and dreading it, try doing it on the train.) I haven't done an exhaustive verbatim transcription but tried to capture all the ideas covered in every interview. Now I have enough material to start doing the analysis.

As I have carried out interviews, my data are qualitative, i.e., texts containing my interviewees’ ideas. The aim of qualitative data analysis is to abstract those ideas into one cohesive set of statements which could stand for similar pieces of data i. This is not a statistical generalization but an interpretive one ii. The way this works is by organising segments of text according to categories of data, or data codes. I then will go through an iterative process of rephrasing and writing summaries of all of the ideas contained in each category. At doing this I am abstracting the ideas from their original contexts (e.g. the interviews or the interviewees’ jobs) and assigning them new contexts, the one of their categories. Selecting categories is not a science but a kind of art. They depend on the way the researcher interprets the data, and they need constant reading and re-reading of the texts, to make sure categories and analysis reflect the phenomenon under study. The end result of the analysis will help me to draw specific implications, like for example the relationship between BRII stakeholders and the Research Information Infrastructure, characteristics of data needed, uses of research activity data, ways of accessing and viewing information which are most useful for different roles in Oxford, etc.

Anyway, having explained the (sort of) theory behind the analysis process, I will finish this post by explaining the first set of categories that have emerged from my data so far:
  • Perspectives on Research Management data/Research Activity data, what people think about its importance, benefits, relevance to their work, accessibility, visibility, and its management. This is also about the kinds of activities that they perform that involve this kind of data.
  • Research activities, what are the actual processes connected to research activities, types of activities, types of groups, etc, how are they reflected in data?
  • Content of Data/Sources of data – what "objects" are these data describing? (this category will also describe data contributors.) Other issues such as management, quality of content, difficulties at gathering data, difficulties at putting together a website, what is sensitive data, etc
  • Types of Stakeholders, descriptions of departments, functions, people’s roles and their activities
  • Notes for development - anything relevant for the design of the infrastructure or the web services, including my own thoughts.
To give you an idea of the kind of data I got, here you have an extract from an interview with someone from the Medical Sciences division. I have initially classified this text under Research Activities.

"Themes have no money (they are different from institutes and centres) Themes are purely a way of helping to sell their research in a way, showing where their strengths are in this university. A theme is a way to classify people. Themes are also a way of quantifying what they do."


-----------------------------
i Tesch, R., (1990), Qualitative Research: analysis types and software tools, New York, The Farmer Press.
ii Walsham, G., (1995), 'Interpretive case studies in IS research: nature and method', European Journal of Information Systems, 4, no.2