A long way to go … A study on the implementation of the learning-outcomes based approach in the EU[*]

Tim Birtwistle, Courtney Brown, and Robert Wagenaar[**]

doi: 10.18543/tjhe-3(2)-2016pp429-463

Abstract: Higher Education institutions have, in the framework of the Bologna Process, been called to re-define their degree programmes on the basis of the learning outcomes approach. This implies a change of paradigm moving from teacher-centred to student-centred education. The Tuning project was set-up in 2000 to develop — through a bottom-up approach — a methodology to achieve this shift. This methodology proved not only to be relevant for Europe, but also for other world regions, including the USA, where Tuning projects were launched from 2009. In 2010 both in the EU and the USA the need was felt to find out whether the intended modernization of learning was actually taking place and how this process was perceived by its main stakeholders. For this purpose a study was initiated, covering the period 2011 to the beginning of 2016, based on the two-pillar approach of quantitative and qualitative instruments. For the study a robust evaluation instrument was developed, consisting of surveys and in-depth interviews implemented by a research team at a selected group of Higher Education institutions, involving management, teaching staff, student counsellors and students. In this paper the outcomes of the EU part of the study are presented, cross referencing to some of the USA study results. The main outcome of the study is that in general limited progress has been made regarding the intended paradigm shift and that key expectations of the reform Process have not been met. This is both the case for Europe and the USA. Although, good practices have been identified, the actual implementation of the student-centred approach is not proceeding beyond a discourse on the paradigm shift and there is no certainty it will be achieved. For Europe there is also a worrying disconnect between the various tiers of the HE sector, ranging from Ministers to students, regarding the actual penetration of the student-centred approach and the education experience of the students. There has been a failure to engage with and convince academic staff about the necessity and advantages of this paradigm shift. Teaching staff are struggling to adjust to the new concepts and paradigm shift and are challenged by no longer being the “knowledge owners” but rather learning facilitators. It does not help that the vast majority of staff members have not undertaken professional development for HE teaching. Where staff development has taken place, it is too focused on process, rather than the concepts and benefits of a learning outcomes approach. The outcomes of the study should therefore be perceived as a wake-up call because without additional and continued support in particular for the teaching staff the reform process could fail.

Keywords: Bologna Process; student-centred learning; learning outcomes; surveys; site-visits.

I. Origins of the Tuning/Learning Outcomes approach in Europe
and the study providing the focus for ‘A long way to go’

More than a decade has passed since when in 2003, as part of the Bologna Process and through the means of the Berlin Communiqué,[1] the Ministers of Education encouraged the member States “to elaborate a framework of comparable and compatible qualifications for their higher education systems, which should seek to describe qualifications in terms of workload, level, learning outcomes, competences and profile”. As a consequence of this call the European Higher Education Institutions were urged to re-define their degree programmes in output-based terms, using learning outcomes to define the outputs to be achieved. In other words to make these programmes student-centred so as to better prepare graduates for their future role in society. This approach gradually became the axiom for modernizing higher education in Europe. This was confirmed in the Bologna follow-up Leuven-Louvain-la-Neuve 2009 Communiqué in which a special paragraph was devoted to student-centred learning and the teaching mission of HE. The ministers reasserted in that document ‘the necessity for ongoing curricular reform geared toward the development of learning outcomes’.[2] For the very first time in an official Bologna document the central role of learners and academics in the modernization process was highlighted: ‘student-centred learning requires empowering individual learners, new approaches to teaching and learning, effective support and guidance structures and a curriculum focused more clearly on the learner in all three cycles’. Academics were urged, ‘in close cooperation with student and employer representatives’, to continue ‘to develop learning outcomes and international reference points for a growing number of subject areas’. This would require ‘higher education institutions to pay particular attention to improving the teaching quality of their study programmes at all levels’.

These statements could be read as an advertisement for a project that had been launched 9 years earlier, with the support of the European Commission, by a significant group of renowned universities to develop an approach that would offer the instruments to make the required modernization a reality. This university-driven process, named Tuning Educational Structures in Europe (in short Tuning) developed a universal approach to implement the Bologna Process at the level of higher educational institutions and subject areas. It published the main part of its results in the period 2009-2010.[3] The Tuning approach consists of a methodology to (re-) design, develop, implement and evaluate study programmes for each of the three Bologna cycles. It served, and still serves, as a platform for developing reference points at subject area level, basing its work on a wide stakeholder consultation, including employers, graduates, students and academic staff. The reference points that were developed during these years were and are relevant for making programmes of studies comparable, compatible and transparent. They are expressed in terms of competences (distinguishing between general, transversal and subject-specific ones) and learning outcomes. Tuning contributed to the development and enhancement of high-quality competitive study programmes by focussing on fitness of purpose (to meet expectations) and fitness for purpose (to meet aims) as well as providing a “living” assessment and pedagogical learning environment that is applicable to the “4ever” learners: whoever they may be, wherever they may be, however they learn, whenever they learn. The methodology transcends “delivery” and encompasses all learners.

Since 2003 the Tuning methodology spread gradually around the globe in varying degrees and with a local context often put on the core principles. In the case of Tuning Latin America it covered 18 countries and 15 subject areas, for Tuning USA it was sometimes a single state within the country and always a single language,[4] whilst in China, Georgia and Russia it was in a single language in a single country.[5] However, although Tuning spread around Europe and the world, it must be stated at the outset that the term Tuning is not universally recognised. It meets strong ‘brand loyalty’ from those who have been engaged in projects around the globe[6] but beyond that recognition is limited, in particular to HE management. To that end throughout this article the term Tuning encompasses the student-centred approach (requiring a learning outcomes approach). Indeed it was Tuning that raised awareness about the need for a paradigm shift from staff driven to student-centred higher education.[7]

The study, which provides the basis for this article, originates from the co-operation between the International Tuning Academy experts from Europe and Lumina Foundation.[8] The private Lumina Foundation has at its core “Goal 20%25”, to have 60% of Americans with high-quality degrees (by 2025). Funding has covered a number of analytical tracts of the Bologna Process[9] and projects (Tuning USA) and discussion working documents.[10] The development of Tuning USA (2008) involved higher education institutions in three US states covering six disciplines with a mix of two-year, four-year, public and private institutions. The initial pilot project was completed in August 2010. Tuning USA 2 was launched in early 2012 with more states and disciplines as well as taking the subject area of history deeper and wider with the American Historical Association (AHA). The Degree Qualifications Profile (DQP) and Tuning are being more closely aligned. The extensive range of projects funded by Lumina to foster the attainment of Goal 20%25 ranges from inter alia Tuning and the DQP through Competency Based Education, New Business Models, funding arrangements, completion, and credentials framework.[11]

By 2010 the need was felt to check whether in two world regions, the USA and Europe, the intended modernization of learning was actually taking place and how this process was perceived by its main stakeholders. To find this out, an initial study was set up and implemented during the period 2011-2012, the purpose of which was to develop robust evaluation survey instruments.[12] Already during the implementation of this first study, the need was felt for extension to other stakeholder groups, graduates and employers and to enhance and deepen the existing set. This resulted in a follow-up study, which covered the period July 2013 — January 2016. Although limiting the initiative to Europe and the USA, it was clearly understood that it should be structured in such a way to allow, at a later stage, the whole “Tuning Family” in all of its aspects (the nuclear family, the extended family, the dispersed family and the disenchanted family) stretching around the globe to adopt the methodology. What must be recognised is that local contexts, conditions, traditions and imperatives affect the way in which the Tuning competence/learning outcomes based approach develops. Whether implemented in Africa, Canada, China, Russia, Central Asia, the United States, Latin America or Europe (or indeed in any of the other areas where Tuning is being used) the need for evidence based analysis is there, requiring a robust evaluation process to be able to be tailored to the local, national or regional context.

This article covers the outcomes of this challenging study. The EU part of the study was co-financed by the European Union,[13] the USA part by Lumina Foundation. The findings presented here focus in particular on Europe, being sometimes referenced against those of the USA. This is to ensure that the focus is clear and to enable policy implications to be analysed and ways forward to be suggested in a European context. A further article will focus on the US context compared with the EU and thus offer that analysis.

II. The study

The study recognised from the outset that a robust methodology was required and that for this to operate across two continents it had to be developed with care, culturally, linguistically (English was used across the Study, because multiple translations were just not possible) and in terms of the time for respondents to complete the online surveys. A great deal of development work — testing, improving as a result of the testing, ‘translating’ context and language — and then finalising the evaluation instruments was needed. These survey instruments were designed to gather information and thus provide evidence of the relative impact on the learning environment as a result of the Tuning/learning outcomes process/approach or of comparable initiatives and activities. In terms of impact this should be evidenced by changes in behaviour brought about by adopting the Tuning process or comparable Learning Outcomes based processes, by changes in learning and teaching strategies and methodologies and by the provision of learning opportunities and assessment of student learning. This has to be set against the overall objective of the student-centred approach to prepare graduates better for their role in society, both in terms of employability and citizenship.

The approach reflects the paradigm shift from input or staff/expert driven learning to output based student-centred learning. This shift has been promoted in the framework of the Bologna Process and in reform processes that Tuning has also initiated in other parts of the world. Although the Tuning approach has been received well and is widely used today, there is only limited evidence about how effective the student-centred approach is in practice for today’s and tomorrow’s society. Of course, where Tuning Projects have taken place, there is a strong ‘brand recognition’ amongst the academic staff (faculty), who have participated. However, it must be said that, beyond these project participants (admittedly thousands of people around the globe), there is little ‘brand recognition’. Then, throughout the Study, those participating could, if they recognised Tuning, choose — through the ‘skip logic’ used in the present survey, see later — that very route which makes use of Tuning terminology or alternatively go down the ‘learning outcomes approach’ terminology route.

In both the USA (for example A Culture of Evidence: An evidence based approach to accountability for student learning outcomes[14]) and Europe there was a demand for up to date hard data to be collected using a single methodology (surveys), allowing analysis by project, subject, institution, region and group, plus the qualitative data (visits) to compare with the quantitative data. Previous attempts at gathering such data had been undertaken, in various guises.

In Europe there have been the various European University Association TRENDS (I — VII) reports which clearly illustrate the long and winding road that needs to be followed to achieve some degree of change. The following extracts and references illustrate what has happened over the past 17 years (TRENDS I and II[15] had largely analysed what was in place and how change might develop). For example: TRENDS III (2003)[16] identified what it called the ‘gaps’ between levels of perceived adoption of changes (see “disconnect” later) as well as the rising star of ECTS and the challenge of student centred learning. TRENDS IV in 2005[17] undertook a major set of visits and asked some general questions about change in learning. TRENDS V (2007)[18] stated that: “the most significant legacy would be a change of educational paradigm […]; institutions are gradually moving away from a teacher-driven provision, and towards a student-centred concept of higher education”. TRENDS VI (2010)[19] stated: “some institutions have begun to support pedagogical skills’ developments and curricular reforms but that these changes entail many challenges. […] Student-centered learning entails a more creative approach to teaching and therefore even more hours spent on developing new ways of teaching. Institutions must find ways to motivate academic staff to spend the time required to design, evaluate and re-design their modules, if necessary, and to assume different roles’’. Then there is TRENDS VII (2015)[20] asking: “To what extent have learning and teaching moved up as institutional priorities? How extensive has the shift been to student-centred learning across Europe and is this shift supported by national and institutional policies and other measures (e.g. funding, staff development, internal and external quality assurance procedures)?” A good deal of attention is given to learning (ICT, internationalization etc.) and it is reported that: “Given the interest of national authorities and policy makers in the EHEA, it is not surprising that the implementation of a learning-outcome approach has been an important development for 60% of institutions. As a result, by 2015, 64% have applied it to all courses and 21% to some courses. This shows a continuing progression since TRENDS 2010, when 53% had applied it to all courses and 32% to some’. Is this implementation or wholesale adoption? Is it documentary lip-service or a shift in paradigm, practice and purpose?

In the case of the present study, implementation of the Visits proved to be very time consuming. Cooperation of Higher Education Institutions was not always easy to organise. In fact, there were many institutions and their staff that were approached, who were reluctant to discuss the state of affairs in their institution. Some simply stated that position whilst with others their degree of obfuscation and prevarication rendered a Visit impossible. This hampered the collection of data. Also too many institutions did not promote participation in the surveys, for whatever reason — ‘survey overload’ might be one the causes. This applied to both Europe and the USA. It proved to be necessary to extend the original project period of the study to meet the planned objectives.

Nevertheless, the outcomes presented here offer — in the view of the research team — a picture of the actual situation regarding the implementation process of the modernisation of Higher Education. Although the team found excellent examples of good practice, the overall picture is worrying. It seems that the discourse related to the paradigm shift is now landing, but that overall the actual implementation is very slow to commence or, indeed, not taking place at all. Only at places where tailored action has taken place, initiated by individuals because they were involved in specific initiatives such as Tuning, Thematic Network Programmes (TNPs) and/or ECTS related activities or other projects, it seems that serious progress has been made.

When the findings in this Study are compared to the Bologna Implementation report 2015,[21] the already quoted European University Association (EUA) TRENDS VII: Learning and Teaching in European Universities report[22] and the European Students’ Union (ESU) Bologna with Student Eyes 2015: Time to meet the expectation from 1999 report,[23] it seems that the state of implementation at Higher Education institutional level is even weaker than is stated in those reports. It is worth noting in this respect that in the ESU Peer Assessment of Student Centred Learning ‘Putting students at the heart of learning’ (2015),[24] it is observed that “Institutional reviews […] rarely signify the aspect of teaching and learning as a core one, which also gives a false signal to the institutional leadership about priorities of management”.

III. Methodology

The initial project statement was driven by the need for evidence concerning how far the student-centred approach in HE has been taken up in institutions. To address this aim, a mixed methodology was tailored and fine-tuned, using quantitative and qualitative indicators. The ultimate aim was to test whether this student-centred approach addresses current issues better than the traditional forms of education in the European Union.

The evaluation process reflected in this study is based on two pillars: quantitative and qualitative instruments. The quantitative or inner instruments are based on a set of surveys in which the respondent can self-identify as either being more familiar with Tuning or with the learning outcomes/competences/student-centred approach and as a result have the questions framed in language appropriate to that selection (so called ‘skip logic’): (1) questionnaires for academic staff and institutional management, (2) questionnaire for students (3) questionnaires for graduates and (4) questionnaires for employers. Questionnaires 1 and 2 were developed as part of the first phase of the Study and focus on the reception and implementation of the approach. They were piloted twice before going to scale as part of the second phase of the Study. The questions included in the questionnaires were the result of intense cooperation between the EU and the US team. During this process sensitivities regarding educational models and use of terms came to light and required accommodation. Having started with common models it was then decided that it was necessary to split these into European and US versions, taking in to account linguistic, cultural and context differences, but keeping exactly the same methodology and core questions about the educational process.

Questionnaires 3 and 4 were mainly developed during the latter stages of the Study and focus on the effectiveness of the (Tuning) competences/learning outcomes approach for career development. They both need further field-testing before going to scale. The same self-identifying approach was applied for the 3 larger questionnaires to make these as user friendly as possible. The operational questionnaires can be accessed (and indeed completed) via the Tuning websites.[25]

Involving institutions and their staff and students to complete the questionnaires proved not to be a simple process of distribution. In January 2014 tailored action was required by the EU Steering Group to identify more institutions to be involved, approaching various representative bodies in Europe, making an open invitation to complete the surveys, identifying persons previously involved in projects. A spreadsheet was set up to track contacts and responses.

The second pillar covered the qualitative approach using what were referred to as the outer instruments. For this part the research teams in the US and Europe were both extended with researchers. The team in Europe was made up of 5 members, covering 4 nationalities, to be able to operate in pairs. In the original set-up of the study, it was foreseen that the “outer instrument” sessions (focus groups, interviews etc.) would be conducted initially by two members, an expert and graduate assistant, then by the graduate assistant only with periodic sampling and validation of the process by a Steering Committee member. In practice it proved necessary to involve for each session two experienced researchers, because of the size of the groups to interview, the complexity of the issues at stake and the note taking. For each visit a report was drawn-up. The approach used in Europe was mirrored in the United States. The reports from these sessions were aggregated ensuring anonymity whilst at the same time allowing for accurate analysis. The visits were constructed around the following headings:

 

1. Introduction

2. General information about the visit / Basic information

3. Level of implementation of LO/competences approach at Institutional/ Programme/ course units level

4. Kind of information/support for teachers provided by the institution to use Learning Outcomes/competences approach

5. Strengths, weaknesses and main challenges occurred in teaching, learning and assessment strategies by using the Learning Outcomes/competences approach

6. Changes and impact of LO/competence approach in student performance

7. Students’ perspective on LO/competence approach and utility for them to find a suitable job

8. “Tuning” dissemination in the institution (projects, materials, implementation, etc.)

9. Main conclusions of the visit including recommendations. Prior to each visit a rigorous analysis of all on-line information available in the public domain was undertaken, this then allowed for a further comparison between the results gathered during the visit, responses to the on-line surveys and the ‘public face’ of the institution.

 

These qualitative instruments inform about behaviour(s) and attitude(s) of key stakeholders regarding redesigning/enhancing of curricula; formulating competences and learning outcomes statements and their practical use; learning opportunities and structures; assessment of students; communication of learning outcomes to students and other stakeholders, etc. This should lead to some clear evidence whether the use of the student-centred approach has a (positive) effect on student and staff motivation and performances resulting in higher success rates. Data collected from the first Pilot provided indicators of change.

In the EU 14 site visits took place, spread over Higher Education Institutions from as many countries.[26] The available budget did not allow for more visits.

IV. Terminology

The use of consistent terminology and well and broadly understood concepts are a crucial element for successful reforms. In this case the focus was on the paradigm shift from expert driven education to student-centred education based on the use of the competences/learning outcomes based approach. The outcomes of this study show there is (still) a lot of confusion about both terminology and concepts applied.

The reasons for this are manifold. Terminology is to a large extent culturally and historically bound. In the framework of the Bologna Process it has been agreed to use English as the lingua franca. However, using an English term does not automatically imply that such a term has the same meaning and connotation in other countries. A good example is the term ‘competences’. In the UK this term is traditionally associated with more applied forms of education, such as vocational education and training, while in the USA and continental Europe it is perceived as encompassing knowledge, skills and (personal) attributes. Differences in understanding and interpretation of terms has led to many misunderstandings, also due to the way these have been translated in other languages. These misunderstandings have been boosted by the definitions and practical use of terminology in different European documents, two competing European Qualifications Frameworks, ECTS Users’ Guide, CEDEFOP terminology guide,[27] Tuning documents, etc.

The many websites, course catalogues and course manuals of the universities studied by the research team reflect the confusion in use of terminology. Concepts (and terms) such as competences,[28] learning goals and objectives[29] and programme and module/unit learning outcomes[30] are in the vast majority of documents mixed up and used interchangeably. Misunderstanding exists also about the term student-centred education, not meaning a cafeteria model,[31] but flexible programmes covering a particular field of study, allowing for individual profiling with the aim to preparing students most effectively for their future role in society.[32]

In this study the definitions used were as defined by Tuning and applied worldwide, in particular the ones regarding competences and learning outcomes[33]. In Tuning terms, learning outcomes set a level of competence to be achieved, basing it on the idea that the role of education is to make the learner more competent. It also allows for making the important distinction between disciplinary based competences and general or transversal ones to be developed in the context of a field of studies which are also included in the 2015 version of the ECTS Users’ Guide.

What has not been to date sufficiently understood, from the methodological point of view, is the difference between ‘learning outcomes’ and the ‘outcomes of learning’. The latter is a very broad evaluation of the total gain made by a learner throughout their studies. This includes formal, informal and non-formal learning. This is a very relevant distinction, because the institution is manifestly responsible for the learning outcomes of its programmes; it can only be partly responsible for the total experience of learning, social interaction, maturation, etc.

It became apparent during the course of the visits, in particular the interviews with the students, that there is a disconnect between the levels of communication regarding student learning outcomes and the value that students place, for obvious reasons, on their total learning experience, including other activities: group work, project work, work experience, etc. The students need to pass the hurdles to obtain their reward but they also wanted a rounded total experience to be better employable.

V. Survey results

The opening questions were used to establish the context within which the respondent worked/studied: institution, post, how long in post, subject area, cycle of study and year of study etc. This data is of use to the researchers because it enables a helicopter view of where the response are coming from and thus an oversight of the project spread. The responses came from a wide range of countries, institutions, post-holders, cycles of study, subject areas. With a number of questions respondents were asked to check all applicable options, thus the numbers do not always add up to 100%.

SURVEY 1: ‘Teaching, Learning and Assessment: Process and Impact’

The survey counted 399 respondents in total. Of the EU respondents, 70% were academic staff, 20% were management and leadership and 10% were student advisors or counsellors. However, in the EU, many respondents wore multiple hats, as both academic staff and management and leadership. So there is some overlap where a respondent could be counted for both the academic staff and other categories. Of the American respondents 42% were faculty members, 46% were adjunct/contingent faculty, 2% were deans, 6% were department chairs and 4% wore a variety of other hats. In total 83.5% of the academics/faculty completing the survey have been in post for more than 5 years (for administrators and other staff it was 54.8%).

When asked if they felt “informed” regarding expectations for their courses about how they relate to the discipline and/or degree programs 53.9% of EU staff said ‘Yes’ and 46.2% said ‘No’ (for the US the Yes count was much higher).

Regarding what students might receive credit for only 29.7% of EU respondents stated that recognition of informal prior learning was given, but 85.4% said that recognition for formal prior learning is the case. Only 14.6% said yes for Massive Open Online Courses and 22.8% for experiential learning. In all cases the figures were significantly lower from the US respondents.

Regarding methods of delivery in all cases a variety of modes are used but again with significant differences between the EU and the US, much higher figures being returned from the EU respondents showing that 93.7% use campus-based learning, 60.8% use flipped classrooms, 7.6% use MOOCs, 50.6% use blended learning and 28.4% use online only delivery.

Given the history of the use of ECTS in much of the EU, it is not surprising that academics say they take into consideration student workload when planning courses. In fact 96.2% said this is the case (the figure is lower from the US).

When asked how the curriculum is defined, the vast majority (in both the EU with 80.3% and the US) said that it is in terms of learning outcomes and competences. About 12.5% still cling to the use of aims and objectives and 6% stated ‘other’.

Of those who stated defining their curricula on the basis of learning outcomes/competences, most academics/faculty gathered information to help define these through discussions with colleagues at their institution, but some also frequently gathered information from discussions with colleagues at other institutions as well as students at their institution, as can be learned from the survey outcomes presented below. Multi-answers were allowed in responding to the question illustrated by Table 1.

Table 1

How did you gather information to help define
the learning outcomes and/or competences?

Discussions with current students

48.7%

Discussions with discipline academic staff at my institution

81.2%

Discussions with faculty across subject areas/disciplines at my institution

58.1%

Discussions with faculty in my subject area/discipline in other institutions and sectors

45.3%

Discussions with professional organizations and/or discipline specific associations

30.8%

Discussions with other stakeholders (employers, alumni, community members, etc)

42.7%

Discussion has not been initiated

6.0%

 

As follow-up questions, staff acquainted with the learning outcomes/competence approach were asked whether the curriculum designed had been a collaborative effort, and had been discussed and agreed by academic staff. The first part of this statement was answered positively by 48.2%, the second part by 66.4%. Respectively, 45.4% and 29.1% answered that to a certain extent (‘somewhat’) these elements had played a role. Asked whether academic staff discussed student learning, degree outcomes, and competences, 63.4% confirmed this was the case. 51.4% stated that the discourse had changed focussing more on these topics. Respectively, 25.7% and 36.7% mention there had been some impact. The USA surveys proved to be more or less comparable to the EU outcomes.

High percentages of respondents acquainted with the learning outcomes/competences approach agreed that as a result of using this approach, learning outcomes are more integrated in the classroom, that course learning outcomes align with degree programme learning outcomes, and that the syllabus references learning outcomes. Respondents felt less strongly that the course catalogue reflects the learning outcomes for each course. In more detail: 56.4% of the respondents answered that the course catalogue reflected the learning outcomes for the degree and 62.9% for each course. Respectively 39.1% and 26.6% thought this was the case to a certain extent. 74.5% stated that their unit learning outcomes were consistent with the programme learning outcomes, 18.2% thought this was partly the case. This relates to the answers to the question whether ‘my syllabus’ includes learning outcomes/competences, which 79.3% think is really the case and 14.2% partly. 56.1% think the learning outcomes are integrated in assessment, learning, and teaching, 42.2% presume this is partly the case. Asked whether the advising and information materials described the learning outcomes at programme and course unit level 41.3% said this was the case and 47.7% to some extent. Finally, 51.8% stated that they discussed the learning outcomes with students and 39.3% ‘somewhat’. The figures for the USA with regard to most of these statements are higher and significantly higher for ‘integration of learning outcomes in teaching, learning and assessment’ and ‘discussion of learning outcomes with students’.

Multi-answers were allowed again in responding to the question illustrated by Table 2.

Table 2

As a result of using a learning outcomes/competencies approach to what extent do you agree with the following?

%

1 — Yes

2 — Somewhat

3 — No

I don’t know*

USA Mean

EU Mean

The learning outcomes / competences approach drives the way I structure my courses

EU 56.1%

Tot. 61.3%

EU 39.3%

Tot. 32.3%

EU 3.7%

Tot. 5.5%

EU 0.9%

Tot. 0.9%

1.43

1.45

I make adjustments throughout the term in my teaching when I see the students are not achieving the learning outcomes

EU 34.9%

Tot: 52.1%

EU 43.1%

Tot. 35.2%

EU 21.1%

Tot. 12.3%

EU 0.9%

Tot. 0.5%

1.42

1.88

I have broadened my perspective of the entire curriculum by tailoring my specialization to the needs of the degree program

EU 34.9%

Tot. 39.7%

EU 49.1%

Tot. 38.8%

EU 12.3%

Tot. 15.9%

EU 3.8%

Tot. 5.6%

1.74

1.76

My assessments are based on learning outcomes

EU 62.6%

Tot. 71.4%

EU 31.8%

Tot. 21.2%

EU2.8%

Tot. 5.1%

EU2.8%

Tot. 2.3%

1.33

1.31

Student engagement has improved

EU 34.3%

Tot. 38.5%

EU 34.3%

Tot. 34.9%

EU20.4%

Tot. 15.6%

EU11.1%

Tot. 11.0%

1.69

1.83

Student learning is a central indicator of quality

EU 55.7%

Tot. 66.5%

EU 37.7%

Tot. 26.0%

EU 4.7%

Tot. 3.7%

EU 1.9%

Tot. 3.7%

1.29

1.43

There is an opportunity for an end of course open dialog with students to discuss the extent to which learning outcomes have been achieved

EU 45.4%

Tot. 40.5%

EU 38.9%

Tot. 28.9%

EU12.1%

Tot. 27.5%

EU 3.7%

Tot. 3.2%

2.0

1.67

NB: the extent was rated on a scale from 1 to 3, see Table headings. The Mean, which is shown in the two last two columns reflects the answer of respondents on this scale.

* Please, note that “I don’t know” responses were eliminated in Mean calculation

As a result of using a learning outcomes approach, the majority of respondents felt that student learning is an indicator of quality, the learning outcomes/competences approach drives the way they structure their courses and that assessments are based on learning outcomes. Fewer participants felt that they had tailored their specialisation to the needs of the degree programme.

Respondents felt that the most positive impact from applying a learning outcomes approach came from the way they assess learning (40.7%), the way they present their course materials (48.2%) and state course outcomes (50.9%), the alignment of the curriculum and courses to the learning outcomes (43.5%), the way they teach (55.6%) and discussions with students (49.1%). Student engagement (31.5%), type of discussions with colleagues in the field (24.1%), the impact on quality assurance mechanisms (28.8%) and the development of a common language in the discipline scores significantly lower (19.4%). The impact on the quality of programme scores 41.7%. The figures for the USA are significantly lower.

SURVEY 2: the EU students

Out of a total of 666 respondents, 86% were from the first or second cycles (53% and 33% respectively). Short cycle, doctoral candidates and ‘traditional’ long or single cycle students were also represented. Respondents were also from every year of study (1 to 6) and from across the spectrum of subject areas (architecture to zoology).

When asked how their curriculum is defined, 67.1% said learning outcomes but 70.3% said objectives with 57% stating competences. This is at variance with the responses from academic staff/faculty (see above) and also with the findings from the visits (see later).

To test the levels of communication a series of questions were asked of the students, as illustrated by Table 3.

Table 3

Levels of communication

Not at all

Somewhat

Very much

Don’t know

When I was advised on course unit selection there was a focus on the competences I would gain

10.3%

53.9%

26.2%

9.6%

My discipline/degree programme has a clear statement of expectations

4.7%

39.2%

52.6%

3.6%

I understand why I am required to take the course units needed to earn my degree

6.1%

35%

56.1%

2.9%

My workload is appropriate to achieve the learning outcomes of the course unit

10.1%

37.6%

49.8%

3.6%

Advisors are able to provide a clear explanation of how course units fit into a bigger picture

14.1%

46.8%

33.1%

6%

The course catalogue states the learning outcomes for each unit

10.4%

36.7%

46.2%

6.8%

The course catalogue states the learning outcomes for my degree

9.2%

38.3%

44.4%

8.1%

Progression routes to a degree are clearly stated and explained

13.5%

36%

43.4%

7.2%

 

In only two cases do more than 50% of the students believe ‘very much’ that they are getting a clear explanation of what they need to do and why they need to do it to achieve their degree. ‘Somewhat’ figures are large in all categories but the visits show that often ‘somewhat’ is a kind way of saying ‘no’. This indicates a gap[34] (‘disconnect’) between what academics and management believe and what the students perceive and believe they are experiencing.

It does appear that the level of discussion of learning outcomes in class (23.9% saying ‘very much’ and 75% stating ‘not at all’ or ‘somewhat’) and at the end of the course (24.4% saying ‘very much’ and 70% ‘not at all’ or ‘somewhat’) is disappointing. The connection between the learning outcomes and the assignments is slightly higher (41.8% saying ‘very much’) but even so disappointing (once again the meaning of ‘somewhat’ is a problem).

51% of the academic staff state they discuss learning outcomes with students ‘very much’ and 39% ‘somewhat’ compared to the 23.5% and 51% respectively felt to be the case by the students. The gap shows. Moreover, 45.4% of academic staff state that there is ‘very much’ an opportunity for an open discussion with students at the end of the course whereas only 24.4% of the students feel this is the case. The gap (‘disconnect’) is writ large.

Some main conclusions can be drawn from the surveys. The results in Europe and the USA are largely comparable. However, it is clear that care must be taken when interpreting these survey/questionnaire results because earlier examples in the Bologna Process show there is a tendency to overestimate one’s own performance to leave a more positive impression, even if this is subconscious. This has been noticed with regard to both the official Stocktaking and the TRENDS Reports over the years.[35] This seems also to be the case with these surveys if compared with the outcomes of the in-depth visits (see below). This seems not only to be the case in the ‘yes’ responses, but in particular in the ‘somewhat’ responses.

VI. Visits process and results

VI.1. Process

As has been said (see above) setting up the visits proved to be very difficult. Some institutions actually stated that they felt they were not ready for such “scrutiny” (term used by them, although we kept stressing — at every stage of communication with all approached — that these visits were research visits and not, in any way, shape or form validation or providing feedback to any outsiders or agencies, but that, on the contrary, the visits were learning opportunities because of the feedback). Others prevaricated such that time ran out (giving a feeling of not wanting to take part) and some made every effort to accommodate the visit and to lay themselves open to analysis in the true spirit of the visits and the research objectives.[36] To these we are indebted and once again say ‘Thanks’.

In the end 14 visits across all EU took place, from research intensive universities to those with a teaching only mission, encompassing a wide breadth of missions and sizes. There was no visit to a private for profit institution, but this was not for the lack of asking.

In the set up phase the same information was sent to each institution approached as well as a suggested format for a single full day visit. The categories of persons the team hoped to see were stated but whom the team did see was up to the institution, depending on the availability in different subject areas. This led to a wide range of subject staff and students being seen but also some repetition of subject areas — this did not matter because the original evaluation had been that, apart from subjects directly involved in Tuning, Thematic Network Programmes (TNPs) or ECTS projects from a particular institution, the methodology was unlikely to have been influenced apart from by national policies (the national qualifications framework, quality assurance mechanisms, diploma supplement, continuing professional development requirements etc.).

Once the visit date had been agreed (and researchers allocated — from a calendar of availability) an internet search of the institution took place. This looked at references to the national qualifications framework, diploma supplement (examples and availability), quality assurance mechanisms (internal and external), availability of in-house staff development, degree profile, curriculum, unit learning outcomes, any sample assessments etc. This formed Part 2 of the institutional feedback report and informed the researchers (and institution) of the public face of the institution.

At the end of each visit the researchers gave informal feedback to the institution — to whom this was given varied by institution as it was for them to decide. The next step was that a draft report be sent for correction of factual elements. Following any required amendments of fact, the final report was sent.

It is important to note that anonymity was promised, no institution or individual would be identified or identifiable. Each institution received a copy of the final report.

VI.2. Findings

There are certain recurring themes from the visits (and these do actually show to varying extents but are nonetheless present across the continents). The main headlines are:

VI.2.1. Varied institutions display varied behaviour

Higher education activity still falls largely in to three categories: teaching, research, administration. The nuances of each of these have changed over the years and continue to change. Institutions have proliferated and with that (and the change in most places to mass participation systems even where there is still selection based on prior educational achievement) the variety of missions has changed and the mix of the elements. However, there are students in universities and they are there to learn. The mission of the university will impact on the learning process as will funding patterns, the political will of the state, the background of the student population, etc. However, as was stated in Modernisation of Higher Education (2013):[37] “With this report, we put quality of teaching and learning centre stage” and “Our focus, therefore, is on the quality of teaching and learning for those who enter or who hope to enter higher education in the future.”

Some institutions visited were highly micro managed — this impacted upon the curriculum, staff development, the mix of workload for staff, student staff ratios, assessment calendar, appraisal systems, internal quality assurance etc. Across the spectrum then there were: central macro management, devolved management, self-management within institutional parameters. All styles leading to varied operating environments.

What is clear is that there is a disconnect between what different tiers of responsibility believe/imagine is the higher education landscape and what those who actually participate in the learning process experience. This appeared to some extent in each and every institution visited. If one looks at 2015 statements regarding the Bologna Process at the higher policy levels, awareness about its implications is writ large, with the corresponding “font size” diminishing progressively down the levels until there is — in some places, it has to be said — a total lack of actual experience by the students of any active knowledge of, and participation in, the learning outcomes process.

This metaphor recognizes that there is a lack of progress but, as the research results show, not the full extent of the actual lack of progress.

VI.2.2. Insufficient learning alignment

By learning alignment is meant the continuum of the learning environment from learning outcomes (LO) to the learning activities (LA) to the all essential learning assessment (LA), hence the frequently used term of ‘LO,LA,LA’. None of these segments is free standing and can make any meaningful contribution to the learning process without the other two. Learning outcomes are not a passive ossified artefact but must be active (and thus subject to re-evaluation and change after an appropriate feedback loop). The learning activities must reflect the learning outcomes and are now required (European Standards and Guidelines 1.3 2015) to: “encourage students to take an active role in creating the learning process” leading on to learning assessment that “reflects the approach” (that is reflects the student involvement).

Once again there was a disconnect here; it varied in magnitude as did the institutions vary. However, although a few institutions were making very positive (in some cases strident) requirements of their staff to engage in all aspects of learning alignment, there remained a lack of report back from students that they could see the connection and that there had been continued efforts to both engage them in the process and to continually communicate with them. So, even where efforts were clear and demonstrable there was still a lack of meaningful penetration. Imagine how disappointing it was where there was no management drive or institutional buy in to ensuring that the learning outcomes approach and learning alignment were embedded in the warp and weft of the learning experience. Such a situation was sometimes totally obvious and showed no signs of there being a “learning spring” around the corner.

In some sessions the lack of engagement by staff involved in pedagogics with the learning outcomes approach was clear (“what do we want to know about learning outcomes for?”). If those who are custodians of the development of learning show a total disregard for student-centred/learning outcomes, what hope is there for a paradigm shift?

Where staff development was taking place which engaged with why and how the change from didactic expert driven delivery to student centred/learning outcomes facilitation of learning (with learning alignment) many staff did both welcome this and fully engage with it. Where there was active engagement in mentoring/coaching, this too made a positive difference. Where there had been involvement in projects such as Tuning or in the past ECTS, that also made a positive difference. Where there was institutional indifference or mere lip service, that, not surprisingly, had a negative impact.

VI.2.3. Vocabulary, semiotics, messaging and communication

Any systematic search through university websites reveals much. Of course there are claimed problems with updating, editing, proof reading. However, the evidence on the websites (prior to a visit) is then confirmed by the visits — there is a lack of consistency in the use of terminology and vocabulary and then documents, web pages, course handbooks, and study manuals. Discussions then did confirm the confusion. Does this matter? Yes, it does matter because confusion abounds when terms are used inconsistently, interchangeably and incorrectly.

There is no single definition of terms such as ‘competences’, ‘learning outcomes’, ‘learning alignment’, ‘student-centred learning’ but there are recognized definitions used consistently in policy documents and working documents (for example ECTS Users’ Guide 2015, Tuning documents, Frameworks etc.). Adherence to these more commonly used and available definitions with the phrase ‘for the purpose of this document we use the following definitions’ would at least start to eliminate wider confusion and would certainly limit internal institutional confusion.

At meetings on the visits staff commonly used ‘competence’ and ‘learning outcome’ as interchangeable terms. Slipping back in to the language of the former paradigm (expert driven delivery), for example ‘learning goals/objectives’ rather than the language of the new paradigm, for example ‘learning outcome’ is more than a slip of the tongue. The semiotics of this is one of confusion, lack of clarity, lack of determination to join the paradigm shift and therefore lack of consistency.

This confusion is commonplace. The lack of consistent messaging and communication does lead the stakeholders (across the spectrum) to lack in belief that a paradigm shift is underway, let alone that it has been achieved. This also leads to the question (see above) of how can there be learning alignment when there is a lack of clarity as to what it is that is being aligned. These are more than issues of editing and proof reading; they are issues of a true buy in to the paradigm shift.

VI.2.4. Staff development

Staff development is a crucial issue. Without staff development the change in paradigm will remain stalled but it must encompass the “why and how” not merely the process of form filling. There must be engagement with the staff and this was said and gained in the visits. Where there was active engagement in mentoring/coaching, involvement in projects (such as Tuning or in the past ECTS), that made a positive difference but institutional indifference or lip service that made a negative difference.

Those members of the staff who want to engage and master the learning outcomes approach, and many interviewed were of that mind, felt stranded both by lack of training and by the pull towards research and away from teaching as a career enhancement. It was often mentioned that at the outset of the introduction of their national qualifications frameworks and learning outcomes, there had been some training. From what was said, such development was either viewed as a done deal or any attempt to deal with concepts, benefits etc. was abandoned and replaced by process training. This was anathema to the staff. They want concepts, benefits, links etc. and not form filling to comply with internal QA and audit requirements.

Where new projects were launched (for example joint degrees, centres of excellence in teaching etc.) there did tend to be a reinvigoration of training, or often what was much liked was in-house mentoring/coaching and peer-to-peer activities and evaluation of documents. These ventures were both cost effective and engendered a collegial spirit.

A main challenge for Higher Education Institutions is that too often there is lack of a well-established unit for staff-development. Some examples of excellent staff development provision were found either at university or faculty/school level. Some provision was also at country level. In general, it has to be noted, however, there is low priority for establishing and sustaining such centres. In many institutions there was a lack of informed trainers. As mentioned above, staff will not accept sub-standard process driven ‘training’. They want to understand the concept and benefits of the new paradigm. Without this, it is feared that this shift will not take place. Use should be made of examples of good practice, which for some of the countries visited will be in other countries and therefore require an international endeavour.

VI.2.5. Student reaction

All meetings with students were interesting, stimulating and regrettably confirmed beyond reasonable doubt the disconnect that exists between even the most pessimistic of the 2015 reports cited above (BWSE linked to the ESU country coordinator reports) and the reality shown on the ground by the responses of the student interviews. The disconnect was confirmed by the consistent themes that they disclosed, namely: lack of (perceived) communication; lack of understanding of the gains to be had from having a good understanding of their studies and of what they would know, understand and be able to do on completing units of learning; that they displayed learning behaviour immersed in the former paradigm — what are we told, what information do we have, what are the past assessments, how can we best get through this subject. Thus in terms of the learning outcomes approach there was only evidence of a lack of penetration and understanding at first cycle in the vast majority of cases and at second cycle with some evidence of impact, particularly amongst mature students. In terms of student-centred learning, of course the European Standards and Guidelines 1.3 2015 is too recent to have impacted on process, but, notwithstanding this, at first cycle level there was very limited evidence of this shift, at second cycle there were some green shoots of development.

Students were not convinced that there was any link between what was demanded of them and any description, or analysis, of what outcomes they would achieve by the end of their learning. Some knew that they had been told by some staff of the learning outcomes at the start of their studies but few felt there was consistent communication and messaging about this. Those who did placements (work based learning, internships, stages etc.) did not make any link between learning outcomes and the skills/competences that they could offer an employer. Even where they had been provided with CV writing guidance this link had not been made, nor had the simple benefits they would gain by using such language and demonstrating the competences they had gained from their studies been pointed out.

In terms of their studies, there was little perceived link to workload from the credits allocated to a unit of study. Some students did know what the norm should be (28 hours per credit being often quoted) but few felt this was in any sense realistic. Most felt that the workload demanded of them was less than that quoted. However, there was a general feeling that the smaller the credit allocation was, the heavier the workload/credit required to achieve the learning outcomes was (in their terminology ‘to pass’). All institutions operated a post learning review in one form or another; this varied from the very tightly prescribed in terms of scheduling, analysis of responses and feedback to rather haphazard process and follow up, with all shades of process in between. All students felt that if their views were sought (which they were) then there should be some clear line of follow up — analysis of returns, discussion of the data, action plan, action and communication of what had happened and why. Once again the extent of this line of action being in place varied greatly — at one end of the spectrum staff was replaced if the feedback and data was very negative, at the other no action appeared to be taken or follow up communication made.

VI.2.6. Impact of the National Qualifications Framework and ECTS

In particular management and senior staff with management experience, or duties, acknowledged the impact that the introduction of their national framework had made. The link to ECTS in terms of programme structure and profile was also acknowledged. However, those engaged in the teaching did not often see this — of course if the university regulations required a certain format then that in reality is enough (and often this was the case).

The Frameworks had been, without exception, a catalyst for change in terms of levels, outcomes (the Dublin Descriptors were often cited as being a significant agent of change), and, of course, creating a fundamental and often fraught change to a 3 cycle system with the consequences of this still reverberating around some country systems.

VI.2.7. Impact of Tuning

Senior management at all institutions were aware of Tuning, some simply because of having received the documents for the initial approach and others because of involvement over the years with projects or having attended conferences. Staff who had already undertaken the on-line survey had some awareness of Tuning as did those who had been involved in projects, however, others were not aware of the process. Students were unaware of the process, as they were largely unaware of the learning outcomes approach.

There was little brand awareness of Tuning, but where there was awareness and where there had been participation in projects there was great brand loyalty, much more so to Tuning than to any passing knowledge of the learning outcomes approach.

VI.2.8. Disconnect

This term has become the by-word for the overall findings of the research (a stronger version than ‘gap’ from TRENDS III, see above). By the term is meant the inability to have, throughout the tiers of a higher education institution (and indeed beyond that throughout the European Higher Education Area), a consistent awareness let alone ‘buy-in’ and adherence to the learning outcomes approach. Given that this is a core element of ECTS, of Frameworks and the European Standards and Guidelines, this has to be both disappointing and indeed a shock and a wake-up call.

VII. Examples of good practice

On the basis of the visits the team has been able to identify a number of good practices that are relevant to the whole sector. Each institution had examples of good practice but not one was exemplary. Nevertheless, from these instances it proved to be possible to aggregate cognate areas and thus produce the following list:

 

a) A well-defined university policy on learning, teaching and assessment in accordance with the mission of the institution. However, this policy must be put into action right through the institution. Having the policy is not sufficient, the institution has to be sure that there is wide acceptance and, indeed, ‘buy in’ to the policy and the action resulting from it. The need for good communication is essential to ensure that all stakeholders are involved, aware and committed to the actions.

It can be noted that where a clear policy has been defined and followed through there is a shift of paradigm underway, however, even in these institutions this remains patchy at implementation levels. This means that constant attention to the policy implementation is required for continuing development and success.

b) Some universities are working with fixed templates for describing the curriculum as well as its modules and units. These require statements of the profiling of the programme and its learning outcomes as well as the learning outcomes for individual units, plus the learning and teaching methods and the forms of assessment. It is crucial that these are shared with potential as well as actual students. In the set up phase it is essential that these are viewed by the staff as something more than just a ‘tick box administrative task’ but as an integral part of the curriculum development owned by the staff who develop them and then facilitate the, hopefully, aligned learning.

c) Staff development is an essential component for enhancing study programmes and their delivery that will meet the needs of all stakeholders (both internal to the university and its students as well as external, for example employers and professional organisations). Staff development can have many different forms. What seemed to work best was a central policy underpinned by central funding, the actual staff (who took part in training, advising, mentoring, supporting) was based in a central unit but with well organised and defined links to individual departments, faculties etc. The staff, of course, should be well versed in the paradigm shift taking place and able to communicate this whilst fully understanding the university policies and their place within the wider world. Staff often acted as the ambassadors for the university in national and regional bodies and activities.

Decentralised models do exist and where there was alignment with university policies and excellent internal communications with some central coordination they too did work effectively. Activities that these models might deliver include: international staff mobility, courses, workshops, peer mentoring, continual professional development, learning gatherings (often ‘learning lunches’), team building, allotting credits to activities to enable staff to accumulate credit to achieve a qualification etc.

d) With activities such as curriculum development the building of Teams (including staff, students, central staff development representative, employers, professional body representatives etc.) to take responsibility for defining, organising, implementing and delivering the learning in all of its aspects. This ensures collegial ‘buy in’.

e) Structured links to employment and the world of work, including: alumni tracking, visiting lecturers, CV coaching, staff communication on learning outcomes, competences and professional standards, relations with employers, internships/placements, entrepreneurship labs etc. All of these assist the students to understand their place within their studies and how to best present themselves when applying for internships/placements, jobs, further studies.

f) National initiatives — these can provide impetus and re-launch the conversation about the paradigm shift. New initiatives are needed on a regular basis because otherwise other new ideas push the ‘older’ ones down the memory and institutional/personal priorities. Such initiatives have included: centres of excellence, ‘lecturer of the year’, ‘best university’ etc.

VIII. Conclusions and next steps

‘A long way to go …” reflects the findings of this study. This is in terms of the findings in the inner but in particular in the outer instruments, the surveys and site visits respectively. It is fair to conclude that the discourse about the shift of paradigm is taking place to various degrees, amongst management and to a lesser extent staff, but much less amongst students. There is a long way to go but there is no certainty that the shift will be achieved, indeed it seems that it is finely balanced and could, without additional and continued support, fail. Making it work is the responsibility of all levels involved and cannot be simply left to the academic staff responsible for delivering the programmes. The evidence clearly shows the disconnect between the rhetoric, political ambitions and reality. This has already been reported on in the already quoted ‘2015 analyses’ of progress. At a policy level, examples of the perception of success in implementation of a student-centred learning approach are reported by: the Bologna Implementation Report 2015: “lack of recognition of the value of student evaluation, independent learning and the use of learning outcomes”; the TRENDS VII (2015): “not all these positive developments are common everywhere and, therefore, more progress is needed”; and the Bologna With Student Eyes 2015: “there has clearly been some progress; … 50% of respondents think that progress is slow; … the other half...are still not convinced that student-centred learning has been made a priority in higher education.”

These statements are confirmed by this study. In fact, the actual level of penetration is lower than that which was stated in those documents. The main cause of this has been the insufficient communication between the political players and university hierarchies and the academic staff, as highlighted in the Yerevan Communiqué.

There has been a failure to engage with and convince academic staff about the necessity and advantages of this paradigm shift. Many initiatives have been taken in terms of national and international cooperation but have not received the endorsement and support required by the political policy makers. Seed corn funding has proven to be of help in the launch of relevant activities but a long-term commitment is the only way to achieve changes of this magnitude across such a broad spectrum of higher education systems.

It has been underestimated by all involved in the process how crucial a commitment to staff training and development is. It must be remembered that most staff in higher education have had no pedagogic/andragogic education and training — most staff are indeed ‘driving without a licence’, they base their own teaching on their own experiences as a student. The world has changed but not — in the vast majority of countries and cases the training for life as a university academic involved in facilitating learning and then assessing the achievement of the learning outcomes. What came as a shock was that many ‘trainers/professionals’ interviewed were actually themselves still operating in, and indeed wedded to, the old paradigm of expert driven delivery. Many institutions proved not to have any form of a well working Staff Development Unit with a focus on the new paradigm and all that it entails, including the many benefits to both staff and students. If this is not remedied the future looks bleak. However, any such Units must be positive, well informed, truly engaged and truly serve the needs of the staff and their students in line with institutional policies. They must not be perceived as a ‘side show’. Recognition of such a Unit’s value and ability to enhance and add value to the learning is vital. Success without these factors is unlikely. Full engagement by all actors is a sine qua non for success.

Without engaging students and employers in programme design, implementation, delivery and quality assurance there will not be the required level of progress. Good initiatives in this respect are there, but it is a patchwork rather than all pervasive.

Given the financial situation, students show, for obvious reasons, concern about their future role in society. What they observe is a flexible labour market in which they are expected to demonstrate a sufficiently wide range of general competences and where possible some work experience. They know they need subject specific knowledge and skills but do also desire the wider outcomes of learning. In today’s ever changing job market and challenged society it is of crucial importance to involve employers and societal leaders in the educational process, if possible in a structured way. They should be seen as advisers in this process, not decision makers in what should be taught and learnt, something which is a collective responsibility but must have at its core the academic staff. Nevertheless, their involvement as guest lecturers and placement/internship providers adds great value. Many institutions have already recognised this and taken appropriate steps in that direction.

To achieve these enhancements follow-up steps are required. The programme of visits was able to engage the institutions once more with the required paradigm shift and to re-launch the dialogue as well as allowing the researchers to analyse the state of play and the needed enhancements. The most important of these are:

 

• A stronger commitment at national level to achieving the paradigm shift, which is, in any case in the national interest in terms of economic prosperity and a sustainable society.

• European, as well as national, support to create better conditions for success. This can be both organisational and financial. This also implies a well-defined strategy for communicating the benefits of the paradigm shift at national, institutional and personal level. This might require tailored taskforces to operate at all those levels.

• Renewed institutional commitment and stronger leadership to achieve the paradigm shift, including adopting those good practices that already exist. This requires serious investment in targeted staff development and effective structures for curriculum development and learning backed by an effective quality culture.

• A systematic approach for analysing the reality of what is happening in practice. This could make use of the robust instruments developed, tested and used in the framework of this study. Site visits by an international team have proven to be of great value both in the analysis that takes place but also in the heightened awareness created. It seems to be the best way to obtain a reliable picture of what is happening and allows for relevant and useful constructive feedback.

 

Bibliography

Adelman, Cliff. The Bologna Process for U.S. Eyes: Re-learning Higher Education in the Age of Convergence. Washington, 2009.

Arreola, Raoul A. Writing Learning Objectives. A Teaching Resource Document from the Office of the Vice Chancellor for Planning and Academic Support, The University of Tennessee, Memphis, s.a.: https://www.uwo.ca/tsc/graduate_student_programs/pdf/LearningObjectivesArreola.pdf

Bologna Process website: http://www.ehea.info/article-details.aspx?ArticleId=87

CEDEFOP. Terminology of European education and training policy. A selection of 130 key terms. Second edition, Luxembourg, 2014.

Education International and European Student Union, Time for a new paradigm in education: student-centred-learning. Learning SCL toolkit, s.a. (2010): http://www.esu-online.org/pageassets/projects/projectarchive/100814-SCL.pdf.

European Student Union. Bologna with Student Eyes 2015. Time to meet the expectations from 1999, Brussels, 2015: http://www.esu-online.org/asset/News/6068/BWSE-2015-online.pdf

European University Association, TRENDS III: Progress towards the European Higher Education Area. By Sybille Reichert and Christian Tauch, 2003: http://www.ehea.info/Uploads/EUA%20TRENDS/TRENDS_III-July2003.pdf

——— TRENDS IV: European Universities Implementing Bologna.
By Sybille Reichert and Christian Tauch, 2005: http://www.eua.be/eua/jsp/en/upload/TRENDSIV_final.1114509452430.pdf

——— TRENDS V: Universities shaping the European Higher Education Area.
By David Crosier, Lewis Purser & Hanne Smidt, 2007: http://www.ond.vlaanderen.be/hogeronderwijs/bologna/documents/EUA_TRENDS_Reports/Final_TRENDS_Report_V_May.pdf.

——— TRENDS VI (2010): A decade of change in European Higher Education. By Andrée Sursock & Hanne Smidt, 2010: http://www.eua.be/Libraries/publications-homepage-list/TRENDS2010

——— TRENDS VII (2015): Learning and Teaching in European Universities. By Andrée Sursock, 2015: http://www.eua.be/Libraries/publications-homepage-list/EUA_TRENDS_2015_web.

European Commission. The European Qualifications Framework for Lifelong Learning (EQF). Luxembourg, 2008: http://www.ond.vlaanderen.be/hogeronderwijs/bologna/news/EQF_EN.pdf.

European Commission/EACEA/Eurydice. The European Higher Education Area in 2015: Bologna Process Implementation Report. Luxembourg: Publications Office of the European Union, 2015.

Froyd, Jeffrey and Nancy Simpson. Student-Centered Learning Addressing Faculty Questions about Student- centered Learning. Texas A&M University, 2010: http://ccliconference.org/files/2010/03/Froyd_Stu-CenteredLearning.pdf.

Gonzalez, Julia and Robert Wagenaar, eds. Tuning Educational Structures in Europe, Final Report. Bilbao and Groningen, 2003.

——— Tuning Educational Structures in Europe, Universities’ contribution to the Bologna Process, Final Report Phase 2. Bilbao and Groningen, 2005.

High Level Group on the Modernisation of Higher Education. Report to the Commission on Improving the quality of teaching and learning in Europe’s higher education institutions. June 2013. Luxembourg publications office or http://ec.europa.eu/education/library/reports/modernisation_en.pdf.

Lokhoff, Jenneke, Bas Wegewijs, Katja Durkin, Robert Wagenaar, Julia González, Ann Katherine Isaacs, Luigi F. Donà dalle Rose and Mary Gobbi, eds. A Guide to Formulating Degree Programme Profiles. Including Programme Competences and Programme Learning Outcomes. Bilbao, Groningen, The Hague, 2010.

Lumina Foundation website: www.luminafoundation.org

McKiernan and Birtwistle.”Making the Implicit Explicit: Demonstrating the Value Added of Higher Education by a Qualifications Framework.” In: The Journal of College and University Law. Notre Dame, 2010: http://www3.nd.edu/~jcul/files/Birtwistle_McKiernan.pdf.

MHEC website: http://www.mhec.org/programs/tuning (accessed 18 May 2016).

Millett, Catherine M., David G. Payne, Carol A. Dwyer, Leslie M. Stickler, Jon J. Alexiou, A Culture of Evidence: An Evidence-Centered Approach to Accountability for Student Learning Outcomes. ETS, Princeton, 2008: https://www.ets.org/Media/Education_Topics/pdf/COEIII_report.pdf.

PASCL (Peer Assessment of Student Centred Learning) Website: http://pascl.eu/publications/overview-on-student-centred-learning-in-higher-education-in-europe/ (accessed 18 May 2016).

“Realising the European Higher Education Area.” Communiqué of the Conference of Ministers responsible for Higher Education in Berlin on 19 September 2003: http://www.ond.vlaanderen.be/hogeronderwijs/bologna/documents/mdc/berlin_communique1.pdf 016.

Rogers, C., As a teacher, can I be myself? In: Freedom to learn for the 80s. Ohio: Charles E. Merrill Publishing Company, 1983.

Serbati, Anna. “Implementation of Competence-Based Learning Approach: stories of practices and the Tuning contribution to academic innovation.”Tuning Journal for Higher Education 3, no.1 (2015).

‘The Bologna Process 2020 — The European Higher Education Area in the new decade’. Communiqué of the Conference of European Ministers Responsible for Higher Education, Leuven and Louvain-la-Neuve, 28-29 April 2009: http://www.ond.vlaanderen.be/hogeronderwijs/bologna/conference/documents/leuven_louvain-la-neuve_communiqué_april_2009.pdf.

The Glossary for Education Reform website: http://edglossary.org/learning-objectives/.

Tuning Educational Structures in Europe website: http://www.unideusto.org/tuningeu/.

Tuning Europe website, EU/US research project: http://www.unideusto.org/tuningeu/component/content/article/385-euus-research-project.html.

Tuning International Academy website: http://www.tuningacademy.org/.

Tuning Russia website: http://www.tuningrussia.org/index.php?lang=ru.

About the Authors

Tim Birtwistle (tim.birtwistle@hedconsultant.co.uk), is Emeritus Professor of the Law and Policy of Higher Education (Leeds Beckett), Visiting Fellow Oxford Centre for Higher Education Policy Studies, Fellow of the Royal Society of Arts, Jean Monnet Chair and consultant to Lumina Foundation (USA). He has published widely on the law, policy, process and change of higher education covering, inter alia, academic freedom, university liability, risk, dispute resolution, qualification frameworks, credits, learning, assessment and trans-Atlantic issues of convergence and divergence in journals, professional publications, book chapters and handbooks.

Courtney Brown (cbrown@luminafoundation.org) is the Director of Organizational Performance and Evaluation at the Lumina Foundation in the United States. In this role she works across the Foundation to provide strategic direction using evidence from evaluation, data and metrics. She oversees the evaluative work on the development, outcomes, and impact of the work the Foundation conducts. In addition, she manages the performance measurement system of metrics the Foundation uses to measure progress toward GOAL 2025, to increase the proportion of Americans with high-quality degrees and credentials to 60 percent by the year 2025. Dr. Brown received a Ph.D. from the University of Virginia (USA) in Educational Evaluation and Research. 

ROBERT WAGENAAR (r.wagenaar@rug.nl) is a historian and at present Director of the International Tuning Academy Groningen by appointment of the Executive Board of the University of Groningen. He is also a member of the Editorial Board of the Tuning Journal for Higher Education (TJHE). From 2003 until 1 September 2014 he was director of Undergraduate and Graduate Studies (Dean of Studies) at the Faculty of Arts of the University of Groningen in the Netherlands. Wagenaar is an external Higher Education expert for the European Commission and has been involved in main initiatives to harmonize European Higher Education, such as the development of a European Credit Transfer and Accumulation System (ECTS) since 1988, the Qualifications Framework for the European Higher Education Area and a European Qualifications Framework for LLL. He also chaired from 2004 to 2014 the Dutch team of experts for the implementation of the ‘Bologna Process’ in Dutch Higher Education institutions by appointment of the Ministry of Education and Culture. Per January 2015, he has been appointed by the same Ministry as member of the Dutch Higher Education Experts team in the framework of Erasmus +. Furthermore, he is the president of the Erasmus Mundus Master Programme of Excellence Euroculture: Europe in the Wider World (since 2006). Together with Julia Gonzalez and Pablo Beneitone (University of Deusto, Bilbao, Spain), Wagenaar elaborated, designed and coordinates the large scale innovative project Tuning Educational Structures in the World.

[*] The research team consisted of the authors and the following researchers: Ingrid van der Meer (International Tuning Academy Groningen, University of Groningen, The Netherlands), Edurne Bartolomé Peral (Deusto International Tuning Academy -DITA, University of Deusto, Spain), and Anna Serbati (University of Padova, Italy).

[**] Tim Birtwistle (tim.birtwistle@hedconsultant.co.uk), is Emeritus Professor of the Law and Policy of Higher Education (Leeds Beckett), Visiting Fellow Oxford Centre for Higher Education Policy Studies, Fellow of the Royal Society of Arts, Jean Monnet Chair and consultant to Lumina Foundation (USA).

Courtney Brown (cbrown@luminafoundation.org), PhD in Educational Evaluation and Research, is the Director of Organizational Performance and Evaluation at the Lumina Foundation in the United States.

Robert Wagenaar (r.wagenaar@rug.nl) is a historian and at present Director of the International Tuning Academy Groningen by appointment of the Executive Board of the University of Groningen, The Netherlands. He is also a member of the Editorial Board of the Tuning Journal for Higher Education (TJHE).

More details on the authors are provided at the end of this article.

[1] “Realising the European Higher Education Area,” Communiqué of the Conference of Ministers responsible for Higher Education in Berlin on 19 September 2003.

[2] Communiqué of the Conference of European Ministers Responsible for Higher Education, Leuven and Louvain-la-Neuve, 28-29 April 2009.

[3] Tuning Academy website: http://www.tuningacademy.org/.

[4] For example see: MHEC website: http://www.mhec.org/programs/tuning.

[5] See: Tuning Russia website: http://www.tuningrussia.org/index.php?lang=ru.

[6] See: Tuning Educational Structures in Europe website: http://www.unideusto.org/tuningeu/ for details.

[7] Tuning Educational Structures in Europe, Final Report, 2003 and Tuning Educational Structures in Europe, Universities’ contribution to the Bologna Process, Final Report Phase 2, 2005.

[9] Adelman, The Bologna Process for U.S. Eyes: Re-learning Higher Education in the Age of Convergence (Washington, 2009).

[10] McKiernan and Birtwistle, ‘Making the Implicit Explicit: Demonstrating the Value Added of Higher Education by a Qualifications Framework’.

[11] See footnote 6 supra.

[12] Tender reference first phase of study: Negotiated procedure EAC-2010-1243.

[13] Tender reference second phase of study: Negotiated procedure EAC-03/2013.

[14] Millett, Catherine M. a.o., A Culture of Evidence: An Evidence-Centered Approach to Accountability for Student Learning Outcomes.

[16] European Union Association, TRENDS III.

[17] European Union Association, TRENDS IV.

[18] European Union Association, TRENDS V.

[19] European Union Association, TRENDS VI.

[20] European Union Association, TRENDS VII.

[21] European Commission/EACEA/Eurydice, The European Higher Education Area in 2015: Bologna Process Implementation Report. Luxembourg: Publications Office of the European Union, 2015.

[22] European University Association, TRENDS VII.

[23] European Student Union, Bologna with Student Eyes 2015.

[25] Tuning Educational Structures in Europe website: http://www.unideusto.org/tuningeu/component/content/article/385-euus-research-project.html, accessed March 18th 2016.

[26] List of countries, states and subject areas:

List of Countries: Austria, Belgium, Germany, Ireland, Italy, Lithuania, Netherlands, Norway, Poland, Portugal, Romania, Slovenia, Spain, Sweden; List of US states: California, Indiana, Maryland, Michigan, New York, North Carolina, Texas, Utah; List of subject areas: Administration, Aeronautics, Architecture, Arts, Banking and Finance, Biology, Biotechnology, Business, Business Administration, Chemistry, Christianity, Computer Science, Economics, Electrical Engineering, Electronics, Engineering, Facility Management, Foreign Languages, Gender Studies, History, Information Technology, International Business, Mathematics, Mechanical Engineering and Mechatronics, Media (TV & Radio), Medieval & Early Modern History, Modern British History, Pedagogy, Philosophy, Physics, Physiotherapy, Psychology.

[27] CEDEFOP, Terminology of European education and training policy. A selection of 130 key terms. Second edition, Luxembourg, 2014.

[28] Tuning applied the following definition: Represent a dynamic combination of cognitive and metacognitive skills, knowledge and understanding, interpersonal, intellectual and practical skills, and ethical values. It is complementary with the definition used by the EQF for LLL. In this overarching framework — making a distinction between knowledge, skills and competences — the following definition is used: “competence” means the proven ability to use knowledge, skills and personal, social and/or methodological abilities, in work or study situations and in professional and personal development. This is based on the assumption that these have been acquired at an earlier stage in the learning process. European Commission, The European Qualifications Framework for Lifelong Learning (EQF), 11.

[29] Learning objectives can be defined as clear and concise statements that describe what the teacher intend the students to learn by the end of the course. It outlines the material intended to be covered or the questions related to the discipline that the class will address. This approach means in practice that the focus is on the teaching process (instead of the learning process) and on knowledge transfer of the teacher to the students. Learning objectives express knowledge acquisition and transfer and the term is part of the paradigm of the staff-centred approach.

In the USA learning objectives are often defined as learning outcomes. This has contributed to the confusion of terms. See for example: The Glossary for Education Reform: http://edglossary.org/learning-objectives/; Another example of the mixing-up of terms terms is: Raoul A. Arreola, Writing Learning Objectives. A Teaching Resource Document from the Office of the Vice Chancellor for Planning and Academic Support, The University of Tennessee, Memphis, s.a.

[30] Statements of what a learner is expected to know, understand and be able to demonstrate after completion of a process of learning. According to Tuning Learning outcomes are expressed in terms of the level of competence to be obtained by the learner. They relate to level descriptors in national and European qualifications frameworks. The term is applied in the context of the student-centred approach.

[31] A misunderstanding has been created in this respect when defining student-centred learning as ‘an approach to learning in which learners choose not only what to study but also how and why that topic might be of interest. See: Rogers, C. (1983). As a teacher, can I be myself? In Freedom to learn for the 80s. Ohio: Charles E. Merrill Publishing Company, 1983.

Jeffrey Froyd, Nancy Simpson from Texas A&M University give a comprehensive overview what is understood by student-centred learning from the perspective of the teacher in their paper Student-Centered Learning Addressing Faculty Questions about Student- centered Learning (2010).

[32] The European Student Union applies the following definition of student-centred learning: A learning approach characterised by innovative methods of teaching which aim to promote learning in communication with teachers and students and which takes students seriously as active participants in their own learning, fostering transferable skills such as problem-solving, critical and reflective thinking. Education International and European Student Union, Time for a new paradigm in education: student-centred-learning, 2010, 4.

[33] Anna Serbati, Implementation of Competence-Based Learning Approach: stories of practices and the Tuning contribution to academic innovation, in: Tuning Journal for Higher Education, Growing Tuning Seeds, Volume3, Issue No.1, November 2015. See also: Jenneke Lokhoff, e.o., A Tuning Guide to Formulating Degree Programme Profiles. Including Programme Competences and Programme Learning Outcomes. Bilbao, Groningen, The Hague, 2010.

[34] ‘Gap’ is the term first used in TRENDS III 2003.

[35] TRENDS Reports III, IV, V, VI and VII; Bologna Stocktaking reports 2005, 2007 and 2009.

[36] As described in detail in Section 2 and 3 above, the aim of EU-US Study on the implementation of the Learning Outcomes/Competences approach was simply to determine the extent to which universities have adopted it. Recall that the methodology used a variety of instruments to find evidence (mixed methodology: online questionnaire plus in-depth interviews).

[37] High Level Group on the Modernisation of Higher Education, Report to the Commission. June 2013, 7 and 12.

 

 

Copyright

Copyright for this article is retained by the Publisher. It is an Open Access material that is free for download, distribution, and or reuse in any medium only for non-commercial purposes; provided any applicable legislation is respected, the original work is properly cited, and any changes to the original are clearly indicated.