Tuning impact in Latin America: is there implementation beyond design?

Pablo Beneitone and Maria Yarosh[*]

doi: 10.18543/tjhe-3(1)-2015pp187-216

Abstract: Deusto International Tuning Academy is undertaking a large-scale study to analyse the impact Tuning projects may have had in participating universities. More particularly, the study hopes to provide an unambiguous answer regarding the presence or absence of the implementation of a competence-based student-centred approach in the different world regions where Tuning projects have taken place. The present article focuses only on Latin America where two Tuning projects have been developed. It describes the findings of the first two stages of the study. After reporting the data, the authors argue that there is evidence of a Tuning impact in each of three intended impact domains: (1) understanding of the importance of a shift from content- to competence-based education; (2) provision of institutional support necessary to facilitate this change; and (3) appropriate teaching, learning and assessment within the general framework of the study plans and degree profiles.

Keywords: impact study; implementation; project evaluation; Tuning; Latin America; Tuning Latin America; competence-based approach; student-centred approach.

I. Introduction

A current trend in higher education is a shift from teacher to student centred learning. The student has to take an active part in building his or her own learning, turning the teacher into the facilitator, the one who helps to provide the resources (information, methods, tools), who creates environments and accompanies students, offering assistance and guidance throughout the process, thus raising student motivation, commitment and liking for and understanding of the usefulness of learning.

There was — and is — a recognition that in spite of their valuable differences, higher education systems faced common internal and external challenges related to the growth and diversification of higher education, the employability of graduates, the shortage of skills in key areas, the expansion of private and transnational education, the need to further encourage staff and student mobility, and, in the longer term, the desire to attract the best scholars from around the world in order to be leaders in different areas of research.

The reforms required cover all areas of higher education; this was true in Europe, and in most other world regions as well. Tuning emerged as a response to reform in Europe initiated by the Bologna Declaration, but the methodology developed has since been used in many regions where reform of higher education was being undertaken, where governments perceived as useful a model of reform that encouraged participation from academics at all levels, which provided links with the world of work, and had authenticity in terms of the culture of education in the country.

The development of competence based approaches to teaching and learning in higher education has been influenced by industry, where there has been a growing perception that new graduates are often unfit for the demands of the modern workplace, and from graduates themselves who have found that their range of skills and competences are lacking when they seek employment. Numerous reports from around the world attest to this.[1], [2]

A student centred approach to teaching and learning is the concomitant of the adoption of a competence oriented approach to curriculum design. Where learning is designed to focus on what students can do, can value, can innovate and be creative in, students have to be the active owners of their own development.

Tuning proposes a methodology whose aim is to facilitate the development of a competence-based and student-centred approaches in higher education. The first generation of Tuning projects invited universities to adopt the new paradigm and gave them an initial impulse by focusing on the curriculum design process, the first necessary step if this new approach is to be implemented. It was expected that equipped with the required conceptual framework and methodological tools, universities would be able to continue moving towards implementation after the project. However, no feedback mechamism has been established to check whether this was indeed happening.

The present article focuses on Latin America and shares the findings of the first two stages of the impact research study. Section 1 starts with a very brief outline of the main characteristics of the two Tuning Latin America projects. Section 2 sets the global methodological framework of this impact evaluation study, while Section 3 introduces the general methodological decisions that shaped the first two stages. Data collection procedures, samples and the major findings of Stage One and Stage Two are presented in Sections 4 and 5. Section 6 discusses the findings relevant for each of the three intended Tuning impact domains. Finally, Section 7 discusses two possible directions of further research.

II. The context: two Tuning projects in Latin America

The Tuning Latin America projects were the first projects conducted by Tuning outside Europe. The concern for how to progress towards a shared area for universities while respecting traditions and diversity ceased to be an exclusive concern for Europeans and had already by 2004 become a global need.

From the beginning, two very specific problems faced by Latin American universities were pinpointed. On the one hand was the need to modernise, reformulate and make degree programmes more flexible in the light of new trends, of the requirements of society and of the demands of a fast changing world. On the other hand, which is linked closely to the first problem, the importance of transcending limits imposed by staff in terms of learning, by providing education that would enable what has been learnt to be recognised beyond local, national and regional institutional borders.

In its first phase (2004-2007) the Tuning Latin America project sought to engage academics and administrators in a wide debate, the goal of which was to identify and exchange information and improve collaboration between higher educational institutions, with a view to developing the quality, effectiveness, comparability and transparency of degree programmes within the region. It allowed the importance of competences to take centre stage in the process of curriculum reform and modernization.[3]

The second phase of Tuning Latin America (2011-2013) started on already-fertile terrain — with the fruits of the previous phase and in view of the current demand on the part of Latin American universities and governments to facilitate the continuation of the process that had already been embarked on. The second project involved 182 Latin American universities with the aim of contributing to build a Higher Education Area in the region. This challenge took the form of four very specific central working themes:

• A deeper understanding of agreements involving designing meta-profiles and degree profiles in the 15 subject areas included in the project (Administration, Agronomy, Architecture, Law, Education, Nursing, Physics, Geology, History, Information Technology, Civil Engineering, Mathematics, Medicine, Psychology and Chemistry).

• Contributing to reflections on future scenarios for new professions.

• Promoting the joint construction of methodological strategies in order to develop and assess competences.

• Designing a system of credits (CLAR — Latin American Reference Credit) to facilitate recognition of studies in Latin America as a region that can be articulated with systems from other parts of the world.[4]

Tuning expected, in terms of impact, a clear understanding at university level of the importance the shift from programmes based on knowledge to those which also included competences. A second impact intended was the creation of conditions that favour implementation in terms of institutional policy and culture. A third envisaged impact was that the universities modify their study plans and face the challenge of using agreed competences as a point of reference for the design of curricula and for developing degree profiles. This point was tied to teaching, learning and assessment, including the estimation of student workload and the allocation of credits at the level of units and programmes.

After ten years of development, the Project in Latin America completed the design stage, opening a process for consolidating these issues and allowing them to mature within the universities, which included implementing complete degree programmes following the Tuning methodology. The way these challenges have been developed by the Latin American universities is the core of this paper.

III. Impact Evaluation Study: theoretical framework

Evaluation can happen at different points with respect to the lifecycle of a project, can pursue different goals and can adopt different methodological approaches. Yet it must be done with rigour, to ensure maximum objectiveness.

Impact evauation attempts to measure the effects that a project might have had. Let us look at three definitions of impact studies, which are very similar to each other, but emphasize slightly different aspects of this type of research inquiries.

The definition proposed by 3ie is as follows:

Rigorous impact evaluation studies are analyses that measure the net change in outcomes for a particular group of people that can be attributed to a specific programme using the best methodology available, feasible and appropriate to the evaluation question that is being investigated and to the specific context.[5]

The Austrian Development Agency similarly focuses on the desired effects when looking at the effects of an intervention.[6] The World Bank, on the other hand, does not characterize impact as positive or negative but defines impact evaluation as a study that “assesses changes in the well-being of individuals, households, communities or firms that can be attributed to a particular project, programme or policy”.[7] The OECD separates intended from unintended effects, and suggests that an impact study should take both into account: “impact evaluation is an assessment of how the intervention being evaluated affects outcomes, whether these effects are intended or unintended.[8]

Defining the notion of impact is important:

How impact is defined will necessarily determine the scope and content of the study because different definitions prioritize different aspects of “impact”; imply different concepts of causality (what produces the impact); and how to estimate the impact (evaluation designs).[9]

Apart from positive or negative, intended or unintended effects, it is also useful to define the impact in terms of expected versus unexpected, immediate versus long-lasting and as experienced or not by particular stakeholders or at the level of institutions or communities or whole regions.[10]

The two major purposes of impact evaluation seem to be on the one hand, obtaining feedback or learning from experience (checking what worked and what did not, where, for whom; what can or need to be improved and how); and accountability on the other.[11], [12] Another purpose is demonstrating that the project has indeed had some or all of the intended effects.[13]

Two caveats have to be taken into account in good impact evaulation: (1) the difficulty of proving cause-effect relations in real-life large-scale projects and (2) the caution necessary when formulating recommendations (whether about project elements to be maintained or those in need of improvement) due to the danger of generalization. Caution is recommended by most if not all sources that discuss impact evaluation studies.[14] Newcomer, Hatry and Wholey, for example,[15] suggest that evaluators should limit themselves to speaking about plausible attribution, recognizing that there may be external causes of the effects reported.

Concerning generalization, the similarity of the contexts and conditions (including the impossibility of accounting for other contributory factors) will determine the extent to which an aspect of a programme which worked (or failed to work) in one situation within a past project is likely to work (or fail again) in the future project.[16], [17], [18]

In terms of the methodology and evaluation design, three issues are important: (1) the timing of the evaluation; (2) the formulation of evaluation questions and choice of the methods to be used, both in relation to the evaluated project attributes; and (3) the triangulation and contextualization of the findings.[19], [20], [21]

The timing, or the decision to proceed with an impact evaluation, is key because different effects become noticeable at different stages of a project. While some effects can be present during the project lifetime but cease to exist once the project is over, others might not appear till some time after the project finalization.

Evaluation questions need to be as concrete as possible since they determine the type of data collected and the types of responses the evaluation study will be able to provide. Evaluators need to consider each of the originally intended project outcomes and formulate questions related to the domains of impact they or the party who commissions the evaluation consider most relevant.[22]

There is no single perfect evaluation design or data collection method for impact evaluation studies.[23] Until quite recently, comparing the situation before and after a project and comparing the conditions of project beneficiaries with their counterparts not affected by the project were considered to be the only valid methods.[24], [25] Nowadays, however, the most important aspect is that of reconciling evaluation questions and the evaluation design used (the methods) with the inner logic of the programme or project.[26], [27] Indeed, the beneficiaries, scope and types of intended effects, not to mention the pace at which changes can be expected to happen, the interdependence of such changes or different aspects, and the type of indicators, all differ from project to project. Evaluators, thus, have more liberty, but also more responsibility for selecting the methods to find responses to the questions relevant for “their” project.

Case study is one of the methods that has proved valuable in impact evaluation. “Studies of ‘cases’ that combine within-case analysis and comparisons across cases are especially suited to impact evaluation in complex settings”.[28] Evaluators need to identify the appropriate unit of analysis,[29] but when this is done, the results can be very promising. Thus:

Cases may be policy interventions, institutions, individuals, events or even countries during a particular historical period. This represents a shift from focusing causal analysis on variables taken out of their specific context. Locating variables in the context of the ‘case’ and conducting within-case analysis alongside comparisons across cases has opened up major new opportunities for causal analysis that are still largely ignored in evaluation practice.[30]

The use of case studies could, therefore, be an optimal solution in terms of the need to triangulate and contextualize the findings. They may also provide a solution to another long neglected issue: the involvement of project participants and/or beneficiaries in impact evaluation.[31] Although different levels of involvement and types of participation might be appropriate for different projects, if evaluators do not identify and take into account the opinions of the different project stakeholders or beneficiaries, the findings are bound to be partial, if not biased.

To conclude, a project impact evaluation study needs to comply with a number of general principles that regulate this type of enquiry, but it needs to have considerable freedom and responsibility in terms of defining the impact to be evaluated, in formulating concrete questions and in selecting the methods to be applied.

IV. Methodology

The general purpose of this Tuning impact study was to explore the effects of the two Tuning Latin America projects (Phase 1: 2004-2007; Phase 2: 2011-2013) in Latin American universities. More particularly, the study aimed to discover to what extent a competence-based student-centred approach to teaching, learning and assessment in higher education degree programmes had been implemented by the different participating institutions. It was also important for the Tuning Academy to obtain feedback and learn what improvements identified as desirable by the project participants could be implemented after the completion of the projects. These may have simply remained as a topic of discourse; they may have been implemented on paper but not in practice; or they may have been rejected or forgotten in the course of post-project intra-institutional or national reforms.

The design of the impact study was seen to satisfy at least three basic requirements. First of all, it was important to identify institutions where the competence-based student-centred approach advocated in the Projects had been implemented. The experiences of such institutions could be valuable for others wishing to introduce curriculum reform. Secondly, a centralized impact evaluation could give each participating institution the possibility of sharing its achievements and concerns, as well as comparing their approaches and results with others in their region or their area of studies. Thirdly, the Tuning Academy needed to collect data on the status of the impelementation of a competence-based student-centred approach so as to better organize future efforts, to build on what has been achieved, and address any issues which have not been resolved.

Thus, regarding the first key question of any impact study — whether or not to proceed with an evaluation at all[32] — a positive answer was given because, nearly 10 years after the start of the first Tuning Latin America project, it was essential to obtain systematic feedback on the effectiveness of the initiative. Since no impact studies of Tuning projects had been conducted before, it was not clear whether the timing was fully suitable.[33], [34] However, it was important to identify the current state of affairs, while recognizing that any fundamental change in higher education is necessarily an evolving process.[35]

With respect to the scope, it was decided to focus specifically on implementaion at the level of first-cycle degree programmes. The study was to look into the extent of the intended effects now observable along the different Tuning lines of action.

The four Tuning lines of action are: 1) agreeing on the competences to be developed (generic and subject specific); 2) sharing expertise in approaches to teaching, learning and assessment of these competences; 3) measuring student workload and credits; and 4) evaluating the quality of programmes. The first line analyses transversal competences as well as those that are specific to subject areas. The second line invite academics to share the most effective methods of teaching, learning and assessment for achieving the competences identified. The third line proposes a reflection on the impact and relationship of this system of competences with the student’s workload, and its connection with the resulting time measured in credits. Finally, the fourth line highlights the fact that quality is an integrating part of the design of the competence-based curriculum, essential in articulating the three previous lines.

All the institutions participating in both of the Tuning Latin America Projects were invited to participate — on a voluntary basis. The study was conducted by the Tuning Academy at the University of Deusto and no external funding was available to compensate for the time dedicated to participation in the study on the part of Latin American universities.

The main evaluation questions were three:

(1) What is the general picture of the implementation of a competence-based student-centred approach in participating Latin American universities?

(2) Which participating institutions succeeded in implementing the competence-based student-centred approach at the level of at least one first-cycle degree programme as opposed to cases of (a) implementing the approach in single courses within a degree programme or (b) implementing certain aspects of this approach, whether in separate departments or at the level of the whole university?

(3) What difficulties have been experienced by those actively trying to implement a competence-based student-centred approach?

A three-stage study was designed to answer these questions. Stage One was aimed at obtaining a response to the first question above and at identifying potential cases of successful implementation. Stage Two further explored the successful cases identified, as well as responding to question three above. Stage Three, finally,will focus on a small number of cases of successful implementation and explore these in greater detail. This article reports on Stages One and Two which have already been completed. Stage Three is to be conducted in the near future and will be discussed in a subsequent publication.

The general methodological approach adopted in Stages One and Two was quantitative. Both Stages used online questionnaires as the research instrument in which open questions were avoided as much as possible. Stage Three, in turn, will use interviews and focus-groups, which will permit the collection of qualitative data from a small number of cases studied in detail, providing depth to the general picture described in this article.

V. Stage One: Procedure, Sample and Findings

Stage One was conducted between October and November 2013. 160 institutions in 18 Latin American countries were invited, by email, to participate. The Tuning Impact Questionnaire was developed simultaneously in both Spanish and Portuguese.

V.1. Stage One: the Data Collection Instrument

Respondents were asked to report on the impact the Tuning project(s) might have had in their university in relation to introducing or further implementing competence-based student-centred learning, in terms of the following five aspects:

• Curriculum development.

• Approaches to teaching, learning and assessment.

• Assessment of the students’ workload.

• Introducing a system of credits based on students’ workload calculations.

• Offering staff development in order to help them introduce competence-based student-centred approach.

These five aspects are closely linked to the Tuning lines: curriculum development including generic and subject specific competences with Line 1; teaching learning and assessment with Line 2; student workload and the definition of a credit system with Line 3; and administrative support with the core of Line 4.

The questionnaire explicitly asked about the impact of Tuning projects. This means that cases in which a competence-based student-centred approach had been introduced before the institution participated in any Tuning projects or independent of such participation were not included in this research.

The four levels of implementation envisaged for each of the aspects were:

• Zero — nothing has changed despite participation in the Tuning project(s).

• First — the improvements in question have been implemented in the programmes of the respondent’s subject area.

• Second — the improvements in question have been implemented in some subject areas within the respondent’s institution.

• Optimal — the improvements in question have been implemented in the whole university.

Respondents who reported positive changes in any of the aspects were further asked to indicate whether any documentation exists (study plans/student guides/didactic materials/strategic plans/reports, etc.) that demonstrates that they the changes are related to the respondents’ participation in the Tuning project(s).

V.2. Stage One Sample Description

In total, respondents from 133 out of the 160 higher education institutions from 18 Latin American countries completed the questionnaire, giving a response rate of 83.1% in terms of institutions and 100% in terms of countries. All fifteen of the subject areas, were represented in the sample. All respondents had participated in the Tuning projects and were still working at the same university where they had worked during the projects.

Geographically, out of the 133 universities, 19 were from Argentina, 7 from Bolivia, 14 from Brazil, 13 from Chile, 15 from Colombia, 3 from Costa Rica, 3 from Cuba, 8 from Ecuador, 5 from El Salvador, 3 from Guatemala, 2 from Honduras, 11 from Mexico, 5 from Nicaragua, 4 from Panama, 5 from Paraguay, 7 from Peru, 2 from Uruguay, and 7 from Venezuela.

V.3. Stage One Findings

As shown in Figure 1, in 122 higher education institutions out of 133 (91.7%), noticeable changes were reported resulting from implementing the recommendations of the Tuning project in at least one of the five aspects noted above and in at least one subject area. In addition, 62 institutions (46.6%) reported having implemented changes in all the five aspects at least at the level of one subject area (and were able to provide written documents to support their opinion). Finally, 10 universities (7.5%) believed that Tuning-influenced changes had been implemented in all of the five aspects across the whole university and could provide documentary evidence to support this claim.

 

In 91.7% of the universities, noticeable changes have been implemented at least in one of the five aspects

In 46.6% of the universities changes have been implemented along all the Tuning aspects at least at the level of one subject area (and being able to provide written documents to support their answer).

Figure 1

Stage One Findings: Level of impact of Tuning in different implementation aspects

Table 1 (below) gives more detailed information on the responses obtained along the different Tuning lines. The highest implementation level was reported for the domains of (1) the design and revision of curriculum, study programmes and plans; and (2) teaching, learning and assessment methods (75.9% for each of them). More than half of the universities stated that staff development initiatives have been available to support teachers, although the questionnaire format did not permit identification of what exactly was offered in terms of content, format, variety, etc. Finally, the workload aspect — and especially the introduction of a system of credits — seems the one where fewer universities have been able to advance.

Table 1

Implementation levels reported along the five Tuning methodology aspects

Question (Implementation aspects related to Tuning)

Implementation level reported (number of institutions; out of 133)

No

Yes

(a) Yes, but only in my subject area

(b) Yes, but only in some faculties/ departments/ centres/ degrees/ subject areas

(c) Yes, in the whole university

Total of implemented (a+b+c)

Has the competence-based student-centred approach been applied to revising or creating curricula/study programmes/plans?

32
(24.1%)

28

43

30

101
(75.9%)

Have the teaching, learning and evaluation methodologies been changed in any way following the competence-based student-centred approach?

32
(24.1%)

34

42

25

101
(75.9%)

Has the time and effort required from students (students workload) been considered in order to adjust study programmes?

48
(36.1%)

27

31

27

85
(63.9%)

Has any system of credits based on the students’ workload been introduced (ECTS/CLAR/other)?

73
(54.9%)

12

21

27

60
(45.1%)

Have teachers been offered relevant training (in order to help them introduce competence-based student-centred approach)?

44
(33.1%)

14

44

31

89
(66.9%)

VI. Stage Two: Procedure, Sample and Findings

In Stage Two (June-October 2014) first-cycle degrees were selected as the unit of analysis. Those institutions reporting noticeable changes along the five aspects in Stage One, and able to provide supporting documentation, were invited to participate. More precisely, they could participate if they considered that they had at least one first-cycle degree programme which: (1) could be considered an example of successful implementation of a competence-based student-centred approach, with this success clearly attributed to participation in the Tuning project(s), and (2) could provide from those involved in the programme at least three academic executives, ten teachers, and twenty-five students who were willing to respond to the on-line questionnaire. Academic executives were high-level academic administrators who are in charge of degree programme development (deans, heads of the departments, vice-rectors, etc.). These administrators had to be fully familiar with the degree programme proposed as an example of successful implementation. Teachers could be any members of the teaching staff who give classes within the programme in question. Students were chosen from the selected degree programme and had to be in their 3rd or 4th year of studies, if possible.

The contact person at each of the universities (normally a former Tuning Latin America project participant) was sent an email with the links to the three online questionnaires and was in charge of forwarding the links to the potential respondents and ensuring the required number of responses. Close to the first deadline these contact persons were informed of how many academic executives, teachers and students from their university had responded so far, to enable them to invite more respondents from the categories where a shortfall existed. Thus, monitoring was done at the central level, but efforts to get the minimum required number of respondents were done by the contact persons at local level.

VI.1. Stage Two Data Collection Instrument

Three different questionnaires were developed for Stage Two: one for academic executives, one for teachers and one for students.

The academic executives were asked about the perceived impact of the Tuning projects on the design and planning of a competence-based student-centred approach in the degree programmes of their departments; about the difficulties experienced in the process of implementation; and about external contributory factors (e.g a policy promoting a competence-based student-centred approach and the level of autonomy universities have in order to introduce changes into the curriculum design. They were also asked about institutional support for the change (e.g.in the form of developing guidelines for academics, and degree programmes, providing staff development or appointing a person to monitor the process).

The teaching staff was also asked about the perceived impact of the Tuning projects on the design and planning of a competence-based student-centred approach in the degree programmes of their departments, as well as about the difficulties experienced in the process of implementation. A number of questions explored the current teaching and assessment practices (to see how close they came to fully implementing the approach). Furthermore, teachers’ opinions were explored about a competence- based approach compared with a content-focused one. Finally, the issue of staff development initiatives aimed at helping academics adopt the new approach was addressed.

Most of the questions addressed to students were aimed at soliciting information about the teaching, learning and assessment activities they have experienced (to compare the data with teachers’ responses). Student opinions about a competence-based student-centred approach were also invited.

VI.2. Stage Two Sample Description

27 institutions were both able to demonstrate their eligibility and were ready to undertake the Stage Two consultation. This was the first big step towards the identification of institutions whose implementation experiences could be considered best practice (and studied in Stage Three). 21 of these eligible universities completed the whole process of Stage Two data collection.[36]

As a result, Stage Two sample comprises 11 Latin American countries, with 21 institutions. Namely, these are Argentina (1 HEI), Bolivia (2 HEIs), Chile (4), Ecuador (1), El Salvador (2), Guatemala (1), Honduras (2), Mexico (1), Panama (2), Paraguay (3) and Perú (2). As for the subject areas represented, these are 15 and as follows: Agronomy (Chile), Architecture (Chile), Biophysics (Ecuador), Business Administration (Bolivia and Honduras), Chemistry (Argentina and Peru), Civil Engineering (Guatemala), Computer Engineering (Paraguay), Educational Administration and Management (Honduras), Law (El Salvador), Mathematics (Bolivia), Mathematics and Physics (Paraguay), Medicine (Chile, Panama and Peru), Modern Languages (El Salvador), Nursing (Chile, Mexico and Paraguay) and Psychology (Panama).

The second stage respondent sample comprised 70 academic administrators, 237 members of the teaching staff and 658 students. These numbers mean that for each category more than the required minimum of responses was received (the exact minimum for 21 institutions would have been 63, 210 and 525 respondents for the three groups).

VI.3. Stage Two Findings

The most general question about the possible impact of the Tuning projects on the implementation of a competence-based student-centred approach asked academic executives and teachers to what extent, in their opinion, the Tuning project had had an impact on the design and planning of the degree programmes in their department. Table 2 shows the results which suggest generally strong or at least a certain positive impact in the majority of cases: positive Tuning impact was reported by 92.85 % of academic executives and 92.4% of teaching staff; and strong (considerable or very strong) impact was reported by 77.15% of academic executives and 65.5% of the teaching staff.

Table 2

Tuning Impact on Implementation of Design and Planning of Competence-Based Student-Centred Learning

Response

Academic Executives [of 70]

Teachers [of 237]

To a great extent

20%

20.7%

To a considerable extent

57.15%

44.7%

To a little extent

15.7%

27%

To no extent

7.15%

7.6%

 

Figure 2

Stage Two Findings: Respondents who estimate that Tuning has had impact*

* The “total Tuning impact” category comprises three types of answers: “to a little extent”, “to a considerable extent” and “to a great extent”.

Figure 3

Stage Two Findings: Respondents who estimate Tuning impact as strong*

* The “strong Tuning impact” category comprises two types of answers: “to a considerable extent” and “to a great extent”.

In the questionnaires for academic executives, two types of factors potentially contributing to successful implementation of a competence-based student-centred approach were distinguished: external factors which cannot be directly controlled by the university authorities, and internal factors which can be created or strengthened by them. Two major external factors reported in this study were the presence or absence of a policy favourable for the change, and the level of authority universities have to introduce changes into curricula.

However, neither of these external factors were seen as preventing implementation. The level of autonomy reported by the academic executives in the sample is high: 83.8% consider that their institutions have the autonomy “at the level of structuring and designing degree programmes”; 98.5% believe that there is enough autonomy “at the level of procedures and teaching methods to be used”; and all of them (100%) reported enjoying autonomy to introduce changes “at the level of evaluation methods and procedures”.

The other external contributing factor — a specific policy in the higher education sector in the country that promotes or contains indications with respect to the implementation of a competence-based student-centred approach — was recognised as present by 72.1% of the academic executives (20.6% reported that such a policy “clearly establishes guidelines for university degrees”, while 51.5% remarked that the policy in their context was “very general or partial”).

Internally, two elements to encourage successful implementation were highlighted: creating guidelines to help academics adopt and implement the new approach and establishing a system of monitoring. As for institutional support, documentation or guidelines that facilitate or support the implementation of a competence-based student-centred approach appear to have been developed by the university authorities in 86.8% of cases; and the implementation process is monitored, by different agents, in 91.2% of cases.

With respect to the difficulties encountered in the implementation process, as shown in Table 3 (below), academic executives and teaching staff seem to agree in perceiving resistance on the part of the teaching staff as more significant than on the part of the students (more than 50 percent of academic executives and of teachers report having experienced the former, while only 28% of academic executives and 33.7% of teachers, the latter). Financial support is considered insufficient by slightly more than half of the respondents in each of the two categories, despite the positively valued engagement of the university authorities. More than half of the academic executives and teachers consider a competence-based student-centred approach to be complex, and more than 60% in both groups consider that teaching staff have not been sufficiently prepared for the new approach.

Regardless of the difficulties reported, 84% of the teachers reported focusing on generic or subject specific competences in the courses they teach. Out of those who help their students develop at least one generic competence (65.4%), 67.1% are working with the generic competences agreed upon in Tuning. That is, 45.2% of respondents used the Tuning list to select the generic competence(s) focused on in their courses, while a further 21.9% chose the generic competence from either the Tuning list or from other sources.

Table 3

Difficulties experienced by the university, department or degree programme while implementing the the competence-based student-centred approach: Perceptions of Academic Executives compared with perceptions of Teaching Staff

Difficulty

Academic Executives

Teachers

Disagree

Agree

Disagree

Agree

Resistance against the competence-based student-centred approach on the part of the teaching staff

36.7%

63.2%

46.8%

53.1%

Resistance against the competence-based student-centred approach on the part of the students

72.0%

28.0%

66.3%

33.7%

Insufficient financial support

45.6%

54.4%

39.7%

60.3%

Insufficient preparation and training of the teaching staff for this approach

22.3%

67.6%

35.0%

65.0%

Insufficient leadership and engagement on the part of academic executives/authorities

66.2%

33.8%

57.8%

42.2%

The complexity of the competence-based student-centred approach

42.6%

57.4%

49.4%

50.6%

If we compare these responses with those provided by students enrolled in the same degree programmes at the same universities, 10.5% considered that a competence-based student-centred approach has not been applied in the courses they followed or had followed, while the other 89.5% believed it had, with 16% reporting that the new approach had been adopted in all the courses they had taken so far in their programme. In 89% of the courses that follow a competence-based student-centred approach students were informed about the competences they were expected to (further) develop in each particular course.

There also exists a positive correlation between teachers who follow a competence-based student-centred approach and the perceived level of the Tuning impact on the whole. Thus 95.5% of the teachers who have adopted this approach believe that this was at least to some extent due to their participation in the Tuning projects, while 47.7% of this group report this cause-effect relation to be strong or very strong.

A number of questions included in the questionnaire permit us also to see an initial outline of methods of assessment, the central part of any pedagogical approach. Thus, 97.4% of the teachers who work with generic competences also pay attention to assessment (56.8% said they assessed some of the generic competences or did so in some of their courses; while 40.6% responded that they assessed all of the generic competences addressed in all the courses they taught). The data collected indicate that 62.4% of all teachers in the sample used more than one method of assessment: 35.9% reported taking into account not only their own opinion, but also students’ self-assessment, while 26.6% considered a third source — peer-assessment. To add two final elements to this general picture, 97% of teachers indicated that they informed their students on the first day of the course about the assessment system and methods used to determine their final grade for the course (criteria, indicators, assessment techniques…); and a similarly high percentage — 94.9% — reported giving their students regular formative feedback throughout the course.

The data obtained from students does not always coincide with the responses of teachers. 17.7% reported that all teachers who taught them and applied the competence-based student-centred approach, assessed the competences focused on at the end of the course. 21.6% believed this happened in nearly all of courses that follow the competence-based student-centred approach, while a further 47.5% have experienced competence-based student-centred assessment in some of the courses. On the other hand, only 25.8% of students said their teachers always took into account students’ self-assessment as an additional source of assessment information; and 29.9% stated the same about peer assessment. Asked about the formative feedback received during the course the responses were more conservative than those of the teachers — 12.2% of students indicated that they were given this in all the courses, 26.7% in nearly all of the courses, and a further 54% in some of the courses.

The majority of teachers in the 21 Latin American universities appear to have consulted their students in order to verify their own estimation of student workload (see Table 4) and nearly half of the teaching staff reported that all of the teachers within the same degree programme coordinated their efforts. However, approval of the workload estimate at the level of the department or centre has been obtained by 43% only.

In contrast to this only 10% of students reported being asked about workload by nearly all of their teachers, and only 36.9% by some of the teachers. 47.7% of the sample seem never to have been consulted on the issue by any of their teachers.

Finally, in terms of the actions taken in order to support reform, more than half of the teaching staff seem to have had an opportunity to attend staff development courses (60.3%) and in most of these cases they were able to choose from more than one option (56.6%), yet less than half of the sample reported having had access to continuous guidance and support to help them implement a competence-based student-centred approach (42.2%).

Table 4

Actions aimed at defining the total students’ workload for the courses (considering the time spent by students working both in class and outside the classroom)

Actions aimed at defining students workload

Yes

No

I have compared my perception with the students

67.9%

32.1%

It has been approved by my department /centre

43.0%

57.0%

It has been coordinated with the rest of teachers of the same degree programme

44.7%

55.3%

A further key change is that of the attitudes, recognizing the value and the advantages of a competence-based student-centred approach. Teachers and students were asked two series of questions about (1) their perception of the influence a competence-based student-centred approach had on different aspects of university education, and (2) their perception of the new approach in comparison with a content-based approach.

As can be seen in Table 5, both teachers and students seem to be convinced of the merits of the new approach: it helps students achieve better academic results and develop new abilities, values, attitudes, etc. Teaching and learning approaches are more active and diverse. Students are more motivated and engaged; they understand the new approach and they do not share the common misconception about the content loss often erroneously associated with a competence-based student-centred approach.

At the same time, teachers are conscious of the greater effort demanded from them, and both they and students recognise that students are also required to do more themselves if a competence-based student-centred approach is adopted.

Table 5

Attitudes towards the competence-based student-centred approach: how the competence-based student-centred approach is perceived to influence the teaching and learning process. Point of view of teachers and students.

With the competence-based

student-centred approach…

Teachers

Students

Disagree

Agree

Disagree

Agree

Better academic results are achieved

12.6%

87.4%

7.9%

92.1%

The process has become less demanding for the students

66.2%

33.8%

59.7%

40.3%

Students can develop new abilities, new values, new attitudes, etc.

8.8%

91.2%

6.2%

93.8%

Students are more confused with the new learning system

72.2%

27.8%

63.0%

37.0%

Students are more interested and involved

27.8%

72.2%

21.9%

78.1%

A much greater effort is required from the teaching staff

14.8%

85.2%

Students were not asked this question

A much greater effort is required from the students

28.5%

71.5%

25.2%

74.8%

Active methodologies and the new methods of teaching and learning have been incorporated

11.0%

89.0%

Students were not asked this question

Conceptual content has been lost

71.3%

28.7%

65.2%

34.8%

A further question that asked teachers and students to compare competence-based and content-based approaches further confirms the clear preference of the two groups for competence-based student-centred teaching and learning (see Table 6). The majority of respondents from both groups attributed considerably greater merit to the competence-based approach on all of the six parameters included in the questionnaire.

Table 6

Competence-based student-centred approach versus contents-based approach from the point of view of teachers and students.

(With) the competence-based student-centred approach

Teachers

Students

Disagree

Agree

Disagree

Agree

Improves the student’s personal development

6.7%

93.3%

6.3%

93.7%

Improves the student’s civic education

21.5%

78.5%

18.8%

81.2%

The graduates are better prepared and receive a more complete education

19.0%

81.0%

13.8%

86.2%

Students are more adequately prepared for their future professional activity

13.1%

86.9%

10.7%

89.3%

Is more linked to the labour market

19.4%

80.6%

17.9%

82.1%

Permits students to obtain a more international education

23.6%

76.4%

17.8%

82.2%

VII. Discussion

We will first comment on the very fact that this study has been feasible, interpreting this as indirect evidence of the impact. After this we will revisit all of the findings through the prism of the three types of Tuning impact which were outlined in Section 1: (1) the impact on the attitudes and values — the clear understanding of the importance of the shift in approach; (2) the impact in terms of creating a sustainable support system — ensuring the conditions necessary for successful implementation of a competence-based student-centred approach at the institutional level; and (3) impact visible at the level of teaching, learning and assessment in the classroom — impact that students can perceive and benefit from the changes directly.

The fact that this study was able to obtain such a high level of response is a first indication of long-term Tuning impact. Conducted after completion of the project and completely dependent on the voluntary participation of all the respondents, the response to the study is clear evidence of continued commitment within the institutions involved. The response rate (83.1% in terms of institutions and 100% in terms of countries for Stage One; and 77.8% in terms of institutions and 100% in terms of countries for Stage Two, and the fact that the total number of Stage Two respondents exceeded the stipulated minimum numbers for each of the three categories of respondents) illustrates this commitment. In qualitative terms, what we have seen is a clear interest in the topic of improving higher education, a strong will to learn more about and understand better the implementation processes, to share lessons learned and best practices, thus continuing to learn together across subject, institutional, and national borders. The communities of learning created within Tuning Latin America projects are, thus, very much alive and might be ready to explore new means of collaborative learning.

Having made this important general comment, we proceed to the first type of expected Tuning impact, that of developing new attitudes and values and acquiring a clear understanding of the importance of the change in approach to university education. Table 5 shows how highly valued a competence-based student-centred approach is in general among both teachers and students. The results students achieve through the learning process are markedly better, richer — students can develop new abilities, values, attitudes, etc. — and students are more motivated (“more interested and involved”). There is agreement between and among the two groups and the level of agreement is above 87% with respect to the first two aspects, and above 72% in the three of them.

Table 6 reflects what Stage Two respondents think of a competence-based student-centred approach, as opposed to a content — or factual-knowledge — focused one. If the understanding of the importance of change has been achieved, it is here that a clear preference for the new approach should be seen. This is, indeed, what happens. Not only do teachers, and students, unanimously see a competence-based approach as leading to each of the listed positive results in terms of students’ learning (76.4% being the lowest agreement level and 84.3% the mean). What is even more interesting is the general picture which emerges. The competence-based student-centred approach adopted appears to be more beneficial for some of the most important aspects of higher education than a content based one: it helps students become better persons, better citizens of their countries and of the world and increases student employability — contributing in this way to three of the central goals of higher education in general. This suggests that, at least in the 21 Latin American universities that participated in the second Stage of the research study, the Tuning projects could be considered to have achieved the first intended impact.

The second intended impact of the Tuning Projects — as a result of a clear understanding of the value of competence-based student-centred learning — is that of institutional commitment that supports and promotes the desired changes not only during but also after the project lifecycle. In an optimal scenario academic administrators would have created the conditions necessary for implementing the new approach. They would have also ensured continuous support for teachers throughout the process of change. This is the impact teachers would feel directly, while students could only perceive it indirectly — that is, whether teachers do change their approach thanks to the indispensable conditions being in place.

Not all of the conditions necessary for the reform to take place depend on the university authorities, but the figures reported in Section 5.3 show that in the case of the 21 institutions external factors (of favourable policy and sufficient autonomy of the universities) were present as well. Yet, academic administrators have played their role as well. To begin with, dissemination of Tuning ideas and agreements (e.g. lists of competences) was done successfully in all of the institutions. Second, guidelines and other supporting documents have been created at more than 86% of the institutions, and, once again, disseminated, with a high level of success. Third, staff development activities were designed and implemented, even if possibly not to the optimal level. Fourth, having concrete persons in charge of monitoring the change process seems to be a common practice across these universities. Fifth, while financial support could have been stronger, there is a net perception that the leadership has been adequate and that academic authorities have been engaged in stimulating and supporting the change (see Table 3). The agreement between the two groups of respondents (teachers and academic executives) on this question should be highlighted once again. Finally, we would like to observe that academic authorities appear to be realistic and critical, yet committed to promoting the change. They are fully aware of the resistance of some teachers and they are also more aware (than teachers themselves) of the fact that teachers need more staff development and preparation. Even if more could be done in terms of staff development, for example, leadership is not preceived as lacking in these 21 institutions; rather, it could be argued that the university authorities are leading and supporting the change.

The overall situation in these institutions is very positive and the university authorities and staff are probably well aware of what further action needs to be taken. Given their overall commitment, this action might be a simple question of time (and possibly a slightly less simple question of funding).

Finally, we should look at the level of impact on teaching, learning and assessment, perhaps the most evidently expected one, yet possibly the less easy to achieve. A vast majority (91.7%) of the institutions surveyed in Stage One reported noticeable changes in this domain in at least one aspect and at the level of at least one particular subject area. More specifically, the majority of institutions reported the introduction of a competence-based student-centred approach when revising or creating curricula, study programmes or plans, and to have changed teaching, learning and assessment methods appropriately (75.9% in each case). Both of the aspects can be modified within a subject area domain and might not require a centralized effort (although a number of universities said such changed had been introduced at the level of the whole institutions), unlike for example the introduction of a system of credits based on the students’ workload.

Stage Two data demonstrate that in the 21 sample institutions a competence-based student-centred approach is seen to work, if not in all, at least in a considerable number of courses within the degree programmes chosen as examples. This claim by teachers is largely corroborated by students. Of particular interest for us here is the fact that 67.1% of teachers who focus on generic competences work with those agreed upon in Tuning. The Tuning competence lists must, therefore, be familiar to at list 67.1% of teachers in these institutions, a figure which greatly exceeds the number of teachers who had been involved in the Tuning Latin America projects directly during the project lifecycle.

Two central issues indicative of the level of implementation of a competence-based student-centred approach, but which often take longer to embed, are assessment of competences and the calculation of student workload. Indeed, more discrepancies are noticeable between teachers’ and students’ answers with respect to these two aspects and thus degree of implementation might be lower than for other aspects. In other words, this requires further work. However, the figures obtained suggest that all of these universities are certainly on the way to achieving both goals, and their progress can be plausibly attributed to their participation in the Tuning Latin America projects.

VIII. Conclusion

To sum up, this study demonstrates that Tuning has had an evidence based impact in at least 21 higher education institutions in 11 Latin American countries and 15 subject areas. All the three types of impact which could be expected appear to be supported by the evidence gathered in this study. Classroom practices have been modified to introduce a competence-based student-centred approach. It is also clear that strong positive attitudes towards this approach have been developed by different stakeholders. Both teachers and students (the main intended beneficiaries of the new approach within institutions of higher education) seem to highly appreciate the benefits of competence-based teaching, learning and assessment, despite being fully aware of the complexity of the changes.

It is important, however, to recall two aspects of the research reported in this article: the first two stages were quantitative in nature, Stage One was exploratory while Stage Two focused on cases of most successful implementation. This suggests two further steps that might need to be undertaken in order to understand how a competence-based student-centred approach can be implemented and what successful implementation depends on. Firstly, a qualitative study of some of the 21 institutions in Stage Two could be conducted. The main question here would be that of how the implementation process developed in these universities: how they moved from their participation in Tuning projects, from initiating debate about the need of change, to actually implementing the new approach, changing the mindsets of the different stakeholders, overcoming the barriers, etc. What were the best practices or solutions identified by the competence-based student-centred approach advocated in these institutions that permitted them to achieve this goal? This qualitative third stage would permit the examination of the nature of contexts favourable for the implemention of a radical change in approach and perhaps identify elements or patterns within institutional cultures either necessary for or associated with successful implementation. Those factors which are context-independent, if any, would be of special relevance for helping other universities achieve the similar goals.[37]

Secondly, it is equally important to find out why other universities have been less successful or unsuccessful in implementing the results of the Tuning projects. What limited their ability to redesign their curricula? What prevented these institutions from achieving the same results? Can any of these decisive factors be controlled by any of the three groups (academics executives, teachers and students)? Or could a further international project which would target particular aspects critical for successful implementation (teaching and learning activities, assessment methods, etc.) help resolve those difficulties which these universities have not managed to overcome? A further research study might be necessary in this case.

To conclude, a first step has been made towards exploring the impact Tuning has had in one of world regions where Tuning projects have been completed. The results obtained in Latin America are encouraging in terms of establishing a relation between participation in a Tuning project and advancing towards fully implementing a competence-based student-centred approach, with all that this entails. Further studies must, however, be conducted to answer the new questions which have arisen, to explore cases of successful implementation in more detail, and to draw a comparative picture with other regions of the world higher education academics have participated in Tuning projects.

Bibliography

“What is impact evaluation?”, last modified 2011, http://web.worldbank.org/WBSITE/EXTERNAL/TOPICS/EXTPOVERTY/EXTISPMA/0,,menuPK:384339~pagePK:162100~piPK:159310~theSitePK:384329,00.html

Altbach P. G., Reisberg L., and Rumbley L.E. (2009) Trends in Global Higher Education Tracking an Academic Revolution. A Report Prepared for the UNESCO 2009 World Conference on Higher Education.

Austrian Development Agency, Guidelines for Project and Programme Evaluations. Vienna: Austrian Development Agency, 2009.

Beneitone, Pablo; Esquetini, César; González, Julia; Marty Maletá, Maida; Siufi, Gabriela, and Wagenaar, Robert, eds. Reflection on and Outlook for Higher Education in Latin America. Bilbao: University of Deusto – University of Groningen, 2007.

Beneitone, Pablo; González, Julia and Wagenaar, Robert, eds. Meta-profiles and profiles: a new approach to qualifications in Latin America. Bilbao: University of Deusto, 2014.

Garbarino, Sabine and Holland, Jeremy. Quantitative and Qualitative Methods in Impact Evaluation and Measuring Results, (2009); accessed at http://www.gsdrc.org/docs/open/EIRS4.pdf

Newcomer, Kathryn E.; Hatry, Harry P., and Wholey, Joseph S., “Planning and Designing Useful Evaluations.” In Handbook of Practical Program Evaluation, edited by Kathryn E. Newcomer, Harry P. Hatry, Joseph S. Wholey, 5-29. San Francisco: Jossey-Bass, 2010.

OECD, Glossary of Key Terms in Evaluation and Results Based Management. Paris: OECD Publications, 2010.

   , Outline of Principles of Impact Evaluation, (n.d.), http://www.oecd.org/dac/evaluation/dcdndep/37671602.pdf

Stern, Elliot, Impact Evaluation: A Guide for Commissioners and Managers. London: Bond for International Development, 2015.

Stern, Elliot; Stame, Nicoletta; Mayne, John; Forss, Kim; Davies, Rick, and Befani, Barbara, Broadening the Range of Designs and Methods for Impact Evaluations. 2012.

Wheebox, People Strong and the Confederation of Indian Industry (2014) The India Skills Report https://wheebox.com/wheebox/resources/IndiaSkillsReport.pdf

About the Authors

PABLO BENEITONE (pablo.beneitone@deusto.es) is Director of Tuning Academy at the University of Deusto (Spain). During most of his professional and academic career since 1994 onwards, he has been responsible for managing international higher education projects at university and national level. At the University of Deusto he was Project Manager of Tuning Latin America and Tuning Africa, and was involved in other regional programmes, Russia, China, and Europe, supported by European Commission. He has published extensively on the ‘Tuning Methodology’ and given Tuning-related conference presentations in more than 25 countries. Mr. Beneitone holds a Bachelor degree in International Relations and a Master in International Cooperation from Universidad del Salvador (Argentine). His doctoral research focuses on the internationalisation of curriculum. To access the full CV, please copy and paste this URL into your browser: http://www.tuningjournal.org/cv/Pablo_Beneitone.pdf.

MARIA YAROSH (mariayarosh@deusto.es) works in the Research and Staff Development lines of the Tuning Academy at the University of Deusto (Bilbao, Spain). Parallel to this, she participates in international education and research projects. Her collaboration with Tuning started with the preparation of the Tuning Russia project (2009). Maria Yarosh holds a Specialist degree in Linguistics and Translation from the Institute of Foreign Languages (St. Petersburg, Russia), an MA degree in Lifelong Learning: Policy and Management from the Institute of Education, University of London (UK) and a PhD in Education from the University of Deusto (Spain).

[*] Pablo Beneitone (pablo.beneitone@deusto.es) is Director of Tuning Academy at the University of Deusto (Spain). Maria Yarosh (mariayarosh@deusto.es) works at the Tuning Academy at the University of Deusto (Spain).

[1] Wheebox, People Strong and the Confederation of Indian Industry (2014) The India Skills Report https://wheebox.com/wheebox/resources/IndiaSkillsReport.pdf

[2] Altbach P. G., Reisberg L., and Rumbley L.E. (2009) Trends in Global Higher Education Tracking an Academic Revolution. A Report Prepared for the UNESCO 2009 World Conference on Higher Education)

[3] Pablo Beneitone et al., eds. Reflection on and Outlook for Higher Education in Latin America. (Bilbao: University of Deusto – University of Groningen, 2007), 17.

[4] Pablo Beneitone, Julia González and Robert Wagenaar, eds. Meta-profiles and profiles: a new approach to qualifications in Latin America (Bilbao: University of Deusto, 2014), 11-12.

[5] Elliot Stern et al., Broadening the Range of Designs and Methods for Impact Evaluations (2012), 6.

[6] Austrian Development Agency, Guidelines for Project and Programme Evaluations (Vienna: Austrian Development Agency, 2009), 2.

[8] OECD, Outline of Principles of Impact Evaluation, (n.d.), 1, http://www.oecd.org/dac/evaluation/dcdndep/37671602.pdf

[9] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 5.

[10] Elliot Stern, Impact Evaluation: A Guide for Commissioners and Managers (London: Bond for International Development, 2015), 7.

[11] “What is impact evaluation?”

[12] OECD, Outline of Principles of Impact Evaluation, 1.

[13] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, i.

[14] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, ii and 4.

[15] Kathryn E. Newcomer, Harry P. Hatry, Joseph S. Wholey, “Planning and Designing Useful Evaluations.” In Handbook of Practical Program Evaluation, edited by Kathryn E. Newcomer, Harry P. Hatry, Joseph S. Wholey, 5-29. (San Francisco: Jossey-Bass, 2010), 15.

[16] Newcomer, Hatry, Wholey, “Planning and Designing Useful Evaluations”, 16.

[17] OECD, Outline of Principles of Impact Evaluation, 1.

[18] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 8.

[19] OECD, Outline of Principles of Impact Evaluation, 2.

[20] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, i.

[21] Stern, Impact Evaluation: A Guide for Commissioners and Managers, 10.

[22] Austrian Development Agency, Guidelines for Project and Programme Evaluations, 3.

[23] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 5.

[24] Austrian Development Agency, Guidelines for Project and Programme Evaluations, 10.

[25] OECD, Outline of Principles of Impact Evaluation, 2.

[26] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, i.

[27] Stern, Impact Evaluation: A Guide for Commissioners and Managers, 10.

[28] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 14.

[29] Stern, Impact Evaluation: A Guide for Commissioners and Managers, 14.

[30] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 27.

[31] See Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 21.

[32] OECD, Outline of Principles of Impact Evaluation, 2.

[33] “What is impact evaluation?”

[34] Sabine Garbarino and Jeremy Holland, Quantitative and Qualitative Methods in Impact Evaluation and Measuring Results, (2009); accessed at http://www.gsdrc.org/docs/open/EIRS4.pdf, 3.

[35] Stern et al, Broadening the Range of Designs and Methods for Impact Evaluations, 36.

[36] Incomplete sets of responses were not included into the analyses that are presented below.

[37] Even if any generalisations or attempts at transferring enabling elements from one context to another must be made with great caution.

 

 

Copyright

Copyright for this article is retained by the Publisher. It is an Open Access material that is free for download, distribution, and or reuse in any medium only for non-commercial purposes; provided any applicable legislation is respected, the original work is properly cited, and any changes to the original are clearly indicated.