Challenges in Sustainability | 2017 | Volume 5 | Issue 1 | Pages 52–61
DOI: 10.12924/cis2017.05010052
ISSN: 2297–6477
Challenges in
Sustainability
Research Article
Fostering the Next Generation of Sustainability Professionals—
Assessing Field Courses in a Sustainability Science Graduate
Program
Ricardo San Carlos*, Yuki Yoshida and Shogo Kudo
Graduate Program in Sustainability Science—Global Leadership Initiative, Graduate School of Frontier Sciences,
The University of Tokyo, Tokyo, Japan. E-Mail: yukiyoshida@sustainability.k.u-tokyo.ac.jp (YY);
kudo@sustainability.k.u-tokyo.ac.jp (SK)
* Corresponding author: E-Mail: sancarlos.ricardo@sustainability.k.u-tokyo.ac.jp; Tel.: +81 471364877;
Fax: +81 471364878
Submitted: 28 February 2016 | In revised form: 3 July 2016 | Accepted: 2 August 2016 |
Published: 27 March 2017
Abstract:
A growing number of educational programs in sustainability science has paralleled the rise of the
field itself. The educational approach of these programs follows the problem-driven, interdisciplinary, and
transdisciplinary nature of the field itself. However, its effectiveness has yet to be systematically evaluated.
Similarly, while ad-hoc evaluation schemes have attempted to monitor the quality of the educational
programs, there is no standard method that accounts for the particularities of sustainability science programs.
This study thus addresses the need for an assessment of the problem-driven approach of educational
programs in sustainability science. We have conducted student self-assessments of field courses in the
Graduate Program in Sustainability Science (GPSS-GLI) at The University of Tokyo, which positions its field
courses at the center of its curriculum. The self-assessments were based on five key competencies identified
as particularly important for sustainability professionals. Workshops and questionnaires engaged students
in a reflection of the six field courses and of their own personal development through the activities offered.
Our questionnaire results indicate that the majority of participants were satisfied with how the courses
furthered their personal development. While some participants expressed frustration at being unable to
sufficiently address the respective field’s sustainability challenges due to time constraints, students generally
recognized the five key competencies as important for addressing sustainability issues after participating in
the courses. Moreover, participants attributed much of their learning to their active engagement in planned
field research activities, rather than to passive learning. Variations in results across different course units
provide material for further analysis and development of the curriculum. This study is an initial attempt at
assessment, with room for ongoing improvement and further research to address additional requirements
for fostering the next generation of sustainability professionals.
Keywords:
curriculum development; fieldwork evaluation; higher education; competencies; sustainability
professional; sustainability science
c
2017 by the authors; licensee Librello, Switzerland. This open access article was published
under a Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/).
librello
1. Introduction
Sustainability science has been promoted actively both in
research and education as a vibrant response to emerging
sustainability challenges such as climate change, environ-
mental degradation, food security, energy provision, and
inequality. The main disciplinary foci of sustainability sci-
ence are to understand the complex interactions between
natural and social systems [
1
–
5
], and to create knowledge
for sustainable development [
6
–
9
]. Since challenges in sus-
tainability generally require action to alter the status quo,
the discipline’s approach is problem-based and solution-
oriented [
10
–
13
]. Moreover, interconnected problems [
14
]
require researchers to go beyond their discipline of training.
To reflect findings from research into actual practice, it is also
necessary to cross the potential divide between academics
and practitioners. Accordingly, sustainability science com-
bines an interdisciplinary approach that employs academic
knowledge from natural and social sciences to humanities,
with a transdisciplinary approach that promotes co-design
and co-creation of knowledge by diverse social stakeholders
to address real-world sustainability challenges [15–18].
While the research dimension of sustainability science
has formed its own space and landscape within academia
[
10
,
13
,
19
,
20
], sustainability-related educational programs
have also been promoted. According to a list from the jour-
nal Sustainability: Science, Practice, & Policy (SSPP)[
21
]
there are more than 230 sustainability programs at the uni-
versity level as of January 2016 [
22
]. Sustainability science
education plays a key role in producing human resources
with the literacy, knowledge, and skills required to actualize
the recommendations of sustainability science research.
Program curriculum and implementation must therefore re-
flect the interdisciplinary and transdisciplinary aspects of
sustainability science. Students should be encouraged and
trained to develop an interdisciplinary and transdisciplinary
mindset, as the problems they address define the types
of knowledge and methods necessary to propose possible
solutions. Given the field’s problem-driven and solution-
oriented approach, it is also critical to have linkages not
only between research and stakeholders such as industry,
government, and NGOs, but also between research and
education for the continuous development of sustainabil-
ity professionals. More collaborative and critical research
approaches are necessary to guide social transformation
towards sustainability [23].
Theoretical and applied literatures address the design
of educational programs. The interdisciplinary approach of
sustainability science brings together academic disciplines
with diverging worldviews, and this in turn creates epistemo-
logical discussions. Such inter-paradigmatic collaboration
and negotiation is considered a key characteristic of the field.
In line with the epistemological discussion, the idea of ‘trans-
epistemology’ [
24
] was introduced [
24
] to better describe
the dynamic integration of different methods in sustainability
research. According to Scheweizer-Ries and Perkins [
24
],
trans-epistemology is the “cooperation between different
personal knowledge systems” and society as a whole is the “
‘producer’ of shared and socially constructed understanding
of the world” [
24
]. The idea of sustainability is fundamentally
normative and carries specific cultural values. It may also
differ from person to person, so that sustainability science
researchers must be able to imagine the diversity of views
on a given topic and comprehend interlinkages between
the viewpoints of different stakeholders. Accordingly, an
educational program in sustainability must have the flexibil-
ity to accommodate awareness and tolerance of multiple
epistemological views [25].
Regarding the design and operation of an educa-
tional program in sustainability science, Onuki and Mino
introduced three key components: (i) knowledge and
concept-oriented courses, (ii) experiential learning and
skills-oriented courses, and (iii) transdisciplinary thesis re-
search [
26
]. Mino and his colleagues later added the trans-
boundary framework to emphasize the full range of scales,
from the individual to the global, in order to examine sub-
jects and problems from multiple angles [
27
].Tamura and
Uegaki operate a sustainability science program in Ibaraki
University, and raise another core concept for designing a
sustainability science program, the “Mind-Skills-Knowledge”
model of sustainability education [
28
]. In an analogy of
sustainability science students with athletes who need to
maintain their body, technique and spirit, the framework
stresses a balance of different types of knowledge. Others
have suggested declarative, procedural, effectiveness, and
social knowledges, as well as their effective interaction [
25
].
In terms of the evaluation of sustainability science ed-
ucation programs, one challenge is to develop a method
for investigating whether students are effectively acquir-
ing the competencies required in order for them to actual-
ize their knowledge as concrete actions for sustainability
[
28
]. The work of Wiek and his colleagues provides a com-
prehensive discussion of five key competencies within a
problem-solving framework [
29
]. While the proposed key
competencies have been applied to assess students’ learn-
ing outcomes in a transdisciplinary course [
30
], a general
need for research on pedagogy and evaluation in sustain-
ability science programs remains.
To address this gap in the evaluation of sustainability
science programs, this study aims to examine the problem-
driven approach of sustainability science through student
self-assessments of six field courses at the Graduate Pro-
gram in Sustainability Science (GPSS-GLI), The University
of Tokyo. Field courses in GPSS-GLI are designed for stu-
dents to engage in collaborative research and to address
real-world sustainability challenges in various topical cases.
So far, field courses have covered countries in Africa (Kenya,
Nigeria, and South Africa), Asia (China, Japan, Thailand),
Europe (Denmark and Sweden) and Latin America (Costa
Rica), and topics such as renewable energy, biodiversity
conservation, and urban informal settlement [31].
These courses also aim to equip students with practical
skills such as workshop facilitation, coordination with local re-
source persons, and field methodologies that can be applied
53
to their thesis research. The general structure of each field
course is designed by one or two faculty members who spe-
cialize on the given topic. One unit normally accompanies a
cohort of four to ten students, and one doctoral student takes
a leading role in the planning. Six field courses implemented
during the academic year 2014–2015 are evaluated in this
study (see Table 1 for an overview of the units). Four of these
are Global Field Exercises and two Resilience Exercises, but
these have equal weight and significance in the curriculum
and are handled as identical in this analysis.
2. Methods
The assessment began with the development of a concep-
tual framework and methodology, implemented systemat-
ically in a subsequent phase. The first phase took place
in the Tohoku Resilience Exercise, one of six workshops
assessed in this study. While the exercise itself had an
educational focus of having students grapple with the com-
plexity of issues surrounding the regional reconstruction
after the Tohoku Earthquake and Tsunami of March 2011,
students simultaneously engaged in a reflective analysis
that laid the foundation of this assessment effort [32].
This developmental phase began with a review and anal-
ysis of the conceptual framework of key competencies for
sustainability science professionals [
29
] that had been used
in a previous assessment of the said program [
33
]. The
chosen framework was deemed appropriate for this assess-
ment as a focus on real-world problems is characteristic of
GPSS-GLI, and the five key competencies were identified
for their relevance to sustainability science research and
problem solving [
29
,
32
]. Collectively, students reviewed this
pre-existing framework and adapted the original definitions
for use within an educational context [
32
]. The group then
analyzed the linkages between each of the competencies
and the activities and issues within the Resilience Exercise.
In order to hold pointed discussions about how different
components of the field course-related activities contributed
to participants’ personal development, there was a need to
distinguish between active and passive learning, as well as
of recognizing a competence as important in professions
of sustainability science. As discussed by San Carlos and
colleagues [
33
], a review of academic literature revealed
a lack of consensus and clarity on the definitions of ac-
tive and passive learning [
34
,
35
]. For practical purposes,
our understanding is that active learning involves active
engagement of the students with the planned field exercise
course activities. In other words, active learning is learning
by doing, such as designing and conducting original field
surveys and through firsthand interaction with stakeholders
in the research topic. In contrast, passive learning is the
unidirectional transmission of information through lectures
and other methodologies that do not require active student
engagement [35].
Table 1. Description of field course units and assessment participation rates
Type of course Resilience Exercise Global Field Exercise
Unit name Minamata Tohoku Oasis Costa Rica Bangkok Nairobi
Main location Minamata, Japan Otsuchi, Japan Zhangyo City, China Guanacaste, Costa Rica Bangkok, Thailand Nairobi, Kenya
Duration of field exercise 6 days 7 days 14 days 7 days 13 days 14 days
Workshop participants /
GPSS-GLI students in unit
8/8 (100%
workshop
participation)
5/5 (100%
workshop
participation)
7/8 (88% workshop
participation)
8/10 (80%
workshop
participation)
5/5 (100%
workshop
participation)
3/3 (100%
workshop
participation)
Focus / Objective
Educational /
current situation of
Tsunami affected
area and applying
the concept of
resilience
Research / additionality
of payments for
ecosystem services for
agroforestry
Educational /
sustainability
challenges and
research methods
in urban Africa
Educational /
understanding
issues regarding
the Minamata
Disease
Research /
sustainable water
management in
semiarid region in
China
Educational /
urban health
issues (focus
differed by group)
Primary field activities
Lectures, field,
visits, interviews
Interviews, field
visits
Lectures, field
research, group
work
Lectures, field,
visits, group work
Field visits,
interviews, survey
Lectures, field
visits, group work
Major characteristics
Organized by
faculty
Student-led, first
time unit
Organized with
participation of
local students
Organized by
faculty. Output of
educational
material
Student-led,
participants from
multiple institutions
Organized by
faculty.
Participants from
multiple institutions
54
At the end of the week-long Exercise course with daily,
reflective discussions, each student’s personal experi-
ences were quantified for analysis using a questionnaire
with the concepts discussed. This questionnaire was
used throughout the subsequent assessment. The ques-
tionnaire assumed that the respondent would have re-
ceived some explanation of the competencies prior to
assessment, but listed definitions as shown in Table 2.
Students were asked to rate the unit’s effectiveness in
facilitating personal development of the respective com-
petence beyond their baseline level. The assessment of
each competence was threefold: for passive learning, for
active learning, and for “recogni[tion]/agree[ment] about
the importance of the competence for research on sus-
tainability issues” (hereafter: “Recognition”). Responses
were indicated according to a 5 point-Likert scale (1: very
ineffective (no influence); 2: ineffective, 3: satisfactory,
4: effective, 5: very effective). Open space was provided
at the end of the questionnaire with prompts encourag-
ing comments on respondents’ personal experiences or
feedback on the assessment itself.
All subsequent assessments were conducted after the
completion of the field courses according to the following
procedure. The authors contacted student participants of
the respective course unit using e-mail and/or social media
to schedule a course workshop. This correspondence in-
volved all GPSS-GLI students who had participated in the
course, with one exception where the student had already
graduated and left the country.
Workshops were facilitated by at least one of the au-
thors. A brief introduction of the assessment project was
followed by inquiry about the unit’s educational and/or
research objectives. Using a whiteboard or projected com-
puter screen, students were then asked to list out the
units’ activities. Next, the competence framework was in-
troduced using the definitions on Table 2, and students
were asked to identify linkages between the competencies
and the listed activities. At the end, the questionnaire was
handed out either electronically or on paper for students
to complete individually. The total duration of the work-
shops averaged about 90 minutes, and all workshops were
conducted between September and November of 2015.
Table 2. Original and applied definitions of Key Competencies in Sustainability (adapted from San Carlos et al [32])
Competence Original definition 29 Our operationalization
Systems-thinking competence
Ability to collectively analyze complex
systems across different domains
(society, environment, economy, etc.)
and across different scales (local to
global)
Competency to organize and
understand the complex constellation
of sustainability issues
Anticipatory competence
An ability to collectively analyze,
evaluate, and craft rich pictures of the
future related to sustainability issues
and sustainability problem-solving
frameworks
Competency to visualize future
scenarios, including non-intervention
and alternative sustainability visions
Normative competence
An ability to collectively map, specify,
apply, reconcile, and negotiate
sustainability values, principles, goals,
and targets
Competency to understand the range
of different values that could lead to
different sustainability visions
Strategic competence
Ability to collectively design and
implement interventions, transitions,
and transformative governance
strategies toward sustainability
Competency to design and implement
strategies to achieve a particular
sustainability vision
Interpersonal competence
An ability to motivate, enable, and
facilitate collaborative and participatory
sustainability research and problem
solving
Competency to communicate,
coordinate, negotiate or lead
55
Subsequent consultations with faculty and affiliated staff
members supplement the above process as a means to con-
sider the appropriateness of the completed assessment. To
date, this process has consisted of an e-mail with a semi-
structured questionnaire to faculty and staff members associ-
ated with each field course. The e-mail included a summary
of the student assessments for the respective unit, as well
as cross-unit average scores. Another document outlined
the intent of the assessment and prompted for responses as
follows: 1) explanations and interpretations of the results; 2)
reflections on the exercise design; 3) comments and feedback
on the assessment itself. As some unit-specific comments
would be traceable to individual faculty members, the docu-
ment asked faculty members to indicate their willingness to
have their comments attributable to the unit in question.
3. Results
3.1. Student Workshop and Faculty Participation Rate
Field course units are referred to by the location: Mina-
mata (Japan), Tohoku (Japan), Oasis (China), Costa Rica,
Bangkok (Thailand), and Nairobi (Kenya). As shown in Ta-
ble 1, the assessment had a high rate of participation, with
full participation for four of six units. E-mail inquiries with
faculty and staff members were followed up with reminders
and reached a response rate of 86% (n = 7). As only one
staff member was contacted, faculty and staff will hereafter
be referred to as “respondents” to ensure confidentiality.
3.2. Student Workshop and Questionnaire Results
Figure 3 shows the questionnaire results. Columns (A) to (F)
show results in each course unit by competence. Rows (1) to
(5) show the results for each competence by course unit. Mean
scores and standard deviations (SD) for each competence are
shown by type of learning. The last column and row represent
the aggregate means by competence and unit, respectively.
3.3. Results by Competence and Type of Learning
Figure 1 shows the mean scores for the five competencies by
type of learning. Results indicate overall student satisfaction
with the field courses, as all five competencies obtained scores
higher than 3.0 (“satisfactory”) for all types of learning. The
highest scoring competence was Interpersonal Competence
(M = 4.17). The lowest scoring competence was Strategic
Competence (M = 3.38).
High scores on Recognition indicate that students gen-
erally agreed with the literature on the relevance of the key
competencies for sustainability science research [
29
]. Recog-
nition scored higher than the other types of learning on four
of five competencies (Anticipatory Competence (M = 3.89);
Normative Competence (M = 3.89); Strategic Competence
(M = 3.79); Interpersonal Competence (M = 4.27)). Regard-
ing Systems Thinking Competence, Recognition only scored
marginally below the mean score of 3.88 (M = 3.86).
Figure 1.
Aggregate scores by competence and type of
learning.
Active Learning was evaluated more highly than Passive
Learning for all competencies. This is intuitive, as the field
courses are based on the concept of providing opportunities
for active engagement in the field [
36
]. The difference was
greatest for Interpersonal Competence, where the aggre-
gate mean for Active Learning (M = 4.33) was 0.43 points
greater than for Passive Learning (M = 3.90). In contrast,
the gap between Passive (M = 3.17) and Active (M = 3.18)
Learning was only 0.01 points for Strategic Competence.
Other notable results are the high scores on Interper-
sonal Competence (M = 4.17) and low scores on Strategic
Competence (M = 3.38). Effectiveness with the develop-
ment of Interpersonal Competence may be explained by
GPSS-GLI students’ diversity in cultural, academic, and
professional backgrounds as well as demographics [
36
].
Lower evaluations on Strategic Competence may be due
to the expectation and desire of students to have a tangi-
ble impact on the study area, despite time and resource
constraints that limit such impact in reality. Student and
faculty respondents alike commented that courses focused
on understanding past and current situations rather than
on speculating the future. This is understandable given the
one to two week duration of the courses and consistent
with the interpretation regarding the lack of capacity of the
courses to have tangible impact.
3.4. Results by Field Course and Competence
Figure 2 shows competence and mean scores for the six
field courses assessed in this study. Mean scores by course
unit were also higher than satisfactory (3.0). The highest
scoring course unit was the Bangkok Unit (M = 3.85). The
Tohoku Unit (M = 3.46) received the lowest scorings and had
high inter-student variation in each competence, a result
likely attributable to the extended and critical discussions
unique to this unit [32].
Results showed varying tendencies across units on stu-
dent assessments’ scores and standard deviations (Figure
3). Minamata Unit obtained high scores and low standard
deviations for all competencies (see column (A)). Tohoku
Unit yielded the lowest mean score (M = 3.46), with similar
results excepting Strategic Competence (M = 2.75), which
56
scored below “satisfactory”. However, standard deviations
within each competence were high for all competencies and
almost all types of learning (see column (B)). One respon-
dent took particular note of the contrast between Minamata
and Tohoku Units, as “both are designed as ‘experience-
oriented’ [and with] similar concepts”.
The Oasis and Costa Rica Units were similar in their
focus on research. However, evaluations by Oasis Unit stu-
dents had large variation (e.g. standard deviations above
1.0 for Strategic Competence (Passive (SD = 1.30); Ac-
tive (SD = 1.22); Recognition (SD = 1.58))). Students in
both units were consistent in their high evaluations of the
course’s impact on their Interpersonal Competence (Oasis
(M = 4.87); Costa Rica (M = 4.20); see Figure 3, columns
(C) and (D)). In particular, Oasis student evaluation for Inter-
personal Competence was highest of all units (aggregate M
= 4.87). These outcomes may be attributed to the empha-
sis on student leadership noted by the faculty respondents
affiliated with the two units. One stated that this emphasis
might have been interpreted as a lack of strategic vision in
the design of the unit, explaining the lower evaluation on
Strategic Competence.
General scoring patterns on the Bangkok and Africa
Units are comparable. However, responses on the former
unit had greater internal consistency (see Table 1, column
(E), (F) and Figure 2). The Africa Unit yielded relatively high
variation amongst students for Passive and Active Learning.
An affiliated respondent observed that these relatively high
variations might indicate that “the exercise led to variable
experiences for different students”. Another respondent
affiliated with the Bangkok Unit expressed surprise at the
lower scores on Recognition. Regarding Systems Thinking
Competence, this respondent suggested that more atten-
tion should be given to a “holistic view about the complex
systems (economic, social etc.) relating to the environmen-
tal and health issues” addressed in the unit.
4. Discussion
This self-assessment of field courses in GPSS-GLI ad-
dresses both the needs of the said program, as well as aca-
demic needs to assess the development of competencies
necessary for sustainability professionals [
29
,
37
]. Building
upon the foundation of a previous assessment of GPSS-
GLI curricular activities conducted six years ago [
33
], the
present study provides a more detailed and in-depth as-
sessment of field courses, a core activity in the program.
4.1. Contributions to GPSS-GLI
The self-assessment method was generated in the previous
exploration of GPSS-GLI student perspectives on curricular
activities and the development of their competencies [
33
].
Student participation in the assessment and development
of GPSS-GLI is consistent with the program’s emphasis on
developing student leadership skills [
36
] and educational
practices in which students can participate [38].
Responses on the validity of the assessment are mixed,
yet overall positive. Most faculty members considered the
competence framework to be an appropriate assessment
framework for GPSS-GLI. Five out of the six faculty mem-
bers consulted considered the results insightful to varying
degrees. Comments included, “the results seem accurate,”
and “results are convincing”.
Nonetheless, some were skeptical of the framework
and/or fundamental approach of this assessment. One re-
spondent considered the competence framework unfit for
this assessment, another expressed that their understand-
ing of the framework was insufficient to use it, and a third
considered it necessary to differentiate between the two
types of field courses (Global Field Exercise and Resilience
Exercise) offered in the program. A further response ques-
tioned, “the overall assessment has to be looked at with a
question mark”.
4.2. Methodological Limitations
Indeed, limitations of this ongoing assessment must be
carefully considered. First, students may have difficulties re-
lating their field experiences to the development of their per-
sonal competencies. Moreover, the time between the field
course and assessment varied from unit to unit. Notwith-
standing the high rates of participation in the workshops
(Table 1), the validity of our results must be interpreted ac-
cording to the low numbers of participants per unit. Second,
results depend on students’ comprehension of the compe-
tencies, and the relatively short workshops may have been
insufficient to ensure adequate comprehension. One faculty
member raised this issue and recommended incorporating
an explanation of the competencies in the guidance before
each field course.
Third, scores only reflect additional improvement of indi-
vidual competencies that students considered attributable
to the field courses. Accordingly, results are subject to
variations in baseline levels. Individual experiences before,
during and after the units play a great role in student as-
sessment, and a respondent questioned “if [students] could
really assess what outcome/impact they experienced for
each key competence and by how much”.
Figure 2. Aggregate scores by course and competence.
57
SUMMARY 2015 GFE/RE ASSESSMENT RESULTS BY UNIT
Unit Score SD
Unit Score
(Average)
Unit Score SD
Unit Score
(Average)
Unit Score SD
Unit Score
(Average)
Unit Score SD
Unit Score
(Average)
Unit Score SD
Unit Score
(Average)
Unit Score SD
Unit Score
(Average)
Unit Score
Unit Score
(Average)
Passive Learning
3.75
0.89
3.88
1.13
4.20
0.84
3.40
1.14
4.00
0.82
3.71
0.95
3.82
Active Learning
4.00
1.07
3.38
1.30
4.60
0.55
4.20
0.84
3.67
1.25
3.86
0.90
3.95
Recognizing the importance
of the competency for
solving sustainability issues
3.88
0.64
3.38
1.41
4.40
0.89
3.80
1.10
3.67
0.47
4.00
0.58
3.86
Passive Learning
3.25
1.04
3.63
1.19
3.40
0.55
3.20
0.84
3.67
0.47
3.00
1.15
3.36
Active Learning
3.38
0.92
3.38
0.74
3.20
1.64
3.00
0.71
4.00
0.82
3.57
1.27
3.42
Recognizing the importance
of the competency for
solving sustainability issues
3.75
1.04
4.00
1.41
4.40
0.89
3.40
0.55
3.67
0.47
4.14
0.90
3.89
Passive Learning
3.88
0.64
3.75
0.89
3.20
0.84
3.00
1.41
3.67
0.94
3.29
1.25
3.47
Active Learning
3.63
0.52
3.00
1.31
3.40
0.89
3.80
0.84
4.00
0.00
3.57
1.27
3.57
Recognizing the importance
of the competency for
solving sustainability issues
4.25
0.71
3.88
1.36
3.80
1.30
3.80
1.30
3.33
0.47
4.29
0.76
3.89
Passive Learning
3.25
1.04
3.13
1.13
2.20
1.30
3.60
0.89
3.67
0.47
3.14
1.07
3.17
Active Learning
4.00
0.93
1.63
0.52
2.00
1.22
4.00
1.00
4.00
0.82
3.43
1.40
3.18
Recognizing the importance
of the competency for
solving sustainability issues
3.63
0.92
3.50
1.51
3.00
1.58
4.00
0.71
4.33
0.47
4.29
0.76
3.79
Passive Learning
3.75
1.04
3.50
1.20
4.60
0.55
4.00
1.22
4.00
0.82
3.57
1.62
3.90
Active Learning
4.25
0.71
4.00
1.07
5.00
0.00
4.40
0.55
4.33
0.47
4.00
1.15
4.33
Recognizing the importance
of the competency for
solving sustainability issues
4.13
0.83
3.88
1.36
5.00
0.00
4.20
0.84
3.67
0.47
4.71
0.76
4.27
OVERALL COMPETENCIES (AVERAGE)
Score
3.73
3.79
3.46
3.76
3.72
3.85
3.77
4.09
BANGKOK
(E)
3.78
3.78
3.67
4.00
4.00
NAIROBI
(F)
3.86
OVERALL
GFE/RE
(G)
(1)
SYSTEMS THINKING
COMPETENCY
3.88
3.88
(2)
ANTICIPATORY
COMPETENCY
3.46
TOHOKU
(B)
3.55
3.67
3.57
COSTA RICA
(D)
3.80
3.20
OASIS
(C)
(3)
NORMATIVE
COMPETENCY
3.92
MINAMATA
(A)
4.20
4.87
(5)
INTERPERSONAL
COMPETENCY
4.04
(4)
STRATEGIC
COMPETENCY
3.63
3.47
2.40
3.54
2.75
3.79
3.53
3.56
4.40
3.67
1.00
2.00
3.00
4.00
5.00
Score equal or higher than average score for
that competency overall GFE/RE
Assessment of the effectiveness of this exercise for facilitating the following competences.
3.64
3.38
3.72
3.62
4.17
3.87
Score lower than average score for that
competency overall GFE/RE
Standard Deviation higher than 1 point (within
the 5 point Likert scale)
Very ineffective
Ineffective
Satisfactory
Effective
Very effective
Figure 3. Questionnaire results.
58
Fourth, field courses were designed with unique ob-
jectives, none of which explicitly involved the said com-
petencies. Nonetheless, one respondent interpreted that
“assessment results show that this design was vindicated”,
somewhat validating the methodology even if it differed from
the original intentions.
A high or low score is not necessarily good or bad, but
merely a reflection of the unit design. Results thus ought to
be viewed in light of the respective unit. Alternatively, future
assessments should consider incorporating or reflecting
the unit design in its assessment framework so as to more
appropriately assess field courses designed with varying
objectives in mind. For example, intended objectives of the
respective course unit may be integrated into the assess-
ment framework to provide insights more relevant to the
unit in question. However, condensing the main features of
each unit design into the assessment framework would be
extremely challenging. Instead, the authors believe that a
post-assessment discussion with students and faculty could
shed light on the results obtained and allow for an open
discussion that would involve the units’ design.
Additional qualitative data on students’ experience could
offer a deeper insight into the results and how the com-
petencies were developed in each unit. One respondent
suggested “one would have to have qualitative expressions
about their experiences” in order to better analyze the as-
sessment outcome.
4.3. Fundamental Considerations
Lozano has suggested that most of the tools available for
assessing sustainability do not seem adequate for immediate
application to the university setting. In general, responses to
this situation fall under either 1) modification of the existing
tools, or 2) creation of specific tools for universities [
39
]. The
current assessment falls under the latter approach and at-
tempted to cater to the characteristics of the field courses in
GPSS-GLI. Any application of this competence-based assess-
ment to other programs or universities should be conducted
with care and upon fundamental reconsideration of the as-
sessment approach. Within the program, faculty members
must consider the appropriateness of the framework used
in this assessment, as well as whether or not and how to
incorporate its outcomes in the design of future course units.
While Wiek et al.’s framework was selected for its focus
on sustainability science, Wiek and his colleagues specify
that pedagogy was beyond the scope of their study [
29
].
Thus, the application of his framework to education is so far
unique to this assessment project [
32
,
33
], and the results
must be interpreted with caution. For example, universal
competencies other than those “key” to sustainability sci-
ence have not been considered, and the list of the five
key competencies so far identified has yet to be finalized
[
29
]. The existence of other studies on sustainability in
higher education suggest that attention should be also paid
to competencies related to domains such as the affective
learning outcomes of educational initiatives [
40
,
41
]. Further,
alternative approaches to assessment could be taken into
consideration, such as the Integral Framework, employed
by a GPSS-GLI faculty member in the design of one course
unit [
42
,
43
] but were beyond the scope of the present study.
More fundamentally, the objective and validity of an as-
sessment need to be carefully examined. Most faculty mem-
bers consulted in the assessment consider the development
of an evaluation scheme for sustainability science education
to be a necessary step in improving the program design.
However, respondents shared the concern that an emphasis
on assessment development may lead to program designs
that excessively cater to evaluation. This concern is par-
ticularly relevant to the field courses, where, through direct
exposure to the problems and through real-life interactions
with residents of the field site, students’ learning outcomes
extend beyond what was originally intended or anticipated.
Field courses must thus maintain a certain degree of flexibil-
ity to encompass diverse forms of learning.
5. Conclusions
This study contributes to the development of a method to
assess students’ learning outcomes of field courses in a
sustainability science program. Through the case of six field
courses in GPSS-GLI at The University of Tokyo, we ad-
dress not only the development of the said program, but also
academic needs to assess the key competencies for sus-
tainability professionals. The results of the self-assessment
suggest that the majority of field course participants felt
satisfied with the knowledge and skills they acquired, and
gained the ambition to further explore the respective topic
areas. Students also recognized the importance of key com-
petencies for sustainability professionals after participating
in the field courses. Although the authors do not suggest
that all courses be aligned with the key competencies, the
study suggests that such alignment could raise students’
awareness of the competencies they are acquiring from the
program they belong to.
We expect to tackle limitations of the self-assessment
method of this study in future developments. In particular,
the assessment framework may be altered to reflect the
variety of field course designs. In terms of implementation,
the framework’s concepts may be better standardized by
building a common understanding of the methods and ter-
minologies used across students and faculty. There should
also be consensus within the graduate program on the
role of the assessment and the appropriate level of effort
dedicated to this task.
We believe that the results of this study showed enough
evidence to support the usefulness and appropriateness
of the deployed self-assessment. We consider the present
study a successful and relevant step forward in the assess-
ment of field exercise courses in the Graduate Program
in Sustainability Science – Global Leadership Initiative of
The University of Tokyo, and a contribution to the general
development of mechanisms to assess key competencies
for sustainability science research.
59
6. Acknowledgements
The authors would like to thank the Graduate Program in
Sustainability Science – Global Leadership Initiative (GPSS-
GLI) for providing the opportunity to conduct the assess-
ment reported in this document. In particular, the authors
thank the students and faculty that contributed comments
and reflections on the results. Finally, acknowledgements
go to all the students that actively collaborated in the work-
shops that made this assessment possible.
References and Notes
[1]
Clark WC, Dickson NM. Sustainability science: The emerging
research program. Proceedings of the National Academy of Sci-
ences of the United States of America. 2003;100(14):8059–8061.
doi:10.1073/pnas.1231333100.
[2]
Kates RW, Clark WC, Corell R, Hall JM, Jaeger CC, Lowe I,
et al. Sustainability Science. Science. 2001;292(5517):641–642.
doi:10.1126/science.1059386.
[3]
Anderies J, Janssen M, Ostrom E. A framework to analyze the robust-
ness of social-ecological systems from an institutional perspective.
Ecology and Society. 2004;9(1). doi:10.5751/ES-00610-090118.
[4]
Ostrom E, Janssen MA, Anderies JM. Going beyond
panaceas. Proceedings of the National Academy of Sciences
of the United States of America. 2007;104(39):15176–15178.
doi:10.1073/pnas.0701886104.
[5]
Komiyama H, Takeuchi K. Sustainability science: building a new dis-
cipline. Sustainability Science. 2006;1(1):1–6. doi:10.1007/s11625-
006-0007-4.
[6]
Kates RW, Parris TM. Long-term trends and a sustainabil-
ity transition. Proceedings of the National Academy of Sci-
ences of the United States of America. 2003;100(14):8062–8067.
doi:10.1073/pnas.1231331100.
[7]
Parris TM, Kates RW. Characterizing a sustainability transition:
goals, targets, trends, and driving forces. Proceedings of the
National Academy of Sciences of the United States of America.
2003;100:8068–8073. doi:10.1073/pnas.1231336100.
[8]
Martens P. Sustainability: Science or Fiction? IEEE Engineering Man-
agement Review. 2007;35(3):70. doi:10.1109/EMR.2007.4296430.
[9]
Griggs D, Stafford-Smith M, Gaffney O, Rockstr
¨
om J, Ohman MC,
Shyamsundar P, et al. Sustainable development goals for people and
planet. Nature. 2013;495(7441):305–307. doi:10.1038/495305a.
[10]
Clark WC. Sustainability science: a room of its own. vol. 104; 2007.
doi:10.1073/pnas.0611291104.
[11]
Miller TR, Wiek A, Sarewitz D, Robinson J, Olsson L, Kriebel D,
et al. The future of sustainability science: A solutions-oriented
research agenda. Sustainability Science. 2014;9(2):239–246.
doi:10.1007/s11625-013-0224-6.
[12]
Spangenberg JH. Sustainability science: a review, an analy-
sis and some empirical lessons. Environmental Conservation.
2011;38(03):275–287. doi:10.1017/S0376892911000270.
[13]
Kajikawa Y. Research core and framework of sustainability science.
Sustainability Science. 2008;(3):215–239. doi:10.1007/s11625-008-
0053-1.
[14]
Kauffman J, Arico S. New directions in sustainability science:
promoting integration and cooperation. Sustainability Science.
2014;9(4):413–418. doi:10.1007/s11625-014-0259-3.
[15]
Brandt P, Ernst A, Gralla F, Luederitz C, Lang DJ, Newig J, et al. A re-
view of transdisciplinary research in sustainability science. Ecological
Economics. 2013;92:1–15. doi:10.1016/j.ecolecon.2013.04.008.
[16]
Wiek A, Farioli F, Fukushi K, Yarime M. Sustainability science: Bridg-
ing the gap between science and society. Sustainability Science.
2012;7(SUPPL. 1):1–4. doi:10.1007/s11625-011-0154-0.
[17]
Trencher G, Yarime M, McCormick KB, Doll CNH, Kraines SB. Beyond
the third mission: Exploring the emerging university function of co-
creation for sustainability. Science and Public Policy. 2013;41(2):151–
179. doi:10.1093/scipol/sct044.
[18]
Bettencourt LMA, Kaur J. Evolution and structure of
sustainability science. PNAS. 2011;108(49):19540–19545.
doi:10.1073/pnas.1102712108.
[19]
Kates RW. What kind of a science is sustainability science? Pro-
ceedings of the National Academy of Sciences. 2011;108(49):19449–
19450. doi:10.1073/pnas.1116097108.
[20]
Kajikawa Y, Ohno J, Takeda Y, Matsushima K, Komiyama H. Cre-
ating an academic landscape of sustainability science: an analysis
of the citation network. Sustainability Science. 2007;2(2):221–231.
doi:10.1007/s11625-007-0027-8.
[21]
SSPP is a peer-reviewed, open-access journal focusing on sustain-
ability science research. The journal also provides an international
network of sustainability research and education.
[22]
Sustainability: Science, Practice, & Policy. Academic Programs
in Sustainability; 2016. Available from: http://sspp.proquest.com/
sspp institutions/display/universityprograms#.
[23]
Stock P, Burton RJF. Defining terms for integrated (multi-inter-trans-
disciplinary) sustainability research. Sustainability. 2011;3(8):1090–
1113. doi:10.3390/su3081090.
[24]
Schweizer-ries P, Perkins DD. Sustainability Science: Transdisci-
plinarity , Transepistemology , and Action Research Introduction to
the Special Issue. Umweltpsychologie. 2012;16(1):6–10.
[25]
Frisk E, Larson KL. Education for sustainability: Competencies &
practices for transformative action. Journal of Sustainability Educa-
tion. 2011;2.
[26]
Onuki M, Mino T. Sustainability education and a new master’s degree,
the master of sustainability science: the Graduate Program in Sus-
tainability Science (GPSS) at the University of Tokyo. Sustainability
Science. 2009;4(1):55–59. doi:10.1007/s11625-009-0073-5.
[27]
Tamura M, Uegaki T. Development of an educational model for sus-
tainability science: Challenges in the Mind-Skills-Knowledge educa-
tion at Ibaraki University. Sustainability Science. 2012;7(2):253–265.
doi:10.1007/s11625-011-0156-y.
[28]
Sterling S, Orr D. Sustainable education, re-visioing learning and
change. Devon, UK: Green Books; 2001.
[29]
Wiek A, Withycombe L, Redman C. Key competencies in sustain-
ability: a reference framework for academic program development.
Sustainability Science. 2011;6(2):203–218. doi:10.1007/s11625-011-
0132-6.
[30]
Remington-Doucette S, Connell K, Armstrong C, Musgrove S. As-
sessing sustainability education in a transdisciplinary undergraduate
course focused on real world problem solving: A case for disciplinary
grounding. International Journal of Sustainability in Higher Education.
2013;14(4):404–433. doi:10.1108/IJSHE-01-2012-0001.
[31]
A list of past field courses in GPSS-GLI is available at
http://www.sustainability.k.u-tokyo.ac.jp/exercises.
[32]
San Carlos RO, Tyunina O, Yoshida Y, Mori A, Sioen GB, Yang J.
In: Esteban M, Akiyama T, Chiahsin C, Ikeda I, editors. Assessment
of Fieldwork Methodologies for Educational Purposes in Sustain-
ability Science: Exercise on Resilience, Tohoku Unit 2015. Springer
International Publishing; 2016. pp. 67–91. doi:10.1007/978-3-319-
32930-7 4.
[33]
Tumilba V, Kudo S, Yarime M. Review and assessment of academic
activities , student competencies, research themes and practice of
sustainability principles in the Graduate Program in Sustainability Sci-
ence. In: The 19th International Sustainable Development Research
Conference. Stellenbosch: ISDRC 19; 2013.
[34]
Bonwell CC, Eison JA. Active Learning: Creating Excitement in the
Classroom. Washington; 1991.
[35]
Chi MTH. Active-Constructive-Interactive: A Conceptual Framework
for Differentiating Learning Activities. Topics in Cognitive Science.
2009;1(1):73–105. doi:10.1111/j.1756-8765.2008.01005.x.
[36]
Graduate Program in Sustainability Science, Global Leadership Ini-
tiative, The University of Tokyo, Tokyo, Japan.
[37]
Barth M, Godemann J, Rieckmann M, Stoltenberg U. Developing key
competencies for sustainable development in higher education. Inter-
national Journal of Sustainability in Higher Education. 2007;8(4):416–
60
430. doi:10.1108/14676370710823582.
[38]
Waas T, Hug
´
e J, Ceulemans K, Lambrechts W, Vandenabeele J,
Lozano R, et al. Sustainable Higher Education. Understanding and
Moving Forward. Brussels; 2012.
[39]
Lozano R. Incorporation and institutionalization of SD into universi-
ties: breaking through barriers to change. Journal of Cleaner Pro-
duction. 2006;14(9-11):787–796. doi:10.1016/j.jclepro.2005.12.010.
[40]
Rieckmann M. Future-oriented higher education: Which key compe-
tencies should be fostered through university teaching and learning?
Futures. 2012;44(2):127–135. doi:10.1016/j.futures.2011.09.005.
[41]
Shephard K. Higher education for sustainability: seeking affective
learning outcomes. International Journal of Sustainability in Higher
Education. 2008;9(1):87–98. doi:10.1108/14676370810842201.
[42]
Akiyama T, Li J. Environmental leadership education for tackling
water environmental issues in arid regions. In: Environmental Lead-
ership Capacity Building in Higher Education. Springer Japan; 2013.
pp. 81–92.
[43]
San Carlos RO, Teah HY, Akiyama T, Li J. In: Esteban M, Akiyama
T, Chiahsin C, Ikeda I, editors. Designing Field Exercises with the
Integral Approach for Sustainability Science: A Case Study of the
Heihe River Basin, China. 1st ed.; 2016. pp. 23–39. doi:10.1007/978-
3-319-32930-7 2.
61