Index

A

academic capitalism, 31
action, 146–50
real action vs no real action, 150
assessment forms, 37–8
Australian Graduate Survey (AGS), 83
Australian Survey of Student Engagement (AUSSE), 86–7
Australian university
student feedback practice and use, 81–95
institutional-level use, 88–9
national initiatives that have influenced collection and reporting, 88
national or sector-wide use, 82–8
university practices and uses of student feedback, 90–4
courses/programmes of study, institutional experience, 93–4
subjects/units, 92–3
teaching, 90–1
Australian University Quality Agency (AUQA), 89

C

cafeteria system, 39
CLASSE, 115–16
Code of Practice, 20
Cooperative Institutional Research Program (CIRP), 30
Course Experience Questionnaire, 13, 32–3, 39, 68, 84–5, 102–4
core CEQ items considered, 103
Curtin University, 93–4

F

faculty-level survey, 12
Faculty Survey of Student Engagement (FSSE), 108
feedback cycle, 145–58
action, 146–50
real action vs no real action, 150
feedback to students, 155–7
students’ perspectives on feedback flyer, 156–7
managing student expectations, 151–5
students’ ownership of feedback process, 153–5
monitoring action, 150–1
student feedback/action cycle, 147
First Year Experience Questionnaire (FYE), 87–8

G

goal-based form, 39
Good Teaching scale, 93
Graduate Destination Survey (GDS), 83

H

higher education, 51–4
HyperText Markup Language (HTML), 139

I

institute-level satisfaction, 9–11
institutional feedback surveys, 157

L

Learning and Teaching Performance Fund (LTPF), 84–5, 88–9
Lund University, 66–9
course evaluations, 66–9
student evaluations, 69–72
external stakeholders, 72
faculty leadership, 72
lecturers, 70–2
students, 69–70

M

meta-profession model, 40–1
module-level feedback, 14–15
multiple surveys, 17–19

N

National Higher Education Plan (NHEAP), 53
National Student Survey (NSS), 18–19, 85, 149
National Survey of Student Engagement (NSSE), 30
New England Association of Colleges and Universities standard, 4.50, 32

O

omnibus form, 39
Online Student Evaluation of Teaching System, 56–8
operational evaluation, 67–8

P

paper-based surveys, 119–29
administrative considerations, 121–2
results, 125–6
evaluation mode trends, 126
number of units evaluated, 125
response rates, 126
survey response rates, 122–4
trend data analysis, 124–5
Portable Document Format (PDF), 137
Postgraduate Research Experience Questionnaire (PREQ), 86
programme-level survey, 12–14

Q

quality assurance, 50

R

reporting evaluation, 68
Royal Institute of Technology (KTH), 64–5

S

satisfaction cycle, 6
satisfaction surveys
faculty-level, 12
institution-level, 9–11
module-level feedback, 14–15
programme-level, 12–14
teacher appraisal, 16–17
Seashoal Lows University, 33–5
SERVPERF, 41
SERVQUAL, 33, 41–2
significant networks, 73
SOU Report, 64
Staff Student Engagement Questionnaire (SSEQ), 108–10
Staff Student Engagement Survey (SSES), 108
Student Course Experience Survey, 93
student engagement, 107–8
Student Engagement Questionnaire (SEQ), 107–10
outcome measures, 109–10
student engagement scales, 109
Student Evaluation of Courses and Teaching (SECAT), 17
Student Evaluation of Teaching (SET), 91
student evaluations, 62–5, 69–72
faculty, 33–4
student feedback, 3–23
external and internal links, 110–14
cross-institutional results, 111–14
mean square fit statistics, 113
psychometric disconnect, 114
scale mean score relationships, 113
variable map from Rasch analysis, 112
external information, 7
faculty-level satisfaction with provision, 12
feedback/action cycle, 147
feedback to students, 20–2
improvement, 5–7
improving university teaching, 61–76
Faculty of Engineering, Lund University, 66–9
influencing lecturers, 75
leadership, 75–6
student evaluations, 69–72
using the data, 74–5
inclusive practice, 133–41
institutional approach to unit evaluation, 136–7
unit evaluation online access complaints, 138
unit evaluation online response rates, 140
institute-level satisfaction, 9–11
integration, 22–3
Malaysian perspective, 49–59
case study, 56–7
educational competitiveness and ranking, 51–3
higher education, 53–4
higher education performance, 52
lessons learned, 57–9
Malaysian universities, 54–6
module-level feedback, 14–15
multiple surveys, 17–19
practice and use in Australian national and university context, 81–95
institutional-level use, 88–9
national or sector-wide use, 82–8
within-university practices and uses, 90–4
preoccupation with surveys, 7–9
programme-level satisfaction, 12–14
satisfaction cycle, 6
teacher performance appraisal, 16–17
tools, 101–16
assessment of instruments, 101–2
selected assessments, 102–10
types of surveys, 9
United States and global context, 29–44
academic programmes and instruction, 32–3
case study, 33–5
issues, 35–9
meta-profession model of faculty, 40–1
root causes of customer failures, 42
SERVQUAL, 41–2
types, 31
web-based or paper-based surveys, 119–29
administrative considerations, 121–2
survey response rates, 122–4
trend data analysis, 124–5
student feedback cycle, 50
student ratings, 37, 39, 40
Student Satisfaction Approach, 146, 158
Student Satisfaction Survey (SSS), 18–19
students as learners and teachers (SaLT), 51
Students’ Evaluation of Educational Quality (SEEQ), 90–1
Students Evaluations of Educational Quality (SEEQ), 16–17
survey fatigue, 127
Survey Management Systems (SMS), 133, 137
surveys, 4, 9
Swedish Higher Education Act (2000), 62–3

T

teacher appraisal surveys, 16–17
Teaching Quality Instruments (TQI), 104–7
institution-specific TQI items, 106–7
Tuning Project, 43

U

Umeå University, 65
unit evaluation
institutional approach, 136–7
online access complaints, 138
online response rates, 140
Universiti Putra Malaysia (UPM), 54–5
Universiti Teknologi Malaysia (UTM), 56–7
University Experience Survey, 85
University of Cambridge, 21–2
University of Central England, 147–8, 150
student perceptions
ease of locating books on shelves, 152
usefulness and promptness of feedback, 149
student satisfaction with availability of Internet, 148
University of Central England (UCE), 7, 10, 50
University of Queensland, 90, 92
University of Sydney, 93
University Sains Malaysia (USM), 55
Update, 155

W

web-based surveys, 119–29
administrative considerations, 121–2
results, 125–6
evaluation mode trends, 126
number of units evaluated, 125
response rates, 126
survey response rates, 122–4
trend data analysis, 124–5
Web Content Accessibility Guideline (WCAG), 139–40
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset