DSA Supplementary Questionnaire Initial analysis and observations.

17
DSA Supplementary Questionnaire Initial analysis and observations

Transcript of DSA Supplementary Questionnaire Initial analysis and observations.

Page 1: DSA Supplementary Questionnaire Initial analysis and observations.

DSA Supplementary Questionnaire

Initial analysis and observations

Page 2: DSA Supplementary Questionnaire Initial analysis and observations.

Purpose

DSA assesses e-maturity on a broad basis

Supplemented by:

• 72 questions dealing with specifics of “e-activity”

• 4 open-response questions

Page 3: DSA Supplementary Questionnaire Initial analysis and observations.

Coverage of questionnaire

• Inputs from JISC, SFEU, COLEG

• Agreement with SFC

– Infrastructure– Teaching & Learning– Organisation– Strategic Leadership– Subject-specific adoption of e-learning

Page 4: DSA Supplementary Questionnaire Initial analysis and observations.

Methodology

• Included as part of the DSA Workbook issued to colleges

• Available on-line for individual practitioners

• Latter targeted through JISC, SFEU and COLEG networks

Page 5: DSA Supplementary Questionnaire Initial analysis and observations.

Outcomes

• 36 colleges completed as part of DSA

• 59 individual practitioner responses

But

• Only 3 responses to open questions from individuals

Page 6: DSA Supplementary Questionnaire Initial analysis and observations.

Area ratings (high to low)

• B Infrastructure: Services and Organisation• A Infrastructure: Technology• F Organisation: Staff Development• C Teaching and Learning: Resources• I Strategic Leadership• H Organisation: Quality Assurance• J Individual Subject Area• D Teaching and Learning: On-line

assessment• G Organisation: Financial Management• E Teaching and Learning: Pedagogy

Page 7: DSA Supplementary Questionnaire Initial analysis and observations.

Questions

J1. Extent of use of e-

learning in the subject area

J2. Extent of use of e-learning

across the awarding

levels (e.g. NC, HN)

J3. Extent of use of e-learning across

learner types (e.g. age, disability,

ESOL)

J4. Extent of use of e-learning in employer facing

delivery (e.g. Work based

learning)

COLLEGES

Very Good 0 2 3 0

Good 10 10 10 6OK 14 10 12 11

Poor 10 14 11 17

Not known 2 0 0 2

Total (36 colleges)

36 36 36 36

INDIVIDUALS

Very Good10 6 7 4

Good 11 9 10 2OK 21 20 15 7

Poor 9 13 13 10

Not known8 11 14 36

Total (59 responses)

59 59 59 59

E-learning coverage in specific subjects

Page 8: DSA Supplementary Questionnaire Initial analysis and observations.

Questions

E1. Effective learner use of VLE

E2. Effective learner

use of e-Portfolio

s

E3. Provision

of on-line tutor support for e-

learners

E4. Provision

of technical support for e-

learners (e.g.

online / help

desk / local

technician)

E5. Adoption

of diverse media

for learning content

and activity

(e.g. Game based

learning, pod

casting, IPTV

services)

E6. Adoption

of collabor

ative online activity

(e.g. email lists,

discussion forums,

blogs, wikis)

E7. Use of

mobile access devices

in personal learning

(e.g. PDAs,

phones, MP3

players)

COLLEGES

Very Good

4 0 6 3 0 0 0

Good 11 6 6 12 6 9 3OK 16 7 11 13 12 10 5

Poor 5 21 12 7 14 17 28Not

known0 2 1 1 4 0 0

Total (36 colleges)

36 36 36 36 36 36 36

INDIVIDUALS

Very Good 12 0 9 6 1 5 0Good 14 3 10 14 6 15 6

OK 10 7 6 10 12 13 3Poor 14 17 19 16 21 16 32

Not known 9 32 15 13 19 10 18

Total (59 response

s) 59 59 59 59 59 59 59

Page 9: DSA Supplementary Questionnaire Initial analysis and observations.

Element % of respondents scoring as Very Good

B6. Effectiveness of library & resource centre support 43% B5. Availability of library & resource centre support when required 46% A1. Adequacy of bandwidth to access the internet 60% A2. Adequacy of internal campus network performance 44% A9. Access to externally hosted online content 31% A7. Capability to upload content to learning support platform (e.g. VLE or repository)

35%

B3. Effectiveness of operational IT support 29% A3. Adequacy of network performance between college sites (campuses, annexes)

25%

B9. Onsite facilities for personal e-learning (e.g. in LRC, drop in centres, annexes)

23%

B2. Availability of operational support when required through IT support staff

23%

Highest scoring elements (highest first)

Page 10: DSA Supplementary Questionnaire Initial analysis and observations.

Element % of respondents scoring as Poor

E7. Use of mobile access devices in personal learning (e.g. PDAs, phones, MP3 players)

64%

E2. Effective learner use of e-Portfolios 40% E5. Adoption of diverse media for learning content and activity (e.g. Game based learning, pod casting, IPTV services)

37%

D2. Use of externally provided online question banks or formative test instruments (e.g. COLA)

42%

B10. Effectiveness of the college website in promoting e-learning options 39% A12. Access to video conferencing facilities for distance meetings or classes

35%

F1. Take up of e-enabled learning beyond the enthusiasts 35% D1. Availability of directly relevant online e-assessment instruments 33% A10. Access to resources using privately owned laptop whilst in college 46% J4. Extent of use of e-learning in employer facing delivery (e.g. Work based learning)

28%

Lowest scoring elements (lowest first)

Page 11: DSA Supplementary Questionnaire Initial analysis and observations.

Elements that individuals scored more highly than college

• A4. Access to data projection or electronic whiteboards for class teaching

• D8. Credibility of online summative assessment approach

• E6. Adoption of collaborative online activity (e.g. email lists, discussion forums, blogs, wikis)

• F5. Value of self-help networks (e.g. SFEU Communities of Practice)

• J1. Extent of use of e-learning in the subject area

Page 12: DSA Supplementary Questionnaire Initial analysis and observations.

Elements that college scored more highly than individuals

• B4. College risk assessment & contingency planning to address major IT failure

• F4. Value of external advice and support in helping take on e-enabled learning (e.g. JISC RSCs, SFEU)

• G2. Affordability of licensing the necessary software tools (e.g. VLE, assessment tools, authoring tools)

Page 13: DSA Supplementary Questionnaire Initial analysis and observations.

Your hopes• Most comments about embedding or

funding/resources

• Losing the “e-”

“A strategic, management led initiative that results in staff embracing the widely available

technology and results in more resources being developed either by staff individually or

supported centrally in order to enhance the student experience.”

Page 14: DSA Supplementary Questionnaire Initial analysis and observations.

Good Practice

• SFC Transformation projects– BlendEd– E-Construction

• Use of ICT Learning & Teaching champions / advisers/ learning technologies

• CISCO CCNA on-line• SQA Internet Security e-enabled Unit• BA Child & Youth Studies (networked)

Page 15: DSA Supplementary Questionnaire Initial analysis and observations.

Biggest Challenge

• Funding

• Staff time (for development)

• Staff attitudes

• Culture and ethos

• Learning technology support

• Policies to support e-learning

Page 16: DSA Supplementary Questionnaire Initial analysis and observations.

Most significant change

• Moving e-learning from enthusiasts to mainstream

• National banks of materials and assessments

• Change in approach/attitude of awarding bodies

• Consistent style/approach for all developments

Page 17: DSA Supplementary Questionnaire Initial analysis and observations.

Walter Patterson

SERO