Lessons Learned from Years of Administering a Multi-Institution Online Alumni Survey

download Lessons Learned from Years of Administering a Multi-Institution Online Alumni Survey

If you can't read please download the document

description

Lessons Learned from Years of Administering a Multi-Institution Online Alumni Survey. American College Personnel Association March 2013 Amber D. Lambert, Ph.D. Angie L. Miller, Ph.D. Center for Postsecondary Research, Indiana University. Presentation Outline. - PowerPoint PPT Presentation

Transcript of Lessons Learned from Years of Administering a Multi-Institution Online Alumni Survey

Slide 1

American College Personnel Association March 2013

Amber D. Lambert, Ph.D.Angie L. Miller, Ph.D.Center for Postsecondary Research, Indiana UniversityLessons Learned from Years of Administering a Multi-Institution Online Alumni Survey

Presentation OutlineLiterature Review: Importance of alumni assessment and survey issues

Lessons from the Strategic National Arts Alumni Project (SNAAP)Survey administration challenges How schools are implementing survey results

Literature ReviewAs funding to higher education institutions continues to be cut, colleges and universities are often required to show measures of their effectiveness (Kuh & Ewell, 2010)

Surveys are used in many areas of higher education (Kuh & Ikenberry, 2009; Porter, 2004)Alumni surveys can provide valuable information on student satisfaction, acquired skills, strengths and weaknesses of the institution, and current career attainment

Literature ReviewA major concern with all surveys, and alumni surveys in particular, is low response rates

Over the last decade survey response rates have been falling (Atrostic, Bates, Burt, & Silberstein, 2001; Porter, 2004)

Alumni surveys often have lower response rates than other types of surveys (Smith & Bers, 1987) due to:Bad contact information Suspicion of money solicitationDecreased loyalty after graduation

Lessons Learned from the Strategic National Arts Alumni Project (SNAAP)

5SNAAPAs an example, we will discuss some best practices for survey administration and share results from the Strategic National Arts Alumni Project (SNAAP)

What is SNAAP?On-line annual survey designed to assess and improve various aspects of arts-school educationInvestigates the educational experiences and career paths of arts graduates nationallyFindings are provided to educators, policymakers, and philanthropic organizations to improve arts training, inform cultural policy, and support artists

Who does SNAAP survey?Participants drawn from:Arts high schoolsIndependent arts collegesArts schools, departments, or programs in comprehensive colleges/universities

Over 5 years, SNAAP has been administered at nearly 300 institutions of various focuses, sizes, and other institutional characteristics

Cohort Year Sampling2008 and 2009 Field Tests: 5, 10, 15, & 20 years out2010 Field Test: 1-5, 10, 15, & 20 years out2011 and forward: all years to generate the most comprehensive data possible

Increasing Numbers2010 Field TestOver 13,000 respondents154 Institutions

2011 AdministrationMore than 36,000 respondents66 institutions

2012 AdministrationMore than 33,000 respondents70 institutions

Now able to combine 2011 and 2012 respondents to create a SNAAP Database with over 68,000 respondents

Questionnaire TopicsFormal education and degreesInstitutional experience and satisfactionPostgraduate resources for artistsCareerArts engagementIncome and debtDemographics

Survey Administration Challenges

10Survey Administration Challenges: Locating the LostImportant that contact information is accurate and up-to-date

Encourage proactive effortsNewslettersWebsites, social networking

Alumni trackingContracted with Harris Connect, a direct marketing firm

11Survey Administration Challenges: Response RatesResponse rates are directly related to the accuracy of contact information

Incentives: only minimally effective

Open enrollment features can increase number of responsesSocial networking sitesNeed to verify respondents

12Survey Administration Challenges: Response Rates

Email invitations to participate in the surveyIs it better to have HTML or plain text?

For the 2011 administration, we created visually appealing email invitations in HTML format

We wondered if HTML was causing problems with email platforms. Either getting flagged as spam or requiring too much effort from respondents to enable the html content (an extra click)13Survey Administration Challenges: Response Rates

For the 2012 administration, we systematically compared the effectiveness of HTML invites to plain text invites across the 5 email contacts sent to participants

Results of this experiment suggested that a combination of message types gets the highest response ratesPlain text was more effective for the initial contactHTML was more effective for follow-up contactsPotential reasons: plain text may reach larger numbers, but HTML may give the project legitimacy The results of the first round of messages showed a statistically significant and rather large difference in response rates between the two groups. Those who were sent plain-text message were much more likely to click on the link for the survey and respond than those who were sent the HTML message. In the second round of invitations, those who received the HTML invitation were much more likely to respond than the plain-text counterparts. For alumni at schools of art and design, for whom the visual appearance of a message might be even more influential, the difference was more dramatic. Also, the group that first received a plain-text email was the group with the highest response rate. These results suggest that using both plain-text and HTML messages when contacting your target population may be the best way to increase response rates and that starting with a plain-text invitation might be the best plan. The plain-text messages perhaps reach larger numbers, such as those for whom the HTML messages would go their spam folders or not load properly. In contrast, the HTML messages might give the project legitimacy to survey respondents. Thus, to reach the largest number of your target population, both HTML and plain-text message should be used and the order of these formats might make a difference as well. In addition, for some types of institutions the use of HTML messages might be even more critical in the effort to increase response rates.

14Survey Administration Challenges: Response Rates

Long-term strategies can also influence the tendency of alumni to respond

Consider building recognition of the alumni survey presence while they are still students, so there is familiarity with the project once contacted to participate as alumniSharing data on campusInvolving students in campus or curricular changes made based on survey resultsConnecting alumni surveys with senior exit surveys (SNAAP plans to test this model with select institutions in the 2014 administration)Implementing Survey Results

16Using SNAAP for Curricular Assessment

The SNAAP survey asks alumni to rate the importance of 16 different skills and competencies to their current profession or work life (whether they work in the arts or in a different occupation). 17Using SNAAP for Curricular Assessment (cont.)

As an example, this page from the Frequency Reports Career section shows that the majority of alumni from our Sample Institution found that critical thinking and analysis of arguments and information to be either very important (84%) or somewhat important (12%) to their current work. They found that broad knowledge and education are very important (75%) or somewhat important (23%). (To find out what all respondents said in 2011, see the final column under SNAAP Institutions). 18Using SNAAP for Curricular Assessment (cont.)

Most important skills:Creative thinking and problem solvingListening and revisingInterpersonal relations and working collaborativelyBroad knowledge and educationCritical thinking and analysis of arguments and informationRecommend that faculty:Incorporate open-ended projects (top skill #1) and group projects (top skill #3)Require analysis of theories or reviews/critiques (top skill #5) and provide opportunities for feedback and revision (top skill #2)Ensure curricula include a firm knowledge foundation in a wide variety of areas (top skill #4)In 2011, the most important skills identified by alumni were those listed above. Based on this information, a school could recommend that faculty build in assignments as outlined above. 19Using SNAAP for Curricular Assessment (cont.)

This question asks how much the institution helped the respondent acquire or develop each of the 16 skills and abilities. 20Using SNAAP for Curricular Assessment (cont.)

In this instance, less than half (47%) felt their institution helped them acquire or develop critical thinking skills. 21Using SNAAP for Curricular Assessment (cont.)

Can identify strengths: What skills and competencies have the highest percentages of alumni reporting the institution helped them develop very much or quite a bit?Can identify areas for improvement: What skills and competencies have the highest percentages of alumni reporting the institution helped them develop very little or not at all?Peer group information provides context: Do other institutions have similar strengths and weaknesses?22Examples From 2011 Aggregate Findings

This chart shows the 2011 overall responses to three of the skills questions. An interpretation of how to analyze these findings is contained on the next slide. 23Using SNAAP for Curricular Assessment (cont.)

Alumni receive strong training in learning artistic techniques

Discrepancies between those who say a skill is important for their work and those who say the institution helped them develop that skill suggest some improvements that could be made, such as:Requiring business and financial classes, or incorporating these elements into existing courses Include classes looking at the nontraditional career paths of arts graduatesUsing SNAAP for Program Assessment

Here, we distinguish program assessment (extra-curricular or non-credit programs) from curricular assessment (courses of study). 25

Using SNAAP for Program Assessment (cont.)

This slide shows that in our Sample Institution, 32% of alumni were somewhat dissatisfied with career/further education advising, while 26% were somewhat satisfied. 26Using SNAAP for Program Assessment (cont.)

Programs and services with low satisfaction may need to be revised

Career advising had 59% report either very dissatisfied or somewhat dissatisfied

Additional resources could be devoted to developing new components of career advising such as:Alumni career panel presentationsRsum or portfolio building sessionsNetworking opportunities for graduating studentsHere are some ideas on how to respond to the data in the previous slide. 27Examples: Sharing on Campus

28

One of the first institutions to share their data on-campus was Miami University of Ohio. This Assessment Brief was published by Miamis institutional research office after it received its SNAAP report of 2010 data. 29

Purdue University published a four-page report of its 2011 SNAAP data. 30

Purdue 2 of 431

Purdue 3 of 432

Purdue 4 of 433Examples: Alumni & Donor Outreach

34

The Houston School of the Visual and Performing Arts shared some alumni comments in its quarterly newsletter sent to donors, parents, alumni and friends. 35

HSPVA 2 of 236

The College of Fine Arts at the University of Texas at Austin began sharing its SNAAP results in 2011 with the Deans letter in its quarterly print publication. In this example, the dean thanks his alumni for participating and shares selected findings: http://www.utexas.edu/finearts/alumni/alumni-snaap-thank-you-letter37

In 2012, UT Austin developed a web page that illustrates some of its findings http://www.utexas.edu/finearts/about/mission-vision/alumni-snapshot38

UT Austin also shared portions of its Institutional Report (being careful to not identify any respondents). http://www.utexas.edu/finearts/sites/default/files/attach_download/utaustinfineartssnaap2011report.pdf39

The Herron School of Art and Design at IUPUI in Indianapolis created a web site to share selected findings. 40

Herron 2 of 241

The Herron School posted an article about an alumna that ties in to SNAAP. 42Examples: Recruitment

43

This web page from Kent State shares aggregate SNAAP findings to make the case for a visual arts education. 44

The Herron School of Art and Design created a recruitment brochure based on its alumni achievements. This page includes comments from the SNAAP survey. 45

Herron (Creativity) 2 of 246ConclusionsAssessing alumni can provide important information on institutional effectiveness, but alumni surveys can pose several obstacles

When administering alumni surveys, some steps can be taken to update contact information and increase response rates

The results from alumni surveys can be useful in multiple areas, including curricular and program assessment, campus information sharing, alumni and donor outreach, and recruitment

47Questions or Comments?Contact Information:Amber D. Lambert [email protected] L. Miller [email protected]

Strategic National Arts Alumni Project (SNAAP)www.snaap.indiana.edu(812) [email protected]

ReferencesAtrostic, B. K., Bates, N., Burt, G., & Silberstein, A. (2001). Nonresponse in U.S. government household surveys: Consistent measure, recent trends, and new insights. Journal of Official Statistics, 17(2), 209-226.

Kuh, G. D. & Ewell, P. T. (2010). The state of learning outcomes assessment in the United States. Higher Education Management and Policy, 22(1), 1-20.

Kuh, G. D. & Ikenberry, S. O. (2009). More than you think, less than we need: Learning outcomes assessment in American higher education, Urbana, IL: University of Illinois and Indiana University, National Institute of Learning Outcomes Assessment.

Porter, S.R. (2004). Raising response rates: What works? New Directions for Institutional Research, 121, 5-21.

Smith, K., & Bers, T. (1987). Improving alumni survey response rates: An experiment and cost-benefit analysis. Research in Higher Education, 27(3), 218-225.

Assessment Brief #62October 12, 2011

Using Feedback from Miami Alumni to Improve Educational Effectiveness

Surveying AlumniMiami students are frequently surveyed throughout their college experiences. However, assessing the long-term impact of students Miami education can also require reaching out to students after they graduate. Feedback from alumni, who are now using the skills they developed at Miami, can greatly improve educational effectiveness. The Strategic National Arts Alumni Project (SNAAP) survey gathers information about fine arts alumni to better understand the relationship between arts education and arts-related occupations. The SNAAP participants from Miami University consisted of 220 undergraduate fine arts alumni who graduated in the following years: 1990, 1995, 2000, and 2005-2009.The survey included questions about institutional experiences and career choices. To capture institutional experiences, the survey prompted alumni to report their overall satisfaction with their education as well as their satisfaction with specific areas (e.g., academic advising, freedom to take risks). In the career section, alumni reported their current and previous occupations, their satisfaction with these jobs, and their current level of fine arts engagement.

To explore the intersection between institutional experience and careers, the survey asked alumni about the skills and competencies they developed at Miami University as well as which skills were most important in their current job.

By reviewing these results, faculty and staff can better understand how students experiences at Miami prepare them for their career.Key Findings Fine arts alumni were satisfied with their experiences at Miami University; 94% of undergraduate arts alumni rated their overall experience as good or excellent.

Arts alumni were especially satisfied with their sense of belonging at Miami and with their instructors.

Respondents were least satisfied with opportunities to network with alumni and others, advice about further education, career advising, and work experience. The vast majority of respondents reported developing critical and creative thinking skills while at Miami and found these skills important in their future careers. Fine arts alumni were less likely to report that Miami helped them to develop business and technological skills related to their field.

Student Satisfaction

RecommendationsThe SNAAP survey highlights the importance of gathering alumni feedback. Such feedback is a valuable resource for assessing educational impact and improving educational effectiveness across the university. The SNAAP survey helps faculty and staff in fine arts by identifying the following: Common occupations and post-secondary degrees among graduates

Skills and competencies that students will frequently use in their careers

Levels of student satisfaction with various aspects of their Miami experience These results can help the division improve retention and graduation rates and better prepare students for their future careers.If you have comments or questions, please contact the Center for the Enhancement of Learning, Teaching and University Assessment at [email protected] or 513-529-9266. Previous Briefs are available online at: http://www.units.muohio.edu/celt/assessment/briefs/. Miami University

Assessment