Alaska Visitor Statistics Program V: Project Evolution and Policy Impacts Prepared for: Travel and...
-
Upload
antonia-warren -
Category
Documents
-
view
215 -
download
0
Transcript of Alaska Visitor Statistics Program V: Project Evolution and Policy Impacts Prepared for: Travel and...
Alaska Visitor Statistics Program V:
Project Evolution and Policy Impacts
Prepared for:
Travel and Tourism Research Association
Greater Western Chapter
Presented by:
McDowell Group, Inc.Anchorage · Juneau · Kodiak
March 23, 2007
McDowell Group• Alaska-based research and consulting for
35 years• 15 full-time staff: Juneau, Anchorage,
Kodiak• Tourism, seafood/fisheries, resource
development, social services, economic development– 400+ tourism projects including market
research and analysis, economic impacts, feasibility studies, tourism development, and strategic planning
AVSP Overview and History• Commissioned by
State of Alaska• Two components:
– Visitor Volume– Visitor Survey
• Two time periods:– Summer (May-September)– Fall/Winter (October-April)
• Previously conducted in 1985, 1989, 1993, and 2001
• AVSP I, II, III by McDowell Group, AVSP IV by Northern Economics
Information CollectedYellow: new for AVSP V
• Trip purpose• Trip packages• Transportation (entry, exit, within state)• Destinations (day visits, overnight visits)• Length of stay (overall, in each destination)• Lodging types used• Activities (overall,
in each destination)• Satisfaction ratings• Previous Alaska travel• Trip planning behavior• Demographics• Expenditures
AVSP I-IV Methodology• Three surveys – Random Arrival Survey (RAS): short intercept
survey administered upon arrival into Alaska– Visitor Expenditure Survey (VES): self-
administered diary given to RAS respondents to be filled out during trip
– Visitor Opinion Survey (VOS): mailed to RAS respondents to be completed after trip
• Visitors tallied upon entry into Alaska • Survey locations: airports, cruise ship
docks, highways, onboard ferries
AVSP IV Challenges• Declining response rates– RAS response rates fell from >90% To ~50%– VOS response rates fell from 82% (AVSP I) to
68% (AVSP III) to 19% (AVSP IV)– VES response rates fell from 73% (AVSP I) to
62% (AVSP III) to 15% (AVSP IV)
• Security issues – Access to jetways, docks, etc.
• Funding level not reflective of increased costs
• Small sample sizes prevented subgroup analysis; limited confidence in data
Changes for AVSP V• Exit methodology (survey
and tally) instead of entry• Three surveys (intercept,
diary, mail-out) combined into one intercept instrument
• Survey content streamlined; more relevant to industry needs
• Introduction of online survey to boost sample sizes
Online Survey
• Cards distributed tosimilar sample as intercept
• Survey contentmirrored interceptsurveys
• Each card had uniquepassword
• Incentives– One winner per month won Denali package– One winner per season won cruise package
AVSP V Successes
• Greatly increased sample sizes– 5,659 total surveys
(compared to 3,722 RAS, 714 VOS, 547 VES in AVSP IV)• 2,703 intercept• 2,956 online
– Increased sample sizes allowed for extensive subgroup analysis (50+ separate profiles provided)
– 50,000 travelers tallied for visitor volume
• Intended trip behavior collected in RAS became actual trip behavior; information fresh in visitors’ minds
AVSP V Successes (cont’d)• Greatly increased response rates
– 86 percent compared to 19 (VOS) and 15 (VES) percent in AVSP IV
– Visitors at completion of trip much more willing to be surveyed than arriving visitors
– Fewer time constraints in exit survey– Response rates strong despite long survey length
• Higher than anticipated response rates among online respondents (18 percent)
• Improved survey content more useful to industry (ex: Internet usage, detailed activity participation, length of stay in each destination)
• Methodology changes made funding level reasonable
AVSP V ChallengesCombination of two methodologies: intercept and
online• Respondent bias
– Online visitors from South, Midwest more likely to participate, international visitors less likely
– Online respondents more likely to use Internet, other trip planning sources
• Question misinterpretation– Intercept respondents able to ask surveyors for
clarification– Online respondents did not always follow directions– Online respondents misunderstood some questions– Survey editing/cleaning not as thorough
Solutions• Respondent bias
– Weighted online data by origin– Trip planning source data (usage of
internet, travel agent, additional sources) based only to intercept respondents
• Question misinterpretation– Correction of responses for cruise
overnight data– For all other questions that appeared
to be misunderstood, only intercept data used• Party size• Transportation within state• Activities• Expenditures
AVSP Impacts on Policy• Data already being used
in legislative funding requests
• Expenditure data is used in economic impact analyses– State, regional, local
AVSP Impacts on Policy• Demographic data used when developing
marketing strategy and tactics– Advertising– Direct mail
• Trip planning info guides marketing efforts and visitor information services
• Timing of trip decision and booking
AVSP Impacts on Policy cont.• Data used to gain funding and
track effectiveness of special programs– Developing Alaska Rural Tourism– SEAtrails– Tourism Mentorship Assistance– AlaskaHost– Independent visitor marketing– International visitor marketing
AVSP Impacts on Policy cont.• Provides critical performance measures for
State agencies– Dept. of Commerce– DOTP&F
• AMHS• Alaska Railroad• Airports
– Fish and Game
Lessons Learned• Allow more time for pre-test of online survey– Contract awarded in first week of April;
fielding began on May 1• Compare data early in survey fielding• Give online respondents more guidance to avoid
misinterpretation• Avoid open-ended questions in online survey– Time involved in coding not worth resulting data
• Online component not as economical as anticipated; card distribution time-consuming
• Online component makes sense when visitors are easy to find; otherwise intercept is more economical
Questions/Discussion• Are online respondents as truthful as intercept
respondents? Are they more truthful?• Are there times when questions work better
online? Ex: list of information sources, list of activities
• How does survey timing affect responses? (at end of trip vs. days after trip) Satisfaction? Expenditures?
Thank you
Final report will be available online in early April at:
www.dced.state.ak.us/oed/toubus/home.cfm
McDowell Group, Inc.Anchorage · Juneau · Kodiak
www.mcdowellgroup.net