SEO for website migrations - 53 SEO factors for a successful website relaunch

110
53 SEO factors for a successful website migration

Transcript of SEO for website migrations - 53 SEO factors for a successful website relaunch

53SEO factors for a successful website migration

1.Raise awareness for website migration SEO within your organisation

● Expectation management: Not losing SEO traffic is a huge success.

● Communicate the risks: All non-brand SEO traffic and the revenue it generates

could be lost.

● Define high-priority items (show-stoppers): Missing redirects, missing content,

etc.

● Budget and resources: Make sure there is enough time and money to save your

SEO traffic.

How to make SEO a priority within your website relaunch project:

https://www.searchviu.com/en/seo-priority-website-relaunch/

2.Allocate resources for troubleshooting after the website migration

● After the launch of the new website, everybody involved in the project probably

deserves a holiday.

● But: Make sure there are resources available for troubleshooting.

● Crawl errors need to be fixed, traffic drops analysed, etc.

More details on what needs to be done after the migration at the end of this presentation.

3.Create a website migration SEO project plan

● Tools like JIRA or Trello can help you organise the SEO tasks that have to be

taken care of before and after the website migration.

● You can also use this presentation and create the first tickets out of the factors

listed here.

● Use priorities, e.g. “show-stopper”, “high”, “medium”, “low”.

● Involve all stakeholders in the project plan and make sure everybody knows

exactly what to do and when.

4.Make all content that is driving traffic to the old website available on the new website

● This seems simple, but it’s very important: Content that is removed from the

website will no longer drive organic traffic.

● Use Google Analytics and Google Search Console to analyse which landing

pages are driving SEO traffic for which search queries.

● Make sure the content that is responsible for this traffic is not missing on the new

website.

5.Make all pages with promising rankings available on the new website

● In addition to pages that are already driving traffic, have a look at pages with

promising rankings.

● These pages might be able to drive traffic in the near future, so don’t waste this

potential.

● You can use rank tracking tools (e.g. SEMrush, Sistrix, Searchmetrics, …) to get

this data.

6.Make sure all inbound links point to relevant, up-to-date content on the new website

● Pages with backlinks are important and should be kept, even if they are not

currently driving traffic.

● Pages with backlinks that can’t be kept can be replaced, if the click intent

continues to be satisfied.

● All backlinks should point to relevant and up-to-date pages that also link to

other important pages.

● URLs with backlinks can be redirected to a matching target if the URL can’t be

kept.

7.Make all images and PDFs that drive traffic available on the new website

● Non-HTML file types are often forgotten when analysing content that drives

traffic.

● Check Google Search Console for PDFs and images (or other file types) that drive

traffic. Attention: They won’t show in Google Analytics.

● Indexed PDFs should be replaced by HTML pages with the same (or better)

content.

● Image file names and sizes should be kept, if possible.

8.Make all keywords that are driving traffic available on the new website

● Check which keywords are currently driving traffic and at the same time are

included in important places: Title tags, headlines, internal link anchor texts,

page content, etc.

● If you remove these keywords, your rankings might suffer.

● This is a very difficult task to do manually, but searchVIU has you covered.

9.Keep or improve all title tags and meta descriptions

● If your old website already has optimised title tags and meta descriptions, make

sure to keep them.

● A website relaunch is a good opportunity to improve all of your title tags and

meta descriptions.

● When making changes to title tags and meta descriptions, pay attention to

keywords that are currently driving traffic - think twice about removing them.

● If you create entirely new pages, make sure to create good title tags and meta

descriptions for them.

10.Migrate all previous content optimisations to the new website

● Previous content optimisations can include keyword placement in headlines,

internal links, page content and other places.

● They are sometimes difficult to spot, especially if obscure tactics like WDF*IDF

have been used.

● SEO traffic can suffer if previous optimisations that have had positive effects are

unknowingly removed.

● Check the history and documentation of your project for previous content

optimisations.

11.Avoid unnecessary URL changes

● Important rule for URLs: If it ain’t broke, don’t fix it.● SEOs often feel the urge to “optimise” URLs, but in most cases URL changes do

more harm than good.

● Only change URLs if you really have to.

12.Avoid duplicate content

● Duplicate content can have different causes:

○ Lazy editors might copy the same content to different pages on a website.

○ For technical reasons, some pages might be available with more than one URL.

● Both types of duplicate content can be detected by crawling the new website

and searching for duplicates.

13.Set up redirects from all old URLs that no longer exist to a matching target

● This is one of the most important SEO tasks for website migrations.

● Manually matching old and new URLs can take up a lot of time.

● It is important to find a matching target for each URL that has been driving

traffic. Redirects to the home page or to category pages don’t save SEO traffic.

● Developers might resist and say that too many redirects slow down the site.

Don’t let them fool you.

● Set up your redirects on DEV and test them before the migration.

14.Avoid redirect chains

● A redirect chain is caused by a URL that redirects to a URL that itself redirects to

another URL.

● All redirects should point directly to their final target.

● Redirect chains are often caused by protocol or domain switches.○ A developer might set up a rule that redirects all http URLs to their https counterparts and then

redirects some of these https URLs to other different https URLs.

○ In case of a domain switch, there might be a rule that redirects all URLs from the old domain to

their exact equivalents on the new domain, before adding another redirect from old to new URL

paths.

15.Redirect all URLs that have backlinks but no content to a matching target

● If you detect URLs that have backlinks but no content, make sure you redirect

them to a matching target (one that satisfies the user’s click intent).

● If you can’t find a matching redirect target, you should probably create a new

page. A backlink is usually worth the effort.

16.Change existing redirects to new targets

● Often, there are already redirects in place, from previous site migrations or

website changes.

● In order to avoid redirect chains, the targets of these old redirects should be

changed to new targets.

● Redirects than are no longer needed because the redirected URLs have not been

requested in a long time can be removed.

17.Make sure you only use 301 status codes for permanent redirects

● Using different status codes might slow down the process of replacing old

URLs with new ones.

● A 301 status code is the clearest signal for search engine bots.

● If you are not planning on changing the redirect target anytime soon and if the

redirect target is always the same for all users and bots, a 301 status code is

always the best choice.

● In other cases, e.g. IP or browser language based redirects, a 302 status code

might be an alternative.

18.Include image and PDF URLs in your redirects

● All image, PDF and other file type URLs that have been driving traffic or that have

backlinks should be redirected to their new equivalents.

● Indexed PDFs should redirect to HTML pages with the same (or better) content.

PDFs can still be offered on the website, but they should not be indexed.

● Image redirects only really work if the file name (the part after the last slash in

the URL) and the file size stay the same.

19.Crawl your pages to check indexability

● Crawl your new website. Often.

● Use different crawl settings:○ With and without JavaScript rendering.

○ With and without respecting robots.txt directives.

○ As Googlebot and with a different user agent.

○ With and without “nofollow” links.

● Pay attention to settings that might prevent important pages from being

indexed: “noindex”, canonical tags, orphan pages (or only reachable via “nofollow”

links), etc.

20.Check your robots.txt for harmful or missing directives

● Some CMS or shop systems automatically add information to the robots.txt file.

● Some developers use the robots.txt file for creative solutions.

● The longest robots.txt file is not necessarily the best one.

● Be aware of this: The robots.txt file asks bots not to crawl a page. This doesn’t

mean that they won’t index it.

● Make sure you know what your robots.txt file is about and that it does not block

search engine bots from important resources.

21.Make all important content available for search engine crawlers and rendering services

● Google now renders almost every single page and often ignores the HTML

source document completely.

● Make sure that all important content is available in the rendered versions of the

pages of your new website without further user actions required.

● Some pages might be driving traffic for long tail search queries thanks to content

that’s not very prominent on the page. Make sure this kind of content is not

hidden from search engines on the new website.

22.Make all important internal links crawlable

● Important internal links can be links that have keywords in their anchor texts that

drive traffic to the linked page.

● Links that link from many pages to one page can also be important for the

rankings of the linked page (e.g. main menu links, footer navigation links).

● If a page only has a few internal links pointing to it, these links are also very

important for the page.

● Make sure that these links are technically crawlable and that there aren’t any

directives preventing them from being crawled.

23.Exclude all pages that are not supposed to be indexed from indexing

● Pages with thin content or duplicate URLs should not be indexed.

● Pages that generate very few impressions or clicks can also be excluded from

indexing.

● Less pages with a higher average quality often generate more traffic than lots of

pages with lower quality.

● You can use “noindex”, canonical tags and other directives to prevent pages from

being indexed.

24.Change all internal links to new targets

● When you launch your new website, make sure your internal links no longer

point to old targets.

● Links in content blocks are often forgotten.

● It is not enough to redirect all URLs, as internal links to redirects are a bad

quality signal for search engines and they slow down the processing of the new

website content and structure.

25.Keep all important internal links to important pages

● Just as you have to make sure that all important internal links are still crawlable

after the migration (see number 22), make sure that they are available in the first

place.

● A redesign of the main menu, deletion of the breadcrumb navigation or changes

to the footer menu might cause pages to lose important internal links.

● Check for pages that lose lots of internal links or that are demoted in the internal

linking hierarchy.

26.Keep all important keywords in internal link anchor texts

● A keyword that is driving traffic to a page and that is also included in internal

links pointing to this page should not be removed from the link texts.

● A change of a label in the main navigation can cause a serious traffic loss for the

linked page.

● Removing single contextual links from strong pages that contain an important

keyword in the anchor text can also harm the linked page’s rankings.

27.Keep your breadcrumb navigation

● Breadcrumb navigations are an important signal for users and search engines to

understand the structure of a page better.

● Their links also pass relevance from pages to their parent pages and they often

contain descriptive link texts.

● Removing a breadcrumb navigation without replacement can cause serious

damage to your organic traffic.

28.Don’t demote important pages in the navigation hierarchy

● The level of a page in the navigation hierarchy can play a role for its rankings

and organic traffic.

● The further a page is away from the home page, the less relevant it appears to

search engine crawlers and the harder it is to reach for users.

● Demoting a page that drives organic search traffic can cause a loss of this traffic.

29.Keep your canonical tag implementation

● If your current website relies on a canonical tag implementation to tackle

duplicate URLs and this problem is not solved on the new website, make sure to

keep your canonical tag implementation.

● Your new website might need canonical tags on pages that didn’t need them or

that didn’t exist on your old website.

● A perfect website doesn’t need canonical tags and it is always better to improve

the cause of a problem, rather than fighting the symptoms, but canonical tags are

still an important tool for tackling duplicate content problems.

30.Implement hreflang correctly, if needed

● If your new website has more than one country or language version, you need an

hreflang implementation.

● A correct hreflang implementation will help search engine crawlers better

interpret the international and multilingual structure of your website.

● Efficient crawling and processing of new information is crucial after a website

migration, and hreflang is one signal that helps search engines achieve this.

31.Create correct and well-structured XML sitemaps

● XML sitemaps are only directly important for SEO on very big websites, but they

are always handy for SEO quality control.

● If you separate your XML sitemaps by page types or topics, you can easily detect

how well different areas of the website are indexed.

● Use a sitemap index that links to your different sitemaps.

● Make sure XML sitemaps only include URLs that are supposed to be indexed and

are not blocked by robots.txt, are set to “noindex” or have canonical tags pointing

to other pages.

32.Handle pagination correctly on the new website

● If you have paginated content on your new website, make sure you use

rel=”next” and rel=”prev” to help search engine robots interpret the pagination.

● You can also provide a “show all” page and use canonical tags pointing from all

paginated pages to the main page.

● Another alternative can be to set paginated pages to “noindex”. You should avoid

“nofollow” though, because that way pages that are linked only from paginated

pages might never be crawled.

33.Use structured data on the new website, wherever possible

● Structured data should be used wherever possible to make the content of your

pages more machine-readable.

● If you’ve been using structured data on your old website, make sure you don’t

remove any of it.

● The latter can happen unknowingly, if your old CMS included structured data out

of the box, and your new one doesn’t.

34.Use https

● There is no excuse for not using https.

● A website migration is a good opportunity for switching to https.

● If you are changing most of your URLs anyhow, a switch to https can’t do much

additional harm.

● If you are not changing your URLs, but lots of other elements on the website, it

might be a good idea to do the move to https a few weeks earlier or later to

minimise the risk of confusing search engine bots too much.

35.Keep your image alt tags

● If you’ve been using optimised image alt tags on your old website, make sure to

keep them on your new website.

● Image alt tags can help generate additional traffic from image search.

● They are also considered as part of the content of the page and should

therefore not be ignored.

● Image alt tags are not just important for SEO, but also for the accessibility of your

page. Users that can’t see your content or that are using devices without screens

have to rely on image alt tags for interpreting images.

36.Check your HTTP headers for SEO directives

● Canonical tags, hreflang annotations, “noindex” and other directives can be

included in HTTP headers.

● Directives in HTTP headers can easily be overlooked, so make sure you check the

HTTP headers of your new website before the launch.

● Screaming Frog automatically extracts all important information from HTTP

headers.

37.Implement your 404 page correctly

● A 404 page should load on the URL that has been requested (and not trigger

a redirect).

● 404 error pages have to give back a 404 status code, and not 200 or any other

status code.

● For tracking purposes, it makes sense to include “404”, “page not found” or

something similar in the title tag of the error page.

● Your 404 page will probably have lots of visitors after the website migration, so

make sure it’s extra user-friendly.

38.Check the page speed metrics of your new website

● You can use Chrome’s new Audits Panel on protected environments.

● It’s even possible (but a bit tricky) to use Google’s PageSpeed Insights on

protected environments.

● Make sure your new website is super fast. This should be a top priority.

39.Check the mobile version of your new website

● Crawl your website with a tool that behaves like Google’s mobile bot.

● Make sure all elements that are relevant for SEO (title tags, meta descriptions,

canonical tags, hreflang, rel=”prev” & rel=”next”, “noindex”, internal links, content,

etc.) are available on the mobile version.

● Be aware of all differences between your desktop and mobile version and

evaluate their possible SEO impact.

● This is especially important as we are facing a mobile-first index in the near

future.

40.Check the rendered version of your new website

● As mentioned before, Google renders most pages and often relies on the

rendered version instead of the HTML source document.

● This means that you have to crawl your new website with a tool that renders

JavaScript and extracts all information from the rendered versions of the

pages.

● JavaScript crawling takes up a lot of time and resources.

41.Implement a custom page for 5xx errors

● Server errors are very likely to occur after a website migration.

● For UX and tracking reasons, it makes sense to set up a custom page for server

errors (with 5xx status codes).

42.Return a 503 error code in case the entire website is down after the relaunch

● If the entire website happens to be down after the website migration, it is best to

return a 503 status code.

● This status code means that the service is temporarily unavailable.

● Search engine crawlers are more likely to interrupt the crawl and try again later if

they encounter this status code.

● A different status code might cause them to keep trying and waste resources.

In case of a domain switch

43.Use the available webmaster tools features for domain switches

● Google, Bing and Yandex all have “domain switch” tools with different names.

● If you are switching domains, make sure you set up your webmaster tools as soon

as possible and use the domain switch tools that are available.

44.Update your backlinks to the new domain

● Update all links to your website that you have direct control over to your new

domain.

● You can get in contact with website owners that link to your old domain and

politely ask them to change the link target.

● This is also a good way of letting more people know that you have a brand new

website.

45.Don’t cancel your old domain and SSL certificate

● Your redirects should work for a long time (probably for ever).

● Make sure you keep your old domain, or your redirects will disappear.

● If you were using https on your old domain, you also have to keep your SSL

certificate, otherwise redirects from https URLs will stop working.

After the relaunch

46.Check your new robots.txt file

● As soon as your new website goes live, check your new robots.txt file.

● A classic error is leaving the robots.txt file blocking everything, because it was

(intentionally) set up like that on the DEV environment.

47.Monitor the number of indexed pages

● If you’re changing lots of URLs, it would be natural to see the number of indexed

URLs rising and then falling again.

● Indexing of new URLs normally happens sooner than removal or replacement of

old ones.

● If you see a different pattern, e.g. a sudden decline in indexed pages, you should

have a closer look at the situation.

48.Monitor and fix crawl errors in Google Search Console

● Website migrations without a spike in crawl errors are very rare.

● Mark all crawl errors as fixed just before the website migration.

● After the migration, check them regularly and fix them by redirecting the false

URLs to the correct target.

How to deal with crawl errors in Google Search Console:https://www.rebelytics.com/crawl-errors-google-search-console/

49.Monitor and fix errors experienced by users

● You can easily track 404 errors in Google Analytics without any further tracking

configuration if your 404 error page is set up correctly (see no. 37).

● If you have a trackable 5xx error page, you can also monitor these errors in Google

Analytics.

● Errors experienced by users are even more important than those encountered by

bots and should be fixed first.

50.Monitor the most important SEO KPIs

● Keep an eye on your most important SEO KPIs, such as revenue from organic

traffic, session quality, visibility, etc.

● Changes are inevitable after a relaunch, but you should analyse them and

understand their causes.

● Negative trends should be analysed and corrected with high priority.

51.Monitor the number of SEO landing pages

● The number of pages that drive organic search traffic is a metric that is usually

very stable.

● It doesn’t depend on seasonal changes or other external factors as much as

traffic, revenue or rankings.

● A drop in the number of SEO landing pages normally is a symptom of technical (indexing) problems.

● Go to the landing pages report in Google Analytics, apply an organic traffic

segment and scroll down to the number of lines in the report.

52.Monitor the search analytics reports in Google Search Console

● Keep an eye on overall impressions, clicks, average position and click rates in

Google Search Console’s search analytics reports.

● Also monitor these metrics for your most important landing pages and keywords.

53.Crawl your old URLs to verify that your redirects are working

● Directly after the migration, crawl all of your old URLs and make sure they

redirect to the right targets.

● This way you can detect errors and fix them before users or search engine bots

find them.

THE END