Team Social was at the Denver Digital Summit late last month. It was two days filled with a tremendous amount of knowledge from thought leaders and digital marketing gurus. The summit featured speakers from different verticals and industries, many of whom shared practical, actionable information that is undoubtedly useful for anyone with even the slightest of interests in the digital domain.
Being an SEO practitioner, one session that I really enjoyed was John Triplett’s session on Website Migration. Almost everyone who manages a website has either started, completed or considering website migration. Some of the reasons for initiating a website migration can be :
- Re-branding or redesign
- Moving from http to https
- Site architecture changes
- Adopting a news CMS
- Incorporating multiple languages and domains
As with every project, a little bit of planning can prevent the migration from spiraling out of control. In an ideal scenario, at the end of your migration, you are looking to maintain your website traffic and keyword rankings. Following is a broad overview of your website migration project as laid out by John Triplett in his session.
Measure current website analytics – At the outset, use your website analytics software to observe your total and organic traffic, and conversions (if any).
Benchmark keyword rankings – Make a note of the position of your top landing pages, and the keywords you are ranking for.
Measure current site speed – You can use Google’s Page Speed Insights tool to measure your site’s speed, and suggestions to improve it.
Plan a 301 Redirect Strategy – Permanent (301) redirects are an important aspect of a website migration project. Redirects help transfer the SEO value of an old web page to a new one. They send signals to search engines of the association between the new and the old pages, in addition to bringing your website visitor to the appropriate place on the new website.
Prepare and implement XML sitemaps and robots.txt –
XML sitemaps allow search engines to find essential pages on your website easily.
robots.txt is a text file that resides on your website, and instructs the search engine bots on how to, and which sections of the website to index or not to index.
You can submit both XML sitemaps and robots.txt within Google’s search console.
Serve Custom 404s: It is possible that some of your legacy pages were not redirected for one reason or the other. It helps if you have a custom 404 page built to direct potential traffic to the appropriate section of your website.
Conduct an audit – Measure website analytics post-migration. Observe any changes in the website traffic and conversions.
Do a manual search – See how your website (and top pages) appear in search engine results. Find out if your top pages are being cached.
Crawl the website – Use a crawler like Screaming Frog to find out any broken links. Ensure that 301 redirects are in place.
I believe one of the reasons why John Triplett’s session resonated with me was probably because our College of Veterinary Medicine and Biomedical Sciences underwent a website migration recently.
John, in his session, mentioned that migration presents an excellent opportunity for growth, and rightly so. You can use this time to focus on optimizing your Meta titles, descriptions, heading tags for key website pages. This is also a great time to find content that is redundant or outdated, broken links to and from your website, and build on what you have created from an SEO perspective.