The Internet Site Migration Manual: Search Engine Optimization Strategy, Process, & Tick List

The author’s views are totally his or her own (except the not likely event of hypnosis) and may not constantly replicate the perspectives of Moz.

What is a site migration?

A web site migration is a time period broadly utilized by search engine optimization professionals to describe any occasion wherein a internet site undergoes good sized changes in areas which could drastically affect seek engine visibility — typically modifications to the web page’s location, platform, shape, content material, design, or UX.

Google’s documentation on-site migrations doesn’t cowl them in extremely good depth and downplays the truth that so often they bring about sizable traffic and revenue loss, that may closing from some weeks to numerous months — depending on the quantity search engine ranking alerts had been affected, as well as how lengthy it could take the affected business to rollout a a hit restoration plan.

Quick get admission to hyperlinks

Site migration checklist (eight-page PDF)

Site migration examples

Site migration types

Common web site migration pitfalls

Site migration process

1. Scope & planning

2. Pre-launch instruction

3. Pre-launch checking out

4. Launch day movements

5. Post-launch testing

6. Performance assessment

Appendix: Useful equipment

Site migration examples

The following phase discusses how each a hit and unsuccessful website migrations look and explains why it’s far a hundred% possible to come out of a domain migration without suffering big losses.

Debunking the “predicted site visitors drop” fable

Anyone who has been involved with a domain migration has probably heard the giant concept that it’ll result in de facto traffic and revenue loss. Even though this assertion holds a few reality for a few very particular instances (i.e. moving from a longtime area to a modern one) it shouldn’t be treated as gospel. It is entirely feasible emigrate without losing any site visitors or sales; you could even enjoy widespread boom proper after launching a made over website. However, this will handiest be performed if every single step has been nicely-planned and carried out.

Examples of unsuccessful website online migrations

The following graph illustrates a huge UK retailer’s botched web site migration wherein the internet site lost 35% of its visibility weeks after switching from HTTP to HTTPS. It took them approximately six months to absolutely recover, which should have had a great impact on sales from organic seek. This is a normal instance of a negative site migration, in all likelihood caused by negative planning or implementation.

Example of a bad web page migration — healing took 6 months!

But recovery may not usually be viable. The under visibility graph is from some other big UK retailer, in which the HTTP to HTTPS switchover ended in a everlasting 20% visibility loss.

Another instance of a bad web page migration — no signs and symptoms of recovery 6 months on!

In reality, it’s is totally feasible to migrate from HTTP to HTTPS without losing a lot visitors and for so the sort of lengthy period, other than the first few weeks in which there may be high volatility as Google discovers the brand new URLs and updates seek effects.

Examples of a hit web site migrations

What does a a hit site migration appear to be? This largely depends on the website online migration kind, the targets, and the KPIs (greater information later). But in maximum instances, a a hit web site migration shows at least one of the following characteristics:

  • Minimal visibility loss at some stage in the first few weeks (short-time period intention)
  • Visibility growth thereafter — relying at the form of migration (lengthy-term aim)
  • The following visibility document is taken from an HTTP to HTTPS website migration, which became additionally followed by way of giant enhancements to the site’s page loading times.

    The following visibility file is from a whole website online overhaul, which I became lucky to be worried with several months earlier and supported in the course of the approach, making plans, and trying out phases, all of which were equally important.

    As generally takes place on-website migration initiatives, the launch date had to be pushed returned some instances because of the risks of launching the brand new website online in advance and before most important technical limitations had been completely addressed. But as you can see at the underneath visibility graph, the wait was well really worth it. Organic visibility no longer simplest didn’t drop (as most could commonly assume) however in reality, began developing from the primary week.

    Visibility increase one month after the migration reached 60%, while organic traffic boom two months submit-release passed 80%.

    Example of a completely successful site migration — on the spot growth following the new website online launch!

    This was a instead complex migration as the new internet site was re-designed and constructed from scratch on a brand new platform with an improved web page taxonomy that included new landing pages, an updated URL structure, masses of redirects to keep link equity, plus a switchover from HTTP to HTTPS.

    In fashionable, introducing too many adjustments on the same time can be problematic due to the fact if some thing goes wrong, you’ll battle to discern out what exactly is at fault. But on the identical time, leaving fundamental modifications for a later time isn’t ideal either as it would require extra resources. If you realize what you’re doing, making multiple superb adjustments at once can be very price-powerful.

    Before stepping into the nitty-gritty of the way you could flip a complicated web page migration challenge right into a fulfillment, it’s essential to run thru the primary site migration sorts in addition to explain the main reasons so many website online migrations fail.

    Site migration types

    There are many site migration kinds. It all relies upon on the character of the modifications that take place.

    Google’s documentation broadly speaking covers migrations with web page vicinity changes, that are labeled as follows:

    • Site moves with URL changes
    • Site moves with out URL modifications

    Site move migrations

    These usually occur whilst a website movements to a distinctive URL because of any of the underneath:

    Protocol exchange

    A classic instance is while migrating from HTTP to HTTPS.

    Subdomain or subfolder exchange

    Very not unusual in worldwide SEO wherein a commercial enterprise decides to move one or extra ccTLDs into subdomains or subfolders. Another not unusual example is in which a cellular web site that sits on a separate subdomain or subfolder will become responsive and both computing device and cellular URLs are uniformed.

    Domain name change

    Commonly happens when a business is rebranding and ought to circulate from one area to any other.

    Top-stage area exchange

    This is not unusual when a commercial enterprise decides to release international web sites and wishes to transport from a ccTLD (u . s . code pinnacle-stage area) to a gTLD (general pinnacle-degree area) or vice versa, e.g. transferring from .co.uk to .com, or transferring from .com to .co.uk and so on.

    Site shape adjustments

    These are changes to the web page architecture that commonly have an effect on the web page’s internal linking and URL shape.

    Other sorts of migrations

    There are other types of migration which are caused via adjustments to the web page’s content material, structure, design, or platform.

    Replatforming

    This is the case while a internet site is moved from one platform/CMS to another, e.g. migrating from WordPress to Magento or simply upgrading to the contemporary platform version. Replatforming can, in some cases, additionally bring about design and URL modifications because of technical barriers that often arise whilst changing platforms. This is why re-platforming migrations not often result in a internet site that appears exactly similar to the previous one.

    Content migrations

    Major content changes consisting of content material rewrites, content consolidation, or content pruning may have a massive impact on a website’s natural search visibility, depending on the scale. These adjustments can frequently affect the website online’s taxonomy, navigation, and internal linking.

    Mobile setup adjustments

    With so many alternatives to be had for a domain’s cellular setup transferring, permitting app indexing, constructing an AMP website, or building a PWA internet site also can be considered as partial web site migrations, mainly while an present cellular web page is being replaced by way of an app, AMP, or PWA.

    Structural modifications

    These are frequently as a result of primary adjustments to the web site’s taxonomy that effect on the website online navigation, internal linking and consumer journeys.

    Site redesigns

    These can vary from predominant layout modifications in the appearance and experience to a whole website revamp which can also consist of large media, code, and copy changes.

    Hybrid migrations

    In addition to the above, there are numerous hybrid migration types that may be combined in practically any manner possible. The more changes that get introduced at the same time the better the complexity and the risks. Even though making too many modifications at the equal time increases the dangers of some thing going wrong, it could be more price-powerful from a resources attitude if the migration is very nicely-planned and finished.

    Common site migration pitfalls

    Even even though every website migration is one-of-a-kind there are some commonplace themes at the back of the most standard website migration failures, with the most important being the following:

    Poor strategy

    Some web page migrations are doomed to failure manner before the brand new site is launched. A approach that is constructed upon doubtful and unrealistic objectives is a great deal less probable to bring achievement.

    Establishing measurable targets is vital on the way to degree the effect of the migration submit-release. For maximum website migrations, the number one objective should be the retention of the web site’s modern site visitors and revenue stages. In positive instances the bar might be raised higher, but in general looking forward to or forecasting increase have to be a secondary objective. This will assist avoid creating unrealistic expectations.

    Poor making plans

    Coming up with a detailed assignment plan as early as feasible will help keep away from delays along the way. Factor in beyond regular time and sources to address any unforeseen instances that may get up. No remember how well concept out and distinctive your plan is, it’s especially unlikely everything will pass as expected. Be flexible along with your plan and receive the fact that there’ll almost truely be delays. Map out all dependencies and make all stakeholders privy to them.

    Avoid making plans to release the web page close to your seasonal peaks, due to the fact if some thing goes incorrect you received’t have sufficient time to rectify the problems. For instance, retailers ought to avoid launching a domain near September/October to avoid putting the busy pre-Christmas duration at risk. In this example, it might be a great deal wiser launching during the quieter summer months.

    Lack of resources

    Before committing to a website migration venture, estimate the effort and time required to make it a fulfillment. If your budget is limited, make a call as to whether or not it is worth going in advance with a migration that is in all likelihood to fail in assembly its mounted goals and motive revenue loss.

    As a rule of thumb, try to encompass a buffer of at the least 20% in extra aid than you initially think the mission would require. This extra buffer will later allow you to quick deal with any issues as quickly as they arise, without jeopardizing fulfillment. If your sources are too tight otherwise you start cutting corners at this early degree, the web page migration might be at risk.

    Lack of search engine optimization/UX consultation

    When changes are taking place on a website, every single choice wishes to be weighted from both a UX and SEO perspective. For instance, eliminating top notch quantities of content or links to improve UX may also damage the site’s potential to target commercial enterprise-important keywords or bring about crawling and indexing troubles. In both case, such adjustments could harm the web page’s natural seek visibility. On the alternative hand, having too much text copy and few images might also have a negative impact on consumer engagement and damage the web page’s conversions.

    To avoid dangers, hire experienced search engine marketing and UX consultants in order to talk the potential effects of every unmarried change with key enterprise stakeholders who recognize the enterprise intricacies better than every body else. The execs and cons of every option need to be weighed earlier than making any selection.

    Late involvement

    Site migrations can span numerous months, require extremely good making plans and enough time for trying out. Seeking expert assist overdue may be very volatile due to the fact vital steps might also were ignored.

    Lack of testing

    In addition to a amazing strategy and thoughtful plan, devote some time and effort for thorough testing before launching the web site. It’s plenty more optimal to put off the launch if testing has diagnosed vital issues instead of speeding a sketchy implementation into production. It goes without saying which you have to not release a website if it hasn’t been examined by means of each expert search engine marketing and UX teams.

    Attention to detail is also very crucial. Make positive that the developers are fully aware about the dangers associated with bad implementation. Educating the developers approximately the direct effect of their paintings on a domain’s traffic (and therefore sales) could make a massive difference.

    Slow reaction to computer virus fixing

    There will constantly be bugs to restore as soon as the brand new website is going live. However, a few bugs are greater critical than others and might need immediately interest. For example, launching a brand new web page handiest to discover that search engine spiders have problem crawling and indexing the site’s content material could require a right away restoration. A sluggish response to fundamental technical obstacles can every now and then be catastrophic and take a long term to get over.

    Underestimating scale

    Business stakeholders frequently do not expect web site migrations to be so time-eating and aid-heavy. It’s no longer unusual for senior stakeholders to demand that the new website online release on the planned-for day, no matter whether it’s one hundred% equipped or no longer. The motto “allow’s launch ASAP and fix later” is a classic mistake. What maximum stakeholders are blind to is that it may take just a few days for organic search visibility to tank, but recovery can take several months.

    It is the responsibility of the consultant and venture supervisor to train clients, run them via all the distinct phases and scenarios, and explain what each one includes. Business stakeholders are then able to make greater knowledgeable choices and their expectations ought to be easier to control.

    Site migration process

    The website migration process may be cut up into six fundamental crucial stages. They are all similarly vital and skipping any of the underneath duties could preclude the migration’s achievement to varying extents.

    Phase 1: Scope & PlanningWork out the project scope

    Regardless of the motives at the back of a website migration project, you want to be crystal clean about the goals proper from the start due to the fact those will assist to set and control expectancies. Moving a site from HTTP to HTTPS may be very distinct from going through a complete website online overhaul, therefore the two need to have one-of-a-kind objectives. In the first example, the goal should be to retain the website’s visitors stages, while in the 2nd you can potentially purpose for growth.

    A website migration is a amazing possibility to address legacy issues. Including as many of those as possible within the mission scope ought to be very cost-powerful due to the fact addressing those issues publish-launch would require notably greater assets.

    However, in every case, pick out the maximum important factors for the mission to achieve success. Identify all dangers that might have a bad effect on the website online’s visibility and keep in mind which precautions to take. Ideally, prepare some forecasting eventualities based totally on the unique dangers and increase possibilities. It goes without announcing that the forecasting eventualities need to be prepared by way of skilled web site migration consultants.

    Including as many stakeholders as feasible at this early level will help you purchased a deeper information of the most important demanding situations and possibilities across divisions. Ask for feedback from your content material, search engine marketing, UX, and Analytics groups and put together a listing of the largest issues and possibilities. You then need to work out what the capability ROI of addressing each this type of could be. Finally, pick one of the available alternatives primarily based to your objectives and to be had assets, which will shape your web page migration strategy.

    You ought to now be left with a prioritized listing of sports which can be anticipated to have a nice ROI, if applied. These must then be communicated and mentioned with all stakeholders, so you set realistic objectives, agree on the task, scope and set the proper expectations from the outset.

    Prepare the assignment plan

    Planning is equally vital due to the fact web site migrations can often be very complicated initiatives that could without problems span numerous months. During the making plans phase, each task needs an owner (i.e. search engine optimization consultant, UX consultant, content material editor, web developer) and an expected transport date. Any dependencies need to be recognized and included inside the assignment plan so anyone is privy to any sports that cannot be fulfilled because of being depending on others. For example, the redirects can’t be tested except the redirect mapping has been completed and the redirects have been applied on staging.

    The undertaking plan need to be shared with anybody involved as early as viable so there is sufficient time for discussions and clarifications. Each pastime desires to be described in incredible element, in order that stakeholders are aware of what each undertaking might entail. It goes without announcing that flawless mission control is vital in order to arrange and perform the desired activities in step with the agenda.

    A important part of the venture plan is getting the expected launch date proper. Ideally, the brand new website must be launched all through a time whilst visitors is low. Again, keep away from launching beforehand of or for the duration of a peak period because the consequences might be devastating if things don’t cross as expected. One component to endure in mind is that as web site migrations by no means cross totally to plot, a sure degree of flexibleness could be required.

    Phase 2: Pre-launch practise

    These include any sports that want to be accomplished whilst the new website is still underneath development. By this point, the brand new website’s SEO requirements must have been captured already. You have to be liaising with the designers and information architects, supplying remarks on prototypes and wireframes well earlier than the brand new web site will become available on a staging environment.

    Wireframes evaluation

    Review the new web page’s prototypes or wireframes before improvement commences. Reviewing the brand new site’s predominant templates can assist perceive both search engine optimization and UX problems at an early stage. For example, you could locate that huge quantities of content material have been eliminated from the class pages, which have to be immediately flagged. Or you may discover that some high visitors-driving pages no longer seem in the essential navigation. Any radical adjustments in the design or replica of the pages should be very well reviewed for capability search engine optimization troubles.

    Preparing the technical search engine optimization specifications

    Once the prototypes and wireframes have been reviewed, put together a detailed technical search engine optimization specification. The goal of this important report is to capture all the critical search engine optimization necessities developers want to be aware about before working out the mission’s scope in phrases of work and expenses. It’s all through this level that budgets are signed off on; if the search engine marketing requirements aren’t covered, it is able to be impossible to include them later down the line.

    The technical SEO specification desires to be very distinct, but written in this type of way that builders can effortlessly turn the requirements into moves. This isn’t a document to give an explanation for why some thing needs to be applied, but how it should be carried out.

    Make sure to consist of specific requirements that cover at least the subsequent regions:

    • URL structure
    • Meta facts (inclusive of dynamically generated default values)
    • Structured statistics
    • Canonicals and meta robots directives
    • Copy & headings
    • Main & secondary navigation
    • Internal linking (in any form)
    • Pagination
    • XML sitemap(s)
    • HTML sitemap
    • Hreflang (if there are worldwide web sites)
    • Mobile setup (together with the app, AMP, or PWA web page)
    • Redirects
    • Custom 404 web page
    • JavaScript, CSS, and photograph files
    • Page loading times (for laptop & mobile)

    The specification need to additionally encompass regions of the CMS functionality that permits customers to:

    • Specify custom URLs and override default ones
    • Update page titles
    • Update meta descriptions
    • Update any h1–h6 headings
    • Add or amend the default canonical tag
    • Set the meta robots attributes to index/noindex/comply with/nofollow
    • Add or edit the alt textual content of each photograph
    • Include Open Graph fields for description, URL, picture, type, sitename
    • Include Twitter Open Graph fields for card, URL, title, description, photo
    • Bulk upload or amend redirects
    • Update the robots.txt document

    It is also vital to make sure that after updating a selected attribute (e.g. an h1), different elements aren’t affected (i.e. the web page title or any navigation menus).

    Identifying priority pages

    One of the largest demanding situations with website migrations is that the achievement will in large part rely on the amount and pleasant of pages which have been migrated. Therefore, it’s very critical to make sure that you recognition at the pages that honestly be counted. These are the pages that have been riding site visitors to the legacy website, pages that have collected links, pages that convert well, and so forth.

    In order to do that, you want to:

  • Crawl the legacy website online
  • Identify all indexable pages
  • Identify top performing pages
  • How to move slowly the legacy website

    Crawl the old website so you have a replica of all URLs, page titles, meta facts, headers, redirects, broken hyperlinks etc. Regardless of the crawler application of choice (see Appendix), make certain that the crawl isn’t too restrictive. Pay near attention to the crawler’s settings before crawling the legacy web page and don’t forget whether you should:

    • Ignore robots.txt (in case any important elements are by chance blocked)
    • Follow inner “nofollow” hyperlinks (so the crawler reaches more pages)
    • Crawl all subdomains (relying on scope)
    • Crawl outdoor begin folder (relying on scope)
    • Change the consumer agent to Googlebot (laptop)
    • Change the user agent to Googlebot (cellphone)

    Pro tip: Keep a replica of the vintage website online’s crawl information (in a document or on the cloud) for several months after the migration has been completed, simply if you ever want any of the vintage website’s data as soon as the new website online has gone live.

    How to identify the indexable pages

    Once the crawl is entire, paintings on identifying the legacy website’s listed pages. These are any HTML pages with the subsequent characteristics:

    • Return a 200 server response
    • Either do no longer have a canonical tag or have a self-referring canonical URL
    • Do not have a meta robots noindex
    • Aren’t excluded from the robots.txt document
    • Are internally connected from different pages (non-orphan pages)

    The indexable pages are the simplest pages which have the capability to pressure traffic to the website and therefore want to be prioritized for the purposes of your website migration. These are the pages well worth optimizing (if they’ll exist on the new site) or redirecting (if they won’t exist on the new website).

    How to pick out the pinnacle appearing pages

    Once you’ve diagnosed all indexable pages, you could need to carry out extra paintings, specially if the legacy web page includes a big number of pages and optimizing or redirecting they all is not possible because of time, useful resource, or technical constraints.

    If that is the case, you have to discover the legacy web page’s pinnacle acting pages. This will assist with the prioritization of the pages to consciousness on all through the later stages.

    It’s advocated to put together a spreadsheet that consists of the below fields:

    • Legacy URL (encompass only the indexable ones from the craw records)
    • Organic visits over the past twelve months (Analytics)
    • Revenue, conversions, and conversion fee during the last 365 days (Analytics)
    • Pageviews during the last twelve months (Analytics)
    • Number of clicks from the closing 90 days (Search Console)
    • Top related pages (Majestic search engine marketing/Ahrefs)

    With the above records in one location, it’s now a great deal easier to discover your maximum essential pages: the ones that generate natural visits, convert well, contribute to revenue, have a very good quantity of referring domains linking to them, etc. These are the pages that you need to attention on for a successful website migration.

    The pinnacle appearing pages should ideally additionally exist on the brand new website. If for any reason they don’t, they need to be redirected to the maximum applicable web page so that customers soliciting for them do now not land on 404 pages and the link fairness they previously had stays at the web site. If any of these pages quit to exist and aren’t well redirected, your web page’s scores and traffic will negatively be affected.

    Benchmarking

    Once the launch of the new website is getting close, you must benchmark the legacy web page’s performance. Benchmarking is crucial, now not best to compare the brand new web site’s overall performance with the preceding one but additionally to help diagnose which areas underperform on the new website and to quickly deal with them.

    Keywords rank monitoring

    If you don’t music the site’s scores often, you should do so simply before the new web site goes stay. Otherwise, you may later conflict figuring out whether the migration has long past smoothly or where exactly matters went wrong.Don’t go away this to the final minute in case something is going awry — every week earlier could be the correct time.

    Spend some time running out which keywords are most representative of the website online’s natural seek visibility and song them across computing device and mobile. Because tracking thousands of head, mid-, and lengthy-tail keyword combinations is commonly unrealistic, the bare minimal you must display are key phrases which are driving traffic to the website online (keywords rating on web page one) and have decent search extent (head/mid-tail consciousness)

    If you do get traffic from each brand and non-logo key phrases, you have to additionally determine which kind of key phrases to attention on greater from a monitoring POV. In general, non-emblem key phrases have a tendency to be extra competitive and unstable. For most sites it might make feel to attention totally on these.

    Don’t neglect to tune ratings throughout desktop and mobile. This will make it plenty easier to diagnose troubles publish-release need to there be performance troubles on one device kind. If you obtain a high volume of traffic from multiple united states of america, don’t forget rank tracking key phrases in different markets, too, due to the fact visibility and scores can vary extensively from country to united states of america.

    Site performance

    The new web page’s web page loading times can have a large impact on both visitors and sales. Several studies have shown that the longer a web page takes to load, the better the leap charge. Unless the antique website’s page loading times and site overall performance ratings were recorded, it will be very tough to attribute any visitors or revenue loss to web page performance associated issues as soon as the new web site has gone live.

    It’s endorsed which you evaluate all most important web page kinds the use of Google’s PageSpeed Insights and Lighthouse tools. You ought to use precis tables like the ones underneath to benchmark some of the maximum critical overall performance metrics, if you want to be beneficial for comparisons as soon as the brand new website is going stay.

    MOBILE

    Speed

    FCP

    DCL

    Optimization

    Optimization rating

    Homepage

    Fast

    0.7s

    1.4s

    Good

    81/one hundred

    Category page

    Slow

    1.8s

    5.1s

    Medium

    seventy eight/one hundred

    Subcategory web page

    Average

    zero.9s

    2.4s

    Medium

    69/100

    Product page

    Slow

    1.9s

    five.5s

    Good

    eighty three/one hundred

    DESKTOP

    Speed

    FCP

    DCL

    Optimization

    Optimization score

    Homepage

    Good

    0.7s

    1.4s

    Average

    eighty one/a hundred

    Category web page

    Fast

    zero.6s

    1.2s

    Medium

    seventy eight/one hundred

    Subcategory page

    Fast

    zero.6s

    1.3s

    Medium

    78/one hundred

    Product web page

    Good

    0.8s

    1.3s

    Good

    83/a hundred

    Old web page move slowly facts

    A few days earlier than the brand new website replaces the old one, run a very last move slowly of the old website. Doing so should later prove precious, should there be any optimization problems on the new web site. A very last move slowly will permit you to shop vital statistics approximately the vintage web page’s page titles, meta descriptions, h1–h6 headings, server fame, canonical tags, noindex/nofollow pages, inlinks/outlinks, degree, etc. Having all this data to be had could save you quite a few trouble if, say, the brand new website isn’t properly optimized or suffers from technical misconfiguration issues. Try additionally to save a duplicate of the old website online’s robots.txt and XML sitemaps in case you need those later.

    Search Console data

    Also do not forget exporting as an awful lot of the antique website’s Search Console information as possible. These are only available for ninety days, and probabilities are that once the brand new website goes stay the antique site’s Search Console records will disappear eventually. Data well worth exporting consists of:

    • Search analytics queries & pages
    • Crawl errors
    • Blocked sources
    • Mobile usability issues
    • URL parameters
    • Structured statistics errors
    • Links to your website online
    • Internal links
    • Index status

    Redirects practise

    The redirects implementation is one of the maximum vital sports all through a site migration. If the legacy website online’s URLs stop to exist and aren’t efficiently redirected, the internet site’s rankings and visibility will certainly tank.

    Why are redirects essential in website online migrations?

    Redirects are extremely crucial because they assist both search engines like google and users discover pages which can no longer exist, have been renamed, or moved to every other vicinity. From an search engine marketing factor of view, redirects assist engines like google find out and index a domain’s new URLs quicker but additionally apprehend how the vintage website’s pages are associated with the brand new web site’s pages. This affiliation will allow for rating indicators to bypass from the old pages to the brand new ones, so ratings are retained with out being negatively affected.

    What happens whilst redirects aren’t efficaciously applied?

    When redirects are poorly carried out, the consequences can be catastrophic. Users will both land on Not Found pages (404s) or irrelevant pages that don’t meet the person intent. In both case, the web page’s jump and conversion rates could be negatively affected. The consequences for engines like google can be similarly catastrophic: they’ll be unable to accomplice the old website’s pages with the ones on the new site if the URLs aren’t same. Ranking indicators won’t be exceeded over from the vintage to the new web site, for you to bring about ranking drops and organic search visibility loss. In addition, it’ll take search engines like google longer to discover and index the new web page’s pages.

    301, 302, JavaScript redirects, or meta refresh?

    When the URLs between the antique and new version of the website are special, use 301 (permanent) redirects. These will inform serps to index the brand new URLs as well as ahead any ranking alerts from the old URLs to the brand new ones. Therefore, you must use 301 redirects if your site moves to/from another area/subdomain, if you switch from HTTP to HTTPS, or if the website or elements of it were restructured. Despite a number of Google’s claims that 302 redirects bypass PageRank, indexing the new URLs might be slower and ranking signals may want to take an awful lot longer to be exceeded on from the vintage to the new page.

    302 (transient) redirects should handiest be used in situations in which a redirect does now not need to live permanently and consequently indexing the new URL isn’t a priority. With 302 redirects, engines like google will initially be reluctant to index the content material of the redirect destination URL and pass any ranking alerts to it. However, if the brief redirects stay for a long time period without being eliminated or up to date, they may end up behaving similarly to everlasting (301) redirects. Use 302 redirects when a redirect is likely to require updating or removal in the close to future, in addition to for any u . s .-, language-, or tool-particular redirects.

    Meta refresh and JavaScript redirects ought to be averted. Even though Google is getting higher and better at crawling JavaScript, there aren’t any guarantees these gets discovered or bypass ranking signals to the new pages.

    If you’d like to discover more about how Google offers with the different forms of redirects, please confer with John Mueller’s publish.

    Redirect mapping method

    If you are fortunate sufficient to work on a migration that doesn’t involve URL adjustments, you can pass this segment. Otherwise, examine directly to find out why any legacy pages that received’t be available on the same URL after the migration should be redirected.

    The redirect mapping record is a spreadsheet that consists of the following columns:

    • Legacy website URL –> a page’s URL at the antique website online.
    • New web site URL –> a web page’s URL on the brand new web site.

    When mapping (redirecting) a web page from the old to the new web page, always strive mapping it to the most applicable corresponding page. In cases wherein a applicable page doesn’t exist, keep away from redirecting the page to the homepage. First and main, redirecting users to inappropriate pages effects in a totally bad consumer enjoy. Google has stated that redirecting pages “en masse” to irrelevant pages will be dealt with as tender 404s and because of this received’t be passing any SEO value. If you can’t find an equal web page on the brand new web site, strive mapping it to its discern category web page.

    Once the mapping is entire, the document will want to be sent to the development team to create the redirects, so that these can be tested earlier than launching the brand new website. The implementation of redirects is another element in the site migration cycle in which matters can frequently cross incorrect.

    Increasing efficiencies all through the redirect mapping method

    Redirect mapping requires first rate attention to element and wishes to be executed by way of experienced SEOs. The URL mapping on small websites may want to in principle be completed by way of manually mapping each URL of the legacy website online to a URL on the brand new site. But on massive web sites that include heaps or maybe masses of heaps of pages, manually mapping each single URL is nearly not possible and automation needs to be introduced. Relying on certain commonplace attributes among the legacy and new website may be a big time-saver. Such attributes may include the web page titles, H1 headings, or different particular web page identifiers consisting of product codes, SKUs and so on. Make sure the attributes you rely upon for the redirect mapping are unique and not repeated throughout several pages; in any other case, you’ll end up with incorrect mapping.

    Pro tip: Make sure the URL structure of the brand new website online is a hundred% finalized on staging before you start working on the redirect mapping. There’s nothing riskier than mapping URLs with the intention to be updated earlier than the brand new web page goes stay. When URLs are updated after the redirect mapping is completed, you may need to deal with undesired situations upon release, which includes damaged redirects, redirect chains, and redirect loops. A content-freeze ought to be positioned on the old site nicely in advance of the migration date, so there is a cut-off factor for new content material being posted on the vintage website online. This will ensure that no pages may be neglected from the redirect mapping and guarantee that every one pages on the antique website online get redirected.

    Don’t forget about the legacy redirects!

    You should get keep of the antique site’s current redirects to ensure they’re considered while making ready the redirect mapping for the brand new website online. Unless you do this, it’s probable that the web page’s cutting-edge redirect report gets overwritten by means of the new one at the release date. If this takes place, all legacy redirects that had been previously in region will give up to exist and the web site may also lose a respectable quantity of link equity, the volume of with a view to largely depend on the web site’s volume of legacy redirects. For example, a website that has gone through a few migrations within the past ought to have a very good quantity of legacy redirects in place that you don’t want getting lost.

    Ideally, preserve as the various legacy redirects as possible, ensuring those received’t reason any issues when blended with the new website’s redirects. It’s strongly endorsed to eliminate any capability redirect chains at this early stage, which could easily be executed through checking whether the same URL appears each as a “Legacy URL” and “New web site URL” within the redirect mapping spreadsheet.If this is the case, you’ll need to replace the “New website URL” for that reason.

    Example:

    URL A redirects to URL B (legacy redirect)

    URL B redirects to URL C (new redirect)

    Which outcomes in the following redirect chain:

    URL A –> URL B –> URL C

    To take away this, amend the existing legacy redirect and create a brand new one so that:

    URL A redirects to URL C (amended legacy redirect)

    URL B redirects to URL C (new redirect)

    Pro tip: Check your redirect mapping spreadsheet for redirect loops. These arise while the “Legacy URL” is identical to the “new web page URL.” Redirect loops want to be eliminated because they result in infinitely loading pages which might be inaccessible to customers and serps. Redirect loops need to be removed due to the fact they may be immediate traffic, conversion, and ranking killers!

    Implement blanket redirect regulations to keep away from reproduction content

    It’s strongly encouraged to attempt running out redirect policies that cowl as many URL requests as viable. Implementing redirect regulations on a web server is a great deal greater green than relying on numerous one-to-one redirects. If your redirect mapping document consists of a totally huge number of redirects that want to be implemented as one-to-one redirect rules, site performance may be negatively affected. In any case, double take a look at with the improvement crew the most variety of redirects the internet server can handle with out problems.

    In any case, there are a few general redirect policies that have to be in location to avoid producing duplicate content material troubles:

    • URL case: All URLs containing higher-case characters need to be 301 redirected to all decrease-case URLs, e.g. https://www.website.com/Page/ must be robotically redirecting to https://www.internet site.com/web page/
    • Host: For instance, all non-www URLs have to be 301 redirected to their www equal, e.g. https://internet site.com/web page/ must be redirected to https://www.internet site.com/web page/
    • Protocol: On a stable website, requests for HTTP URLs have to be redirected to the equivalent HTTPS URL, e.g. http://www.website.com/page/” elegance=”redactor-autoparser-object”>http://www.internet site.com/web page/ have to robotically redirect to https://www.website.com/web page/
    • Trailing lower: For instance, any URLs now not containing a trailing decrease should redirect to a version with a trailing scale back, e.g. http://www.internet site.com/web page must redirect to http://www.website.com/web page/” elegance=”redactor-autoparser-item”>http://www.website.com/web page/

    Even if some of those popular redirect rules exist at the legacy internet site, do no longer expect they’ll necessarily exist on the brand new site until they’re explicitly requested.

    Avoid inner redirects

    Try updating the web site’s inner hyperlinks so they don’t trigger internal redirects. Even although search engines can observe internal redirects, these aren’t recommended due to the fact they upload additional latency to web page loading instances and may also have a terrible effect on seek engine move slowly time.

    Don’t forget your image files

    If the web site’s pix have moved to a brand new location, Google recommends redirecting the old photo URLs to the brand new picture URLs to help Google discover and index the new pictures faster. If it’s no longer smooth to redirect all pictures, intention to redirect at least those photograph URLs which have accumulated one way links.

    Phase 3: Pre-launch trying out

    The in advance you could start trying out, the better. Certain things want to be fully implemented to be examined, however others don’t. For example, user journey troubles will be diagnosed from as early because the prototypes or wireframes layout. Content-related problems between the vintage and new web page or content inconsistencies (e.g. between the laptop and cell site) can also be recognized at an early level. But the more technical components ought to only be tested as soon as absolutely applied — things like redirects, canonical tags, or XML sitemaps. The earlier troubles get diagnosed, the much more likely it is that they’ll be addressed earlier than launching the brand new site. Identifying certain varieties of issues at a later stage isn’t price powerful, would require more assets, and motive tremendous delays. Poor trying out and no longer allowing the time required to thoroughly check all constructing blocks that could affect SEO and UX overall performance may have disastrous effects soon after the brand new web page has gone stay.

    Making positive search engines like google and yahoo can’t get entry to the staging/test website online

    Before making the new website online available on a staging/trying out surroundings, take some precautions that engines like google do no longer index it. There are some exceptional methods to try this, each with specific execs and cons.

    Site available to unique IPs (maximum encouraged)

    Making the check web page to be had handiest to particular (whitelisted) IP addresses is a totally powerful way to save you engines like google from crawling it. Anyone seeking to access the test web page’s URL won’t be capable of see any content until their IP has been whitelisted. The predominant advantage is that whitelisted users should effortlessly get entry to and crawl the site with none problems. The only downside is that 0.33-party internet-based totally tools (including Google’s tools) can not be used due to the IP regulations.

    Password safety

    Password defensive the staging/take a look at website online is every other way to keep search engine crawlers away, however this solution has two foremost downsides. Depending at the implementation, it can no longer be viable to move slowly and test a password-protected internet site if the crawler application doesn’t make it beyond the login display. The different disadvantage: password-protected web sites that use bureaucracy for authentication may be crawled the usage of 1/3-birthday party programs, however there’s a threat of inflicting severe and sudden troubles. This is due to the fact the crawler clicks on every hyperlink on a page (whilst you’re logged in) and will easily end up clicking on hyperlinks that create or cast off pages, deploy/uninstall plugins, and so forth.

    Robots.txt blocking

    Adding the subsequent lines of code to the test website online’s robots.txt record will save you serps from crawling the test web page’s pages.

    User-agent: *

    Disallow: /

    One downside of this method is that even though the content material that looks at the check server gained’t get indexed, the disallowed URLs may appear on Google’s seek consequences. Another disadvantage is if the above robots.txt document actions into the live site, it’s going to motive extreme de-indexing problems. This is some thing I’ve encountered numerous times and for that reason I wouldn’t suggest the use of this method to block serps.

    User journey evaluate

    If the web page has been redesigned or restructured, possibilities are that the person journeys may be affected to some extent. Reviewing the consumer trips as early as feasible and properly before the brand new web site launches is hard due to the dearth of user information. However, an skilled UX professional will be capable of flag any worries that might have a poor impact on the website’s conversion rate. Because A/B trying out at this level is rarely viable, it might be worth carrying out a few person checking out and try and get a few comments from real customers. Unfortunately, user enjoy problems may be a number of the harder ones to cope with because they’ll require sitewide changes that take a variety of time and effort.

    On full site overhauls, now not all UX selections can usually be subsidized up by using records and lots of choices will must be based on high-quality exercise, beyond revel in, and “gut feeling,” consequently getting UX/CRO specialists involved as early as viable may want to pay dividends later.

    Site architecture evaluate

    A site migration is often a splendid opportunity to improve the website online architecture. In other words, you have got a first rate hazard to reorganize your keyword focused content and maximize its seek site visitors capacity. Carrying out sizable keyword studies will help become aware of the quality viable category and subcategory pages so that users and search engines like google can get to any page on the site inside some clicks — the fewer the higher, so that you don’t become with a completely deep taxonomy.

    Identifying new key phrases with first rate site visitors ability and mapping them into new landing pages could make a big distinction to the website’s organic visitors ranges. On the alternative hand, enhancing the web site architecture desires to be done thoughtfully. Itt could motive troubles if, say, vital pages move deeper into the brand new web site architecture or there are too many similar pages optimized for the equal keywords. Some of the maximum a success site migrations are those that allocate huge assets to beautify the site structure.

    Meta information & replica assessment

    Make positive that the website’s web page titles, meta descriptions, headings, and replica were transferred from the old to the new site with out problems. If you’ve created any new pages, make certain these are optimized and don’t goal keywords which have already been focused by using other pages. If you’re re-platforming, be aware that the brand new platform can also have unique default values when new pages are being created. Launching the new web page with out well optimized page titles or any kind of missing replica may have a right away poor effect for your web site’s scores and traffic. Do not neglect to review whether any person-generated content material (i.e. person reviews, comments) has also been uploaded.

    Internal linking evaluate

    Internal hyperlinks are the backbone of a website. No depend how well optimized and structured the web page’s copy is, it received’t be sufficient to succeed unless it’s supported by a wonderful internal linking scheme. Internal hyperlinks have to be reviewed throughout the complete web site, which include hyperlinks observed in:

    • Main & secondary navigation
    • Header & footer links
    • Body content material links
    • Pagination links
    • Horizontal links (associated articles, comparable merchandise, and so on)
    • Vertical links (e.g. breadcrumb navigation)
    • Cross-website online links (e.g. hyperlinks throughout global websites)

    Technical tests

    A series of technical assessments need to be finished to make sure the new web site’s technical setup is sound and to avoid discovering fundamental technical system defects after the brand new web site has long gone stay.

    Robots.txt report review

    Prepare the brand new website’s robots.txt file at the staging environment. This way you can take a look at it for mistakes or omissions and avoid experiencing seek engine crawl troubles when the new website goes stay. A traditional mistake in website migrations is when the robots.txt document prevents search engine get admission to the usage of the following directive:

    Disallow: /

    If this gets by chance carried over into the stay website (and it frequently does), it’s going to save you search engines like google and yahoo from crawling the site. And while serps can’t crawl an listed web page, the keywords associated with the web page will get demoted within the seek consequences and subsequently the page gets de-listed.

    But if the robots.txt record on staging is populated with the brand new website online’s robots.txt directives, this mishap may be averted.

    When getting ready the new site’s robots.txt file, make sure that:

    • It doesn’t block seek engine access to pages which are intended to get listed.
    • It doesn’t block any JavaScript or CSS sources search engines like google and yahoo require to render page content material.
    • The legacy website’s robots.txt report content material has been reviewed and carried over if important.
    • It references the new XML sitemaps(s) in preference to any legacy ones that now not exist.

    Canonical tags review

    Review the site’s canonical tags. Look for pages that both do now not have a canonical tag or have a canonical tag that is pointing to every other URL and question whether that is meant. Don’t neglect to crawl the canonical tags to find out whether they return a 200 server response. If they don’t you will need to update them to get rid of any 3xx, 4xx, or 5xx server responses. You need to additionally look for pages which have a canonical tag pointing to some other URL mixed with a noindex directive, because those are conflicting signals and you;’ll want to get rid of one in every of them.

    Meta robots overview

    Once you’ve crawled the staging web page, search for pages with the meta robots houses set to “noindex” or “nofollow.” If this is the case, review every one in every of them to make sure that is intentional and put off the “noindex” or “nofollow” directive if it isn’t.

    XML sitemaps evaluation

    Prepare two special kinds of sitemaps: one which contains all the new website’s indexable pages, and some other that includes all of the old website’s indexable pages. The former will assist make Google aware about the brand new website online’s indexable URLs. The latter will help Google turn out to be aware about the redirects which are in vicinity and the truth that some of the listed URLs have moved to new locations, so that it could find out them and update search outcomes faster.

    You have to test each XML sitemap to ensure that:

    • It validates with out issues
    • It is encoded as UTF-eight
    • It does no longer include extra than 50,000 rows
    • Its size does now not exceed 50MBs while uncompressed

    If there are extra than 50K rows or the report length exceeds 50MB, you need to wreck the sitemap down into smaller ones. This prevents the server from turning into overloaded if Google requests the sitemap too regularly.

    In addition, you should move slowly each XML sitemap to make certain it most effective consists of indexable URLs. Any non-indexable URLs should be excluded from the XML sitemaps, inclusive of:

    • 3xx, 4xx, and 5xx pages (e.g. redirected, no longer observed pages, terrible requests, and so on)
    • Soft 404s. These are pages and not using a content material that go back a two hundred server response, in place of a 404.
    • Canonicalized pages (apart from self-referring canonical URLs)
    • Pages with a meta robots noindex directive
    <!DOCTYPE html>

    <html><head>

    <meta call="robots" content material="noindex" />

    (…)

    </head>

    <frame>(…)</body>

    </html>

    • Pages with a noindex X-Robots-Tag inside the HTTP header
    HTTP/1.1 two hundred OK

    Date: Tue, 10 Nov 2017 17:12:forty three GMT

    (…)

    X-Robots-Tag: noindex

    (…)

    • Pages blocked from the robots.txt record

    Building clean XML sitemaps can assist display the proper indexing ranges of the new website as soon as it is going stay. If you don’t, it will be very tough to spot any indexing problems.

    Pro tip: Download and open each XML sitemap in Excel to get a detailed evaluate of any extra attributes, together with hreflang or picture attributes.

    HTML sitemap evaluate

    Depending on the scale and kind of web site this is being migrated, having an HTML sitemap can in positive instances be beneficial. An HTML sitemap that includes URLs that aren’t connected from the website online’s main navigation can extensively increase web page discovery and indexing. However, keep away from producing an HTML sitemap that includes too many URLs. If you do need to encompass thousands of URLs, recall building a segmented HTML sitemap.

    The variety of nested sitemaps in addition to the maximum range of URLs you need to encompass in every sitemap relies upon at the website’s authority. The more authoritative a website, the better the wide variety of nested sitemaps and URLs it is able to escape with.

    For example, the NYTimes.com HTML sitemap consists of 3 degrees, wherein each one includes over 1,000 URLs in step with sitemap. These nested HTML sitemaps useful resource search engine crawlers in coming across articles posted considering that 1851 that otherwise might be tough to find out and index, as now not all of them could were internally connected.

    The NYTimes HTML sitemap (degree 1)

    The NYTimes HTML sitemap (stage 2)

    Structured statistics evaluation

    Errors in the based facts markup want to be diagnosed early so there’s time to repair them earlier than the new site goes live. Ideally, you must check every single page template (instead of each single page) the use of Google’s Structured Data Testing device.

    Be certain to check the markup on each the computer and cellular pages, in particular if the cellular internet site isn’t responsive.

    The device will handiest file any present errors however now not omissions. For instance, in case your product web page template does not encompass the Product dependent facts schema, the tool received’t document any mistakes. So, in addition to checking for errors you must also ensure that each page template includes an appropriate structured facts markup for its content material type.

    Please check with Google’s documentation for the most up to date info at the structured information implementation and supported content material types.

    JavaScript crawling overview

    You ought to check every single page template of the brand new website online to ensure Google may be able to move slowly content material that requires JavaScript parsing. If you’re able to use Google’s Fetch and Render device in your staging web page, you ought to definitely achieve this. Otherwise, carry out some guide checks, following Justin Brigg’s recommendation.

    As Bartosz Góralewicz’s tests proved, even if Google is capable of crawl and index JavaScript-generated content material, it does not mean that it could move slowly JavaScript content across all primary JavaScript frameworks. The following desk summarizes Bartosz’s findings, showing that a few JavaScript frameworks aren’t search engine marketing-pleasant, with AngularJS currently being the most complicated of all.

    Bartosz also found that other serps (inclusive of Bing, Yandex, and Baidu) definitely conflict with indexing JavaScript-generated content material, which is essential to understand in case your website online’s traffic relies on any of these search engines like google and yahoo.

    Hopefully, that is something so as to improve through the years, however with the growing recognition of JavaScript frameworks in web improvement, this have to be excessive up to your checklist.

    Finally, you must check whether or not any external sources are being blocked. Unfortunately, this isn’t something you could manage a hundred% because many resources (inclusive of JavaScript and CSS documents) are hosted by means of 0.33-birthday party web sites which may be blocking off them through their personal robots.txt documents!

    Again, the Fetch and Render device can assist diagnose this kind of problem that, if left unresolved, may want to have a tremendous negative effect.

    Mobile web site SEO reviewAssets blocking assessment

    First, make certain that the robots.txt file isn’t by accident blocking any JavaScript, CSS, or photograph files which might be vital for the cell website’s content material to render. This ought to have a poor impact on how engines like google render and index the mobile web page’s web page content, which in turn may want to negatively have an effect on the mobile website’s seek visibility and overall performance.

    Mobile-first index evaluate

    In order to avoid any issues related to Google’s mobile-first index, thoroughly assessment the cell internet site and make there aren’t any inconsistencies between the desktop and cell websites within the following regions:

    • Page titles
    • Meta descriptions
    • Headings
    • Copy
    • Canonical tags
    • Meta robots attributes (i.e. noindex, nofollow)
    • Internal hyperlinks
    • Structured facts

    A responsive website must serve the equal content, hyperlinks, and markup throughout gadgets, and the above search engine optimization attributes have to be same throughout the computing device and cell websites.

    In addition to the above, you should carry out a few similarly technical tests relying at the cellular website’s set up.

    Responsive website online assessment

    A responsive internet site have to serve all devices the identical HTML code, that’s adjusted (thru the use of CSS) depending at the display screen length.

    Googlebot is capable of automatically discover this cellular setup as long because it’s allowed to move slowly the web page and its assets. It’s consequently extraordinarily important to ensure that Googlebot can access all important property, together with photos, JavaScript, and CSS documents.

    To sign browsers that a web page is responsive, a meta=”viewport” tag must be in vicinity in the <head> of every HTML web page.

    <meta name="viewport" content material="width=device-width, preliminary-scale=1.zero">

    If the meta viewport tag is missing, font sizes may appear in an inconsistent way, which may also purpose Google to treat the web page as not mobile-friendly.

    Separate cell URLs overview

    If the cell website makes use of separate URLs from desktop, make certain that:

  • Each computer web page has atag pointing to the corresponding cellular URL.
  • Each cell page has a rel=”canonical” tag pointing to the corresponding desktop URL.
  • When desktop URLs are requested on cellular gadgets, they’re redirected to the respective cell URL.
  • Redirects paintings across all cell gadgets, including Android, iPhone, and Windows telephones.
  • There aren’t any beside the point pass-hyperlinks among the computing device and cellular pages. This method that internal links on observed on a desktop page should most effective hyperlink to desktop pages and those determined on a cell page must handiest link to other cellular pages.
  • The cellular URLs go back a two hundred server response.
  • Dynamic serving evaluate

    Dynamic serving web sites serve distinctive code to each device, but on the equal URL.

    On dynamic serving web sites, review whether the range HTTP header has been efficiently installation. This is vital because dynamic serving websites alter the HTML for cell user sellers and the vary HTTP header enables Googlebot discover the cellular content material.

    Mobile-friendliness assessment

    Regardless of the mobile website online set-up (responsive, separate URLs or dynamic serving), assessment the pages using a cellular consumer-agent and ensure that:

  • The viewport has been set effectively. Using a set width viewport throughout devices will purpose cellular usability issues.
  • The font size isn’t too small.
  • Touch factors (i.e. buttons, hyperlinks) aren’t too close.
  • There aren’t any intrusive interstitials, which includes Ads, mailing list signal-up paperwork,App Download pop-americaetc. To keep away from any troubles, you have to use both use asmall HTML or photograph banner.
  • Mobile pages aren’t too slow to load (see subsequent segment).
  • Google’s mobile-friendly check device can assist diagnose maximum of the above problems:

    Google’s cellular-pleasant take a look at tool in action

    AMP website review

    If there is an AMP website and a computer version of the web page is to be had, make sure that:

    • Each non-AMP page (i.e. laptop, cell) has atag pointing to the corresponding AMP URL.
    • Each AMP page has a rel=”canonical” tag pointing to the corresponding desktop page.
    • Any AMP page that doesn’t have a corresponding desktop URL has a self-referring canonical tag.

    You should also make sure that the AMPs are legitimate. This can be examined the usage of Google’s AMP Test Tool.

    Mixed content material mistakes

    With Google pushing difficult for web sites to be absolutely secure and Chrome turning into the primary browser to flag HTTP pages as now not secure, intention to release the brand new web page on HTTPS, making sure all assets including pictures, CSS and JavaScript files are requested over steady HTTPS connections.This is critical that allows you to avoid combined content troubles.

    Mixed content occurs while a web page that’s loaded over a secure HTTPS connection requests property over insecure HTTP connections. Most browsers both block dangerous HTTP requests or just show warnings that restrict the person revel in.

    Mixed content mistakes in Chrome’s JavaScript Console

    There are many methods to pick out mixed content material mistakes, inclusive of the use of crawler packages, Google’s Lighthouse, and many others.

    Image property review

    Google crawls snap shots less frequently than HTML pages. If migrating a site’s images from one area to some other (e.g. out of your domain to a CDN), there are ways to useful resource Google in discovering the migrated pictures quicker. Building an photograph XML sitemap will assist, but you also want to ensure that Googlebot can reach the website’s snap shots while crawling the website. The problematic part with picture indexing is that both the net page wherein an picture seems on as well as the photo report itself must get indexed.

    Site performance evaluation

    Last however no longer least, degree the old web page’s web page loading instances and spot how these compare with the brand new website online’s when this will become to be had on staging. At this stage, awareness on the community-independent elements of overall performance along with the usage of outside resources (pics, JavaScript, and CSS), the HTML code, and the internet server’s configuration. More statistics approximately the way to do this is available further down.

    Analytics monitoring evaluate

    Make positive that analytics tracking is well set up. This evaluate must preferably be carried out by expert analytics specialists who will appearance beyond the implementation of the tracking code. Make certain that Goals and Events are properly installation, e-commerce monitoring is carried out, superior e-commerce tracking is enabled, and many others. There’s not anything extra frustrating than having no analytics statistics after your new site is released.

    Redirects checking out

    Testing the redirects before the new site goes stay is essential and may save you quite a few hassle later. There are many methods to check the redirects on a staging/test server, but the bottom line is that you should now not launch the new internet site without having examined the redirects.

    Once the redirects emerge as available at the staging/testing environment, crawl the complete listing of redirects and check for the subsequent problems:

    • Redirect loops (a URL that infinitely redirects to itself)
    • Redirects with a 4xx or 5xx server reaction.
    • Redirect chains (a URL that redirects to some other URL, which in flip redirects to every other URL, and so on).
    • Canonical URLs that return a 4xx or 5xx server reaction.
    • Canonical loops (page A has a canonical pointing to page B, which has a canonical pointing to web page A).
    • Canonical chains (a canonical that factors to any other web page that has a canonical pointing to some other page, and so forth).
    • Protocol/host inconsistencies e.g. URLs are redirected to both HTTP and HTTPS URLs or www and non-www URLs.
    • Leading/trailing whitespace characters. Use trim() in Excel to remove them.
    • Invalid characters in URLs.

    Pro tip: Make sure one of the vintage website online’s URLs redirects to the proper URL on the new site. At this degree, due to the fact the brand new website doesn’t exist but, you could best take a look at whether or not the redirect vacation spot URL is the meant one, however it’s truly well worth it. The reality that a URL redirects does not imply it redirects to the proper page.

    Phase 4: Launch day activitiesWhen the site is down…

    While the new website is replacing the old one, possibilities are that the stay web page goes to be temporarily down. The downtime need to be saved to a minimal, however while this occurs the web server need to reply to any URL request with a 503 (provider unavailable) server reaction. This will tell search engine crawlers that the website online is briefly down for renovation so they arrive lower back to move slowly the website online later.

    If the web site is down for too lengthy with out serving a 503 server response and search engines move slowly the internet site, natural search visibility could be negatively affected and healing received’t be instantaneous once the web site is returned up. In addition, while the internet site is quickly down it need to also serve an informative maintaining web page notifying customers that the website is temporarily down for protection.

    Technical spot exams

    As soon as the brand new website online has long past stay, take a quick study:

  • The robots.txt report to ensure serps aren’t blocked from crawling
  • Top pages redirects (e.g. do requests for the antique web page’s pinnacle pages redirect correctly?)
  • Top pages canonical tags
  • Top pages server responses
  • Noindex/nofollow directives, in case they are accidental
  • The spot assessments need to be accomplished throughout each the cell and computing device websites, except the website online is absolutely responsive.

    Search Console actions

    The following activities need to take region as quickly as the brand new internet site has long gone stay:

  • Test & upload the XML sitemap(s)
  • Set the Preferred area of the domain (www or non-www)
  • Set the International concentrated on (if applicable)
  • Configure the URL parameters to address early any potential reproduction content problems.
  • Upload the Disavow document (if applicable)
  • Use the Change of Address tool (if switching domains)
  • Pro tip: Use the “Fetch as Google” feature for each different type of page (e.g. the homepage, a class, a subcategory, a product web page) to ensure Googlebot can render the pages with none problems. Review any reported blocked assets and consider to apply Fetch and Render for laptop and mobile, especially if the cellular internet site isn’t responsive.

    Blocked assets save you Googlebot from rendering the content of the page

    Phase five: Post-launch assessment

    Once the brand new website has long gone stay, a brand new spherical of in-depth exams must be performed. These are in large part the equal ones as those cited within the “Phase 3: Pre-release Testing” section.

    However, the main distinction during this segment is that you now have get right of entry to to a lot more facts and tools. Don’t underestimate the quantity of attempt you’ll need to put in throughout this section, because any problems you encounter now without delay affects the web page’s overall performance in the SERPs. On the alternative hand, the earlier an issue receives diagnosed, the quicker it’ll get resolved.

    In addition to repeating the same checking out responsibilities that have been mentioned within the Phase three section, in sure areas matters can be examined greater thoroughly, as it should be, and in more detail. You can now take complete benefit of the Search Console functions.

    Check crawl stats and server logs

    Keep an eye fixed on the move slowly stats to be had inside the Search Console, to make certain Google is crawling the brand new website online’s pages. In widespread, while Googlebot comes across new pages it has a tendency to boost up the common variety of pages it crawls in line with day. But if you can’t spot a spike across the time of the release date, something can be negatively affecting Googlebot’s capability to move slowly the site.

    Crawl stats on Google’s Search Console

    Reviewing the server log files is by some distance the best way to identify any crawl troubles or inefficiencies. Tools like Botify and On Crawl may be extremely useful because they combine crawls with server log statistics and might spotlight pages search engines do not move slowly, pages that aren’t linked to internally (orphan pages), low-fee pages which might be closely internally related, and loads extra.

    Review move slowly errors regularly

    Keep an eye at the suggested crawl mistakes, ideally daily all through the first few weeks. Downloading these errors every day, crawling the suggested URLs, and taking the necessary actions (i.e. put into effect additional 301 redirects, restore gentle 404 mistakes) will resource a quicker healing. It’s highly unlikely you may want to redirect every single 404 this is reported, but you ought to add redirects for the maximum essential ones.

    Pro tip: In Google Analytics you could without problems find out which can be the most usually asked 404 URLs and attach these first!

    Other useful Search Console capabilities

    Other Search Console features really worth checking encompass the Blocked Resources, Structured Data errors, Mobile Usability errors, HTML Improvements, and International Targeting (to test for hreflang said errors).

    Pro tip: Keep a close eye at the URL parameters in case they’re inflicting reproduction content troubles. If this is the case, take into account taking some urgent remedial motion.

    Measuring website online speed

    Once the brand new web site is live, measure website speed to make certain the site’s pages are loading rapid enough on both computer and mobile gadgets. With web site pace being a ranking sign throughout gadgets and becauseslow pages lose users and customers, evaluating the brand new web page’s velocity with the old site’s is extremely important. If the new website’s page loading instances appear to be better you have to take a few immediate movement, otherwise your website online’s visitors and conversions will almost honestly take successful.

    Evaluating speed using Google’s gear

    Two equipment that could assist with this are Google’s Lighthouse and Pagespeed Insights.

    The Pagespeed Insights Tool measures web page performance on each cellular and computer devices and indicates real-global page pace facts based totally on consumer statistics Google collects from Chrome. It also assessments to look if a web page has carried out common performance high-quality practices and gives an optimization score. The tool consists of the following most important classes:

    • Speed score: Categorizes a page as Fast, Average, or Slow the usage of metrics: The First Contentful Paint (FCP) and DOM Content Loaded (DCL). A web page is taken into consideration speedy if each metrics are within the pinnacle one-1/3 of their class.
    • Optimization rating: Categorizes a web page as Good, Medium, or Low based on overall performance headroom.
    • Page load distributions: Categorizes a page as Fast (quickest 0.33), Average (center 0.33), or Slow (bottom third) with the aid of evaluating against all FCP and DCL events within the Chrome User Experience Report.
    • Page stats: Can indicate if the page might be faster if the developer modifies the advent and capability of the web page.
    • Optimization guidelines: A list of fine practices that might be carried out to a web page.

    Google’s PageSpeed Insights in action

    Google’s Lighthouse is very reachable for cell performance, accessibility, and Progressive Web Apps audits. It gives diverse beneficial metrics that may be used to measure web page performance on mobile gadgets, together with:

    • First Meaningful Paint that measures when the number one content material of a page is visible.
    • Time to Interactive is the point at which the page is prepared for a user to interact with.
    • Speed Index measures suggests how speedy a page are visibly populated

    Both equipment offer hints to assist enhance any said website performance troubles.

    Google’s Lighthouse in action

    You can also use this Google device to get a hard estimate on the percentage of users you may be dropping from your cellular website’s pages because of gradual web page loading times.

    The equal tool also provides an industry evaluation so that you get an idea of the way a long way you are from the top appearing sites in your industry.

    Measuring pace from real customers

    Once the website has long past live, you could start evaluating site speed primarily based at the users journeying your web site. If you have got Google Analytics, you could easily examine the brand new website online’s average load time with the previous one.

    In addition, if you have get right of entry to to a Real User Monitoring tool which include Pingdom, you could examine web site velocity based totally on the customers touring your internet site. The under map illustrates how specific site visitors enjoy very exceptional loading instances depending on their geographic area. In the underneath instance, the web page loading times look like satisfactory to site visitors from the UK, US, and Germany, but to customers dwelling in different international locations they’re lots better.

    Phase 6: Measuring web page migration performanceWhen to degree

    Has the website online migration been successful? This is the million-dollar query each person involved would like to understand the answer to as soon as the brand new website online is going live. In reality, the longer you wait the clearer the answer becomes, as visibility all through the primary few weeks or maybe months may be very risky depending on the scale and authority of your web page. For smaller websites, a four–6 week period should be sufficient before evaluating the brand new site’s visibility with the vintage web page’s. For large web sites you can should anticipate at least 2–three months before measuring.

    In addition, if the brand new site is appreciably unique from the preceding one, customers will need a while to get used to the brand new appearance and feel and acclimatize themselves with the brand new taxonomy, user journeys, and many others. Such adjustments to start with have a substantial poor effect at the web page’s conversion charge, which need to improve after a few weeks as returning visitors are getting increasingly used to the new website online. In any case, making data-pushed conclusions approximately the brand new website’s UX can be unstable.

    But these are just standard policies of thumb and want to be considered along side different factors. For instance, if some days or weeks after the new web site launch giant additional modifications had been made (e.g. to address a technical issue), the migration’s evaluation must be driven further back.

    How to measure

    Performance dimension may be very important and even though enterprise stakeholders could best be interested to listen approximately the sales and site visitors impact, there are a whole lot of different metrics you ought to pay attention to. For instance, there can be numerous motives for sales going down following a website migration, together with seasonal trends, decrease brand hobby, UX issues that have appreciably decreased the web site’s conversion fee, poor mobile performance, bad page loading times, and many others. So, further to the organic site visitors and sales figures, also take note of the following:

    • Desktop & cellular visibility (from SearchMetrics, SEMrush, Sistrix)
    • Desktop and cellular rankings (from any reliable rank tracking tool)
    • User engagement (bounce charge, common time on page)
    • Sessions per page kind (i.e. are the category pages driving as many periods as before?)
    • Conversion rate in keeping with page type (i.e. are the product pages changing the same way as earlier than?)
    • Conversion price by way of device (i.e. has the computing device/cellular conversion charge elevated/decreased on account that launching the brand new website?)

    Reviewing the underneath could also be very reachable, specially from a technical troubleshooting perspective:

    • Number of indexed pages (Search Console)
    • Submitted vs indexed pages in XML sitemaps (Search Console)
    • Pages receiving at least one go to (analytics)
    • Site velocity (PageSpeed Insights, Lighthouse, Google Analytics)

    It’s handiest after you’ve looked into all of the above areas that you can appropriately finish whether your migration has been a success or not.

    Good luck and in case you need any consultation or assistance together with your website online migration, please get in touch!

    Site migration checklist

    An up-to-date website online migration checklist is to be had to down load from our web page. Please notice that the tick list is regularly up to date to consist of all essential regions for a a hit website online migration.

    Appendix: Useful toolsCrawlers

    • Screaming Frog: The search engine marketing Swiss army knife, ideal for crawling small- and medium-sized websites.
    • Sitebulb: Very intuitive crawler software with a neat consumer interface, well prepared reports, and many useful statistics visualizations.
    • Deep Crawl: Cloud-based totally crawler with the potential to crawl staging web sites and make crawl comparisons. Allows for comparisons between one-of-a-kind crawls and copes properly with big websites.
    • Botify: Another powerful cloud-primarily based crawler supported by way of super server log document evaluation capabilities that may be very insightful in terms of knowledge how serps move slowly the web site.
    • On-Crawl: Crawler and server log analyzer for business enterprise SEO audits with many reachable features to become aware of move slowly price range, content first-class, and overall performance issues.

    Handy Chrome accessories

    • Web developer: A collection of developer tools which include smooth approaches to permit/disable JavaScript, CSS, images, and so forth.
    • User agent switcher: Switch among extraordinary consumer marketers inclusive of Googlebot,cell, and other agents.
    • Ayima Redirect Path: A great header and redirect checker.
    • search engine optimization Meta in 1 click: An on-page meta attributes, headers, and links inspector.
    • Scraper: An clean way to scrape internet site data into a spreadsheet.

    Site tracking gear

    • Uptime Robot: Free website uptime monitoring.
    • Robotto: Free robots.txt tracking device.
    • Pingdom equipment: Monitors site uptime and page speed from actual users (RUM service)
    • SEO Radar: Monitors all critical search engine marketing elements and fires signals while these change.

    Site overall performance tools

    • PageSpeed Insights: Measures web page overall performance for cellular and desktop gadgets. It tests to look if a page has implemented not unusual performance fine practices and presents a score, which stages from 0 to one hundred points.
    • Lighthouse: Handy Chrome extension for overall performance, accessibility, Progressive Web Apps audits. Can additionally be run from the command line, or as a Node module.
    • Webpagetest.org: Very special page checks from various places, connections, and gadgets, which includes distinct waterfall charts.

    Structured statistics checking out gear

    • Google’s based facts testing tool & Google’s dependent data testing device Chrome extension
    • Bing’s markup validator
    • Yandex dependent statistics checking out tool
    • Google’s rich consequences checking out device

    Mobile checking out equipment

    • Google’s cellular-friendly checking out device
    • Google’s AMP trying out device
    • AMP validator tool

    Backlink records sources

    • Ahrefs
    • Majestic search engine optimization

    Related Posts

    Leave a Reply

    Your email address will not be published. Required fields are marked *