The redesign process is complex and involves the labour of many people and time. Designers want to make the site beautiful, developers want to code it fast, and SEOs want it to be optimised – those three intentions do not always complement each other.
The redesign can be done in numerous ways: using CMS, custom code, frameworks, page builders, etc. But not all ways are SEO-friendly, so it’s recommended to implement redesign using SEO best practices. In this article, we’ll see how to execute SEO website redesign and steps to check website health after it’s done.
What SEO problems might appear after a website redesign
If website development isn’t guided by best practices, many issues may also occur in terms of SEO. Indexation issues, broken pages, website speed, poor media optimisation, responsiveness issues, and more. All of these issues will surely have an impact on rankings, resulting in a drop in organic traffic.
Check the website before redesigning
A great thing you can do before publishing your new design is to run a website health checkup so that you will have something to compare with later. Firstly, crawl and save a list of all pages to validate them being presented on a new site. Besides the URLs, you need to save the important data associated with them: meta tags, content, etc., to validate that nothing important was lost on the updated site. Collecting it manually can be difficult, especially for big websites, but there is an easier way – using automatic website health check software. For instance, the SE Ranking website audit tool checks and saves statistics on each page of your site so that you can see the current condition and the change history – you can compare the whole website or each page to its previous crawl, which is exactly what we need. We’ll use this tool to demonstrate the process as it checks a website with over 120 criteria at once, so you have all the needed data in one place.
How to check website health and detect problems after the redesign
The best way to check site health after publishing a new site version is to compare it to the old one. To do so, you need to run two crawls with the SE Ranking Website Audit: one before changes and one after. Using those two crawls, you’ll be able to ensure important things are still available and that old issues have been fixed. Let’s examine it in detail.
Basic crawlability and indexability-related issues
After the redesign, one of the first things to check is that all pages are accessible and can be re-indexed on a new version. The main problems that affect page crawling and indexing are:
- Non-200 response code (1)
If there is no such URL on a website, the response code is 4xx; if the page was redirected, the response code is 3xx; if there is some server-side error, the response code is 5xx. In any of these cases, a page cannot be loaded and displayed to users, so it cannot be crawled or added to the Google index – make sure all important pages have the 200 response code.
- Page is blocked in robots.txt (2)
- The page contains the “noindex” meta robots tag (3)
- Canonical URL is referring to another page (4)
To validate all these issues simultaneously, go to Website Audit → Crawled pages and choose the appropriate columns to see crawlability and indexability problems on one screen.
Redesign is not always about the front end. Sometimes it affects back end and/or server configuration as well. If you move to another CMS or changes impact the URL structure, you need to ensure the new URLs are the same as the old ones or configure redirects accordingly. If you don’t do this, the old URLs will have 404 response codes, and the new pages will not inherit trust from the old pages since they will be treated as brand new pages.
Also, several issues can occur regardless of the page path:
- Slash at the end
“…/example-url” is not the same as “…/example-url/”. If the canonical URL doesn’t have the slash at the end, visiting the URL ending with the slash will result in a 404 response code which is not good for SEO. You need to ensure all your new URLs end the same as the old ones or configure the canonicals and redirects accordingly.
Like the previous issue, enabling/disabling SSL connections also affects the URL structure and can cause serious issues if configured incorrectly.
“www.example.com” doesn’t equal “example.com”. This prefix was commonly used in the early ages of the world wide web to indicate that a web resource was meant to be accessed publicly.
Technically speaking, “www.” is a subdomain, and Google treats the subdomain as a different domain rather than a folder. If you are trying to access a domain or subdomain that isn’t configured properly, you’ll get the error, and so will search bots. To avoid such issues, make sure to either set up redirects or add rel=canonical tags pointing to the preferred version of your website.
When publishing a new version of a website, you need to ensure that all optimised meta tags of old pages are in place. Otherwise, missing or wrong titles and descriptions may result in a significant drop in rankings and traffic. To check the health of a website and that you have no pages with missing or wrong meta tags, head over to the Title and Description sections in the Issues Report of the website audit.
To ensure that meta titles and descriptions are the same as before redesigning, use the Crawled Pages report and enable corresponding columns. Then, compare them to what was before the redesign.
If several pages have the same content, Google will have difficulty figuring out what page it should display in the SERP, resulting in a rankings drop for all duplicated pages.
It can be hard to compare content, especially if you have a big website. The good news is that it is possible to detect such issues automatically.
Duplicate pages are pages with identical content. They may be hard to find sometimes because they can have different URLs, but there is a parameter that can help you identify them. Enabling the “Content Hash” column will display the unique hash of the page’s content – a cryptographic equivalent of content. If the hash is the same for two pages, it means they are 100% identical – look through those occurrences and leave only one copy of a page.
Another duplication scenario is identical meta tags. The title and meta description have a significant impact when a search engine evaluates the relevance of a page. Don’t confuse Google with pages with identical meta tags because none of them will be ranked high.
To find those pages, enable the “Duplicate Title” and “Duplicate Description” columns in the Crawled Pages report of the Website Audit tool or go to the corresponding section in the Issues report.
Similar to meta tags, the headline of a page takes priority over other (sub)headers when a search engine determines what the page is about. You need to make sure none of your pages have identical H1s, so they don’t plagiarize each other. Like the Meta tags check, enable the “Duplicate H1” column in the report.
Core Web Vitals
If you are implementing a new design, you should be sure that it will improve the user experience; otherwise, there is no point in doing it. Luckily, Google has developed Core Web Vitals – a set of metrics to help developers evaluate their UX.
With SE Ranking, you don’t need to check each page in Web.dev or Chrome’s Lighthouse since the Website Audit checks all pages for all necessary page experience metrics. Let’s look through them.
The very first thing is to check the Core Web Vitals: LCP, FID, and CLS. These are three parameters that reflect how well your website is performing UX-wise:
- Largest Contentful Paint means how fast the first viewport will be printed in the user’s browser. This metric is closely connected to the website loading speed, which we’ll discuss in seconds.
- First Input Delay measures how fast users can interact with the page (e.g., click on links, buttons). This parameter is also related to the site speed.
- Cumulative Layout Shift tells how stable your content is during loading – it’s very annoying when content elements change their size and position during the page load.
This information can be found on the Overview dashboard or in the Issue Report -> Performance section. The great thing is that this report is based on data from real users and lab environment tests. This is a starting point for any further optimisation.
2 out of 3 Core Web Vitals are related to loading time. Not only that, but statistics tell that a third of users won’t wait longer than 3 seconds for a page to load. Therefore, it’s fair to say that website speed is a core parameter regarding user experience. Let’s see how SE Ranking can help you to identify and solve loading time issues.
Since all web pages are written in HTML, it’s the first thing you need to validate after the redesign. The issues are pretty basic, yet they are fundamental for further tweaks. In this regard, you need to ensure that HTML size is not too big to exclude big DOM size since it can be challenging to render, impacting the overall loading time. Also, ensure your HTML is compressed and cached to minimize the document size and backend processing time.
JS and CSS
Optimisation techniques for JS and CSS files are similar:
Exclude all comments, line breaks, extra spaces, etc. – leave only characters that matter.
Try to serve as few files as possible (ideally, one JS and one CSS per page). Each file request takes time for the server to respond and for the browser to download it – a single file is a much more convenient way to do it.
Enabling server-side caching means that your files won’t be generated from scratch each time a user requests the page; instead, saved copies of them will be served. This takes away a lot of unnecessary load from servers and improves loading time.
Here are some general recommendations:
- Avoid using frameworks and themes
JS frameworks and premade CSS can save you some time, but often they aren’t made with speed optimisation in mind. They have a lot of functionality you don’t need, and all those extra features may be too heavy to load.
- Be careful merging JS files
If there is a single error in the JS file, any other functions within the file may not work. Thus if you have a single JS file, it’s possible that all functions can stop working. It may be a clever decision to create separate files for core functionality and front-end beauty.
Regarding SEO, images optimisation is also essential:
- Images should have an Alt text, so search engines and people using screen readers can understand what the image is about – make sure all images have meaningful Alt text after the redesign.
- Image size
The best practice is to serve the image in a resolution equal to the parent container. However, it’s almost impossible to create image copies to fit all possible containers due to the responsive design where containers may change their sizes. What you can do is create several image variations (i.e. original size, 300px width, 100px width) to use in different places on the site. For example, the featured image on the article should be in original size, while it’s better to use a smaller image of 300px width or lower for the thumbnail on the archive page.
- Image compression
Compression of an image is the process of reducing the number of unique colors used in an image. Moderate compression will hardly be seen by the human eye but can save up to 30% of file size – that’s a lot, especially considering images have a large file size compared to other page resources.
It is mandatory to check for mobile optimisation issues as the majority of traffic comes from mobile devices, and Google switched to mobile-first indexing. SE Ranking has this covered as well.
By navigating to the Mobile Optimisation category in the Issues report, you’ll be able to validate that your website is aligned with fundamental responsive design principles.
Redesign is a complicated process, especially considering that a redesigned site has to be SEO-optimised. The best way to accomplish it is to2e SEO and performance practices in mind.
Once you publish your changes, you need to check if everything is okay – here is where website health check tools come into play. To be sure that your new site is better or that nothing has been broken SEO- and performance-wise, you need to compare it to its previous version. SE Ranking does exactly what you need: crawls, checks, and stores website audit data so you can easily monitor and compare your results by dates. Using such audits enables you to compare and improve the website’s health after the redesign.