In recent years, I've noticed that more and more websites are being developed with a JavaScript-first approach. This can create challenges with Client Side Rendering (CSR) and internal links that aren't referenced as HTML elements, making it difficult for search engine bots to fully identify content and connections between pages. While there are many elements that could be fixed, my approach focuses on explaining why this method is problematic, and partnering with the development team. With the goal to implement a pre-rendering solution and add the "href" attribute to internal links to make the content more easily readable.
One of the most common technical issues I've come across is the lack of an effective page retirement policy. This is quite common across many e-commerce sites. Many webmasters have a tendency to simply 404 pages which are either out of stock or have become temporarily unavailable. This can have a negative impact on the performance of the site, from an organic and user experience perspective. From an organic performance these pages could have driven significant traffic or had backlinks pointing to them. From a user experience, these pages could have been bookmarked by users who are now met with a 404 page. The simple solution is to put in place a page retirement policy, this could mean either automatically 301 redirecting all product pages which are being retired or alternatively assess performance of those pages before redirecting, check individual performance in analytics and also use tools to identify any backlinks pointing there.
In my four years of practicing SEO, I’ve had several clients who struggled with duplicate content resulting in incorrect rel=canonical. To fix this, I would often start by using a tool like Google Search Console or Screaming Frog to identify which pages have the incorrect attribute. Next, I would determine the correct URL that should be canonical by analyzing the content and purpose of each page. Once I have identified the correct URL, I would update the rel=canonical tag on the affected pages to point to the correct URL, which can typically be done by modifying the HTML code of the page header. I would also make sure that the content on the page matches the canonical URL to avoid any confusion for search engines. Finally, I would monitor my site to ensure that the changes have been properly implemented and that search engines are correctly indexing my preferred URL.
VP, Group Head of SEO in Central Group | International SEO Expert & Advisor
Answered 2 years ago
One of the most commonly overlooked issues in SEO is poor indexation management issues, also known as Index Bloat or Over-indexing. This happens when low-quality pages are indexed, which can negatively impact the site's AVG quality at the domain level. This is particularly common for large e-commerce companies and content-driven businesses, where only a small fraction of indexed pages provide real value. Typically, this is around 5-30%. To solve this issue, businesses can use fully automated indexation logic or manual iterations. Some variation of the so-called "Panda Diet" methodology. The solution involves identifying low-value pages based on business, SEO, technical, cross-channel metrics, and parameters. A score is then built by weighing these metrics against each other, and low-value pages are de-indexed (i.e. removed from the index). The internal linking architecture is also managed to fit this logic, and strategies for other valuable pages (left and new) are continued.
A frequent technical SEO issue, especially for multi-regional and multilingual web properties, is improper hreflang setup. Hreflang attributes are vital for sending ranking signals between pages, preventing duplicate content issues, and ensuring users see the right website in SERPs. Hreflang attributes are among one of the factors that Google has identified as a direct ranking signal between pages. For example, a well-ranked US English blog post with an equivalent on a Mexican Spanish site may see improved Mexican rankings with proper hreflang implementation. Hreflang links indicate that if the US content satisfies user search intent, the Mexican content likely will too. Common mistakes include not having hreflang at all or having it setup incorrectly. Despite being tedious to audit and fix, correct hreflang can significantly impact organic rankings in some cases. You can implement hreflang through an XML sitemap, link elements in the HTML <head> section, or by using HTTP headers.
One common technical SEO issue I come across often is cannibalisation. I use the performance report in Google Search Console to find out where cannibalisation is present. First, I enter the website to generate the performance report. Next I look at the queries and see which of them have more that one page ranking for the same keyword. Next I look into the intent of the keyword to see if it is similar in more than one URL. If the intent is similar, that means cannibalisation is present and will affect ranking efforts. To solve this, I manually look at the pages and depending on what serves a better purpose. I will either consolidate the content, create new landing pages, use redirects or delete content.
A common issue I see during technical audits is either a missing XML sitemap or not using the file correctly. Quite often, the file will be placed inside a folder such as domain.com/folder/sitemap.xml. This is a problem as it means that only URLs within the /folder/ would be read by search engines. When I encounter this, I work with the client and developers to update the XML sitemap to ensure that it contains all of the necessary URLs (indexable URLs with a 200-status code), and move it to the root domain so it now sits on domain.com/sitemap.xml. This helps with the indexation of content by allowing search engines to read all URLs within the domain. It is important to note that this does not guarantee that all of the URLs will be indexed, but it does help with content discovery.
I have found six different homepage versions in the past, including those with https, non-https, www, non-www, /index, and .htm. This is a common technical SEO problem, especially when a new website is built. These URLs are unique to search engines and are indexed separately, causing serious duplicate content problems. Furthermore, link juice is divided among all versions of the homepage, affecting the backlink profile of the domain. To address this issue, the best approach is to choose one URL as the main URL and consolidate the other versions using 301 redirects. The htaccess file is a good option to use for this purpose. Also, choosing the preferred domain in Google Search Console can help further resolve the issue.
Multiple times I've seen websites that either do not have a sitemap.xml file or have implemented one in a way that is unreadable by search engines. A main purpose of Sitemaps is to help search engines know which pages on a website should show up in search results. There are two main ways to fix this issue. The easiest way is to use a plugin or sitemap generator tool, such as Yoast SEO (if using WordPress) to generate a sitemap for the website. Depending on the plugin or tool it may automatically add it to your site, or you may have to upload the generated file into your site. Lucky, sitemap files are easy to code. If needed one can manually be created with basic coding skills.
Internal linking is crucial for enhancing a website's structure, search engine visibility, and user experience. Here's some suggestions to improve your internal linking strategy: 1 Conduct an internal linking audit. 2 Prioritize important pages. 3 Use descriptive anchor text. 4 Create a hierarchical structure. 5 Use internal linking to support topical relevance. 6 Fix broken internal links. 7 Avoid over-linking. By following these steps, is possible to enhance the internal linking structure of your website. As a result, it may benefit from improved user experience and search engine visibility.
A technical SEO issue I addressed was improper heading tag usage on a client's WordPress site. The disorganized heading structure affected user experience and search engine understanding of the content hierarchy. To resolve this, I: 1. Audited the site for improper heading usage, including missing, duplicate, or incorrect nesting. 2. Revised the heading structure on affected pages, using a single H1 tag and appropriate H2 and H3 tags for subsections. 3. Incorporated targeted keywords into headings to improve keyword consistency and relevance. 4. Ensured a clear, logical heading hierarchy for both users and search engines. 5. Applied CSS styling to make headings visually distinguishable and appealing. This improved user experience, made content more accessible to search engines, and increased the potential for higher search rankings.
Structured data markup enables you to markup the content on your site in order to enrich Google search results. You can see examples of 'rich snippets' when you search for a product and ratings out of 5 stars are shown in the SERPs underneath the title alongside the number of ratings, pricing and other details. Many types of web pages can have structured data markup enabled. Articles, recipes, products, services, job postings, local businesses etc. The key benefit of implementing this is that more information is delivered to the user directly in the SERPs and this can increase click-through rate (CTR). You can use free tools such as Google's structured data markup helper, schema.org and plugins for your CMS to help you create structured data markup for your site. But it should be noted that applying markup does not guarantee that rich snippets will show up in the SERPs. Instead, it gives your site the best chance of having rich snippets feature in the results. Don't forget it!
Enhancing Lead Generation and Conversions: The Power of Postponed CTAs In the ever-evolving landscape of online marketing, businesses are continually seeking effective strategies to boost their lead generation and conversion rates. While on-page SEO optimizations play a vital role in attracting traffic, one often overlooked yet powerful technique is the implementation of postponed call-to-action (CTA) blocks on web pages. In my answer, I want to explain the benefits of this approach and how it can significantly improve both user experience and conversion rates. I was working on a lead generation website, and the client was focused on converting traffic very much. I implemented much on-page SEO optimization, but I one important thing that helped us to get a result was the postponed CTA block on a page. Google states that we should avoid intrusive interstitials and dialogues, make sure that you implement this setting. This implementation helped us to increase traffic on money pages.
One of the main problems I encounter when performing SEO Technical Audits is the lack of Crawl Budget optimisation. Blocking web crawling of certain parts of your site is essential to optimize Crawl Budget. Thus, building a customised robots.txt, creating "noindex" directives, avoiding 404 errors and optimising sitemaps are some of the most important tasks to maximize your site's Crawl Budget. Designing and implementing a website architecture focused on SEO is key to improving Crawl Budget and facilitating crawling and indexing by Google. It's important to work on SEO Flat Architecture from the main menu, where each page of the site can be reached via 3/4 clicks maximum, thanks to intuitive navigation menus. Avoiding the infinite generation of URLs with parameters due to e-Commerce Facet Search in Category menus is one of the main tech issues to fix. In this point, robots.txt and the use of "noindex" directives play a fundamental role in order to ensure Crawl Budget optimisation.
Clients often try to rank for a bunch of keywords that are only loosely tied together by a common topic. This approach usually brings minimal results and leads to frustration from leaders who decide that "SEO doesn't work". I advise my clients to concentrate on a single topic, two at most, and explore it from all angles, while also building their marketing funnel. Later, they can add more topics to the mix. Building topic clusters should be combined with a well-thought out website structure and internal linking strategy that make sense for search engines. Once you start publishing content on different topics, make sure it's neatly organized on your website, for example in different blog or menu categories. Start with bottom-of-the-funnel content to capture leads who are considering making a purchase and link to relevant sales or demo pages; this helps users and search engines alike understand the internal logic of your website.
Canonical URLs issues. Canonical issues pose several challenges for SEO. Google avoids indexing identical content. Therefore, when it detects duplicate pages, it chooses a default or canonical version of that page and omits all other versions of the page from search results. This can be problematic if the URL selected by Google is not the preferred URL that should be indexed. Also, if your content is accessible through several URLs, external sites may cite different URLs when referencing your content, leading to the distribution of your link equity over multiple pages, and reducing its potency. When it comes to fixing canonical issues on a website, there are two main methods: implementing 301 redirects or adding canonical tags to your site's pages. By using canonical tags, you can tell Google which of the similar pages is preferred. On the other hand, with 301 redirects, you can permanently redirect traffic from one URL to another.