SEO and .NET Core

SEO is an art and a science at the same time. Historically, it has not really been in the sphere of software development, but web design.

When a business grows beyond the normal WordPress website into a large web application, the line between the search engine optimiser and software developer becomes blurred – as does the responsibility.

Though I do not deny the importance of SEO factors such as back links, domain age and content optimisation, we need to consider the technical aspect of SEO and code. 

That is the purpose of this article.

Issue tracking and fixing of SEO issues

We know that page speed affects SEO, yet finding the issues that affect performance can be challenging. At the lowest level, one can use Google Page Speed Insights or GTmetrix to understand the page speed issues. 

Monitoring where the slowness is happening will give you a better understanding. This could be done by custom code or by using Azure Monitoring.

Chrome Developer Tools (I call them the F12 tools) also has a great interface for checking performance. This can be accessed F12 > Performance > Record. Refresh the page, and you will receive loads of info about the FCP, LCP and the response times of scripts. 

Infrastructure factors affecting SEO

Before we get into the detail of MVC and .NET Core applications, I want to mention that page speed and responsiveness is exceptionally important in upping your SEO score. 

I find that many companies like to have a custom setup of their system on a virtual machine – often running databases, web applications and third-party services on the same machine. Though this might make sense, the performance of the website could be affected due to a spike in website traffic, SQL jobs and other factors. 

In certain cases, the use of cloud services such as Azure or AWS might make the website performance better. These services are optimised for performance (replication and redundancy)  stability (scalability). Examples include Azure web services and Azure Elastic

Code factors affecting SEO

Depending on whether a company optimises a solution for delivery, performance or configurability, the approach of resolving some of the issues that SEO will require changes to the MVC Views, C# business logic or to the persistence of data (The database or Elastic).

Largest Contentful Paint (LCP)

Largest Contentful Paint (LCP) is a Core Web Vitals metric and measures when the largest content element in the viewport becomes visible.

Web.dev

LCP has a lot to do with making everything in the initial viewport load as fast as possible. The largest content tends to be images and videos and sometimes large scripts. These can be optimised as follows:

  • Use Gzip compression
    • The following DLL works for script files (link here to implementation)
      Microsoft.AspNetCore.ResponseCompression
    • Change the server settings to allow all images to use Gzip compression. If you’re using a CDN for your images, make sure they support Gzip.
  • Avoid having huge script files blocking the rendering of the HTML and CSS. Place less important files at the bottom of the page and only have globally essential scripts in the _layout.cshtml file. 
  • Avoid inline styles as this delays drawing the UI. 
  • Pick your website’s front end plugin carefully: Some plugins an cause massive performance issues due to browser side rendering of content. 

Website speed and performance

As page speed indirectly also affects your SEO rating, e.g. people leaving because they’ve waited too long for it to load (bounce rate), it is important to make sure your application loads fast. The following needs to be considered or making your website faster:

  • If possible, implement caching. In some cases, this is challenging, especially in an e-commerce solution where products sell out quickly.
  • Leverage browser caching – oftentimes, developers add a query string to a script to reload it every time. Avoid this if possible.
  • Reduce redirects: avoid the following on Get operations:
    • return RedirectToAction("Index");
    • return Redirect("Area/Controller/Index");

Sitemap and the robots.txt file

A robots.txt file tells search engines what should and shouldn’t be indexed. This is thus the perfect place to reference the sitemap.xml file! A sitemap references all pages that you want Google to spider. Note that other pages may be included if another website references them.

 There are a few ways to do this:

  • For general static sites, one can use sitemap generators and add the file to the solution. 
  • For dynamic sites such as eCommerce sites, this would need to be done manually. Solutions for manual inclusion include:

Cumulative Layout Shift (CLS)

Imagine trying to click a link, but it keeps on moving? For this reason, Google added CLS. This metric was added to favour sites with better user experience – where the layout doesn’t change. The image below (source here) illustrates the issue nicely.

As .Net Core is built on asynchronous behaviour, it sometimes makes sense to get content asynchronously and draw it as it arrives at the browser. 

It also happens with third-party JavaScript plugins and integrations where content gets appended to a div – causing the layout to shift down, as well as user controls such as carousels and accordions. 

There are two potential work-arounds:

  • Add a fixed height for the container that the content will be rendered into. This will stop the shift from happening
  • Move business logic of third party components to the server, so that all the data will be drawn together on render.
Finding what is shifting the page can be challenging at times. The Chrome developer tools can assist with finding the culprits by checking the performance and expanding the experience dropdown – more info here.

Compression and Minification

Performance and page speed is very central to Google and SEO and many of the above points touch on this already. Historically, bundles were added in the C# code like this:

 bundles.Add(new ScriptBundle("~/bundles/bs-jq-bundle").Include(
                      "~/Scripts/bootstrap.js",
                      "~/Scripts/jquery-3.3.1.js"));

With .Net Core, this has changed drastically. Online there are examples of where a tag is now added to include or exclude the bundling, as in this snippet:

<environment exclude="Development">

<link rel="stylesheet"
href="https://ajax.aspnetcdn.com/ajax/bootstrap/3.3.7/css/bootstrap.min.css"
asp-fallback-href="~/lib/bootstrap/dist/css/bootstrap.min.css"
asp-fallback-test-class="sr-only"
asp-fallback-test-property="position"
asp-fallback-test-value="absolute" />

<link rel="stylesheet"
href="~/css/site.min.css" asp-append-version="true"
/>

</environment>

I have, however found for some odd reason, this doesn’t always work – especially when scripts that are loaded from a CDN. The other way is to add the bundles in the bundleconfig.json (full example here):

[

{

"outputFileName": "wwwroot/css/site.min.css",
"inputFiles": [
"wwwroot/lib/bootstrap/dist/css/bootstrap.css",
"wwwroot/css/site.css"
]

}

]

This file does use the ‘BuildBundlerMinifier‘ Nuget package, but works really well with setting it up with your CI/CD, as per he link here – and it works well with setting the code up for local debug with your <environment> tags, as per above.

Conclusion

As much as SEO is an art, it is also a science. Developers need to be more cognizant that business to customer solutions might need to be indexed by search engines. Adding the appropriate solution in place to help the marketing team is essential in a successful online strategy.

Though there are many libraries that can help with site maps, compression and modification, sitemaps and website speed and performance, the implementation of these will depend on the infrastructure and ability of the development team to accommodate the new changes in the existing infrastructure and workload.

Though SEO has historically not been something that software engineers focus on, we can see the shift of becoming more customer-focused changing the status quo.

Enjoy your business