<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: MylifeforAiur</title>
    <description>The latest articles on Forem by MylifeforAiur (@mylifeforaiur).</description>
    <link>https://forem.com/mylifeforaiur</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/mylifeforaiur"/>
    <language>en</language>
    <item>
      <title>React SEO: manage sitemaps</title>
      <dc:creator>MylifeforAiur</dc:creator>
      <pubDate>Thu, 22 Dec 2022 09:25:00 +0000</pubDate>
      <link>https://forem.com/mylifeforaiur/react-seo-manage-sitemaps-34o9</link>
      <guid>https://forem.com/mylifeforaiur/react-seo-manage-sitemaps-34o9</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzu71spk0vbfpbpns9lt3.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzu71spk0vbfpbpns9lt3.jpg" alt="maze with instruction"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  background
&lt;/h2&gt;

&lt;p&gt;This is another step of &lt;a href="https://dev.to/mylifeforaiur/gatsby-site-seo-action-list-2c59"&gt;React SEO improvement series with Gatsby&lt;/a&gt;. Sitemap serves as  report of site's content and help crawlers to grab the nuts and bolts of site.&lt;/p&gt;

&lt;p&gt;According to google:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;A sitemap is a file where you provide information about the pages, videos, and other files on your site, and the relationships between them. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Sitemap are specially helpful if:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Your site is large and have a big number of pages with low traffic&lt;/li&gt;
&lt;li&gt;Your site is new and eagerly wants to get noticed&lt;/li&gt;
&lt;li&gt;Your site is rich in media content like video, images etc&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Problem
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Web site needs sitemap automation tool to generate sitemap with your production build so it is maintenance free.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Some of the urls like generated by client side rendering are not recognised by sitemap tool. You need to attach it manually, meanwhile keep the automated part running. &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Solution
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Again, base on my guinea pig &lt;a href="https://github.com/hurricanew/gatsby-starter-blog" rel="noopener noreferrer"&gt;gatsby blog starter site&lt;/a&gt;, picked &lt;a href="https://www.gatsbyjs.com/plugins/gatsby-plugin-sitemap/" rel="noopener noreferrer"&gt;Gatsby sitemap plugin&lt;/a&gt; as a tool to generate sitemap on build. Simply install it:
&lt;code&gt;npm install gatsby-plugin-sitemap&lt;/code&gt; and add it to gatsby-config.js&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyfx4yip529e56q6nxg0s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fyfx4yip529e56q6nxg0s.png" alt="gatsby sitemap plugin"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It will check all pages and generate sitemap like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="cp"&gt;&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;urlset&lt;/span&gt; &lt;span class="na"&gt;xmlns=&lt;/span&gt;&lt;span class="s"&gt;"http://www.sitemaps.org/schemas/sitemap/0.9"&lt;/span&gt;
  &lt;span class="na"&gt;xmlns:news=&lt;/span&gt;&lt;span class="s"&gt;"http://www.google.com/schemas/sitemap-news/0.9"&lt;/span&gt;
  &lt;span class="na"&gt;xmlns:xhtml=&lt;/span&gt;&lt;span class="s"&gt;"http://www.w3.org/1999/xhtml"&lt;/span&gt;
  &lt;span class="na"&gt;xmlns:image=&lt;/span&gt;&lt;span class="s"&gt;"http://www.google.com/schemas/sitemap-image/1.1"&lt;/span&gt;
  &lt;span class="na"&gt;xmlns:video=&lt;/span&gt;&lt;span class="s"&gt;"http://www.google.com/schemas/sitemap-video/1.1"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;url&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;loc&amp;gt;&lt;/span&gt;https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/hello-world/&lt;span class="nt"&gt;&amp;lt;/loc&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;changefreq&amp;gt;&lt;/span&gt;daily&lt;span class="nt"&gt;&amp;lt;/changefreq&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;priority&amp;gt;&lt;/span&gt;0.7&lt;span class="nt"&gt;&amp;lt;/priority&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/url&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;url&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;loc&amp;gt;&lt;/span&gt;https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/my-second-post/&lt;span class="nt"&gt;&amp;lt;/loc&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;changefreq&amp;gt;&lt;/span&gt;daily&lt;span class="nt"&gt;&amp;lt;/changefreq&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;priority&amp;gt;&lt;/span&gt;0.7&lt;span class="nt"&gt;&amp;lt;/priority&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/url&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/urlset&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;⚠️ &lt;strong&gt;changefreq and priority will be ignored by google bots&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;That's the moving part, but how to append fixed urls not auto-generated? Answer is to use multi-location sitemap. This map will be uploaded to &lt;a href="https://search.google.com/search-console/about" rel="noopener noreferrer"&gt;google search console&lt;/a&gt; of your site and set in robot.txt. Sample like:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="cp"&gt;&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;sitemapindex&lt;/span&gt; &lt;span class="na"&gt;xmlns=&lt;/span&gt;&lt;span class="s"&gt;"http://www.sitemaps.org/schemas/sitemap/0.9"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;sitemap&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;loc&amp;gt;&lt;/span&gt;https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/sitemap-0.xml&lt;span class="nt"&gt;&amp;lt;/loc&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/sitemap&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;sitemap&amp;gt;&lt;/span&gt;
    &lt;span class="nt"&gt;&amp;lt;loc&amp;gt;&lt;/span&gt;https://gatsbystarterblogsource.gatsbyjs.io/gatsby-starter-blog/sitemap-manual.xml&lt;span class="nt"&gt;&amp;lt;/loc&amp;gt;&lt;/span&gt;
  &lt;span class="nt"&gt;&amp;lt;/sitemap&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;/sitemapindex&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With above, &lt;code&gt;sitemap-0.xml&lt;/code&gt; is generated by sitemap plugin so it keeps the book of all gatsby site urls, and &lt;code&gt;site-manual.xml&lt;/code&gt; is the extra you want to declare.&lt;/p&gt;

&lt;p&gt;Last but not least, don't forget to submit the url in google search console:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhif7mro6h0v56qx0mh6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdhif7mro6h0v56qx0mh6.jpg" alt="sitempap in google search console"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here is the teleport to (commit)[&lt;a href="https://github.com/gatsbyjs/gatsby-starter-blog/commit/b0d1f7c7dd0b0362925c3998090f77889ca63eca" rel="noopener noreferrer"&gt;https://github.com/gatsbyjs/gatsby-starter-blog/commit/b0d1f7c7dd0b0362925c3998090f77889ca63eca&lt;/a&gt;] for devs too busy to read:&lt;/p&gt;

&lt;h2&gt;
  
  
  Next
&lt;/h2&gt;

&lt;p&gt;Champagne🍾 time now, with sitemap and &lt;a href="https://dev.to/mylifeforaiur/gatsby-seo-manage-robot-text-file-in-different-environments-2lb"&gt;robot.txt improvement&lt;/a&gt;, your site has a good start for SEO and you should be able to see some improvement of impressions and clicks in &lt;a href="https://search.google.com/search-console/about" rel="noopener noreferrer"&gt;google search console&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Next topic will be tips on site metadata. SEO on!&lt;/p&gt;

&lt;h2&gt;
  
  
  ref links:
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://developers.google.com/search/docs/crawling-indexing/sitemaps/overview" rel="noopener noreferrer"&gt;Sitemaps Overview&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.gatsbyjs.com/plugins/gatsby-plugin-sitemap/" rel="noopener noreferrer"&gt;Gatsby sitemap plugin&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://search.google.com/search-console/about" rel="noopener noreferrer"&gt;Google search console&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>webdev</category>
      <category>javascript</category>
      <category>gatsby</category>
      <category>seo</category>
    </item>
    <item>
      <title>Gatsby SEO: Manage robot text file in different environments</title>
      <dc:creator>MylifeforAiur</dc:creator>
      <pubDate>Wed, 14 Dec 2022 23:22:34 +0000</pubDate>
      <link>https://forem.com/mylifeforaiur/gatsby-seo-manage-robot-text-file-in-different-environments-2lb</link>
      <guid>https://forem.com/mylifeforaiur/gatsby-seo-manage-robot-text-file-in-different-environments-2lb</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RxmumXr_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p8m4xghn9t9drd9ijrn4.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RxmumXr_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/p8m4xghn9t9drd9ijrn4.jpg" alt="crawling bot" width="880" height="569"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Background
&lt;/h2&gt;

&lt;p&gt;When we are talking about SEO, the very first thing to handle is robot.txt that tells search engine crawlers which URLs they can access on your site. Within dev environment you want none of your pages on cook to be visible in google search results. And in prod you want the opposite. Robot txt is the gate keeper for so. Robot txt file has four directives:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;User-agent - specify which bots like Googlebot, AdsBot-Google, bingbot, slurp&lt;/li&gt;
&lt;li&gt;Disallow - specify files, directories to block bots&lt;/li&gt;
&lt;li&gt;Allow - by default it is allow. This is to whitelist urls like you can disallow a directory and allow just some files in it.&lt;/li&gt;
&lt;li&gt;Sitemap - reference the location of site maps. Let's leave it for another post for sitemaps.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Problem
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Site starter/template doesn't have robot.txt or the file doesn't change with development/production environments. Even accidentally leaves disallow setting to production env. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;em&gt;site with mis-configured disallow can block search engine badly&lt;/em&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--etwGq3Kj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5wye4wlf8fjszdc2vzl2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--etwGq3Kj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5wye4wlf8fjszdc2vzl2.png" alt="site with mis-configured robot.txt" width="880" height="506"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution
&lt;/h2&gt;

&lt;p&gt;Give an example for Gatsby starter site but principle applies to all starters:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;install plugin/node modules to manage robot txt file - This is overkill in my opinion. Update static files according to different build stages is simple enough. Maintaining a library is always an overhead.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Save two versions of text files and use them with build stages. I vote this solution for the mere simplicity. List steps here:&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Step 1. Change default file used by site to disallow. Sample is gatsby site so the default one is in static folder.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--odbwscqV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s0pm9kq017715w4pr9oi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--odbwscqV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s0pm9kq017715w4pr9oi.png" alt="dev env to disallow" width="880" height="238"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 2. Add robot-prod.txt to directory like SEO, only lists urls disallow here. Urls are relative and regular expression ready.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9-nL2gp4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h5jdqpwbvniuf8e90qch.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9-nL2gp4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h5jdqpwbvniuf8e90qch.png" alt="sample robot.txt for production" width="880" height="305"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 3. Update your build file to copy it to root folder&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GPXuUkAV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9nswvojed9f7rdugi6ed.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GPXuUkAV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9nswvojed9f7rdugi6ed.png" alt="command to add robot.txt to production" width="880" height="328"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 4. Verify search results change, notice it will work after search engine cache refresh. For Google it will take 24 hours. &lt;/p&gt;

&lt;p&gt;For devs reference, here is the sample commit to make it working on my gatsby blog starter &lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/gatsbyjs/gatsby-starter-blog/commit/075e61748c8e90eb09621ae6b812225d7607da07"&gt;https://github.com/gatsbyjs/gatsby-starter-blog/commit/075e61748c8e90eb09621ae6b812225d7607da07&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Call out
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Robot.txt must be placed under root folder of site.&lt;/li&gt;
&lt;li&gt;Rules applies to relative urls only.&lt;/li&gt;
&lt;li&gt;Disallow does not guarantee your pages will fly under the radar. They will still get indexed if other pages refer to them. Use the &lt;code&gt;noindex&lt;/code&gt; robots meta tag or &lt;code&gt;X-Robots-Tag&lt;/code&gt; HTTP header to completely block bots.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Next
&lt;/h2&gt;

&lt;p&gt;Hola, one commit and 3 steps to bring your robot text completely under control. 🐢 Fruit can't be lower, chuck the code in your repo now. Next post I will talk about site menu of crawl bots - sitemap.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Other articles of this series
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://dev.to/mylifeforaiur/gatsby-site-seo-action-list-2c59"&gt;SEO action list&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Ref links
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://yoast.com/ultimate-guide-robots-txt/"&gt;The ultimate guide to robots.txt&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/gatsbyjs/gatsby-starter-blog/commit/075e61748c8e90eb09621ae6b812225d7607da07"&gt;robot txt FAQ&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developers.google.com/search/docs/crawling-indexing/block-indexing"&gt;Completely block search bots&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/hurricanew/gatsby-starter-blog"&gt;Gatsby starter with SEO action list&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>gatsby</category>
      <category>seo</category>
      <category>webdev</category>
      <category>javascript</category>
    </item>
    <item>
      <title>Gatsby site SEO action list</title>
      <dc:creator>MylifeforAiur</dc:creator>
      <pubDate>Wed, 07 Dec 2022 05:07:04 +0000</pubDate>
      <link>https://forem.com/mylifeforaiur/gatsby-site-seo-action-list-2c59</link>
      <guid>https://forem.com/mylifeforaiur/gatsby-site-seo-action-list-2c59</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p34Wj6aC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zulilh8gwf3fjz0jc42h.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p34Wj6aC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zulilh8gwf3fjz0jc42h.jpg" alt="gatsby-se" width="880" height="587"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Background
&lt;/h2&gt;

&lt;p&gt;Search Engine Optimisation is critical for new start up business site to rank themselves up on search engine(or google) search result list. Even for well established business brand, it is still handy for their help/FAQ pages to be top listed on relevant search so people won't call support centre to increase operation cost. And usually, SEO work is more of development practice improvement. After process is set up, unlike accessibility or security features, those SEO improvements are easy to maintain. Development team can also set up a coding SEO guide wiki page or even custom eslint rules to share knowledge and maintain standard for new joiners. &lt;/p&gt;

&lt;p&gt;Gatsby js, as a static site generating tool, is known for its friendly SEO, scalability, security and Content Management System (CMS) support through Multipage Page Application (MPA) architecture. Gatsby gradually gains popularity among react frameworks that usually build app into Single Page Application(SPA). For google bots, executing javascript to get html content is always second choice. &lt;a href="https://developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics"&gt;More about javascript SEO&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Problem
&lt;/h2&gt;

&lt;p&gt;Gatsby js framework has extensive support for SEO. However &lt;strong&gt;serious set up is needed to ensure the SEO optimisation&lt;/strong&gt; is baked in your site starter. &lt;/p&gt;

&lt;h2&gt;
  
  
  Action list
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AV3pL1qp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2l54q52c8oj9qdqmgheq.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AV3pL1qp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2l54q52c8oj9qdqmgheq.jpeg" alt="SEO impact effort chart" width="880" height="680"&gt;&lt;/a&gt;&lt;br&gt;
Based on Gatsby site, attached a chart with effort and impact. &lt;a href="https://www.gatsbyjs.com/starters/"&gt;Gatsby's starter&lt;/a&gt; gives you plenty of options, but none of them are fully or even partially SEO optimised.  Rated with impact and effort, actions are classified with three layers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Golden(low effort high impact) 🥇 :

&lt;ul&gt;
&lt;li&gt;Robot.txt control&lt;/li&gt;
&lt;li&gt;Page metadata&lt;/li&gt;
&lt;li&gt;Sitemap generation&lt;/li&gt;
&lt;li&gt;URL structure&lt;/li&gt;
&lt;li&gt;Navigation links&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Silver 🥈 :

&lt;ul&gt;
&lt;li&gt;SPA/MPA optimisation&lt;/li&gt;
&lt;li&gt;SEO Monitoring&lt;/li&gt;
&lt;li&gt;Site Moves&lt;/li&gt;
&lt;li&gt;Canonical urls&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;Bronze 🥉

&lt;ul&gt;
&lt;li&gt;Site speed&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Next
&lt;/h2&gt;

&lt;p&gt;This is an overview of the action list, I will write a serious of articles to elaborate tech details of items on the chart. I will also be using most starred Gatsby starter - forked &lt;a href="https://github.com/hurricanew/gatsby-starter-blog"&gt;blog starter&lt;/a&gt; to demo the SEO credits you can claim with Gatsby site. Happy SEO!&lt;/p&gt;

&lt;h3&gt;
  
  
  Ref links
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://www.gatsbyjs.com/why-gatsby"&gt;Why Gatsby&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.tekrevol.com/blogs/spa-vs-mpa/"&gt;SPA vs MPA&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://developers.google.com/search/docs/crawling-indexing/javascript/javascript-seo-basics"&gt;Javascript SEO basics&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/hurricanew/gatsby-starter-blog"&gt;https://github.com/hurricanew/gatsby-starter-blog&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>seo</category>
      <category>gatsby</category>
      <category>webdev</category>
      <category>react</category>
    </item>
  </channel>
</rss>
