<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Oleg Komissarov</title>
    <description>The latest articles on Forem by Oleg Komissarov (@olegkomissarov).</description>
    <link>https://forem.com/olegkomissarov</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/olegkomissarov"/>
    <language>en</language>
    <item>
      <title>Next.js vs React</title>
      <dc:creator>Oleg Komissarov</dc:creator>
      <pubDate>Tue, 07 May 2024 16:09:32 +0000</pubDate>
      <link>https://forem.com/focusreactive/nextjs-vs-react-11o0</link>
      <guid>https://forem.com/focusreactive/nextjs-vs-react-11o0</guid>
      <description>&lt;p&gt;In the domain of web development, React has long been a powerhouse for building dynamic and interactive user interfaces. Its declarative syntax, component-based architecture, and virtual DOM make it a popular choice among developers seeking efficient and scalable solutions.&lt;/p&gt;

&lt;p&gt;However, as web applications grow in complexity and demand, developers often find themselves seeking additional tools and frameworks to enhance their projects. This is where Next.js comes into play.&lt;/p&gt;

&lt;p&gt;In this article, we delve into the dynamic landscape of Next.js and React, highlighting their respective strengths, use cases, and differences. It's important to note that Next.js doesn't compete with React; instead, it extends its capabilities, offering developers a robust toolkit for building modern web applications.&lt;/p&gt;

&lt;p&gt;React introduced powerful concepts like components, JSX, and the virtual DOM, revolutionizing the way developers build user interfaces. However, Next.js takes these foundations to the next level with features such as server-side rendering (SSR), routing, and static site generation (SSG). While React laid the groundwork for modern web development, Next.js enhances it by offering seamless integration of server-side rendering for improved performance and search engine optimization, built-in routing for streamlined navigation management, and static site generation to optimize performance and SEO. Overall, these improvements make development easier and help developers build faster websites. Together, these tools empower developers to create dynamic and performant web applications.&lt;/p&gt;

&lt;h3&gt;
  
  
  SERVER-SIDE RENDERING VS CLIENT-SIDE RENDERING
&lt;/h3&gt;

&lt;p&gt;When considering the rendering of web pages, there are two primary approaches: Server-Side Rendering (SSR) and Client-Side Rendering. Notably, while Next.js provides SSR out of the box, traditional React applications typically rely on Client-Side Rendering (CSR).&lt;/p&gt;

&lt;p&gt;Server-Side Rendering (SSR) involves generating HTML for a web page on the server before sending it to the client's browser. This means that when a user requests a page, they receive a fully-rendered HTML document directly, without the need for additional client-side processing. Unlike Client-Side Rendering (CSR), where the browser must first request JavaScript files, execute them, and then fetch data, SSR streamlines the process, resulting in faster initial page load times. With SSR, all necessary data is fetched and incorporated into the HTML document on the server-side, eliminating the need for subsequent requests for data. This not only simplifies the flow of data retrieval but also significantly reduces latency, resulting in a smoother and more efficient user experience. The absence of the need to run JavaScript before the first page load is especially important for devices with limited processing power. Also, with SSR, there's no need for loaders and less concerns about layout shifts, as the page loads once with all data from the first appearance.&lt;/p&gt;

&lt;p&gt;Moreover, SSR offers significant benefits for search engine optimization (SEO). Since the initial HTML document contains all the content of the page, including dynamically generated data, search engine crawlers can easily parse and index the content without the need to execute JavaScript. This improves the visibility of the website's content to search engines, leading to better ranking in search results. As a result, SSR is particularly advantageous for content-heavy websites and applications where SEO is a priority, as it ensures that search engines can efficiently crawl and index the entire website content.&lt;/p&gt;

&lt;p&gt;In contrast, Client-Side Rendering (CSR) provides developers with more flexibility in rendering pages dynamically within the client's browser. This approach enables more control over specific parts of the page, allowing for quicker rendering of certain components when necessary. However, these benefits can also be achieved through various rendering optimizations, including lazy-loading. Nonetheless, CSR comes with trade-offs such as slower initial page load times and potential SEO challenges. With Next.js, developers can seamlessly integrate Server-Side Rendering (SSR) into their React applications, getting the benefits of SSR while preserving the development flexibility and power of React components. This integration enables the creation of fast, SEO-friendly web applications without compromising the development experience or the capabilities of React.&lt;/p&gt;

&lt;h3&gt;
  
  
  ROUTING
&lt;/h3&gt;

&lt;p&gt;When it comes to routing, React applications traditionally rely on third-party libraries like React Router, while Next.js provides built-in routing capabilities.&lt;/p&gt;

&lt;p&gt;In React applications with routing managed by React Router, developers define routes and their corresponding components, enabling users to navigate to different pages based on the URL. While React Router offers a robust solution for routing in React applications, it requires additional configuration and setup.&lt;/p&gt;

&lt;p&gt;In contrast, Next.js simplifies routing by offering built-in file-based routing. Each page corresponds to a file within the project structure. This approach eliminates the need for manual route configuration, making it easier to organize and manage routes. Next.js supports dynamic routes, nested routes, and catch-all routes.&lt;/p&gt;

&lt;p&gt;Overall, Next.js streamlines the routing process with its built-in routing capabilities and simplifies the routing development process.&lt;/p&gt;

&lt;h3&gt;
  
  
  DATA FETCHING
&lt;/h3&gt;

&lt;p&gt;In React applications, data fetching typically involves using useEffect hooks to initiate requests for external data with native browser APIs like fetch or external libraries such as Axios to perform HTTP requests and retrieve data from a server or external API. Once the data is fetched, it is commonly stored in component state using useState hooks or managed by global state management libraries like Redux. It is quite a lot of code and a lot of steps on the client. With the invention of React Query data fetching code became more declarative and simple, however, React Query did not resolve all the drawbacks of CSR.&lt;/p&gt;

&lt;p&gt;Next.js simplifies the data fetching process by offering built-in methods getServerSideProps, getStaticProps, and getStaticPaths, providing a more efficient and optimized solution.&lt;/p&gt;

&lt;p&gt;getServerSideProps facilitates server-side data fetching for every request, making it suitable for scenarios requiring dynamic data retrieval based on session information or other server-side factors. This method ensures that data remains current and customized to each user's session, enhancing the responsiveness and relevance of the application's content.&lt;/p&gt;

&lt;p&gt;getStaticProps, on the other hand, enables static data pre-rendering at build time. This method is suitable for data that does not change frequently and can be pre-generated at build time. By pre-rendering pages with static data, Next.js improves performance by serving fully-rendered HTML pages to users, reducing the need for database requests and JavaScript execution.&lt;/p&gt;

&lt;p&gt;In addition to getStaticProps, Next.js offers getStaticPaths, which handles dynamic routing for static pages. With getStaticPaths, developers can define dynamic paths for pages that have dynamic parameters or depend on external data sources. This feature enables Next.js to pre-render all possible variations of a page.&lt;/p&gt;

&lt;p&gt;Next.js' approach to data fetching significantly improves performance by minimizing the reliance on client-side JavaScript execution. With Next.js, JavaScript code is run on the server, and prerendered pages are returned to the client, reducing the need for clients to download large bundles of JavaScript to render a page, which is especially good for users with slow Internet connection. Furthermore, Next.js allows developers to keep the data they need closer to where they need it, optimizing data flows and improving overall application performance.&lt;/p&gt;

&lt;p&gt;One rare drawback of Next.js is the potential for data fetching code duplications, for instance in scenarios like search pages where the search term is part of the URL and when we need to prerender the page on the server for direct page requests (e.g., for caching popular searches or SEO purposes) while also making requests for interactive search term input.&lt;/p&gt;

&lt;p&gt;Static Site Generation (SSG) is particularly well-suited for certain use cases in Next.js, such as blogs, marketing pages, and other content-heavy websites. By pre-rendering pages at build time, Next.js can deliver pages much faster. Additionally, SSG can significantly reduce the server load and improve scalability, making it an ideal choice for websites with high traffic. Overall, Static Site Generation in Next.js offers a powerful solution for building modern web applications that prioritize performance and scalability.&lt;/p&gt;

&lt;h3&gt;
  
  
  INCREMENTAL STATIC REGENERATION (ISR)
&lt;/h3&gt;

&lt;p&gt;ISR is another powerful approach used in Next.js applications. It enhances the traditional Static Site Generation (SSG) by allowing developers to regenerate static pages in the background, without rebuilding the entire site. This means that, after the initial build, Next.js can automatically update static pages in response to data changes or when a certain amount of time has passed, ensuring that content remains fresh without sacrificing performance.&lt;/p&gt;

&lt;p&gt;In other words, when using ISR in Next.js, static pages are initially generated at build time, just like with SSG. However, instead of waiting for the next build to update the content, ISR allows Next.js to re-generate specific static pages on-demand, e.g. based on a predefined revalidation interval. This revalidation interval determines how frequently Next.js checks for updates to the data used to generate the page.&lt;/p&gt;

&lt;p&gt;By leveraging ISR, developers can strike a balance between performance and freshness, ensuring that static pages remain fast and responsive while also staying up-to-date with the latest data. This approach is particularly useful for content that changes frequently but doesn't require immediate updates, such as news articles, product listings, or social media feeds.&lt;/p&gt;

&lt;p&gt;In summary, Incremental Static Regeneration in Next.js combines the benefits of Static Site Generation with dynamic data updates, enabling developers to create fast, scalable websites with fresh content.&lt;/p&gt;

&lt;h3&gt;
  
  
  DEVELOPER EXPERIENCE AND PRODUCTIVITY
&lt;/h3&gt;

&lt;p&gt;When comparing React to Next.js in terms of developer experience and productivity, Next.js often emerges as the favorable choice due to its built-in features and streamlined development process. While React provides a solid foundation for building user interfaces, Next.js goes further by offering essential features that have become standard requirements for modern projects. While some may argue that this pre-packaged approach limits flexibility in terms of configuration setup, it's a good thing for developers seeking quick and effective solutions without the hassle of repetitive setup tasks. This standardized approach eliminates the need for boilerplate configurations, allowing developers to focus on building features rather than setting up infrastructure.&lt;/p&gt;

&lt;p&gt;Moreover, both React and Next.js have a strong community and ecosystem. There are plenty of resources, plugins, and libraries available to assist developers in their projects. This active ecosystem encourages innovation and collaboration, enabling developers to utilize existing solutions and customize them to suit their requirements.&lt;/p&gt;

&lt;p&gt;Another advantage of Next.js is its suitability for full stack development. By providing a unified framework for both client-side and server-side development, Next.js simplifies the process of implementing full stack features. Developers can work within a single language and framework, reducing the complexity of managing multiple technologies and lowering development costs as a result.&lt;/p&gt;

&lt;p&gt;One drawback of Next.js impacting developer experience is quite slow development server coupled with the occasional occurrence of memory leaks in dev preview hot updates, which can hinder the development process a little.&lt;/p&gt;

&lt;h3&gt;
  
  
  DEPLOYMENT AND HOSTING
&lt;/h3&gt;

&lt;p&gt;Popular hosting platforms like Netlify, Vercel, and AWS have integrations with Next.js. These platforms provide out-of-the-box support for Next.js applications, streamlining the deployment process and eliminating the need for manual configuration. If you want to know when Vercel is the better choice for hosting, consider reading &lt;a href="https://focusreactive.com/when-to-host-on-vercel-and-when-not/" rel="noopener noreferrer"&gt;this article&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;One significant difference between deploying React and Next.js projects is that while React applications typically require additional setup and configuration for deployment, Next.js applications benefit from built-in features that facilitate deployment automation. This includes features like automatic code splitting, optimized assets loading, and serverless deployment options.&lt;/p&gt;

&lt;h3&gt;
  
  
  SCALABILITY AND PERFORMANCE
&lt;/h3&gt;

&lt;p&gt;Scalability and performance are crucial factors to consider when building web applications with React or Next.js. In both cases, developers need to anticipate potential scalability challenges and implement strategies to ensure optimal performance as the application grows.&lt;/p&gt;

&lt;p&gt;React applications may need more manual optimization. Developers must carefully manage state, optimize component rendering, and implement efficient data fetching strategies to maintain performance as the application scales.&lt;/p&gt;

&lt;p&gt;Next.js offers built-in features and optimizations that can enhance scalability and performance. For example, Next.js provides automatic image optimization, which reduces image file sizes and improves loading times. Additionally, Next.js supports prefetching of pages and data, ensuring that resources are preloaded before they are needed, thereby reducing latency and improving the overall user experience.&lt;/p&gt;

&lt;p&gt;In summary, while both React and Next.js applications can be scaled and optimized for performance, Next.js offers several advantages in terms of built-in optimizations and performance features.&lt;/p&gt;

&lt;h3&gt;
  
  
  CONCLUSION AND RECOMMENDATIONS
&lt;/h3&gt;

&lt;p&gt;In conclusion, React provides a solid foundation for building user interfaces, with a rich ecosystem of libraries and tools to support developers in their projects. Next.js extends React's capabilities with built-in features for server-side rendering, routing, and static site generation, making it an excellent choice for projects that require enhanced performance, SEO, and developer productivity.&lt;/p&gt;

&lt;p&gt;When choosing between React and Next.js, it's essential to consider the specific requirements and goals of your project. If you're building a simple single-page application or need maximum flexibility and control over your application's architecture, React may be the preferred option. However, if you're looking for a more streamlined development experience, with built-in support for many basic features, Next.js could be the better choice.&lt;/p&gt;

</description>
      <category>react</category>
      <category>nextjs</category>
      <category>webdev</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Configuring robots.txt for Better Indexation and SEO Score</title>
      <dc:creator>Oleg Komissarov</dc:creator>
      <pubDate>Thu, 11 Apr 2024 08:13:48 +0000</pubDate>
      <link>https://forem.com/focusreactive/configuring-robotstxt-for-better-indexation-and-seo-score-3fhk</link>
      <guid>https://forem.com/focusreactive/configuring-robotstxt-for-better-indexation-and-seo-score-3fhk</guid>
      <description>&lt;p&gt;The robots.txt file, placed in the root directory of a website, instructs search engine robots about which pages should and should not be crawled. By preventing web crawlers from crawling certain parts of a website, we can have more control over the content visibility and improve the site’s SEO score.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding the Purpose of robots.txt
&lt;/h2&gt;

&lt;p&gt;The primary purpose of the robots.txt file is to prevent web crawlers from crawling specific parts of a website that are not supposed to be seen by everybody. Also, by omitting such routes, we can prevent crawlers from including irrelevant or repetitive content in search engine results. This selective approach ensures that only the valuable content is crawled, thereby boosting the overall SEO score.&lt;/p&gt;

&lt;p&gt;For example, we may want to prevent search engines from crawling personalized pages, admin pages, or content that is still under development. By using the robots.txt file, we can pass these instructions to web crawlers.&lt;/p&gt;

&lt;p&gt;However, it's important to note that while the robots.txt file can prevent certain pages from being crawled, it does not guarantee that search engines won't find them. The file simply provides instructions to web crawlers and does not have the power to block access to pages actively.&lt;/p&gt;

&lt;h2&gt;
  
  
  Importance of robots.txt
&lt;/h2&gt;

&lt;p&gt;Configuring the robots.txt file correctly can have several benefits. Let's explore some of the key advantages:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Enhancing Crawl Budget Efficiency
&lt;/h3&gt;

&lt;p&gt;Properly configured robots.txt file helps conserve the crawl budget, which is the number of pages a search engine bot is willing to crawl on a website during a given time period. By conserving the crawl budget, we can ensure that only the most relevant and valuable content is indexed, improving overall crawl efficiency.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Preventing Duplicate Content Issues
&lt;/h3&gt;

&lt;p&gt;Duplicate content can harm a website's search engine rankings. By disallowing search engine bots from crawling repetitive or similar content, we can prevent confusion and maintain the quality and credibility of the content.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Securing Sensitive Information
&lt;/h3&gt;

&lt;p&gt;Website security and user privacy are crucial, especially for sites with user accounts or confidential information. The robots.txt file enables us to protect sensitive or private sections of the websites by disallowing search engine bots from crawling them. But keep in mind that in some situations URLs that are disallowed in robots.txt may still be indexed, even if they haven't been crawled.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Providing a Clear Sitemap Reference
&lt;/h3&gt;

&lt;p&gt;Another feature of robots.txt is referencing a website's XML sitemap. The XML sitemap helps search engine bots discover and follow the website's structure, leading to a more efficient and thorough crawling process. By including a reference to the sitemap in the robots.txt file, we can ensure that search engine bots can easily find and navigate the sitemap.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Directing Crawler Behavior for Multilingual Websites
&lt;/h3&gt;

&lt;p&gt;For websites with multilingual content, using robots.txt file can help to ensure that search engine bots prioritize crawling the correct versions of the content based on user location or language preferences. This improves geo-targeting and relevance in search results, ultimately enhancing the overall user experience.&lt;/p&gt;

&lt;h2&gt;
  
  
  Syntax Used in robots.txt File
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. User-agent
&lt;/h3&gt;

&lt;p&gt;The "User-agent" protocol identifies the specific bot or crawler to which the rule applies. For example, &lt;code&gt;User-agent: Googlebot&lt;/code&gt; would target Google's web crawler. To target all crawlers, the rule can be specified like this: &lt;code&gt;User-agent: *&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Disallow
&lt;/h3&gt;

&lt;p&gt;The "Disallow" protocol tells bots not to crawl specific pages or sections of a website. For example, &lt;code&gt;Disallow: /settings/&lt;/code&gt; would block crawlers from accessing the routes in the "settings" folder. Paths must start with the "/" character and if it refers to a folder, it must end with the "/" as well.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Allow
&lt;/h3&gt;

&lt;p&gt;The "Allow" protocol grants bots permission to crawl specific pages or sections of a website, even if they have been disallowed in a previous rule. For example, &lt;code&gt;Allow: /settings/public-page.html&lt;/code&gt; would allow bots to access the "public-page.html" file, even if it is located in a disallowed folder.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Sitemap
&lt;/h3&gt;

&lt;p&gt;The "Sitemap" protocol provides the location of a website's XML sitemap, helping search engine bots find pages more efficiently. Including the sitemap in the robots.txt file is considered one of the best practices for SEO. For example, &lt;code&gt;Sitemap: https://www.example.com/sitemap.xml&lt;/code&gt; directs crawlers to the website's sitemap file. The sitemap does not even have to be on the same host as the robots.txt file. It is also possible to reference multiple XML sitemaps in robots.txt. As an example, this may be useful if a site has one static sitemap and a dynamic one.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Crawl-delay
&lt;/h3&gt;

&lt;p&gt;The "Crawl-delay" property sets a delay between requests to avoid overloading the server. For example, &lt;code&gt;Crawl-delay: 10&lt;/code&gt; would request that bots wait 10 seconds between requests to the website.&lt;/p&gt;

&lt;h2&gt;
  
  
  Example of robots.txt
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;User-agent: *
Allow: /

Disallow: /_next/*.js$

Disallow: /login
Disallow: /signup
Disallow: /forgot-password

Disallow: /admin/
Allow: /admin/exception

Disallow: /api/admin/mutations/

Sitemap: https://example.com/sitemap.xml
Sitemap: https://example.com/dynamic-sitemap.xml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  NextJS and robots.txt
&lt;/h2&gt;

&lt;p&gt;In Next.js, we can easily add or generate a static robots.txt file in the root of our app directory as usual. Since &lt;a href="https://nextjs.org/blog/next-13-3" rel="noopener noreferrer"&gt;13.3&lt;/a&gt; Next.js also provides the convenient flexibility to dynamically generate the robots.txt file by returning a Robots object from the robots.ts file. This approach is particularly useful for generating different rules based on certain conditions (e.g. .env properties). Let's take a look at an example of generating a Robots object in Next.js:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { MetadataRoute } from 'next';

export default function robots(): MetadataRoute.Robots {
    if (process.env.ENVIRONMENT === 'development') {
        return {
            rules: {
                userAgent: '*',
                disallow: '*',
            },
        }
    }

    return {
        rules: [
            {
                userAgent: '*',
                allow: '/',
                disallow: '/private/',
                crawlDelay: 5,
            },
        ],
        sitemap: ['https://example.com/sitemap.xml'],
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the example above, we define a robots function. We restrict all paths for crawlers for the development environment. For other environments, we return a Robots object (or an array of objects) with the rules property containing the specific crawling directives. Additionally, we specify the location of the sitemap.xml file using the sitemap property (or an array for multiple sitemaps).&lt;/p&gt;

&lt;p&gt;Then NextJS will automatically generate a static robots.txt file using the function we provided.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Configuring the robots.txt file is one of the best practices for website management and SEO. The robots.txt file provides instructions to search engine bots, guiding their crawl process.&lt;/p&gt;

&lt;p&gt;The robots.txt file helps to secure sensitive information, enhance crawl efficiency, prevent duplicate content issues, provide a clear sitemap reference, and direct crawler behavior for multilingual or multiregional websites which increases the overall SEO score. NextJS offers increased flexibility allowing generating the robots.txt file based on certain conditions.&lt;/p&gt;




&lt;p&gt;Remember to regularly update the robots.txt file to keep up with the changing needs of the website. With proper configuration and regular maintenance, the robots.txt file can be a powerful tool to optimize search engine rankings.&lt;/p&gt;

</description>
      <category>seo</category>
      <category>robotstxt</category>
    </item>
  </channel>
</rss>
