<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Julian Duru</title>
    <description>The latest articles on Forem by Julian Duru (@durutheguru).</description>
    <link>https://forem.com/durutheguru</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/durutheguru"/>
    <language>en</language>
    <item>
      <title>SpringBoot Integration Testing using TestContainers and docker-compose</title>
      <dc:creator>Julian Duru</dc:creator>
      <pubDate>Sat, 28 Jan 2023 12:57:04 +0000</pubDate>
      <link>https://forem.com/durutheguru/springboot-integration-testing-using-testcontainers-and-docker-compose-h1i</link>
      <guid>https://forem.com/durutheguru/springboot-integration-testing-using-testcontainers-and-docker-compose-h1i</guid>
      <description>&lt;p&gt;Integration testing is an important part of any software development process. It is a phase of software testing where different components of a system are tested together as a group. This helps ensure that the different components work together as expected. With the rise of micro-services and containerization, Integration testing has become even more crucial. In this article, we will explore how to use TestContainers and docker-compose to perform integration testing on a Spring Boot application.&lt;/p&gt;

&lt;p&gt;TestContainers is a Java library that allows you to run docker containers as part of your integration tests. It provides a simple API to start and stop these containers. This makes it easy to test how your application interacts with external services, such as databases or message queues. You can run lightweight, disposable instances of databases, browsers, and other services in ephemeral Docker containers during your tests. This improves the reliability, speed, and ease of writing end-to-end tests for applications that use external dependencies.&lt;/p&gt;

&lt;p&gt;With TestContainers, you can spin up a real database or another service without having to install and configure it on your local machine, which can reduce the complexity of your test setup and make your tests more robust and consistent.&lt;/p&gt;

&lt;p&gt;Docker-compose is a tool for defining and running multi-container applications. It uses a YAML file to define the services that make up an application, and their configurations. With TestContainers, you can use a docker-compose file to define the services that your tests depend on.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Why TestContainers?&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Before I started using TestContainers, I used to run my testing on H2 database and mock out external dependencies. Needless to say, this was not the most effective form of testing. It worked most of the time, but I would often run into issues in production that were peculiar to Mysql or MongoDB depending on my production database of choice. Also, if I was testing within a MicroService environment, I could discover issues when integration between services needed to be utilized; issues that didn’t arise in testing.&lt;/p&gt;

&lt;p&gt;TestContainers provides a way to instantiate docker containers of external dependencies in your integration testing. So instead of using H2 database for my tests, I can start a MySQL docker image and point my tests to run against MySQL.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Setting up&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;To get started, you will need to add TestContainers as a dependency to your project. You can do this by adding the following to your pom.xml file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;dependency&amp;gt;
    &amp;lt;groupId&amp;gt;org.testcontainers&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;mysql&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;1.17.2&amp;lt;/version&amp;gt;
    &amp;lt;scope&amp;gt;test&amp;lt;/scope&amp;gt;
&amp;lt;/dependency&amp;gt;

&amp;lt;dependency&amp;gt;
    &amp;lt;groupId&amp;gt;org.testcontainers&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;junit-jupiter&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;1.17.2&amp;lt;/version&amp;gt;
    &amp;lt;scope&amp;gt;test&amp;lt;/scope&amp;gt;
&amp;lt;/dependency&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you’ve done this, you can create a test that uses TestContainers to start a container. For example, the following test starts a MySQL container, and then uses the container’s IP address and port to connect to the database:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
@Test
void testWithMySQL() {
    // Start the MySQL container
    var mysqlContainer = new MySQLContainer&amp;lt;&amp;gt;("mysql:8.0")
            .withDatabaseName("mydb")
            .withUsername("username")
            .withPassword("password");
    mysqlContainer.start();

    // Connect to the database using the container's IP address and port
    var jdbcUrl = "jdbc:mysql://" + mysql.getContainerIpAddress() + ":" + mysql.getMappedPort(3306) + "/mydb";
    try (Connection connection = DriverManager.getConnection(jdbcUrl, "username", "password")) {
        // Run your tests here
    }

    // Stop the container
    mysql.stop();
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can also use a docker-compose file to define the services that your tests depend on. This makes it easy to test how your application interacts with multiple services. For example, the following docker-compose file defines a MySQL and a RabbitMQ service:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
version: '3'
services:
  mysql:
    image: mysql:8.0
    environment:
      MYSQL_DATABASE: mydb
      MYSQL_USER: username
      MYSQL_PASSWORD: password
      MYSQL_ROOT_PASSWORD: password
    ports:
      - "3306:3306"
    healthcheck:
          test: "mysql $$MYSQL_DATABASE -uroot -p$$MYSQL_ROOT_PASSWORD -e 'SELECT 1;'"
          interval: 10s
          timeout: 300s
          retries: 10

  rabbitmq:
    image: rabbitmq:3-management
    ports:
      - "5672:5672"
      - "15672:15672"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can then use the DockerComposeContainer to start and stop the services defined in the docker-compose file. For example, the following test starts the MySQL and RabbitMQ services, and then uses the container’s IP address and port to connect to them:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
@Test
void testWithDockerCompose() {
    // Start the services defined in the docker-compose file
    var composeContainer = new DockerComposeContainer(new File("src/test/resources/docker-compose.yml"))
            .withExposedService("mysql", 3306)
            .withExposedService("rabbitmq", 5672);
    composeContainer.start();

    // Connect to the MySQL service using the container's IP address and port
    var jdbcUrl = "jdbc:mysql://" + composeContainer.getServiceHost("mysql", 3306) + ":" + composeContainer.getServicePort("mysql", 3306) + "/mydb";
    try (Connection connection = DriverManager.getConnection(jdbcUrl, "username", "password")) {
        // Run your tests here
    }

    // Connect to the RabbitMQ service using the container's IP address and port
    var factory = new ConnectionFactory();
    factory.setHost(composeContainer.getServiceHost("rabbitmq", 5672));
    factory.setPort(composeContainer.getServicePort("rabbitmq", 5672));
    try (Connection connection = factory.newConnection()) {
        // Run your tests here
    }

    // Stop the services
    composeContainer.stop();
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above examples start and stop the containers within the test method. But this might not be ideal if you want to run a suite of tests. There are different strategies you can employ to utilize TestContainers across your entire test suite:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Declare Containers as beans.&lt;/li&gt;
&lt;li&gt;Declare Containers in a Test superclass and annotate with @Container.&lt;/li&gt;
&lt;li&gt;Use the Singleton pattern to define containers in a Test superclass and start manually.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let’s look at each of them.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Declaring Containers as beans&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Not my favorite approach, but this works fine. Here I set up MySQLContainer as a spring bean and Autowire in my Datasource bean. So the data source available in my test context is pointing to the MySQLContainer.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
@TestConfiguration
public class TestDataSourceConfig {

    /** 
    * create mysql container as a bean. start the container before returning it. 
    */
    @Bean
    public MySQLContainer mySQLContainer() {
        var container = new MySQLContainer&amp;lt;&amp;gt;("mysql:8.0")
            .withDatabaseName("mydb")
            .withUsername("username")
            .withPassword("password")
            .withUrlParam("createDatabaseIfNotExist", "true")
            .withUrlParam("serverTimezone", "UTC")
            .withReuse(true)
            .withLogConsumer(
                new Slf4jLogConsumer(
                    LoggerFactory.getLogger(getClass())
                )
            );

        container.start();

        return container;
    }

    /**
    * autowire mysql container in datasource config and use container properties 
    * to initialise the datasource.
    */
    @Bean
    @Primary
    public DataSource dataSource(MySQLContainer mySQLContainer) {
        var dataSource = new HikariDataSource();

        dataSource.setJdbcUrl(mySQLContainer.getJdbcUrl());
        dataSource.setUsername(mySQLContainer.getUsername());
        dataSource.setPassword(mySQLContainer.getPassword());
        dataSource.setDriverClassName(mySQLContainer.getDriverClassName());

        return dataSource;
    }

}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The datasource configuration above can be included in a base integration class. Something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
@ExtendWith({SpringExtension.class})
@SpringBootTest(
    classes = {
        TestDataSourceConfig.class,
    },
    webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT
)
public abstract class BaseIntegrationTest {


}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So long as your Test classes extend the BaseIntegrationTest, they should reuse the same context which should have the MySQLContainer initialized with the Datasource bean.&lt;/p&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;Use @Container annotation with @TestContainers configuration&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;If you would like TestContainers to manage your containers as part of your test life cycles, then it’s best to use this approach. You will have to annotate the Integration test class with @TestContainers which will look for fields declared with @Container  annotation and automatically start and stop them as needed.&lt;/p&gt;

&lt;p&gt;Using the @DynamicPropertySource annotation which was introduced in Spring 5.2.5, we can dynamically set properties in our spring environment. In the example below, I am able to set the ‘spring.datasource.url’ property dynamically after my docker-compose containers have been initialized.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
@Slf4j
@ExtendWith({SpringExtension.class})
@SpringBootTest
@Testcontainers
public abstract class BaseIntegrationTest {

    @Container
    private static DockerComposeContainer dockerComposeContainer = new DockerComposeContainer&amp;lt;&amp;gt;(
        new File("src/test/resources/docker-compose.yml")
    ).withExposedService(
        "mysqldb_1", 3306, Wait.forHealthcheck()
    );

    @DynamicPropertySource
    protected static void setProperties(
        DynamicPropertyRegistry registry
    ) {
        setDataSourceProperties(registry);
    }

    private static void setDataSourceProperties(DynamicPropertyRegistry registry) {
        var mysqlDbHost = dockerComposeContainer.getServiceHost("mysqldb_1", 3306);
        var mysqlDbPort = dockerComposeContainer.getServicePort("mysqldb_1", 3306);

        log.info("DB Connection Properties: {}, {}", mysqlDbHost, mysqlDbPort);

        registry.add(
            "spring.datasource.url",
            () -&amp;gt; String.format("jdbc:mysql://%s:%d/mydb?createDatabaseIfNotExist=true", mysqlDbHost, mysqlDbPort)
        );
    }

}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Use Singleton pattern to define containers in a Test superclass and start manually.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We can also declare our container in a static field and start it manually in a static code block. This is useful if you want your container started only once for several classes extending a base class.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
@Slf4j
@ExtendWith({SpringExtension.class})
@SpringBootTest
public abstract class BaseIntegrationTest {

    private static DockerComposeContainer dockerComposeContainer = new DockerComposeContainer&amp;lt;&amp;gt;(
        new File("src/test/resources/docker-compose.yml")
    ).withExposedService(
        "mysqldb_1", 3306, Wait.forHealthcheck()
    );

    static {
        dockerComposeContainer.start();
    }

    @DynamicPropertySource
    protected static void setProperties(
        DynamicPropertyRegistry registry
    ) {
        setDataSourceProperties(registry);
    }

    private static void setDataSourceProperties(DynamicPropertyRegistry registry) {
        var mysqlDbHost = dockerComposeContainer.getServiceHost("mysqldb_1", 33080);
        var mysqlDbPort = dockerComposeContainer.getServicePort("mysqldb_1", 33080);

        log.info("DB Connection Properties: {}, {}", mysqlDbHost, mysqlDbPort);

        registry.add(
            "spring.datasource.url",
            () -&amp;gt; String.format("jdbc:mysql://%s:%d/mydb?createDatabaseIfNotExist=true", mysqlDbHost, mysqlDbPort)
        );
    }

}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Running your tests with TestContainers and docker-compose can allow you to test with entire services interacting with each other in an isolated environment. You can spin up services, message queues, redis instances, in order to mimic a production environment and do a full integration test &lt;strong&gt;end-to-end&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Testing using TestContainers can be more expensive to operate, as opposed to mocking service interactions. You may notice increased startup times for your tests. In the long run, however, it can be more rewarding if done correctly. TestContainers offers better integration testing as you can test features end-to-end across all connected components.&lt;/p&gt;

&lt;p&gt;TestContainers and Docker Compose are powerful tools that can greatly improve the reliability and speed of writing end-to-end tests for applications that use external dependencies. TestContainers allows you to easily spin up real instances of databases, browsers, and other services in ephemeral Docker containers during your tests, making your test setup more robust and consistent. Meanwhile, Docker Compose allows you to define and run multi-container applications, which can be useful for testing complex applications that depend on multiple services. Together, Testcontainers and Docker Compose can provide a powerful and efficient solution for testing applications that use external dependencies, making it a valuable tool for any developer or tester.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>java</category>
      <category>springboot</category>
      <category>testcontainers</category>
    </item>
    <item>
      <title>Continuous Integration of Maven Artifacts with GitHub Actions</title>
      <dc:creator>Julian Duru</dc:creator>
      <pubDate>Sat, 14 Jan 2023 00:35:25 +0000</pubDate>
      <link>https://forem.com/durutheguru/continuous-integration-of-maven-artifacts-with-github-actions-533d</link>
      <guid>https://forem.com/durutheguru/continuous-integration-of-maven-artifacts-with-github-actions-533d</guid>
      <description>&lt;p&gt;A major requirement in developing a non-trivial software product is the establishment of a formal workflow for Continuous integration and Continuous delivery. Whether it’s an individual or group project, the CI/CD workflow lays the groundwork for getting code from the developer’s box to the live environment.&lt;/p&gt;

&lt;p&gt;But it goes beyond that. CI/CD allows code to be deployed frequently while maintaining quality and reliability. Typically, this is accomplished with the help of automated unit and integration tests at specific points in the development pipeline. Tests that will help to ensure that new code does not break existing code.&lt;/p&gt;

&lt;p&gt;Continuous Integration is so important that a failure to implement it formally results in haphazard development and loss of code quality which only creates more stress for the engineering team.  You can read more about CI/CD in this .&lt;/p&gt;

&lt;h2&gt;
  
  
  Dependency Management
&lt;/h2&gt;

&lt;p&gt;Due to the nature of modern-day software development, entire systems are built by relying on functionality from different components. As these components are allowed to evolve independently, there will need to be a way to ensure they stay compatible. We don’t want updates on one component to cause bugs to manifest in another component. Hence, in order to write software, we need to rely on dependency management.&lt;/p&gt;

&lt;p&gt;Dependency Management allows us to package dependencies in our project and gracefully upgrade those dependencies if we need to.  In the Java ecosystem, there are two primary dependency management tools: Maven and Gradle. Maven uses XML configurations to define the project build, including all plugins and dependencies. Gradle on the other hand uses a Groove / Kotlin-based Domain Specific Language (DSL) to define the build tasks and dependencies.&lt;/p&gt;

&lt;p&gt;Let’s talk a bit about versioning.&lt;/p&gt;

&lt;h2&gt;
  
  
  Versioning
&lt;/h2&gt;

&lt;p&gt;Versioning is a process of assigning a unique number to a specific version of a software artifact. Such that at every point in time, the software version number gives an idea of how the artifact has evolved.&lt;/p&gt;

&lt;p&gt;Versioning is an important part of Dependency Management because it allows us to keep track of the latest developments in our dependencies.&lt;/p&gt;

&lt;p&gt;A few questions might arise: how do we upgrade the dependency versions of our Projects? What format do we employ for versioning? How do we ensure that upgrades to dependencies are tracked and don’t break existing functionality?&lt;/p&gt;

&lt;p&gt;Currently, Semantic Versioning is very widely used. Here’s what it looks like:&lt;/p&gt;

&lt;p&gt;Given a version number: For example 1.5.2, there are 3 parts: MAJOR(1), MINOR(5), PATCH(2).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;MAJOR&lt;/strong&gt; : updated when you make incompatible API changes; that is, changes that can break existing systems using that dependency.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;MINOR&lt;/strong&gt; : updated when you add functionality in a backward-compatible manner.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PATCH&lt;/strong&gt; : updated when you make backward-compatible bug fixes.&lt;/p&gt;

&lt;p&gt;Now we can delve into branching models in git.&lt;/p&gt;

&lt;h2&gt;
  
  
  Branching Models
&lt;/h2&gt;

&lt;p&gt;A branching model is a set of rules developers follow when working on a shared codebase. They help structure the approach to branching and merging code, which helps teams work more efficiently. I strictly adhere to a branching model that allows me to compute a version for my project based on the Git history. There are a few branching models in software development:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;GitFlow&lt;/li&gt;
&lt;li&gt;GitHub Flow&lt;/li&gt;
&lt;li&gt;Trunk-based development&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I’ll shed more light on GitFlow since it’s the most popular. Gitflow uses different branch types:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Master or Main:&lt;/strong&gt; This is the stable branch that contains the last version of code released into production and should ALWAYS be production-ready.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feature:&lt;/strong&gt; feature branches are meant for implementing features. Usually, developers will branch off develop and write all code pertaining to that feature on the feature branch before merging it to develop&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Develop:&lt;/strong&gt; Developers merge to develop branch when features are ready to be integrated into the master, the development branch serves as a branch for integrating different features planned for an upcoming release&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Release&lt;/strong&gt; : branches off develop and used to prepare a production release. When the release branch is tested, it is typically merged into develop and master.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hotfix&lt;/strong&gt; : Is used to fix bugs that arise on the master branch. Hotfix branches off master and is merged back into master and develop on completion and testing.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The GitFlow model works best for larger teams though can be more tedious to manage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcf5mcmqeiwfihhv924dz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcf5mcmqeiwfihhv924dz.png" width="773" height="1024"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Git Flow branching model reference: &lt;a href="https://nvie.com/posts/a-successful-git-branching-model/" rel="noopener noreferrer"&gt;https://nvie.com/posts/a-successful-git-branching-model/&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;A Process is somewhat a combination of GitFlow and the slightly less popular &lt;a href="https://trunkbaseddevelopment.com/" rel="noopener noreferrer"&gt;Trunk-based development&lt;/a&gt;.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create two permanent branches: main and develop. main for production-ready code, develop for feature integration.&lt;/li&gt;
&lt;li&gt;When working on a feature, branch off develop, and create a branch prefixed with ‘feature/’. For example to work on user login, create a branch off development called feature/user-login.&lt;/li&gt;
&lt;li&gt;When the feature is ready, push the latest feature code to the remote repo and create a pull request to develop.&lt;/li&gt;
&lt;li&gt;Run code review, source code analysis, automated unit, and integration tests on the feature branch to ensure it is suitable to be merged to develop.&lt;/li&gt;
&lt;li&gt;Merge code to develop and run automated unit and integration tests again to ensure feature merge has not broken other features.&lt;/li&gt;
&lt;li&gt;Merge code to the main branch. Run tagging, versioning, and prepare the release of the artifact.&lt;/li&gt;
&lt;li&gt;Publish.&lt;/li&gt;
&lt;li&gt;Merge back to develop.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The primary difference between this process flow and the more popular Gitflow is I don’t create a release branch. Instead, after testing develop, I merge to main and run my release management on main.  The develop branch exists for testing and integration purposes. This will ensure that only fully tested code is merged to main. The process of merging develop to main can be automatic (triggered by CI) or manual should you want a Senior Dev to review the code before it reaches main.&lt;/p&gt;

&lt;p&gt;In a sense, this process is similar to Trunk Based development where main and develop both serve as Trunk. After running release management, merge main back to develop. Github avoids cyclical builds so the merge back to develop will not trigger CI.&lt;/p&gt;
&lt;h2&gt;
  
  
  Let’s implement the CI Pipeline on GitHub Actions
&lt;/h2&gt;

&lt;p&gt;First, we install GitVersion:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ brew install gitversion
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I use GitVersion to handle my versioning.&lt;/p&gt;

&lt;p&gt;GitVersion uses the git commit history to compute the version of the software project. The commit messages are parsed for certain patterns:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;When you add a new feature, the commit message should include: +semver: feature&lt;/li&gt;
&lt;li&gt;When you add a bug fix, the commit message should include: +semver: patch&lt;/li&gt;
&lt;li&gt;When a commit introduces a breaking change, the message should include: +semver: major&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Set up the project for semantic versioning support. I use the bash script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#! /bin/bash

git flow init
gitversion init
git commit -m "Setup Versioning"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above script will show a wizard in the command prompt requesting to fill in information about the project setup. The wizard is quite simple and easy to follow. Once you’re done, you should have a &lt;strong&gt;GitVersion.yml&lt;/strong&gt; file with the following structure in the project root:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;mode: ContinuousDeployment
branches: {}
ignore:
sha: []
merge-message-formats: {}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the root folder of your project, you should have 3 workflow files declared in the .github/workflows path. Something like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frchgb13tgxreqoztmat4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frchgb13tgxreqoztmat4.png" width="620" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Feature/Hotfix branch Integration. (/.github/workflows/non-mainline-branch-update.yml)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
name: Feature/Hotfix Build

on:
  push:
    branches:
      - 'feature/*'
      - 'hotfix/*'

jobs:
  test:
    runs-on: ubuntu-latest

    steps:
    - name: Set up JDK 17
    - uses: actions/checkout@v3
      uses: actions/setup-java@v3
      with:
        java-version: 17
        distribution: zulu

    - name: Cache SonarCloud packages
      uses: actions/cache@v3
      with:
        path: ~/.sonar/cache
        key: ${{ runner.os }}-sonar
        restore-keys: ${{ runner.os }}-sonar

    - name: Cache Maven packages
      uses: actions/cache@v3
      with:
        path: ~/.m2
        key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
        restore-keys: ${{ runner.os }}-m2-

    - name: Build and analyze
      env:
      run: mvn -B verify -s settings.xml -f pom.xml

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above script is triggered on push events for branches matching the following patterns: feature/*, hotfix/*.  The step names describe what is being done in each step of the build.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Setup the JDK environment,&lt;/li&gt;
&lt;li&gt;Restore cache for Sonar. You can skip the Sonar caching step if it doesn’t apply to you. I use Sonar Cloud to scan my artifacts, hence I need to include the Sonar Caching step to speed up the build.&lt;/li&gt;
&lt;li&gt;Restore cache for Maven. This helps avoid re-downloading dependencies and thus speeds up the build. The cache key is related to the hash of pom files in the project, so changes to the poms will compute a new hash and trigger the re-downloading of dependencies.&lt;/li&gt;
&lt;li&gt;The last step will run the build and analysis&lt;/li&gt;
&lt;/ol&gt;

&lt;h4&gt;
  
  
  Develop branch integration (/.github/workflows/develop-integration.yml)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
name: Develop Branch Integration

on:
  pull_request:
    branches: [develop]
    types: [closed]

jobs:
  build:
    runs-on: ubuntu-latest

    if: github.event.pull_request.merged == true
    steps:
      - uses: actions/checkout@v3
      - name: Setup Java 17 env
        uses: actions/setup-java@v1
        with:
          java-version: 17

      - name: Cache Maven packages
        uses: actions/cache@v3
        with:
          path: ~/.m2
          key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
          restore-keys: ${{ runner.os }}-m2-

      - name: Build and analyze
        env:
          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
          SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }}
        run: mvn -B verify -s settings.xml -f pom.xml

      - name: Set Commit Message
        id: commit
        run: |
          ${{ startsWith(github.head_ref, 'feature/') }} &amp;amp;&amp;amp; echo ::set-output name=message::"+semver: feature" \
          || echo ::set-output name=message::"+semver: patch"

      - name: Commit Build Message
        env:
          COMMIT_MSG: ${{ steps.commit.outputs.message }}
        run: |
          git config user.email ${{ secrets.GIT_EMAIL }}
          git config user.name ${{ secrets.GIT_USERNAME }}
          git add .
          git commit -m "$COMMIT_MSG" --allow-empty || true

      - name: Push changes
        uses: ad-m/github-push-action@master
        with:
          branch: develop
          github_token: ${{ secrets.GITHUB_TOKEN }}

  merge-main:
    name: Merge to Main
    needs: [build]
    runs-on: ubuntu-latest

    if: github.event.pull_request.merged == true
    steps:
      - name: Checkout
        uses: actions/checkout@v2
      - name: Fetching
        run: |
          git fetch --all

      - name: Merge to Main
        uses: devmasx/merge-branch@v1.1.0
        with:
          type: now
          target_branch: 'main'
        env:
          GITHUB_TOKEN: ${{ secrets.GIT_ACCESS_TOKEN }}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The steps are conditional on merging to develop. Hence the snippet: ”   &lt;code&gt;if: github.event.pull_request.merged == true"&lt;/code&gt;. This build starts off almost like the previous build, but this time has a ‘ &lt;strong&gt;Set Commit Message&lt;/strong&gt; ‘ step where I set a message in output which I intend to use in the next build step. This message is going to be used in the “Commit  Build Message” step and will form part of the message in the commit.&lt;/p&gt;

&lt;p&gt;On the master build, the commit message will be utilized by GitVersion to compute the version of the software.  If I merge a feature/* branch, I commit with the message “+semver: feature” else I use “+semver: patch”. For breaking changes introduced to the artifact, I would manually commit a message “+semver: major” on my local. After the tests run successfully, I commit the changes to develop branch and automatically trigger a Merge to Main.&lt;/p&gt;

&lt;p&gt;The above script will not work without declaring the secrets in your Git repository settings. Go to Actions under &lt;strong&gt;&lt;em&gt;Secrets and Variables&lt;/em&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqbtvabrhzdw65ljltv9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqbtvabrhzdw65ljltv9.png" width="800" height="415"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When you push code to a branch prefixed with feature/*, hotfix/* the above pipeline is triggered. You can check the status of the job under the Actions tab.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0a6g7bu1s1i9omkg14t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz0a6g7bu1s1i9omkg14t.png" width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Main Branch Integration (/.github/workflows/main-integration.yml)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
name: Main Branch CI

# Controls when the action will run. Triggers the workflow on push or pull request
# events but only for the main branch
on:
  push:
    branches: [main]

jobs:
  # This workflow contains a single job called "build"
  build:
    # The type of runner that the job will run on
    runs-on: ubuntu-latest

    # Steps represent a sequence of tasks that will be executed as part of the job
    steps:
      # Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
      - uses: actions/checkout@v2
      - name: Fetching All
        run: |
          git fetch --prune --unshallow

      # Install .NET Core as it is required by GitVersion action
      - name: Setup .NET Core
        uses: actions/setup-dotnet@v3
        with:
          dotnet-version: |
            3.1.x
            5.0.x

      # Install Git Version
      - name: Installing GitVersion
        uses: gittools/actions/gitversion/setup@v0.9.3
        with:
          versionSpec: '5.3.x'

      # Use Git Version to compute version of the project
      - name: Use GitVersion
        id: gitversion
        uses: gittools/actions/gitversion/execute@v0.9.3

      # Setup Java environment
      - name: Setup Java 17 env
        uses: actions/setup-java@v1
        with:
          java-version: 17

      # Cache and restore Maven dependencies
      - name: Cache Maven packages
        uses: actions/cache@v3
        with:
          path: ~/.m2
          key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
          restore-keys: ${{ runner.os }}-m2-

      # For a maven artifact, set version to what was computed by GitVersion in earlier step
      - name: Evaluate New Artifact Version
        run: |
          NEW_VERSION=${{ steps.gitversion.outputs.semVer }}
          echo "Artifact Semantic Version: $NEW_VERSION"
          mvn versions:set -DnewVersion=${NEW_VERSION}-SNAPSHOT -s settings.xml

      # Deploy artifact to repository. Could be ossrh, archiva etc. 
      - name: Build and Deploy with Maven
        env:
          ARTIFACT_REPO_USERNAME: ${{ secrets.ARTIFACT_REPO_USERNAME }}
          ARTIFACT_REPO_PASSWORD: ${{ secrets.ARTIFACT_REPO_PASSWORD }}
        run: |
          export MAVEN_OPTS="--add-opens=java.base/java.util=ALL-UNNAMED --add-opens=java.base/java.lang.reflect=ALL-UNNAMED --add-opens=java.base/java.text=ALL-UNNAMED --add-opens=java.desktop/java.awt.font=ALL-UNNAMED"
          mvn clean deploy -s settings.xml -f pom.xml

      # Optional step where I like to write the version number to a file in the project root. 
      - name: Upgrading Version
        run: |
          RELEASE_TAG=${{ steps.gitversion.outputs.semVer }}
          echo $RELEASE_TAG &amp;gt; version.ver
          git config user.email ${{ secrets.GIT_EMAIL }}
          git config user.name ${{ secrets.GIT_USERNAME }}
          git add .
          git commit -m "Upgraded Version &amp;gt;&amp;gt; $RELEASE_TAG" || true

      - name: Push changes
        uses: ad-m/github-push-action@master
        with:
          branch: main
          github_token: ${{ secrets.GITHUB_TOKEN }}

  merge-develop:
    name: Merge to Develop
    needs: [build]
    runs-on: ubuntu-latest

    steps:
    - name: Checkout
      uses: actions/checkout@v2
    - name: Fetching
      run: |
        git fetch --all
    - name: Merge to Develop
      uses: devmasx/merge-branch@v1.1.0
      with:
        type: now
        target_branch: develop
      env:
        GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Following the comments in the main build should be straightforward. I compute the version of the artifact using GitVersion, then I deploy the artifact after calling “mvn versions:set”. The new version is Committed to main and then merged back to develop. The deployment step will vary depending on the nature of the artifact. For example, it could be a service being deployed on Heroku. In this case, I will have a step like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
- name: Push changes to Heroku
  uses: akhileshns/heroku-deploy@v3.12.12
  with:
    heroku_api_key: ${{secrets.HEROKU_API_KEY}}
    heroku_app_name: ${{secrets.HEROKU_APP_NAME}}
    heroku_email: ${{secrets.HEROKU_EMAIL}}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We can do anything really. Deploy to Heroku, AWS, GCP, or push to docker. Whatever works based on our process flow.&lt;/p&gt;

&lt;p&gt;Here’s a sequence diagram to recap what our CI looks like:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh2pjne8iasfua4778y8z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh2pjne8iasfua4778y8z.png" width="548" height="573"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this article, we have discussed how to automate the CI process of Maven artifacts using GitHub Actions. By using GitHub Actions, you can easily configure and run Maven commands, such as “mvn test” and “mvn deploy”, whenever changes are pushed to your repository. This helps to catch errors early and ensure that the code is always in a releasable state.&lt;/p&gt;

&lt;p&gt;Please note that this is an example and it might be necessary to adjust the steps and commands to match your specific use case 😉&lt;/p&gt;

</description>
      <category>programming</category>
      <category>java</category>
      <category>maven</category>
      <category>ci</category>
    </item>
    <item>
      <title>Implementing the Flux Architecture Pattern in VueJS</title>
      <dc:creator>Julian Duru</dc:creator>
      <pubDate>Mon, 07 Sep 2020 00:03:08 +0000</pubDate>
      <link>https://forem.com/durutheguru/implementing-the-flux-architecture-pattern-in-vuejs-57gp</link>
      <guid>https://forem.com/durutheguru/implementing-the-flux-architecture-pattern-in-vuejs-57gp</guid>
      <description>&lt;p&gt;Modern frontend development has really gone far. Honestly, if you can remember the early days, you know we’ve come a long way from relying on spaghetti JQuery code to deliver functionality to users. Today we have frameworks like React, Vue, and Angular. These frameworks encapsulate the &lt;a href="https://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93viewmodel"&gt;MVVM&lt;/a&gt; and &lt;a href="https://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93controller"&gt;MVC&lt;/a&gt; software architecture patterns that make it easier to build scalable frontends to meet the demands of users.&lt;/p&gt;

&lt;p&gt;Some of the basic requirements of a frontend include accepting input from a user and forwarding the input to the backend, also there is often the need to fetch data from the backend and render it to the user. All this might seem so simple on the surface, but as you start to build out a large system, complexity can start to increase by several orders of magnitude. Therefore, a well-designed frontend must follow best practices of &lt;strong&gt;componentization&lt;/strong&gt; and &lt;strong&gt;clear separation of concerns&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In the spirit of componentization, a problem starts to present itself when several parts of the application need to share data. How do we ensure this data is shared in a consistent manner and that updates to this data are communicated to all interested components? This problem is generally called &lt;strong&gt;State Management&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;The State Management problem was born out of the inherent complexity in building out large frontend systems that relied on different components that needed to share data in a consistent manner. This problem was elegantly solved at Facebook using the &lt;strong&gt;Flux Architecture&lt;/strong&gt;. The popular frontend frameworks have their implementations of Flux; Vue has &lt;em&gt;Vuex&lt;/em&gt;, React has &lt;em&gt;Redux&lt;/em&gt;, Angular has &lt;em&gt;NgRx&lt;/em&gt;. For some reason they all end in x, I wish I knew why.&lt;/p&gt;

&lt;p&gt;In this post, I’ll focus on implementing Flux in VueJS, as Vue is my frontend framework of choice. React and Angular are equally good, Vue just happens to be my favorite.&lt;/p&gt;

&lt;h2&gt;
  
  
  So what is the Flux Architecture?
&lt;/h2&gt;

&lt;p&gt;Flux introduces predictability in State Management. It accomplishes this by ensuring a unidirectional flow of data across the application. To understand Flux, let’s look at the essential components of a flux architecture:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Actions&lt;/li&gt;
&lt;li&gt;Store&lt;/li&gt;
&lt;li&gt;Dispatcher&lt;/li&gt;
&lt;li&gt;Views&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9Z3cQDP2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/flux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9Z3cQDP2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/flux.png" alt="" width="597" height="503"&gt;&lt;/a&gt;Diagrammatic Representation of Flux pattern&lt;/p&gt;

&lt;p&gt;Very briefly, I’ll run through them.&lt;/p&gt;

&lt;p&gt;An &lt;strong&gt;Action&lt;/strong&gt;  is an object that encapsulates all the information needed to perform that action. Actions are sent through the dispatcher and triggered to all listening stores. The source of the action could vary depending on the use case and specific scenario. A good example is a user triggering an action by clicking a button. &lt;/p&gt;

&lt;p&gt;A &lt;strong&gt;Store&lt;/strong&gt; is an object that serves as a wrapper around a &lt;strong&gt;State&lt;/strong&gt;. The State is the source of truth; in other words, the primary source of the data we are interested in. The store accomplishes state management by exposing methods with which clients can trigger updates to the state or read the existing state. After executing an update, the store emits an event. Through event propagation, the changes cascade to all Views interested in that state.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;Dispatcher&lt;/strong&gt; is responsible for delivering actions to the stores. Put simply, Stores register to listen to actions and receive notifications of actions from the Dispatcher. &lt;/p&gt;

&lt;p&gt;And finally, &lt;strong&gt;Views&lt;/strong&gt; listen for events emanating from store changes and re-render on such event notifications. Views can also be used to trigger actions to the store through the Dispatcher. &lt;/p&gt;

&lt;p&gt;With an understanding of what I’ve just described, it is easy to see the unidirectionality of data propagation and how it reduces the complexity of state management. The Flux architecture is strict in its implementation. Clients are not allowed to directly manipulate the state; all updates pass through the store. Also, multiple components can register to listen to store updates. &lt;/p&gt;

&lt;p&gt;Now let’s look at an example implementation in Vue. We will write a small app that will call a backend and save the data in the local store. Also, we will expose a view to this data. &lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up a mock backend using JSON Server.
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;&lt;a href="https://www.npmjs.com/package/json-server"&gt;json-server&lt;/a&gt;&lt;/em&gt; is a fantastic npm module that allows you to easily mock a REST API. It’s great for frontend development because you can proceed with work and testing without waiting for an existing backend service.&lt;/p&gt;

&lt;p&gt;To install json-server, run the command below, assuming you already have &lt;a href="https://www.npmjs.com/get-npm"&gt;npm setup&lt;/a&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm install -g json-server
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then create a json file to model the mock database. Here’s a sample structure:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "country": [
        {
            "code": "NG",
            "id": 1,
            "name": "Nigeria"
        },
        {
            "code": "GH",
            "id": 2,
            "name": "Ghana"
        }
    ],
    "person": [
        {
            "id": 1,
            "name": "Lagbaja",
            "occupation": "Musician"
        },
        {
            "id": 2,
            "name": "Kate Henshaw",
            "occupation": "Actress"
        },
        {
            "id": 3,
            "name": "Julian Dumebi Duru",
            "occupation": "Software guy"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save the file content in a local folder and run command to execute:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ json-server --watch db.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The json-server module will fire up a mock server on a local port (usually 3000) and expose appropriate REST endpoints for the entities in our json file. You should have traditional REST-style endpoints available: &lt;code&gt;GET http://localhost:3000/person&lt;/code&gt;, &lt;code&gt;GET http://localhost:3000/person/1&lt;/code&gt;. Even POST methods are supported. You can check the &lt;a href="https://www.npmjs.com/package/json-server"&gt;official npm page&lt;/a&gt; for json-server.   &lt;/p&gt;

&lt;p&gt;Now that we have a mock backend, let’s setup a Vue project to implement Flux. &lt;/p&gt;

&lt;h2&gt;
  
  
  Scaffolding a Vue Project
&lt;/h2&gt;

&lt;p&gt;Before you can go ahead with scaffolding a Vue project, you need to have &lt;a href="https://cli.vuejs.org/guide/installation.html"&gt;vue-cli&lt;/a&gt; installed locally. Installation is pretty straightforward. Simply enter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm install -g vue-cli
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then we can go ahead and enter the command below. I like using webpack as my module bundler; browserify is another option. You can make time to check them out. Still on your command line, navigate to a folder of your choice, and enter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ vue init webpack vuex-app
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;vuex-app is the name of our sample Vue application. Feel free to replace it with whatever you deem fit. After executing the command above, you will be asked a series of jamb questions to aid the scaffolding. Select some sensible defaults similar to what I have below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XfaudlPB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/Screenshot-2020-09-07-at-1.27.14-AM-1024x326.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XfaudlPB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/Screenshot-2020-09-07-at-1.27.14-AM-1024x326.png" alt="" width="880" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You should have a project folder that looks like this: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--u5HM69fO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/Screenshot-2020-09-05-at-8.44.58-PM.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--u5HM69fO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/Screenshot-2020-09-05-at-8.44.58-PM.png" alt="" width="584" height="888"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within the vuex-app project folder, we need to install some node packages that will serve as dependencies for the project. The first is Vuex.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm install vuex --save
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To allow us to make API calls to our backend, we’ll use Axios. I like Axios because it’s neat and it works pretty well. Plus it is backed by a large number of developers, so it’s safe to assume continued support for the foreseeable future. Here you go:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm install axios --save
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we’ve setup our dependencies, let’s go ahead and write some code.&lt;/p&gt;

&lt;p&gt;First, we’ll build out the store. When implementing flux, I like to employ a modular approach. It’s good to pay attention to modularisation early in development. It makes it easier to scale the codebase as you progress. So in the src folder, We will create a store sub-folder. Try to replicate the structure below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jcYJ6aYX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/folder-structure.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jcYJ6aYX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/folder-structure.png" alt="" width="603" height="610"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice how I have my modules in separate folders, each with an index.js. I try to have a separate module per API resource. It doesn’t have to be like this, you can employ whatever naming or structure you like as this just happens to be a personal preference. Moving on, in the &lt;code&gt;src/store/modules/person/index.js&lt;/code&gt;, include the following content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import axios from 'axios'


const state = {

    loading: false,

    error: '',

    list: []

};

const getters = {

    getPersons(context) {
        return context.list;
    },

};

const mutations = {

    setPersons(context, persons) {
        context.list = persons;
    },

    setError(context, error) {
        context.error = error;
    },

    setLoading(context, loading) {
        context.loading = loading;
    },

};

const actions = {

    fetchPersons(context) {
        context.commit('setLoading', true);

        axios
        .get('http://localhost:3000/person')
        .then(
            (response) =&amp;gt; {
                context.commit('setLoading', false);
                context.commit('setPersons', response.data);
            }
        ).catch(
            (error) =&amp;gt; {
                context.commit('setLoading', false);
                context.commit('setError', error);
            }
        );
    }

};

export default {
    namespaced: true,
    state,
    getters,
    actions,
    mutations,
};


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The file consists of 4 different objects: state, getters, mutations, and actions.   &lt;/p&gt;

&lt;p&gt;The state object must be a Plain Javascript Object, preferably an object literal. Here’s a description of the properties in my state object:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;loading&lt;/strong&gt; : a boolean that will allow me to track if a resource is currently being loaded. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;error&lt;/strong&gt; : a string value to hold a possible error message from the backend.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;list&lt;/strong&gt; : a list to hold the person objects I fetch from the API.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The getters object declares one method: &lt;code&gt;getPersons&lt;/code&gt; which accepts a context parameter. The context is a reference to our store from which we return the person list to our caller.   &lt;/p&gt;

&lt;p&gt;The mutations object allows us to expose methods for updating our store’s state. Each method accepts the store context and a parameter to update the underlying state.   &lt;/p&gt;

&lt;p&gt;The actions object contains only one action. Our action calls the endpoint on our mock server to load person objects. Notice how I only mutate the store’s state by calling the commit method on the store. For the sake of demonstration, I have kept the code simple. In a real project, you want to avoid hard coding your base URLs. Instead, have it referenced via a configuration. That way you can easily configure your base URLs to have your app work in different environments.   &lt;/p&gt;

&lt;p&gt;Finally, we export the store module with these objects and include the namespaced property with value ‘true’. Including this flag allows us to access our person store module in a namespace (as we will see later). This promotes the reusability of our store modules by allowing them to be more self-contained. &lt;/p&gt;

&lt;p&gt;Next, we code our store’s entry file. We will code this file to aggregate all the other store modules.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import Vue from 'vue'
import Vuex from 'vuex'
import persons from './modules/person'


Vue.use(Vuex);

export default new Vuex.Store({

  modules : {
    persons,
  },

});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Not much going on here. In line 6, we call Vue.use to install Vuex. And then we export a store object that aggregates all our modules. For now, we have just one module: persons.&lt;/p&gt;

&lt;p&gt;With the store in place, we can now re-implement the HelloWorld.vue component to use the store we just built. We want to load a list of persons from our backend and display them on the user interface. Delete the autogenerated content of HelloWorld.vue and include this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;template&amp;gt;
    &amp;lt;div class=""&amp;gt;
        &amp;lt;div v-for="p in persons" v-bind:key="p.id"&amp;gt;
            {{p.name}} - {{p.occupation}}
        &amp;lt;/div&amp;gt;
    &amp;lt;/div&amp;gt;
&amp;lt;/template&amp;gt;

&amp;lt;script&amp;gt;

    import store from '@/store'

    export default {

        mounted() {
            store.dispatch("persons/fetchPersons");
        },

        computed:  {

            persons() {
                return store.getters['persons/getPersons'];
            }

        }

    }

&amp;lt;/script&amp;gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the template, ee use Vue’s v-for syntax to render all items in our store’s person list. Also, in the exported component, we trigger the namespaced &lt;code&gt;fetchPersons&lt;/code&gt; action in the &lt;strong&gt;mounted&lt;/strong&gt; lifecycle hook. You can checkout Vue’s official documentation to understand component lifecycles. This ensures that when the component is loaded, the &lt;code&gt;fetchPersons&lt;/code&gt; action is triggered. Also, we expose a computed property called persons on which our v-for template binds and renders the list items. The computed property calls the store’s getter which returns the fetched person objects. Notice we have to employ the appropriate namespace to access the getter. &lt;/p&gt;

&lt;p&gt;And we’re good to go. From the root of the project on the command line, we can run the app in dev mode:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm run dev
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Open the app in your browser, you should be able to see the person objects in db.json which were successfully fetched and stored. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--x0NkT3dV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/Screenshot-2020-09-07-at-2.07.31-AM-1024x787.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--x0NkT3dV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://secureservercdn.net/45.40.146.28/94l.adb.myftpupload.com/wp-content/uploads/2020/09/Screenshot-2020-09-07-at-2.07.31-AM-1024x787.png" alt="" width="880" height="676"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Depending on your requirements, you might want to have a store that is persisted to the local storage. This can allow us to create a somewhat offline experience for the user. In a case where the user loses internet connection, we can simply render the last successfully loaded data from the store pending when the internet connection is re-established.&lt;/p&gt;

&lt;p&gt;If you would like your store to be persisted to the browser’s local storage, there’s a plugin you can use for that: &lt;code&gt;vuex-persistedstate&lt;/code&gt;. Within the project root folder, run command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;$ npm install vuex-persistedstate --save 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then we will make a slight modification to our store’s index file. Here’s what it should look like afterward:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import Vue from 'vue'
import Vuex from 'vuex'
import persistedState from 'vuex-persistedstate'
import persons from './modules/person'


Vue.use(Vuex);

export default new Vuex.Store({

  modules : {
    persons,
  },

  plugins: [
    persistedState(),
  ],

});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We haven’t done much here, merely imported the &lt;code&gt;vuex-persistedstate&lt;/code&gt; and installed it as a store plugin. Pretty neat! The plugin will handle persistence to the browser’s local storage. &lt;/p&gt;

&lt;p&gt;You can go ahead and test the setup. Refresh the page and check the browser’s local storage, you should see the loaded objects have been persisted. What’s really great is, even if we kill the mock backend and it becomes unavailable, we can still give the user an offline experience. Since our component is reading from the store which already has persisted objects. Soon as our backend is up and we can fetch new data, our store is updated and the view re-rendered. That my friend, is the magic of Flux.&lt;/p&gt;

&lt;h2&gt;
  
  
  Wrapping up
&lt;/h2&gt;

&lt;p&gt;To recap, we implemented the Flux Architecture using VueJS. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;We setup a mock back-end using json-server&lt;/li&gt;
&lt;li&gt;Then we built a store to epitomize our application state.&lt;/li&gt;
&lt;li&gt;Then we added an Action to fetch data from the mock backend.&lt;/li&gt;
&lt;li&gt;Afterward, we implemented a View to trigger the Action when the View is loaded.&lt;/li&gt;
&lt;li&gt;The Action resulted in a mutation of the Store’s state.&lt;/li&gt;
&lt;li&gt;The mutations triggered events that cascaded updates to the View.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I will point out though, you don’t always need to use the flux architecture pattern. Sometimes, it’s fine to have components manage their state locally as opposed to having global State Management. Flux really shines when you need to share state across components and want to ensure a clean architecture. Though it introduces some complexity, it can be worth the trade-off. Check out this &lt;a href="https://medium.com/@dan_abramov/you-might-not-need-redux-be46360cf367#.z9abvda1k"&gt;medium post&lt;/a&gt; by the author of Redux: Dan Abramov. &lt;/p&gt;

&lt;h2&gt;
  
  
  Further Reading
&lt;/h2&gt;

&lt;p&gt;Vuex – &lt;a href="https://vuex.vuejs.org/"&gt;https://vuex.vuejs.org/&lt;/a&gt;&lt;br&gt;&lt;br&gt;
Vue Official Documentation – &lt;a href="https://vuejs.org/v2/guide/"&gt;https://vuejs.org/v2/guide/&lt;/a&gt;&lt;br&gt;&lt;br&gt;
json-server – &lt;a href="https://www.npmjs.com/package/json-server"&gt;https://www.npmjs.com/package/json-server&lt;/a&gt;&lt;br&gt;&lt;br&gt;
vue-persistedstate – &lt;a href="https://www.npmjs.com/package/vuex-persistedstate"&gt;https://www.npmjs.com/package/vuex-persistedstate&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>vue</category>
      <category>vuex</category>
    </item>
  </channel>
</rss>
