<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Ken Yip</title>
    <description>The latest articles on Forem by Ken Yip (@ken2026).</description>
    <link>https://forem.com/ken2026</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/ken2026"/>
    <language>en</language>
    <item>
      <title>Develop a custom hook extension for Directus</title>
      <dc:creator>Ken Yip</dc:creator>
      <pubDate>Sun, 10 Nov 2024 04:25:06 +0000</pubDate>
      <link>https://forem.com/ken2026/develop-a-custom-hook-extension-for-directus-33p1</link>
      <guid>https://forem.com/ken2026/develop-a-custom-hook-extension-for-directus-33p1</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Recently, I decided to use Directus as the CMS for my personal blog after comparing it with other open-source frameworks. One of the advantages of Directus is that it allows us to easily create custom extensions.&lt;/p&gt;

&lt;p&gt;In this article, I will guide you through creating a custom hook extension that slugifies the article and throws an error if the slug already exists in the database.&lt;/p&gt;

&lt;h2&gt;
  
  
  Configure the Setup
&lt;/h2&gt;

&lt;p&gt;First of all, we need to install Directus on our local machine to develop an extension. There are several ways to set up a Directus project. You can check the official documentation here:&lt;br&gt;
&lt;a href="https://docs.directus.io/getting-started/quickstart.html#_1-create-a-project" rel="noopener noreferrer"&gt;https://docs.directus.io/getting-started/quickstart.html#_1-create-a-project&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To enable local development, I recommend using the NPM installation method. It allows us to hot-reload the server when the extension code is changed.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Setup Directus project&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight shell"&gt;&lt;code&gt;  npm init directus-project@latest &amp;lt;project-name&amp;gt;
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Install Nodemon to enable hot-reload&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  yarn install -D nodemon
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Add a command to the scripts section of the package.json&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  "dev": "npx nodemon --watch extensions --ext js --exec \"yarn start\""
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Create a collection called ‘article’ with the fields: title, slug, and content. The slug should be set as hidden.&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Start the program with the command:&lt;br&gt;
&lt;/p&gt;

&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  yarn dev
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;/ol&gt;

&lt;h2&gt;
  
  
  Create the custom hook
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;Navigate to the extensions folder and run the following command to create a new extension.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  npx create-directus-extension@latest slugify-article-title
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You will be prompted with serveral questions. I used the TypeScript to create the extension.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Install the ‘slugify’ package in the extension.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  yarn add slugify
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Create 2 folders: errors and utils, and then add the following scripts:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  // createArticleTitleAlreadyExistsError.ts
  import { createError } from '@directus/errors';

  export const createArticleTitleAlreadyExistsError = (title: string) =&amp;gt;
    createError(
      'ArticleTitleAlreadyExistsError',
      `An article with the title "${title}" already exists.`,
      409
    );
  // InvalidSetupError.ts
  import { createError } from '@directus/errors';

  export const InvalidSetupError = createError(
    'InvalidSetupError',
    'The article collection must have a string field named "slug" to use the slugify-article-title extension.',
    400
  );

  // verifySetup.ts
  import { SchemaOverview } from '@directus/types';

  import { InvalidSetupError } from '../errors';

  export const verifySetup = (schema: SchemaOverview | null) =&amp;gt; {
    if (!schema) {
      return;
    }
    const articleSchema = schema.collections?.article;
    if (articleSchema === undefined || articleSchema.fields.slug === undefined) {
      return;
    }
    if (articleSchema.fields.slug.type !== 'string') {
      throw new InvalidSetupError();
    }
  };
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The verifySetup function ensures that the article collection is set up correctly. The errors contain the error message and HTTP code. The error message will be displayed as an alert on the admin portal.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;In the index.ts file, replace the script with the following code:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  import slugify from 'slugify';

  import { defineHook } from '@directus/extensions-sdk';

  import { createArticleTitleAlreadyExistsError } from './errors';
  import { ArticlePayload } from './types';
  import { verifySetup } from './utils';

  export default defineHook(({ filter }) =&amp;gt; {
    filter('article.items.create', async (input, _meta, { database, schema }) =&amp;gt; {
      verifySetup(schema);

      const payload = input as ArticlePayload;
      const slug = slugify(payload.title, { lower: true });

      const existingTitle = await database
        .table('article')
        .where('slug', slug)
        .first('title')
        .then((result) =&amp;gt; result?.title);

      if (existingTitle !== undefined) {
        const ArticleTitleAlreadyExistsError = createArticleTitleAlreadyExistsError(existingTitle);
        throw new ArticleTitleAlreadyExistsError();
      }

      payload.slug = slug;
      return payload;
    });

    filter('article.items.update', async (input, meta, { database, schema }) =&amp;gt; {
      verifySetup(schema);
      const updates = input as Partial&amp;lt;ArticlePayload&amp;gt;;
      if (!updates.title) {
        return updates;
      }
      const articleId: string = meta.keys[0];
      if (!articleId) {
        return updates;
      }
      const slug = slugify(updates.title, { lower: true });

      const existingTitle = await database
        .table('article')
        .where('slug', slug)
        .where('id', '&amp;lt;&amp;gt;', articleId)
        .first('title')
        .then((result) =&amp;gt; result?.title);

      if (existingTitle !== undefined) {
        const ArticleTitleAlreadyExistsError = createArticleTitleAlreadyExistsError(existingTitle);
        throw new ArticleTitleAlreadyExistsError();
      }

      updates.slug = slug;
      return updates;
    });
  });
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The script contains two hooks that are triggered before the article item is created or updated. It slugifies the title, checks if the slug exists in the database, and either inserts the slug or throws an error (as declared in the previous step) if it already exists. You can find the available events here:&lt;br&gt;
&lt;a href="https://docs.directus.io/extensions/hooks.html#available-events" rel="noopener noreferrer"&gt;https://docs.directus.io/extensions/hooks.html#available-events&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Run the following command to build the extension and test it on the admin portal. You will notice that the server and the build process are automatically restarted whenevet there is a code change!&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>node</category>
    </item>
    <item>
      <title>Crafting a Scalable Node.js API: Insights from My RealWorld Project with Express, Knex, and AWS CDK</title>
      <dc:creator>Ken Yip</dc:creator>
      <pubDate>Mon, 04 Nov 2024 22:24:18 +0000</pubDate>
      <link>https://forem.com/ken2026/crafting-a-scalable-nodejs-api-insights-from-my-realworld-project-with-express-knex-and-aws-cdk-4ol5</link>
      <guid>https://forem.com/ken2026/crafting-a-scalable-nodejs-api-insights-from-my-realworld-project-with-express-knex-and-aws-cdk-4ol5</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1k0kvzjorcdxwbe2vqv0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1k0kvzjorcdxwbe2vqv0.png" alt="Image description" width="800" height="123"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Recently, I undertook a significant refactor of an example program I previously built by implementing the repository design pattern. I also transitioned from using AWS SAM to AWS CDK for managing infrastructure and deployment. You can find the source code for the application &lt;a href="https://github.com/kenyipp/realworld-nodejs-example-app" rel="noopener noreferrer"&gt;here&lt;/a&gt; and the infrastructure code &lt;a href="https://github.com/kenyipp/realworld-nodejs-example-app-infra" rel="noopener noreferrer"&gt;here&lt;/a&gt;. In this article, I will highlight the technologies and design patterns I adopted in this repository, sharing insights into the structure and techniques that contribute to a robust and scalable application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Structure
&lt;/h2&gt;

&lt;p&gt;The project employs a monorepository pattern using Turborepo, which allows for efficient management of multiple services and shared resources.&lt;br&gt;
Within this structure, the services are organized in the apps folder, while shared components and libraries reside in the packages folder. This organization enables the apps to depend on the packages, and for the packages to depend on one another as needed. Turborepo facilitates dependency management and utilizes caching mechanisms, significantly speeding up development by avoiding redundant tasks when underlying resources have not changed.&lt;br&gt;
The core business logic is located in the core folder under packages. In this package, I implemented the repository design pattern, dividing the program into four distinct layers: &lt;strong&gt;Database, Repository, Service, and Provider.&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Database Layer&lt;/strong&gt;: This layer manages data exchange between the database and the application, handling CRUD operations and database-specific logic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Repository Layer&lt;/strong&gt;: The repository layer acts as an intermediary, transforming inputs and outputs, implementing caching mechanisms, and ensuring that the application logic does not need to be tightly coupled with the database.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Service Layer&lt;/strong&gt;: Here, the core business logic resides. This layer consists of multiple service classes that act as facades, receiving necessary dependencies and delegating tasks to the appropriate handlers located in the implementations folder.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Provider Layer&lt;/strong&gt;: This layer handles interactions with third-party services, encapsulating their functionality and providing a clean interface for the rest of the application.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This architecture emphasizes the separation of concerns, where each layer is dependent on the others, but changes in one layer do not impact the others. For example, if we decide to switch our database from RDS to MongoDB, only the database layer would require modifications.&lt;/p&gt;

&lt;p&gt;By adhering to the principles of dependency injection, we can easily mock each layer during testing, enabling us to verify the functionality of other services in isolation. In fact, I have mocked all API requests and responses within the provider layer, which will be discussed further in the testing section.&lt;/p&gt;

&lt;p&gt;In the apps folder, we establish the entry points for our services. This is where we extract request payloads from various endpoints, validate them, and process them using the core package as needed. The architecture is designed to ensure that even if we change the source of the payload (for instance, switching from SQS to RabbitMQ), we only need to modify the application layer, leaving the package layer untouched.&lt;/p&gt;

&lt;h2&gt;
  
  
  API Deployment
&lt;/h2&gt;

&lt;p&gt;This project is designed as a fully serverless application, with API endpoints deployed within AWS Lambda functions. To facilitate this, we utilized the @vendia/serverless-express plugin, which allows us to convert our Express application into a Lambda function seamlessly. Additionally, we integrated AWS API Gateway to handle incoming HTTP requests, providing a robust interface for our APIs.&lt;/p&gt;

&lt;p&gt;Since we are using Cloudflare as our CDN provider, I configured the API Gateway as a regional resource. This setup allows us to create a CNAME record in Cloudflare that points to the API Gateway endpoint, ensuring efficient routing and caching of requests.&lt;/p&gt;

&lt;p&gt;One of the significant advantages of this architecture is the flexibility it offers. We can easily switch from a serverless deployment to a containerized solution if needed. The Dockerfile for this purpose is located in the apps/local folder. If you're interested in deploying the Docker container to AWS ECS with a load balancer and auto-scaling capabilities, you can refer to this repository for a comprehensive guide on the process.&lt;br&gt;
&lt;a href="https://github.com/kenyipp/nextjs-cdk-example" rel="noopener noreferrer"&gt;https://github.com/kenyipp/nextjs-cdk-example&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By adopting this serverless architecture, we benefit from scalability, reduced operational overhead, and efficient resource management, allowing us to focus on developing features and improving the application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Infrastructure Management
&lt;/h2&gt;

&lt;p&gt;The infrastructure for this project is fully managed using AWS CDK, and I have created a dedicated repository specifically for managing it. This repository includes all necessary resources such as IAM roles, SQS queues, and CodePipeline configurations that support the application.&lt;/p&gt;

&lt;p&gt;To streamline the deployment process, this setup includes two CI/CD pipelines:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Infrastructure CI/CD Pipeline&lt;/strong&gt;: This pipeline is triggered by GitHub Actions and handles the deployment of the entire infrastructure stack. If the pipeline code is well-structured (e.g., with no circular dependencies between stacks) and has been thoroughly tested in lower environments, it can deploy to production seamlessly without requiring manual adjustments on local. This pipeline updates the infrastructure components, ensuring consistency across all environments.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lambda Deployment CI/CD Pipeline&lt;/strong&gt;: This pipeline manages the deployment of the application code. It pulls the latest code from the application repository, builds the application, and uploads the resulting package to the Lambda function. This deployment pipeline is also triggered by GitHub Actions, allowing for a smooth and automated code deployment process.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Both CodePipeline processes are manually triggered through GitHub Actions. Each pipeline runs quality checks and test cases in GitHub Actions before triggering the corresponding AWS CodePipeline deployment for infrastructure or application code. This approach ensures that only tested and verified changes make it to production, providing a robust and automated deployment pipeline for the entire application stack.&lt;/p&gt;

&lt;h2&gt;
  
  
  Testing Strategies
&lt;/h2&gt;

&lt;p&gt;This repository employs a comprehensive testing strategy, including unit tests, integration tests, and end-to-end tests, to ensure stability and reliability across all layers of the application.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Unit Tests&lt;/strong&gt;: Unit tests are written for functions and core modules, targeting the smallest components in the application. By mocking dependencies within each layer - such as database, repository, and service layers - we can effectively test both happy paths and edge cases. This isolation allows us to focus on individual functions and modules, ensuring each component works as expected without dependencies interfering.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration Tests&lt;/strong&gt;: Integration tests focus on validating the APIs in the apps layer. Using supertest in conjunction with an in-memory SQLite database, we test each API endpoint in a realistic environment. This approach enables us to verify that modules interact correctly and that data flows as expected from one layer to another.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;End-to-End Tests&lt;/strong&gt;: For end-to-end testing, we set up a real MySQL database in GitHub Actions and launch the server to simulate actual deployment conditions. We use Newman (a tool for running Postman collections) to test the full user experience, from API requests to final responses. This allows us to validate that the application behaves as intended from a user's perspective.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Testing is a crucial part of our development workflow, supporting rapid iteration and refactoring. Well-written test cases allow us to confidently add features or make modifications without the risk of breaking other parts of the system. Good test coverage also provides a foundation for future refactoring, ensuring that as long as tests pass, the refactored code performs correctly.&lt;/p&gt;

&lt;p&gt;In many cases, tight timelines may require us to prioritize feature rollouts, which can lead to sacrifices in code quality. However, I prioritize writing robust test cases first, as they serve as a safety net for future improvements. This practice enables us to refactor and enhance the codebase with confidence, knowing that as long as tests pass, the quality and functionality of the code remain intact.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Refactoring this repository has been a valuable learning experience, allowing me to apply new technologies and best practices along the way. I continuously explore and integrate improvements to keep the project up-to-date and efficient. If you find this repository or article helpful, consider giving it a star or clap - your support is appreciated! Thank you.&lt;/p&gt;

</description>
      <category>cdk</category>
      <category>typescript</category>
      <category>node</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Deploy a Docker Image to ECS with Auto Scaling Using AWS CDK in Minutes</title>
      <dc:creator>Ken Yip</dc:creator>
      <pubDate>Sun, 03 Nov 2024 18:50:43 +0000</pubDate>
      <link>https://forem.com/ken2026/deploy-a-docker-image-to-ecs-with-auto-scaling-using-aws-cdk-in-minutes-7a2</link>
      <guid>https://forem.com/ken2026/deploy-a-docker-image-to-ecs-with-auto-scaling-using-aws-cdk-in-minutes-7a2</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Deploying a scalable application to the cloud can be challenging for new developers, especially when managing resources individually through the console, which is time-consuming. While AWS CloudFormation is powerful, its complexity can create a high learning curve. Amazon CDKoffers an easier solution by allowing you to define AWS infrastructure using familiar programming languages. This simplifies the process and reduces deployment time. I’ve created an example repository to demonstrate how to deploy a containerized application with auto-scaling capabilities using AWS CDK. By cloning the repo, you can deploy a scalable application in just a few minutes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/kenyipp/nextjs-cdk-example" rel="noopener noreferrer"&gt;https://github.com/kenyipp/nextjs-cdk-example&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Main features
&lt;/h2&gt;

&lt;p&gt;Here are the main features of this repository:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;ECS Fargate Service&lt;/strong&gt;: Run your application without managing servers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Load Balancing&lt;/strong&gt;: Automatically distribute incoming traffic to containers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Auto-Scaling&lt;/strong&gt;: Scale your application based on CPU usage.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Customizable Task Definitions&lt;/strong&gt;: Easily modify CPU, memory, and environment variables.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By following the guide in my GitHub repository, you can get your application up and running in no time. If you have any questions or need further clarification on any part of the process, feel free to reach out!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cdk</category>
      <category>devops</category>
      <category>docker</category>
    </item>
  </channel>
</rss>
