<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Fatih Felix Yildiz</title>
    <description>The latest articles on Forem by Fatih Felix Yildiz (@mfyz).</description>
    <link>https://forem.com/mfyz</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/mfyz"/>
    <language>en</language>
    <item>
      <title>duplicati for backups</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Tue, 11 Feb 2025 13:43:13 +0000</pubDate>
      <link>https://forem.com/mfyz/duplicati-for-backups-24ad</link>
      <guid>https://forem.com/mfyz/duplicati-for-backups-24ad</guid>
      <description>&lt;p&gt;I use a few different methods to backup my stuff on my server (and my mac). Duplicati is one of them. It’s probably the easiest to set up backup tool I’ve used.&lt;/p&gt;

&lt;p&gt;Duplicati is a versatile, open-source backup solution that’s packed with features. It’s easy to use, supports a wide variety of backup destinations, and has a friendly web interface that makes backup management straightforward for most.&lt;/p&gt;

&lt;h2&gt;
  
  
  Host anywhere
&lt;/h2&gt;

&lt;p&gt;Duplicati is open source and it can be deployed almost anywhere. I run it as a docker container that does NOT require much resources to run. They also have hosted version that centralizes multiple machine backups, but I never used it.&lt;/p&gt;

&lt;p&gt;I use a pretty simple single-container compose file that I deploy using portainer and its gitops integration (related article: &lt;a href="https://mfyz.com/portainer-gitops-a-simple-way-to-deploy-and-manage-your-self-hosted-applications/" rel="noopener noreferrer"&gt;Portainer + gitops ❤️: A simple way to deploy and manage your self-hosted applications&lt;/a&gt;). I mount my sources (they are all in the same machine, stuff running in different containers) into duplicati’s container. Then duplicati treats it as simple folders to monitor, take incremental backups and send backups to whatever remote I set up.&lt;/p&gt;

&lt;p&gt;Here is my docker-compose.yml for my duplicati setup:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: '3.9'
services:
  duplicati:
    image: lscr.io/linuxserver/duplicati:latest
    container_name: duplicati
    restart: unless-stopped
    environment:
      - TZ=America/New_York
      # - PUID=1000 #the user
      # - PGID=1000 #the group
      # - CLI_ARGS= #optional
    volumes:
      - config:/config
      - ${BACKUP_FOLDER}:/backups
      - ${SOURCE_ROOT_FOLDER}:/source/data
    ports:
      - 8200:8200
volumes:
  config:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Simple Web UI to manage backups
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqq5thhcpf3b75dmiklg.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqq5thhcpf3b75dmiklg.jpg" alt="Image description" width="800" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvr3g2tsm96zv3p3uzg4i.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvr3g2tsm96zv3p3uzg4i.jpg" alt="Image description" width="800" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up a Backup Job
&lt;/h2&gt;

&lt;p&gt;Setting up a job is pretty easy using the web UI. A few steps include selecting source, setting up target and finally schedule.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0g9pl97q6196zna71ddw.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0g9pl97q6196zna71ddw.jpg" alt="Image description" width="800" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejiav5g3pgr7bky6pdy0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fejiav5g3pgr7bky6pdy0.jpg" alt="Image description" width="800" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Duplicati supports tons of target/destinations
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foj12ejxfqojymbifsgme.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foj12ejxfqojymbifsgme.jpg" alt="Image description" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I use google drive as the target.&lt;/p&gt;

&lt;h2&gt;
  
  
  Monitoring
&lt;/h2&gt;

&lt;p&gt;While duplicati has some ways to check jobs statuses, and logs. I don’t trust one single system.&lt;/p&gt;

&lt;p&gt;I have a separate healthcheck script that checks the duplicati backup folders if duplicati failed to write. It’s a simple node.js script that reads the google drive folder contents and queries files create timestamps then checks if there is at least one in the last X hours. If it find any file, it pings my healthcheck system.&lt;/p&gt;

&lt;p&gt;I’ve covered this recently: &lt;a href="https://mfyz.com/monitor-everything-with-healthchecks-io/" rel="noopener noreferrer"&gt;Monitor everything with Healthchecks.io&lt;/a&gt;. Healthchecks takes care of sending slack and email alerts to me.&lt;/p&gt;

&lt;p&gt;Check duplicati out from their website: &lt;a href="https://duplicati.com" rel="noopener noreferrer"&gt;https://duplicati.com&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This post was originally posted on my blog: &lt;a href="https://mfyz.com/duplicati-for-backups/" rel="noopener noreferrer"&gt;https://mfyz.com/duplicati-for-backups/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>duplicati</category>
      <category>backups</category>
      <category>opensource</category>
      <category>selfhosted</category>
    </item>
    <item>
      <title>Hosting my hobby projects from cheap HP mini desktop from my closet (Verizon Fios)</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Mon, 04 Nov 2024 13:54:48 +0000</pubDate>
      <link>https://forem.com/mfyz/hosting-my-hobby-projects-from-cheap-hp-mini-desktop-from-my-closet-verizon-fios-3khl</link>
      <guid>https://forem.com/mfyz/hosting-my-hobby-projects-from-cheap-hp-mini-desktop-from-my-closet-verizon-fios-3khl</guid>
      <description>&lt;h2&gt;
  
  
  Why?
&lt;/h2&gt;

&lt;p&gt;For me, self-hosting is like having my own personal playground where I can experiment, tinker, and learn. It's a great way to explore new technologies, try out different setups, and have fun with my projects.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqdmp7i1o4rwwdx4eo80y.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqdmp7i1o4rwwdx4eo80y.jpeg" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As part of my job, I need to have deep understanding of developer experience. Best way to build this understanding is to be the developer, both initial experience with any development tool, as well as day to day of working with these systems. Self-hosting is in a way, building empathy with developer community. Understanding differences, and good/bad versions. My main reason is “learning”.&lt;/p&gt;

&lt;p&gt;There are a bunch of other reasons one may choose to do this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Privacy&lt;/strong&gt;: your data&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Full control&lt;/strong&gt;: You own it (well, both good and bad)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost-effective&lt;/strong&gt;: (may not be always true, but mostly true)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  You shouldn’t do this… But if you really want to…
&lt;/h2&gt;

&lt;p&gt;I don’t suggest this route for majority of the people. It’s hard, you’ll hit walls way more frequent than you want. You have to be a warrior. If your reason is similar to mine, go for it. There is a strong determining factor though, your connectivity.&lt;/p&gt;

&lt;p&gt;Let’s get started. I’ll start the connectivity, then hardware and software.&lt;/p&gt;

&lt;h2&gt;
  
  
  Sounds common but not so common: It’s a privilege to be on Fios
&lt;/h2&gt;

&lt;p&gt;While high-speed internet has become more commonplace, it's still a privilege, especially in the United States. We’re (I’m) definitely taking it for granted. I use Verizon Fios is a fiber internet service. If it wasn’t for this, I wouldn’t self-host my stuff.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1q47r0c23abv267x0q0t.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1q47r0c23abv267x0q0t.jpeg" alt="Image description" width="800" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Big practical separation for Fios is how stable it is, regardless of the “mpbs/gbps” package you have. I used Fios in residential and office setup in New York City for years. I got off as I moved to different neighborhoods and I really really missed it when I didn’t have it, even though I got 1gbps service packages from other providers. &lt;/p&gt;

&lt;p&gt;Back in the day when we used dial ups and trying to play Half Life (or Counter Strike) online, your connection speed mattered but “lag” mattered even more. I lived in Turkey back then and we had a cable internet provider vs adsl services and the difference was, you were getting super low lag/ping in cable network even though you were 1/4, 1/3th of the connection speed you had compared to other services that bragged how fast they were. Didn’t matter when playing online.&lt;/p&gt;

&lt;p&gt;I have 300/300mbps, what’s called “symmetrical” connection. 300mbps is already way higher than average internet connectivity worldwide (although certain countries/cities regions have way faster networks, overall world citizen gets access to internet at lower speeds). But it would be ok even if it was slower, because it’s Fiber network and its symmetrical which means download and upload speeds are same. Often you see traditional ISPs advertise something ridiculous high speeds like 500mbps, but it’s often only referred to download speeds. And in majority of consumer scenarios this is fine. But you need the upload speed to be high and consistent/stable when you want to serve upstream.&lt;/p&gt;

&lt;h2&gt;
  
  
  Hardware
&lt;/h2&gt;

&lt;p&gt;Since it’s hobby purposes, I initially searches some “old” server (like servers sitting on racks) on eBay. Then I realize it had million combination of hardware components, like CPU architectures, network interfaces. I quickly went down the rabbit hole of Reddit threads both fun and scary stories. These “serious” server hardware were electric eating, heating sources that are also giant, requires space kinda machines turned me off and I backed out quickly.&lt;/p&gt;

&lt;p&gt;Then I explored mini pcs that are more common computers that could handle my applications really easily. Think like you’re looking for a computer that you could use, but instead you just host stuff and it sits somewhere in your home in a closet, without being a fire risk or a thing that you need to worry about how to keep it cool.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnn8wp50tdaig5n7b04lu.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnn8wp50tdaig5n7b04lu.jpg" alt="Image description" width="800" height="327"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I bought a “HP Elite Desk Mini” which is a decent computer if I were to use it as my desktop. It’s 16GB memory, i7 quad core CPU, and have 510gb SSD. I think I bought it under $150 on Amazon. You can go fancier with much beefier machine with a few hundred dollars if you’re being much more serious about this. I’m thinking to buy another (same machine) and stack them.&lt;/p&gt;

&lt;p&gt;The footprint of this machine is super small. It snuck under my Verizon router in a closet, almost zero noise and barely heats. I’m sure if I find an ARM version of this thing (or a raspberry pi) I can go smaller and almost no heat but I never seen over heating on these.&lt;/p&gt;

&lt;p&gt;Wether this machine is a good or bad hardware decision, it’s debatable, but I’m really happy a few years in.&lt;/p&gt;

&lt;h2&gt;
  
  
  Software: Ubuntu &amp;amp; Docker
&lt;/h2&gt;

&lt;p&gt;The first thing I did was to clean it up and install Ubuntu (LTS). Almost bare bones ubuntu then right away docker installed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4w25fq6dvcgrolzq2fpy.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4w25fq6dvcgrolzq2fpy.jpeg" alt="Image description" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I have nginx and php on it for some early play of a Wordpress blog (not this one), but then abandoned it.&lt;/p&gt;

&lt;p&gt;I run almost everything exclusively in docker (more on this below).&lt;/p&gt;

&lt;p&gt;I try to update &amp;amp; upgrade ubuntu once a year. Nothing else.&lt;/p&gt;

&lt;h2&gt;
  
  
  Access: Cloudflare Zero Trust
&lt;/h2&gt;

&lt;p&gt;The machine itself is completely closed to direct internet access. It’s IP Tables don’t allow connections even from local network (except SSH port accepts local network IP range).&lt;/p&gt;

&lt;p&gt;Traditionally, machine needs to open ports to outside, then have a router port forwarding and set up all public IP sort things. More than 2 decades ago I did that with static IP from my ISP. Man, all the hustle…&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjghpvr9r8jrasuwdbth2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjghpvr9r8jrasuwdbth2.jpg" alt="Image description" width="800" height="369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Non of that is necessary anymore. I use Cloudflare Access, Tunnels which has an agent always running in the server, and from remote configuration, I can listen any internal port (without opening it up) and forward the port directly to a subdomain of my domains. This shortcuts the DNS work for me too. On top of that, most of my private apps run on subdomains that are protected by Cloudflare Zero Trust access (only me). I love Cloudflare’s this feature that solves 2-3 problems at once for me.&lt;/p&gt;

&lt;p&gt;One might wonder, what happens if Cloudflare has an outage, and their Zero Trust tools stop working, does it open my apps to public all of a sudden? No, because my apps are not open the public in the first place. Zero Trust tunnel has to work in order to open it up to public, and if Zero Trust authentication is down, the subdomain will also not be accessible (because it’s proxied through Zero Trust “application” record.&lt;/p&gt;

&lt;p&gt;Worst case scenario, I lose access to my private apps from outside. Even with that, I can SSH to my server and create a tunnel, port forwarding to the specific port the app is running.&lt;/p&gt;

&lt;p&gt;In a normal day, I simply join to Zero Trust network using Cloudflare’s desktop app WARP, that replaces VPN for me.&lt;/p&gt;

&lt;p&gt;All things considered, I’m sure there are still holes and paranoia in this plan. You can go through more traditional route that is not any different than hosting this instance in Digital Ocean or AWS and replicate what you think is “more secure”, but I’m pretty happy with the baseline Cloudflare brings, and solve a few unnecessary things I have to take care (like no need to do reverse proxy for all apps I’m running).&lt;/p&gt;

&lt;h2&gt;
  
  
  Deploy apps: Portainer + Gitops
&lt;/h2&gt;

&lt;p&gt;I use portainer to both set up deployments and management of my containers. Portainer is essentially nice UI version of your docker command line tools. But where it shines is the gitops integration that integrates with github via webhooks, so when I push any change to any of my apps repos (which all have &lt;code&gt;docker-compose.yml&lt;/code&gt; that includes their infra and application configurations), my apps gets re-deployed by portainer. This makes spinning up a new app, or an open source tool in my server, a breeze.&lt;/p&gt;

&lt;p&gt;I covered portainer and its gitops integration in this article:&lt;br&gt;
&lt;a href="https://mfyz.com/portainer-gitops-a-simple-way-to-deploy-and-manage-your-self-hosted-applications/" rel="noopener noreferrer"&gt;Portainer + gitops ❤️: A simple way to deploy and manage your self-hosted applications&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This post was first published on my blog: &lt;a href="https://mfyz.com/hosting-my-hobby-projects-from-cheap-hp-mini-desktop-from-my-closet-verizon-fios/" rel="noopener noreferrer"&gt;https://mfyz.com/hosting-my-hobby-projects-from-cheap-hp-mini-desktop-from-my-closet-verizon-fios/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>selfhosting</category>
      <category>docker</category>
      <category>ubuntu</category>
      <category>portainer</category>
    </item>
    <item>
      <title>Monitor everything with Healthchecks.io</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Wed, 23 Oct 2024 18:16:28 +0000</pubDate>
      <link>https://forem.com/mfyz/monitor-everything-with-healthchecksio-7ie</link>
      <guid>https://forem.com/mfyz/monitor-everything-with-healthchecksio-7ie</guid>
      <description>&lt;h3&gt;
  
  
  Why monitor?
&lt;/h3&gt;

&lt;p&gt;Monitoring is essential for ensuring the reliability and performance of your applications. It's like having a watchful eye on your systems, allowing you to proactively identify and address issues before they impact your users. Imagine you’re the pilot of a plane in air, and monitoring means having a set of tools tell you things are not going right instantly. Not having right signals could mean death.&lt;/p&gt;

&lt;p&gt;There are various ways you can externally monitor services, at least half of the stuff I manage are private, non-public apps, services, scripts. Best way to monitor them is using &lt;a href="https://en.wikipedia.org/wiki/Dead_man's_switch" rel="noopener noreferrer"&gt;Dead man's switch&lt;/a&gt; approach. It means, detecting a failure when a process gets non-operational (for whatever reason). Each service is expected to “report” that they are still functioning in an expected timeframe (you can determine what that means in your business logic).&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;a href="http://Healthchecks.io" rel="noopener noreferrer"&gt;Healthchecks.io&lt;/a&gt;
&lt;/h2&gt;

&lt;p&gt;I use &lt;a href="http://healthchecks.io" rel="noopener noreferrer"&gt;healthchecks.io&lt;/a&gt;, an open source monitoring tool for this.&lt;/p&gt;

&lt;p&gt;I monitor my servers, containers, apis, backups, scripts, you name it. Although I use other monitoring tools for service availability; I find &lt;a href="http://healthchecks.io" rel="noopener noreferrer"&gt;healthchecks.io&lt;/a&gt; the main and my go to when I consider keeping eye on a service’s operational availability.&lt;/p&gt;

&lt;p&gt;Here is the screenshot of my checks for random things that I need to keep eye on:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7k3e5p8yp6zsu2lu29v.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7k3e5p8yp6zsu2lu29v.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For each check, you can see the recent history of the pings, and you can see the settings. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnrt7mesh6ozyv1dajyod.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnrt7mesh6ozyv1dajyod.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up a check
&lt;/h2&gt;

&lt;p&gt;You create a new “check” with pretty much only giving it a name and picking what’s the timeframe that a ping is expected (from the source). You can give checks, description (I often write up where the ping is located, like, &lt;code&gt;mfyz server → crontab → check-daily-backups.sh&lt;/code&gt;), and you can give tags, that helps organizing checks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scheduling
&lt;/h3&gt;

&lt;p&gt;Use simple visual scheduling and grace period configuration, or enter crontab expression (which you can also generate with many online tools easily, with verbal expressions like every “Monday, 1 am”)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftsibqyavix7dzala6g0t.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftsibqyavix7dzala6g0t.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Preview the schedule:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs45bgy4smd58x3k9brdb.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs45bgy4smd58x3k9brdb.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Each check essentially gets a unique long-id and you get a ping url. It’s a simple curl/http GET request to that URL that you can make in any method you want. Each ping marks the check “healthy” until &lt;a href="http://healthchecks.io" rel="noopener noreferrer"&gt;healthchecks.io&lt;/a&gt; does NOT receive a ping within the timeframe its configured. &lt;/p&gt;

&lt;p&gt;&lt;a href="http://Healthcheck.io" rel="noopener noreferrer"&gt;Healthcheck.io&lt;/a&gt; also gives copy/paste snippets for various languages in the check detail page:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzwqtotw1ztjqcn3ycig6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzwqtotw1ztjqcn3ycig6.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Integrations and Extensibility
&lt;/h2&gt;

&lt;p&gt;I love healthchecks.io because it’s a very simple tool, that does not have 50 screens or settings, but well designed/thought service that provides rich ways to configure, and interact with it. A few points I find it really useful:&lt;/p&gt;

&lt;h3&gt;
  
  
  Integrations with almost anything I need
&lt;/h3&gt;

&lt;p&gt;Integrations are mostly notification channels, but most are just a few clicks configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0br4wibvn7duoqnbf5pf.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0br4wibvn7duoqnbf5pf.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  REST Management API
&lt;/h3&gt;

&lt;p&gt;Pretty much any object and any action you do in the UI can be done via their rest API.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://healthchecks.io/docs/api/" rel="noopener noreferrer"&gt;https://healthchecks.io/docs/api/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Badges
&lt;/h3&gt;

&lt;p&gt;To slap status badges easily in random places (dashboards and such):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftzmq0317fkso42d9ogpy.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftzmq0317fkso42d9ogpy.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Self-hosted vs Cloud/SaaS Healthchecks.io
&lt;/h2&gt;

&lt;p&gt;Healthchecks.io is an open source tool. You can host your own instance easily using it’s &lt;a href="https://github.com/healthchecks/healthchecks/blob/master/docker/docker-compose.yml" rel="noopener noreferrer"&gt;docker-compose.yml&lt;/a&gt; file. They also have a cloud/SaaS version with fair free/hobby plan that should be more than enough to track your micro services, servers, cron jobs etc for your small/side projects. I use their cloud version but I can move to my own self-hosted version any time I want.&lt;/p&gt;

&lt;p&gt;In the case I grow my use of healthchecks, I find their pricing plans really reasonable and I’d most likely stick with their cloud version and would be happy to pay:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa38hpju7xabl0h4a5wxp.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa38hpju7xabl0h4a5wxp.jpg" alt="Image description" width="800" height="663"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I’ll suggest an alternative below, but before, I want to say, even though there are richer and more powerful services, I prefer to keep my monitoring separated and have more control on them. Example: I do have an end-to-end script that browses a few pages and checks stuff using playwright, and I try to keep it experiment plain/simple on purpose. And simply have a bash script that runs the playwright test and pings a check in &lt;a href="http://healthchecks.io" rel="noopener noreferrer"&gt;healthchecks.io&lt;/a&gt; when it passes. That monitor is simple enough that I can move to any other service, or move the test from where it runs to another place without worrying any vendor lock. Instead, I would have used a service like checkly but it would lock me in right away. In short, I like how healthchecks.io contributes this decoupled model and plays a central reporting engine for my monitors.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best next alternative: Uptime kuma
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0a4gn5vrcnofzbt5wouc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0a4gn5vrcnofzbt5wouc.jpg" alt="Image description" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want more robust monitoring tool that has more ways to monitor your services, websites, beyond dead man’s switch method, check out &lt;a href="https://github.com/louislam/uptime-kuma" rel="noopener noreferrer"&gt;uptime kuma&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;It has almost all features &lt;a href="http://healthchecks.io" rel="noopener noreferrer"&gt;healthchecks.io&lt;/a&gt; has, plus a few more:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Status pages&lt;/li&gt;
&lt;li&gt;More check methods like DNS record, ping, docker container, SSL cert&lt;/li&gt;
&lt;li&gt;Its Web UI looks more traditional monitoring tools like uptime history, response time etc&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Run via Docker:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -d --restart=always -p 3001:3001 -v uptime-kuma:/app/data --name uptime-kuma louislam/uptime-kuma:1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Check out the project page: &lt;a href="https://github.com/louislam/uptime-kuma" rel="noopener noreferrer"&gt;https://github.com/louislam/uptime-kuma&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This article was originally posted on my blog: &lt;a href="https://mfyz.com/monitor-everything-with-healthchecks-io/" rel="noopener noreferrer"&gt;https://mfyz.com/monitor-everything-with-healthchecks-io/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>monitoring</category>
      <category>opensource</category>
      <category>cronjob</category>
      <category>selfhosted</category>
    </item>
    <item>
      <title>A quick way to tweak CDN/Edge TTL to radically improve site performance (and SEO)</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Wed, 16 Oct 2024 12:49:42 +0000</pubDate>
      <link>https://forem.com/mfyz/a-quick-way-to-tweak-cdnedge-ttl-to-radically-improve-site-performance-and-seo-50fe</link>
      <guid>https://forem.com/mfyz/a-quick-way-to-tweak-cdnedge-ttl-to-radically-improve-site-performance-and-seo-50fe</guid>
      <description>&lt;p&gt;I want to talk about a quick tweak you can do in your CDN TTL settings to radically improve your site's performance. Direct impact on Time-To-First-Byte (TTFB) metric, but as a halo effect, pretty much every other Web Vital.&lt;/p&gt;

&lt;p&gt;You can do this in any CDN since TTL customization is a pretty standard need and most CDN providers have easy ways to create rules for various rule configurations.&lt;/p&gt;

&lt;p&gt;I use Cloudflare for my blog's CDN layer. Cloudflare already comes with nice defaults for optimizing the delivery of static assets like images, javascript, css files. But for HTML documents, CDNs use cache-control headers to determine how to cache, and how long to cache. Applications return this header and it's a way for the application (origin) to tell CDNs how to behave on certain pages. But in this optimization method, we'll simply override all (or most) of our pages to be highly cached and served from the cache while revalidating in the background.&lt;/p&gt;

&lt;p&gt;The way this works is CDN always serves the "last" cached HTML to the reader (or crawler) from the edge network, really really really fast (in some cases double-digit milliseconds), and triggers a request to the origin server to get the "latest" version. Most applications also return proper response codes if the content hasn't changed from the timestamp that CDN will ask if there is a new update to the content.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to configure custom TTL in Cloudflare
&lt;/h2&gt;

&lt;p&gt;To set up custom edge TTL in Cloudflare, navigate to your site page, Caching &amp;gt; Cache Rules page.&lt;/p&gt;

&lt;p&gt;Create a new rule, give it a name, and then set up the request path configuration.&lt;/p&gt;

&lt;p&gt;You can set multiple expressions, and exclude patterns that you know are Admin, or Rest API, or other URLs that should NOT be cached long. I use WordPress for my blog and I exclude paths containing things like wp-admin, wp-json, cron…&lt;/p&gt;

&lt;p&gt;Then Select "Ignore cache-control header and use this TTL in the Edge TTL section. And finally, select how long you want to cache. Longer is better, because longer means, most of your site content, including long-tail content that doesn't get consistent traffic will also be cached at the edge. I started with 1 day, then 1 week, then I tried 1 month, but then had some pages stuck in the cache too long, and dialed it back to 1 week as my sweet spot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm9qwxu6p8nnkttt0j15t.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm9qwxu6p8nnkttt0j15t.jpg" alt="Image description" width="800" height="947"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Even if you're not using Cloudflare, I'm sure there is an equivalent of this in your CDN provider.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is the impact on page speed?
&lt;/h2&gt;

&lt;p&gt;After the change, I saw a big drop (like 90% reduced load) in my server's load. It meant CDN was doing what it was supposed to do. It's one of the positive side effects of doing higher cache offload to CDN, to be able to scale higher traffic without needing powerful hosting resources.&lt;/p&gt;

&lt;p&gt;My Time-To-First-Byte decreased (improved) 70%, coming from shy of 500ms down to 100–160ms range 🤯&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F73fjwtvecbs9hem3b0dk.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F73fjwtvecbs9hem3b0dk.jpg" alt="Image description" width="800" height="757"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;More importantly, the real user experience on the page became even more mind-blowing because things became super snappy. Click click click, bam bam bam, nothing was in a visible loading state anymore. Even if metrics didn't move, I am super happy with this aspect of the change.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdxtg0tjgy189qvxfgjd.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpdxtg0tjgy189qvxfgjd.gif" alt="Image description" width="480" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;🤯🤯🤩&lt;/p&gt;

&lt;p&gt;I got my Cloudflare Web Analytics email and noticed almost all Web Vitals moved positively at least 30% improvement.&lt;/p&gt;

&lt;p&gt;I wasn't expecting other Web vitals like CLS, and LCP to be directly impacted (or impacted as much as they did). But it makes sense. When the assets load much faster like this, the "wait time" (or blocking time) goes down, therefore layout shift or the largest paint goes down.&lt;/p&gt;

&lt;h2&gt;
  
  
  SEO Impact
&lt;/h2&gt;

&lt;p&gt;It's well known fact that Google takes your "core web vitals" in account when determining your ranking in the search results. This change has more impact on crawlers than you think. Because most of the time, crawlers' requests are the ones that hit "cache cold" pages. It means Google (or other search engine) is reading your site holistically way more than your real users. Imagine every single article you wrote. There is no user who reads every single one of them - Google does 🙂 (and does it regularly). When a crawler tries to visit a page that nobody read in a long time, its' request will have cache-miss more likely than cache-hit, so it will "wait" longer for your web server to render the page.&lt;br&gt;
When you put yourself in the crawler's shoes, imagine you try to read 10,000 articles/pages on a site over a day or two period (maybe it takes longer, who knows…). Now consider the percentage of those pages that will have to be rendered, or served from the CDN cache. The more pages Google sees "slow", it will think your whole site is slow.&lt;/p&gt;

&lt;p&gt;This is where the real value of super-long TTLs comes in. Especially if you combine that with serve-stale-while-revalidate (SSWR) which most CDNs automatically do (if not, I'm sure there is a setting you can enable these together). SSWR with super-long TTL (like 7 days, or more) basically creates an infinite loop of "always cached" scenarios. And with that, your crawler traffic gets served from the cache (at cost/risk of "stale content" which is OK in the vast majority of use cases), and directly increases your site's overall speed score and, therefore your SEO scores.&lt;/p&gt;

&lt;h2&gt;
  
  
  Content Freshness
&lt;/h2&gt;

&lt;p&gt;There is one caveat though, which is content freshness. When you bump the Edge TTL up to multi-day TTLs like I did, you need to make sure your CMS/site is nicely integrated with your CDN's cache clear systems, in the case you make updates. Two scenarios:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You update existing content (like fixing a typo, or changing the cover image of a post), the change should be reflected on the content's detail page right away.&lt;/li&gt;
&lt;li&gt;You publish new content, so the new content is supposed to show up in common places like your homepage.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can use your CDN's cache clear UI or APIs to trigger "purge" on URLs you think it's impacted (homepage, section pages, etc), or highly visible pages like the homepage can be configured with a lower TTL in a separate cache rule set.&lt;/p&gt;

&lt;p&gt;I use WordPress for my content management system and Cloudflare has WordPress plugin to listen to publish/update hooks to trigger these cache clear nicely.&lt;/p&gt;

&lt;p&gt;Another way to think about this is to find the balance. What is the "stale"ness you can tolerate on a page? Let's say another article detail page showing "recent articles", or "related articles" sections to NOT show your most recent article there. As long as that time length is not something you can't afford, cache longer, to achieve better site/page performance.&lt;/p&gt;




&lt;p&gt;This post was first published on my personal blog: &lt;a href="https://mfyz.com/a-quick-way-to-tweak-cdn-edge-ttl-to-radically-improve-site-performance-and-seo/" rel="noopener noreferrer"&gt;https://mfyz.com/a-quick-way-to-tweak-cdn-edge-ttl-to-radically-improve-site-performance-and-seo/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloudflare</category>
      <category>cdn</category>
      <category>webvitals</category>
      <category>seo</category>
    </item>
    <item>
      <title>Portainer + gitops ❤️: A simple way to deploy and manage your self-hosted applications</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Tue, 08 Oct 2024 12:59:13 +0000</pubDate>
      <link>https://forem.com/mfyz/portainer-gitops-a-simple-way-to-deploy-and-manage-your-self-hosted-applications-pkk</link>
      <guid>https://forem.com/mfyz/portainer-gitops-a-simple-way-to-deploy-and-manage-your-self-hosted-applications-pkk</guid>
      <description>&lt;p&gt;Self-hosting became a much easier and more viable option using docker. You don’t need to understand the source code or have no intent to customize stuff. Setting things that you are not familiar with made open source applications require their own experts.&lt;/p&gt;

&lt;p&gt;Docker made all of this almost like installing an app on your computer from a binary. In fact, I never installed Redis directly on my computer before, yet, I have half a dozen apps that have their own Redis instances humming happily on my server and I have zero concerns about how to set it up in case I need it directly on my projects.&lt;/p&gt;

&lt;p&gt;I’m going to give you my personal go-to way of how I host my applications like simple nodejs, php applications, WordPress sites, and many open source tools (that are also saas, but I choose to host my own instance).&lt;/p&gt;

&lt;h1&gt;
  
  
  Portainer: My Container Management Maestro
&lt;/h1&gt;

&lt;p&gt;Portainer acts as my central command center for all things containerized. This handy tool lets me build, deploy, and manage both individual containers and entire stacks. Did I mention it runs as a lightweight container itself? Here’s a peek at my streamlined docker-compose.yml for Portainer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: "3"
services:
  portainer:
    image: portainer/portainer-ce:latest
    restart: unless-stopped
    ports:
      - 9000:9443
    volumes:
      - data:/data
      - /var/run/docker.sock:/var/run/docker.sock
volumes:
  data:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Web UI
&lt;/h1&gt;

&lt;p&gt;Portainer’s web UI has pretty much everything you need to see and “do” for your stacks and containers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg1rcusm74ncywpx48ho6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg1rcusm74ncywpx48ho6.jpeg" alt="Image description" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1gar5c9afg06hjmj8cut.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1gar5c9afg06hjmj8cut.jpeg" alt="Image description" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Secure Access
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2hhi9pblrypz7fkhmq8c.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2hhi9pblrypz7fkhmq8c.jpeg" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I use Cloudflare zero trust to expose my portainer (and other private apps). As simple as pointing a subdomain to a port using a tunnel, then saying Zero trust that any network request to that subdomain requires Zero trust authenticated session.&lt;/p&gt;

&lt;h1&gt;
  
  
  Portainer Gitops
&lt;/h1&gt;

&lt;p&gt;Let’s get to the juicy part, the magic factor for portainer: gitops integration. It’s not rocket science, but it’s the most important “need” when hosting your own apps.&lt;/p&gt;

&lt;p&gt;Certainly, if you are managing code, like templates, extensions, plugins, or basic stuff like your configuration files for an app’s server environments (like *sql, redis, node, php, nginx).&lt;/p&gt;

&lt;p&gt;This makes your simple projects also closer to “Infrastructure as Code” practices, without going through complex AWS, Azure, IaC models.&lt;/p&gt;

&lt;p&gt;Assuming you are keeping them in a VCS, favorably github, you treat your git flows (i.e: merge of a PR to a certain branch) as the main triggers for your deployments.&lt;/p&gt;

&lt;p&gt;Portainer comes with native gitops integration through both webhooks and polling (not recommended but can be used as a backup method). When there is a push to a branch you define, portainer re-runs your stack, builds your images if needed then restarts your containers with the changes. 🎩&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5dtvv2kxntsrdc79dab3.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5dtvv2kxntsrdc79dab3.jpeg" alt="Image description" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Portainer is open source (zlib license) and free with Community Edition. It also has more advanced features in its Business license. I found a few areas that I wished had those advanced features but so far none of them became “blockers” in my use cases. I imagine using portainer for more team-wide/company-wide use cases may require Business license.&lt;/p&gt;

&lt;p&gt;At some point, I wanted to find a truly open-source and non-profit version of portainer and there are a few, but portainer (and its gitops integration) makes it a good enough combination that I didn’t want to bother replacing it.&lt;/p&gt;

&lt;p&gt;Check it out &lt;a href="https://www.portainer.io/" rel="noopener noreferrer"&gt;https://www.portainer.io/&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This article first published on my blog: &lt;a href="https://mfyz.com/portainer-gitops-a-simple-way-to-deploy-and-manage-your-self-hosted-applications/" rel="noopener noreferrer"&gt;https://mfyz.com/portainer-gitops-a-simple-way-to-deploy-and-manage-your-self-hosted-applications/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>docker</category>
      <category>selfhost</category>
      <category>gitops</category>
      <category>portainer</category>
    </item>
    <item>
      <title>Animating systems diagrams with draw.io</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Wed, 02 Oct 2024 12:30:03 +0000</pubDate>
      <link>https://forem.com/mfyz/animating-systems-diagrams-with-drawio-1oka</link>
      <guid>https://forem.com/mfyz/animating-systems-diagrams-with-drawio-1oka</guid>
      <description>&lt;p&gt;Diagrams are fantastic tools for explaining complex systems. They act as a visual map, allowing viewers to grasp how various components interact and connect in a single glance.&lt;/p&gt;

&lt;p&gt;However, static diagrams sometimes lack the ability to convey intricate processes fully. Imagine presenting a complex system diagram - you can narrate, draw focus to specific elements, and guide your audience through the visual representation. However one of the primary goals of diagrams is to provide a clear explanation without needing a constant narration.&lt;/p&gt;

&lt;p&gt;Animating diagrams, particularly the flow between different components, bridges the gap between static images and detailed narration. Animations act as a visual guide, directing viewers' attention and highlighting the order of operations within the system.&lt;br&gt;
Imagine seeing a building's piping system statically on paper (which is still an amazing way to give information about, well the piping in a building).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6m2sfilhzo8ui6ixd9oh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6m2sfilhzo8ui6ixd9oh.png" alt="Image description" width="584" height="582"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But imagine the drawing animates and shows the direction of the flow of water or sewer system. Increasing the information signal using animations, amplifies the impact of the system diagram.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6cmz0l8mxqcp26uf143d.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6cmz0l8mxqcp26uf143d.gif" alt="Image description" width="584" height="582"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I recently ended up using Drawing.io that has easy way to do this.&lt;br&gt;
Most of the time the connections between components of the diagrams are the pieces we want to animate.&lt;/p&gt;

&lt;p&gt;I looked for this in a lot of other tools I used in the past. Both proper diagramming tools like lucid charts, mermaid, excalidraw, and miro. figma/figjam and general presentation apps like Keynote, and PowerPoint. None have easy animation options.&lt;/p&gt;

&lt;p&gt;The way I started doing easy animations, is to use my existing drawings, import them as static jpeg/png or SVG into draw.io then draw a connector line over my existing lines, give it a white color and make it thick, then check it's "animation" option to animate white dashes lines that animate over my existing connector lines, making them animated.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq4i976qodpveqk0tbcf2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq4i976qodpveqk0tbcf2.png" alt="Image description" width="478" height="1040"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Beyond Animation, I like draw.io because it's a fantastic drawing tool. It's open-source and free to use and contribute to. Works cross-platform. It has a lot of drawing elements, and you can import your own png, svg icons to your library.&lt;/p&gt;

&lt;p&gt;You can install draw.io on your desktop (on Mac, using homebrew, as simple as brew install –cask drawio), use the online version for free, or signup for cloud plans to collaborate with your team.&lt;/p&gt;




&lt;p&gt;This post was originally published at my blog: &lt;a href="https://mfyz.com/animating-systems-diagrams-with-draw-io/" rel="noopener noreferrer"&gt;https://mfyz.com/animating-systems-diagrams-with-draw-io/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>animation</category>
      <category>drawio</category>
      <category>drawing</category>
      <category>diagram</category>
    </item>
    <item>
      <title>WordPress Headless + CPT + ACF: Building a Flexible Content Platform</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Tue, 24 Sep 2024 20:30:29 +0000</pubDate>
      <link>https://forem.com/mfyz/wordpress-headless-cpt-acf-building-a-flexible-content-platform-13oc</link>
      <guid>https://forem.com/mfyz/wordpress-headless-cpt-acf-building-a-flexible-content-platform-13oc</guid>
      <description>&lt;p&gt;This article will guide you through creating a flexible and dynamic content platform using WordPress as a headless CMS, Custom Post Types (CPTs), and Advanced Custom Fields (ACF). Whether you're a seasoned developer or just starting out, this combination offers a powerful foundation for your projects.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Go Headless with WordPress?
&lt;/h2&gt;

&lt;p&gt;Think of WordPress as the brains behind your content, and a headless setup as giving it the freedom to power any front-end you want. This means you can use your favorite framework (React, Vue.js, etc.) to create a beautiful and performant user interface.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdwuaqwzk6o38ecj03a1.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmdwuaqwzk6o38ecj03a1.jpg" alt="Image description" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One of the big benefits of using wordpress headless is to remove concerns of any front-end from the actual WordPress. This is one of the things I struggled a lot in the past working with wordpress that there is a plugin for everything. And you can easily end up in a place with 20+ plugins bloating your wordpress installation. Most of them are about front-end website experience. This way, you can also separate your editorial needs from your developer teams' needs, making your developer team more independently optimize and deploy your website, without worrying about risking editorial mishaps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up Your Local Test Environment
&lt;/h2&gt;

&lt;p&gt;Before we dive into the fun stuff, let's set up a playground. Here's what you'll need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;WordPress Installation / Local Server: Use Docker for a streamlined setup. Check out this docker compose I wrote a few years back, it should still be a good place to start: &lt;a href="https://github.com/mfyz/wordpress-docker-compose" rel="noopener noreferrer"&gt;https://github.com/mfyz/wordpress-docker-compose&lt;/a&gt; or I'm sure you can find a valid/recent example quickly.&lt;/li&gt;
&lt;li&gt;Headless Framework: Consider Next.js for a React-based frontend. You can find a sample project I played with it here: &lt;a href="https://github.com/mfyz/next-wp" rel="noopener noreferrer"&gt;https://github.com/mfyz/next-wp&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Unleashing the Power of WP-JSON
&lt;/h2&gt;

&lt;p&gt;WordPress's REST API, accessible through wp-json, is your gateway to interacting with your content programmatically. Let's explore it using Postman.&lt;/p&gt;

&lt;h2&gt;
  
  
  Exploring the WP-JSON Endpoint with Postman
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fipunqd5tnvot1lfzoei0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fipunqd5tnvot1lfzoei0.jpg" alt="Image description" width="800" height="522"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Postman is a fantastic tool for testing APIs. Here's how to utilize it for exploring the WordPress REST API:&lt;/p&gt;

&lt;p&gt;Import a Postman Collection: Import the pre-built WordPress Postman Collection to get started quickly. This collection provides pre-configured requests for interacting with various WordPress resources.&lt;br&gt;
Test Requests: Send GET requests to retrieve various post types, pages, and custom fields. Explore the available endpoints and data structures.&lt;/p&gt;

&lt;h2&gt;
  
  
  Customizing Your Content types with Custom Post Types (CPT)
&lt;/h2&gt;

&lt;p&gt;WordPress offers you the flexibility to create custom post types beyond the standard posts and pages. Think of these as building blocks for your unique content structure (Imagine unique content types, like: recipes, books, hardware, people, places…).&lt;/p&gt;

&lt;p&gt;&lt;u&gt;Custom Post Type UI&lt;/u&gt; is a user-friendly plugin allows you to easily create, manage, and customize custom post types directly within your WordPress admin panel. It eliminates the need for manual coding, making CPT creation accessible to users of all skill levels.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxyy3ty38q0pqu85mz3vn.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxyy3ty38q0pqu85mz3vn.jpg" alt="Image description" width="800" height="609"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Advanced Custom Fields with ACF
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5d45coj5zw6pa8ng6apt.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5d45coj5zw6pa8ng6apt.jpg" alt="Image description" width="800" height="503"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Advanced Custom Fields (ACF) is a game-changer for content management. It lets you create custom fields for your custom post types, making them more flexible and dynamic. Think of it like building blocks for your content.&lt;/p&gt;

&lt;p&gt;Here's what you can achieve with ACF:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvx93j6587mckft3pesov.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvx93j6587mckft3pesov.jpg" alt="Image description" width="800" height="547"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create Flexible Layouts: Design complex page layouts with varied content formats using ACF fields.&lt;/li&gt;
&lt;li&gt;Simplify Content Creation: Provide editors with user-friendly interfaces for adding and managing content, even for complex data structures.&lt;/li&gt;
&lt;li&gt;Enhanced Data Management: Store complex data structures efficiently with custom field groups.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here is how your custom fields will look like in your pages, or posts:&lt;/p&gt;

&lt;p&gt;I find this very intuitive.&lt;/p&gt;

&lt;p&gt;When combining it with the CPT UI plugin, it becomes really customization. CPT UI has additional controls to make the "editing" experience simpler for custom types (like disable Guttenberg, disable the body of the post altogether, and other customizations).&lt;/p&gt;

&lt;p&gt;ACF will promote its PRO plan a lot, but you don't need its pro version in most cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Front-end freedom
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5xg8w62m2cym5f48d85l.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5xg8w62m2cym5f48d85l.jpg" alt="Image description" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using WordPress empowers your front-end team to choose their favorite front-end framework, push the boundaries of customization and performance for your front-end of your experiences.&lt;/p&gt;

&lt;p&gt;It can also centralize your content platform for multi-channel digital experiences like website, mobile apps, OTT apps (TV apps).&lt;/p&gt;

&lt;p&gt;In the summary at the top, I mentioned the next.js sample I played with a few years back to use simple wordpress + CPT UI + ACF combination. You can browse the source code here: &lt;a href="https://github.com/mfyz/next-wp" rel="noopener noreferrer"&gt;https://github.com/mfyz/next-wp&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I hope this article provides a solid foundation for your headless WordPress journey.&lt;/p&gt;

&lt;p&gt;Now go ahead and build something amazing!&lt;/p&gt;




&lt;p&gt;This article was first published on my blog: &lt;a href="https://mfyz.com/wordpress-headless-cpt-acf-building-a-flexible-content-platform/" rel="noopener noreferrer"&gt;https://mfyz.com/wordpress-headless-cpt-acf-building-a-flexible-content-platform/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>headless</category>
      <category>nextjs</category>
      <category>postman</category>
      <category>wordpress</category>
    </item>
    <item>
      <title>Leveling up Data-Driven Product Development game using Posthog</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Tue, 17 Sep 2024 14:14:13 +0000</pubDate>
      <link>https://forem.com/mfyz/leveling-up-data-driven-product-development-game-using-posthog-l1c</link>
      <guid>https://forem.com/mfyz/leveling-up-data-driven-product-development-game-using-posthog-l1c</guid>
      <description>&lt;p&gt;Posthog is an open-source product analytics platform that offers flexibility and control. You can deploy it on your own infrastructure or use the cloud-based option. This gives you the freedom to customize and extend the platform to meet your specific needs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy4vec7g2r6y3f1gn41u.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxy4vec7g2r6y3f1gn41u.jpg" alt="Image description" width="800" height="521"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I've been using Posthog for a while now, and it's quickly become my go-to tool for understanding my users and making data-driven decisions. As an open-source platform, it gives me the flexibility to customize it to fit my exact needs. Although, I've been using their cloud offering with generous free-tier which was my go-to Product Operating System for projects.&lt;/p&gt;

&lt;h2&gt;
  
  
  Auto capture: The Magic Button
&lt;/h2&gt;

&lt;p&gt;One of the things I appreciate most about Posthog is its auto-capture feature. It's like having a tiny detective following my users around, recording their every click and interaction. This has saved me countless hours of manually setting up tracking events. It also has pretty good customizations on what gets auto captured what not:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;posthog.init('phc_.........................', {
    api_host: '&amp;lt;https://us.i.posthog.com&amp;gt;',
    autocapture: {
        dom_event_allowlist: ['click'],
        url_allowlist: ['posthog.com./docs/.*'],
        url_ignorelist: ['posthog.com./docs/.*/secret-section/.*'],
        element_allowlist: ['button'],
        css_selector_allowlist: ['[ph-autocapture]'],
        element_attribute_ignorelist:['data-attr-pii="email"'],
    },
})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Beyond the basics, Posthog has a ton of cool features that make it a powerhouse. Here are a few of my favorites:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;HogQL:&lt;/strong&gt; Their SQL-like querying language. This is an awesome capability for a data nerd like me. Even though alternatives like Amplitude have similar SQL-ish capabilities, they are almost always included in their Enterprise plans, unlike Posthog which is included in all plans.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;User funnels:&lt;/strong&gt; I can easily visualize how users flow through my product and identify bottlenecks where they might be dropping off.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Cohort analysis:&lt;/strong&gt; I can segment my users into groups based on their behavior and track their performance over time.&lt;br&gt;
Heatmaps: I can see exactly where users are clicking on my website or app, helping me optimize the user experience.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Session recordings:&lt;/strong&gt; I can watch actual recordings of user sessions to see how they're interacting with my product.&lt;br&gt;
Web Analytics: A recently added feature for people who struggled to adopt GA4. They have pretty simple old-school web analytics automatically tracked.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F16de5z0i5333o8nkx9w3.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F16de5z0i5333o8nkx9w3.jpg" alt="Image description" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Experimentation features
&lt;/h2&gt;

&lt;p&gt;Posthog also has powerful features for A/B testing and feature flags. This allows me to experiment with different designs and features without affecting all of my users. It's a great way to gather data and make informed decisions about my product's direction.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcrdvyvw3bs935yjdu77y.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcrdvyvw3bs935yjdu77y.jpg" alt="Image description" width="800" height="498"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Surveys: Getting Direct Feedback
&lt;/h2&gt;

&lt;p&gt;One of my favorite things about Posthog is its surveys feature. I can create custom surveys and target specific segments of my user base to get direct feedback on my product. It's a great way to understand my users' needs and pain points.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8e66xz80epe0c1bod72l.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8e66xz80epe0c1bod72l.jpg" alt="Image description" width="800" height="407"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why I Love Posthog
&lt;/h2&gt;

&lt;p&gt;In short, Posthog has helped me level up my product analytics game. It's easy to use, powerful, and customizable. If you're looking for a tool to help you understand your users and make data-driven decisions, I highly recommend giving it a try.&lt;/p&gt;

&lt;p&gt;Their documentation is also one of the best developer documentation I've experienced with.&lt;/p&gt;

&lt;p&gt;Check it out: &lt;a href="https://posthog.com/" rel="noopener noreferrer"&gt;https://posthog.com/&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This post is first published on my personal blog: &lt;a href="https://mfyz.com/leveling-up-data-driven-product-development-game-using-posthog/" rel="noopener noreferrer"&gt;https://mfyz.com/leveling-up-data-driven-product-development-game-using-posthog/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>analytics</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Metrics to pay attention to, when optimizing web page performance</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Tue, 10 Sep 2024 13:45:06 +0000</pubDate>
      <link>https://forem.com/mfyz/metrics-to-pay-attention-to-when-optimizing-web-page-performance-3h6a</link>
      <guid>https://forem.com/mfyz/metrics-to-pay-attention-to-when-optimizing-web-page-performance-3h6a</guid>
      <description>&lt;p&gt;In today's lightning-fast digital landscape, website speed is no longer a luxury - it's a fundamental requirement. Every developer should possess the knowledge to analyze and optimize web page performance for a seamless user experience. After all, a speedy website translates into higher engagement, lower bounce rates, and ultimately, increased conversions.&lt;/p&gt;

&lt;h2&gt;
  
  
  The High Cost of Slow Websites
&lt;/h2&gt;

&lt;p&gt;The detrimental effects of sluggish websites are well-documented by numerous studies:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Upward of 25% of users abandon a site if it takes longer than 4 seconds to load &lt;a href="https://www.getelastic.com/site-performance/" rel="noopener noreferrer"&gt;Akami Study&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;A 3-second delay can lead to a significant drop in engagement, with 22% fewer page views and a 50% higher bounce rate &lt;a href="https://blog.radware.com/applicationdelivery/applicationaccelerationoptimization/2013/03/free-report-ecommerce-page-speed-web-performance-spring-2013/" rel="noopener noreferrer"&gt;Strangeloop Networks Study&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Conversions take a major hit for websites taking 5 seconds to load, experiencing a decrease of 38% &lt;a href="https://blog.radware.com/applicationdelivery/applicationaccelerationoptimization/2013/03/free-report-ecommerce-page-speed-web-performance-spring-2013/" rel="noopener noreferrer"&gt;Strangeloop Networks Study&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://medium.com/@Pinterest_Engineering/driving-user-growth-with-performance-improvements-cfc50dafadd7" rel="noopener noreferrer"&gt;Pinterest increased search engine traffic and sign-ups by 15%&lt;/a&gt; when they reduced perceived wait times by 40%.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.creativebloq.com/features/how-the-bbc-builds-websites-that-scale" rel="noopener noreferrer"&gt;The BBC found they lost an additional 10% of users&lt;/a&gt; for every extra second their site took to load.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prioritizing Core Web Vitals
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6yzzroru7g1ph7g4xtru.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6yzzroru7g1ph7g4xtru.jpeg" alt="Image description" width="800" height="203"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Forget outdated metrics - Google prioritizes Core Web Vitals for website performance evaluation. These metrics measure real-world user experience and directly impact search engine rankings. Here's a breakdown of the three key Core Web Vitals:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Largest Contentful Paint (LCP):&lt;/strong&gt; This tracks the time it takes for the largest content element to load. Optimize images and preload content to improve LCP (ideally under 2.5 seconds). &lt;a href="https://web.dev/lcp/" rel="noopener noreferrer"&gt;Learn more about LCP&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Interaction to Next Paint (INP):&lt;/strong&gt; This metric measures the user's perceived responsiveness. Aim for an INP of under 50 milliseconds. &lt;a href="https://web.dev/inp/" rel="noopener noreferrer"&gt;Learn more about INP&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cumulative Layout Shift (CLS):&lt;/strong&gt; This metric assesses how much your page layout shifts as elements load. Use pre-defined dimensions for images and avoid lazy loading critical content to minimize CLS (ideally below a score of 0.1). &lt;a href="https://web.dev/cls/" rel="noopener noreferrer"&gt;Learn more about CLS&lt;/a&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Optimizing for Interactivity
&lt;/h2&gt;

&lt;p&gt;Beyond loading speed, interactivity matters. Here's how to ensure your page feels responsive:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Time to Interactive (TTI):&lt;/strong&gt; This measures the time it takes for your page to become fully interactive. Reduce unnecessary JavaScript and optimize critical rendering paths to achieve a TTI under 3.1 seconds. &lt;a href="https://web.dev/interactive/" rel="noopener noreferrer"&gt;Learn more about TTI&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Total Blocking Time (TBT):&lt;/strong&gt; This metric focuses on how long your main thread is blocked by JavaScript execution. Minimize render-blocking JavaScript and leverage code splitting to keep TBT below 3.1 seconds. &lt;a href="https://web.dev/lighthouse-total-blocking-time/" rel="noopener noreferrer"&gt;Learn more about TBT&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Actionable Steps for Improvement
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Leverage a CDN:&lt;/strong&gt; Consider a content delivery network (CDN) to improve content delivery speed for geographically dispersed users. Monitor CDN performance, including cache hit rate and first byte time. Remember to carefully consider the Time-to-Live (TTL) of your content. A longer TTL can improve performance by reducing the number of requests to your origin server, but it can also lead to stale content if not managed properly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Minify and Optimize Resources:&lt;/strong&gt; Reduce file sizes and optimize images for web delivery.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Implement Lazy Loading:&lt;/strong&gt; Load non-critical content below the fold only when the user scrolls down to improve initial page load.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Utilize Browser Caching:&lt;/strong&gt; Enable browser caching for static assets to reduce server requests on subsequent visits.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Other Considerations
&lt;/h2&gt;

&lt;p&gt;While Core Web Vitals and interactivity metrics provide a solid foundation, there are other factors to consider for comprehensive website performance optimization:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Network Performance:&lt;/strong&gt; Although not directly measured by Lighthouse, network response times significantly impact user experience. Tools like Google PageSpeed Insights can help identify network bottlenecks.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Server-Side Optimization:&lt;/strong&gt; Optimizing server response times and resource processing can significantly improve perceived website performance.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Continuous Monitoring and Improvement
&lt;/h2&gt;

&lt;p&gt;Remember, website performance is an ongoing process. Regularly monitor your website's performance metrics using tools like Google PageSpeed Insights and Lighthouse. Continuously analyze and optimize your code, content, and infrastructure to ensure a top-notch user experience.&lt;/p&gt;




&lt;p&gt;This post is first published on my personal blog: &lt;a href="https://mfyz.com/metrics-to-pay-attention-to-when-optimizing-web-page-performance/" rel="noopener noreferrer"&gt;https://mfyz.com/metrics-to-pay-attention-to-when-optimizing-web-page-performance/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webperf</category>
      <category>webvitals</category>
      <category>pagespeed</category>
      <category>seo</category>
    </item>
    <item>
      <title>Creating sequence diagrams using mermaidjs to map out your user journey</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Mon, 23 Jan 2023 13:32:50 +0000</pubDate>
      <link>https://forem.com/mfyz/creating-sequence-diagrams-using-mermaidjs-to-map-out-your-user-journey-3imf</link>
      <guid>https://forem.com/mfyz/creating-sequence-diagrams-using-mermaidjs-to-map-out-your-user-journey-3imf</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbq1rtexzyt4ym6ufv1fa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbq1rtexzyt4ym6ufv1fa.png" alt="Image description" width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I want to talk about an effective product planning process I've been following recently.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why to do user journey thinking?
&lt;/h2&gt;

&lt;p&gt;In your product, single most important thing is to put your users first. Nothing matters other than how your users perceive your product. "Users" is a very general and broad definition and often used without much nuance of different user personas. Anybody who interacts with your product is your user, including you as "admin" or "owner". There are many other user personas you may need to consider when designing a feature. It's also important to think maturity of your users adoption of your product. A user who is new to your overall product may take your new feature in a different way than a power user. Similarly, your small biz client's users may need different things than your enterprise client.&lt;/p&gt;

&lt;p&gt;User journeys can easily highlight differences between these users and how they interact with your product or each other. Your feature may require a tech lead to configure things first in your product, then tell their editors to do other things while editors may need to work with their development team to accomplish other goals. So a simple looking feature may require couple different team members to collaborate and communicate.&lt;/p&gt;

&lt;h2&gt;
  
  
  Sequence Diagrams
&lt;/h2&gt;

&lt;p&gt;Sequence diagram is a type of diagram that does great job telling 2 things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How many participants in a journey&lt;/li&gt;
&lt;li&gt;The order of things between which participants happen.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And example to this would be, order a food at a restaurant.&lt;/p&gt;

&lt;p&gt;Participants:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Guest&lt;/li&gt;
&lt;li&gt;Host&lt;/li&gt;
&lt;li&gt;Waiter&lt;/li&gt;
&lt;li&gt;Kitchen&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And maybe steps would be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Guest asks Host can I have a table for 2?&lt;/li&gt;
&lt;li&gt;Host shows and sits the Guests to a table.&lt;/li&gt;
&lt;li&gt;Waiter comes to the table and asks for order from Guests.&lt;/li&gt;
&lt;li&gt;Guests places their order.&lt;/li&gt;
&lt;li&gt;Waiter tells Kitchen about the order&lt;/li&gt;
&lt;li&gt;Kitchen prepares the order&lt;/li&gt;
&lt;li&gt;Kitchen tells Waiter that order is ready.&lt;/li&gt;
&lt;li&gt;Waiter brings food to the table to the Guests&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A journey like this can be visualized in a sequence diagram like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fblo4v6348z0kbirksnqs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fblo4v6348z0kbirksnqs.png" alt="Image description" width="800" height="329"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sequenceDiagram

Guests -&amp;gt;&amp;gt; Host: Can I have a table for 2?
Host -&amp;gt;&amp;gt; Guests: Shows and sits the Guests to a table.
Waiter -&amp;gt;&amp;gt; Geusts: Comes to the table and asks for order.
Guests -&amp;gt;&amp;gt; Waiter: Places their order.
Waiter -&amp;gt;&amp;gt; Kitcken: Tells about the order
Kitchen -&amp;gt;&amp;gt; Kitchen: Prepares the order
Kitchen -&amp;gt;&amp;gt; Waiter: Order is ready.
Waiter -&amp;gt;&amp;gt; Guests: Brings food to the table.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you see, if we were to design a food ordering feature, we may want to visualize traditional way of ordering food and also visualize our better food ordering experience with our new product feature.&lt;/p&gt;

&lt;h2&gt;
  
  
  How about user stories?
&lt;/h2&gt;

&lt;p&gt;User stories are key when developing a product. Written user stories are best to summarize a capability, a feature or a user goal. A traditional story would look like:&lt;/p&gt;

&lt;p&gt;As a &amp;lt; persona &amp;gt;, I want to &amp;lt; action &amp;gt;, so that I can get &amp;lt; benefit&amp;gt;&lt;/p&gt;

&lt;p&gt;User stories written in this traditional sense brings clear, structured, short and written form to your product features. They are essential when the engineering team is planning their implementation in project management tools. Often a user story is planned as a story and engineering team can break it down to sub-tasks about the implementation steps. If user story is describing a bigger goal, it can be planned as an epic and sub stories and tasks can be planned under it.&lt;/p&gt;

&lt;p&gt;Let's roll back to our user journey mapping with sequence diagrams. As you see here, this process does not replace but complements user stories. It makes sense to make the user journey mapping practice before finalizing user stories when planning a product.&lt;/p&gt;

&lt;h2&gt;
  
  
  Using Mermaid.js for quick Diagramming Tool
&lt;/h2&gt;

&lt;p&gt;Mermaid is an open source software that draws different types of diagrams using simple structured text. One of the diagram types mermaid support is sequence diagrams.&lt;/p&gt;

&lt;p&gt;Mermaid draws a sequence diagram using a text formatted like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sequenceDiagram

Fatih -&amp;gt;&amp;gt; John: Hi, how are you?
John --&amp;gt;&amp;gt; Fatih: Thanks, I'm good.
John --&amp;gt;&amp;gt; Fatih: How are you?
Fatih -&amp;gt;&amp;gt; John: All good, thanks.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Renders to a nice sequence diagram:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5bv8uexr895ipid7atkq.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5bv8uexr895ipid7atkq.jpg" alt="Image description" width="800" height="783"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Mermaid Tooling
&lt;/h2&gt;

&lt;p&gt;I've covered how to edit and manage mermaid diagrams in your favorite tools in &lt;a href="https://mfyz.com/editing-and-previewing-mermaid-diagrams-on-your-docs-markdown-github-notion-confluence/" rel="noopener noreferrer"&gt;a past article&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You can also alternatively create similar diagrams in free-form mode with excalidraw which I also like. I've talked about excalidraw in my &lt;a href="https://mfyz.com/create-quick-diagrams-and-wireframes-using-excalidraw-vscode/" rel="noopener noreferrer"&gt;previous posts&lt;/a&gt;. It will give you way more control to make your diagrams look exactly like you want but obviously it will take much more time to create one compared to mermaid.js diagrams.&lt;/p&gt;




&lt;p&gt;This article was first published on my personal blog: &lt;a href="https://mfyz.com/creating-sequence-diagrams-using-mermaidjs-to-map-out-your-user-journey/" rel="noopener noreferrer"&gt;https://mfyz.com/creating-sequence-diagrams-using-mermaidjs-to-map-out-your-user-journey/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>react</category>
      <category>webdev</category>
      <category>cloud</category>
      <category>performance</category>
    </item>
    <item>
      <title>Editing and previewing Mermaid diagrams on your docs (markdown, github, notion, confluence)</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Tue, 20 Dec 2022 13:40:20 +0000</pubDate>
      <link>https://forem.com/mfyz/editing-and-previewing-mermaid-diagrams-on-your-docs-markdown-github-notion-confluence-1p4p</link>
      <guid>https://forem.com/mfyz/editing-and-previewing-mermaid-diagrams-on-your-docs-markdown-github-notion-confluence-1p4p</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpo2zgtgjcestiyo5tk78.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpo2zgtgjcestiyo5tk78.png" alt="Image description" width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Mermaid.js is one of my recent favorite tools I've added to my toolbox. Mermaid.js is an open-source diagramming tool that converts plain structured text and renders diagrams as SVG or PNG images. You can use the following tools to render and save mermaid diagrams on your documents but there are better ways to create, edit and maintain your mermaid diagrams in your favorite documentation tools. Because of its popularity, a lot of documentation tool supports the mermaid either out of the box or with add-ons.&lt;/p&gt;

&lt;h2&gt;
  
  
  Mermaid Live Editor
&lt;/h2&gt;

&lt;p&gt;Let's start with the obvious option. Mermaid.js has an online editor that is free to use and does not require anything. You can quickly start creating diagrams, and share or embed them in your favorite places.&lt;/p&gt;

&lt;p&gt;Try: &lt;a href="https://mermaid.live"&gt;https://mermaid.live&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Markdown editors mermaid support
&lt;/h2&gt;

&lt;p&gt;I personally use Visual Studio Code as my markdown editor and I use the Markdown Enhanced Preview extension to preview additional markdown features that include mermaid syntax to be rendered as diagrams.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=shd101wyy.markdown-preview-enhanced"&gt;https://marketplace.visualstudio.com/items?itemName=shd101wyy.markdown-preview-enhanced&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffphtysovzsf9y89kbch1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffphtysovzsf9y89kbch1.png" alt="Image description" width="800" height="592"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I've seen most of the text editors that have markdown support to have mermaid support included by default or with add-ons/extensions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Documentation tools mermaid support
&lt;/h2&gt;

&lt;p&gt;Most tools work with markdown or support markdown features, recognize mermaid.js, and provide out-of-the-box mermaid diagrams support.&lt;/p&gt;

&lt;h2&gt;
  
  
  Github.com markdown mermaid support
&lt;/h2&gt;

&lt;p&gt;Github started to support mermaid diagrams rendered instead of code blocks if the code block is marked as mermaid syntax.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.blog/2022-02-14-include-diagrams-markdown-files-mermaid/"&gt;https://github.blog/2022-02-14-include-diagrams-markdown-files-mermaid/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0vyvmywm1xnye8b6d6vj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0vyvmywm1xnye8b6d6vj.png" alt="Image description" width="800" height="420"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It works out of the box and when a markdown file is rendered in the preview mode in your repo, it will show the diagram as SVG rendering inline instead of the code block.&lt;/p&gt;

&lt;h2&gt;
  
  
  Confluence mermaid Plugin
&lt;/h2&gt;

&lt;p&gt;There is a free confluence add-on that adds a page macro that you can insert a mermaid diagram by adding the source diagram text once you insert it to the page, it renders it as an SVG image. It can be easily edited on the page so there is no need to render, export, and insert as an image manually.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://marketplace.atlassian.com/apps/1222792/mermaid-integration-for-confluence?tab=overview&amp;amp;hosting=cloud"&gt;https://marketplace.atlassian.com/apps/1222792/mermaid-integration-for-confluence?tab=overview&amp;amp;hosting=cloud&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmmdha52223vdyu3h3gum.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmmdha52223vdyu3h3gum.jpeg" alt="Image description" width="800" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Notion mermaid support
&lt;/h2&gt;

&lt;p&gt;Any code block that is selected as mermaid syntax, gets a mermaid diagram rendered in the code block. It has diagram size limited to the code block width which makes it hard to see large diagrams, but it does render small-size diagrams or things like pie charts powered by mermaid syntax well.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn5pozlxwxikqj5ctiujj.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn5pozlxwxikqj5ctiujj.jpeg" alt="Image description" width="800" height="1135"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This post was first published on my personal blog: &lt;a href="https://mfyz.com/editing-and-previewing-mermaid-diagrams-on-your-docs-markdown-github-notion-confluence/"&gt;https://mfyz.com/editing-and-previewing-mermaid-diagrams-on-your-docs-markdown-github-notion-confluence/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mermaidjs</category>
      <category>vscode</category>
      <category>notion</category>
      <category>diagram</category>
    </item>
    <item>
      <title>dotfiles method to sync your command line configurations between machines</title>
      <dc:creator>Fatih Felix Yildiz</dc:creator>
      <pubDate>Mon, 05 Dec 2022 14:14:10 +0000</pubDate>
      <link>https://forem.com/mfyz/dotfiles-method-to-sync-your-command-line-configurations-between-machines-5e7a</link>
      <guid>https://forem.com/mfyz/dotfiles-method-to-sync-your-command-line-configurations-between-machines-5e7a</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqv8tga34xurpsadxt29b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqv8tga34xurpsadxt29b.png" alt="Image description" width="800" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I want to talk about a common practice among developers who work on multiple machines or work on (many) servers. As developers, we spend a healthy amount of our time on the shell and work with command line tools. If you fit in this description, it is important to configure your shell to your own liking.&lt;/p&gt;

&lt;p&gt;Then comes the question, of how to keep this configuration in sync between devices or servers. I will talk about the common method called dotfiles to store, push and pull the changes of these files (or collection of them). You may be familiar with this method but I’ll talk about my own setup after giving some context. I hope you can find bits and pieces of helpful tricks I implemented on my own version. But you can search dotfiles in github and find thousands of other developers to explore different flavors of personal configurations. Most developers make these configuration files public. You can find my dotfiles repo here: &lt;a href="https://github.com/mfyz/dotfiles" rel="noopener noreferrer"&gt;https://github.com/mfyz/dotfiles&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Configuration Files
&lt;/h2&gt;

&lt;p&gt;Most of the command line apps we use save their configuration in the user's home folder as a single configuration file or a folder with a dot prefix which makes the files/folders hidden by default file browsing apps and commands.&lt;/p&gt;

&lt;p&gt;This makes the configuration of each tool, extremely portable. Basically copying this file between machines will give you exact replica of your tool configuration everywhere. Well, almost always. Some tools may require additional activation steps like installing vim plugins or planning multiple files.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to collect them in one place?
&lt;/h2&gt;

&lt;p&gt;The main trick of this method is to keep the original files under a single folder, generally&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.dotfiles
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;in your home directory:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/home/myuser/.dotfiles/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then link the original files to their final locations. Instead of manually linking them we’ll be using a utility for this.&lt;/p&gt;

&lt;p&gt;Then this folder can be git managed and pushed to a remote repo. As I mentioned, a lot of developers make their dotenv configurations public. So you can push your dotfiles folder to github privately or publicly. If you are publicly pushing, you need to make sure there is no secure tokens included in your dotfiles. Read Security section below about this, that I’ll be talking about how to separate your secrets and store in a private repo that you can also make similar dotfiles-secrets repo and replication process between machines.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to restore them to the right place in a new machine?
&lt;/h3&gt;

&lt;p&gt;We’ll be using a utility called GNU Stow. Stow is a simple command that simply links files under a folder to your home directory. For example, if you have one or more configuration file for your zsh configuration like&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.zshrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;file, you put it in a folder (can be any name) like “zsh”. Then running the command&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stow zsh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;anywhere will link the contents of zsh folder into the home directory.&lt;/p&gt;

&lt;p&gt;You can collect all apps configurations in different folders under your .dotfiles folder. Then run stow for each folder. Or instead automate it something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;for d in $(ls -d */ | cut -f1 -d '/' | grep -v '^_');
do
    ( stow "$d"  )
done
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Furthermore, you can automate the whole installation steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;clone the github repo, so you have local copy&lt;/li&gt;
&lt;li&gt;install gnu stow using the operating system’s package manager&lt;/li&gt;
&lt;li&gt;walk all directories and run stow command that will link files to the root folder.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The final installation script will look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#!/bin/bash

if [[ -e /etc/debian_version ]]; then
    OS=debian
elif [[ "$OSTYPE" == "darwin"* ]]; then
    OS=macos
elif  ! command -v stow &amp;gt;/dev/null 2&amp;gt;&amp;amp;1; then
    OS=notfound
else
    echo "Please install stow manually then try again."
    exit
fi
if [[ "$OS" = 'debian' ]]; then
    sudo apt-get install -y stow
elif [[ "$OS" = 'macos' ]]; then
    brew install stow
fi

git clone &amp;lt;https://github.com/mfyz/dotfiles.git&amp;gt; ~/.dotfiles
cd ~/.dotfiles || exit
for d in $(ls -d */ | cut -f1 -d '/' | grep -v '^_');
do
    ( stow "$d"  )
done
echo 'Congrats, you are done, Enjoy!'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Securing your secrets
&lt;/h3&gt;

&lt;p&gt;It’s very important NOT to push your secrets/tokens that generally make its way to rc files like .zshrc, or .bashrc. Instead, we will be moving all token/secret export commands to a separate file and repository that you can privately push and sync between machines with a similar method.&lt;/p&gt;

&lt;p&gt;I’ve been using the name dotfiles-secrets for my repo and the name of the script.&lt;/p&gt;

&lt;p&gt;The way it works is simply to put a folder&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.dotfiles-secrets
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;then place all exports of tokens and secrets to a shell file called&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;secrets.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;and push that to a separate private repo. Then git clone that repo manually to your home folder.&lt;/p&gt;

&lt;p&gt;In my regular .dotfiles repo, my shell rc file (I use zsh, therefore it is&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.zshrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;file for me), we have the following block that checks the&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.dotfiles-secrets/secrets.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;file and executes if the file exists.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# secrets
SECRETS_FILE="$HOME/.dotfiles-secret/secrets.sh"
if test -f "$SECRETS_FILE"; then
  source $SECRETS_FILE
fi
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This way, we gracefully do not give any error if you only want to clone the configuration files and not have your secrets.&lt;/p&gt;




&lt;p&gt;This post was first published on my personal blog: &lt;a href="https://mfyz.com/dotfiles-method-to-sync-your-command-line-configurations-between-machines/" rel="noopener noreferrer"&gt;https://mfyz.com/dotfiles-method-to-sync-your-command-line-configurations-between-machines/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>discuss</category>
    </item>
  </channel>
</rss>
