<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: dpacheconr</title>
    <description>The latest articles on Forem by dpacheconr (@dpacheconr).</description>
    <link>https://forem.com/dpacheconr</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/dpacheconr"/>
    <language>en</language>
    <item>
      <title>Parsing multiline logs using a custom Fluent Bit configuration</title>
      <dc:creator>dpacheconr</dc:creator>
      <pubDate>Thu, 25 May 2023 17:58:01 +0000</pubDate>
      <link>https://forem.com/newrelic/parsing-multiline-logs-using-a-custom-fluent-bit-configuration-3oio</link>
      <guid>https://forem.com/newrelic/parsing-multiline-logs-using-a-custom-fluent-bit-configuration-3oio</guid>
      <description>&lt;h2&gt;
  
  
  &lt;em&gt;To read this full New Relic blog, &lt;a href="https://newrelic.com/blog/how-to-relic/parsing-multiline-logs-with-fluent-bit?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;click here&lt;/a&gt;.&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;Applications generally output logs line by line, but occasionally some logs can span multiple lines to make them easier to read. While these multiline logs can improve readability when they're read consecutively (and in isolation), they can be hard to understand when queried among other logs that may appear intermixed. This becomes a problem: logs that are hard to read take up time that could be spent otherwise.&lt;/p&gt;

&lt;p&gt;One solution is to combine all the lines from one log message into a single log record. Doing this clearly separates each multiline log, which will in turn make the log easier to understand and you save time. Although consolidating multiline log messages into a single log entry might seem daunting at first, you can do this by creating a log parser based on the common pattern within the log lines. In this post, I’ll walk you through this process. You'll learn how to use a custom Fluent Bit configuration to enable multiline log messages in &lt;a href="https://newrelic.com/products/logs?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;New Relic logs&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;If you're unfamiliar, Fluent Bit is a logging and metrics processor and forwarder. The &lt;a href="https://docs.newrelic.com/docs/infrastructure/install-infrastructure-agent/get-started/install-infrastructure-agent/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;New Relic infrastructure agent&lt;/a&gt; is bundled with a &lt;a href="https://docs.newrelic.com/docs/logs/enable-log-management-new-relic/enable-log-monitoring-new-relic/fluent-bit-plugin-log-forwarding?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Fluent Bit plugin&lt;/a&gt;, so you can natively forward logs with the simple configuration of a &lt;a href="https://yaml.org/"&gt;YAML&lt;/a&gt; file. &lt;/p&gt;

&lt;p&gt;If you’re already using Fluent Bit, you can also forward your &lt;a href="https://docs.newrelic.com/docs/logs/forward-logs/kubernetes-plugin-log-forwarding/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Kubernetes logs&lt;/a&gt; to New Relic with the help of our &lt;a href="https://github.com/newrelic/newrelic-fluent-bit-output"&gt;Fluent Bit output plugin&lt;/a&gt;. Alternatively you can use it as a standalone Docker image, which we refer to as our &lt;a href="https://github.com/newrelic/helm-charts/tree/master/charts/newrelic-logging#installation"&gt;Kubernetes plugin&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Let's get started by understanding the problem a bit better.&lt;/p&gt;

&lt;h2&gt;
  
  
  The challenge: When multiline logs become hard to read
&lt;/h2&gt;

&lt;p&gt;Take a look at this stack trace as an example. It’s a single log made up of multiple lines. The first line starts with a timestamp, and each new line starts with the word "at".&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;single line...
Dec 14 06:41:08 Exception in thread "main" java.lang.RuntimeException: Something has gone wrong, aborting!
at com.myproject.module.MyProject.badMethod(MyProject.java:22)
at com.myproject.module.MyProject.oneMoreMethod(MyProject.java:18)
at com.myproject.module.MyProject.anotherMethod(MyProject.java:14)
at com.myproject.module.MyProject.someMethod(MyProject.java:10)
at com.myproject.module.MyProject.main(MyProject.java:6)
another line...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, examine how it appears in New Relic without any custom configuration applied. Notice in the next image how each line of the stack trace appears as an individual log entry.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UcM6wWBz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aymdkppldzybj452zary.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UcM6wWBz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aymdkppldzybj452zary.jpg" alt="A multiline log in New Relic recorded as multiple individual logs, instead of as one log." width="800" height="381"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Well that’s precisely the problem: when a single multiline log appears as multiple separate log entries, it makes the multiline log harder to read and to distinguish from other logs. &lt;/p&gt;

&lt;p&gt;OK, let’s solve this problem together. To handle multiline logs in New Relic, let's create a custom Fluent Bit multiline parser configuration, using the same logic from the earlier examples.&lt;/p&gt;

&lt;h2&gt;
  
  
  Ready to follow along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://newrelic.com/platform/log-management?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Sign up for a free account.&lt;/a&gt; Your account includes 100 GB/month of free data ingest, one free full-access user, and unlimited free basic users.&lt;br&gt;
Get your &lt;a href="https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;New Relic API key&lt;/a&gt;. &lt;br&gt;
&lt;a href="https://newrelic.com/instant-observability/?category=infrastructure-and-os?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Start instrumenting&lt;/a&gt; with one of our quickstart integrations.&lt;/p&gt;
&lt;h2&gt;
  
  
  How Fluent Bit processes and parses logs
&lt;/h2&gt;

&lt;p&gt;With &lt;a href="https://docs.fluentbit.io/manual/about/what-is-fluent-bit"&gt;Fluent Bit&lt;/a&gt; you can gather telemetry data from various sources, apply filters to enhance it, and transmit it to any target location, such as New Relic. All log data ingested by Fluent Bit is automatically labeled, so development teams can easily apply filtering, routing, parsing, modification, and output protocols. Of all the log processors and forwarders, Fluent Bit is highly efficient and well known for being open source and vendor-neutral.&lt;/p&gt;

&lt;p&gt;To consolidate and configure multiline logs, you’ll need to set up a Fluent Bit parser. Version 1.8 or higher of Fluent Bit offers two ways to do this: using a built-in multiline parser and using a configurable multiline parser. Together, these two multiline parsing engines are called Multiline Core, a unified functionality that handles all user corner cases for multiline logs. &lt;/p&gt;

&lt;p&gt;In this blog post, you'll be using the &lt;a href="https://docs.fluentbit.io/manual/administration/configuring-fluent-bit/multiline-parsing#configurable-multiline-parsers?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;configurable multiline &lt;/a&gt;parser from version 1.8 or higher of Fluent Bit. You'll then be able to apply the parser to a New Relic &lt;a href="https://docs.newrelic.com/docs/kubernetes-pixie/kubernetes-integration/installation/kubernetes-integration-install-configure/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;supported Kubernetes platform&lt;/a&gt;, the &lt;a href="https://docs.newrelic.com/docs/infrastructure/install-infrastructure-agent/get-started/install-infrastructure-agent/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;New Relic infrastructure agent&lt;/a&gt;, or you can use Fluent Bit as a standalone Docker image with the New Relic &lt;a href="https://docs.newrelic.com/docs/logs/forward-logs/kubernetes-plugin-log-forwarding/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Kubernetes plugin&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Take a moment now to determine which version of tools you are using. Each version of New Relic uses a specific Fluent Bit version, and different versions of Fluent Bit have different features:&lt;/p&gt;

&lt;p&gt;In Fluent Bit version 1.7 or lower, you’ll implement multiline log configuration using the &lt;a href="https://docs.fluentbit.io/manual/v/1.8/pipeline/inputs/tail#multiline-core-v1.8"&gt;old multiline configuration parameters&lt;/a&gt;. &lt;br&gt;
In Fluent Bit versions 1.8 or higher, you’ll need to disable any &lt;a href="https://docs.fluentbit.io/manual/pipeline/inputs/tail#multiline-support"&gt;old multiline configuration&lt;/a&gt; and instead set up a &lt;a href="https://docs.fluentbit.io/manual/administration/configuring-fluent-bit/multiline-parsing"&gt;Multiline Core&lt;/a&gt; parser. &lt;br&gt;
Any version greater than the New Relic &lt;a href="https://docs.newrelic.com/docs/logs/forward-logs/fluent-bit-plugin-log-forwarding/"&gt;Fluent Bit output plugin&lt;/a&gt; v1.12.2, &lt;a href="https://docs.newrelic.com/docs/kubernetes-pixie/kubernetes-integration/installation/install-kubernetes-integration-using-helm/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Helm charts&lt;/a&gt; v1.10.9, or &lt;a href="https://docs.newrelic.com/docs/infrastructure/install-infrastructure-agent/get-started/install-infrastructure-agent/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;infrastructure agent&lt;/a&gt; v1.20.3 can use the Multiline Core functionality available in Fluent Bit v1.8 or higher.&lt;br&gt;
To confirm which version of Fluent Bit you're using, check the New Relic release notes.&lt;/p&gt;
&lt;h2&gt;
  
  
  Creating a custom multiline parser configuration with Fluent Bit
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;First, it's crucial to note that Fluent Bit configs have strict indentation requirements, so copying and pasting from this blog post might lead to syntax issues.&lt;/strong&gt; Check the Fluent Bit docs to understand the &lt;a href="https://docs.fluentbit.io/manual/administration/configuring-fluent-bit/classic-mode/format-schema"&gt;indentation requirements&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;A multiline parser is defined in the parser’s configuration file by using a &lt;code&gt;[MULTILINE_PARSER]&lt;/code&gt; section definition, which must have a unique name, a type, and other associated properties for each type. &lt;/p&gt;

&lt;p&gt;To configure the multiline parser you must provide regular expressions (regex) to identify the start and continuation lines.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# rules   |   state name   | regex pattern                   | next state
# --------|----------------|---------------------------------------------
rule         "start_state"   "/([a-zA-Z]+ \d+ \d+\:\d+\:\d+)(.*)/"   "cont"
rule         "cont"          "/^\s+at.*/"                            "cont"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The first rule is always referred to as the &lt;code&gt;“start_state”&lt;/code&gt;, which is a regex that captures the start of the multiline message. In the earlier example of the New Relic logs, you saw that the multiline log  begins with a timestamp and continues to the end of the line. &lt;/p&gt;

&lt;p&gt;You set the next state, &lt;code&gt;“cont”&lt;/code&gt;, to specify how the potential continuation lines will look. You can give these custom state names and chain them together. In the example in this blog post, the continuation lines start with "at" and are targeted by a suitable regex.&lt;/p&gt;

&lt;p&gt;You only need one continuation state for this example, but you can configure multiple continuation state definitions for more complex cases.&lt;/p&gt;

&lt;p&gt;Make sure that you have continuation states that match all possible continuation lines. Otherwise your multiline capture might truncate when an unexpected line is encountered.&lt;/p&gt;

&lt;p&gt;To simplify the configuration of regular expressions, you can use the Rubular website. &lt;a href="https://rubular.com/r/NDuyKwlTGOvq2g"&gt;This Rubular page&lt;/a&gt; shows the regex described earlier—plus a log line that matches the pattern.&lt;/p&gt;

&lt;h2&gt;
  
  
  Using the custom Fluent Bit multiline parser configuration
&lt;/h2&gt;

&lt;p&gt;Now let’s test this out. My example uses Azure Kubernetes Service (AKS), where I deployed a New Relic Kubernetes integration using &lt;a href="https://docs.newrelic.com/docs/kubernetes-pixie/kubernetes-integration/installation/install-kubernetes-integration-using-helm/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;Helm&lt;/a&gt;. But you can apply a custom Fluent Bit configuration to any &lt;a href="https://docs.newrelic.com/docs/kubernetes-pixie/kubernetes-integration/installation/kubernetes-integration-install-configure/"&gt;supported Kubernetes platform&lt;/a&gt;, or you can use it with the &lt;a href="https://docs.newrelic.com/docs/infrastructure/install-infrastructure-agent/get-started/install-infrastructure-agent/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;New Relic infrastructure agent&lt;/a&gt;. You can also use Fluent Bit as a standalone Docker image, which we refer to as our &lt;a href="https://docs.newrelic.com/docs/logs/forward-logs/kubernetes-plugin-log-forwarding/"&gt;Kubernetes plugin&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;To configure Fluent Bit within Helm, we need to make changes to the &lt;a href="https://github.com/newrelic/helm-charts/blob/master/charts/newrelic-logging/k8s/fluent-conf.yml"&gt;fluent-bit-config configmap&lt;/a&gt; to tell it to apply the parsing.&lt;/p&gt;

&lt;p&gt;The first step is to define the correct log parser for input messages. Since I'm using the AKS cluster in this example, I need to define &lt;a href="https://github.com/newrelic/helm-charts/blob/master/charts/newrelic-logging/k8s/fluent-conf.yml#L64"&gt;CRI&lt;/a&gt; as the log parser. Doing this ensures that each log message is first parsed using the CRI parser, before being handed over to any filters.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[INPUT]
      Name              tail
      Tag               kube.*
      Path              ${PATH}
      Parser            ${LOG_PARSER}
      DB                /var/log/flb_kube.db
      Mem_Buf_Limit     7MB
      Skip_Long_Lines   On 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Parser&lt;/code&gt; is mapped to the value of the &lt;code&gt;LOG_PARSER&lt;/code&gt; environment variable defined in the New Relic logging daemonset. Ensure &lt;code&gt;Parser&lt;/code&gt; is set to “CRI” for this test, because &lt;a href="https://github.com/microsoft/fluentbit-containerd-cri-o-json-log"&gt;AKS uses containerd&lt;/a&gt; as the container runtime and its log format is CRI-Log.&lt;/p&gt;

&lt;p&gt;Next, within &lt;code&gt;filter-kubernetes.conf&lt;/code&gt; add another &lt;code&gt;[FILTER]&lt;/code&gt; section, just like in the next code snippet. This filter ensures that each log entry processed by the input log parser is then parsed using our multiline parser.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[FILTER]
    Name              multiline
    Match             *
    multiline.parser  multiline-regex
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, within &lt;code&gt;parsers.conf&lt;/code&gt; configure the multiline parser using the custom Fluent Bit configuration you created earlier. Notice how the multiline parser is matched by name (&lt;code&gt;“multiline-regex”&lt;/code&gt;) from the &lt;code&gt;[MULTILINE_PARSER]&lt;/code&gt; block in the next code snippet to the &lt;code&gt;[FILTER]&lt;/code&gt; block in the previous code snippet.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[MULTILINE_PARSER]
    name          multiline-regex
    type          regex
    flush_timeout 1000


    # rules |   state name  | regex pattern                    | next state
    # ------|---------------|----------------------------------|-----------
    rule      "start_state"   "/(Dec \d+ \d+\:\d+\:\d+)(.*)/"    "cont"
    rule      "cont"          "/^\s+at.*/"                       "cont"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;a href="https://github.com/newrelic/helm-charts/blob/master/charts/newrelic-logging/k8s/fluent-conf.yml"&gt;fluent-bit-config configmap&lt;/a&gt; should now look like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;apiVersion: v1
kind: ConfigMap
metadata:
  name: fluent-bit-config
  namespace: newrelic
  labels:
    k8s-app: newrelic-logging
data:
  # Configuration files: server, input, filters and output
  # ======================================================
  fluent-bit.conf: |
    [SERVICE]
        Flush         1
        Log_Level     ${LOG_LEVEL}
        Daemon        off
        Parsers_File  parsers.conf
        HTTP_Server   On
        HTTP_Listen   0.0.0.0
        HTTP_Port     2020
    @INCLUDE input-kubernetes.conf
    @INCLUDE output-newrelic.conf
    @INCLUDE filter-kubernetes.conf
  input-kubernetes.conf: |
    [INPUT]
        Name              tail
        Tag               kube.*
        Path              ${PATH}
        Parser            ${LOG_PARSER}
        DB                /var/log/flb_kube.db
        Mem_Buf_Limit     7MB
        Skip_Long_Lines   On
        Refresh_Interval  10
  filter-kubernetes.conf: |
    [FILTER]
        Name           multiline
        Match          *
        multiline.parser multiline-regex

    [FILTER]
        Name record_modifier
        Match *
        Record cluster_name ${CLUSTER_NAME}

    [FILTER]
        Name           kubernetes
        Match          kube.*
         Kube_URL       https://kubernetes.default.svc.cluster.local:443
        Merge_Log      Off
  output-newrelic.conf: |
    [OUTPUT]
        Name  newrelic
        Match *
        licenseKey ${LICENSE_KEY}
        endpoint ${ENDPOINT}
  parsers.conf: |
    # Relevant parsers retrieved from: https://github.com/fluent/fluent-bit/blob/master/conf/parsers.conf
    [PARSER]
        Name         docker
        Format       json
        Time_Key     time
        Time_Format  %Y-%m-%dT%H:%M:%S.%L
        Time_Keep    On

    [PARSER]
        Name cri
        Format regex
            Regex ^(?&amp;lt;time&amp;gt;[^ ]+) (?&amp;lt;stream&amp;gt;stdout|stderr) (?&amp;lt;logtag&amp;gt;[^ ]*) (?&amp;lt;message&amp;gt;.*)$
        Time_Key    time
        Time_Format %Y-%m-%dT%H:%M:%S.%L%z


    [MULTILINE_PARSER]
        name          multiline-regex
        key_content   message
        type          regex
        flush_timeout 1000
        #
        # Regex rules for multiline parsing
        # ---------------------------------
        #
        # configuration hints:
        #
        #  - first state always has the name: start_state
        #  - every field in the rule must be inside double quotes
        #
        # rules |   state name  | regex pattern                  | next state
        # ------|---------------|--------------------------------|-----------
        rule      "start_state"   "/(Dec \d+ \d+\:\d+\:\d+)(.*)/"  "cont"
        rule      "cont"          "/^\s+at.*/"                     "cont"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As the final step, you'll need to restart the New Relic logging pods after changing the configuration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Confirming the custom multiline parser is working correctly
&lt;/h2&gt;

&lt;p&gt;Now within New Relic you can see that the log messages are no longer split across multiple lines and the full stack trace can be seen as part of a single log line.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DdjhuQ2T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hpkl4970d052aww4g9p9.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DdjhuQ2T--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hpkl4970d052aww4g9p9.jpg" alt="After configuring Fluent Bit to handle multiline logs, within New Relic we can see a multiline log displayed on one line" width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A seemingly small change like this can have a big impact when you're dealing with multiline logs, and it ultimately makes your debugging experience with logs more effective. Thankfully, this small change is no big deal to implement.&lt;/p&gt;

&lt;h2&gt;
  
  
  Next steps
&lt;/h2&gt;

&lt;p&gt;If you aren’t already using Fluent Bit, start by &lt;a href="https://docs.newrelic.com/docs/logs/forward-logs/forward-your-logs-using-infrastructure-agent/"&gt;enabling log forwarding&lt;/a&gt; with the &lt;a href="https://docs.newrelic.com/docs/logs/forward-logs/fluent-bit-plugin-log-forwarding/"&gt;Fluent Bit plugin&lt;/a&gt;. Then, you’ll be ready to set up a custom configuration for parsing your multiline logs. &lt;/p&gt;

&lt;p&gt;If you're not using New Relic logs yet, &lt;a href="https://newrelic.com/platform/log-management"&gt;sign up for a free account&lt;/a&gt;. Your account includes 100 GB/month of free data ingest, one free full-platform user, and unlimited free basic users.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;To read this full New Relic blog, &lt;a href="https://newrelic.com/blog/how-to-relic/parsing-multiline-logs-with-fluent-bit?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=amer-fy-24-Parsing_multiline_logs"&gt;click here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>logs</category>
      <category>fluentbit</category>
      <category>multiline</category>
      <category>parser</category>
    </item>
    <item>
      <title>Monitor GitLab with OpenTelemetry and New Relic</title>
      <dc:creator>dpacheconr</dc:creator>
      <pubDate>Mon, 06 Mar 2023 18:33:43 +0000</pubDate>
      <link>https://forem.com/newrelic/monitor-gitlab-with-opentelemetry-and-new-relic-4hhk</link>
      <guid>https://forem.com/newrelic/monitor-gitlab-with-opentelemetry-and-new-relic-4hhk</guid>
      <description>&lt;p&gt;Shifting left isn't just about testing your code and automating the build and deployment process. It's about monitoring and testing your CI and CD pipelines, too.&lt;/p&gt;

&lt;p&gt;In this post, you’ll learn how to monitor your GitLab jobs and pipelines using OpenTelemetry and the &lt;a href="https://newrelic.com/instant-observability/gitlab?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;New Relic GitLab quickstart&lt;/a&gt;. After you're finished, you'll be able to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Track key metrics on your GitLab pipelines in a pre-built dashboard, such as how long your jobs are taking and how often they are failing.&lt;/li&gt;
&lt;li&gt;Visualize jobs and pipeline executions as distributed traces with logs in context.&lt;/li&gt;
&lt;li&gt;Pinpoint where issues are coming from in your pipelines.&lt;/li&gt;
&lt;li&gt;Create alerts on your GitLab pipelines.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The next image shows a New Relic dashboard with some of the GitLab metrics you’ll be able to see.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4ROLeey4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nz16v3hrm01aqzof1ip9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4ROLeey4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nz16v3hrm01aqzof1ip9.png" alt="Image description" width="880" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With this integration, you can shift left and get observability for the build and deploy phases of the DevOps lifecycle. &lt;/p&gt;

&lt;h2&gt;
  
  
  How the integration works
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://github.com/newrelic-experimental/gitlab"&gt;New Relic GitLab integration&lt;/a&gt; is split into two parts for ease of use. Both parts run as jobs in your existing or new GitLab pipelines. You’ll learn how to integrate each part in the next section.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;New Relic exporter&lt;/strong&gt;: Add this job to the end of each of your pipelines to export your pipeline data as logs and traces to New Relic. You’ll be able to use the distributed trace viewer to inspect each pipeline run as well as explorer logs in the log explorer.&lt;br&gt;
  &lt;strong&gt;New Relic metrics exporter&lt;/strong&gt;: This job should be scheduled. It exports GitLab data for all your projects, pipelines, jobs, runners, environments, deployment, and releases. You can configure which groups and projects are included. You can also run this as a standalone container outside GitLab.&lt;/p&gt;

&lt;p&gt;Let’s look at some examples that show what you can visualize in New Relic after you’ve enabled the integration.&lt;/p&gt;

&lt;p&gt;The following distributed trace captured by the New Relic exporter shows a job that contains an error with associated logs in context.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rRWxcfs7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kmhr2r1xe5ad1u407on2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rRWxcfs7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kmhr2r1xe5ad1u407on2.jpg" alt="Image description" width="880" height="460"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next two images show metrics captured by the New Relic metrics exporter. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9gfeNmbP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k160uftkvqo02zsr36di.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9gfeNmbP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k160uftkvqo02zsr36di.jpg" alt="Image description" width="880" height="441"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next image shows GitLab resources as log events with the configured attributes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wVVhp1ZF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d3qrd4o27nrzheiohg61.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wVVhp1ZF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d3qrd4o27nrzheiohg61.jpg" alt="Image description" width="880" height="445"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Configuring the exporters
&lt;/h2&gt;

&lt;p&gt;Both exporter integrations are set up similarly, but with one key difference:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The New Relic exporter needs to be set up for each pipeline you want to report data for.&lt;/li&gt;
&lt;li&gt;The New Relic metrics exporter only needs to be set up once.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Prerequisites:
&lt;/h2&gt;

&lt;p&gt;Before setting up the integration, you will need both a GitLab token and a New Relic ingest API key.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Follow these steps in the &lt;a href="https://docs.newrelic.com/docs/apis/intro-apis/new-relic-api-keys/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;API documentation&lt;/a&gt; to generate a New Relic ingest API key.&lt;/li&gt;
&lt;li&gt;Generate a GitLab API Token with “read_api” access.&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;In the top-right corner, select your avatar.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Edit profile&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;On the left sidebar, select &lt;strong&gt;Access Tokens&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Enter a name and optional expiration date for the token.&lt;/li&gt;
&lt;li&gt;Select the “read_api” scope.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Create personal access token&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Setting up the New Relic exporter
&lt;/h2&gt;

&lt;p&gt;The New Relic exporter should be configured to run as a child job at the end of your project pipeline. It will ship the data from the pipeline to New Relic as a distributed trace with the logs being reported as &lt;a href="https://docs.newrelic.com/docs/logs/logs-context/logs-in-context/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;logs in context&lt;/a&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  Step one: Create CI/CD variables in GitLab
&lt;/h2&gt;

&lt;p&gt;First, you need to set up your CI/CD variables in GitLab.  In addition to the required GitLab token and New Relic API key, there are also several optional variables.&lt;/p&gt;

&lt;p&gt;To add or update variables in GitLab for your project:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to your project’s &lt;strong&gt;Settings &amp;gt; CI/CD&lt;/strong&gt; and expand the &lt;strong&gt;Variables&lt;/strong&gt; section.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Add variable&lt;/strong&gt; and fill in the details for the following required variables. Make sure that both variables are masked.
&lt;em&gt;GLAB_TOKEN&lt;/em&gt; provides access to the GitLab API. Add the GitLab API token you created in the previous section.
&lt;em&gt;NEW_RELIC_API_KEY&lt;/em&gt; is the ingest API key you created in the previous section.&lt;/li&gt;
&lt;li&gt;You can also update the following optional variables. Each configuration option should be set up in GitLab as a pipeline environment variable. By default, these optional variables are already set to automatically export job logs to New Relic.
1.&lt;em&gt;OTEL_EXPORTER_OTEL_ENDPOINT&lt;/em&gt; is the New Relic OTEL endpoint and includes a port. By default, it’s set to &lt;a href="https://otlp.nr-data.net:4318"&gt;https://otlp.nr-data.net:4318&lt;/a&gt; in the US and &lt;a href="https://otlp.eu01.nr-data.net:4318"&gt;https://otlp.eu01.nr-data.net:4318&lt;/a&gt; in the EU.
2.&lt;em&gt;GLAB_EXPORT_LOGS&lt;/em&gt; is used to export job logs to New Relic. This variable is a boolean that’s set to true by default.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;GLAB_ENDPOINT&lt;/em&gt; is the GitLab API endpoint. It’s set to &lt;a href="https://gitlab.com"&gt;https://gitlab.com&lt;/a&gt; by default.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  Step two: Add the pipeline configuration for the exporter
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Create a file called &lt;em&gt;new-relic-exporter.yml&lt;/em&gt; in the root directory of the GitLab project, repository, or branch you want to monitor.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Add the following pipeline code. The New Relic image is provided for testing purposes. It’s recommended that you use your own image of the exporter in production. You can build your own image based on the &lt;a href="https://github.com/newrelic-experimental/gitlab"&gt;repository&lt;/a&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stages:
- new-relic-exporter
new-relic-exporter:
 rules:
   - when: always
 image:
   name: docker.io/dpacheconr/gitlab-exporter:1.0.0
 stage: new-relic-exporter
 script:
   - echo "Done"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ensure that the rule is set to always so that the exporter runs even if previous jobs fail.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step three: Add the exporter job to your existing pipeline
&lt;/h2&gt;

&lt;p&gt;In your &lt;em&gt;.gitlab-ci.yml&lt;/em&gt;  file, add a new stage with the name &lt;em&gt;new-relic-exporter&lt;/em&gt; to your &lt;em&gt;stages&lt;/em&gt; block as shown in the next code snippet. This should be set as the last stage in the list.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stages:
…existing stages…
- new-relic-exporter
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Configure the &lt;em&gt;new-relic-exporter&lt;/em&gt; stage block as shown in the next code snippet:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
new-relic-exporter:
 stage: new-relic-exporter
 inherit:
   variables: true
 variables:
   CI_PARENT_PIPELINE: $CI_PIPELINE_ID
 trigger:
   include:
     - local: new-relic-exporter.yml
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step four: Test and confirm that New Relic is receiving data
&lt;/h2&gt;

&lt;p&gt;Commit your code to run your pipeline and confirm that the New Relic Exporter job runs as the final job in the pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://one.newrelic.com/nr1-core?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;Log into New Relic&lt;/a&gt;, go to &lt;strong&gt;Services - OpenTelemetry&lt;/strong&gt;, and select the entity. It will have the same name as the project, repo, or branch that you are monitoring. Select the &lt;strong&gt;Distributed Traces&lt;/strong&gt; tab and confirm that you can see a trace for your pipeline. Sometimes it takes a few minutes for the first trace to appear. The next image shows an example of a trace from the exporter.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ANP6qcKs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g8n63ezcafim89rqr3wc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ANP6qcKs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/g8n63ezcafim89rqr3wc.jpg" alt="Image description" width="880" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;An example of a trace from the exporter in the distributed traces dashboard in New Relic.&lt;/p&gt;

&lt;p&gt;Next, go to the &lt;strong&gt;Logs&lt;/strong&gt; tab and confirm that you can see your pipeline logs appearing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zbhCIk5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c7x7luhiocfn6b5vjj3v.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zbhCIk5A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/c7x7luhiocfn6b5vjj3v.jpg" alt="Image description" width="880" height="445"&gt;&lt;/a&gt;&lt;br&gt;
Log events in New Relic show GitLab resources.&lt;/p&gt;

&lt;p&gt;If there is no data, ensure that you have correctly set up the environment variables with the appropriate keys. In GitLab, you can check the New Relic exporter job output for additional debugging information.&lt;/p&gt;
&lt;h2&gt;
  
  
  Setting up the New Relic metrics exporter
&lt;/h2&gt;

&lt;p&gt;The New Relic metrics exporter is standalone and does not need to be attached to an existing project pipeline. You should run it in its own project pipeline, but you can add it to an existing pipeline if you prefer.&lt;/p&gt;

&lt;p&gt;This gathers data from the GitLab API about your projects, pipelines, jobs, runners, environments, deployments, and releases, then exports the data to New Relic as a mixture of metrics and logs.&lt;/p&gt;

&lt;p&gt;To get the latest data from your accounts, you should run the New Relic metrics exporter on a regular basis. At least once per hour is recommended.&lt;/p&gt;
&lt;h2&gt;
  
  
  Step one: Create a new project
&lt;/h2&gt;

&lt;p&gt;Create a new GitLab project and give it a descriptive name like New Relic metrics exporter.&lt;/p&gt;
&lt;h2&gt;
  
  
  Step two: Create CI/CD variables in GitLab
&lt;/h2&gt;

&lt;p&gt;The exporter job has a number of configurable options which are provided as CI/CD environment variables. You must supply a GitLab token and New Relic API key. &lt;/p&gt;

&lt;p&gt;You also need to provide a &lt;a href="https://docs.python.org/3/library/re.html"&gt;regular expression&lt;/a&gt; to define which projects and groups you’d like the exporter to monitor. Your GitLab token may have access to data you don’t want to export so take care to set this up so that only the projects and groups you are interested in are captured. &lt;/p&gt;

&lt;p&gt;The other configurations are optional. Each configuration option should be set up as a pipeline environment variable. See the &lt;a href="https://github.com/newrelic-experimental/gitlab"&gt;README&lt;/a&gt; for a list of the configuration options available along with their default values. The defaults with no additional configuration will run the job every 60 minutes.&lt;/p&gt;

&lt;p&gt;To add or update variables in project settings:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to your project’s &lt;strong&gt;Settings &amp;gt; CI/CD&lt;/strong&gt; and expand the &lt;strong&gt;Variables&lt;/strong&gt; section.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Add variable&lt;/strong&gt; and add your &lt;em&gt;GLAB_TOKEN&lt;/em&gt; and &lt;em&gt;NEW_RELIC_API_KEY&lt;/em&gt;. Don’t forget to mask both variables.&lt;/li&gt;
&lt;li&gt;Add or update any optional variables as needed for your project. If you want to change the frequency of the job to a value other than 60 minutes, you can do so with the optional variable &lt;em&gt;GLAB_EXPORT_LAST_MINUTES&lt;/em&gt;.&lt;/li&gt;
&lt;/ol&gt;
&lt;h2&gt;
  
  
  Step three:   Create your pipeline definition
&lt;/h2&gt;

&lt;p&gt;If you have created a new project for this exporter as recommended, you will need to create the pipeline definition file &lt;em&gt;.gitlab-ci.yml&lt;/em&gt; in the root of your project. Add the pipeline code shown in the next snippet.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;stages:
- new-relic-metrics-exporter
 new-relic-metrics-exporter:
 rules:
   - when: always
 image:
   name: docker.io/dpacheconr/gitlab-metrics-exporter:1.0.0
   pull_policy: always
 stage: new-relic-metrics-exporter
 script:
   - echo "Done"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step four: Test the pipeline
&lt;/h2&gt;

&lt;p&gt;Run the pipeline and confirm that the New Relic metrics exporter job runs correctly with no errors, by checking the New Relic metrics exporter job output.&lt;/p&gt;

&lt;p&gt;To execute a pipeline schedule manually, follow these steps in GitLab:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;On the top bar, select &lt;strong&gt;Main menu &amp;gt; Projects&lt;/strong&gt; and find your project.&lt;/li&gt;
&lt;li&gt;On the left sidebar, select &lt;strong&gt;CI/CD &amp;gt; Schedules&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Play&lt;/strong&gt; to run the desired pipeline.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In the New Relic &lt;a href="https://one.newrelic.com/data-exploration/query-builder?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;query builder&lt;/a&gt;, run the following &lt;a href="https://docs.newrelic.com/docs/query-your-data/nrql-new-relic-query-language/get-started/introduction-nrql-new-relics-query-language/?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;NRQL query&lt;/a&gt; to ensure that New Relic is receiving data from the exporter:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;SELECT * FROM Log where gitlab.source ='gitlab-metrics-exporter'&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Go to &lt;a href="https://one.newrelic.com/data-exploration/data-explorer/explorer?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;Metrics &amp;amp; Events&lt;/a&gt; in New Relic and your GitLab metrics in the list of available metrics. These metrics will be prefixed with &lt;em&gt;gitlab&lt;/em&gt;. &lt;/p&gt;

&lt;p&gt;Metrics include pipeline, job duration, and queue duration. The next image shows the information on the duration of GitLab jobs.&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mygKYvnr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6izx8hs9z7zkq16yxeb6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mygKYvnr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6izx8hs9z7zkq16yxeb6.jpg" alt="Image description" width="880" height="428"&gt;&lt;/a&gt;&lt;br&gt;
New Relic dashboard shows duration of GitLab jobs.&lt;/p&gt;

&lt;p&gt;Go to &lt;a href="https://one.newrelic.com/logger?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;Logs&lt;/a&gt; in New Relic to see your log events. Apply the filter &lt;em&gt;gitlab.source:"gitlab-metrics-exporter"&lt;/em&gt; to only see data sourced from &lt;em&gt;new-relic-metrics-exporter&lt;/em&gt;. You will be able to see log data on all of your GitLab resources, including your projects, deployments, and runners.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4b--n7v3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dhzcayiesesu6sywrjez.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4b--n7v3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dhzcayiesesu6sywrjez.jpg" alt="Image description" width="880" height="446"&gt;&lt;/a&gt;&lt;br&gt;
All log data showing in New Relic, including GitLab projects, deployments, and runners.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step five: Create a pipeline schedule
&lt;/h2&gt;

&lt;p&gt;Once you have confirmed the job is correctly sending data, you need to set up a schedule in GitLab to run the job regularly. The schedule pattern should match the value from &lt;em&gt;GLAB_EXPORT_LAST_MINUTES&lt;/em&gt;. As discussed earlier in this post, the default is 60 minutes. Setting values that are different from the value in &lt;em&gt;GLAB_EXPORT_LAST_MINUTES&lt;/em&gt; can lead to duplicate or missing data in New Relic.&lt;/p&gt;

&lt;p&gt;To add a pipeline schedule, do the following in GitLab:&lt;/p&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;On the top bar, select &lt;strong&gt;Main menu &amp;gt; Projects&lt;/strong&gt; and find your project.

&lt;p&gt;On the left sidebar, select &lt;strong&gt;CI/CD &amp;gt; Schedules&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Select &lt;strong&gt;New schedule&lt;/strong&gt; and add an interval pattern To run hourly, set the pattern to  &lt;em&gt;0 * * * *&lt;/em&gt;.&lt;br&gt;
&lt;/p&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h2&gt;
&lt;br&gt;
  &lt;br&gt;
  &lt;br&gt;
  Step six: Import the quickstart dashboard&lt;br&gt;
&lt;/h2&gt;

&lt;p&gt;We’ve built a starter dashboard to visualize your GitLab data and help you monitor your jobs and pipelines. You can install this dashboard using the &lt;a href="https://newrelic.com/instant-observability/gitlab?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;GitLab&lt;/a&gt; &lt;a href="https://newrelic.com/instant-observability/gitlab?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;quickstart&lt;/a&gt; and then further customize it as needed. To get this dashboard, simply follow these steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the &lt;a href="https://newrelic.com/instant-observability/gitlab?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;Gitlab quickstart&lt;/a&gt; in New Relic Instant Observability and select &lt;strong&gt;Install now&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Choose an account and select &lt;strong&gt;Begin installation&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;If you've already completed the earlier steps in this tutorial, select &lt;strong&gt;Done&lt;/strong&gt; to move on to the next step.&lt;/li&gt;
&lt;li&gt;The quickstart deploys the resources to your account. Select &lt;strong&gt;See your data&lt;/strong&gt; to get to the dashboard.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The following image shows the dashboard, which provides a summary of key metrics such as the number of runners available, the number of successful jobs, the top errors, and the job queue status.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zyH-EOGY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3msrj9nidro1o0xj0vfq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zyH-EOGY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3msrj9nidro1o0xj0vfq.png" alt="Image description" width="880" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Install the GitLab quickstart to get a dashboard template with prebuilt views to monitor your GitLab pipelines.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;After you're done, you'll have complete visibility into your GitLab estate, from high-level metrics to logs and distributed traces. The next step is to consider &lt;a href="https://docs.newrelic.com/docs/alerts-applied-intelligence/new-relic-alerts/get-started/your-first-nrql-condition?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;adding alerts&lt;/a&gt; when certain thresholds are met, such as when jobs are taking too long, or when the number pipeline failures gets too high. &lt;/p&gt;

&lt;p&gt;By sending your GitLab data to New Relic, you get observability into the building and deployment phases of the DevOps lifecycle, providing you with more insights into your DevOps practices and helping you shift left.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Read the full blog post at New Relic: &lt;a href="https://newrelic.com/blog/how-to-relic/monitor-gitlab-with-opentelemetry?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy-23q4-dev_to_post_monitorgitlab"&gt;Monitor GitLab with OpenTelemetry and New Relic&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Not an existing New Relic user? &lt;a href="https://newrelic.com/signup?utm_source=devto&amp;amp;utm_medium=community&amp;amp;utm_campaign=global-fy23-q4-dev_to_post_monitorgitlab"&gt;Sign up for a free account&lt;/a&gt; to get started!&lt;/em&gt; 👨‍💻&lt;/p&gt;

</description>
      <category>monitoring</category>
      <category>gitlab</category>
      <category>otel</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
