<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Parseable</title>
    <description>The latest articles on Forem by Parseable (@parseable).</description>
    <link>https://forem.com/parseable</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/parseable"/>
    <language>en</language>
    <item>
      <title>Explain Parquet like I'm Five</title>
      <dc:creator>Nitish Tiwari</dc:creator>
      <pubDate>Mon, 12 Aug 2024 10:11:18 +0000</pubDate>
      <link>https://forem.com/parseable/explain-parquet-like-im-five-n9h</link>
      <guid>https://forem.com/parseable/explain-parquet-like-im-five-n9h</guid>
      <description>&lt;p&gt;Around early 2010's, as HDFS was at its peak, big data was still growing. It was evident that just the storage system is not enough to handle the data deluge. Innovations were needed to make data processing faster and efficient at the storage format itself.&lt;/p&gt;

&lt;p&gt;Doug Cutting, the creator of Hadoop, started Trevni columnar storage format initially. Twitter and Cloudera then collaborated to improve this format. This resulted in the creation of Parquet format. First released in 2013, Parquet is a columnar storage file format that is optimized for large-scale data processing and analytics. Apache Parquet is a top level Apache Software Foundation project since 2015.&lt;/p&gt;

&lt;p&gt;In this post we will explore the structure of Parquet files, its advantages, and use cases. We will also discuss best practices for working with Parquet files. We'll also look at why we at Parseable chose Parquet as our primary storage format and the challenges we face.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is a columnar format?
&lt;/h3&gt;

&lt;p&gt;A column oriented format differs structurally from row oriented formats such as &lt;code&gt;CSV&lt;/code&gt; or &lt;code&gt;Avro&lt;/code&gt;. The easiest way to understand this is to start with a simple dataset in a row oriented file such as a CSV, like this:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;ID&lt;/th&gt;
&lt;th&gt;First Name&lt;/th&gt;
&lt;th&gt;Last Name&lt;/th&gt;
&lt;th&gt;City&lt;/th&gt;
&lt;th&gt;Age&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;Alice&lt;/td&gt;
&lt;td&gt;Brown&lt;/td&gt;
&lt;td&gt;Boston&lt;/td&gt;
&lt;td&gt;29&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;Bob&lt;/td&gt;
&lt;td&gt;Smith&lt;/td&gt;
&lt;td&gt;New York&lt;/td&gt;
&lt;td&gt;35&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;Charlie&lt;/td&gt;
&lt;td&gt;Davis&lt;/td&gt;
&lt;td&gt;San Francisco&lt;/td&gt;
&lt;td&gt;42&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;When this data is stored in a CSV file, it is stored row-wise, i.e. data from a row is next to data from the next row and so on. Each row contains all the columns for a single record. So the actual physical storage of the data would look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1,Alice,Brown,Boston,29;2,Bob,Smith,New York,35;3,Charlie,Davis,San Francisco,42;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In contrast, a columnar file stores data from a column together. So, the same data in a columnar file would be physically stored like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1,2,3;Alice,Bob,Charlie;Brown,Smith,Davis;Boston,New York,San Francisco;29,35,42;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Sure, it is difficult to read for humans, but this format is much more efficient for certain types of queries. For example, if you wanted to know the average age of all the people in the dataset, you would only need to read the &lt;code&gt;Age&lt;/code&gt; column. In a row-based file, you would have to read the entire file to get the same information.&lt;/p&gt;

&lt;h3&gt;
  
  
  Implications of columnar format
&lt;/h3&gt;

&lt;p&gt;The way data is organized in in a columnar format has several important implications (both good and bad) on how data is stored, read, and processed. Let's examine some of these implications:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Writing the file: While writing a columnar file, the writer needs to know the all the values of a column for all the rows (that need to be written). This means that the writer needs to buffer the entire column in memory before writing it to disk. This can be a challenge for very large datasets. Parquet and other columnar formats added a way around, that we'll review later.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Column level configuration: A columnar file stores data from one column together physically, it can use different compression and encoding schemes for each column. For example, if there is a column with repeating values, RLE or Dictionary encoding will be more efficient than other encoding schemes. Such flexibility allows optimizing storage and query performance for each column.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Random access: Because of relatively complex format, data in a columnar file is difficult to be randomly accessed. For example, if you want to read a row from a columnar file, you would have to read the footer, then the metadata, then the column chunks, and finally the pages. You may even be able to skip a few steps, but you'll need to read a whole page at the least to get the data you need. This is in contrast to row-based files where you can read a row directly.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Schema evolution: Schema evolution is the process of changing the schema of a dataset. In a columnar file, schema evolution is next to impossible. This is because the schema is stored in the metadata and changing the schema means changing the metadata. This can be a challenge when working with large datasets.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Interaction with files: Columnar files are not human-readable. This means that you can't open a columnar file in a text editor and read the data. You need a special tool or library to read the data. This can be a challenge when working with small datasets or when debugging. Since, each columnar file format is different in the structure, each of then need their own reader and writer implementations in different languages, only then end users can interact with the files.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Parquet file format
&lt;/h3&gt;

&lt;p&gt;At a high level, a Parquet file is a collection of row groups, each of which contains column chunks. Each column chunk is further divided into pages. The file also contains metadata that helps in reading and decoding the data efficiently.&lt;/p&gt;

&lt;p&gt;Let's take a visual example. In below example, there are N columns in this table, split into M row groups. The file metadata contains the locations of all the column chunk start locations.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;4-byte magic number "PAR1"

  &amp;lt;Row Group 1&amp;gt;
    &amp;lt;Column 1 Chunk 1&amp;gt;
    &amp;lt;Column 2 Chunk 1&amp;gt;
    ...
    &amp;lt;Column N Chunk 1&amp;gt;
  &amp;lt;/Row Group 1&amp;gt;

  &amp;lt;Row Group 2&amp;gt;
    &amp;lt;Column 1 Chunk 2&amp;gt;
    &amp;lt;Column 2 Chunk 2&amp;gt;
    ...
    &amp;lt;Column N Chunk 2&amp;gt;
  &amp;lt;/Row Group 2&amp;gt;

  &amp;lt;Row Group M&amp;gt;
    &amp;lt;Column 1 Chunk M&amp;gt;
    &amp;lt;Column 2 Chunk M&amp;gt;
    ...
    &amp;lt;Column N Chunk M&amp;gt;
  &amp;lt;/Row Group M&amp;gt;

File Metadata
4-byte length in bytes of file metadata (little endian)
4-byte magic number "PAR1"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F90h737ldntgb9zmezuj6.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F90h737ldntgb9zmezuj6.gif" alt="Parquet File" width="601" height="478"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Row Groups
&lt;/h4&gt;

&lt;p&gt;A writer needs to know all the values of a column before writing it to disk. Why? let's understand this better, with an example. Take our previous table.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;ID&lt;/th&gt;
&lt;th&gt;First Name&lt;/th&gt;
&lt;th&gt;Last Name&lt;/th&gt;
&lt;th&gt;City&lt;/th&gt;
&lt;th&gt;Age&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;Alice&lt;/td&gt;
&lt;td&gt;Brown&lt;/td&gt;
&lt;td&gt;Boston&lt;/td&gt;
&lt;td&gt;29&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;Bob&lt;/td&gt;
&lt;td&gt;Smith&lt;/td&gt;
&lt;td&gt;New York&lt;/td&gt;
&lt;td&gt;35&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;Charlie&lt;/td&gt;
&lt;td&gt;Davis&lt;/td&gt;
&lt;td&gt;San Francisco&lt;/td&gt;
&lt;td&gt;42&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;If you were to write this data in a columnar file, you would need to write all the &lt;code&gt;ID&lt;/code&gt; values together, then all the &lt;code&gt;First Name&lt;/code&gt; values together, and so on. The final file looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;1,2,3;Alice,Bob,Charlie;Brown,Smith,Davis;Boston,New York,San Francisco;29,35,42;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, imagine if this file was not complete, yet. Say, there are 5 more rows to be appended to the file (after first 3 rows are written in the Parquet file).&lt;/p&gt;

&lt;p&gt;One way to do this is, read the entire file, append the new rows, and write the entire file back to disk. This would need a lot of data shuffling to keep the columnar format intact. Other way would be to instead keep all the rows in memory (till you get all the 8 rows) and write them to disk, and don't allow appends (once the file is written).&lt;/p&gt;

&lt;p&gt;Now think of production workloads where writers need to write millions of rows worth of data to a Parquet file. Neither the shuffling of data nor the loading of all rows in memory is efficient. To solve this problem, Parquet introduced the concept of row groups.&lt;/p&gt;

&lt;p&gt;Row groups are essentially a way to divide the data in smaller chunks, so they can be efficiently written to disk. Each row group contains a subset of the rows and all the columns for those rows. The writer needs to load only the rows in a row group to write them to disk, and then wait for next set of rows. When the file is complete, the writer can write the metadata to the file.&lt;/p&gt;

&lt;h5&gt;
  
  
  Column Chunks
&lt;/h5&gt;

&lt;p&gt;Within a Row Group, data is stored by columns. These column chunks are the core of Parquet's columnar storage format. By storing data this way, Parquet can efficiently read only the columns needed for a query.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Pages:&lt;/strong&gt; Each column chunk is further divided into pages. Pages are the smallest unit of data storage in a Parquet file. They contain the actual data and metadata necessary for decoding.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Metadata:&lt;/strong&gt; Parquet files contain metadata at different levels. File-level metadata includes information about the schema and the number of row groups. Row group metadata provides details about column chunks such as their size and encoding. Column chunk metadata includes information about pages. Other page header metadata is used to read and decode the given data.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Supported Types
&lt;/h3&gt;

&lt;p&gt;The data types supported by a Parquet file are minimal to help reduce the complexity of implementing readers and writers for the file type. Further, the user can use a logical type to extend the types stored in your file. These annotations are stored in the metadata file as &lt;code&gt;LogicalTypes.md&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Some of the supported data types include:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;BOOLEAN: 1 bit boolean&lt;/li&gt;
&lt;li&gt;INT32: 32-bit signed ints&lt;/li&gt;
&lt;li&gt;INT64: 64-bit signed ints&lt;/li&gt;
&lt;li&gt;INT96: 96-bit signed ints&lt;/li&gt;
&lt;li&gt;FLOAT: IEEE 32-bit floating point values&lt;/li&gt;
&lt;li&gt;DOUBLE: IEEE 64-bit floating point values&lt;/li&gt;
&lt;li&gt;BYTE_ARRAY: arbitrarily long byte arrays&lt;/li&gt;
&lt;li&gt;FIXED_LEN_BYTE_ARRAY: fixed length byte arrays&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Building and Working with Parquet
&lt;/h3&gt;

&lt;p&gt;Working with Parquet files in different programming environments is straightforward thanks to the wide range of libraries and tools available. For example, if you want to work with Oracle Databases, you can opt for Python-based parquet tools.&lt;/p&gt;

&lt;p&gt;For example, &lt;code&gt;parquet-tools&lt;/code&gt; in Python is a very popular CLI tool to interact with Parquet files.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip3 &lt;span class="nb"&gt;install &lt;/span&gt;parquet-tools
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Refer &lt;code&gt;parquet-tools&lt;/code&gt; &lt;a href="https://pypi.org/project/parquet-tools/" rel="noopener noreferrer"&gt;docs here&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Use Cases
&lt;/h3&gt;

&lt;p&gt;Parquet is particularly advantageous in scenarios that involve large-scale data processing and analytics.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Big Data Processing with Apache Spark:&lt;/strong&gt; Apache Spark comes with Parquet as the default storage format. Its compression and encoding schemes enable Spark to efficiently read and write large datasets, run fast analytical queries, and save storage space.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Data Warehousing:&lt;/strong&gt; Parquet's efficient storage and quick query performance in data warehousing are excellent for storing and analyzing large volumes of data. Data warehousing solutions such as Amazon Redshift and Google BigQuery are common use cases.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Machine Learning:&lt;/strong&gt; Parquet files are the file format of choice for machine learning applications due to their efficient storage and retrieval of large datasets. Data scientists use fast data loading and processing to accelerate model training and evaluation.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Best Practices
&lt;/h3&gt;

&lt;p&gt;To optimize your work with Parquet files, consider the following best practices:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Column Pruning:&lt;/strong&gt; When querying data, select only the columns you need. This reduces the amount of data read from disk and improves query performance.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Predicate Pushdown:&lt;/strong&gt; Use predicates in your queries to filter data early in the read process. Parquet supports predicate pushdown, which helps minimize the amount of data read.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Compression and Encoding:&lt;/strong&gt; Based on your data characteristics, choose the proper compression and encoding schemes. Experiment with different options to find the best balance between storage space and query performance.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Batch Processing:&lt;/strong&gt; Process Parquet files in batches to take advantage of Parquet's efficient row group and page structures. This improves I/O performance and reduces overhead.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>database</category>
      <category>analytics</category>
      <category>logging</category>
      <category>observability</category>
    </item>
    <item>
      <title>What is Syslog? Components, benefits, and best practices</title>
      <dc:creator>Jen Wike Huger</dc:creator>
      <pubDate>Tue, 30 Jul 2024 13:33:59 +0000</pubDate>
      <link>https://forem.com/parseable/what-is-syslog-components-benefits-and-best-practices-4k4i</link>
      <guid>https://forem.com/parseable/what-is-syslog-components-benefits-and-best-practices-4k4i</guid>
      <description>&lt;p&gt;The team at Parseable is writing a ton of content around how to do logging in a lightweight and powerful way. One of their recent installments is about Syslog. &lt;a href="https://www.parseable.com/blog" rel="noopener noreferrer"&gt;Read their blog&lt;/a&gt; to learn more.&lt;/p&gt;




&lt;p&gt;Syslog is the abbreviation for System Logging Protocol. It is a logging method that allows monitoring and managing logs from different parts of the server or system. The primary use of Syslog data is to trace back errors in case of a failure or event. It can also be used to get information about your system's overall performance.&lt;/p&gt;

&lt;p&gt;This Syslog guide will help you understand how it works and its functionality. Let's begin by understanding what it is and how it evolved.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj61vg4ubgp0joabs7msv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj61vg4ubgp0joabs7msv.png" alt="Image description" width="800" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  History and Evolution of Syslog
&lt;/h2&gt;

&lt;p&gt;The journey of Syslog began in the early days of the Unix operating system where it served as a simple logging protocol for capturing system messages. Eric Allman created the first instance of Syslog in the late 1980s. Soon it became a must-have tool for Unix administrators. The growing need for a standard logging protocol eventually led to the publication of RFC 3164 the first Syslog standard. It has been further developed and extended into RFC 5424 to establish a more solid and in the meantime flexible basis for the frameworks of message formats that could be used with Syslog.&lt;/p&gt;

&lt;h2&gt;
  
  
  3 Core Components of Syslog
&lt;/h2&gt;

&lt;p&gt;At its core Syslog consists of several key components and concepts:&lt;/p&gt;

&lt;p&gt;Syslog Daemon: The Syslog daemon is responsible for receiving processing and forwarding Syslog messages. Some examples of Daemons include Syslog rsyslog or syslog-ng. It is the central hub for logging activities. Hence it helps capture and appropriately handle all system events along with log data.&lt;/p&gt;

&lt;p&gt;Syslog Messages: A typical Syslog message is made up of three parts:&lt;/p&gt;

&lt;p&gt;The priority (PRI)&lt;br&gt;
Header&lt;br&gt;
Message (MSG)&lt;br&gt;
The PRI combines facility code and severity indicating the message's source and importance. The header includes a timestamp and the hostname or IP address while the MSG contains the log message or event data.&lt;/p&gt;

&lt;p&gt;Facilities and Severities: Syslog categorizes messages by facilities such as kern (kernel messages) user (user-level messages) and mail (mail system). Severities range from emergency (system unusable) to debug (debug messages) allowing administrators to filter and prioritize log data effectively.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Syslog Works
&lt;/h2&gt;

&lt;p&gt;Syslog operates through a straightforward yet powerful workflow:&lt;/p&gt;

&lt;p&gt;Message Flow: Applications generate Syslog messages which are transported over networks using protocols like UDP (User Datagram Protocol) or TCP (Transmission Control Protocol). Upon arrival the Syslog or log server processes and stores these messages according to configured rules.&lt;/p&gt;

&lt;p&gt;Syslog Formats: The most common standard Syslog message formats include RFC 3164 and RFC 5424. RFC 3164 is the original Syslog format and is widely supported. The RFC 5424 offers enhanced features including structured data and better timestamp precision.&lt;/p&gt;

&lt;h2&gt;
  
  
  Benefits of Using Syslog
&lt;/h2&gt;

&lt;p&gt;Several key benefits drive Syslog's widespread adoption.&lt;/p&gt;

&lt;p&gt;Centralized Logging: By aggregating logs from multiple sources into a central location Syslog simplifies monitoring and management. This centralized approach enhances visibility and enables more efficient analysis of log events.&lt;/p&gt;

&lt;p&gt;Real-time Monitoring: Syslog supports real-time alerting and Syslog monitoring allowing administrators to detect and respond to issues promptly. This capability is crucial for maintaining system uptime and security across network devices.&lt;/p&gt;

&lt;p&gt;Compliance and Auditing: Syslog helps organizations meet regulatory requirements by providing a reliable and tamper-evident logging mechanism. It facilitates auditing and records all critical events in the log file.&lt;/p&gt;

&lt;p&gt;Troubleshooting: Syslog's detailed and structured logs are invaluable for troubleshooting. By examining log messages administrators can more efficiently identify and resolve issues minimizing downtime and disruption.&lt;/p&gt;

&lt;p&gt;Syslog Configuration and Usage&lt;br&gt;
Configuring Syslog involves setting up the Syslog daemon and integrating it with various Syslog tools:&lt;/p&gt;

&lt;p&gt;Configuring Syslog Daemon&lt;br&gt;
Basic configuration typically involves specifying log file locations setting log rotation policies and defining message filtering rules. Advanced configurations can include:&lt;/p&gt;

&lt;p&gt;Forwarding logs to a central Syslog server&lt;br&gt;
Applying TLS/SSL for secure transmission&lt;br&gt;
Customizing log formats&lt;br&gt;
Common Syslog Tools&lt;br&gt;
Syslog provides the ability to use tools to enhance performance. One of the most famous tools is RSyslog. It is known for its high performance and flexibility and as a result it is widely preferred in modern Linux environments. Syslog-ng offers advanced features like content-based filtering and high-availability clustering. Moreover if you integrate Syslog with log management systems such as Parseable it provides powerful analytics and visualization capabilities.&lt;/p&gt;

&lt;h2&gt;
  
  
  Best Practices for Syslog
&lt;/h2&gt;

&lt;p&gt;To maximize Syslog's effectiveness administrators should follow these best practices:&lt;/p&gt;

&lt;p&gt;Security&lt;br&gt;
Ensure secure transmission of Syslog messages using TLS/SSL. This prevents unauthorized access and protects the integrity of log data.&lt;/p&gt;

&lt;p&gt;Performance&lt;br&gt;
Optimize Syslog performance by fine-tuning buffer sizes configuring appropriate log rotation policies and using efficient storage solutions.&lt;/p&gt;

&lt;p&gt;Scalability&lt;br&gt;
Plan for scalability by distributing logging workloads across multiple servers and using load-balancing techniques. This ensures that Syslog can handle large volumes of logs in high-demand environments.&lt;/p&gt;

&lt;p&gt;Maintenance&lt;br&gt;
Regular maintenance tasks include:&lt;/p&gt;

&lt;p&gt;Monitoring disk usage&lt;br&gt;
Checking for log file corruption&lt;br&gt;
Updating Syslog configurations as needed&lt;br&gt;
This proactive approach helps maintain a healthy logging infrastructure.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges and Limitations of Syslog
&lt;/h2&gt;

&lt;p&gt;Despite its benefits Syslog faces several challenges:&lt;/p&gt;

&lt;p&gt;Reliability Issues: UDP-based transport while fast can be unreliable leading to potential message loss. Using TCP or implementing reliable delivery mechanisms can mitigate this risk.&lt;/p&gt;

&lt;p&gt;Scalability Concerns: Handling large volumes of logs requires careful planning and resource allocation. Implementing distributed logging solutions and efficient indexing can help address scalability challenges.&lt;/p&gt;

&lt;p&gt;Complexity: Managing complex Syslog configurations can be daunting. Clear documentation regular audits and leveraging automation tools can simplify configuration management.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future of Syslog
&lt;/h2&gt;

&lt;p&gt;Syslog continues to evolve adapting to modern IT environments:&lt;/p&gt;

&lt;p&gt;Emerging Trends: New developments in logging technologies such as structured logging and enhanced metadata support are enhancing Syslog's capabilities and driving improvements in log analysis and correlation.&lt;/p&gt;

&lt;p&gt;Integration with Modern Systems: Syslog is increasingly integrating with cloud-native and containerized environments. This includes support for logging drivers in Docker and Kubernetes enabling seamless logging across diverse infrastructures.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Syslog remains a fundamental tool for system logging offering numerous benefits for IT professionals. Administrators can leverage Syslog to enhance system monitoring compliance and troubleshooting by understanding its history core concepts and best practices. Syslog adapts as the logging landscape evolves ensuring its relevance in modern computing environments.&lt;/p&gt;

</description>
      <category>kubernetes</category>
      <category>architecture</category>
      <category>database</category>
      <category>rust</category>
    </item>
    <item>
      <title>How to set up a CDC pipeline to capture and analyze real-time database changes with Parseable</title>
      <dc:creator>Jen Wike Huger</dc:creator>
      <pubDate>Mon, 22 Jul 2024 16:49:11 +0000</pubDate>
      <link>https://forem.com/parseable/how-to-set-up-a-cdc-pipeline-to-capture-and-analyze-real-time-database-changes-with-parseable-574g</link>
      <guid>https://forem.com/parseable/how-to-set-up-a-cdc-pipeline-to-capture-and-analyze-real-time-database-changes-with-parseable-574g</guid>
      <description>&lt;p&gt;Databases are critical for any application. Data constantly gets updated, inserted, and deleted. In most of cases it is important for the business to keep track of these changes due to security concerns, auditing requirements, and to keep other relevant systems up to date.&lt;/p&gt;

&lt;p&gt;Change Data Capture (CDC) has gained popularity, precisely to address this problem. CDC is a technique used to track all changes in a database and capture them in destination systems. Debezium is a popular CDC tool that leverages database logs as the source of truth, and streams the changes to Kafka and compatible systems like Redpanda.&lt;/p&gt;

&lt;p&gt;The key thing to note here is that while databases are typically mutable, the changes captured by the CDC system should be immutable. Only then users have a system of record that can be trusted. Parseable is built for immutable logs and streams of events and data. It is a great fit for CDC targets. The way we designed Parseable allows for long term retention and querying of logs, and it also provides a rich set of tools for analysis and monitoring.&lt;/p&gt;

&lt;p&gt;In this tutorial, we use Redpanda, PostgreSQL, and Debezium to ingest CDC events into Parseable.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4fgdllkijtvueoo43srg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4fgdllkijtvueoo43srg.png" alt="CDC pipeline with Redpanda, PostgreSQL, Debezium, and Parseable" width="800" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Read more in the &lt;a href="https://www.parseable.com/blog/change-data-capture-with-parseable" rel="noopener noreferrer"&gt;full tutorial&lt;/a&gt; to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set up Docker Compose&lt;/li&gt;
&lt;li&gt;Set up Debezium Connect&lt;/li&gt;
&lt;li&gt;Set up Parseable&lt;/li&gt;
&lt;li&gt;Set up Redpanda Connect&lt;/li&gt;
&lt;li&gt;Test the CDC pipeline&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>devops</category>
      <category>programming</category>
      <category>productivity</category>
      <category>rust</category>
    </item>
    <item>
      <title>Metadata to actionable insights in Grafana: How to view Parseable metrics</title>
      <dc:creator>Jen Wike Huger</dc:creator>
      <pubDate>Thu, 11 Jul 2024 15:55:59 +0000</pubDate>
      <link>https://forem.com/parseable/metadata-to-actionable-insights-in-grafana-how-to-view-parseable-metrics-3oa4</link>
      <guid>https://forem.com/parseable/metadata-to-actionable-insights-in-grafana-how-to-view-parseable-metrics-3oa4</guid>
      <description>&lt;p&gt;Parseable deployments in the wild are handling larger and larger volumes of logs, so we needed a way to enable users to monitor their Parseable instances.&lt;/p&gt;

&lt;p&gt;Typically this would mean setting up Prometheus to capture Parseable ingest and query node metrics and visualize those metrics on a Grafana dashboard. We added &lt;a href="https://www.parseable.com/docs/integrations/prometheus-metrics-and-configuration" rel="noopener noreferrer"&gt;Prometheus metrics support in Parseable&lt;/a&gt; to enable this use case.&lt;/p&gt;

&lt;p&gt;But we wanted a simpler, self-contained approach that allows users to monitor their Parseable instances without needing to set up Prometheus.&lt;/p&gt;

&lt;p&gt;This led us to figuring out a way to store Parseable server's internal metrics in a special log stream called &lt;code&gt;pmeta&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;This stream keeps track of important information about all of the ingestors in the cluster. This includes information like the URL of the ingestor, commit id of that ingestor, number of events processed by the ingestor, and staging file location and size.&lt;/p&gt;

&lt;p&gt;This is a sample event in the &lt;code&gt;pmeta&lt;/code&gt; stream.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "address": "http://ec2-3-136-154-35.us-east-2.compute.amazonaws.com:443/",
  "cache": "Disabled",
  "commit": "d6116e8",
  "event_time": "2024-07-02T09:49:05.125255417",
  "event_type": "cluster-metrics",
  "p_metadata": "",
  "p_tags": "",
  "p_timestamp": "2024-07-02T09:49:05.540",
  "parseable_deleted_events_ingested": 35095373,
  "parseable_deleted_events_ingested_size": 10742847195,
  "parseable_deleted_storage_size_data": 1549123461,
  "parseable_deleted_storage_size_staging": 0,
  "parseable_events_ingested": 3350101,
  "parseable_events_ingested_size": 1054739567,
  "parseable_lifetime_events_ingested": 38445474,
  "parseable_lifetime_events_ingested_size": 11797586762,
  "parseable_lifetime_storage_size_data": 1732950386,
  "parseable_lifetime_storage_size_staging": 0,
  "parseable_staging_files": 2,
  "parseable_storage_size_data": 183826925,
  "parseable_storage_size_staging": 0,
  "process_resident_memory_bytes": 113250304,
  "staging": "/home/ubuntu/parseable/staging"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let's show you how to visualize this data in a Grafana dashboard. &lt;/p&gt;

&lt;p&gt;We'll start by setting up Parseable to collect this &lt;code&gt;pmeta&lt;/code&gt; data.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started with Parseable
&lt;/h2&gt;

&lt;p&gt;Parseable is a cloud-native log management solution that efficiently handles large-scale log data. By integrating Parseable with your infrastructure, you can streamline log ingestion, storage, and querying, making it an essential tool for observability and monitoring. &lt;/p&gt;

&lt;p&gt;You can &lt;a href="https://www.parseable.com/docs/installation" rel="noopener noreferrer"&gt;choose the right installation process&lt;/a&gt; for you.&lt;/p&gt;

&lt;p&gt;To quickly install Parseable using Docker, open the terminal and type the command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker run -p  8000:8000 \  
-v /tmp/parseable/data:/parseable/data \  
-v /tmp/parseable/staging:/parseable/staging \  
-e  P_FS_DIR=/parseable/data \  
-e  P_STAGING_DIR=/parseable/staging \  
containers.parseable.com/parseable/parseable:latest \  
parseable local-store

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can verify the installation by accessing the Parseable UI by navigating to &lt;a href="http://localhost:9000" rel="noopener noreferrer"&gt;http://localhost:9000&lt;/a&gt; in your web browser. Log in using the default credentials (admin/admin) and explore the dashboard to ensure everything is set up correctly.&lt;/p&gt;

&lt;p&gt;Finally, we need to create a log stream before we can send events. A log stream is like a project that will essentially store all your ingested logs. For this tutorial, we'll have a log stream named &lt;code&gt;pmeta&lt;/code&gt;. To create a log stream, log in to your Parseable instance, and you'll find a button on the right-hand top side.&lt;/p&gt;

&lt;p&gt;Note that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;pmeta&lt;/code&gt; is automatically created and populated in a Parseable cluster (high availability setup)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;pmeta&lt;/code&gt; is not created in a single node setup&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;if you're not interested in this data, you can set the retention to 1 day for the &lt;code&gt;pmeta&lt;/code&gt; stream to avoid storing this data&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Read more about the &lt;code&gt;pmeta&lt;/code&gt; stream in the &lt;a href="https://www.parseable.com/docs/concepts/concepts#internal-log-stream" rel="noopener noreferrer"&gt;Parseable documentation&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Instal Grafana and the Parseable plugin
&lt;/h2&gt;

&lt;p&gt;Grafana helps you collect, correlate, and visualize data with beautiful dashboards. We'll connect the parseable instance with Grafana via the &lt;a href="https://github.com/parseablehq/parseable-datasource" rel="noopener noreferrer"&gt;Parseable Grafana datasource&lt;/a&gt;. This plugin allows you to query Parseable data using SQL and visualize it in Grafana.&lt;/p&gt;

&lt;p&gt;If you want to self-host Grafana, you can host it on a dedicated cloud instance or locally, depending on your requirements. Follow the official Grafana &lt;a href="https://grafana.com/docs/grafana/latest/setup-grafana/installation/" rel="noopener noreferrer"&gt;installation guide&lt;/a&gt; for more information.&lt;/p&gt;

&lt;p&gt;Once the Grafana instance is setup, let's quickly install the Parseable plugin and connect our Parseable instance to Grafana.&lt;/p&gt;

&lt;p&gt;Login to your Grafana instance and navigate to the administration setting on the left-hand side menu.&lt;br&gt;
Click on Plugin and Data Option&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fngn5xjlks0vw4ew7suke.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fngn5xjlks0vw4ew7suke.png" alt="Image description" width="800" height="285"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Open the Plugins page and search for &lt;code&gt;Parseable&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Install the plugin and then click on &lt;code&gt;Add New Datasource&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;From the datasource page, fill in the following details:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxzz4m54xrcy70xhn8u9o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxzz4m54xrcy70xhn8u9o.png" alt="Image description" width="800" height="759"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the URL field, type the URL of your Parseable query instance. For example, &lt;code&gt;https://demo.parseable.com:443&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Under the Auth Section, switch to the &lt;code&gt;Basic Auth&lt;/code&gt; setting.&lt;/p&gt;

&lt;p&gt;In the &lt;code&gt;Basic Auth Details&lt;/code&gt; section, enter your Parseable username and password.&lt;/p&gt;

&lt;p&gt;Finally, click on &lt;code&gt;Save &amp;amp; Test&lt;/code&gt; to verify the connection.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up the Grafana dashboard
&lt;/h2&gt;

&lt;p&gt;We'll now use the Parseable Data Source to query data from the Parseable Meta Stream (&lt;code&gt;pmeta&lt;/code&gt;).&lt;/p&gt;

&lt;p&gt;Navigate to the Dashboard section and click on &lt;code&gt;New&lt;/code&gt; and select &lt;code&gt;Import Dashboard&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5h6wj7iswzsiiik2gmjd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5h6wj7iswzsiiik2gmjd.png" alt="Image description" width="800" height="836"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enter the Dashboard ID as &lt;code&gt;21472&lt;/code&gt; and click on load.&lt;/p&gt;

&lt;p&gt;Ensure the data source to Parseable-DataSource by selecting from the dropdown menu.&lt;/p&gt;

&lt;p&gt;Once done, click on &lt;code&gt;import&lt;/code&gt;. It should take a few seconds to load, and then it will create the dashboard.&lt;/p&gt;

&lt;p&gt;We query data from Parseable using SQL. To learn more about querying data in Parseable, you can refer to &lt;a href="https://www.parseable.com/docs/concepts/query" rel="noopener noreferrer"&gt;our documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm3cuasf3flktdngacbs3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm3cuasf3flktdngacbs3.png" alt="Image description" width="800" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Now, you've learned how to create a Grafana dashboard using Parseable's &lt;code&gt;pmeta&lt;/code&gt; stream. &lt;/p&gt;

&lt;p&gt;This dashboard provides crucial insights into your Parseable instance's performance, and we encourage you to customize this dashboard further to fit your specific needs.&lt;/p&gt;

&lt;p&gt;🏃🏽‍♀️ To see Parseable in action, &lt;a href="https://www.youtube.com/watch?v=2Eg_Keqt1I0&amp;amp;t=86s" rel="noopener noreferrer"&gt;watch this YouTube video&lt;/a&gt;. Get started with Parseable &lt;a href="https://www.parseable.com/docs/docker-quick-start" rel="noopener noreferrer"&gt;in just a single command&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;💬 For technical questions or to share your implementations, join our &lt;a href="https://logg.ing/community" rel="noopener noreferrer"&gt;developer community on Slack&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;📝 Read more from &lt;a href="https://www.parseable.com/blog" rel="noopener noreferrer"&gt;the Parseable blog&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Ready to enhance your observability? Start using Parseable and Grafana today to unlock the full potential of your log data.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Ingesting Data to Parseable Using Pandas</title>
      <dc:creator>Jen Wike Huger</dc:creator>
      <pubDate>Wed, 26 Jun 2024 18:40:04 +0000</pubDate>
      <link>https://forem.com/parseable/ingesting-data-to-parseable-using-pandas-pm1</link>
      <guid>https://forem.com/parseable/ingesting-data-to-parseable-using-pandas-pm1</guid>
      <description>&lt;h2&gt;
  
  
  A Step-by-Step Guide
&lt;/h2&gt;

&lt;p&gt;Managing and deriving insights from vast amounts of historical data is not just a challenge but a necessity. Imagine your team grappling with numerous log files, trying to pinpoint issues. But, because logs are stored as files, it is very inefficient to search through them. This scenario is all too familiar for many developers.&lt;/p&gt;

&lt;p&gt;Enter Parseable, a powerful solution to analyze your application logs. By integrating with pandas, the renowned Python library for data analysis, Parseable offers a seamless way to ingest and leverage historical data without the need to discard valuable logs.&lt;/p&gt;

&lt;p&gt;In this post, we explore how Parseable can revolutionize your data management strategy, enabling you to unlock actionable insights from both current and archived log data effortlessly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Requirements
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Python installed on your system&lt;/li&gt;
&lt;li&gt;Pandas library&lt;/li&gt;
&lt;li&gt;Requests library&lt;/li&gt;
&lt;li&gt;A CSV file to be ingested&lt;/li&gt;
&lt;li&gt;Access to the Parseable API&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  The CSV File
&lt;/h3&gt;

&lt;p&gt;Our code example is based on a Kaggle Dataset. We've used a CSV file named e-shop_clothing_2008.csv. Feel free to use your own dataset to follow along. First, ensure your CSV file is formatted correctly and accessible from the script's directory.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Parseable API
&lt;/h3&gt;

&lt;p&gt;Next, we'll interact with the Parseable API to send our data. Here we're using the demo Parseable instance. Before sending any data, please ensure you have entered the correct endpoint and credentials:&lt;/p&gt;

&lt;p&gt;Endpoint: &lt;a href="https://demo.technocube.in/api/v1/ingest" rel="noopener noreferrer"&gt;https://demo.technocube.in/api/v1/ingest&lt;/a&gt;&lt;br&gt;
Username: admin&lt;br&gt;
Password: admin&lt;/p&gt;
&lt;h3&gt;
  
  
  Writing the Script
&lt;/h3&gt;

&lt;p&gt;Here’s a Python script that reads the CSV file in chunks and sends each chunk to the Parseable API (in a stream called testclickstream). Replace the CSV file path, Parseable endpoint, and authentication credentials with your own.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import pandas as pd
import requests
import json

# Define the CSV file path
csv_file_path = 'e-shop_clothing_2008.csv'

# Define the Parseable API endpoint
parseable_endpoint = 'https://demo.technocube.in/api/v1/ingest'

# Basic authentication credentials
username = 'admin'
password = 'admin'

headers = {
    'Content-Type': 'application/json',
    'X-P-Stream': 'testclickstream'
}

# Read and process the CSV file in chunks
chunk_size = 100  # Number of rows per chunk
for chunk in pd.read_csv(csv_file_path, chunksize=chunk_size, delimiter=';'):
    # Convert the chunk DataFrame to a list of dictionaries
    json_data = chunk.to_dict(orient='records')

    # Convert list of dictionaries to JSON string
    json_str = json.dumps(json_data)

    # Send the JSON data to Parseable
    response = requests.post(parseable_endpoint, auth=(username, password), headers=headers, data=json_str)

    # Check the response
    if response.status_code == 200:
        print('Chunk sent successfully!')
    else:
        print(f'Failed to send chunk. Status code: {response.status_code}')
        print(response.text)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Explanation of the Script
&lt;/h3&gt;

&lt;p&gt;We have divided the code's flow into six steps to help you understand its function. This will also help you understand how Pandas libraries and Parseable work together.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Importing Libraries&lt;/strong&gt;&lt;br&gt;
The script starts by importing Pandas for data manipulation, Requests for HTTP requests, and JSON for handling JSON data.&lt;br&gt;
Defining File Path and Endpoint: Specify the path to the CSV file and the Parseable API endpoint. Replace these with your actual file path and API endpoint.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Authentication and Headers&lt;/strong&gt;&lt;br&gt;
Set up basic authentication credentials and headers. The X-P-Stream header indicates the stream or collection name.&lt;br&gt;
Reading CSV in Chunks: Use pd.read_csv to read the CSV file in chunks of 100 rows. The chunk size parameter handles large files efficiently without memory issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Converting Data to JSON&lt;/strong&gt;&lt;br&gt;
Convert each chunk to a list of dictionaries using to_dict with orient='records', then to a JSON string.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sending Data to Parseable&lt;/strong&gt;&lt;br&gt;
Send the JSON data to the Parseable API using a POST request. Check the response status code to ensure successful ingestion. Print any errors.&lt;/p&gt;

&lt;h3&gt;
  
  
  Handling Errors and Retries
&lt;/h3&gt;

&lt;p&gt;Network issues or server errors might prevent successful data ingestion in real-world scenarios. To make the script more robust, implement error handling and retries. Also, look for code errors, if any.&lt;/p&gt;

&lt;h3&gt;
  
  
  Next Steps
&lt;/h3&gt;

&lt;p&gt;Ingesting data into Parseable using Pandas is straightforward and efficient. By reading data in chunks and converting it to JSON, we can seamlessly send it to the Parseable API.&lt;/p&gt;

&lt;p&gt;This script serves as a foundation and is customizable to your specific needs, including sophisticated error handling, logging, or parallel processing.&lt;/p&gt;

&lt;p&gt;Follow this guide to integrate Pandas and Parseable effectively, ensuring smooth and efficient data ingestion for your projects.&lt;/p&gt;

&lt;p&gt;To get started or try Parseable, &lt;a href="https://demo.parseable.com/login?q=eyJ1c2VybmFtZSI6ImFkbWluIiwicGFzc3dvcmQiOiJhZG1pbiJ9" rel="noopener noreferrer"&gt;visit our demo page&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>database</category>
      <category>monitoring</category>
      <category>devops</category>
      <category>python</category>
    </item>
    <item>
      <title>Build complete website with Docusaurus</title>
      <dc:creator>Surya Shakti</dc:creator>
      <pubDate>Sat, 22 Oct 2022 15:40:27 +0000</pubDate>
      <link>https://forem.com/parseable/build-complete-website-with-docusaurus-4ccg</link>
      <guid>https://forem.com/parseable/build-complete-website-with-docusaurus-4ccg</guid>
      <description>&lt;p&gt;In  this post, we'll look at how to build an end to end product website including landing page(s), docs pages and blogs using Docusaurus.&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Docusaurus
&lt;/h2&gt;

&lt;p&gt;For building a Docusaurus website we have to install it first  with a starter website skeleton. Run the below command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npx create-docusaurus@latest my-website classic
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After the successful installation you will get the basic website which you can run right away and start customizing it according to you needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Seeing website in development server
&lt;/h2&gt;

&lt;p&gt;To run the development server go to the root directory of the project and run the the following command.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm start
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After starting the development server you will can see your site on &lt;code&gt;http://localhost:3000/&lt;/code&gt;. It will look something like the following image.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ymums73ggx3ijjvyb42.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ymums73ggx3ijjvyb42.png" alt="Docusaurus UI" width="800" height="532"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This boiler plate we get after installing the Docusaurus have the following folder structure&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;my-website
├── blog
│   ├── 2019-05-28-hola.md
│   ├── 2019-05-29-hello-world.md
│   └── 2020-05-30-welcome.md
├── docs
│   ├── doc1.md
│   ├── doc2.md
│   ├── doc3.md
│   └── mdx.md
├── src
│   ├── css
│   │   └── custom.css
│   └── pages
│       ├── styles.module.css
│       └── index.js
├── static
│   └── img
├── docusaurus.config.js
├── package.json
├── README.md
├── sidebars.js
└── yarn.lock
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here we can see different directories which contain different files. Let us discuss about all those things. &lt;br&gt;
On first we have the &lt;code&gt;blog&lt;/code&gt; directory. It contains the Markdown files of the blogs you want to show in the website.&lt;br&gt;
The &lt;code&gt;docs&lt;/code&gt; directory contains the docs of the website.&lt;br&gt;
In &lt;code&gt;src&lt;/code&gt; you have CSS files and your &lt;code&gt;pages&lt;/code&gt; directory which have your landing page of the website and here you can add other custom pages to according to your requirements.&lt;br&gt;
In &lt;code&gt;static&lt;/code&gt; directory we have our assets like images and company logo.&lt;/p&gt;
&lt;h2&gt;
  
  
  Customizing Home Page
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;src/pages/index.js&lt;/code&gt; is the homepage for the website. It exports a React component that is rendered between navbar and footer. we will also see how to customise navbar and footer later.&lt;/p&gt;

&lt;p&gt;The boiler plate code of the home page is :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import React from 'react';
import clsx from 'clsx';
import Link from '@docusaurus/Link';
import useDocusaurusContext from '@docusaurus/useDocusaurusContext';
import Layout from '@theme/Layout';
import HomepageFeatures from '@site/src/components/HomepageFeatures';

import styles from './index.module.css';

function HomepageHeader() {
  const {siteConfig} = useDocusaurusContext();
  return (
    &amp;lt;header className={clsx('hero hero--primary', styles.heroBanner)}&amp;gt;
      &amp;lt;div className="container"&amp;gt;
        &amp;lt;h1 className="hero__title"&amp;gt;{siteConfig.title}&amp;lt;/h1&amp;gt;
        &amp;lt;p className="hero__subtitle"&amp;gt;{siteConfig.tagline}&amp;lt;/p&amp;gt;
        &amp;lt;div className={styles.buttons}&amp;gt;
          &amp;lt;Link
            className="button button--secondary button--lg"
            to="/docs/intro"&amp;gt;
            Docusaurus Tutorial - 5min ⏱️
          &amp;lt;/Link&amp;gt;
        &amp;lt;/div&amp;gt;
      &amp;lt;/div&amp;gt;
    &amp;lt;/header&amp;gt;
  );
}

export default function Home() {
  const {siteConfig} = useDocusaurusContext();
  return (
    &amp;lt;Layout
      title={`Hello from ${siteConfig.title}`}
      description="Description will go into a meta tag in &amp;lt;head /&amp;gt;"&amp;gt;
      &amp;lt;HomepageHeader /&amp;gt;
      &amp;lt;main&amp;gt;
        &amp;lt;HomepageFeatures /&amp;gt;
      &amp;lt;/main&amp;gt;
    &amp;lt;/Layout&amp;gt;
  );
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;you can edit here like you would edit a normal react component and it will change in the website.&lt;/p&gt;

&lt;h2&gt;
  
  
  Customizing Navbar
&lt;/h2&gt;

&lt;p&gt;In the &lt;code&gt;docusaurus.config.js&lt;/code&gt; we have &lt;code&gt;themeConfig&lt;/code&gt; object where we can customize our Navbar and Footer.&lt;/p&gt;

&lt;p&gt;For navbar customization, we can do it in this object&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; navbar: {
        title: 'My Site',
        logo: {
          alt: 'My Site Logo',
          src: 'img/logo.svg',
        },
        items: [
          {
            type: 'doc',
            docId: 'intro',
            position: 'left',
            label: 'Tutorial',
          },
          {to: '/blog', label: 'Blog', position: 'left'},
          {
            href: 'https://github.com/facebook/docusaurus',
            label: 'GitHub',
            position: 'right',
          },
        ],
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here in &lt;code&gt;title&lt;/code&gt; you can give title on the website. In &lt;code&gt;logo&lt;/code&gt; you can add the company's logo. In &lt;code&gt;items&lt;/code&gt; you can have the the links of different pages or different sections.&lt;br&gt;
The key position in item object decides the position of that item in the navbar.&lt;/p&gt;
&lt;h2&gt;
  
  
  Customizing the footer
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;footer: {
        style: 'dark',
        links: [
          {
            title: 'Docs',
            items: [
              {
                label: 'Tutorial',
                to: '/docs/intro',
              },
            ],
          },
          {
            title: 'Community',
            items: [
              {
                label: 'Stack Overflow',
                href: 'https://stackoverflow.com/questions/tagged/docusaurus',
              },
              {
                label: 'Discord',
                href: 'https://discordapp.com/invite/docusaurus',
              },
              {
                label: 'Twitter',
                href: 'https://twitter.com/docusaurus',
              },
            ],
          },
          {
            title: 'More',
            items: [
              {
                label: 'Blog',
                to: '/blog',
              },
              {
                label: 'GitHub',
                href: 'https://github.com/facebook/docusaurus',
              },
            ],
          },
        ],
        copyright: `Copyright © ${new Date().getFullYear()} My Project, Inc. Built with Docusaurus.`,
      },
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;In &lt;code&gt;links&lt;/code&gt; object each object is a column in the footer which contains links. &lt;/p&gt;
&lt;h2&gt;
  
  
  Creating a new page
&lt;/h2&gt;

&lt;p&gt;To create a new page for your website you can add it in &lt;code&gt;src/pages&lt;/code&gt; directory.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;├── src
│   ├── css
│   │   └── custom.css
│   └── pages
│       ├── styles.module.css
│       └── index.js
│       └── new-page.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here as we added &lt;code&gt;new-page.js&lt;/code&gt; it became a route itself.&lt;br&gt;
you can navigate to /new-page in the browser to see.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In this post we saw how to setup a custom website with Docusaurus and also learnt how to customize the navbar, footer etc. &lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>tutorial</category>
      <category>react</category>
    </item>
    <item>
      <title>Logging for your Node.js app</title>
      <dc:creator>Surya Shakti</dc:creator>
      <pubDate>Thu, 22 Sep 2022 07:03:31 +0000</pubDate>
      <link>https://forem.com/parseable/logging-for-your-nodejs-app-28jd</link>
      <guid>https://forem.com/parseable/logging-for-your-nodejs-app-28jd</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;In this article we will learn how we can store logs of &lt;strong&gt;Node.js&lt;/strong&gt; application into &lt;strong&gt;parseable&lt;/strong&gt; using &lt;a href="https://www.npmjs.com/package/winston" rel="noopener noreferrer"&gt;Winston&lt;/a&gt; library. We'll look at what is logging and why it is important.&lt;/p&gt;

&lt;h2&gt;
  
  
  Table of contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;What is logging?&lt;/li&gt;
&lt;li&gt;Using winston in a Node.js application&lt;/li&gt;
&lt;li&gt;sending those logs to the parseable&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  What is logging?
&lt;/h2&gt;

&lt;p&gt;Logging is the process of recording application events and data to a log file or any other sources, for the purpose of analyzing the system.&lt;/p&gt;

&lt;p&gt;Logs help developers to find the error, track the source of the bug and fix it. In Node.js it is critical to structure the application logs. when we are at the development phase we can use &lt;code&gt;console.log&lt;/code&gt; to find the problems and to get the information you need.&lt;/p&gt;

&lt;p&gt;But once the application is in production we cannot use the &lt;code&gt;console.log&lt;/code&gt; any more.&lt;/p&gt;

&lt;h2&gt;
  
  
  Using winston in a Node.js application
&lt;/h2&gt;

&lt;p&gt;I assume you have a node.js application, now to install winston &amp;amp; winston-transport in your project run the following commands.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;winston
npm &lt;span class="nb"&gt;install &lt;/span&gt;winston-transport
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now our aim is to replace all the &lt;code&gt;console&lt;/code&gt; messages from the application with the winston logger. So Just for example if you  have these three event logs in your node.js application which you want to store in &lt;strong&gt;parseable&lt;/strong&gt; using winston library.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;connsole&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;warn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Got mutiple elements with same id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Login Failed. Invalid ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Events posted successfully.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Configuring Winston in Node.js app&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create a &lt;strong&gt;'logger'&lt;/strong&gt; folder in the root directory where we will configure winston and send the logs to parseable.&lt;br&gt;
Now in &lt;code&gt;logger/index.js&lt;/code&gt; add following code&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;winston&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;winston&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;combine&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;printf&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;winston&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;format&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Transport&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;winston-transport&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;axios&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;myFormat&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;printf&lt;/span&gt;&lt;span class="p"&gt;(({&lt;/span&gt; &lt;span class="nx"&gt;level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;label&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timestamp&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;level&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;message&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;message&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;CustomTransport&lt;/span&gt; &lt;span class="kd"&gt;extends&lt;/span&gt; &lt;span class="nc"&gt;Transport&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nf"&gt;constructor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;opts&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;opts&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;callback&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;

    &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;config&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;post&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`https://demo.parseable.io/api/v1/logstream/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;streamName&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;X-P-META-Tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Owner&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;X-P-META-Tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Host&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;X-P-META-Tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Host.owner&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;Authorization&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`Basic &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;

    &lt;span class="nf"&gt;axios&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;config&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;})&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;

    &lt;span class="nf"&gt;callback&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;devLogger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;transport&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;CustomTransport&lt;/span&gt;&lt;span class="p"&gt;({});&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;winston&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createLogger&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;level&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;debug&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;format&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;combine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;label&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="nf"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="nx"&gt;myFormat&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="na"&gt;transports&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;transport&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;


&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;NODE_ENV&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;production&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;devLogger&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;module&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;exports&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here we are configuring winston in our node.js application and sending the logs to parseable. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Let's discuss this code in parts for better understanding.&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Initializing winston logger instance&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;

&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;devLogger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;transport&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;CustomTransport&lt;/span&gt;&lt;span class="p"&gt;({});&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;winston&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createLogger&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;level&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;debug&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;format&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;combine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;label&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="nf"&gt;timestamp&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="nx"&gt;myFormat&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="na"&gt;transports&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;transport&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/blockquote&gt;


&lt;/blockquote&gt;

&lt;p&gt;The snippet above contains the initialization of a Winston logger instance. Here we specify the log level for this specific logger instance using the npm log level standard, format in which logs will be stored and&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 that specifies where the logs data will go. In our case we will send it to the parseable.



&amp;gt; **Setting custom format of the log data**
&amp;gt;
&amp;gt;&amp;gt;

 ```js 
const myFormat = printf(({ level, message, label, timestamp }) =&amp;gt; {
  return JSON.stringify({
    timestamp: timestamp,
    level: level,
    message: message,
  });
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The snippet above specifies the format of the log data in which it will be stored. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Sending the log data to parseable&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;

&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;CustomTransport&lt;/span&gt; &lt;span class="kd"&gt;extends&lt;/span&gt; &lt;span class="nc"&gt;Transport&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nf"&gt;constructor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;opts&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;super&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;opts&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;callback&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
    &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;config&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;post&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`https://demo.parseable.io/api/v1/logstream/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;streamName&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;X-P-META-Tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Owner&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;X-P-META-Tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Host&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;X-P-META-Tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Host.owner&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;Authorization&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`Basic &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="nf"&gt;axios&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;config&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;})&lt;/span&gt;
      &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="nf"&gt;callback&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/blockquote&gt;


&lt;/blockquote&gt;

&lt;p&gt;The snippet above is responsible for sending the log data to the parseable. here &lt;code&gt;streamName&lt;/code&gt; is the log stream you have created in parseable. and the &lt;code&gt;key&lt;/code&gt; is the authentication key for accessing the parseable.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;calling the logger function&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;

&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;NODE_ENV&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;production&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;devLogger&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/blockquote&gt;


&lt;/blockquote&gt;

&lt;p&gt;You may not want to store the logs in parseable in development phase then you can see them on console to test it and when in production mode you can send it to parseable. The snippet above calls the logger function according to your requirements. you can also call the function directly if you don't want this conditional calling.&lt;/p&gt;

&lt;p&gt;Then we can use &lt;code&gt;logger&lt;/code&gt; instead of &lt;code&gt;console&lt;/code&gt; in our application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./logger&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nx"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;warn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Got mutiple elements with same id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nx"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Login Failed. Invalid ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nx"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Events posted successfully.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Hurray!!🥳🥳🥳 &lt;/p&gt;

&lt;p&gt;You have successfully integrated parseable with your node.js application. now you can run your application and all the events you have replaced with &lt;code&gt;logger&lt;/code&gt; will be posted to parseable and you can check in the &lt;code&gt;logstream&lt;/code&gt;  you created earlier.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>node</category>
      <category>backend</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Reimagine log storage: Parseable</title>
      <dc:creator>Nitish Tiwari</dc:creator>
      <pubDate>Tue, 20 Sep 2022 10:22:49 +0000</pubDate>
      <link>https://forem.com/parseable/reimagine-log-storage-parseable-4o08</link>
      <guid>https://forem.com/parseable/reimagine-log-storage-parseable-4o08</guid>
      <description>&lt;h2&gt;
  
  
  Context
&lt;/h2&gt;

&lt;p&gt;Whether you're a Developer or SRE or DevOps - when you're tasked with setting up logging, there are essentially two options:&lt;/p&gt;

&lt;h3&gt;
  
  
  Search Engines
&lt;/h3&gt;

&lt;p&gt;Setup an indexing based search engine, masquerading as log storage. Such products are difficult to deploy and run in longer term - manage indexes, local and remote storage, different node types and so on. &lt;/p&gt;

&lt;p&gt;Additionally, indexing in the ingestion flow causes high CPU and Memory consumption, while denying very high ingestion rates.&lt;/p&gt;

&lt;h3&gt;
  
  
  SaaS Platforms
&lt;/h3&gt;

&lt;p&gt;Alternatively, pay for an exorbitantly costly SaaS platform that is very easy to get started with. As time grows, data volumes and costs increase. &lt;/p&gt;

&lt;p&gt;But data gravity means getting out of this platform is difficult and you end up storing a fraction of log data to save costs. &lt;/p&gt;




&lt;p&gt;We dealt with both these options in our work and we know both these options are not ideal. This pushed us to think what would a modern, cloud native, log storage and observability platform look like.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;It is clear that log storage of future won't be another index based search engine in a new language.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We set out to build a completely indexing free log storage platform. This led to &lt;a href="https://github.com/parseablehq/parseable" rel="noopener noreferrer"&gt;Parseable&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Parseable is a simple, efficient and fast log storage and observability platform. Think the simplicity of Prometheus but for Logs. Written in Rust, Parseable leverages Apache Arrow, Parquet and widely available object storage platforms for efficiency, cost effectiveness and performance.&lt;/p&gt;

&lt;p&gt;It is compatible with standard logging agents like FluentBit, LogStash etc. Parseable also offers a builtin, intuitive GUI for log query and analysis.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdi7ryav5bqds6x84j4gm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdi7ryav5bqds6x84j4gm.png" alt="Parseable Design" width="800" height="601"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;As we launched Parseable, we see tremendous interest in the community to try this out. We'd love for you to try out Parseable and we're all ears for any questions, feedback, and comments. &lt;/p&gt;

&lt;p&gt;Get Started: &lt;a href="https://www.parseable.io/docs/quick-start" rel="noopener noreferrer"&gt;https://www.parseable.io/docs/quick-start&lt;/a&gt;&lt;br&gt;
Slack: &lt;a href="https://launchpass.com/parseable" rel="noopener noreferrer"&gt;https://launchpass.com/parseable&lt;/a&gt;&lt;br&gt;
Github: &lt;a href="https://github.com/parseablehq/parseable" rel="noopener noreferrer"&gt;https://github.com/parseablehq/parseable&lt;/a&gt;&lt;br&gt;
Documentation: &lt;a href="https://www.parseable.io/docs/introduction" rel="noopener noreferrer"&gt;https://www.parseable.io/docs/introduction&lt;/a&gt;&lt;/p&gt;

</description>
      <category>rust</category>
      <category>kubernetes</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>Logging for your frontend apps</title>
      <dc:creator>Surya Shakti</dc:creator>
      <pubDate>Mon, 19 Sep 2022 11:46:24 +0000</pubDate>
      <link>https://forem.com/parseable/logging-for-your-frontend-apps-28pj</link>
      <guid>https://forem.com/parseable/logging-for-your-frontend-apps-28pj</guid>
      <description>&lt;p&gt;Logging is one of the key activities when you set out to build a reliable piece of software. Yet, we've seen developers putting off logging setup for future. This can be largely attributed to &lt;em&gt;unavailability&lt;/em&gt; of simple, developer friendly logging platforms that just work. &lt;/p&gt;

&lt;p&gt;In this post we'll take a look at how to configure a React app to send logs (over HTTP) to &lt;a href="https://github.com/parseablehq/parseable" rel="noopener noreferrer"&gt;Parseable&lt;/a&gt; - an open source, developer friendly log storage and query platform.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Setup
&lt;/h2&gt;

&lt;p&gt;While Parseable is FOSS and can be setup anywhere, we'll use the &lt;a href="https://demo.parseable.io/" rel="noopener noreferrer"&gt;public demo instance&lt;/a&gt; to avoid the setup and instead focus on the react app side of things. &lt;/p&gt;

&lt;p&gt;I'll use the &lt;a href="https://github.com/pinojs/pino" rel="noopener noreferrer"&gt;Pino JS Logger&lt;/a&gt; as the logging client. &lt;/p&gt;

&lt;h2&gt;
  
  
  Create a React app
&lt;/h2&gt;

&lt;p&gt;Ideally you may have a react app available that you want to add logging for. In case you're starting from scratch, create a react application first so that we can add logging to the app.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npx create-react-app my-project-demo
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Setup Pino
&lt;/h2&gt;

&lt;p&gt;Install Pino so we can use it as a library in the react app.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install &lt;/span&gt;pino
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once installed successfully, create a directory named &lt;code&gt;src/loggers&lt;/code&gt; inside your react app source. We'll initialise Pino library here, to send logs to Parseable.&lt;/p&gt;

&lt;p&gt;Create a file &lt;code&gt;src/loggers/index.js&lt;/code&gt; and paste this following code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;pino&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;pino&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;send&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;level&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;logEvent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;a&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;b&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://demo.parseable.io/api/v1/logstream/pinotest&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;POST&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;Authorization&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Basic cGFyc2VhYmxlOnBhcnNlYWJsZQ==&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="nx"&gt;logEvent&lt;/span&gt;&lt;span class="p"&gt;]),&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;pino&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;browser&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;serialize&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;asObject&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;transmit&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;send&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that the Parseable URL &lt;code&gt;https://demo.parseable.io/api/v1/logstream/pinotest&lt;/code&gt; points to the &lt;code&gt;pinotest&lt;/code&gt; stream on Demo Parseable instance. This is a public instance to demonstrate Parseable UI. Please don't post any sensitive / production logs to the URL. Refer to &lt;a href="https://www.parseable.io/docs/introduction" rel="noopener noreferrer"&gt;Parseable Docs&lt;/a&gt; to start your own Parseable instance.&lt;/p&gt;

&lt;h3&gt;
  
  
  Add logging to the app
&lt;/h3&gt;

&lt;p&gt;Now we can add logs to the react app we created initially.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;logger&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./loggers&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="nx"&gt;logger&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;test log! pinotest stream from reactjs application.&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;App&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;div&lt;/span&gt; &lt;span class="nx"&gt;className&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;App&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="nx"&gt;ReactJs&lt;/span&gt; &lt;span class="nx"&gt;logs&lt;/span&gt; &lt;span class="nx"&gt;to&lt;/span&gt; &lt;span class="nx"&gt;parseable&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="sr"&gt;/div&amp;gt;&lt;/span&gt;&lt;span class="err"&gt;;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;App&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That is all! You have successfully integrated parseable with your React app. &lt;/p&gt;

&lt;p&gt;All the events you log with the logger will be posted to Parseable and you can check in the log stream you created earlier.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;In this post we learnt about how to setup and configure a react app to send logs to Parseable - a simple, developer friendly log storage platform. We were able to do this with just few lines of code. &lt;/p&gt;

&lt;p&gt;With this setup, all the logs are stored securely and efficiently on a remote machine. Logs are available in seconds for auditing and debugging purposes.&lt;/p&gt;

&lt;p&gt;Links:&lt;br&gt;
[1] Parseable Github: &lt;a href="https://github.com/parseablehq/parseable" rel="noopener noreferrer"&gt;https://github.com/parseablehq/parseable&lt;/a&gt;&lt;br&gt;
[2] Parseable Docs: &lt;a href="https://www.parseable.io/docs/introduction" rel="noopener noreferrer"&gt;https://www.parseable.io/docs/introduction&lt;/a&gt;&lt;br&gt;
[3] Pino Github: &lt;a href="https://github.com/pinojs/pino" rel="noopener noreferrer"&gt;https://github.com/pinojs/pino&lt;/a&gt;&lt;br&gt;
[4] Pino Docs: &lt;a href="https://getpino.io/" rel="noopener noreferrer"&gt;https://getpino.io/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>react</category>
      <category>frontend</category>
      <category>javascript</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
