DEV Community

Cover image for 1598. Crawler Log Folder
MD ARIFUL HAQUE
MD ARIFUL HAQUE

Posted on • Edited on

2

1598. Crawler Log Folder

1598. Crawler Log Folder

Difficulty: Easy

Topics: Array, String, Stack

The Leetcode file system keeps a log each time some user performs a change folder operation.

The operations are described below:

  • "../" : Move to the parent folder of the current folder. (If you are already in the main folder, remain in the same folder).
  • "./" : Remain in the same folder.
  • "x/" : Move to the child folder named x (This folder is guaranteed to always exist).

You are given a list of strings logs where logs[i] is the operation performed by the user at the ith step.

The file system starts in the main folder, then the operations in logs are performed.

Return the minimum number of operations needed to go back to the main folder after the change folder operations.

Example 1:

sample_11_1957

  • Input: logs = ["d1/","d2/","../","d21/","./"]
  • Output: 2
  • Explanation: Use this change folder operation "../" 2 times and go back to the main folder.

Example 2:

sample_22_1957

  • Input: logs = ["d1/","d2/","./","d3/","../","d31/"]
  • Output: 3

Example 3:

  • Input: logs = ["d1/","../","../","../"]
  • Output: 0

Constraints:

  • 1 <= logs.length <= 103
  • 2 <= logs[i].length <= 10
  • logs[i] contains lowercase English letters, digits, '.', and '/'.
  • logs[i] follows the format described in the statement.
  • Folder names consist of lowercase English letters and digits.

Hint:

  1. Simulate the process but don’t move the pointer beyond the main folder.

Solution:

We need to determine the minimum number of operations required to return to the main folder after performing a series of change folder operations. The operations can be moving to a parent folder, staying in the current folder, or moving to a child folder. The solution involves simulating these operations while keeping track of the current depth from the main folder.

Approach

  1. Initialization: Start at the main folder, represented by a depth of 0.
  2. Processing Logs: For each operation in the logs:
    • Move to Parent Folder ("../"): If the current depth is greater than 0, decrement the depth by 1. If already at the main folder (depth 0), the depth remains unchanged.
    • Stay in Current Folder ("./"): The depth remains unchanged.
    • Move to Child Folder ("x/"): Increment the depth by 1, as moving into a child folder increases the depth from the main folder.
  3. Result Calculation: After processing all operations, the depth value represents the number of "../" operations needed to return to the main folder. This depth is returned as the result.

Let's implement this solution in PHP: 1598. Crawler Log Folder

<?php
/**
 * @param String[] $logs
 * @return Integer
 */
function minOperations($logs) {
    ...
    ...
    ...
    /**
     * go to ./solution.php
     */
}

// Test cases
$logs1 = array("d1/", "d2/", "../", "d21/", "./");
$logs2 = array("d1/", "d2/", "./", "d3/", "../", "d31/");
$logs3 = array("d1/", "../", "../", "../");

echo minOperations($logs1) . "\n"; // Output: 2
echo minOperations($logs2) . "\n"; // Output: 3
echo minOperations($logs3) . "\n"; // Output: 0
?>
Enter fullscreen mode Exit fullscreen mode

Explanation:

  • Initialization: The variable $depth is initialized to 0, indicating we start at the main folder.
  • Loop Through Logs: For each log entry:
    • Parent Folder ("../"): If the current depth is greater than 0, it means we are not at the main folder, so we decrement $depth by 1. If we are already at the main folder, $depth remains 0.
    • Current Folder ("./"): The depth remains unchanged as we stay in the current folder.
    • Child Folder ("x/"): Moving into a child folder increases the depth by 1.
  • Result: After processing all logs, $depth holds the number of steps (i.e., "../" operations) needed to return to the main folder. This value is returned as the result.

This approach efficiently tracks the current folder depth by processing each operation in sequence, ensuring optimal performance with a linear pass through the logs array. The solution handles edge cases such as staying at the main folder when encountering "../" operations at depth 0.

Contact Links

If you found this series helpful, please consider giving the repository a star on GitHub or sharing the post on your favorite social networks 😍. Your support would mean a lot to me!

If you want more helpful content like this, feel free to follow me:

Tiger Data image

🐯 🚀 Timescale is now TigerData

Building the modern PostgreSQL for the analytical and agentic era.

Read more

Top comments (0)

Gen AI apps are built with MongoDB Atlas

Gen AI apps are built with MongoDB Atlas

MongoDB Atlas is the developer-friendly database for building, scaling, and running gen AI & LLM apps—no separate vector DB needed. Enjoy native vector search, 115+ regions, and flexible document modeling. Build AI faster, all in one place.

Start Free

👋 Kindness is contagious

Dive into this thoughtful piece, beloved in the supportive DEV Community. Coders of every background are invited to share and elevate our collective know-how.

A sincere "thank you" can brighten someone's day—leave your appreciation below!

On DEV, sharing knowledge smooths our journey and tightens our community bonds. Enjoyed this? A quick thank you to the author is hugely appreciated.

Okay