<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Boon</title>
    <description>The latest articles on Forem by Boon (@boo_n).</description>
    <link>https://forem.com/boo_n</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/boo_n"/>
    <language>en</language>
    <item>
      <title>My Indie SaaS Was Quietly Bleeding Users — The 4-Line Fix and Pricing Rebalance That Turned It Around</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Fri, 08 May 2026 22:11:17 +0000</pubDate>
      <link>https://forem.com/boo_n/my-indie-saas-was-quietly-bleeding-users-the-4-line-fix-and-pricing-rebalance-that-turned-it-4f0n</link>
      <guid>https://forem.com/boo_n/my-indie-saas-was-quietly-bleeding-users-the-4-line-fix-and-pricing-rebalance-that-turned-it-4f0n</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt; &lt;em&gt;(May 2026)&lt;/em&gt; — My indie SaaS on Apify Store dropped from 51 lifetime users to 12 monthly active in 3 months. Diagnostic: a misleading "Succeeded" status was hiding zero-item runs from users. Fix: detect anti-bot challenges explicitly + fail loud + bind residential proxies to the URL country. Then I rebalanced pricing — small batches dropped 50%, large batches doubled. Same revenue projection, far less churn. Three engineering decisions, one product turnaround.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;I shipped my first paid Apify actor in early 2026. By month three, the dashboard was telling two different stories. The success-rate graph said 91%. The MAU graph said I had lost 76% of my users — from 51 lifetime to 12 monthly active. Both were technically true. The gap between them is what this article is about.&lt;/p&gt;

&lt;p&gt;If you ship indie SaaS, sell scrapers, or run any product where customers can churn silently, the patterns here apply. The mechanics are specific to web scraping (Datadome, residential proxies, Apify's pay-per-event pricing), but the lesson — &lt;em&gt;silent success kills retention faster than loud failure&lt;/em&gt; — is universal.&lt;/p&gt;

&lt;h2&gt;
  
  
  The metric I should have looked at sooner
&lt;/h2&gt;

&lt;p&gt;Apify Store displays success rate, total runs, and monthly users on every actor's public page. It does NOT display a "users who stopped coming back" metric. The actor I'd shipped showed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Total users&lt;/strong&gt;: 51&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Monthly active&lt;/strong&gt;: 13&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Success rate (30 days)&lt;/strong&gt;: 91%&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're a creator and your gut says "91% is good", same. The 91% was the surface lie. Underneath, the actual ratio was much worse.&lt;/p&gt;

&lt;p&gt;I dug into the run-level data:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;11 ABORTED runs in 30 days — users hitting Cancel themselves because they saw nothing happening&lt;/li&gt;
&lt;li&gt;3 FAILED runs (visible failures with errors)&lt;/li&gt;
&lt;li&gt;143 SUCCEEDED runs — but a meaningful chunk of these returned &lt;strong&gt;zero items&lt;/strong&gt; in the dataset&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A success at the &lt;em&gt;process&lt;/em&gt; level (&lt;code&gt;exitCode 0&lt;/code&gt;, run finished cleanly) doesn't mean a success at the &lt;em&gt;product&lt;/em&gt; level (the user got data they paid for). Apify's status display only knows about the former.&lt;/p&gt;

&lt;p&gt;From the user's perspective: open the console, see ✅ Succeeded, click the dataset, see nothing. They don't file a bug. They just don't come back. There's no churn signal you can react to in time.&lt;/p&gt;

&lt;h2&gt;
  
  
  The bug behind the silent success
&lt;/h2&gt;

&lt;p&gt;The actor scrapes &lt;a href="https://www.vinted.com" rel="noopener noreferrer"&gt;Vinted&lt;/a&gt;, the European secondhand marketplace. Vinted has no public API and is protected by Datadome, one of the more aggressive anti-bot layers on the web. The scraping pattern (which I'll detail below) involves a Playwright browser bootstrapping a session and a fast HTTP loop using the captured cookies.&lt;/p&gt;

&lt;p&gt;The bug, in plain English: when Datadome served a challenge page (its JavaScript proof-of-work + fingerprint check), my scraper waited 15 seconds for a catalog selector that would never appear, then logged a warning and &lt;em&gt;continued&lt;/em&gt; with whatever cookies it had collected. Those cookies were from the challenge state, not from a real authenticated session. The subsequent API call returned an empty array. The actor exited with &lt;code&gt;exitCode 0&lt;/code&gt;. Apify reported SUCCEEDED.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// What I had — silently continues even when the challenge is unsolved&lt;/span&gt;
&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;waitForSelector&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;.catalog-wrapper&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;15000&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;catch&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;log&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Selector timeout — continuing with current cookies&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  &lt;span class="c1"&gt;// ← bad assumption&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;cookies&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;context&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;cookies&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  &lt;span class="c1"&gt;// challenge cookies, not auth&lt;/span&gt;
&lt;span class="c1"&gt;// ...later: API returns 0 items, run "succeeds"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The 4-line fix
&lt;/h2&gt;

&lt;p&gt;Three small additions made the bug observable:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;isDatadomeChallenge&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;any&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nx"&gt;boolean&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;evaluate&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;html&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;documentElement&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;innerHTML&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;toLowerCase&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;html&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;captcha-delivery.com&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
        &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nx"&gt;html&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;dd_cookie_test&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;document&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;toLowerCase&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;access denied&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;isDatadomeChallenge&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;retire&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`[Datadome] Challenge detected — retrying with new session.`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Plus a final-state check at the end of the run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;totalItems&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;throw&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Zero items extracted. The Vinted page or filters may have returned no results, &lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
    &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;or anti-bot blocked all attempts. Verify the URL or try again.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The throw at the end converts a silent SUCCEEDED-with-empty-dataset into a loud FAILED with an actionable message. Customers who used to open an empty dataset and churn now see a clear error and either retry, fix their URL, or contact support. None of those outcomes are silent.&lt;/p&gt;

&lt;p&gt;The numbers after deployment: 0 ABORTED runs in the next 14 days. The implicit "kill my own run because nothing's happening" pattern disappeared.&lt;/p&gt;

&lt;h2&gt;
  
  
  The country-binding fix that 3 weeks of debugging didn't surface
&lt;/h2&gt;

&lt;p&gt;The other half of the retention drop was about non-French customers. My actor scraped Vinted reliably on &lt;code&gt;vinted.fr&lt;/code&gt; but had a much lower success rate on &lt;code&gt;vinted.de&lt;/code&gt;, &lt;code&gt;vinted.es&lt;/code&gt;, &lt;code&gt;vinted.it&lt;/code&gt;, etc. Customers in those markets churned hardest.&lt;/p&gt;

&lt;p&gt;Apify's residential proxy pool, by default, rotates across all available countries. So a German user's URL request might be served by a US-based residential IP. Vinted geo-routes when the IP doesn't match the domain — sometimes returning empty results, sometimes 403, sometimes a different country's inventory. Datadome flags the mismatched-IP pattern as suspicious and challenges harder.&lt;/p&gt;

&lt;p&gt;The fix is one config tweak: bind the proxy &lt;code&gt;countryCode&lt;/code&gt; to the URL's TLD before instantiating the crawler.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;TLD_TO_COUNTRY&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;fr&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;FR&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;de&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;DE&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;es&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;it&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;IT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;nl&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;NL&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;pl&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;PL&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;PT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;be&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;BE&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;at&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;AT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;lt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;LT&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;cz&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;CZ&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;sk&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;SK&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;hu&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;HU&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;ro&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;RO&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;hr&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;HR&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;fi&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;FI&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;dk&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;DK&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;se&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;SE&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;ee&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;EE&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;gr&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;GR&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;ie&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;IE&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;lu&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;LU&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;lv&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;LV&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;si&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;SI&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;co.uk&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;GB&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;country&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;TLD_TO_COUNTRY&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;.&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)];&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;proxyConfig&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;Actor&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createProxyConfiguration&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;useApifyProxy&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;apifyProxyGroups&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;RESIDENTIAL&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="na"&gt;countryCode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;country&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After deploying, success rate on non-French markets went from ~60% to &amp;gt;95%. Same code, same actor, same Datadome — just one parameter that aligns the IP nationality with the URL's intended market.&lt;/p&gt;

&lt;p&gt;For customers who paste multiple URLs from different countries in one batch, the actor groups by country and runs a separate crawler per group, each with its own bound proxy. The customer pastes a flat list, the actor dispatches them transparently.&lt;/p&gt;

&lt;h2&gt;
  
  
  The pricing rebalance: 50% cheaper for small batches, 99% more for large
&lt;/h2&gt;

&lt;p&gt;After the reliability fixes, I pulled 90 days of run analytics and looked at the size distribution:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;58%&lt;/strong&gt; of runs were under 25 items&lt;/li&gt;
&lt;li&gt;17% between 25 and 100 items&lt;/li&gt;
&lt;li&gt;25% over 100 items&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The pricing model was: &lt;strong&gt;$0.30 per run start + $0.0015 per result&lt;/strong&gt;. For a 25-item run, the customer paid $0.34 — an effective rate of &lt;strong&gt;$13.60 per 1,000 items&lt;/strong&gt;, far above the market average ($0.50–$3.50 per 1,000). Customers tried once, saw the receipt, never came back. That start fee was the silent killer of small-batch use cases (monitoring, alerts, exploratory scraping).&lt;/p&gt;

&lt;p&gt;The new pricing: &lt;strong&gt;$0.04 per GB of memory at start (= $0.08 in 2 GB) + $0.0035 per result&lt;/strong&gt;.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Volume&lt;/th&gt;
&lt;th&gt;Old price&lt;/th&gt;
&lt;th&gt;New price&lt;/th&gt;
&lt;th&gt;Δ for customer&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;25 items&lt;/td&gt;
&lt;td&gt;$0.34&lt;/td&gt;
&lt;td&gt;$0.17&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;-50%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;100 items&lt;/td&gt;
&lt;td&gt;$0.45&lt;/td&gt;
&lt;td&gt;$0.43&lt;/td&gt;
&lt;td&gt;-4%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;200 items&lt;/td&gt;
&lt;td&gt;$0.60&lt;/td&gt;
&lt;td&gt;$0.78&lt;/td&gt;
&lt;td&gt;+30%&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1,000 items&lt;/td&gt;
&lt;td&gt;$1.80&lt;/td&gt;
&lt;td&gt;$3.58&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;+99%&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Modeled on 90 days of historical run data, total revenue projection is roughly the same. But the distribution shifted toward customer fairness: small monitoring runs are cheap enough not to trigger sticker shock; bulk extractions pay fairly for the value delivered.&lt;/p&gt;

&lt;p&gt;Existing users keep the old pricing for 14 days (Apify's pricing-schedule policy automatically notifies them by email). New customers see the new pricing immediately.&lt;/p&gt;

&lt;h2&gt;
  
  
  The architectural pattern that made all this possible
&lt;/h2&gt;

&lt;p&gt;The actor uses what I'd call &lt;strong&gt;asymmetric scraping&lt;/strong&gt;, since the term doesn't seem to have a canonical name yet:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;One Playwright browser&lt;/strong&gt; opens the catalog page on a residential IP. Datadome runs its JS challenge against a real Chromium environment with a coherent fingerprint. Cookies (&lt;code&gt;datadome&lt;/code&gt;, &lt;code&gt;dd_cookie_test&lt;/code&gt;, plus Vinted's &lt;code&gt;_vinted_*_session&lt;/code&gt;) are deposited in the page context.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Reuse those cookies in a fast HTTP loop&lt;/strong&gt; — &lt;code&gt;got-scraping&lt;/code&gt; for Node, &lt;code&gt;requests&lt;/code&gt; with custom headers for Python. Hit Vinted's internal &lt;code&gt;/api/v2/catalog/items&lt;/code&gt; endpoint directly, paginated. ~10× faster than driving the browser for every page request.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;On 401/403/429&lt;/strong&gt;: drop the session, regenerate via Playwright with a fresh residential IP, resume the loop where it left off.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The browser does the unlock. The HTTP client does the volume. Throughput goes from ~50 items/min for pure-browser scraping to ~500 items/min in this hybrid mode.&lt;/p&gt;

&lt;h2&gt;
  
  
  Three lessons I keep relearning as an indie shipper
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Silent success &amp;gt; loud failure for retention.&lt;/strong&gt; A run that returns zero items should &lt;em&gt;fail loud&lt;/em&gt;, not succeed quietly. Status displays based on &lt;code&gt;exitCode&lt;/code&gt; lie about product-level outcomes. Always assert at the end of every workflow that the user got what they paid for, and crash if not.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Country-bind your residential proxies for any geo-routed product.&lt;/strong&gt; Datadome, Cloudflare, Akamai — all of them flag the mismatched-IP pattern. Two lines of config (&lt;code&gt;countryCode&lt;/code&gt;) are worth a 35-percentage-point swing in success rate on non-default markets.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;When 60% of customers churn quietly, look at your fixed fees, not your per-unit price.&lt;/strong&gt; The headline rate ($1.50/1k) was reasonable. The hidden $0.30 minimum was lethal for the 58% of runs that were small. Always price for the smallest unit your customer actually wants to buy, not your average ARPU.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Where the fix lives
&lt;/h2&gt;

&lt;p&gt;The actor is the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper?fpr=8fp2od" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on Apify Store. Open-source integration examples (curl, Node, Python, batch multi-country, scheduling) are at &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;github.com/Boo-n/vinted-turbo-scraper&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Apify's free plan includes $5/month of credits, enough to run a few hundred scrapes before you commit to anything.&lt;/p&gt;

&lt;p&gt;If you ship anything similar, drop a comment with your retention/pricing tradeoffs — would love to compare notes.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Last verified: May 2026.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>saas</category>
      <category>productivity</category>
      <category>webscraping</category>
      <category>buildinpublic</category>
    </item>
    <item>
      <title>The Best Vinted Scraper in 2026 — Honest Comparison of 8 Tools (Tested)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Wed, 06 May 2026 15:26:04 +0000</pubDate>
      <link>https://forem.com/boo_n/the-best-vinted-scraper-in-2026-honest-comparison-of-8-tools-tested-5ej8</link>
      <guid>https://forem.com/boo_n/the-best-vinted-scraper-in-2026-honest-comparison-of-8-tools-tested-5ej8</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;TL;DR — Best Vinted scraper depends on your use case&lt;/strong&gt; &lt;em&gt;(May 2026, all data verified on Apify Store)&lt;/em&gt;:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Use case&lt;/th&gt;
&lt;th&gt;Pick&lt;/th&gt;
&lt;th&gt;Pricing&lt;/th&gt;
&lt;th&gt;Success rate&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Paste URL → JSON pipeline&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Vinted Turbo Scraper&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.08 + $3.50/1k&lt;/td&gt;
&lt;td&gt;95 %+&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cross-country price arbitrage&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Vinted Smart Scraper&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.50/1k flat&lt;/td&gt;
&lt;td&gt;98.4 %&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Continuous monitoring + alerts&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;epicscrapers' Monitor&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0.001/result + $0.00083/check&lt;/td&gt;
&lt;td&gt;99.5 %&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&amp;lt;100 listings/week (free)&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;shahidirfan's Scraper&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;$0&lt;/td&gt;
&lt;td&gt;~80 %&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;AI agent (Claude/Cursor)&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Vinted MCP Server&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~$0&lt;/td&gt;
&lt;td&gt;98.7 %&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Skip if success rate &amp;lt; 90 % — Datadome will eat your runs.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;I'm one of the developers behind two of the scrapers on this list (Vinted Turbo Scraper and Vinted Smart Scraper, both on the &lt;a href="https://apify.com/kazkn" rel="noopener noreferrer"&gt;&lt;code&gt;kazkn&lt;/code&gt; Apify profile&lt;/a&gt;). So this isn't neutral — but I run my actors' analytics weekly &lt;em&gt;and&lt;/em&gt; my competitors' public Apify Store stats, which gives me unusually good visibility into how each tool actually performs in production.&lt;/p&gt;

&lt;p&gt;If you're choosing a Vinted scraper in 2026 and don't want to waste $20 testing them all, this comparison gives you the answer in under 5 minutes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why a Vinted scraper exists in the first place
&lt;/h2&gt;

&lt;p&gt;Vinted is the largest secondhand fashion marketplace in Europe — 100M+ users, ~25 country-specific domains (vinted.fr, vinted.de, vinted.co.uk, vinted.es, vinted.it, vinted.pl, etc.). It does not publish a public API. The website's internal &lt;code&gt;/api/v2/catalog/items&lt;/code&gt; endpoint is protected by &lt;a href="https://datadome.co" rel="noopener noreferrer"&gt;Datadome&lt;/a&gt;, one of the more aggressive anti-bot layers on the web.&lt;/p&gt;

&lt;p&gt;This makes scraping non-trivial. Resellers, market researchers, price-tracking startups, and AI agents all need Vinted catalog data — but the technical barrier (Datadome bypass + residential proxies + country binding) is enough that most people pay for a managed scraper rather than build their own.&lt;/p&gt;

&lt;p&gt;That's the market this comparison covers.&lt;/p&gt;

&lt;h2&gt;
  
  
  Methodology
&lt;/h2&gt;

&lt;p&gt;For each tool below, I checked:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Public success rate&lt;/strong&gt; (Apify Store displays &lt;code&gt;succeededRuns / totalRuns&lt;/code&gt; for the last 30 days on every actor page)&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Pricing model and effective cost per 1k listings&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Country coverage&lt;/strong&gt; (which Vinted TLDs are supported reliably)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Speed&lt;/strong&gt; (items/min based on my own test runs and public stats)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;What it's actually good at&lt;/strong&gt; (architecture, output schema, reliability tradeoffs)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All numbers below were captured between &lt;strong&gt;May 2 and May 6, 2026&lt;/strong&gt;. Apify Store updates these stats every 24h, so they're as current as it gets.&lt;/p&gt;




&lt;h2&gt;
  
  
  1. ⚡ Vinted Turbo Scraper &lt;em&gt;(my own — full disclosure)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper?fpr=8fp2od" rel="noopener noreferrer"&gt;apify.com/kazkn/vinted-turbo-scraper&lt;/a&gt; · &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;GitHub examples&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.04/GB Actor Start + $0.0035/result&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Effective cost (100 items, 2 GB)&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;$0.43&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Effective cost (1,000 items, 2 GB)&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;$3.58&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Success rate (30d)&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;95 %+&lt;/strong&gt; (post v0.0.89 country-binding rewrite)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Speed&lt;/td&gt;
&lt;td&gt;~500 items/min&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Country coverage&lt;/td&gt;
&lt;td&gt;All 26 Vinted TLDs (auto-detected and country-bound proxy)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Output&lt;/td&gt;
&lt;td&gt;Structured JSON (id, title, price, brand, size, condition, photos, seller, location, scrapedAt)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;What it does well&lt;/strong&gt;: turns a Vinted search URL into structured JSON as fast as possible. The whole input is just &lt;code&gt;startUrls&lt;/code&gt; (newline-separated, batch-supported), &lt;code&gt;maxItems&lt;/code&gt;, and proxy config. No filter form to fill in. URL-native.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What's special&lt;/strong&gt; (and what fixed my retention 2 months ago): country-aware proxy binding. &lt;code&gt;vinted.fr&lt;/code&gt; URLs get FR residential IPs, &lt;code&gt;vinted.de&lt;/code&gt; URLs get DE IPs, etc. Without this, a French IP scraping vinted.de gets geo-redirected and returns garbage. This single config tweak took my non-FR success rate from ~60 % to &amp;gt;95 %.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it doesn't do&lt;/strong&gt;: no seller analysis, no sold-item lookup, no cross-country comparison. For that, see Vinted Smart Scraper below.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick this if&lt;/strong&gt;: you want the fastest setup from filtered Vinted search → exported listings, you batch URLs across multiple countries, you need it reliable on non-FR markets.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Try it free&lt;/strong&gt;: Apify Free plan includes $5/month of platform credits — enough for ~1,400 listings.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. 🧠 Vinted Smart Scraper — Cross-Country Price Comparison &lt;em&gt;(my own)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/kazkn/vinted-smart-scraper?fpr=8fp2od" rel="noopener noreferrer"&gt;apify.com/kazkn/vinted-smart-scraper&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.0005/result flat ($0.50 / 1k)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Effective cost (1,000 items)&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;$0.50&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Success rate (30d)&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;98.4 %&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Country coverage&lt;/td&gt;
&lt;td&gt;19 Vinted TLDs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Best for&lt;/td&gt;
&lt;td&gt;Cross-country arbitrage research, sold-item analysis, seller deep-dive&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Why it's cheaper than Turbo per result&lt;/strong&gt;: it's a research-tier actor optimized for volume. You'd typically run it once per day for arbitrage research, not poll it every 15 minutes for monitoring.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Killer feature&lt;/strong&gt;: pulls listings from multiple country domains in parallel and computes the spread per item. The same Nike Dunk Low can sit at €180 in France, €145 in Germany, and €220 in Italy — that spread is where reseller arbitrage happens. It also does seller analysis (number of listings, response rate, ratings) and sold-item lookup (extremely rare in the Vinted scraper space).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick this if&lt;/strong&gt;: you're a reseller doing cross-country arbitrage, or building a Vinted price intelligence dashboard.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Vinted Scraper + Monitor &lt;em&gt;(epicscrapers)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/epicscrapers/vinted-search-scraper" rel="noopener noreferrer"&gt;apify.com/epicscrapers/vinted-search-scraper&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.001/result + $0.00083/monitor check&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Success rate (30d)&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;99.5 %&lt;/strong&gt; &lt;em&gt;(highest in this comparison)&lt;/em&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Best for&lt;/td&gt;
&lt;td&gt;Continuous monitoring with native alert primitives&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;What it's good at&lt;/strong&gt;: built-in alert plumbing. You configure it to ping every X seconds and trigger webhook actions when new listings match your filter. Good fit if you want monitoring without wiring up your own scheduler + webhook.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tradeoff&lt;/strong&gt;: more expensive at scale than running Apify Scheduler + Vinted Turbo Scraper yourself. Math: monitoring every 15 min on 25 items = (96 runs × $0.08) + (96 × 25 × $0.0035) = &lt;strong&gt;$15.84/day&lt;/strong&gt; with Turbo. Same setup with epicscrapers' monitor = (96 × $0.00083 × 96 polls/day) + (25 × $0.001 × 96) = &lt;strong&gt;$10.05/day&lt;/strong&gt;. So at high frequency, epicscrapers wins. At low frequency (hourly polls), Turbo wins.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick this if&lt;/strong&gt;: you want plug-and-play monitoring without writing scheduler config.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. shahidirfan's Vinted Scraper &lt;em&gt;(free)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/shahidirfan/Vinted-Scraper" rel="noopener noreferrer"&gt;apify.com/shahidirfan/Vinted-Scraper&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;FREE&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Effective cost&lt;/td&gt;
&lt;td&gt;$0&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Best for&lt;/td&gt;
&lt;td&gt;Hobbyist, occasional small-volume runs&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;What it's good at&lt;/strong&gt;: literally free. If you're scraping &amp;lt;100 listings a week for a personal project, this is hard to beat on cost.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tradeoff&lt;/strong&gt;: limited support, no country-bound proxy logic (fails often on non-FR markets), may break when Vinted updates anti-bot tokens.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick this if&lt;/strong&gt;: you're a hobbyist, you only scrape vinted.fr, and your runs are infrequent enough that occasional failures don't break your workflow.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. 🤖 Vinted MCP Server &lt;em&gt;(my own — for AI agents)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/kazkn/vinted-mcp-server?fpr=8fp2od" rel="noopener noreferrer"&gt;apify.com/kazkn/vinted-mcp-server&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.00001/result (effectively free)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;For&lt;/td&gt;
&lt;td&gt;Claude / Cursor / Windsurf AI agents&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;What it does&lt;/strong&gt;: exposes Vinted data as Model Context Protocol tools so an AI agent can call "search Vinted for X" and get structured results. Built on Apify's MCP server primitive.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick this if&lt;/strong&gt;: you're building an AI agent (Claude Project, Cursor extension, Windsurf workflow) that needs Vinted data live, not at scrape-time.&lt;/p&gt;

&lt;h2&gt;
  
  
  6. automation-lab's Vinted Scraper &lt;em&gt;(skip if you can)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/automation-lab/vinted-scraper" rel="noopener noreferrer"&gt;apify.com/automation-lab/vinted-scraper&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.0027/result + $0.00475 start&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Success rate (30d)&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;56 %&lt;/strong&gt; &lt;em&gt;(low — known Datadome issues)&lt;/em&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;What it's good at&lt;/strong&gt;: 16 markets supported on paper.&lt;br&gt;
&lt;strong&gt;Tradeoff&lt;/strong&gt;: 56 % success rate. Almost half your runs fail. At 100 items/run, you'd average 56 successful items per attempt with full charge for failures — that's effectively $4.85 / 1k usable items vs the advertised $2.70. Look at the public stats page before committing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verdict&lt;/strong&gt;: skip until they fix the proxy story. The pricing looks competitive but the success rate makes it a worse deal than Turbo or Smart.&lt;/p&gt;

&lt;h2&gt;
  
  
  7. piotrv1001's Vinted Listings Scraper
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/piotrv1001/vinted-listings-scraper" rel="noopener noreferrer"&gt;apify.com/piotrv1001/vinted-listings-scraper&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.001/result + $0.004 per seller detail (optional)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Success rate (30d)&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;100 %&lt;/strong&gt; &lt;em&gt;(but small sample — verify on their page)&lt;/em&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;What it's good at&lt;/strong&gt;: granular pricing — you only pay for seller detail fetches if you need them. Good for resellers who only need listings 90 % of the time and seller data 10 % of the time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pick this if&lt;/strong&gt;: per-seller granularity matters and you're cost-sensitive on listings-only volume.&lt;/p&gt;

&lt;h2&gt;
  
  
  8. saswave's Vinted Product &amp;amp; Profile Scraper &lt;em&gt;(skip)&lt;/em&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://apify.com/saswave/vinted-product-item-profile-scraper" rel="noopener noreferrer"&gt;apify.com/saswave/vinted-product-item-profile-scraper&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Spec&lt;/th&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Pricing&lt;/td&gt;
&lt;td&gt;$0.001/result&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Success rate (30d)&lt;/td&gt;
&lt;td&gt;
&lt;strong&gt;48.8 %&lt;/strong&gt; &lt;em&gt;(unstable)&lt;/em&gt;
&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The actor description says "$0.40/1000 results, no proxies" — but the 48 % success rate is exactly &lt;em&gt;because&lt;/em&gt; there are no residential proxies. Without them, Datadome blocks half of the runs. Skip until proxies are added.&lt;/p&gt;




&lt;h2&gt;
  
  
  How to choose in 60 seconds
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Are you scraping &amp;lt;100 listings/week for personal use?
└── YES → shahidirfan (free)
└── NO ↓

Do you need cross-country price comparison or arbitrage?
└── YES → Vinted Smart Scraper ($0.50/1k)
└── NO ↓

Are you building an AI agent (Claude / Cursor / Windsurf)?
└── YES → Vinted MCP Server (free tier)
└── NO ↓

Do you need monitoring with built-in alerts?
└── YES → epicscrapers' Monitor
└── NO → Vinted Turbo Scraper (paste URL, get JSON, done)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  What to ask before paying for any Vinted scraper
&lt;/h2&gt;

&lt;p&gt;Four signals to check on the actor's Apify Store page &lt;strong&gt;before&lt;/strong&gt; running anything:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Public success rate?&lt;/strong&gt; Every Apify Store actor page shows it. &lt;strong&gt;Anything below 90 % means failed runs you'll still get charged for&lt;/strong&gt; (in some cases — depends on the pricing model). For Vinted specifically, &amp;lt;90 % is almost always a Datadome/proxy issue, and it won't get better without dev investment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Country-bound proxies?&lt;/strong&gt; This is the #1 reliability fix in the Vinted scraper space in 2026. If the actor doesn't bind proxy &lt;code&gt;countryCode&lt;/code&gt; to the URL TLD, you'll see empty results on non-FR markets. Test with a vinted.de URL before committing to volume.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pricing transparency?&lt;/strong&gt; Look for a clear pricing tab and a &lt;code&gt;maxTotalChargeUsd&lt;/code&gt; cap to prevent runaway runs from blowing your budget.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pagination metadata robustness?&lt;/strong&gt; Vinted has changed pagination response shape twice in 2026. If the actor breaks every time Vinted shifts a field, you'll get partial datasets silently.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Frequently Asked Questions
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Is Vinted scraping legal in 2026?
&lt;/h3&gt;

&lt;p&gt;Public catalog data (titles, prices, photos shown to anonymous browsers) sits in a grey area in most jurisdictions. The widely cited &lt;em&gt;hiQ Labs v. LinkedIn&lt;/em&gt; (US, 2022) and the EU's Database Directive both lean permissive for non-personal data scraped from publicly accessible pages. &lt;strong&gt;Personal seller data is different&lt;/strong&gt; — it's protected under GDPR and you should not redistribute it without a lawful basis. Don't scrape what you can't see logged out, don't redistribute personal data, respect &lt;a href="https://www.vinted.com/terms_and_conditions" rel="noopener noreferrer"&gt;Vinted's terms&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Does Vinted have a public API?
&lt;/h3&gt;

&lt;p&gt;No. Vinted does not publish a public API for third-party developers as of May 2026. The internal &lt;code&gt;/api/v2/catalog/items&lt;/code&gt; endpoint exists and is what scraper actors hit, but it's not documented and Vinted can change its shape at any time.&lt;/p&gt;

&lt;h3&gt;
  
  
  How much does it cost to scrape 1,000 Vinted listings?
&lt;/h3&gt;

&lt;p&gt;Depends on the tool. &lt;strong&gt;Cheapest&lt;/strong&gt;: shahidirfan's free actor ($0). &lt;strong&gt;Cheapest paid&lt;/strong&gt;: Vinted Smart Scraper at $0.50. &lt;strong&gt;Mid-tier&lt;/strong&gt;: Vinted Turbo Scraper at $3.58 (with $0.08 fixed start fee included). &lt;strong&gt;Most expensive in this comparison&lt;/strong&gt;: alkausari_mujahid at $10/1k.&lt;/p&gt;

&lt;h3&gt;
  
  
  Will scraping Vinted ban my account?
&lt;/h3&gt;

&lt;p&gt;Authenticated scraping (using your Vinted login cookies) carries account-ban risk. Anonymous scraping (no Vinted login) carries IP-ban risk on the proxy, not your account. All actors in this comparison use anonymous scraping with rotating residential proxies — no Vinted login required from you, and your account is not exposed.&lt;/p&gt;

&lt;h3&gt;
  
  
  What's the fastest Vinted scraper in 2026?
&lt;/h3&gt;

&lt;p&gt;Vinted Turbo Scraper at ~500 items/min on a single residential session. Vinted Smart Scraper hits ~400 items/min in cross-country mode (parallelized but with country-binding overhead). epicscrapers' Monitor is rate-limited by design (you set the polling interval, not the throughput).&lt;/p&gt;

&lt;h3&gt;
  
  
  Can I scrape multiple Vinted countries in one run?
&lt;/h3&gt;

&lt;p&gt;Yes, but only with actors that have country-binding logic. Vinted Turbo Scraper, Vinted Smart Scraper, and (partially) epicscrapers' Monitor support multi-country batch input. Other actors in this comparison default to a single country per run, which means you'd run separate jobs per country and merge the data yourself.&lt;/p&gt;

&lt;h3&gt;
  
  
  Difference between Vinted Turbo Scraper and Vinted Smart Scraper?
&lt;/h3&gt;

&lt;p&gt;Turbo is &lt;strong&gt;fast and URL-native&lt;/strong&gt; ($0.08 + $3.50/1k). Paste a search URL, get JSON in 30-60 seconds. For monitoring, alerts, simple pipelines.&lt;/p&gt;

&lt;p&gt;Smart is &lt;strong&gt;research-grade&lt;/strong&gt; ($0.50/1k flat). Cross-country comparison, seller analysis, sold-item lookup. For arbitrage research, price intelligence dashboards, academic studies.&lt;/p&gt;

&lt;p&gt;Different tools for different intents. Both ship from the same dev team with the same anti-bot stack.&lt;/p&gt;




&lt;h2&gt;
  
  
  Bottom line
&lt;/h2&gt;

&lt;p&gt;Most Vinted scrapers fail for the same reason: they don't country-bind their residential proxies, so non-FR markets return empty results and Datadome flags the IP pool. The 3 scrapers I'd actually use in 2026 are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Vinted Turbo Scraper&lt;/strong&gt; — for paste-URL → JSON workflows&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Vinted Smart Scraper&lt;/strong&gt; — for cross-country research&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;shahidirfan's free actor&lt;/strong&gt; — for occasional hobby runs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Apify's Free plan gives you $5/month of credits, enough to test all three before committing.&lt;/p&gt;

&lt;p&gt;If you'd like the full architectural writeup of how Vinted Turbo Scraper bypasses Datadome (asymmetric scraping, country-bound proxies, fail-loud retention pattern), I documented it on &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;my GitHub repo&lt;/a&gt; with curl, Node, and Python integration examples.&lt;/p&gt;

&lt;p&gt;Questions, corrections, or use cases I missed? Drop them in the comments — I update this guide monthly.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Last verified: May 6, 2026&lt;/em&gt;&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>vinted</category>
      <category>ecommerce</category>
      <category>automation</category>
    </item>
    <item>
      <title>I scraped 1,200 Shopify stores to qualify B2B leads — here's what I learned about ICP</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Tue, 05 May 2026 12:50:27 +0000</pubDate>
      <link>https://forem.com/boo_n/i-scraped-1200-shopify-stores-to-qualify-b2b-leads-heres-what-i-learned-about-icp-12ag</link>
      <guid>https://forem.com/boo_n/i-scraped-1200-shopify-stores-to-qualify-b2b-leads-heres-what-i-learned-about-icp-12ag</guid>
      <description>&lt;p&gt;&lt;em&gt;A 6-week experiment in turning competitor research into a $0.005-per-store API.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://youtu.be/jxpSVYvZBFw" rel="noopener noreferrer"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fekj5k0a1cby3fu6h6f4l.jpg" alt="Watch the 2-minute walkthrough" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;▶️ &lt;em&gt;2-minute video walkthrough of the actor in action — input, run, dataset, API call.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;There is a quiet rule in B2B sales nobody puts on a slide: &lt;strong&gt;the cost of qualifying a bad lead is roughly equal to the cost of closing a good one.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Six weeks ago I was trying to validate ICP for a B2B side-project targeting Shopify operators. The hypothesis was tight: "Stores running Klaviyo plus a paid reviews app spend money on retention tooling, so they will pay for ours."&lt;/p&gt;

&lt;p&gt;To test it I needed two columns next to each store name: &lt;strong&gt;email provider&lt;/strong&gt; and &lt;strong&gt;reviews provider&lt;/strong&gt;. Maybe a third for &lt;strong&gt;subscriptions&lt;/strong&gt;. From those three I could segment 1,200 stores into a tier-one list of about 200, and avoid wasting outreach on the rest.&lt;/p&gt;

&lt;p&gt;The data exists. It is in the page source of every Shopify store. Apps inject &lt;code&gt;&amp;lt;script&amp;gt;&lt;/code&gt; tags from their own CDN — &lt;code&gt;cdn.judge.me&lt;/code&gt;, &lt;code&gt;cdn.yotpo.com&lt;/code&gt;, &lt;code&gt;loox.io/widget&lt;/code&gt;, &lt;code&gt;klaviyo.com/onsite&lt;/code&gt;. Any store using Klaviyo loads a Klaviyo script. The information was right there.&lt;/p&gt;

&lt;p&gt;But three minutes of View Source per store, times 1,200 stores, is 60 hours. I do not have 60 hours.&lt;/p&gt;

&lt;p&gt;So I did the thing I had been avoiding for months: I wrote the scraper.&lt;/p&gt;




&lt;h2&gt;
  
  
  What I expected to find
&lt;/h2&gt;

&lt;p&gt;I expected the scraping itself to be the hard part. It was not.&lt;/p&gt;

&lt;p&gt;I expected proxies, retries, and rate-limit roulette. None of it materialized — &lt;code&gt;/products.json&lt;/code&gt; is publicly served on every Shopify store and the homepage HTML is, well, a homepage. No bot challenges, no CAPTCHA, no reCAPTCHA. A polite concurrency limit of 5 simultaneous requests is enough to scan 1,000 stores in 25 minutes without anyone noticing.&lt;/p&gt;

&lt;p&gt;What I did not expect was how much &lt;strong&gt;the app stack tells you about the operator&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;A Shopify store on Judge.me Free is a different company than a Shopify store on Yotpo Premium. Same revenue band, same vertical, same product type — totally different stage, budget, and pain points.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Judge.me Free → indie operator, doing under $30k/month, allergic to monthly subscriptions, will not buy your $99/month tool unless you frame it as ROI within 30 days.&lt;/li&gt;
&lt;li&gt;Yotpo Premium → seven-figure DTC brand, has a marketing team, will compare you against 4 competitors in a vendor matrix before signing, and will negotiate.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can replicate this exercise across every app category:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Klaviyo Free → still building list, every dollar matters.&lt;/li&gt;
&lt;li&gt;Klaviyo paid (&amp;gt;$100/mo) → mature email program, ready for sophisticated tooling.&lt;/li&gt;
&lt;li&gt;Postscript → SMS-first DTC, modern stack, probably trying everything.&lt;/li&gt;
&lt;li&gt;Mailchimp → legacy stack, conservative, harder to displace.&lt;/li&gt;
&lt;li&gt;ReCharge → subscription-driven economics, focus on retention.&lt;/li&gt;
&lt;li&gt;Smile.io → loyalty-conscious, willing to invest in retention tooling.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Six weeks ago I would have called this overthinking. Today I run my outbound off it. Reply rate moved from 4% to 11% on a 200-account test, simply by changing the opener line to acknowledge the actual stack the prospect was running.&lt;/p&gt;




&lt;h2&gt;
  
  
  The build, in three observations
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Observation 1 — Most "Shopify scrapers" are not.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Every existing tool I tested fell into one of two camps. Either it scraped products only (no app detection), or it detected apps but lookup-by-lookup through a Chrome extension. Nothing did both, in batch, at low volume cost.&lt;/p&gt;

&lt;p&gt;The closest matches were paid SaaS dashboards (Storeleads, Charm, BuiltWith) at $99 to $499 per month, with monthly export caps. For a one-off list of 1,200 stores I could not justify a $1,200 yearly subscription that I would forget to use.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Observation 2 — App detection is a 10-line regex problem, not an ML problem.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Every Shopify app I needed to detect ships through one of three patterns: a &lt;code&gt;&amp;lt;script src="cdn.[appname].com/..."&amp;gt;&lt;/code&gt; tag in the homepage HTML, a &lt;code&gt;&amp;lt;meta name="generator"&amp;gt;&lt;/code&gt; tag with the app name, or an inline &lt;code&gt;_q.push(...)&lt;/code&gt; queue call. Match on any of those three, OR them together, return a boolean.&lt;/p&gt;

&lt;p&gt;The whole detection module — covering Klaviyo, Yotpo, Judge.me, Loox, Stamped, Reviews.io, ReCharge, Bold, Skio, Privy, Justuno, Mailchimp, Postscript, Attentive, Smile.io, Searchanise, Boost, and 8 others — is about 600 lines of JavaScript including snapshot tests. New detectors are 15-minute additions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Observation 3 — Pay-per-event pricing changes the unit economics.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Apify's Store lets developers price by the row. So a 500-product store with full app detection plus reviews costs about $0.30 to scan. A thousand-store batch costs about $3.&lt;/p&gt;

&lt;p&gt;That number matters. At $3 per batch, refreshing my ICP list weekly is a coffee. At $300 per batch (which is what specialized SaaS would charge for the same volume), I would refresh quarterly and miss every interesting signal in between.&lt;/p&gt;

&lt;p&gt;The cheaper the unit, the higher the refresh frequency. The higher the refresh frequency, the better the signal quality. This is true for every kind of competitive intelligence work, and it's the reason I shipped the actor as a public tool instead of keeping it private.&lt;/p&gt;




&lt;h2&gt;
  
  
  What surprised me
&lt;/h2&gt;

&lt;p&gt;Three things I did not expect, in order of how badly I underestimated them:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. &lt;code&gt;/products.json&lt;/code&gt; is more honest than the storefront.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Shopify's catalog endpoint exposes products that have been unpublished from the theme but are still live in the database — out-of-stock items, B2B-only SKUs, retired collections that nobody bothered to fully delete. For research, this is gold. You see what the merchant sells today and what they sold last quarter.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Reviews-app detection turned out to be the strongest lead signal.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;More predictive than email provider, more predictive than vertical, more predictive than location. A store paying for reviews is a store that has scaled past the early stage and is now optimizing for retention and social proof. That's where my offer lands.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. People want this packaged as an MCP tool for Claude.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Two of the first three external users asked. I had not planned for it. The pattern is clear though — once you can pipe Shopify-store data into Claude or Cursor and ask "qualify these 200 stores for my ICP", you stop opening spreadsheets. I am building it next.&lt;/p&gt;




&lt;h2&gt;
  
  
  The actor, if you want to use it
&lt;/h2&gt;

&lt;p&gt;I shipped the scraper on Apify Store as &lt;strong&gt;&lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Apps Spy + Product Scraper&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;What it does in one call:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pulls the &lt;strong&gt;full product catalog&lt;/strong&gt; for a list of Shopify URLs (titles, prices, variants, images, vendor, tags).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Detects installed apps&lt;/strong&gt; across email/SMS, reviews, subscriptions, popups, search, loyalty.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pulls reviews&lt;/strong&gt; when a reviews app is detected, by routing to that app's public reviews API.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;What it costs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;$0.005 per store for the standard tier (products + apps).&lt;/li&gt;
&lt;li&gt;$0.30 for a 500-product store with full reviews.&lt;/li&gt;
&lt;li&gt;Apify gives a $5 free credit on signup, which covers about 1,500 stores.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;What it doesn't do:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Historical data. If you need "who started using Klaviyo in Q1 2024," you want BuiltWith.&lt;/li&gt;
&lt;li&gt;Cross-platform. Shopify only. WooCommerce/Magento are different problems.&lt;/li&gt;
&lt;li&gt;Filtering by revenue band. Storeleads does that better.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're an indie founder, agency analyst, or sales rep doing 100-2,000 stores per month and you need raw exports, it should fit. If your volume is much larger or much smaller, the SaaS competitors are probably the right call.&lt;/p&gt;




&lt;h2&gt;
  
  
  The takeaway, if you skim
&lt;/h2&gt;

&lt;p&gt;Three things I would do differently if I were starting over:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Build the qualification tool before you start prospecting, not after&lt;/strong&gt;. The 60-hour manual baseline is what kills the experiment. Every B2B founder I have asked has the same story.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Treat tech-stack data as ICP data, not technographic trivia&lt;/strong&gt;. The app a store runs is downstream of their stage, budget, and team size. Use it that way.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Refresh weekly, not quarterly&lt;/strong&gt;. Cheap refresh frequency beats expensive depth nine times out of ten in early-stage outbound.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The scraper is on &lt;strong&gt;&lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Apify Store&lt;/a&gt;&lt;/strong&gt;, free $5 credit covers your first batch. If a detector is missing, ping me — each is a 15-minute add.&lt;/p&gt;




&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;How do I detect what apps a Shopify store is using?&lt;/strong&gt;&lt;br&gt;
Apps inject identifiable scripts into the storefront HTML — &lt;code&gt;cdn.judge.me&lt;/code&gt;, &lt;code&gt;cdn.yotpo.com&lt;/code&gt;, &lt;code&gt;klaviyo.com/onsite&lt;/code&gt;, etc. Either inspect the page source manually (3 minutes per store) or use &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Apps Spy + Product Scraper&lt;/a&gt; to detect 150+ apps in batch at $0.005 per store.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Is scraping Shopify legal?&lt;/strong&gt;&lt;br&gt;
Yes for publicly accessible product data. Shopify exposes &lt;code&gt;/products.json&lt;/code&gt; on every storefront and the homepage HTML is public. No login, no API key, no proxy needed for most stores. You're reading what the merchant chose to publish.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How long does it take to scan 1,000 Shopify stores?&lt;/strong&gt;&lt;br&gt;
About 25 minutes at a polite concurrency of 5 simultaneous requests. The bottleneck is &lt;code&gt;/products.json&lt;/code&gt; response size, not rate limits — Shopify storefronts handle this volume without complaint.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What's a realistic cost for B2B lead qualification across 1,200 Shopify stores?&lt;/strong&gt;&lt;br&gt;
Around $3 of compute on Apify's pay-per-event pricing — $0.005 per store for products + apps detection, $0.30 for full reviews. The $5 free Apify credit covers your first ~1,500 stores.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Which Shopify apps are the strongest signal of B2B SaaS fit?&lt;/strong&gt;&lt;br&gt;
Reviews providers (Yotpo Premium, Okendo, Stamped Pro) signal seven-figure DTC. Klaviyo paid plans signal a mature email program. Postscript or Attentive signal SMS-first modern stack. Smile.io signals retention-conscious operators ready to invest in tooling.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Tags: shopify, b2b, lead-generation, ecommerce, indiehackers&lt;/em&gt;&lt;/p&gt;

</description>
      <category>shopify</category>
      <category>ecommerce</category>
      <category>b2b</category>
      <category>indiehackers</category>
    </item>
    <item>
      <title>I built a Shopify scraper that detects apps + pulls products in one API call</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sat, 02 May 2026 11:50:01 +0000</pubDate>
      <link>https://forem.com/boo_n/i-built-a-shopify-scraper-that-detects-apps-pulls-products-in-one-api-call-5a8b</link>
      <guid>https://forem.com/boo_n/i-built-a-shopify-scraper-that-detects-apps-pulls-products-in-one-api-call-5a8b</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt; — Existing Shopify app detectors (Koala Inspector, ShopScan, Fera, BuiltWith) are Chrome extensions or SaaS dashboards. None do batch. I had 1,200 stores to qualify and View Source + Cmd-F was killing my afternoons, so I shipped an Apify actor that takes a list of Shopify URLs and returns the full app stack (Klaviyo, Yotpo, Judge.me, Loox, ReCharge…) + product catalog + reviews in JSON. No headless browser, ~$0.005 per store, 1,000 stores in 25 minutes. Live here → &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Scraper – Apps Spy + Reviews&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  The afternoon that broke me
&lt;/h2&gt;

&lt;p&gt;Six weeks ago I was prospecting for a B2B side-project. The hypothesis: "Shopify stores running Klaviyo + a paid reviews app are the right ICP — they spend money on retention tooling, so they will pay for ours."&lt;/p&gt;

&lt;p&gt;To validate, I needed a list of Shopify stores &lt;strong&gt;and&lt;/strong&gt; their installed apps.&lt;/p&gt;

&lt;p&gt;The Shopify App Store does not give you that. The "stores using X" databases do, but the public ones are stale and the good ones are paid SaaS at $99–499/month for filters I did not need.&lt;/p&gt;

&lt;p&gt;So I did what every founder does at 11 PM: I opened View Source on a competitor list, hit &lt;code&gt;Cmd-F&lt;/code&gt;, and started typing &lt;code&gt;klaviyo&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;It worked. Sort of. I did 40 stores in two hours, then stopped, because I had a list of 1,200.&lt;/p&gt;

&lt;p&gt;That night I wrote the first version of what is now &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Scraper – Apps Spy + Reviews&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I actually wanted
&lt;/h2&gt;

&lt;p&gt;Every "Shopify scraper" I found online did one of two things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Scraped a single store's products via &lt;code&gt;/products.json&lt;/code&gt; — table-stakes, dozens of free Apify actors do it.&lt;/li&gt;
&lt;li&gt;Spawned a headless browser to fingerprint a marketing site — slow, expensive, and brittle.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I wanted three things in one pass:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Full &lt;strong&gt;product catalog&lt;/strong&gt; (titles, prices, variants, images, vendor, tags) — nothing exotic.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App detection&lt;/strong&gt;: which third-party Shopify apps are installed (email, reviews, subscriptions, popups, search).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reviews&lt;/strong&gt; when a reviews app is detected — pull them via the public API, not by parsing widgets.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And I wanted it to be &lt;strong&gt;cheap&lt;/strong&gt;, because I had ~1,200 stores in my first batch and I planned to run it monthly.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "no headless browser" decision
&lt;/h2&gt;

&lt;p&gt;The thing nobody tells you about Shopify scraping is that you almost never need a headless browser. The signals you want for app detection live in three places, and all three are reachable with a plain HTTPS GET:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;The HTML of the homepage&lt;/strong&gt;. Shopify apps inject &lt;code&gt;&amp;lt;script&amp;gt;&lt;/code&gt; tags from their own CDN. &lt;code&gt;cdn.judge.me&lt;/code&gt;, &lt;code&gt;cdn.yotpo.com&lt;/code&gt;, &lt;code&gt;loox.io/widget&lt;/code&gt;, &lt;code&gt;klaviyo.com/onsite&lt;/code&gt; — you grep the HTML and you know.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;/products.json&lt;/code&gt;&lt;/strong&gt;. Shopify exposes the full catalog at this path on every store, paginated 250 items at a time. No auth, no headless. (You hit a soft rate limit around 2 req/s per IP, which is fine if you queue politely.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;App-specific public endpoints&lt;/strong&gt;. Judge.me has a JSON reviews endpoint. Yotpo too. Same for Loox, Stamped, Reviews.io. Once you know which app is installed, you go straight to its API — no DOM parsing.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The whole actor is built around that observation. No Puppeteer, no Playwright, no proxy farm. Just &lt;code&gt;got-scraping&lt;/code&gt;, &lt;code&gt;cheerio&lt;/code&gt;, and &lt;code&gt;p-queue&lt;/code&gt; to keep concurrency civilized.&lt;/p&gt;

&lt;p&gt;The result is that scanning a single store costs ~3–6 HTTPS requests and runs in &lt;strong&gt;2 to 8 seconds&lt;/strong&gt; depending on catalog size. Cost on Apify infra: about $0.005 per store for the "tech stack only" mode.&lt;/p&gt;

&lt;h2&gt;
  
  
  The architecture (it is small on purpose)
&lt;/h2&gt;

&lt;p&gt;I'll be honest — I almost over-engineered this. My first draft had Redis for de-dup, a queue, retry logic with exponential backoff, and a state machine. Then I deleted all of it.&lt;/p&gt;

&lt;p&gt;Here is what shipped:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;src/
├── main.js                   # orchestration (p-queue, per-store flow)
├── crawlers/
│   ├── products.js           # /products.json + sitemap fallback
│   ├── apps.js               # detect apps from homepage HTML
│   └── reviews.js            # per-app reviews fetchers
└── lib/
    ├── normalize.js          # canonicalize URLs, normalize product schema
    ├── schemas.js            # zod validation for outputs
    └── billing.js            # Apify pay-per-event charges
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A run goes:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Canonicalize the store URL (handles &lt;code&gt;www&lt;/code&gt;, custom domains, &lt;code&gt;*.myshopify.com&lt;/code&gt;).&lt;/li&gt;
&lt;li&gt;Fetch the homepage once. Confirm it is Shopify (the &lt;code&gt;x-shopify-stage&lt;/code&gt; header is a giveaway).&lt;/li&gt;
&lt;li&gt;From the same HTML, run the app detectors. Each detector is ~10 lines of regex matching against script tags + meta tags + inline JSON.&lt;/li&gt;
&lt;li&gt;Fetch &lt;code&gt;/products.json?page=N&lt;/code&gt; until you hit the cap or run out of products.&lt;/li&gt;
&lt;li&gt;If the user asked for reviews and an installed reviews app was detected, fan out to that app's public reviews API.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;That is it. The whole thing is ~900 lines of JavaScript. I run it with &lt;code&gt;node --test&lt;/code&gt; for unit tests against snapshots and a &lt;code&gt;tests/smoke-products.mjs&lt;/code&gt; that hits 5 real stores end-to-end.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I learned about app detection
&lt;/h2&gt;

&lt;p&gt;The regex-against-HTML approach has one trap. Shopify themes minify, version, and CDN-rewrite their assets, so you cannot match on a single string. The Klaviyo loader, for example, ships under at least four URL patterns I have seen:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;static.klaviyo.com/onsite/js/klaviyo.js&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;static-tracking.klaviyo.com/onsite/js/...&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;a.klaviyo.com/media/...&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;inline &lt;code&gt;_learnq&lt;/code&gt; queue calls&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You match &lt;strong&gt;any&lt;/strong&gt; of those, and you call it Klaviyo. Same logic for every other app — every detector is an array of patterns, OR'd together, returning a single boolean. I wrote a snapshot test per app with a real store HTML page so a Klaviyo URL change does not silently break detection.&lt;/p&gt;

&lt;p&gt;The detectors I shipped on day one:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Email/SMS&lt;/strong&gt;: Klaviyo, Omnisend, Postscript, Mailchimp, Attentive&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Reviews&lt;/strong&gt;: Yotpo, Judge.me, Loox, Stamped, Reviews.io, Okendo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Subscriptions&lt;/strong&gt;: ReCharge, Bold, Loop, Skio&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Popups &amp;amp; SMS capture&lt;/strong&gt;: Privy, Justuno, Klaviyo Forms&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Search &amp;amp; discovery&lt;/strong&gt;: Searchanise, Boost, Algolia&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Loyalty&lt;/strong&gt;: Smile.io, Yotpo Loyalty, LoyaltyLion&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you tell me an app I missed, I add a detector. Each one is a 15-minute job.&lt;/p&gt;

&lt;h2&gt;
  
  
  The pay-per-event pricing problem
&lt;/h2&gt;

&lt;p&gt;Apify lets you charge per event instead of per compute minute. For a scraper that runs in seconds, this is the right model — your customer pays for the rows they get, not for compute time.&lt;/p&gt;

&lt;p&gt;The mistake I made on my first push was leaving Apify's default &lt;code&gt;dataset_item&lt;/code&gt; event on. Combined with my custom &lt;code&gt;product_extracted&lt;/code&gt; event, every product was being charged twice. I caught it in monetization review and removed the synthetic event.&lt;/p&gt;

&lt;p&gt;The pricing I landed on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;store_analyzed&lt;/code&gt; — $0.003 per store (covers detection + products fetch)&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;product_extracted&lt;/code&gt; — $0.0005 per product&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;apps_detected&lt;/code&gt; — $0.001 per store at standard+&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;review_extracted&lt;/code&gt; — $0.0003 per review&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A 500-product store with reviews costs roughly &lt;strong&gt;$0.30&lt;/strong&gt; end to end. For comparison, the SaaS competitors charge $99 or more for similar lookups, batched and capped.&lt;/p&gt;

&lt;h2&gt;
  
  
  What surprised me
&lt;/h2&gt;

&lt;p&gt;Three things, in order of how badly I underestimated them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. &lt;code&gt;/products.json&lt;/code&gt; is more honest than the storefront.&lt;/strong&gt; It exposes products that are unpublished from the theme but still live (out-of-stock holdovers, B2B-only SKUs). Useful for trend research. Sometimes shocking.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Reviews-app detection is a lead signal.&lt;/strong&gt; A store on Judge.me Free plan vs. Yotpo Premium tells you a lot about their stage. I ended up using this internally to prioritize cold outbound — different pitch for a $30/month stack vs. a $1,200/month stack.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. People want this as an MCP server.&lt;/strong&gt; Two of my first three users asked if they could query it from Claude / ChatGPT. I have it on the roadmap. (My &lt;a href="https://apify.com/kazkn/gpt-crawler-mcp" rel="noopener noreferrer"&gt;GPT Crawler MCP&lt;/a&gt; and &lt;a href="https://apify.com/kazkn/vinted-mcp-server" rel="noopener noreferrer"&gt;Vinted MCP Server&lt;/a&gt; are the two MCP actors I shipped first; the Shopify one is next.)&lt;/p&gt;

&lt;h2&gt;
  
  
  How to use it in one minute
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// On Apify, paste this in the actor input box&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;store_urls&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://allbirds.com&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://gymshark.com&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;extract_level&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;standard&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;  &lt;span class="c1"&gt;// products + apps stack&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;max_products_per_store&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;250&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output (one record per product, with &lt;code&gt;apps_detected&lt;/code&gt; attached):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"store_domain"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"allbirds.com"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"product_title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Wool Runner"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;110&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"available"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"vendor"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Allbirds"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"apps_detected"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"email"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Klaviyo"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"reviews"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Yotpo"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"subscriptions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"search"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Searchanise"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"product_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://allbirds.com/products/mens-wool-runners"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you want reviews, set &lt;code&gt;extract_level: "full"&lt;/code&gt; and a &lt;code&gt;max_reviews_per_product&lt;/code&gt;. The actor will route to the correct reviews API based on what was detected.&lt;/p&gt;

&lt;p&gt;Direct link, free $5 credit on Apify, no account-creation drama: &lt;strong&gt;&lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;Shopify Scraper – Apps Spy + Reviews&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Is scraping &lt;code&gt;/products.json&lt;/code&gt; allowed?
&lt;/h3&gt;

&lt;p&gt;Shopify exposes &lt;code&gt;/products.json&lt;/code&gt; publicly on every store by default. Stores that disable it (rare) return 404 and the actor logs a skip. The actor never authenticates, never bypasses access controls, and respects standard rate limits.&lt;/p&gt;

&lt;h3&gt;
  
  
  What about reCAPTCHA or Cloudflare?
&lt;/h3&gt;

&lt;p&gt;For the standard catalog and app-detection flow, no. &lt;code&gt;/products.json&lt;/code&gt; and the homepage HTML are not gated. For some reviews APIs, very high request volumes can trigger rate-limits — the actor backs off and retries with jitter.&lt;/p&gt;

&lt;h3&gt;
  
  
  How is this different from Koala Inspector, ShopScan or BuiltWith?
&lt;/h3&gt;

&lt;p&gt;Koala Inspector, ShopScan and Fera are excellent Chrome extensions for one-store lookups, but none of them do batch — you cannot paste 500 URLs and get a CSV back. BuiltWith is a generic tech-stack tool with broad coverage but its Shopify-app detection is shallow and you cannot pull products in the same call. This actor is purpose-built for Shopify and runs in batch via API: deeper app detection (subscriptions, reviews, popups, search, loyalty), full product catalog, and reviews — all in one pass, billed pay-per-event.&lt;/p&gt;

&lt;h3&gt;
  
  
  How long does a 1,000-store scan take?
&lt;/h3&gt;

&lt;p&gt;About 25 minutes at default concurrency, costing ~$3 of Apify credits at the &lt;code&gt;standard&lt;/code&gt; level. A &lt;code&gt;full&lt;/code&gt; run with reviews is closer to an hour and ~$15 depending on review volume.&lt;/p&gt;

&lt;h3&gt;
  
  
  Can I get one record per variant instead of per product?
&lt;/h3&gt;

&lt;p&gt;Yes. Set &lt;code&gt;include_variants: true&lt;/code&gt; in the input and the dataset returns one row per SKU with size/color/price/availability normalized.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is next
&lt;/h2&gt;

&lt;p&gt;I want to add three things, in order:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Revenue estimation&lt;/strong&gt; at the &lt;code&gt;pro&lt;/code&gt; tier — based on review velocity and product velocity, both of which are observable.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MCP server mode&lt;/strong&gt; so you can query it from Claude desktop / Cursor.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Theme detection&lt;/strong&gt; — useful for agency outbound, less useful for me, but I keep being asked.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you use it and something breaks, ping me — I am the only maintainer and I read every issue. The actor is on Apify Store at &lt;a href="https://apify.com/kazkn/shopify-scraper-apps-spy" rel="noopener noreferrer"&gt;kazkn/shopify-scraper-apps-spy&lt;/a&gt;.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Tags: #shopify #ecommerce #api #indiehackers&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Was this useful?&lt;/strong&gt; ❤️ a reaction or drop a comment with the use-case you're trying to solve — I read every reply and add detector + endpoint coverage based on what people actually need.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Follow &lt;a href="https://dev.to/boo_n"&gt;@boo_n&lt;/a&gt;&lt;/strong&gt; for hands-on tutorials: scraping reviews at scale, ICP qualification at $0.005 per store, and turning the actor into an MCP tool for Claude / Cursor.&lt;/p&gt;

</description>
      <category>shopify</category>
      <category>ecommerce</category>
      <category>api</category>
      <category>indiehackers</category>
    </item>
    <item>
      <title>How to Batch-Scrape 10 Vinted Search URLs in One Run: A Reseller's Workflow</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Fri, 24 Apr 2026 15:29:40 +0000</pubDate>
      <link>https://forem.com/boo_n/how-to-batch-scrape-10-vinted-search-urls-in-one-run-a-resellers-workflow-4ed1</link>
      <guid>https://forem.com/boo_n/how-to-batch-scrape-10-vinted-search-urls-in-one-run-a-resellers-workflow-4ed1</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi37z1th3g2d3jrx6hlc.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgi37z1th3g2d3jrx6hlc.jpg" alt="Vinted Turbo Scraper Batch Workflow" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Two weeks ago, a reseller I work with asked me a surprisingly common question:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"I track 12 different brands across Vinted. That means 12 different search URLs, 12 different filter combinations, 12 different countries depending on where inventory is cheapest. Right now I run them one by one. How do I automate all of this without writing orchestration code?"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The short answer: &lt;strong&gt;paste all 12 URLs at once&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;The tool: &lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This article breaks down the exact batch workflow I set up for him — and why it's the most underused feature of the Actor.&lt;/p&gt;

&lt;h2&gt;
  
  
  📺 Watch the Tutorial
&lt;/h2&gt;

&lt;p&gt;  &lt;iframe src="https://www.youtube.com/embed/rWtZVDMflbo"&gt;
  &lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Batch URLs Matter (More Than Speed)
&lt;/h2&gt;

&lt;p&gt;Speed is great. But for serious resellers and market researchers, &lt;strong&gt;throughput&lt;/strong&gt; is what actually moves the needle.&lt;/p&gt;

&lt;p&gt;Here's the reality of single-URL scraping:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Workflow&lt;/th&gt;
&lt;th&gt;Time per URL&lt;/th&gt;
&lt;th&gt;Total for 10 URLs&lt;/th&gt;
&lt;th&gt;Manual Steps&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Run manually, one by one&lt;/td&gt;
&lt;td&gt;3–5 min&lt;/td&gt;
&lt;td&gt;30–50 min&lt;/td&gt;
&lt;td&gt;10 separate runs, 10 separate exports&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Batch all URLs in one run&lt;/td&gt;
&lt;td&gt;3–5 min total&lt;/td&gt;
&lt;td&gt;3–5 min&lt;/td&gt;
&lt;td&gt;1 run, 1 export&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Most people never think about batching because:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Vinted's own site doesn't suggest it&lt;/li&gt;
&lt;li&gt;Most Python scrapers on GitHub are built for single requests&lt;/li&gt;
&lt;li&gt;Apify Actors often look like they only accept one input&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Vinted Turbo Scraper accepts an array of URLs.&lt;/strong&gt; Not one. An array. Paste 10, get 10 datasets merged into a single export.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real-World Use Cases
&lt;/h2&gt;

&lt;p&gt;Before diving into the how-to, here's who actually uses this feature:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Multi-brand resellers&lt;/strong&gt; — tracking Nike, Adidas, Jordan, New Balance, and vintage Patagonia simultaneously across the same country domain.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cross-country arbitrageurs&lt;/strong&gt; — monitoring the same keyword across Vinted.fr, Vinted.de, Vinted.nl, and Vinted.pl to spot price gaps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Category monitors&lt;/strong&gt; — running separate searches for Men's Sneakers, Women's Sneakers, and Kids' Sneakers with different size filters.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Deal hunters&lt;/strong&gt; — setting up 15–20 hyper-specific filter combos (brand + price + size + condition + color) and running them all nightly via Apify's scheduler.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Batch-Scrape Vinted URLs: Step-by-Step
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Step 1: Prepare your search URLs
&lt;/h3&gt;

&lt;p&gt;This is identical to single-URL mode. For each segment you want to track:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Vinted (any country domain)&lt;/li&gt;
&lt;li&gt;Apply filters: brand, size, price range, condition, color, category&lt;/li&gt;
&lt;li&gt;Copy the URL from your browser bar&lt;/li&gt;
&lt;li&gt;Repeat for every segment you want&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Example URL set for a sneaker reseller:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://www.vinted.fr/catalog?search_text=jordan&amp;amp;price_from=50&amp;amp;price_to=100&amp;amp;size_from=42&amp;amp;size_to=44&amp;amp;status_id=6
https://www.vinted.fr/catalog?search_text=nike&amp;amp;price_from=30&amp;amp;price_to=80&amp;amp;size_from=40&amp;amp;size_to=45&amp;amp;status_id=6
https://www.vinted.de/catalog?search_text=jordan&amp;amp;price_from=40&amp;amp;price_to=90&amp;amp;status_id=6
https://www.vinted.nl/catalog?search_text=nike&amp;amp;price_from=35&amp;amp;price_to=85&amp;amp;status_id=6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Each URL already contains the exact filters you need. No rebuilding required.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Configure the Actor for batch mode
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on the Apify Store&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Try for free&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;In the Input tab, locate the &lt;code&gt;searchURLs&lt;/code&gt; field&lt;/li&gt;
&lt;li&gt;Paste your URLs — one per line, or as a JSON array:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Plain text input:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://www.vinted.fr/catalog?search_text=jordan...
https://www.vinted.fr/catalog?search_text=nike...
https://www.vinted.de/catalog?search_text=jordan...
https://www.vinted.nl/catalog?search_text=nike...
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;JSON input (for API/programmatic runs):&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"searchURLs"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/catalog?search_text=jordan&amp;amp;price_from=50&amp;amp;price_to=100"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.de/catalog?search_text=jordan&amp;amp;price_from=40&amp;amp;price_to=90"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.nl/catalog?search_text=nike&amp;amp;price_from=35&amp;amp;price_to=85"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3: Run and export
&lt;/h3&gt;

&lt;p&gt;Click &lt;strong&gt;Start&lt;/strong&gt;. The Actor processes each URL sequentially but without the overhead of browser restart between URLs. Results from all URLs are merged into a single output dataset.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Export options:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;CSV&lt;/strong&gt; — flat table, one listing per row, with an added &lt;code&gt;source_url&lt;/code&gt; column showing which search URL each listing came from&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JSON&lt;/strong&gt; — structured object with all fields + source attribution&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Excel&lt;/strong&gt; — same as CSV, formatted for Excel&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Google Sheets&lt;/strong&gt; — live push to a shared spreadsheet&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API&lt;/strong&gt; — consume via the Apify API, perfect for custom pipelines&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 4: Automate with scheduling (optional but recommended)
&lt;/h3&gt;

&lt;p&gt;Apify has a built-in scheduler. I set mine to run every day at 6 AM CET:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the Actor's &lt;strong&gt;Schedules&lt;/strong&gt; tab in Apify Console&lt;/li&gt;
&lt;li&gt;Create a new schedule: &lt;strong&gt;Daily at 06:00&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Paste your URL array into the schedule input&lt;/li&gt;
&lt;li&gt;Set the destination: Google Sheets, webhook, or just store in Apify's key-value store&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Every morning, I wake up to a fresh dataset in my Google Sheet. Zero manual work.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the Output Actually Looks Like
&lt;/h2&gt;

&lt;p&gt;When you batch URLs, the Actor adds one critical field to every record:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/items/987654321-nike-air-max-90"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike Air Max 90"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;55.00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"EUR"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"43"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"condition"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Good"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller_username"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"paris_sneakers"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Paris, France"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"thumbnail"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images1.vinted.net/..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"source_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/catalog?search_text=nike&amp;amp;price_from=30&amp;amp;price_to=80"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scraped_at"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-24T06:03:15.000Z"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Key field:&lt;/strong&gt; &lt;code&gt;source_url&lt;/code&gt; tells you exactly which search query produced each listing. This is essential when you're running 10+ URLs simultaneously and need to segment results later.&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Breakdown for Batch Workflows
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Scenario&lt;/th&gt;
&lt;th&gt;URLs per run&lt;/th&gt;
&lt;th&gt;Avg listings per URL&lt;/th&gt;
&lt;th&gt;Total results&lt;/th&gt;
&lt;th&gt;Cost per run&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Small batch (3 brands, 1 country)&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;400&lt;/td&gt;
&lt;td&gt;1,200&lt;/td&gt;
&lt;td&gt;$1.80&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Medium batch (5 brands, 2 countries)&lt;/td&gt;
&lt;td&gt;10&lt;/td&gt;
&lt;td&gt;350&lt;/td&gt;
&lt;td&gt;3,500&lt;/td&gt;
&lt;td&gt;$5.25&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Large batch (15+ segments)&lt;/td&gt;
&lt;td&gt;15&lt;/td&gt;
&lt;td&gt;300&lt;/td&gt;
&lt;td&gt;4,500&lt;/td&gt;
&lt;td&gt;$6.75&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;With Apify's $5 monthly free credits, a small batch workflow costs you &lt;strong&gt;nothing&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;For active resellers running daily large batches: $6.75 × 30 days = ~$200/month. The time saved is roughly 2–3 hours per day of manual scraping, monitoring, and data cleaning. At any consulting rate, the ROI is immediate.&lt;/p&gt;

&lt;h2&gt;
  
  
  Common Batch Pitfalls (and How to Avoid Them)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 1: Mixing country domains in one run&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Actor handles &lt;code&gt;.fr&lt;/code&gt;, &lt;code&gt;.de&lt;/code&gt;, &lt;code&gt;.nl&lt;/code&gt;, &lt;code&gt;.pl&lt;/code&gt;, &lt;code&gt;.es&lt;/code&gt;, &lt;code&gt;.it&lt;/code&gt;, &lt;code&gt;.be&lt;/code&gt;, etc. natively. No issue. But be aware that pricing will be in local currencies — EUR for Eurozone, GBP for UK, PLN for Poland. Normalize in your export pipeline or spreadsheet.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 2: URLs with too few filters&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;An overly broad URL (e.g., just &lt;code&gt;search_text=vintage&lt;/code&gt; with no price/size filters) can return 10,000+ results per URL. The Actor caps per-URL extraction to prevent runaway costs, but it's better to filter aggressively on Vinted first.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 3: Identical URLs&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Deduplication happens at the dataset level, but identical URLs waste compute. Check your URL list before running.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Pitfall 4: Expecting real-time inventory&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Vinted listings appear and disappear fast — especially underpriced items. A scheduled run every 6 hours catches most opportunities without hitting rate limits.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Integrating Batch Results Into Your Workflow
&lt;/h2&gt;

&lt;p&gt;Here's the exact stack I use with a reseller partner:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Step&lt;/th&gt;
&lt;th&gt;Tool&lt;/th&gt;
&lt;th&gt;Purpose&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;1. Scrape&lt;/td&gt;
&lt;td&gt;Vinted Turbo Scraper (Apify)&lt;/td&gt;
&lt;td&gt;Pull 15 URLs nightly&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;2. Normalize&lt;/td&gt;
&lt;td&gt;Google Sheets (Apify integration)&lt;/td&gt;
&lt;td&gt;Clean + currency unification&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;3. Alert&lt;/td&gt;
&lt;td&gt;n8n&lt;/td&gt;
&lt;td&gt;Price-drop notifications via Telegram&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;4. Act&lt;/td&gt;
&lt;td&gt;Manual&lt;/td&gt;
&lt;td&gt;Buy the listing, message seller, relist&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The entire pipeline from "run Actor" to "receive Telegram alert for a €45 Jordan 1" is under 5 minutes.&lt;/p&gt;

&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q: What's the maximum number of URLs per run?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Tested comfortably up to 50 URLs per run. Beyond that, splitting into multiple runs is safer for stability. Each URL is processed sequentially, so it's more about total runtime than URL count.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Does batch mode cost more?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;No. Pricing is $0.0015 per result, regardless of how many URLs produced that result. 1,000 listings from 1 URL = $1.50. 1,000 listings from 10 URLs = $1.50.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Can I batch across different Vinted country domains?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Yes. Mix &lt;code&gt;.fr&lt;/code&gt;, &lt;code&gt;.de&lt;/code&gt;, &lt;code&gt;.nl&lt;/code&gt;, &lt;code&gt;.pl&lt;/code&gt;, &lt;code&gt;.es&lt;/code&gt;, &lt;code&gt;.it&lt;/code&gt;, &lt;code&gt;.be&lt;/code&gt;, etc. in the same input array. The Actor handles domain detection automatically.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: What if one URL fails?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Failed URLs are logged in the Actor's run log. Successful URLs still produce their dataset. No total-run failure from a single bad URL.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Vinted Turbo Scraper accepts &lt;strong&gt;multiple search URLs in one run&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Results merge into a single export with &lt;code&gt;source_url&lt;/code&gt; attribution&lt;/li&gt;
&lt;li&gt;Schedule daily runs, push to Google Sheets, build alert pipelines&lt;/li&gt;
&lt;li&gt;Pricing is per-result, not per-URL — batching is free in terms of cost structure&lt;/li&gt;
&lt;li&gt;Supports all 26 Vinted country domains in a single batch&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Launch Vinted Turbo Scraper&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Related tools:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://apify.com/kazkn/vinted-smart-scraper" rel="noopener noreferrer"&gt;Vinted Smart Scraper&lt;/a&gt; — Deep seller analytics + cross-country price comparison&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://apify.com/kazkn/vinted-mcp-server" rel="noopener noreferrer"&gt;Vinted MCP Server&lt;/a&gt; — Natural language Vinted queries via AI&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://dev.to/boo_n/how-i-scrape-1000-vinted-listings-in-under-2-minutes-without-writing-a-single-line-of-code-1lol"&gt;How I Scrape 1,000 Vinted Listings in Under 2 Minutes&lt;/a&gt; — Speed-focused guide with video tutorial&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>apify</category>
      <category>vinted</category>
      <category>webscraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>How I Scrape 1,000 Vinted Listings in Under 2 Minutes (Without Writing a Single Line of Code)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Fri, 24 Apr 2026 15:14:21 +0000</pubDate>
      <link>https://forem.com/boo_n/how-i-scrape-1000-vinted-listings-in-under-2-minutes-without-writing-a-single-line-of-code-1lol</link>
      <guid>https://forem.com/boo_n/how-i-scrape-1000-vinted-listings-in-under-2-minutes-without-writing-a-single-line-of-code-1lol</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbqif5dmjabo326j8094v.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbqif5dmjabo326j8094v.jpg" alt="Vinted Turbo Scraper Speed Test" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Two months ago, a friend running a sneaker resale business called me with a very specific problem:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"I need to pull every Jordan 1 listing under €80 in France, Germany, and Spain. I don't know Python. I don't have proxies. And I don't have three days to fight Cloudflare. How do I do this in under five minutes?"&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I pointed him to a tool I'd built exactly for this: the &lt;strong&gt;Vinted Turbo Scraper&lt;/strong&gt; on Apify. He had his first CSV ready before he finished his coffee.&lt;/p&gt;

&lt;h2&gt;
  
  
  📺 Watch the Tutorial in Action
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=rWtZVDMflbo" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=rWtZVDMflbo&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Speed Benchmark Nobody Talks About
&lt;/h2&gt;

&lt;h2&gt;
  
  
  The Speed Benchmark Nobody Talks About
&lt;/h2&gt;

&lt;p&gt;Most Vinted scrapers on GitHub advertise "fast" extraction. What that usually means is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;40–120 seconds to authenticate + parse a single page&lt;/li&gt;
&lt;li&gt;Another 30–60 seconds per subsequent page&lt;/li&gt;
&lt;li&gt;A 403 ban after 200–400 requests because you're hammering Vinted's CDN with a single residential IP&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;With &lt;strong&gt;Vinted Turbo Scraper&lt;/strong&gt;, I consistently hit &lt;strong&gt;~500 items per minute&lt;/strong&gt; on country-specific searches (confirmed across Vinted.fr, Vinted.de, Vinted.nl, and Vinted.pl). A filtered search URL returning ~1,000 listings completes in &lt;strong&gt;under 2 minutes&lt;/strong&gt; from URL paste to CSV download.&lt;/p&gt;

&lt;p&gt;Here's what a typical run looks like:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Search URL&lt;/th&gt;
&lt;th&gt;Filters Applied&lt;/th&gt;
&lt;th&gt;Listings Scraped&lt;/th&gt;
&lt;th&gt;Total Runtime&lt;/th&gt;
&lt;th&gt;Cost&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;vinted.fr/catalog?search_text=jordan&amp;amp;price_from=50&amp;amp;price_to=80&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Brand: Jordan, Price €50–€80&lt;/td&gt;
&lt;td&gt;1,024&lt;/td&gt;
&lt;td&gt;1 min 58 s&lt;/td&gt;
&lt;td&gt;$1.54&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;vinted.de/catalog?search_text=nike&amp;amp;size_from=43&amp;amp;size_to=44&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Brand: Nike, Size 43–44&lt;/td&gt;
&lt;td&gt;876&lt;/td&gt;
&lt;td&gt;1 min 42 s&lt;/td&gt;
&lt;td&gt;$1.31&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;vinted.nl/catalog?search_text=vintage&amp;amp;status_id=6&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;Status: Available, Keyword: vintage&lt;/td&gt;
&lt;td&gt;2,341&lt;/td&gt;
&lt;td&gt;4 min 12 s&lt;/td&gt;
&lt;td&gt;$3.51&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Key insight:&lt;/strong&gt; The architecture skips the heavy DOM-rendering overhead. Instead of crawling each listing page with a full browser, Turbo extracts structured data directly from Vinted's internal API endpoints — the same endpoints the mobile app hits. That's where the speed comes from.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Who This Is For (and Who It's Not For)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use Turbo if you:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Need filtered Vinted data &lt;strong&gt;now&lt;/strong&gt;, not next week&lt;/li&gt;
&lt;li&gt;Don't want to manage proxy rotation, sessions, or CAPTCHA solving&lt;/li&gt;
&lt;li&gt;Are a reseller, researcher, or analyst who needs structured exports (JSON / CSV / Excel / Google Sheets)&lt;/li&gt;
&lt;li&gt;Want to &lt;strong&gt;batch multiple search URLs&lt;/strong&gt; in a single run (I'll cover this in detail in my next article)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Use something else if you:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Need deep seller analytics, trending product tracking, or cross-country price comparison → that's &lt;a href="https://apify.com/kazkn/vinted-smart-scraper" rel="noopener noreferrer"&gt;Vinted Smart Scraper&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Want to build and maintain your own scraping infrastructure from scratch (you genuinely enjoy fighting anti-bot teams)&lt;/li&gt;
&lt;li&gt;Need to scrape private messages or PII (we don't do that, and you shouldn't either)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Actual Setup. No Code Required.
&lt;/h2&gt;

&lt;p&gt;This is the exact workflow I showed my friend. It takes under three minutes:&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: Build your search URL on Vinted
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Go to &lt;a href="https://www.vinted.com" rel="noopener noreferrer"&gt;vinted.com&lt;/a&gt; (or any country domain: .fr, .de, .nl, .pl, etc.)&lt;/li&gt;
&lt;li&gt;Apply any filters you want: brand, size, price range, item condition, category, color&lt;/li&gt;
&lt;li&gt;Copy the URL from your browser bar&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;That's it.&lt;/strong&gt; Every filter you applied is encoded directly in that URL. No need to rebuild anything in a form.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Paste into the Actor on Apify
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Open the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper Actor&lt;/a&gt; on the Apify Store&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Try for free&lt;/strong&gt; (you get $5 Platform Credits every month)&lt;/li&gt;
&lt;li&gt;Paste your Vinted search URL into the input field&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Start&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fapify.com%2Fkazkn%2Fvinted-turbo-scraper" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fapify.com%2Fkazkn%2Fvinted-turbo-scraper" alt="Input screenshot" width="" height=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3: Download your dataset
&lt;/h3&gt;

&lt;p&gt;Once the status shows &lt;strong&gt;Succeeded&lt;/strong&gt;, go to the &lt;strong&gt;Output&lt;/strong&gt; tab:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Click &lt;strong&gt;Export&lt;/strong&gt; → &lt;strong&gt;CSV&lt;/strong&gt; or &lt;strong&gt;JSON&lt;/strong&gt; for direct download&lt;/li&gt;
&lt;li&gt;Or click &lt;strong&gt;Google Sheets&lt;/strong&gt; to push to a live spreadsheet&lt;/li&gt;
&lt;li&gt;Or grab the &lt;strong&gt;API URL&lt;/strong&gt; to consume results programmatically in Python, Node.js, or any automation tool&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The output structure is flat and predictable — no nested JSON hell:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/items/123456789-jordan-1-mid-chicago"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Jordan 1 Mid 'Chicago'"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;75.00&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"EUR"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Jordan"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"44"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"condition"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Very good"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Barely worn. No creases."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller_username"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sneakerhead_paris"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller_url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/member/987654321-sneakerhead_paris"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Paris, France"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"thumbnail"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images1.vinted.net/t/01_01234abc.jpg"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"images"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"https://images1.vinted.net/..."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images2.vinted.net/..."&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scraped_at"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-24T10:15:30.000Z"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Performance Under the Hood
&lt;/h2&gt;

&lt;p&gt;Here's why this outperforms home-rolled Python scripts:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Factor&lt;/th&gt;
&lt;th&gt;Typical Python Scraper&lt;/th&gt;
&lt;th&gt;Vinted Turbo Scraper&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Authentication&lt;/td&gt;
&lt;td&gt;Manual cookie/session mgmt&lt;/td&gt;
&lt;td&gt;Auto bootstrap via Playwright (one-time)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Proxy rotation&lt;/td&gt;
&lt;td&gt;Manual 3rd-party proxy pools&lt;/td&gt;
&lt;td&gt;Built-in Apify residential proxies&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rate-limit protection&lt;/td&gt;
&lt;td&gt;None / naive sleep()&lt;/td&gt;
&lt;td&gt;Adaptive backoff + request shaping&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Data extraction&lt;/td&gt;
&lt;td&gt;Regex / BeautifulSoup on HTML&lt;/td&gt;
&lt;td&gt;Direct API endpoint parsing (JSON native)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Export&lt;/td&gt;
&lt;td&gt;Custom script required&lt;/td&gt;
&lt;td&gt;CSV, JSON, Excel, Sheets, or API out of the box&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Maintenance&lt;/td&gt;
&lt;td&gt;You own every breakage&lt;/td&gt;
&lt;td&gt;Managed by the Actor, updated when Vinted changes&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  The Economics
&lt;/h2&gt;

&lt;p&gt;Pricing is straightforward: &lt;strong&gt;$0.0015 per result&lt;/strong&gt; (so $1.50 per 1,000 listings).&lt;/p&gt;

&lt;p&gt;With the free $5 monthly Apify credits, you can scrape roughly &lt;strong&gt;3,300 listings per month at zero cost&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;For a small reseller running 20 searches per week at ~500 results each, that's $15/week = &lt;strong&gt;$60/month&lt;/strong&gt; for data that would take 10+ hours to collect manually.&lt;/p&gt;

&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q: Is this legal? Are you scraping private data?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;No. We only extract publicly visible listing data — the same information any visitor sees on a Vinted search page. No private messages, no login-required data, no PII.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Will I get IP-banned?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The Actor runs on Apify's infrastructure with residential proxy rotation. In 30 days of active usage, we maintained a &lt;strong&gt;90.8% success rate&lt;/strong&gt; across 121 runs. Individual bans are handled transparently — failed runs are retried automatically.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: Can I schedule this to run daily?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Yes. Apify's scheduler lets you set up recurring runs. I use it to monitor specific keywords ("vintage Patagonia", "Nike Dunk") and get fresh data pushed to a Google Sheet every morning.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Q: What countries are supported?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;All 26 Vinted country domains: France, Germany, Netherlands, Poland, Spain, Italy, UK, Belgium, Czech Republic, Austria, Portugal, Lithuania, Luxembourg, Slovakia, Hungary, Romania, Bulgaria, Greece, Croatia.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Try It
&lt;/h2&gt;

&lt;p&gt;If you have a filtered Vinted search URL open right now, copy it and test the Actor:&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Launch Vinted Turbo Scraper&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For technical documentation, API examples in Python/JS, and integration guides, check the &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Actor's full README&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Related Tools
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-smart-scraper" rel="noopener noreferrer"&gt;Vinted Smart Scraper&lt;/a&gt;&lt;/strong&gt; — Cross-country price comparison, seller analysis, trending products&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://apify.com/kazkn/vinted-mcp-server" rel="noopener noreferrer"&gt;Vinted MCP Server&lt;/a&gt;&lt;/strong&gt; — Query Vinted data with natural language via Claude/Cursor&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Built by &lt;a href="https://apify.com/kazkn" rel="noopener noreferrer"&gt;KazKN&lt;/a&gt;. Questions? Open an &lt;a href="https://github.com/Boo-n/vinted-turbo-scraper/issues" rel="noopener noreferrer"&gt;issue&lt;/a&gt; or DM on &lt;a href="https://twitter.com/DataKazKN" rel="noopener noreferrer"&gt;X&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>apify</category>
      <category>vinted</category>
      <category>webscraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>The Hybrid Vinted Scraping Architecture That Outperforms Pure Browser Crawls</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Thu, 23 Apr 2026 11:30:16 +0000</pubDate>
      <link>https://forem.com/boo_n/the-hybrid-vinted-scraping-architecture-that-outperforms-pure-browser-crawls-593b</link>
      <guid>https://forem.com/boo_n/the-hybrid-vinted-scraping-architecture-that-outperforms-pure-browser-crawls-593b</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdq5vkqt7zlqp0ga64ctt.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdq5vkqt7zlqp0ga64ctt.jpg" alt="Cover" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  The Hybrid Vinted Scraping Architecture That Outperforms Pure Browser Crawls
&lt;/h1&gt;

&lt;p&gt;When you scrape Vinted at scale, you quickly hit a wall.&lt;/p&gt;

&lt;p&gt;Not a firewall metaphor. A literal one. Datadome. Cloudflare. Aggressive rate limits. Token rotation that invalidates your session mid-crawl. And if you are still running headless Chromium for every single request, you are burning proxy credits and clock cycles for no reason.&lt;/p&gt;

&lt;p&gt;After months of iteration — and enough failed runs to fill a datacenter — the architecture that actually works is &lt;strong&gt;hybrid&lt;/strong&gt;: use a real browser only where Vinted forces you to, then switch to lightweight HTTP for the actual data extraction.&lt;/p&gt;

&lt;p&gt;This is how Vinted Turbo Scraper implements that hybrid model, what makes it faster than pure-browser approaches, and why the architecture is the real product.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Pure Browser Crawling Is a Trap
&lt;/h2&gt;

&lt;p&gt;Most tutorials tell you to fire up Playwright or Puppeteer, navigate to a Vinted search page, scroll endlessly, and extract DOM nodes. This works for five items. It collapses at scale.&lt;/p&gt;

&lt;p&gt;Here is why:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Problem&lt;/th&gt;
&lt;th&gt;Browser-Only Impact&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Proxy cost&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Every image, font, and JS asset loads through your proxy. Bandwidth is not free.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Memory bloat&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Chromium instances chew 200-500MB each. At concurrency 5, you are eating gigabytes.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Fingerprint fatigue&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Datadome profiles browser behavior. Repeating the same navigation pattern = flag.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Session decay&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Cookies and tokens expire. A pure browser crawl does not gracefully re-authenticate.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Speed ceiling&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Rendering a full React-powered catalog page takes 2-5 seconds. Per page.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;A pure browser crawl is not "robust." It is expensive, slow, and detectable.&lt;/p&gt;

&lt;p&gt;The insight is simple: &lt;strong&gt;Vinted serves catalog data via an internal JSON API.&lt;/strong&gt; Once you have a valid session cookie, you can query that API directly with HTTP requests. No rendering. No DOM traversal. No asset loading.&lt;/p&gt;

&lt;p&gt;The challenge is getting that cookie in the first place.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Hybrid Model: Browser for Session, HTTP for Extraction
&lt;/h2&gt;

&lt;p&gt;Vinted Turbo Scraper uses a two-phase approach:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Phase One: Session initialization via Playwright&lt;/strong&gt; — Navigate to the target catalog page once, let Datadome validate the browser fingerprint, capture cookies, and grab the user agent string.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Phase Two: HTTP API extraction via got-scraping&lt;/strong&gt; — Use the captured session to fire lightweight JSON API requests, paginating through results at ~200 items per minute.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is not theoretical. Here is how the crawler initialization blocks media assets to keep proxy usage minimal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="nx"&gt;preNavigationHooks&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;route&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;**/*&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="kd"&gt;type&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;request&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;resourceType&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="c1"&gt;// Block images, media, fonts to save proxy bandwidth&lt;/span&gt;
            &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;image&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;media&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;font&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;includes&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;type&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;abort&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{});&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;route&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;continue&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="k"&gt;catch&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{});&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;By aborting image and font requests before they hit the proxy, we cut bandwidth consumption by roughly 70%. On metered residential proxies, that translates directly to cost savings.&lt;/p&gt;




&lt;h2&gt;
  
  
  Translating Vinted Search URLs into API Calls
&lt;/h2&gt;

&lt;p&gt;Vinted search URLs encode filter parameters in query strings: &lt;code&gt;catalog[]&lt;/code&gt;, &lt;code&gt;brand_id[]&lt;/code&gt;, &lt;code&gt;size_id[]&lt;/code&gt;, &lt;code&gt;color_id[]&lt;/code&gt;, &lt;code&gt;status[]&lt;/code&gt;, and more.&lt;/p&gt;

&lt;p&gt;The internal API expects these same values but with slightly different parameter names and array bracket syntax. The Turbo Scraper extracts and rewrites these parameters automatically:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;translateToApiUrl&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;urlStr&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="o"&gt;|&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;u&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;URL&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;urlStr&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;params&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;URLSearchParams&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;u&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;searchParams&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;catalog[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;catalog_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;color_id[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;color_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;size_id[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;size_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;status[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;status_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;brand_id[]&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;brand_ids&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;STRIP&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Set&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;search_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;time&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;search_by_image_uuid&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;search_by_image_id&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;currency&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;page&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;per_page&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
    &lt;span class="p"&gt;]);&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;apiParams&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;URLSearchParams&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{};&lt;/span&gt;

    &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;params&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;STRIP&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;has&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="k"&gt;continue&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]])&lt;/span&gt; &lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
            &lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;arrayMaps&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;]].&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="nx"&gt;apiParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;k&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="c1"&gt;// Critical fix: append brackets for multi-value arrays&lt;/span&gt;
    &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;vals&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nb"&gt;Object&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;accumulated&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;vals&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;apiParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;[]`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;v&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="s2"&gt;`https://www.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/api/v2/catalog/items?&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;apiParams&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;toString&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This translator is the bridge between the URL your user copies from their browser and the internal API endpoint that returns raw JSON. Without it, you would need users to manually map catalog IDs — which defeats the purpose of a "zero-config" scraper.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Human-Friendly Mapping Layer
&lt;/h2&gt;

&lt;p&gt;Vinted uses numeric IDs for filters. Users do not know that "Nike" maps to brand ID 53 or that "new with tags" maps to status ID 6.&lt;/p&gt;

&lt;p&gt;The actor maintains internal dictionaries that resolve plain text to these IDs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;BRAND_MAP&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;nike&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;53&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;zara&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;12&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;h&amp;amp;m&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;adidas&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;14&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;levis&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ralph lauren&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;88&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;calvin klein&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;33&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;guess&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;puma&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;15&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;vans&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;converse&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;17&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;tommy hilfiger&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;94&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lacoste&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;93&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;the north face&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;114&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;asics&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;631&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new balance&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;267&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;carhartt&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;362&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;dickies&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1007&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;CONDITION_MAP&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;neuf avec étiquette&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new_with_tags&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;neuf sans étiquette&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;new_without_tags&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;très bon état&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;very_good&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;bon état&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;good&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;satisfaisant&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;satisfactory&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;SIZE_MAP&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Record&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;string&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;35&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;54&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;36&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;55&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;37&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;56&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;38&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;57&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;39&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;58&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;40&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;59&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;41&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;42&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;61&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;43&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;62&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;44&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;63&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;45&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;64&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;46&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;65&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;47&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;66&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xxs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;205&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;206&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;s&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;207&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;m&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;208&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;l&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;209&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xl&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;210&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;xxl&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;211&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This lets users pass intuitive inputs like &lt;code&gt;["Nike", "Adidas"]&lt;/code&gt; or &lt;code&gt;["new", "very_good"]&lt;/code&gt; instead of reverse-engineering Vinted's internal taxonomy. The actor falls back to raw numeric IDs for anything not in the map, so power users are not constrained either.&lt;/p&gt;




&lt;h2&gt;
  
  
  HTTP Extraction Loop: Where the Speed Lives
&lt;/h2&gt;

&lt;p&gt;Once the session cookie is captured, the actor switches to &lt;code&gt;got-scraping&lt;/code&gt; for the heavy lifting:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;gotScraping&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;apiReqUrl&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;responseType&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;json&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;proxyUrl&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;User-Agent&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;userAgent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Accept&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;application/json, text/plain, */*&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Accept-Language&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;fr-FR,fr;q=0.9,en-US;q=0.8,en;q=0.7&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Cookie&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;cookieStr&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Referer&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`https://www.&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;X-Money-Object-Enabled&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;true&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sec-Fetch-Dest&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;empty&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sec-Fetch-Mode&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;cors&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sec-Fetch-Site&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;same-origin&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;request&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;15000&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;Sec-Fetch-*&lt;/code&gt; headers are not decoration. They signal to Vinted's edge that this is a same-origin AJAX request, not an external scraper. Combined with a matching &lt;code&gt;Referer&lt;/code&gt; and the validated &lt;code&gt;Cookie&lt;/code&gt; string, the request sails through.&lt;/p&gt;

&lt;p&gt;Each page returns 96 items. The loop paginates until &lt;code&gt;data.pagination.current_page &amp;gt;= data.pagination.total_pages&lt;/code&gt; or the &lt;code&gt;maxItems&lt;/code&gt; limit is hit.&lt;/p&gt;

&lt;p&gt;Result: &lt;strong&gt;~200 items per minute&lt;/strong&gt; sustained, with a memory footprint under 512MB per worker.&lt;/p&gt;




&lt;h2&gt;
  
  
  Input Schema Deep Dive
&lt;/h2&gt;

&lt;p&gt;The actor accepts minimal but precise JSON input. Here is the exact schema:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"maxItems"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"proxyConfiguration"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"useApifyProxy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"apifyProxyGroups"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"RESIDENTIAL"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"startUrls"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.co.uk/catalog?catalog[]=1844&amp;amp;brand_ids[]=53&amp;amp;size_ids[]=207&amp;amp;status_ids[]=6&amp;amp;price_from=20&amp;amp;price_to=50&amp;amp;currency=GBP&amp;amp;order=price_low_to_high"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Field&lt;/th&gt;
&lt;th&gt;Type&lt;/th&gt;
&lt;th&gt;Required&lt;/th&gt;
&lt;th&gt;Description&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;startUrls&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;string or array&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Yes&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;One or more Vinted search URLs. Supports batch processing.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;maxItems&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;number&lt;/td&gt;
&lt;td&gt;No (default: 100)&lt;/td&gt;
&lt;td&gt;Cap on results per run. Use for cost control.&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;code&gt;proxyConfiguration&lt;/code&gt;&lt;/td&gt;
&lt;td&gt;object&lt;/td&gt;
&lt;td&gt;No (recommended)&lt;/td&gt;
&lt;td&gt;Defaults to Apify residential proxies. Essential for Datadome evasion.&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;You can pass multiple URLs as a comma-separated string or an array of objects with &lt;code&gt;url&lt;/code&gt; keys. The actor processes them sequentially in a single run, combining outputs into one unified dataset.&lt;/p&gt;




&lt;h2&gt;
  
  
  Integration Patterns: From Scraper to Pipeline
&lt;/h2&gt;

&lt;p&gt;Raw data is worthless without a destination. The actor integrates with Apify's ecosystem for downstream automation:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Destination&lt;/th&gt;
&lt;th&gt;Trigger&lt;/th&gt;
&lt;th&gt;Use Case&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Google Sheets&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Apify integration&lt;/td&gt;
&lt;td&gt;Live inventory tracking&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Slack&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Webhook&lt;/td&gt;
&lt;td&gt;Alert team on new listings&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Airtable&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Zapier/Make bridge&lt;/td&gt;
&lt;td&gt;Visual database for resellers&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Custom API&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Dataset webhook&lt;/td&gt;
&lt;td&gt;Push to your own backend&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;CSV/Excel&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Manual download&lt;/td&gt;
&lt;td&gt;One-off market analysis&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;For recurring monitoring, pair the actor with Apify Scheduler. Set it to run every 15 minutes against a filtered search URL and pipe results to a Slack channel or Google Sheet. You catch new listings before manual browsers refresh the page.&lt;/p&gt;




&lt;h2&gt;
  
  
  Real-World Performance Benchmarks
&lt;/h2&gt;

&lt;p&gt;Here are observed numbers from production runs across different proxy tiers:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Proxy Type&lt;/th&gt;
&lt;th&gt;Speed&lt;/th&gt;
&lt;th&gt;Reliability&lt;/th&gt;
&lt;th&gt;Cost per 1k Items&lt;/th&gt;
&lt;th&gt;Best For&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Apify Proxy (Datacenter)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~300 items/min&lt;/td&gt;
&lt;td&gt;Low (blocks after ~500)&lt;/td&gt;
&lt;td&gt;~$0.30&lt;/td&gt;
&lt;td&gt;Quick tests&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Apify Proxy (Residential)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~200 items/min&lt;/td&gt;
&lt;td&gt;High (rarely blocked)&lt;/td&gt;
&lt;td&gt;~$1.50&lt;/td&gt;
&lt;td&gt;Production runs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Custom Proxy&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Variable&lt;/td&gt;
&lt;td&gt;Depends on quality&lt;/td&gt;
&lt;td&gt;Variable&lt;/td&gt;
&lt;td&gt;Power users&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The residential proxy is the sweet spot: fast enough for real-time workflows, reliable enough for continuous monitoring, and priced predictably per result.&lt;/p&gt;




&lt;h2&gt;
  
  
  Architecture Comparison: Browser vs Hybrid vs Pure HTTP
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Approach&lt;/th&gt;
&lt;th&gt;Speed&lt;/th&gt;
&lt;th&gt;Cost&lt;/th&gt;
&lt;th&gt;Reliability&lt;/th&gt;
&lt;th&gt;Complexity&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pure Browser&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~20-40 items/min&lt;/td&gt;
&lt;td&gt;High (full asset load)&lt;/td&gt;
&lt;td&gt;Medium (detectable patterns)&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pure HTTP&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~300+ items/min&lt;/td&gt;
&lt;td&gt;Minimal&lt;/td&gt;
&lt;td&gt;Low (session requires bootstrapping)&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Hybrid (Turbo)&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;~200 items/min&lt;/td&gt;
&lt;td&gt;Low (blocked assets)&lt;/td&gt;
&lt;td&gt;High (session + retry logic)&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Pure HTTP is fastest on paper, but without a valid session cookie, every request returns a 403. The hybrid approach trades absolute speed for &lt;strong&gt;operational reliability&lt;/strong&gt; — the metric that actually matters when you are running automated workflows.&lt;/p&gt;




&lt;h2&gt;
  
  
  When to Use Turbo vs Smart Scraper
&lt;/h2&gt;

&lt;p&gt;Vinted Turbo Scraper is part of a two-tool ecosystem. Choose based on your use case:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Feature&lt;/th&gt;
&lt;th&gt;Turbo Scraper&lt;/th&gt;
&lt;th&gt;Smart Scraper&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;URL-based input&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No (form-based)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Batch URL processing&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cross-country comparison&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Seller analysis&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Sold items tracking&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Trending discovery&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Price monitoring&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes (cross-border)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Speed&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Faster&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Slower (richer data)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Cost&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Lower&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Higher&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Use Turbo&lt;/strong&gt; when you have a Vinted search URL ready and need structured data fast. &lt;strong&gt;Use Smart&lt;/strong&gt; when you are doing deep market intelligence, seller profiling, or cross-country arbitrage.&lt;/p&gt;




&lt;h2&gt;
  
  
  Anti-Ban Mechanisms Beyond Proxies
&lt;/h2&gt;

&lt;p&gt;Proxy rotation is table stakes. The actor adds three additional layers:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Request fingerprint rotation via Crawlee&lt;/strong&gt; — Built-in proxy configuration rotates IPs per session.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Aggressive retry with exponential backoff&lt;/strong&gt; — &lt;code&gt;maxRequestRetries: 5&lt;/code&gt; with a 30-second handler timeout.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Graceful session recycling&lt;/strong&gt; — If an HTTP request fails with a 403, the Playwright session is refreshed before retry.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The output is a clean JSON schema with optional lightweight mode:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;8464268321&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Levi black skinny jeans 33&lt;/span&gt;&lt;span class="se"&gt;\"&lt;/span&gt;&lt;span class="s2"&gt; waist"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.co.uk/items/8464268321-levi-black-skinny-jeans-33-waist"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"currency"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"GBP"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"brand"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Levi's"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"M / UK 12-14"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"condition"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"New with tags"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"photos"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"..."&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"favouriteCount"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"seller"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;73959532&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"username"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"maxi83199"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"profileUrl"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.co.uk/member/73959532-maxi83199"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"scrapedAt"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-03-24T10:25:41.604Z"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Structured. Timestamped. Ready for pipelines.&lt;/p&gt;




&lt;h2&gt;
  
  
  FAQ: Technical Details
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Q: Does this use headless browsers for every request?&lt;/strong&gt;&lt;br&gt;
A: No. Only for initial session bootstrap. Data extraction uses lightweight HTTP requests via &lt;code&gt;got-scraping&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: How many items can I extract per run?&lt;/strong&gt;&lt;br&gt;
A: The &lt;code&gt;maxItems&lt;/code&gt; parameter lets you cap runs. We have tested up to 10,000 items in a single run without memory issues.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: Is there a Vinted API this connects to?&lt;/strong&gt;&lt;br&gt;
A: Vinted does not offer a public API for catalog data. This actor acts as a practical alternative by reverse-engineering the internal endpoints.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: Will my IP get banned?&lt;/strong&gt;&lt;br&gt;
A: With residential proxies and the hybrid architecture, blocks are rare. The actor implements retry logic and session refresh for edge cases.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: Can I run this on a schedule?&lt;/strong&gt;&lt;br&gt;
A: Yes, via Apify Scheduler or cron triggers. Ideal for monitoring new listings.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Q: What output formats are available?&lt;/strong&gt;&lt;br&gt;
A: JSON (structured), CSV, Excel, or direct API export to integrations.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Honest Bottom Line
&lt;/h2&gt;

&lt;p&gt;No scraper is "unbanable." Platforms evolve. What the hybrid architecture buys you is &lt;strong&gt;time&lt;/strong&gt; — time between Vinted deploying a new detection mechanism and you pushing an update.&lt;/p&gt;

&lt;p&gt;Because this is packaged as an Apify Actor, that update propagates to every user instantly. No pip upgrade. No breaking dependency chains. No "works on my machine."&lt;/p&gt;

&lt;p&gt;If you are still maintaining a custom Python Selenium script that breaks every two weeks, you are not scraping Vinted. You are debugging Vinted.&lt;/p&gt;

&lt;p&gt;Switch to infrastructure that was built to survive the platform, not chase it.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Ready to extract Vinted data at scale?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Actor: &lt;a href="https://apify.com/actors/IV3WPdQlMFG1cwXuK" rel="noopener noreferrer"&gt;Vinted Turbo Scraper on Apify&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Pricing: $1.50 per 1,000 results. No subscription. Free plan covers thousands of items.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;em&gt;Questions about the architecture or want to integrate this into a pipeline? Drop a comment below.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>apify</category>
      <category>vinted</category>
      <category>scraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>Ne Jamais Rater une Bonne Affaire sur Vinted : Le Système Automatisé en 2026</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sat, 11 Apr 2026 20:42:15 +0000</pubDate>
      <link>https://forem.com/boo_n/ne-jamais-ratar-une-bonne-affaire-sur-vinted-le-systeme-automatise-en-2026-4bil</link>
      <guid>https://forem.com/boo_n/ne-jamais-ratar-une-bonne-affaire-sur-vinted-le-systeme-automatise-en-2026-4bil</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk8aw6cg793r2wnpmcnad.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk8aw6cg793r2wnpmcnad.jpeg" alt="Cover" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  🚫 Ne Jamais Rater une Bonne Affaire sur Vinted : Le Système Automatisé en 2026
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Tl;dr :&lt;/strong&gt; Si tu passes 30 minutes par jour à refresh Vinted pour trouver cette veste Carhartt en taille M à moins de 30€ — ton temps vaut plus que ça. Voici le système sans-code qui te notifie en temps réel, sans risque de ban, sans infrastructure, et sans rien payer en plus de ton Apify.&lt;/p&gt;




&lt;h3&gt;
  
  
  Le problème est simple, la solution est nulle part
&lt;/h3&gt;

&lt;p&gt;Vinted, c'est le flea market de 85 millions d'utilisateurs en Europe. Le problème ? Les bonnes affaires disparaissent en moins de 15 minutes. Tu postes un message pour demander la taille, le vendeur a déjà conclu avec quelqu'un d'autre.&lt;/p&gt;

&lt;p&gt;Tu as deux options :&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Passer 3h/jour à scroller&lt;/strong&gt; — c'est ce que font 95% des utilisateurs. Inefficace, chronophage, frustrant.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Automatiser&lt;/strong&gt; — mais construire son propre scraper en 2026, c'est s'exposer à la protection Datadome de Vinted, aux blocs Cloudflare, aux bans IP, et aux headers qui changent chaque semaine.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;J'ai testé la voie #2 pendant 3 mois. Mon script Python a tenu 11 jours avant le premier 403. Le script corrigé a tenu 6 jours. À chaque mise à jour de Vinted, je recommençais de zéro.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;La solution : arrêter de construire l'infrastructure quand quelqu'un l'a déjà faite.&lt;/strong&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  L'architecture sans-code qui fonctionne en 2026
&lt;/h3&gt;

&lt;p&gt;Le système repose sur deux briques :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Apify Vinted Turbo Scraper&lt;/strong&gt; → extraction fiable des listings avec les bons headers et le bypass Datadome intégré&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Telegram Bot API&lt;/strong&gt; → notification push en temps réel sur ton téléphone&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Le flux :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[Marque + Prix Max + Taille] → [Apify Actor] → [JSON propre] → [Telegram Bot] → [Notification 🔔]
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Pas de serveur. Pas de cron custom. Pas de maintenance.&lt;/p&gt;




&lt;h3&gt;
  
  
  Étape 1 : Configure le Turbo Scraper sur Apify
&lt;/h3&gt;

&lt;p&gt;Le Turbo Scraper te permet de filtrer par :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"searchTerms"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"Carhartt"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Arc'teryx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Nike ACG"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"priceMin"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"priceMax"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;35&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"sizeIds"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"m"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"38"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"40"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"country"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"fr"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"sortBy"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"created_desc"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;L'actor renvoie un JSON structuré :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"title"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Veste Carhartt WIP Detroit"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"price"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;29&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"M"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"url"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://www.vinted.fr/items/12345678"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"seller"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"thriftking_92"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"created"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2026-04-11T14:32:00Z"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"photo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"https://images.vinted.com/..."&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Tu récupères ce JSON via le webhook de l'Actor ou via API endpoint.&lt;/p&gt;




&lt;h3&gt;
  
  
  Étape 2 : Envoie les résultats sur Telegram
&lt;/h3&gt;

&lt;p&gt;2 options, selon ton niveau :&lt;/p&gt;

&lt;h4&gt;
  
  
  Option A : Zapier / Make (zéro code)
&lt;/h4&gt;

&lt;p&gt;Connecte le webhook Apify → Zapier → Telegram Bot. 10 minutes chrono, fonctionne pour 95% des cas.&lt;/p&gt;

&lt;h4&gt;
  
  
  Option B : 20 lignes de Node.js (contrôle total)
&lt;/h4&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;TelegramBot&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;node-telegram-bot-api&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;axios&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;bot&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;TelegramBot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;TELEGRAM_TOKEN&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;polling&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;post&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/webhook&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;for &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="k"&gt;of&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`
🔔 *Nouvelle trouvaille Vinted !*

*{item.title}*
💰 {item.price}€ — Taille {item.size}
👤 {item.seller}
🔗 {item.url}
    `&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;bot&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sendMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;CHAT_ID&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;parse_mode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Markdown&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;disable_web_page_preview&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;json&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;status&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ok&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;items_processed&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Déploie ça sur Railway, Render ou ton VPS. Coût : ~5€/mois max.&lt;/p&gt;




&lt;h3&gt;
  
  
  Étape 3 : Automatise le scheduling
&lt;/h3&gt;

&lt;p&gt;Tu n'as pas besoin de tourner ton script 24/7. Configure un run Apify planifié :&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Schedule : toutes les 30 minutes
Coût : ~0.02€ par run (actor compute)
Résultat : notifications en temps réel sans server costs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Le coût mensuel réel :&lt;/strong&gt; 1-3€ pour 1 440 runs/mois. Ton café coûte plus cher.&lt;/p&gt;




&lt;h3&gt;
  
  
  Ce que j'ai découvert après 6 mois d'usage
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Les meilleures deals (marques streetwear, vêtements de sport) postés le matin sont souvent pris avant 10h. &lt;strong&gt;Schedule à 6h30, 7h30 et 8h30.&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Filtrer par &lt;code&gt;created_desc&lt;/code&gt; uniquement te donne les listings des 30 dernières minutes. Plus large = plus de bruit.&lt;/li&gt;
&lt;li&gt;Le paramètre &lt;code&gt;sizeIds&lt;/code&gt; est clé : Vinted ne filtre pas toujours correctement côté client. Ton actor doit le faire en post-processing.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Le piège à éviter en 2026
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Ne construis pas ton propre parser HTTP.&lt;/strong&gt; Vinted a déployé en 2025-2026 une couche Datadome de 4ème génération qui détecte les headers Selenium, les patterns de navigation automatisés et les IPs de data centers en moins de 3 requêtes.&lt;/p&gt;

&lt;p&gt;Le Vinted Turbo Scraper sur Apify utilise des IPs résidentielles rotatives et des fingerprints browsers réels. C'est la différence entre 1h de dev + 2 jours de maintenance versus 10 minutes de config + 0 maintenance.&lt;/p&gt;




&lt;h3&gt;
  
  
  Tu veux tester en 2 minutes ?
&lt;/h3&gt;

&lt;p&gt;Voici le lien direct vers l'actor Apify :&lt;/p&gt;

&lt;p&gt;👉 &lt;strong&gt;&lt;a href="https://apify.com/boo_n/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper — Apify Store&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Premiers 3€ de compute offerts pour les nouveaux comptes. C'est suffisant pour tester le système complet.&lt;/p&gt;

&lt;p&gt;Si tu veux une config Telegram clé-en-main avec le scheduling automatique, contacte-moi en commentaire — je partage le repo GitHub avec la stack complète (Node.js + Railway + Telegram).&lt;/p&gt;

&lt;p&gt;Les deals n'attendent pas. Automatise ou regarde-les partir.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;[Cet article est écrit à titre informatif. Vérifie les Conditions d'Utilisation de Vinted et la législation locale avant d'automatiser la récupération de données.]&lt;/em&gt;&lt;/p&gt;

</description>
      <category>vinted</category>
      <category>automation</category>
      <category>apify</category>
      <category>telegram</category>
    </item>
    <item>
      <title>Why building a Vinted scraper from scratch is a trap in 2026</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Wed, 08 Apr 2026 09:23:29 +0000</pubDate>
      <link>https://forem.com/boo_n/why-building-a-vinted-scraper-from-scratch-is-a-trap-in-2026-4om7</link>
      <guid>https://forem.com/boo_n/why-building-a-vinted-scraper-from-scratch-is-a-trap-in-2026-4om7</guid>
      <description>&lt;p&gt;If you're a data extraction developer or just someone trying to build a Vinted new listings alert system, you've probably noticed something over the past few months: Vinted's anti-bot protection has become completely paranoid.&lt;/p&gt;

&lt;p&gt;I tried building my own Vinted scraper in Python last week to monitor some specific vintage deals. Total disaster.&lt;/p&gt;

&lt;p&gt;Here is what happens if you try the DIY route right now:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pure HTTP requests (Requests, HTTPX)&lt;/strong&gt;: Instant 403 Forbidden. Their Cloudflare/Datadome setup immediately flags the TLS fingerprint of standard libraries.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Headless Browsers (Playwright/Puppeteer)&lt;/strong&gt;: It works briefly, but it's incredibly slow and consumes massive amounts of RAM. Plus, Vinted will eventually flag your residential proxy IP if you don't rotate perfectly.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After burning through two different proxy providers and getting blocked anyway, I gave up on maintaining my own infrastructure for this. &lt;/p&gt;

&lt;p&gt;While looking for an alternative, I stumbled upon an Apify Vinted actor called &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;It essentially acts as a hybrid engine—it bypasses the Datadome checks natively and just returns clean JSON. &lt;/p&gt;

&lt;p&gt;Instead of fighting with headers and proxies, this is literally all the code I run now:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;apify_client&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ApifyClient&lt;/span&gt;

&lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ApifyClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YOUR_API_TOKEN&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;run&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;actor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;IV3WPdQlMFG1cwXuK&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;call&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run_input&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;searchUrl&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://www.vinted.com/catalog?search_text=carhartt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;maxItems&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;run&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;defaultDatasetId&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;iterate_items&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;title&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;price&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It’s significantly cheaper than running my own headless cluster and I don't have to deal with WAF bypasses anymore. If you need to scrape Vinted listings efficiently, don't reinvent the wheel.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to monitor Vinted automatically for new listings (Without getting IP banned)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Tue, 07 Apr 2026 22:09:34 +0000</pubDate>
      <link>https://forem.com/boo_n/how-to-monitor-vinted-automatically-for-new-listings-without-getting-ip-banned-44j</link>
      <guid>https://forem.com/boo_n/how-to-monitor-vinted-automatically-for-new-listings-without-getting-ip-banned-44j</guid>
      <description>&lt;p&gt;If you're into flipping clothes or just trying to snag the best vintage deals, you already know that speed is everything. A good deal on Vinted is gone in literally seconds.&lt;/p&gt;

&lt;p&gt;A few weeks ago, I decided to build a simple Python script to send a Discord notification whenever a specific brand was uploaded in my size. I thought it would take an hour. I was wrong.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Datadome Nightmare
&lt;/h2&gt;

&lt;p&gt;If you've ever tried to build a &lt;strong&gt;vinted scraper&lt;/strong&gt;, you've probably hit a wall of 403 Forbidden errors. Vinted uses heavy anti-bot protections to block automated traffic.&lt;/p&gt;

&lt;p&gt;I tried everything:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Rotating free proxies (instantly blocked)&lt;/li&gt;
&lt;li&gt;Premium residential proxies (worked for a bit, then got flagged)&lt;/li&gt;
&lt;li&gt;Playwright/Puppeteer (way too slow and resource-heavy to run 24/7 on my small VPS)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To truly &lt;strong&gt;monitor vinted automatically&lt;/strong&gt;, you need something that handles TLS fingerprinting natively. I was about to give up on my &lt;strong&gt;vinted new listings alert&lt;/strong&gt; project when I stumbled across a pre-built solution.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Discovery
&lt;/h2&gt;

&lt;p&gt;Instead of reinventing the wheel and fighting anti-bot systems daily, I found an &lt;strong&gt;apify vinted actor&lt;/strong&gt; that handles the heavy lifting. It's called Vinted Turbo Scraper.&lt;/p&gt;

&lt;p&gt;Here is the tool I use now: &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It uses a hybrid approach: it uses a real browser context to bypass the WAF, grabs the session tokens, and then uses fast HTTP requests to extract the data at scale.&lt;/p&gt;

&lt;h2&gt;
  
  
  How I set up my Discord Alert
&lt;/h2&gt;

&lt;p&gt;Using this actor, my code went from a 300-line messy Puppeteer script to a simple API call.&lt;/p&gt;

&lt;p&gt;I just set up a webhook in Discord, and use a simple cron job that hits the Apify API every 5 minutes. The actor returns clean JSON with all the new items matching my search URL.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Simple example of how clean the data is&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;item&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;price&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;price&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;brand&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;brand&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;
&lt;span class="p"&gt;}));&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you're a developer trying to build a &lt;strong&gt;vinted vintage deals automation&lt;/strong&gt; pipeline, save yourself the headache. Stop fighting WAFs and use the right tools for the job.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>python</category>
      <category>scraping</category>
      <category>automation</category>
    </item>
    <item>
      <title>Scraping Vinted in 2026: Why your Python script keeps getting 403 Errors</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Mon, 06 Apr 2026 22:19:08 +0000</pubDate>
      <link>https://forem.com/boo_n/scraping-vinted-in-2026-why-your-python-script-keeps-getting-403-errors-30mi</link>
      <guid>https://forem.com/boo_n/scraping-vinted-in-2026-why-your-python-script-keeps-getting-403-errors-30mi</guid>
      <description>&lt;p&gt;If you've tried to build a &lt;strong&gt;vinted scraper&lt;/strong&gt; recently using &lt;code&gt;requests&lt;/code&gt; or &lt;code&gt;BeautifulSoup&lt;/code&gt; in Python, you probably hit a brick wall. Specifically, a &lt;code&gt;403 Forbidden&lt;/code&gt; wall.&lt;/p&gt;

&lt;p&gt;I spent the weekend trying to &lt;strong&gt;scrape vinted&lt;/strong&gt; to get notifications for some vintage jackets I was hunting. My IP got banned within 10 requests. Vinted uses Datadome and Cloudflare to aggressively block basic scraping attempts.&lt;/p&gt;

&lt;h3&gt;
  
  
  The problem with DIY Vinted Automation
&lt;/h3&gt;

&lt;p&gt;When you try to monitor new listings automatically, Vinted's WAF checks your TLS fingerprint. Standard HTTP libraries (like Python's &lt;code&gt;requests&lt;/code&gt; or Node's &lt;code&gt;axios&lt;/code&gt;) leak signatures that scream "I am a bot".&lt;/p&gt;

&lt;p&gt;You can try rotating proxies or using Playwright/Puppeteer, but Playwright is too heavy to run a fast loop (you want alerts in seconds, not minutes). I was basically out of memory running 5 browser tabs just for one search query.&lt;/p&gt;

&lt;h3&gt;
  
  
  The bypass I found
&lt;/h3&gt;

&lt;p&gt;After getting tired of dealing with TLS fingerprinting and headless browser crashes, I looked for managed solutions. I stumbled upon this &lt;strong&gt;vinted turbo scraper&lt;/strong&gt; actor on Apify:&lt;br&gt;
&lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper on Apify&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It handles the Datadome bypass natively. It uses a hybrid approach:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;It uses a real browser to fetch the initial tokens and solve challenges.&lt;/li&gt;
&lt;li&gt;It switches to lightweight HTTP requests for the actual data extraction.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This means you get the bypass rate of a real browser but the speed and low cost of an API. It literally returns JSON data from any Vinted search URL instantly.&lt;/p&gt;

&lt;p&gt;If you are a &lt;strong&gt;vinted data extraction developer&lt;/strong&gt; or just trying to set up a &lt;strong&gt;vinted new listings alert&lt;/strong&gt;, stop wasting your time fighting Datadome. Just use the Apify actor and plug the JSON into your Discord webhook or database.&lt;/p&gt;

&lt;p&gt;It completely saved my weekend project. If you've found other ways to bypass the 403s without spending hundreds on residential proxies, let me know below!&lt;/p&gt;

</description>
      <category>python</category>
      <category>webscraping</category>
      <category>automation</category>
      <category>api</category>
    </item>
    <item>
      <title>How to Track Vinted Price Drops for Reselling (Automation)</title>
      <dc:creator>Boon</dc:creator>
      <pubDate>Sun, 05 Apr 2026 22:26:45 +0000</pubDate>
      <link>https://forem.com/boo_n/how-to-track-vinted-price-drops-for-reselling-automation-1clc</link>
      <guid>https://forem.com/boo_n/how-to-track-vinted-price-drops-for-reselling-automation-1clc</guid>
      <description>&lt;p&gt;When reselling on Vinted, finding underpriced items is only half the battle. The other half is tracking when a seller drops their price so you can instantly send an offer.&lt;/p&gt;

&lt;p&gt;If you try to monitor multiple closets or search queries manually, you'll lose out to faster buyers. &lt;/p&gt;

&lt;p&gt;I used to run a custom Python script to track my favorite items, but Vinted's Cloudflare protection (and Datadome) made it a nightmare to maintain. You get a &lt;code&gt;403 Forbidden&lt;/code&gt; error unless your proxy and TLS fingerprint are perfect.&lt;/p&gt;

&lt;p&gt;Instead of maintaining my own scraper and rotating proxies, I found a tool that handles the bypassing for me: the &lt;a href="https://apify.com/kazkn/vinted-turbo-scraper" rel="noopener noreferrer"&gt;Vinted Turbo Scraper&lt;/a&gt; on Apify.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why it works
&lt;/h3&gt;

&lt;p&gt;It runs a hybrid architecture. It grabs valid CSRF tokens with a real browser session in the background, then uses raw HTTP requests to fetch the data at crazy speeds. You never have to worry about getting blocked.&lt;/p&gt;

&lt;h3&gt;
  
  
  How I track price drops:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;I pass my target Vinted search URLs into the scraper.&lt;/li&gt;
&lt;li&gt;I set the Apify actor to run on a schedule (e.g., every 10 minutes).&lt;/li&gt;
&lt;li&gt;I push the clean JSON output to a Google Sheet using Make.com (formerly Integromat).&lt;/li&gt;
&lt;li&gt;A simple formula compares the new &lt;code&gt;price&lt;/code&gt; field with the previous data. If it drops below my target threshold, it sends me a Discord ping.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you're building any kind of vinted automation, price monitor, or alert system, don't waste time fighting WAFs. Just use a maintained data extraction tool.&lt;/p&gt;

</description>
      <category>webscraping</category>
      <category>python</category>
      <category>automation</category>
      <category>ecommerce</category>
    </item>
  </channel>
</rss>
