<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Dusty Kaisler</title>
    <description>The latest articles on Forem by Dusty Kaisler (@dustykaisler).</description>
    <link>https://forem.com/dustykaisler</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/dustykaisler"/>
    <language>en</language>
    <item>
      <title>How to Test the Magento API in Postman?</title>
      <dc:creator>Dusty Kaisler</dc:creator>
      <pubDate>Mon, 21 Aug 2023 06:26:22 +0000</pubDate>
      <link>https://forem.com/dustykaisler/how-to-test-the-magento-api-in-postman-2fdl</link>
      <guid>https://forem.com/dustykaisler/how-to-test-the-magento-api-in-postman-2fdl</guid>
      <description>&lt;p&gt;Magento API is an application programming interface that allows developers to integrate third-party applications or services with the Magento e-commerce platform. It facilitates communication between different software systems and enables the exchange of data in a standardized format.&lt;/p&gt;

&lt;p&gt;Postman, on the other hand, is a popular collaboration platform for API development. It provides a user-friendly interface for testing, documenting, and managing APIs. With Postman, developers can easily make API requests, organize them into collections, and share them with their team members or clients.&lt;/p&gt;

&lt;p&gt;By using Postman, developers can test Magento API endpoints by creating requests with different HTTP methods such as GET, POST, PUT, DELETE, etc. They can then send these requests to the Magento server and receive responses containing the desired data or perform specific actions.&lt;/p&gt;

&lt;p&gt;Postman also allows developers to set headers, add authentication parameters, and customize requests according to their specific needs. This makes it easier to troubleshoot and debug API calls, as developers can view the full request and response details, including headers, body, status codes, etc.&lt;/p&gt;

&lt;p&gt;Overall, &lt;a href="https://devhubby.com/thread/how-to-call-an-external-api-in-magento-2"&gt;Magento API&lt;/a&gt; and Postman provide developers with powerful tools to interact with the &lt;a href="https://business.adobe.com/products/magento/magento-commerce.html"&gt;Magento e-commerce platform&lt;/a&gt; and build integrations or extensions that enhance its functionalities.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To test the Magento API in Postman, follow these steps:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Launch Postman:&lt;/strong&gt; Open the Postman app and make sure you are signed in.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Create a new Request:&lt;/strong&gt; Click on the "&lt;strong&gt;+ New&lt;/strong&gt;" button on the left-hand side of the Postman interface to create a new request.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Set endpoint URL:&lt;/strong&gt; Enter the API endpoint URL in the request URL field. For example, if you want to test the "&lt;strong&gt;GET products&lt;/strong&gt;" API, the URL might be something like: &lt;a href="https://your-magento-domain.com/rest/V1/products"&gt;https://your-magento-domain.com/rest/V1/products&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Choose request type:&lt;/strong&gt; Select the appropriate HTTP request type (GET, POST, PUT, DELETE) based on the API endpoint you are testing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Add headers:&lt;/strong&gt; If the API requires any headers, navigate to the "Headers" tab below the URL field and add the necessary headers. For Magento API, you will typically need to include headers like "Authorization" and "Content-Type".&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Specify authentication and authorization:&lt;/strong&gt; If the Magento API requires authentication and authorization, you may need to configure the relevant settings in the "&lt;strong&gt;Authorization&lt;/strong&gt;" tab in Postman. Select the appropriate method (e.g., Basic Auth, Bearer Token), provide the necessary credentials, and set any other required parameters. If you do not remember your credentials, token, or &lt;a href="https://topminisite.com/blog/how-to-change-the-magento-admin-password-from"&gt;admin password&lt;/a&gt;, you can restore them.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Add request body (if applicable):&lt;/strong&gt; If your API request requires a body (for POST or PUT requests), navigate to the "&lt;strong&gt;Body&lt;/strong&gt;" tab, choose the desired format (raw, form-data, x-www-form-urlencoded), and enter the required data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Send the request:&lt;/strong&gt; Click on the "&lt;strong&gt;Send&lt;/strong&gt;" button to send the API request to the specified endpoint.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Inspect the response:&lt;/strong&gt; Postman will display the API response in the "&lt;strong&gt;Response&lt;/strong&gt;" section. Check if the response code, data, and headers are as expected. You can review the response in various formats like JSON, HTML, or XML by using the format options.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Repeat for other API endpoints:&lt;/strong&gt; If you need to test multiple API endpoints, you can create separate requests for each and follow the same process.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By following these steps, you can effectively test the Magento API using Postman and ensure that it is functioning correctly as per your requirements.&lt;/p&gt;

</description>
      <category>postman</category>
      <category>api</category>
      <category>magento</category>
      <category>php</category>
    </item>
    <item>
      <title>How to Scrape Data From Zillow?</title>
      <dc:creator>Dusty Kaisler</dc:creator>
      <pubDate>Fri, 21 Jul 2023 18:09:12 +0000</pubDate>
      <link>https://forem.com/dustykaisler/how-to-scrape-data-from-zillow-5cgo</link>
      <guid>https://forem.com/dustykaisler/how-to-scrape-data-from-zillow-5cgo</guid>
      <description>&lt;p&gt;&lt;a href="https://www.zillow.com/"&gt;Zillow.com&lt;/a&gt; is a popular online real estate marketplace that provides information about real estate properties in the United States. &lt;a href="https://forum.dollaroverflow.com/thread/when-was-zillow-founded"&gt;Zillow was founded in 2006&lt;/a&gt; and it is one of the leading real estate websites, offering a wide range of services related to buying, selling, renting, and financing properties.&lt;/p&gt;

&lt;p&gt;Zillow allows users to search for homes and apartments available for sale or rent across various locations. The platform provides &lt;a href="https://forum.dollaroverflow.com/thread/where-does-zillow-get-its-listings"&gt;detailed property listings&lt;/a&gt; with information such as property photos, descriptions, pricing, and amenities. Users can also find data on historical property values, local market trends, and neighborhood information.&lt;/p&gt;

&lt;p&gt;One of the notable features of Zillow is its &lt;a href="https://www.zillow.com/z/zestimate/"&gt;Zestimate&lt;/a&gt;, an automated valuation model that provides an estimated property value for millions of homes based on various factors such as recent sales data, location, and property characteristics. However, it's essential to note that Zestimates are estimates and may not always reflect the true market value accurately.&lt;/p&gt;

&lt;p&gt;In addition to residential properties, Zillow also includes listings for commercial properties, land, and vacation rentals.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is web scraping?
&lt;/h2&gt;

&lt;p&gt;Web scraping is the process of extracting data from websites automatically. It involves using software or scripts to access web pages, download the content, and extract specific information from the HTML code of the web pages. Web scraping allows you to collect large amounts of data from websites efficiently and can be used for various purposes, such as data analysis, research, or populating a database.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to scrape Zillow using Python?
&lt;/h2&gt;

&lt;p&gt;You can use Python with libraries like &lt;a href="https://pypi.org/project/requests/"&gt;requests&lt;/a&gt; for making HTTP requests and &lt;a href="https://pypi.org/project/beautifulsoup4/"&gt;BeautifulSoup&lt;/a&gt; or &lt;a href="https://pypi.org/project/Scrapy3/"&gt;Scrapy&lt;/a&gt; for parsing and extracting the relevant information from the web pages.&lt;/p&gt;

&lt;p&gt;Here's an example of how to use Python with &lt;strong&gt;requests&lt;/strong&gt; and &lt;strong&gt;BeautifulSoup&lt;/strong&gt; to scrape data from a webpage:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;requests&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;bs4&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;BeautifulSoup&lt;/span&gt;

&lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"https://www.zillow.com/new-york-city-ny/"&lt;/span&gt;

&lt;span class="n"&gt;headers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="s"&gt;"User-Agent"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"Mozilla/5.0 (X11; Linux x86_64) "&lt;/span&gt;
                  &lt;span class="s"&gt;"AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.103 "&lt;/span&gt;
                  &lt;span class="s"&gt;"Safari/537.36"&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status_code&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;soup&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;BeautifulSoup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"html.parser"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;addresses&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;find_all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="s"&gt;"data-test"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"property-card-addr"&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;address&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;addresses&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;address&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;getText&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
        &lt;span class="c1"&gt;# Output: list of addresses
&lt;/span&gt;&lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Failed to retrieve data. Status code:"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;status_code&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Please remember that it will probably be required to implement some rate-limiting between requests or use a proxy and dynamically change User agents per request.&lt;/p&gt;

&lt;p&gt;So if you see an error: &lt;strong&gt;"ImportError: No module named requests"&lt;/strong&gt; while running this script, you probably need to install the requests package or BeautifulSoup, and &lt;a href="https://devhubby.com/thread/python-error-import-error-no-module-named"&gt;here&lt;/a&gt; you can find information on how you can fix this error.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to scrape Zillow using Python Selenium?
&lt;/h2&gt;

&lt;p&gt;Here's a simple example of how to use Python Selenium for web scraping:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;selenium&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;webdriver&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;selenium.webdriver.common.by&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;By&lt;/span&gt;

&lt;span class="c1"&gt;# Replace 'path_to_webdriver' with the path to your web driver executable.
&lt;/span&gt;&lt;span class="n"&gt;driver&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;webdriver&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Chrome&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;executable_path&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;'path_to_webdriver'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'https://www.zillow.com/new-york-city-ny/'&lt;/span&gt;
&lt;span class="n"&gt;driver&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Extract data using Selenium methods.
&lt;/span&gt;&lt;span class="n"&gt;items&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;driver&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;find_elements&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;By&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CSS_SELECTOR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;'div#grid-search-results ul li'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;items&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;address&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;item&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;find_element&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;By&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;CSS_SELECTOR&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"address"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;address&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Close the web driver after scraping.
&lt;/span&gt;&lt;span class="n"&gt;driver&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;quit&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can find more examples of how to scrape Zillow using Python in &lt;a href="https://devhubby.com/thread/how-to-scrape-zillow-with-a-python"&gt;this thread&lt;/a&gt; as well.&lt;/p&gt;

</description>
      <category>python</category>
      <category>programming</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
