<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Shivam Chaurasia</title>
    <description>The latest articles on Forem by Shivam Chaurasia (@tracebackerror).</description>
    <link>https://forem.com/tracebackerror</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/tracebackerror"/>
    <language>en</language>
    <item>
      <title>Harnessing the Power of Django StreamingHttpResponse for Efficient Web Streaming</title>
      <dc:creator>Shivam Chaurasia</dc:creator>
      <pubDate>Mon, 19 Jun 2023 08:41:00 +0000</pubDate>
      <link>https://forem.com/epam_india_python/harnessing-the-power-of-django-streaminghttpresponse-for-efficient-web-streaming-56jh</link>
      <guid>https://forem.com/epam_india_python/harnessing-the-power-of-django-streaminghttpresponse-for-efficient-web-streaming-56jh</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction:&lt;/strong&gt;&lt;br&gt;
In the world of web development, streaming data has become increasingly popular. Whether it's live video feeds, real-time analytics, or large file downloads, streaming allows for efficient and seamless transmission of data over the web. Django, a high-level Python web framework, provides a powerful tool called StreamingHttpResponse that enables developers to implement streaming functionalities with ease. In this blog post, we will explore the capabilities of Django StreamingHttpResponse and discuss how it can enhance your web applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What is Django StreamingHttpResponse?&lt;/strong&gt;&lt;br&gt;
Django StreamingHttpResponse is a class-based response that streams content to the client in chunks rather than waiting for the entire response to be generated. It allows you to iterate over a generator function or any other iterable, sending chunks of data to the client as they become available. This is particularly useful when dealing with large datasets or long-running processes, as it avoids buffering the entire response in memory.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implementing StreamingHttpResponse:&lt;/strong&gt;&lt;br&gt;
To use StreamingHttpResponse, you need to create a generator function or an iterable that generates the content you want to stream. Let's take a look at a simple example to illustrate the concept:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;django.http&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;StreamingHttpResponse&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;stream_data&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# Generate data in chunks
&lt;/span&gt;    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nb"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"Data chunk &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;
        &lt;span class="c1"&gt;# Simulate delay between chunks
&lt;/span&gt;        &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;sleep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;streaming_view&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;StreamingHttpResponse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;stream_data&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'Content-Type'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'text/plain'&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the code above, we define a generator function called &lt;code&gt;stream_data()&lt;/code&gt; that yields data chunks. We then create a StreamingHttpResponse object by passing the generator function as the content argument. Finally, we set the appropriate &lt;code&gt;Content-Type&lt;/code&gt; header and return the response.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;StreamingHttpResponse for streaming real-time stock market data&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
Let's dive into a more sophisticated example of using Django's StreamingHttpResponse for streaming real-time stock market data to clients.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;django.http&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;StreamingHttpResponse&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;channels.layers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;get_channel_layer&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;asgiref.sync&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;async_to_sync&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;json&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="nn"&gt;time&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;generate_stock_data&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# Connect to a real-time stock data source (e.g., WebSocket, API)
&lt;/span&gt;    &lt;span class="c1"&gt;# Iterate and yield stock data in chunks
&lt;/span&gt;    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;stock_data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;get_real_time_stock_data&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;dumps&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;stock_data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;# Simulate delay between data updates
&lt;/span&gt;        &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;sleep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;stream_stock_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;channel_layer&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;get_channel_layer&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;stream_name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"stock_data_stream"&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;send_stock_data&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;stock_data&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;generate_stock_data&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="n"&gt;async_to_sync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel_layer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;group_send&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;stream_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="s"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"stock_data_update"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="s"&gt;"data"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;stock_data&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;
            &lt;span class="c1"&gt;# Yield the data for streaming response as well
&lt;/span&gt;            &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;stock_data&lt;/span&gt;

    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;StreamingHttpResponse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;send_stock_data&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;content_type&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"application/json"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Open a channel to stream data to clients via WebSocket or other streaming protocols
&lt;/span&gt;    &lt;span class="n"&gt;async_to_sync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel_layer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;group_add&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;stream_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;channel_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;handle_disconnect&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="c1"&gt;# Cleanup and close the channel when the client disconnects
&lt;/span&gt;        &lt;span class="n"&gt;async_to_sync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;channel_layer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;group_discard&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;stream_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;channel_name&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;streaming_content&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;disconnect&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;handle_disconnect&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, we assume that you have set up Django Channels to handle WebSocket communication. Here's how the example works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;We define a &lt;code&gt;generate_stock_data()&lt;/code&gt; function that connects to a real-time stock data source (e.g., WebSocket or API) and yields JSON-encoded stock data in chunks.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;In the &lt;code&gt;stream_stock_data()&lt;/code&gt; view function, we obtain the &lt;code&gt;channel_layer&lt;/code&gt; to handle WebSocket communication.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Inside the &lt;code&gt;send_stock_data()&lt;/code&gt; generator function, we iterate over the generated stock data and send it to the client via WebSocket using the &lt;code&gt;group_send()&lt;/code&gt; method. We also yield the stock data for the streaming response.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We create a StreamingHttpResponse object with &lt;code&gt;send_stock_data()&lt;/code&gt; as the content. We set the content type to "application/json" to indicate that the response contains JSON-encoded data.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We add the client's WebSocket channel to the &lt;code&gt;stock_data_stream&lt;/code&gt; group using &lt;code&gt;group_add()&lt;/code&gt;. This allows us to send stock data updates to all clients subscribed to the stream.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We define a &lt;code&gt;handle_disconnect()&lt;/code&gt; function that gets called when the client disconnects. Inside this function, we remove the client's WebSocket channel from the &lt;code&gt;stock_data_stream&lt;/code&gt; group using &lt;code&gt;group_discard()&lt;/code&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We assign the &lt;code&gt;handle_disconnect()&lt;/code&gt; function to &lt;code&gt;response.streaming_content.disconnect&lt;/code&gt; so that it gets called when the client disconnects.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By implementing this example, you can provide real-time stock data updates to clients, allowing them to receive and process the data as it becomes available. The use of Django's StreamingHttpResponse and Django Channels facilitates the seamless streaming of data and enhances the overall user experience in real-time stock market monitoring applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EventStream in StreamingHttpResponse&lt;/strong&gt;&lt;br&gt;
Let's combine the concepts of Event Stream and StreamingHttpResponse to create a real-time event stream using Django.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="nn"&gt;django.http&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;StreamingHttpResponse&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;generate_events&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="c1"&gt;# Connect to an event source or database
&lt;/span&gt;    &lt;span class="c1"&gt;# Retrieve and yield events in chunks
&lt;/span&gt;    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;events&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;get_real_time_events&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;events&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="s"&gt;"data: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;event&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="se"&gt;\n\n&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;
        &lt;span class="c1"&gt;# Simulate delay between event updates
&lt;/span&gt;        &lt;span class="n"&gt;time&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;sleep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;event_stream&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;request&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;stream_events&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;generate_events&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
            &lt;span class="k"&gt;yield&lt;/span&gt; &lt;span class="n"&gt;event&lt;/span&gt;

    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;StreamingHttpResponse&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;stream_events&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;content_type&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s"&gt;'text/event-stream'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'Cache-Control'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'no-cache'&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;'Transfer-Encoding'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'chunked'&lt;/span&gt;

    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this example, we assume you have a source (e.g., event source, database) from which you can retrieve real-time events. Here's how the example works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;We define a &lt;code&gt;generate_events()&lt;/code&gt; generator function that connects to the event source or database and yields events in chunks. Each event is formatted as an Event Stream line with a "data" field.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Inside the &lt;code&gt;event_stream()&lt;/code&gt; view function, we define a &lt;code&gt;stream_events()&lt;/code&gt; generator function that iterates over the generated events and yields them.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We create a StreamingHttpResponse object with &lt;code&gt;stream_events()&lt;/code&gt; as the content and set the content type to "text/event-stream" to indicate that we are streaming an Event Stream.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;We set the appropriate response headers: &lt;code&gt;Cache-Control&lt;/code&gt; is set to "no-cache" to ensure that the response is not cached, and &lt;code&gt;Transfer-Encoding&lt;/code&gt; is set to "chunked" to enable streaming.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Finally, we return the StreamingHttpResponse object as the response.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;By implementing this example, you can create a real-time event stream where clients can connect and receive event updates as they occur. This approach is particularly useful for applications such as real-time chat systems, live feeds, or activity streams where users need to be continuously informed about the latest events.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Advantages of StreamingHttpResponse:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Memory Efficiency: StreamingHttpResponse streams data in chunks, which reduces the memory footprint compared to buffering the entire response. This makes it ideal for scenarios involving large datasets or files.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Improved User Experience: Streaming responses provide a better user experience for long-running processes or when dealing with large files. Instead of waiting for the entire content to load, users can start consuming data as it becomes available.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Real-Time Data Streaming: Django's StreamingHttpResponse enables real-time data streaming, making it suitable for applications like live video feeds, chat applications, or real-time analytics. Clients can receive data updates as soon as they are available.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scalability: By avoiding the need to buffer the entire response, StreamingHttpResponse allows your application to handle multiple concurrent streaming requests more efficiently. This scalability is crucial for applications that serve a high volume of users or deal with heavy data processing.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Considerations and Best Practices:&lt;/strong&gt;&lt;br&gt;
While Django's StreamingHttpResponse provides powerful streaming capabilities, there are some considerations and best practices to keep in mind:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Chunk Size: The size of each data chunk should be optimized based on your use case and the client's network conditions. Very small chunks may lead to inefficient data transmission, while excessively large chunks may result in delays or connection timeouts.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Compression: If streaming large amounts of text-based data, enabling compression can significantly reduce the bandwidth requirements and improve overall performance. Django supports various compression options that can be applied to the response.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Web Server Compatibility: StreamingHttpResponse works best with web servers that support streaming, such as Gunicorn or uWSGI. Ensure that your chosen server is compatible with streaming responses to leverage the full benefits.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
Django StreamingHttpResponse is a powerful tool that empowers developers to efficiently implement streaming functionalities in their web applications. Whether you are dealing with large datasets, real-time data updates, or long-running processes, StreamingHttpResponse offers an elegant solution that optimizes memory usage and&lt;/p&gt;

&lt;p&gt;enhances user experience. By utilizing this feature, you can unlock the potential for scalable, real-time streaming in your Django projects, opening up a wide range of possibilities for innovative and interactive web applications.&lt;/p&gt;

&lt;p&gt;Disclaimer&lt;br&gt;
This is a personal [blog, post, statement, opinion]. The views and opinions expressed here are only those of the author and do not represent those of any organization or any individual with whom the author may be associated, professionally or personally.&lt;/p&gt;

</description>
      <category>python</category>
      <category>streaminghttpresponse</category>
      <category>webdev</category>
      <category>eventstreams</category>
    </item>
    <item>
      <title>SQLAlchemy ORM Advance Usage</title>
      <dc:creator>Shivam Chaurasia</dc:creator>
      <pubDate>Thu, 01 Dec 2022 09:38:25 +0000</pubDate>
      <link>https://forem.com/epam_india_python/sqlalchemy-orm-advance-usage-304d</link>
      <guid>https://forem.com/epam_india_python/sqlalchemy-orm-advance-usage-304d</guid>
      <description>&lt;p&gt;SQLAlchemy is the most widely used ORM in Python based application; its backend neutral and offers complete flexibility to build any complex SQL expression. &lt;/p&gt;

&lt;p&gt;Today, we will try to showcase few of the unexplored or advance usage of the ORM which will be of great advantage in different scenarios.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scenario 1:&lt;/strong&gt;&lt;br&gt;
Let’s consider you have a table called project and want to filter a data based on key project_id, task_id and component_id taken together which are present in subset of dataframe called project_to_retrieve.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ocn6hfcft7bgq6qctsj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ocn6hfcft7bgq6qctsj.png" alt="Image description" width="800" height="588"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Hint: produce a composite IN construct &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Connect To An SqLite Database
from sqlalchemy import create_engine

engine = create_engine(f'sqlite:///demo.db')


# Create a Project Table
from sqlalchemy import MetaData
meta = MetaData()

from sqlalchemy import Table, Column, Integer, String, MetaData

projects_table = Table(
   'projects', meta,
   Column('id', Integer, primary_key = True, autoincrement=True),
   Column('project_id', Integer),
   Column('task_id', Integer),
   Column('component_id', Integer),
   Column('cost', Integer),
)

meta.create_all(engine)

# Add Project Details
conn = engine.connect()
conn.execute(projects_table.insert(), [
   {'project_id':'HBB-XC', 'task_id' : 'U', 'component_id' : 1, 'cost' : 50},
   {'project_id':'HBB-DC', 'task_id' : 'I', 'component_id' : 2, 'cost' : 40},
   {'project_id':'HBB-NC', 'task_id' : 'U', 'component_id' : 3, 'cost' : 100},
])

view_projects_data = projects_table.select()
result = conn.execute(view_projects_data)
print(f"Projects Tables Data: ")
for row in result:
   print ("{} \t {}\t {} \t {} \t {}".format(row[0],row[1], row[2], row[3], row[4]))



# Subset of data to retrieve
import pandas as pd

data = [['HBB-XC', 'U', 1], ['HBB-NC', 'U', 3], ['HBB-NC', 'I', 3] ]

project_to_retrieve = pd.DataFrame(data,
                                   columns=['project_id', 'task_id', 'component_id']
)
print(f"Projects To Retrieve: ")
print(project_to_retrieve.head(5))

# Solution Query

from sqlalchemy import tuple_
from sqlalchemy.sql import select


active_projects = select(
    projects_table
).where(
    tuple_(
        projects_table.c.project_id,
        projects_table.c.task_id,
        projects_table.c.component_id
    ).in_(
        project_to_retrieve[
            ["project_id", "task_id", "component_id"]
        ]
        .to_records(index=False)
        .tolist()
    )
)

result = conn.execute(active_projects)
print(f"Active Projects Output: ")
for row in result:
   print ("{}\t {} \t {} \t {}".format(row[0],row[1], row[2], row[3]))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Explanation:&lt;/strong&gt;&lt;br&gt;
Here If we split the code its divided into data preparation stage from line1-50 and and the query to retrieve data from line 51-76&lt;/p&gt;

&lt;p&gt;In first part we have first created an engine to connect to an sqlite database named as demo, then we have ingested few of the records using SQLAlchemy ORM core insert statements. And finally we have created and pandas dataframe projects_to_retrieve which needs to retrieve active projects from projects table.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6nlgmgii6mzb86zjy5go.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6nlgmgii6mzb86zjy5go.png" alt="Image description" width="800" height="242"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, in the second part, we are using tuple_ to retrieve the subset from table. This tuple will generate the composite in structure and will fitler on all the three conditions specified in the query.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg0usj16mbu1j1h71lv57.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg0usj16mbu1j1h71lv57.png" alt="Image description" width="800" height="77"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scenario 2:&lt;/strong&gt;&lt;br&gt;
You are working on a certain SQLAlchemy based Flask project TASK_ESTIMATION and have tables TASK which is getting directly managed by your migrations. But you soon realized that you need to use the table STORY_ESTIMATION which reside in different schema STORIES which is not getting managed by your alembic migrations or models.&lt;/p&gt;

&lt;p&gt;How will you able to bring this tables into your current scope?&lt;/p&gt;

&lt;p&gt;Hint: Automap&lt;br&gt;
&lt;strong&gt;Solution:&lt;/strong&gt;&lt;br&gt;
SQLAlchemy has a concept called Database reflection which can be used for reflecting the existing databases without recreating models or managing models from your side.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxug2lzgweqecvcjrybs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxug2lzgweqecvcjrybs.png" alt="Image description" width="800" height="530"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scenario 3:&lt;/strong&gt;&lt;br&gt;
Consider the previous case. You are trying to use an AUTOMAP on below provided tables but suddenly you realized that the tables are not getting reflected from stories schema, what’s wrong in below table ?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27i35osxk4fo609wlqvn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27i35osxk4fo609wlqvn.png" alt="Image description" width="800" height="735"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt;&lt;br&gt;
For Database reflection to work its necessary that reflected tables is having an primary key. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk51o6zyg9uavmpsyv7c1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk51o6zyg9uavmpsyv7c1.png" alt="Image description" width="800" height="33"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this case the issue is at line 7 where story estimation is not having any primary key. Which will result in error when model is accessed.&lt;/p&gt;

&lt;p&gt;The fix for this would be to add a primary key constraint&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ec21dbdxtwa344it1te.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ec21dbdxtwa344it1te.png" alt="Image description" width="800" height="33"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scenario 4:&lt;/strong&gt;&lt;br&gt;
Let’s consider you were having a list of dict which contains the values of project_id and cost. Using this python dict of list you want to update all the Project table using project_id  and updating the cost.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Solution:&lt;/strong&gt;&lt;br&gt;
In this case we will be using the bindparam to produce bound expression which can be later passed with python list values at the time of execution.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fidvjgufc51utln2mvrxa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fidvjgufc51utln2mvrxa.png" alt="Image description" width="800" height="573"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion:&lt;/strong&gt;&lt;br&gt;
In this blog we discussed how we can leverage the sqlalchemy capabilities for reflecting existing tables from another schemas or database using AutoMap and the common issues which is often hard to debug for the first-time user of AutoMap. &lt;/p&gt;

&lt;p&gt;Also, we have seen a use case where we dynamically filtered the composite conditions in a pandas dataframe using tuple_ expression.&lt;/p&gt;

&lt;p&gt;And finally, dynamically binded python values using bindparam which provides options to provide values at a time of execution.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Disclaimers:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This is a personal blog. The views and opinions expressed here are only those of the author and do not represent those of any organization or any individual with whom the author may be associated, professionally or personally.&lt;/p&gt;

</description>
      <category>python</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
