<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Umesh Kumar Dhakar</title>
    <description>The latest articles on Forem by Umesh Kumar Dhakar (@umeshdhakar).</description>
    <link>https://forem.com/umeshdhakar</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/umeshdhakar"/>
    <language>en</language>
    <item>
      <title>Trigger Azure Data Factory Pipeline from Event Grid (Using Webhook Endpoint)</title>
      <dc:creator>Umesh Kumar Dhakar</dc:creator>
      <pubDate>Wed, 12 Apr 2023 09:25:30 +0000</pubDate>
      <link>https://forem.com/umeshdhakar/trigger-azure-data-factory-pipeline-from-event-grid-using-webhook-endpoint-53hh</link>
      <guid>https://forem.com/umeshdhakar/trigger-azure-data-factory-pipeline-from-event-grid-using-webhook-endpoint-53hh</guid>
      <description>&lt;p&gt;In this post, I will explain how we can use Azure Event Grid Topics to trigger Data Factory pipeline. We will be using Access key for authentication and POST request for publishing a topic to Event Grid. I have also attached a snapshot for each step which can help to understand the steps easily.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Prerequisite:&lt;/strong&gt;&lt;br&gt;
Microsoft.EventGrid should be registered under the subscription.&lt;br&gt;
To confirm this, go to your subscription, then click Resource Provider.&lt;br&gt;
Search for Event Grid, if &lt;code&gt;Microsoft.EventGrid&lt;/code&gt; status is Registered, we are good to go else you need to select &lt;code&gt;Microsoft.EventGrid&lt;/code&gt; and click on the register button.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9yhb831s9s95vhv7h7o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr9yhb831s9s95vhv7h7o.png" alt="Register EventGrid to Subscription"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will create 2 Azure resources: Event Grid Topic and Azure Data Factory.&lt;/p&gt;

&lt;h4&gt;
  
  
  Create Event Grid Topic Resource
&lt;/h4&gt;

&lt;p&gt;Here, I created a resource with name &lt;code&gt;eg-tp-useast-dev-01&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ferj5l34idfq8blij5yle.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ferj5l34idfq8blij5yle.png" alt="Create EventGrid Topic Resource"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While creating the resource, make sure that &lt;code&gt;Enable access key and Shared Access Signature (SAS)&lt;/code&gt; is enabled.&lt;br&gt;
Remaining configurations you can keep as default.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F110w1soxvs6ivsjgeau5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F110w1soxvs6ivsjgeau5.png" alt="Enable access key and Shared Access Signature"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Once Event Grid Topic Resource is deployed, go to the resource. Under the Overview tab, you can see there is no subscription found at this moment. Once we configure the ADF trigger, we will see a subscription here.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk2gv3q6j46ffnv58jw5f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk2gv3q6j46ffnv58jw5f.png" alt="No Subscription"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h4&gt;
  
  
  Create Azure Data Factory resource and launch the Studio
&lt;/h4&gt;

&lt;p&gt;I created a Data Factory resource &lt;code&gt;df-useast-dev-01&lt;/code&gt;.&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd5iyc8ddf1qci5savr7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd5iyc8ddf1qci5savr7m.png" alt="Create Azure Data Factory resource"&gt;&lt;/a&gt;&lt;br&gt;
Go to Author Tab and create a sample pipeline. For simplicity, I just added a Wait activity inside a sample pipeline &lt;code&gt;PL_Sample&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd4815dlwnmvm23s7t20m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd4815dlwnmvm23s7t20m.png" alt="sample pipeline"&gt;&lt;/a&gt;&lt;br&gt;
Once you created the pipeline, Publish it.&lt;/p&gt;




&lt;h4&gt;
  
  
  Create a custom trigger
&lt;/h4&gt;

&lt;p&gt;Create a custom trigger for the pipeline &lt;code&gt;PL_Sample&lt;/code&gt; which we just created. Click on Add Trigger and create a new trigger with the below details:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;give trigger name &lt;/li&gt;
&lt;li&gt;select type as Custom events&lt;/li&gt;
&lt;li&gt;select Subscription&lt;/li&gt;
&lt;li&gt;you should see all available event grid topics, select the topic which we just created (eg-tp-useast-dev-01)&lt;/li&gt;
&lt;li&gt;type the subject: tp-subject-01&lt;/li&gt;
&lt;li&gt;type the event type: event-type-01&lt;/li&gt;
&lt;li&gt;check &lt;code&gt;Start trigger on creation&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;click OK&lt;/li&gt;
&lt;li&gt;publish Trigger&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;subject and event type values are used to filter the events.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43ykm580rej7cp75bkg9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F43ykm580rej7cp75bkg9.png" alt="create a custom trigger"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;NOTE:&lt;/strong&gt; If Event Grid topic is not visible in the &lt;code&gt;Event Grid topic name&lt;/code&gt; drop-down, follow these 4 steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open Event Grid topic resource in a new tab, Go to the overview tab, click on JSON view.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftkyezaubyeg99i722m2u.png" alt="Resource ID"&gt;
&lt;/li&gt;
&lt;li&gt;Copy the Resource ID.
&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fowka3v0xouxwr5vy9e3n.png" alt="Resource ID"&gt;
&lt;/li&gt;
&lt;li&gt;Again, switch back to the ADF trigger page and switch to Enter manually for &lt;code&gt;Account selection method&lt;/code&gt; and 
paste the Resource ID in the scope field.&lt;/li&gt;
&lt;li&gt;Click on Ok and Publish the trigger.&lt;/li&gt;
&lt;/ol&gt;
&lt;/blockquote&gt;

&lt;p&gt;Once Trigger is published successfully, go to Event Grid topic, you should see 1 event subscription.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw4yeo1oyo5laarnodppz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw4yeo1oyo5laarnodppz.png" alt="Event subscription"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;So far we have configured a ADF pipeline with the Event Grid topic.&lt;/p&gt;

&lt;h4&gt;
  
  
  Publish an event to the Event Grid Topic
&lt;/h4&gt;

&lt;p&gt;Now we will publish an event to the Event Grid Topic by sending a POST request. I am using Postman to send requests.&lt;/p&gt;

&lt;p&gt;In order to send a request, we need an endpoint and an access key of the topic.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to Overview tab and copy the &lt;code&gt;Topic Endpoint&lt;/code&gt; value&lt;/li&gt;
&lt;li&gt;Go to Access Keys and copy a Key&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faz4z2mwnfy5n9ci2wfnc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faz4z2mwnfy5n9ci2wfnc.png" alt="Event Grid Topic access key"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h4&gt;
  
  
  Create A POST request
&lt;/h4&gt;

&lt;p&gt;&lt;code&gt;URL&lt;/code&gt;: Use the Topic Endpoint as request URL&lt;br&gt;
&lt;code&gt;header&lt;/code&gt;: Add a header in the request with key &lt;code&gt;aeg-sas-key&lt;/code&gt; and access key as value.&lt;br&gt;
&lt;code&gt;body&lt;/code&gt;: Add the below body in JSON format&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

[
  {
    "id": "I01",    
    "eventType": "event-type-01",
    "subject": "tp-subject-01",
    "data":{
      "key1": "val1",
      "key2": "val2"
    },
    "eventTime": "2023-04-06T11:26:07+05:30",
    "dataVersion": "1.0"
  }
]


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;id&lt;/code&gt;: id of the event (must be unique for each request)&lt;br&gt;
&lt;code&gt;eventType&lt;/code&gt;: same as eventType mentioned while creating ADF trigger&lt;br&gt;
&lt;code&gt;subject&lt;/code&gt;: same as subject mentioned while creating ADF trigger&lt;br&gt;
&lt;code&gt;data&lt;/code&gt;: parameters that can be passed to ADF trigger (explained at the end)&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fop60t9u4ynhrn779k9pc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fop60t9u4ynhrn779k9pc.png" alt="Create A POST request"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Click on Send, If the status is &lt;code&gt;200 OK&lt;/code&gt;, ADF pipeline should trigger.&lt;br&gt;
Go to &lt;code&gt;ADF monitor tab &amp;gt; Pipeline Runs &amp;gt; Triggered&lt;/code&gt;.&lt;br&gt;
You should see the Pipeline has been triggered by the custom event trigger &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdha5ihxlm9rs13kzgcad.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdha5ihxlm9rs13kzgcad.png" alt="ADF monitor tab"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;As I mentioned earlier that we can pass the parameter from event topic to ADF pipeline. For that follow below steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a pipeline parameter (here I created &lt;code&gt;par_input_file&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Edit the trigger and pass &lt;code&gt;@triggerBody().event.data.key1&lt;/code&gt; in Trigger Run Parameters.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When pipeline gets triggered, it should initialize the pipeline parameter with &lt;code&gt;event.data.key1&lt;/code&gt; value.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvoqgjv5jtk6fvu28jfpy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvoqgjv5jtk6fvu28jfpy.png" alt="pipeline paramete"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Note: Resources should be deleted after testing in order to avoid costs.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Thank you for reading this post. Hope it helped you understand and implement this event-driven pipeline execution. I could not find all these steps in a single post, so thought to consolidate all the steps into one post and share them with other developers. Please let me know if you stuck in between while implementing this, I would be happy to help. Happy learning.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>eventgrid</category>
      <category>dataengineering</category>
      <category>learnandshare</category>
    </item>
    <item>
      <title>How to Create and Publish Python Package.</title>
      <dc:creator>Umesh Kumar Dhakar</dc:creator>
      <pubDate>Mon, 28 Jan 2019 08:59:00 +0000</pubDate>
      <link>https://forem.com/umeshdhakar/how-to-create-and-publish-python-package-62o</link>
      <guid>https://forem.com/umeshdhakar/how-to-create-and-publish-python-package-62o</guid>
      <description>&lt;p&gt;Hello World!&lt;br&gt;
Python has a simple way of creating and publishing packages. &lt;a href="https://pypi.org/"&gt;Python Package Index&lt;/a&gt; is a package repository which maintains all the published package.&lt;br&gt;
Once I was curious about that packaging thing in python, So decided to learn this stuff. Along with that learning journey, I maintained a note and wrote necessary steps. I refined that note and posted here so that it may help someone who is looking into it. If you want to learn this stuff, this post can help you. You need a PyPI account to publish a package, So before starting, please &lt;a href="https://pypi.org/account/register/"&gt;Sign Up&lt;/a&gt; on PyPI, then proceed with the below process.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create a directory for the package.&lt;br&gt;
&lt;code&gt;mkdir demo_car&lt;/code&gt;  &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create file &lt;code&gt;__init__.py&lt;/code&gt; in the directory.&lt;br&gt;
&lt;code&gt;touch demo_car/__init__.py&lt;/code&gt;&lt;br&gt;
&lt;em&gt;(This file represents that the directory which contains it is a python package.)&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Write &lt;code&gt;'name = "&amp;lt;package-name&amp;gt;"'&lt;/code&gt; in file.&lt;br&gt;
&lt;code&gt;echo 'name="demo_car"' &amp;gt;&amp;gt; demo_car/__init__.py&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create python scripts for the package.&lt;br&gt;
&lt;code&gt;touch demo_car/engine.py&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Let's write some code in engine.py script. Open an editor (I prefer &lt;strong&gt;Nano&lt;/strong&gt;), write some python code or paste below sample code and save the file by Ctrl+o.&lt;br&gt;
&lt;code&gt;nano demo_car/engine.py&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def start():
    print("Started!")

def stop():
    print("Stopped!")
~~~~
&amp;gt; If no editor is available then install nano with the following commands.  
&amp;gt; `apt-get update`
&amp;gt; `apt-get install nano`
&amp;gt; `export EDITOR=nano`

- Package files structure should be like this.

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;.&lt;br&gt;&lt;br&gt;
├── demo_car&lt;br&gt;&lt;br&gt;
│   ├── engine.py&lt;br&gt;&lt;br&gt;
│   └── &lt;strong&gt;init&lt;/strong&gt;.py&lt;br&gt;&lt;br&gt;
├── LICENSE&lt;br&gt;&lt;br&gt;
├── README.md&lt;br&gt;&lt;br&gt;
└── setup.py&lt;br&gt;
~~~~&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Purpose of these files.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;setup.py&lt;/code&gt;
Build script for packaging.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;LICENSE&lt;/code&gt;
Every package should have a License. We use &lt;a href="https://choosealicense.com/licenses/mit/"&gt;MIT license&lt;/a&gt; here.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;README.md&lt;/code&gt;
This file contains a description of the package which will be shown on PyPI repository. &lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create these three files.&lt;br&gt;&lt;br&gt;
&lt;code&gt;touch setup.py LICENSE README.md&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;

&lt;p&gt;Open these files in the editor and update them.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For setup.py file&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import setuptools

with open("README.md", "r") as fh:
    long_description = fh.read()

setuptools.setup(
    name="demo_car",
    version="0.0.1",
    author="Umesh Kumar Dhakar",
    author_email="kumar886umesh@gmail.com",
    description="A demo_car package",
    long_description=long_description,
    long_description_content_type="text/markdown",
    url="https://github.com/umeshdhakar/sample-python-package",
    packages=setuptools.find_packages(),
    classifiers=[
        "Programming Language :: Python :: 3",
        "License :: OSI Approved :: MIT License",
        "Operating System :: OS Independent",
    ],
) 
~~~~  

-     For LICENSE file

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;MIT License&lt;/p&gt;

&lt;p&gt;Copyright (c) [2019] [demo_car]&lt;/p&gt;

&lt;p&gt;Permission is hereby granted, free of charge, to any person obtaining a copy&lt;br&gt;
of this software and associated documentation files (the "Software"), to deal&lt;br&gt;
in the Software without restriction, including without limitation the rights&lt;br&gt;
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell&lt;br&gt;
copies of the Software, and to permit persons to whom the Software is&lt;br&gt;
furnished to do so, subject to the following conditions:&lt;/p&gt;

&lt;p&gt;The above copyright notice and this permission notice shall be included in all&lt;br&gt;
copies or substantial portions of the Software.&lt;/p&gt;

&lt;p&gt;THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR&lt;br&gt;
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,&lt;br&gt;
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE&lt;br&gt;
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER&lt;br&gt;
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,&lt;br&gt;
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE&lt;br&gt;
SOFTWARE.&lt;/p&gt;

&lt;p&gt;~~~~&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;    For README&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
## Demo_car
This is a sample package to learn the steps of creating and publishing package.
~~~~

- Now we are ready with all the necessary files so without waiting we are going to packaging it.  

- Install virtual environment package.
  *(If you already have then skip this step.)*
  `pip install virtualenv`

- Create and activate the environment.
  `virtualenv -p python3 .`
  `source bin/activate`

- Install or Update the setuptools and wheel packages in the environment.
  `pip install --upgrade setuptools wheel`

      *setuptools and wheel are the packages which use setup.py file to build 
      package.*

Now all Set! 😊, You are going to create the package, Are you excited? 😉

- Fire below command to build package.

    `python3 setup.py sdist bdist_wheel`

Hopefully, the package should be built and you can see package's compressed file in `dist` directory beside setup.py file.
Well Done my friend(👍), You have created It.
***
**Installing the created package.**

- Create a new environment at some other location and activate it.
  *(Now you know how to do this).*

- Copy that zip file in the new environment.
  `pip install &amp;lt;package&amp;gt;`
  (`pip install  demo-car-0.1.tar.gz`)

*Run* `pip list` *to show the installed package in the activated environment. Your package should be enlisted here.*

Congratulations! (😊) You have installed your package.
***
**It's time to publish the Package to PyPI so that it is publically available.**

- We need to install twine package to upload it but before that go to the path where `setup.py` exists and Activate the first environment you created for packaging then fire up the command to install or update twine.
  `pip install --upgrade twine`

- Final command to send it to **PyPI**.
  `twine upload dist/*`

It will prompt you for the account credentials, 
So enter your username and password, then it will start uploading your package.

**Congratulations! (🎉 ) Dear, You did it.**
Now your package should be available on the [PyPI](https://pypi.org/)
([Reference]( https://packaging.python.org/tutorials/packaging-projects/))

Thank you for reading 😊.
These are the minimum steps to create a package which I know. Please refer official documentation if you want to explore more 💡. 
Feel free to ask query, I am happy to help. Have you found any useful information which I missed here? then please comment below.


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

</description>
      <category>python</category>
      <category>package</category>
      <category>linux</category>
      <category>writing</category>
    </item>
    <item>
      <title>Basic commands to start Git and GitHub</title>
      <dc:creator>Umesh Kumar Dhakar</dc:creator>
      <pubDate>Tue, 08 Jan 2019 10:43:33 +0000</pubDate>
      <link>https://forem.com/umeshdhakar/basic-commands-to-start-git-and-github-4157</link>
      <guid>https://forem.com/umeshdhakar/basic-commands-to-start-git-and-github-4157</guid>
      <description>&lt;p&gt;Hello Folks, Git and GitHub are very useful tools for people like us.&lt;br&gt;
It's a good habit to make notes of anything we are learning and there are many online tools which make note taking easier.&lt;br&gt;
&lt;a href="https://notepad.pw"&gt;notepad.pw&lt;/a&gt; and &lt;a href="https://simplenote.com/"&gt;simplenote&lt;/a&gt; are some clean and simple tools for note-taking. Happily, I have also developed this good habit and created a document while learning Git.&lt;br&gt;
Some good &lt;a href="https://dev.to/"&gt;Dev.to&lt;/a&gt; posts motivated me to pull out those documents and write them here also, which might help someone here and definitely me also.&lt;br&gt;
So here are the basics commands to start Git and host source code on GitHub(remote repository)&lt;/p&gt;

&lt;p&gt;Install git.&lt;br&gt;
&lt;code&gt;sudo apt-get install git&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create a git repository locally in a directory.&lt;br&gt;
&lt;code&gt;git init&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Set your globle user details.&lt;br&gt;
&lt;code&gt;git config --global user.name &amp;lt;name&amp;gt;&lt;/code&gt;&lt;br&gt;
&lt;code&gt;git config --global user.email &amp;lt;email&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create a README file to store the description of the repository.&lt;br&gt;
&lt;code&gt;touch README.md&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create a file to ignore some files and directories from the repository.&lt;br&gt;
(write the relative path of files and directory, wild cards can also be used here)&lt;br&gt;
&lt;code&gt;touch .gitignore&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Create project's files or paste the existing files in current directory.&lt;/p&gt;

&lt;p&gt;Add changed files for committing.&lt;br&gt;
&lt;code&gt;git add &amp;lt;filename&amp;gt;&lt;/code&gt;&lt;br&gt;
or add all files at once&lt;br&gt;
&lt;code&gt;git add .&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Check the status of the repository.&lt;br&gt;
&lt;code&gt;git status&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;It's time to commit the first changes locally.&lt;br&gt;
&lt;code&gt;git commit -m "initial commit"  &amp;lt;file&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Everything we did is stored locally, Now we will link the local repository to a remote repository.&lt;/p&gt;

&lt;p&gt;Create a fresh repository on GitHub (now we can create unlimited private repos on GitHub also, Thanks GitHub for this new year gift 😊).&lt;br&gt;
Click that green button 'Clone or download' and copy the URL of the repository.&lt;/p&gt;

&lt;p&gt;Link the remote repository with the local repository.&lt;br&gt;
&lt;code&gt;git remote add origin &amp;lt;repo_url&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Push the changes to the master branch of remote repository.&lt;br&gt;
&lt;code&gt;git push -u origin master&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;One extra command. 💡&lt;/strong&gt;&lt;br&gt;
If you accidentally commited changes or want to undo the commit these commands can save you time.&lt;br&gt;
&lt;code&gt;git reset --hard HEAD^&lt;/code&gt;&lt;br&gt;
&lt;code&gt;git push -f&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Thank you for reading 😊. &lt;br&gt;
There are much more commands out there. These are only the basics which can help in getting started. Please let me know if have some doubts, I am happy to help you out.&lt;/p&gt;

</description>
      <category>git</category>
      <category>linux</category>
      <category>github</category>
      <category>motivation</category>
    </item>
    <item>
      <title>Auto backup of Databases in Postgres Container.</title>
      <dc:creator>Umesh Kumar Dhakar</dc:creator>
      <pubDate>Mon, 10 Sep 2018 07:11:11 +0000</pubDate>
      <link>https://forem.com/umeshdhakar/auto-backup-of-databases-in-postgres-container-4p56</link>
      <guid>https://forem.com/umeshdhakar/auto-backup-of-databases-in-postgres-container-4p56</guid>
      <description>&lt;p&gt;&lt;strong&gt;1. Create A file in Container.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;touch /path/&amp;lt;file&amp;gt; &lt;/code&gt;&lt;br&gt;
    (ex. touch /data/bkp_cmd.sh)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Write Backup command to that file.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;echo "pg_dumpall -U postgres -f /path/file" &amp;gt;&amp;gt; &amp;lt;file&amp;gt; &lt;/code&gt;&lt;br&gt;
    (ex. eg. echo "pg_dumpall -U postgres -f &lt;br&gt;
        /data/files/backup_&lt;code&gt;date +%Y%m%d&lt;/code&gt;.sql" &amp;gt;&amp;gt; bkp_cmd.sh )&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Make file executable.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt; chmod 777 &amp;lt;file&amp;gt; &lt;/code&gt;&lt;br&gt;
    (ex. chmod 777 /data/bkp_cmd.sh )&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Append cron file.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;crontab -e &lt;/code&gt;&lt;br&gt;
    &lt;code&gt;00 00 */15 * * /data/bkp_cmd.sh &amp;gt;&amp;gt; /data/logs/cron_logs.txt 2&amp;gt;&amp;amp;1 &lt;/code&gt;&lt;br&gt;
    You can generate cron command &lt;a href="https://crontab-generator.org/"&gt;here.&lt;/a&gt;&lt;br&gt;
    if no editor is installed then&lt;br&gt;
    Append command to cron file via&lt;br&gt;
    &lt;code&gt;(crontab -l &amp;amp;&amp;amp; echo "00 00 */15 * * /data/bkp_cmd.sh &amp;gt;&amp;gt; /data/backup_cron_logs.txt 2&amp;gt;&amp;amp;1") | crontab - &lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Check whether cron service is running or not.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;service cron status&lt;/code&gt;&lt;br&gt;
    If it it is not running, start it by command&lt;br&gt;
    &lt;code&gt;service cron start&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Wait and check the backup file at specified time.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Thanks for reading! 😊&lt;br&gt;
Please leave a comment if you have any suggestions or you have any doubt.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>database</category>
      <category>learning</category>
      <category>postgres</category>
    </item>
    <item>
      <title>Share file between Host System and Docker Container.</title>
      <dc:creator>Umesh Kumar Dhakar</dc:creator>
      <pubDate>Tue, 04 Sep 2018 10:05:11 +0000</pubDate>
      <link>https://forem.com/umeshdhakar/share-file-between-host-system-and-docker-container-5al6</link>
      <guid>https://forem.com/umeshdhakar/share-file-between-host-system-and-docker-container-5al6</guid>
      <description>&lt;p&gt;&lt;strong&gt;1. Create a folder in Host System.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;C:\\&amp;lt;directory-name&amp;gt; &lt;/code&gt;&lt;br&gt;
    (eg. C:\\Tunnel)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Add this folder in Shared Folder list of VM.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt; Go to : Machine &amp;gt; Settings &amp;gt;  Shared Folder &lt;/code&gt;&lt;br&gt;
    (Add folder here.)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Create folder in VM.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;mkdir /&amp;lt;directory-name&amp;gt;&lt;/code&gt;&lt;br&gt;
    (eg. mkdir /vm_tunnel)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Mount Shared Folder in VM directory.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt;mount -t vboxsf &amp;lt;host-directory-name&amp;gt; /&amp;lt;vm-directory-name&amp;gt;&lt;/code&gt;&lt;br&gt;
    (eg. mount -t vboxsf Tunnel /vm_tunnel)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Test sharing by creating file in Tunnel or /vm_tunnel and check in&lt;br&gt;
/vm_tunnel or Tunnel.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Share file between Host and Conatiner.&lt;/strong&gt;&lt;br&gt;
    &lt;code&gt; docker cp &amp;lt;container&amp;gt;:/path/to/file/in/container /path/of/vm-direcory/where/host-folder/is/mounted&lt;/code&gt;&lt;br&gt;
        (eg. docker cp foo:/data/bar.txt /vm_tunnel/)&lt;/p&gt;

&lt;p&gt;Thanks for reading! 😊&lt;/p&gt;

</description>
      <category>docker</category>
      <category>linux</category>
      <category>devtips</category>
    </item>
    <item>
      <title>How to build a docker image which can restore databases whenever a new container is created from it</title>
      <dc:creator>Umesh Kumar Dhakar</dc:creator>
      <pubDate>Wed, 29 Aug 2018 10:18:22 +0000</pubDate>
      <link>https://forem.com/umeshdhakar/how-to-build-a-docker-image-which-can-restore-databases-whenever-a-new-container-is-created-from-it-dbm</link>
      <guid>https://forem.com/umeshdhakar/how-to-build-a-docker-image-which-can-restore-databases-whenever-a-new-container-is-created-from-it-dbm</guid>
      <description>&lt;p&gt;&lt;strong&gt;1. Run a postgres container.&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker container run --name &amp;lt;name&amp;gt; --detach -p 5432:5432 -e POSTGRES_PASSWORD=&amp;lt;my-pass&amp;gt; postgres &lt;/code&gt;&lt;br&gt;
(ex: docker container run --name store --detach -p 5432:5432 -e POSTGRES_PASSWORD=my-pass postgres)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Create a directory in container to copy backup file.&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker exec &amp;lt;container_name&amp;gt; mkdir /&amp;lt;dir_name&amp;gt;&lt;/code&gt;&lt;br&gt;
(ex: docker exec store mkdir /data)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Copy a backup file in container from Host or from another container.&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker cp /path/of/backup_file/database_bkp.sql &amp;lt;container_name&amp;gt;:/path/to/copy/file&lt;/code&gt;&lt;br&gt;
(ex: docker cp /volume1/database_bkp.sql store:/data)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Install an editor in container (nano or vim etc...).&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;apt-get update&lt;br&gt;
 apt-get install vim&lt;br&gt;
 export EDITOR=vim&lt;br&gt;
&lt;/code&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Create file in /docker-entrypoint-initdb.d directory&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;touch /docker-entrypoint-initdb.d/&amp;lt;file_name&amp;gt;&lt;/code&gt;&lt;br&gt;
(ex: touch /docker-entrypoint-initdb.d/restore_db.sh )&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. write restore script in restore_db.sh&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;code&gt;#!/bin/bash&lt;br&gt;
set -e&lt;br&gt;
psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" -f /data/database_bkp.sql&lt;/code&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;(if you dumped only a single database using pg_dump then please mention database name also in script)&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;code&gt;psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" -d &amp;lt;database-name&amp;gt; -f /data/database_bkp.sql)&lt;/code&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;7. Make file executable&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;chmod 755 /docker-entrypoint-initdb.d/restore_db.sh&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;8. Build a new image from running container.&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker commit &amp;lt;running-container&amp;gt; &amp;lt;new-image&amp;gt;&lt;/code&gt;&lt;br&gt;
(ex: docker commit store store_new)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;9. Verify that image has built properly.&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker image ls&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;10. Run container from newly build image on a different port.&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;docker container run --name &amp;lt;name&amp;gt; --detach -p 4432:5432 -e POSTGRES_PASSWORD=&amp;lt;my-pass&amp;gt; &amp;lt;new-image&amp;gt;&lt;/code&gt;&lt;br&gt;
(eg: docker container run --name store_postgres --detach -p 4432:5432 -e POSTGRES_PASSWORD=my-pass store_new)&lt;/p&gt;

&lt;p&gt;This image has a restore file in it, a database is created when a new container is created from this image&lt;/p&gt;

&lt;p&gt;Thanks for reading! 😊, This is my first post. Please let me know if you have any issue and also please write feedback.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>postgres</category>
      <category>todayilearned</category>
      <category>database</category>
    </item>
  </channel>
</rss>
