<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Arockia Nirmal Amala Doss</title>
    <description>The latest articles on Forem by Arockia Nirmal Amala Doss (@arockianirmal26).</description>
    <link>https://forem.com/arockianirmal26</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/arockianirmal26"/>
    <language>en</language>
    <item>
      <title>From On-Prem Microsoft SQL Server to AWS Serverless - Migrating Business Intelligence, Application data Landscape</title>
      <dc:creator>Arockia Nirmal Amala Doss</dc:creator>
      <pubDate>Mon, 10 Apr 2023 01:09:46 +0000</pubDate>
      <link>https://forem.com/aws-builders/from-on-prem-microsoft-sql-server-to-aws-serverless-migrating-business-intelligence-application-data-landscape-3if7</link>
      <guid>https://forem.com/aws-builders/from-on-prem-microsoft-sql-server-to-aws-serverless-migrating-business-intelligence-application-data-landscape-3if7</guid>
      <description>&lt;h1&gt;Introduction&lt;/h1&gt;

&lt;p&gt;
In today's fast-paced business world, companies are constantly seeking ways to streamline their operations, increase efficiency, and reduce costs. One way to achieve these goals is by leveraging the power of cloud computing and serverless technologies. This was the case for a company that I was working for and had been using Microsoft SQL Server, SSIS ETL tool and Reporting Services for their business intelligence/internal application needs. &lt;br&gt;
In this blog post, I will summarize how we successfully migrated the complete business intelligence (OLAP) and application (OLTP) databases setup to AWS serverless services, including Amazon Aurora RDS, Lambda, Event Bridge, Glue, S3, Redshift and Quick Sight. In this post I summarise high level details about the architecture and the process.
&lt;/p&gt;

&lt;h1&gt;Old Architecture (On-Premises)&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LE2uF4ZT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l3j3kgw5fyqr5da5a3ck.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--LE2uF4ZT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/l3j3kgw5fyqr5da5a3ck.jpg" alt="Old Architecture" width="800" height="342"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;
In the company we mainly had two categories of data, the first set is the worldwide sales/product data which is refreshed every morning and available as flat files in the company intranet which is available to the subsidiaries worldwide. The second set is the local sales data from the European subsidiary&lt;br&gt;
&lt;b&gt;Part 1 &lt;/b&gt;: A python script is scheduled using windows task scheduler on a on-prem windows server. Every morning this script downloads the worldwide sales/product data flat files from the intranet and places in the same on-prem windows server of our European subsidiary. Once the flat files are arrived in the folder, the SSIS ETL packages will get triggered. &lt;br&gt;
&lt;b&gt;Part 2 &lt;/b&gt;: The SSIS ETL packages will now read the flat files, transform and load the product metadata/worldwide sales data  into the European SQL Server staging database. The product metadata is used by the European sales website as well as by the CRM. The sales data is used for analysis purposes&lt;br&gt;
&lt;b&gt;Part 3 &lt;/b&gt;: The data in the staging database is further transformed into Facts and dimensions and are available as base tables for reporting. Once the ETL jobs are completed the internal employees are notified with an email and they can access the business intelligence/sales SSRS reports. 


&lt;/p&gt;
&lt;h3&gt;Drawbacks of the above architecture:&lt;/h3&gt;

&lt;ol&gt;
  &lt;li&gt;Though we enjoyed full ownership of the data infrastructure on-premises, over the period of time we noticed performance degradation since the data we process everyday increased exponentially. Scalability and performance became major concern.&lt;/li&gt;
  &lt;li&gt;We always relied on an expert freelancer consultant who set up the data infrastructure to analyze and fix the infrastructure issues.&lt;/li&gt;
  &lt;li&gt;Security patches and upgrades must be done manually which further complicated the maintenance &lt;/li&gt;
  &lt;li&gt;Increasing costs for the hardware and licenses were also an additional issue &lt;/li&gt;
&lt;/ol&gt;



&lt;h1&gt;New Architecture (AWS Serverless)&lt;/h1&gt;

&lt;p&gt;
Considering the above drawbacks, we decided to move the complete data landscape to AWS. Given the knowledge bases and documentation available online we decided to first build a PoC. We could able to successfully negotiate with the management with the PoC. Below is the rough illustration of the new architecture we developed over the time. 
&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_9SX4IJA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sksl8ro4vhe3rz1bigkr.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_9SX4IJA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/sksl8ro4vhe3rz1bigkr.jpg" alt="New Architecture" width="800" height="455"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;b&gt;Part 1&lt;/b&gt;: We created and scheduled a Lambda function using Eventbridge which would download the required flat files every morning and put into the Raw bucket in our S3 Datalake. &lt;br&gt;&lt;br&gt;
&lt;b&gt;Part 2&lt;/b&gt;: Once the files are in the Raw bucket, a Glue ETL job will be triggered which will refine the data in the flat files. For example, here we copy only the required columns from Raw to Refined stage. &lt;br&gt;&lt;br&gt;
&lt;b&gt;Part 3&lt;/b&gt;: Once the files are in the Refined bucket, another set of Glue ETL jobs will be triggered which will transform the data and copy the required data rows into Aurora Serverless RDS, Redshift serverless and Reporting S3 bucket. To reduce the costs, we periodically unload cold data defined by our business departments into Refined S3 bucket.&lt;br&gt;&lt;br&gt;
&lt;b&gt;Part 4&lt;/b&gt;: Aurora Serverless RDS is our OLTP database which is used by our Sales Apps and CRM which now contains the European sales and customer data. For our datawarehousing and analysis/reporting purposes we move the necessary data from the OLTP database to Redshift on a hourly basis using another set of Glue ETL jobs. &lt;br&gt;&lt;br&gt;
&lt;b&gt;Part 5&lt;/b&gt;: We have a set of reports in Quicksight that connects with Redshift datawarehouse and S3 reporting bucket. To query the data from S3 we leverage the features of Glue crawler, data catalog and athena. &lt;br&gt;&lt;br&gt;
&lt;b&gt;Initial Database Migration&lt;/b&gt;: We performed our initial database migration with AWS Database Migration Service and Schema Conversion Tool. DMS features like data validation and setting up ongoing replication from source target enabled smooth transition. Earlier we used SQL Server both as OLTP and OLAP databases but now we have Aurora as OLTP and Redshift as OLAP. &lt;br&gt;&lt;/p&gt;


&lt;h3&gt;Best Practices &amp;amp; Things we gained after moving to AWS&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;The main advantage for us is the flexibility in terms of cost and scalability. We are now aware of our exact expenses for each AWS service, which inturn helps us fine tune our use cases. We have also set up database auto scaling policies based on usage&lt;/li&gt;
&lt;li&gt;We always offloaded the read traffic from the main writer database instance by having a separate read only instances which boosted the performance further&lt;/li&gt;
&lt;li&gt;Database backups &amp;amp; replications are easier to setup&lt;/li&gt;
&lt;li&gt;We started using Infrastructure as Code with Terraform to provision AWS resources and version controlling for our data infrastructure&lt;/li&gt;
&lt;li&gt;Since we opted for mostly serverless services, now we worry no more about managing the hardware&lt;/li&gt;
&lt;/ol&gt;



&lt;p&gt;Migrating to AWS was the best decision made by the company and they are now exploring many new AWS services like EMR, Sagemaker etc., to dive deep into the data to extract insights from the data. Thanks for reading the article and feel free to add comments or write me about your thoughts!  &lt;/p&gt;

</description>
      <category>aws</category>
      <category>database</category>
      <category>data</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Guide to Creating a Secure and Efficient Personal AWS Account in 2023 - Best Practices for First-Time Users</title>
      <dc:creator>Arockia Nirmal Amala Doss</dc:creator>
      <pubDate>Tue, 04 Apr 2023 09:06:25 +0000</pubDate>
      <link>https://forem.com/aws-builders/guide-to-creating-a-secure-and-efficient-personal-aws-account-in-2023-best-practices-for-first-time-users-4cfn</link>
      <guid>https://forem.com/aws-builders/guide-to-creating-a-secure-and-efficient-personal-aws-account-in-2023-best-practices-for-first-time-users-4cfn</guid>
      <description>&lt;p&gt;Amazon Web Services (AWS) is a cloud computing platform that offers a wide range of services, including compute power, storage, and databases, as well as tools for machine learning, security, and more. Opening an AWS account is a simple process, but it's important to follow best practices to ensure that your account is secure and your resources are optimized for cost and performance. In this blog, I'll walk you through the steps of opening an AWS account and provide you with some key best practices to help you get started on the right foot.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zyvkrnDt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iph7j2xcz87vxoohwdpx.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zyvkrnDt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/iph7j2xcz87vxoohwdpx.jpg" alt="Overview" width="747" height="203"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Content of this Article
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Sign up for an AWS account&lt;/li&gt;
&lt;li&gt;Login as a root user &amp;amp; secure(MFA) root user&lt;/li&gt;
&lt;li&gt;Create an Admin user with required permissions&lt;/li&gt;
&lt;li&gt;Setup account alias&lt;/li&gt;
&lt;li&gt;Change payment currency preference&lt;/li&gt;
&lt;li&gt;Update security challenge questions&lt;/li&gt;
&lt;li&gt;Setup default region/language&lt;/li&gt;
&lt;li&gt;Set up the AWS CLI&lt;/li&gt;
&lt;li&gt;Billing alerts/alarm&lt;/li&gt;
&lt;li&gt;Best practices &amp;amp; Recommendations&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Sign up for an AWS account
&lt;/h2&gt;

&lt;p&gt;Signing up for an AWS account is a 5 step process. The process can be started with the following link&lt;/p&gt;

&lt;p&gt;&lt;a href="https://portal.aws.amazon.com/billing/signup#/start/email"&gt;Sign up for AWS.&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the initial step, you will be asked for a root user email address and an account name. Root user must be strictly used only for administrative functions like account recovery, billing etc. For developing/using services in AWS, it is always recommended to create separate users with minimal or required privileges. We will create an Admin user later in this article. After the above step you need to verify your email address with a one time password which will be sent to your root email address.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1:&lt;/strong&gt; After the email verification, you are asked to create a password for a root user. AWS requires that your password meet the following conditions: It must have a minimum of 8 characters and a maximum of 128 characters. Make sure to create a strong password with special characters.&lt;br&gt;
&lt;strong&gt;Step 2:&lt;/strong&gt; Here you will asked for your contact information and your usage type (choose Personal- for your own projects).&lt;br&gt;
&lt;strong&gt;Step 3:&lt;/strong&gt; Now its time for entering your billing information. Your credit/debit card will be validated in this step.&lt;br&gt;
&lt;strong&gt;Step 4:&lt;/strong&gt; Your identity will be confirmed in this step. You could opt to receive either a text message or a voice call.&lt;br&gt;
&lt;strong&gt;Step 5:&lt;/strong&gt; You will be asked to select a support plan. It is recommended to opt for 'Basic support - Free' plan for new users who are just getting started with AWS.&lt;br&gt;
Congratulations you are now successfully created your personal AWS account! Now we will perform some essential setups and also secure your AWS account.&lt;/p&gt;
&lt;h2&gt;
  
  
  Login as a root user &amp;amp; secure(MFA) root user
&lt;/h2&gt;

&lt;p&gt;Once the account has been created, you might be redirected to the login page where you can login again as a root user.  In order to secure the root user we need to add MFA(Multifactor Authentication) for the root user. Search for IAM in the home page of the AWS management console and open IAM dashboard.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ip8C1p8q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yn4aipjrl1sqoe2h24ej.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ip8C1p8q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/yn4aipjrl1sqoe2h24ej.png" alt="Secure User with MFA" width="752" height="383"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Under 'security recommendations' click 'Add MFA' and follow the instructions. For example you could use Google/Microsoft authenticator in your smartphone for this purpose. Also make sure that the root user have no active access keys since root user will not be used to perform daily tasks.&lt;/p&gt;
&lt;h2&gt;
  
  
  Create an Admin user with required permissions
&lt;/h2&gt;

&lt;p&gt;When setting up a new AWS account, you create a root user account. The root user is a special entity that has full access to the account, and can perform all actions, including changing the payment methods or closing the account. Due to this level of permissions, set up additional users to perform daily tasks related to your account. It is recommended that you create separate users for specific roles and functions.&lt;/p&gt;

&lt;p&gt;Again, we'll use the IAM service to create users and assign them permissions. Before setting up a new user, we'll create a user group. User groups let you specify permissions for multiple users, which can make it easier to manage the permissions for those users. For example, you could have a user group called Admins and give that user group typical administrator permissions. Any user in that user group automatically has Admins group permissions.&lt;/p&gt;

&lt;p&gt;In the IAM console, choose User groups in the left-side navigation and then choose Create group. Enter the User group name (in this case, Admins), then scroll down to the Attach permissions policies section. Search for "AdministratorAccess", then select the box next to the policy with the name "AdministratorAccess", scroll down, and choose Create group. Once the user group has been created, select Users in the left-side navigation bar and then choose Add users and link it with the user group like below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_w-gZLIf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6cfzfhtqcwywr3w23sfi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_w-gZLIf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6cfzfhtqcwywr3w23sfi.png" alt="Create Admin User" width="752" height="578"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--x1k0X5FZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0sh02vljgol1rrumzu6m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--x1k0X5FZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0sh02vljgol1rrumzu6m.png" alt="Assign Admin Group" width="752" height="419"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once you review and create the new user, the console sign-in details will be shown which includes URL, username and password. Don't forget to secure this user too with MFA.&lt;/p&gt;
&lt;h2&gt;
  
  
  Setup account alias
&lt;/h2&gt;

&lt;p&gt;Let's set an alias for your account, which should be easier to remember than the 12-digit account ID. To set it, navigate to the Identity and Access Management (IAM) dashboard. Find the account ID on the right-hand side and click Create or Change under the AWS account alias. This alias needs to be globally unique across all AWS accounts, so your first choice may not be available.&lt;/p&gt;
&lt;h2&gt;
  
  
  Change payment currency preference
&lt;/h2&gt;

&lt;p&gt;This prevents our card issuer to charge a fee for transactions in other currencies. Click your Account Name (Top Right) -&amp;gt; Account to find the section Payment Currency Preference.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sFMfu3GG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/89302lmv7pe8x3eipqc2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sFMfu3GG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/89302lmv7pe8x3eipqc2.png" alt="Currency Preference" width="752" height="95"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Update security challenge questions
&lt;/h2&gt;

&lt;p&gt;Improve the security of your AWS account by adding security challenge questions. AWS use these to help identify you as the owner of your AWS account if you ever need to contact AWS customer service for help. Click your Account Name (Top Right) -&amp;gt; Account to find the section Configure Security Challenge Questions.&lt;/p&gt;
&lt;h2&gt;
  
  
  Setup default region/language
&lt;/h2&gt;

&lt;p&gt;Click your Account Name (Top Right) -&amp;gt; Account to find the section Unified Settings where you set the default region and language. Setting default region isn't just an issue for newbies, it would save a lot of mini heart attacks when people log in to an account and think they're missing resources, only to find out they're in the wrong region!&lt;/p&gt;
&lt;h2&gt;
  
  
  Set up the AWS CLI
&lt;/h2&gt;

&lt;p&gt;The AWS CLI is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.&lt;br&gt;
Go to IAM -&amp;gt; Users -&amp;gt; (newly created user) -&amp;gt; Under 'Security Credentials' go to 'Access Keys' -&amp;gt; 'Create Access Key'. In the next window 'Access key best practices &amp;amp; alternatives' choose 'Command Line Interface (CLI) ' and then create the access key (adding Tags is optional). Now save the access key and the secret access key (download the .csv file). Follow the instructions below to install the CLI based on your OS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/getting-started-install.html"&gt;AWS CLI&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;once the installation is complete, we could confirm the installation using the version command in the terminal/powershell (windows) like below and you would see a version number.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To configure the credentials, use the following command in the CLI and add the access key ID and secret access key credentials of the user created earlier. You also need to mention the default region name. The default output format can be json.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws configure
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After this you could run the aws ec2 describe-vpcs command to check if the configuration is correct. Each new AWS account has default VPCs configured. In the output section, you can view the VPCs in your AWS account.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws ec2 describe-vpcs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Billing alerts/alarm
&lt;/h2&gt;

&lt;p&gt;Search for 'Billing' in AWS home. You would land in AWS Billing Dashboard. In the navigation pane, choose Billing Preferences under Preferences. Select Receive Billing Alerts And click save preferences. You could also select 'Receive PDF Invoice By Email' and 'Receive Free Tier Usage Alerts' if needed. Now the billing alerts are enabled.&lt;br&gt;
Billing metrics are located in us-east-1 region. Make sure the Console is switched to that region by selecting US East (N. Virginia) us-east-1 in the region selector (top right corner, next to Support button).&lt;br&gt;
Follow the instructions below to create/delete an alarm. In my case I set an alarm when the overall costs goes more than 10 Dollars.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/monitor_estimated_charges_with_cloudwatch.html"&gt;Billing Alarm&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Best practices &amp;amp; Recommendations
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Rotate all your access key pairs / root user passwords regularly&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Limit the tasks you perform with the root user (creating an administrative user, change account settings and closing AWS account etc.) More details are in the link below. Never create access keys for the root user. If you need to have one then rotate access keys regularly&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;When creating IAM users make sure that the IAM users have the most restrictive policies possible, with only enough permissions to allow them to complete their intended tasks&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Frequently audit your IAM roles/Policies&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;To protect secrets in AWS, configure them in AWS Secrets Manager, then insert a descriptive reference to them in the application code. For example, the password for a production database can be stored in Secrets Manager and named my_db_password_production. Use AWS Cost Explorer which enables you to view and analyze your costs and usage. Using the Cost Explorer user interface is free of charge&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Make use of AWS Free Tier&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use Tags to organize resources and to track you AWS costs on a detailed level&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
