<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Payal Gupta</title>
    <description>The latest articles on Forem by Payal Gupta (@payalgupta4639).</description>
    <link>https://forem.com/payalgupta4639</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/payalgupta4639"/>
    <language>en</language>
    <item>
      <title>Glue cross-account setup</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 11 Jan 2025 16:10:44 +0000</pubDate>
      <link>https://forem.com/aws-builders/glue-cross-account-setup-466p</link>
      <guid>https://forem.com/aws-builders/glue-cross-account-setup-466p</guid>
      <description>&lt;p&gt;This document will cover detailed steps on how to query glue DB catalog from Dremio in a cross-account setup using AWS Lake formation&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Use-case&lt;/strong&gt;&lt;br&gt;
Account A - Dremio is deployed here and AWS Glue_DB_A is created and added as a source in Dremio&lt;/p&gt;

&lt;p&gt;Account B - AWS Glue_DB_B  is created and data is located in the S3 bucket&lt;/p&gt;

&lt;p&gt;Customer wants to share Glue-DB B catalog with Glue-DB A and query the data located in account B from Dremio&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setup Diagram&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr62kmultg90vao8s706o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr62kmultg90vao8s706o.png" alt="Image description" width="800" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Role of each of service in the given setup&lt;/strong&gt; - &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Lake Formation - To create data mesh, simplify cross-account data sharing, and create resource links&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Resource Access Manager - To share resources and view shared Data catalog &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;IAM User - To provide cross-account read/write access to the S3 bucket to run queries from Dremio &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Amazon Athena - Just to test whether lake formation access is working fine or not&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Steps&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Resource Sharing using Lake Formation and Resource Access Manager&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;First we need to use Lake Formation and Resource Access Manager to share glue catalog from account B to A&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps for Account-B:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Create Glue DB named Glue_DB_B&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create Glue Table in this DB, point to S3 location where data resides, and provide schema &lt;br&gt;
OR&lt;br&gt;
You can use glue crawler to automatically extract data from S3 and add glue table for you. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to Lake Formation console -&amp;gt; Data Lake Location -&amp;gt; Register same S3 location -&amp;gt; Use default IAM role -&amp;gt; &lt;code&gt;AWSServiceRoleForLakeFormationDataAccess&lt;/code&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to Lake Formation -&amp;gt; Databases -&amp;gt; Select Glue_DB_B -&amp;gt; Actions -&amp;gt; Grant -&amp;gt; Fill in (External Account), put AWS Account-A ID -&amp;gt; Choose a specific table&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;For DB, grant Alter, Create table, Describe
For Table, grant Alter, Delete, Describe, Drop, Insert
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Go to Resource Access Manager console -&amp;gt; Shared by me in the left pane -&amp;gt; Resource Shares 
You should be able to view your shared resources&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Steps for Account-A:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Go to Resource Access Manager → Shared with me → Resource Shares → Accept your Resource Share&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now, Go to Lake Formation -&amp;gt; Table -&amp;gt; Your shared table will appear here -&amp;gt; Click on table -&amp;gt; Actions -&amp;gt; create Resource link&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Table will now appear italicized in the glue db as shown below&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Provide cross-account read/write access to the S3 bucket&lt;/strong&gt;  &lt;/p&gt;

&lt;p&gt;Steps to do so:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to Account B → S3 console&lt;/li&gt;
&lt;li&gt;Select your S3 bucket &lt;/li&gt;
&lt;li&gt;Go to the Permissions tab &lt;/li&gt;
&lt;li&gt;Edit Bucket Policy and add the following policy (make sure to add the AWS Account-A ID, IAM User name, and bucket name)
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::&amp;lt;AccountA-ID&amp;gt;:user/&amp;lt;username&amp;gt;"
            },
            "Action": [
                "s3:GetObject",
                "s3:PutObject"
            ],
            "Resource": "arn:aws:s3:::&amp;lt;bucket-name&amp;gt;/*"
        },
        {
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::&amp;lt;AccountA-ID&amp;gt;:user/&amp;lt;username&amp;gt;"
            },
            "Action": [
                "s3:GetLifecycleConfiguration",
                "s3:ListBucket"
            ],
            "Resource": "arn:aws:s3:::&amp;lt;bucket-name&amp;gt;"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Add Glue catalog as a source in Dremio&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Last step is to add Glue_DB_A as a source in Dremio :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to Add Source&lt;/li&gt;
&lt;li&gt;Select AWS Glue Data Catalog&lt;/li&gt;
&lt;li&gt;Fill in the details - Name, Region, Authentication&lt;/li&gt;
&lt;li&gt;Hit Save&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You should be able to view the datasets from both the glue catalogs and run queries on them.&lt;/p&gt;

&lt;p&gt;Or&lt;/p&gt;

&lt;p&gt;You can run the query on the glue source via Athena instead of Dremio.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>analytics</category>
      <category>cloud</category>
    </item>
    <item>
      <title>How to use Glue crawler to add tables automatically</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 07 Dec 2024 12:54:12 +0000</pubDate>
      <link>https://forem.com/aws-builders/how-to-use-glue-crawler-to-add-tables-automatically-51c4</link>
      <guid>https://forem.com/aws-builders/how-to-use-glue-crawler-to-add-tables-automatically-51c4</guid>
      <description>&lt;p&gt;This document will cover the steps on how to use Glue crawler to extract data from S3 to automatically add tables to the glue DB and run queries on it from Dremio or Athena&lt;/p&gt;

&lt;p&gt;Setup Diagram&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0y5psbjosrl5bnrdowx2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0y5psbjosrl5bnrdowx2.png" alt="Setup Diagram" width="800" height="187"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Steps to follow&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create an S3 bucket and upload the raw data i.e, csv, json files.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to AWS Glue Console and Create Glue DB &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go to Tables page and Select Add Tables using crawler on the top right corner&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmm6kit1kbd8xmk49462h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmm6kit1kbd8xmk49462h.png" alt="Add Tables using Crawler" width="800" height="183"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This should land you to the AWS Glue Crawler setup page &lt;/p&gt;

&lt;p&gt;Follow below steps to fill in the details&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Name - Enter the Crawler name&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;Add data source &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data source - Select S3&lt;/li&gt;
&lt;li&gt;Location of S3 data - Select In this account (if that’s the case)&lt;/li&gt;
&lt;li&gt;S3 path - Browse for the S3 bucket which contains the data and don’t forget to add forward slash at the end&lt;/li&gt;
&lt;li&gt;Subsequent crawler runs - Select Crawl all sub-folders&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Click Add an S3 data source&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Click Next → Configure security settings&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Click Create new IAM role and give a name to the role. It will create a new IAM role required by the Glue crawler to extract the data present in the S3 bucket&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Next, Set output and scheduling&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Select the Target Database - you can choose default or create a new one&lt;/li&gt;
&lt;li&gt;Crawler schedule - On Demand&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Next → Review and Create → Create Crawler&lt;/p&gt;&lt;/li&gt;

&lt;li&gt;&lt;p&gt;Now, the crawler has been successfully created and you can run the crawler&lt;/p&gt;&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F120k34rq3bb8nq9x7a7r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F120k34rq3bb8nq9x7a7r.png" alt="Run the crawler" width="800" height="170"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It will take few minutes to extract the data from S3 bucket and once it is done, you should see the state as Ready&lt;/p&gt;

&lt;p&gt;Now, you should be able to see a table added in the glue DB&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Dremio → Add the glue catalog as a source&lt;/li&gt;
&lt;li&gt;Name - Enter glue catalog name&lt;/li&gt;
&lt;li&gt;Region - Select the AWS region&lt;/li&gt;
&lt;li&gt;Authentication - AWS Access key&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Click Save and run queries on the glue DB from Dremio! or Athena&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudskills</category>
      <category>learning</category>
    </item>
    <item>
      <title>Embracing the Future of Tech: Attend AWS re:Invent 2023 as an All Builders Welcome Grant Recipient</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Wed, 25 Oct 2023 12:43:26 +0000</pubDate>
      <link>https://forem.com/aws-builders/embracing-the-future-of-tech-attend-aws-reinvent-2023-as-an-all-builders-welcome-grant-recipient-i1e</link>
      <guid>https://forem.com/aws-builders/embracing-the-future-of-tech-attend-aws-reinvent-2023-as-an-all-builders-welcome-grant-recipient-i1e</guid>
      <description>&lt;p&gt;The journey of a thousand miles begins with a single step, they say. For me, that first step was the moment I received an email that would forever change my perspective on the future of cloud computing and digital transformation. &lt;/p&gt;

&lt;p&gt;Last year, I received the All Builders Welcome grant to attend AWS re:Invent 2022!&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feh6apwnejnmwvjgu3kqz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feh6apwnejnmwvjgu3kqz.png" alt="ABW Grant received" width="800" height="129"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But unfortunately, due to the delay in getting US visa, had to defer it to this year.&lt;/p&gt;

&lt;p&gt;I am thrilled to share that I got the US visa this year and will be attending AWS re:Invent 2023 in Las Vegas as an ABW Grant recipient!&lt;/p&gt;

&lt;h3&gt;
  
  
  The All Builders Welcome Grant: A Gateway to Knowledge and Opportunity
&lt;/h3&gt;

&lt;p&gt;AWS re:Invent is renowned as the largest gathering of the global cloud computing community. Each year, it attracts an array of visionaries, engineers, and experts who are at the forefront of technological advancement. &lt;/p&gt;

&lt;p&gt;The grant includes registration for re:Invent 2023, airfare to Las Vegas, Nevada, and hotel accommodations for the duration of the conference. &lt;/p&gt;

&lt;p&gt;Participants will have access to five days of re:Invent content and activities as well as a curated program designed to remove barriers and create opportunities for learning, career growth, and community building, including a welcome event, reserved seating in the keynotes, meetups, a mentoring luncheon with AWS leadership, and much more!&lt;/p&gt;

&lt;p&gt;To learn more about it, check out the official page - &lt;a href="https://reinvent.awsevents.com/community/all-builders-welcome/"&gt;https://reinvent.awsevents.com/community/all-builders-welcome/&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The Significance of AWS re:Invent
&lt;/h3&gt;

&lt;p&gt;AWS re:Invent is more than just a conference; it is a hub of innovation, knowledge-sharing, and networking. It is where the brightest minds come together to discuss, dissect, and debate the future of cloud technology and its applications. The event covers a vast spectrum of topics, from machine learning and artificial intelligence to serverless computing and security.&lt;/p&gt;

&lt;h3&gt;
  
  
  What to Expect
&lt;/h3&gt;

&lt;p&gt;As I prepare to embark on this incredible journey, I'm excited about several aspects of the event:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Knowledge and Insights&lt;/strong&gt;:&lt;br&gt;
AWS re:Invent is an unparalleled source of knowledge and insights. It's a place where you can learn from the best in the industry, stay updated on the latest developments, and gain a deeper understanding of cloud technology's future.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Networking&lt;/strong&gt;:&lt;br&gt;
The event provides a unique opportunity to connect with professionals, experts, and peers who share my passion for technology. These connections can lead to collaborations, idea exchanges, and lifelong friendships.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Innovation&lt;/strong&gt;:&lt;br&gt;
Expectations for groundbreaking announcements and product launches run high at AWS re:Invent. It's the place where the technology of tomorrow is unveiled today.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Personal Growth&lt;/strong&gt;:&lt;br&gt;
Beyond the technical aspects, attending a conference of this magnitude can help me grow as an individual and professional. It will broaden my horizons and challenge my thinking.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Thank You, AWS
&lt;/h3&gt;

&lt;p&gt;I want to express my profound gratitude to the AWS team for providing this incredible opportunity to me. It is not only an acknowledgment of my passion and dedication but also an investment in my future. I look forward to representing the All Builders Welcome community at AWS re:Invent and making the most of this experience.&lt;/p&gt;

&lt;h3&gt;
  
  
  Let's Connect!
&lt;/h3&gt;

&lt;p&gt;As I prepare to embark on this incredible journey, I want to invite you all to join me. If you're attending AWS re:Invent 2023, I would be delighted to connect with you. Let's learn, share, and innovate together.&lt;/p&gt;

&lt;p&gt;Stay tuned for live updates from the event, where I will be sharing key takeaways, exciting announcements, and my personal reflections on this journey of knowledge and discovery.&lt;/p&gt;

&lt;p&gt;The future of technology is unfolding before us, and I can't wait to be a part of it at AWS re:Invent 2023!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>techtalks</category>
      <category>cloud</category>
      <category>cloudskills</category>
    </item>
    <item>
      <title>AWS Solutions Architect Professional Exam Preparation Guide</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 10 Dec 2022 07:14:04 +0000</pubDate>
      <link>https://forem.com/payalgupta4639/aws-solutions-architect-professional-exam-preparation-guide-49in</link>
      <guid>https://forem.com/payalgupta4639/aws-solutions-architect-professional-exam-preparation-guide-49in</guid>
      <description>&lt;p&gt;This blog post will cover:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Brief overview of AWS Solutions Architect Professional Exam&lt;/li&gt;
&lt;li&gt;Who should take this exam?&lt;/li&gt;
&lt;li&gt;Some useful tips &amp;amp; Resources to crack the exam (based on experience)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's get started...&lt;/p&gt;

&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;This is a professional level exam and it is considered as the toughest of all the AWS Certifications. It needs extensive knowledge and abilities in offering complicated solutions to complex problems in order to maximize security, cost, and performance, and automate manual procedures.&lt;/p&gt;

&lt;p&gt;Exam will include 75 questions, either multiple choice or multiple response to be completed in 180 minutes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Who Should take this exam?
&lt;/h2&gt;

&lt;p&gt;This exam is intended for individuals with two or more years of hands-on experience on AWS to design and deploy cloud architecture to provide better solutions to complex problems.&lt;/p&gt;

&lt;p&gt;You would need a deeper understanding of the AWS services in terms of how these services work together along with their best practices and useful benefits over one another in order to choose the best option for the given use case.&lt;/p&gt;

&lt;h2&gt;
  
  
  Some useful tips &amp;amp; resources to crack this exam
&lt;/h2&gt;

&lt;p&gt;I recently took this exam and was able to crack it by following below practices. Hope they help you too.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Do hands-on labs&lt;/strong&gt;&lt;br&gt;
Take any good Course available on Udemy, Whizlabs, CloudAcademy etc. &lt;br&gt;
I took &lt;a href="https://cloudacademy.com/learning-paths/solutions-architect-professional-certification-preparation-for-aws-2019-377/"&gt;Cloud Academy course&lt;/a&gt; that helped me alot.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Mock test series&lt;/strong&gt;&lt;br&gt;
Make sure to atleast take 2 mock tests in a week for practice. I used Mock test series by Stephane on &lt;a href="https://www.udemy.com/course/practice-exam-aws-certified-solutions-architect-professional/"&gt;Udemy&lt;/a&gt;.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Read AWS Services FAQs and Whitepapers&lt;/strong&gt;&lt;br&gt;
Many times the questions come from the FAQs and whitepapers as these resources includes how to best implement the AWS services, and how one service is different from the other one in a particular use-case. So, I would highly suggest you to go through them at-least once.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Learn about AWS Services best practices&lt;/strong&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Practice Time and Focus Management&lt;/strong&gt;&lt;br&gt;
This is crucial to crack the exam. One thing that absolutely helped me in cracking this exam was the daily routine that I created for this exam's preparation. &lt;br&gt;
You need to ensure that you block your calendar or put a reminder to dedicate some focused time only for exam preparation such as for doing mock tests, reading through whitepapers etc. To know more, check out &lt;a href="https://youtu.be/KoAztvttZwM"&gt;this&lt;/a&gt; video.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That's all for now. Thanks!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>cloudskills</category>
      <category>cloud</category>
    </item>
    <item>
      <title>AWS Solutions Architect Associate Exam Preparation Guide</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 01 Oct 2022 12:45:48 +0000</pubDate>
      <link>https://forem.com/payalgupta4639/aws-solutions-architect-associate-exam-preparation-guide-381l</link>
      <guid>https://forem.com/payalgupta4639/aws-solutions-architect-associate-exam-preparation-guide-381l</guid>
      <description>&lt;p&gt;This blog will cover some useful resources and the exam preparation plan to help you pass the AWS SAA certification.&lt;/p&gt;

&lt;p&gt;First thing to mention, do not consider this exam to be easy :) &lt;/p&gt;

&lt;p&gt;AWS SAA is one of the popular and most challenging exams in the cloud domain which helps you gain knowledge and boost your confidence in using AWS cloud services. So, it’s great that you are planning to take this exam and up-skilling your cloud knowledge. &lt;/p&gt;

&lt;p&gt;Now, let’s talk about the resources to prepare for it. As mentioned, this exam is not an easy one, it will test how good you are with AWS services in terms of their usage, best practices, designing and implementing distributed systems on AWS. So, below are some of the good resources which I found helpful.&lt;/p&gt;

&lt;h2&gt;
  
  
  Courses:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Udemy&lt;/strong&gt;: &lt;a href="https://www.udemy.com/course/aws-certified-solutions-architect-associate-saa-c03/"&gt;Ultimate AWS Certified Solutions Architect Associate (SAA)&lt;/a&gt; by Stephane Maarek&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Udemy&lt;/strong&gt;: &lt;a href="https://www.udemy.com/course/aws-certified-solutions-architect-associate-hands-on/"&gt;AWS Certified Solutions Architect Associate Training SAA-C03&lt;/a&gt; by Neal Davis&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Cloud Academy&lt;/strong&gt;: &lt;a href="https://cloudacademy.com/learning-paths/aws-solutions-architect-associate-saa-c02-certification-preparation-954/"&gt;https://cloudacademy.com/learning-paths/aws-solutions-architect-associate-saa-c02-certification-preparation-954/&lt;/a&gt;  &lt;/p&gt;

&lt;h2&gt;
  
  
  Practice Tests:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Whizlabs&lt;/strong&gt;: &lt;a href="https://www.whizlabs.com/aws-solutions-architect-associate/"&gt;https://www.whizlabs.com/aws-solutions-architect-associate/&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Udemy&lt;/strong&gt;: &lt;a href="https://www.udemy.com/course/aws-certified-solutions-architect-associate-amazon-practice-exams-saa-c03/"&gt;https://www.udemy.com/course/aws-certified-solutions-architect-associate-amazon-practice-exams-saa-c03/&lt;/a&gt; by Jon Bonzo&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sample Example Questions by AWS:&lt;/strong&gt; &lt;a href="https://d1.awsstatic.com/training-and-certification/docs-sa-assoc/AWS-Certified-Solutions-Architect-Associate_Sample-Questions.pdf"&gt;here&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS FAQs:&lt;/strong&gt; &lt;a href="https://d1.awsstatic.com/training-and-certification/ramp-up_guides/Ramp-Up_Guide_Architect.pdf"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Whitepapers &amp;amp; Guides:&lt;/strong&gt; &lt;a href="https://aws.amazon.com/whitepapers/?whitepapers-main.sort-by=item.additionalFields.sortDate&amp;amp;whitepapers-main.sort-order=desc&amp;amp;awsf.whitepapers-content-type=*all&amp;amp;awsf.whitepapers-tech-category=*all&amp;amp;awsf.whitepapers-industries=*all&amp;amp;awsf.whitepapers-business-category=*all&amp;amp;awsf.whitepapers-global-methodology=*all"&gt;here&lt;/a&gt;&lt;br&gt;
Some of the important white papers are –&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Overview of Amazon Web Services&lt;/li&gt;
&lt;li&gt;AWS Well-Architected Framework&lt;/li&gt;
&lt;li&gt;Architecting for the Cloud: AWS Best Practices&lt;/li&gt;
&lt;li&gt;AWS Security Best Practices&lt;/li&gt;
&lt;li&gt;AWS Storage Services Overview&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS Ramp-Up Guide: Architect:&lt;/strong&gt; &lt;a href="https://d1.awsstatic.com/training-and-certification/ramp-up_guides/Ramp-Up_Guide_Architect.pdf"&gt;here&lt;/a&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Learning Path
&lt;/h2&gt;

&lt;p&gt;On average, 35-40 hours or 6-8 weeks should be fine to prepare for AWS SAA exam (if you have some prior AWS experience). &lt;/p&gt;

&lt;p&gt;If you are just beginning with AWS, then you might need approximately 50-60 hours or three months to prepare.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Note: These recommendations are in general, and not a hard constraint. Please feel free to go on your own pace which suits you better.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Below is the 3-months learning path for beginners:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Month 1:&lt;/strong&gt; Get familiar with AWS Services and their best practices&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Course/youtube Videos - 10 hours&lt;/li&gt;
&lt;li&gt;Practice labs/hands-on - 10 hours&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Month 2:&lt;/strong&gt; Deep Dive! Learn about different use-cases and how to implement them using AWS Services such as which service would be the best fit in the given particular scenario and why&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Course/youtube videos/AWS Ramp-Up Guide: Architect - 10 hours&lt;/li&gt;
&lt;li&gt;Read AWS Whitepapers/FAQs - 5 hours&lt;/li&gt;
&lt;li&gt;Hands-on - 5 hours&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Month 3:&lt;/strong&gt; Practice Mock tests!! To pass the exam, practice questions are extremely important. Take any one of the above mentioned practice test series, try to go through all the test questions, check the detailed explanation provided to understand why that particular option was the right one to clear your concepts which will help you perform better in the exam.&lt;/p&gt;

&lt;p&gt;Hope it helps. Good Luck!!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>saas</category>
      <category>security</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Let's start with AWS Lambda</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Tue, 15 Feb 2022 18:28:40 +0000</pubDate>
      <link>https://forem.com/aws-builders/lets-start-with-aws-lambda-n6j</link>
      <guid>https://forem.com/aws-builders/lets-start-with-aws-lambda-n6j</guid>
      <description>&lt;p&gt;This post will cover the following content:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What is AWS Lambda?&lt;/li&gt;
&lt;li&gt;Why should you use AWS Lambda?&lt;/li&gt;
&lt;li&gt;How to use AWS Lambda?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let's get started...&lt;/p&gt;

&lt;h2&gt;
  
  
  What is AWS Lambda?
&lt;/h2&gt;

&lt;p&gt;Lambda is a highly available, serverless, event-driven compute service that lets you run code without provisioning or managing servers or clusters. You can trigger Lambda from over 200 AWS services and software as a service (SaaS) applications, and only pay for what you use.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why should you use AWS Lambda?
&lt;/h2&gt;

&lt;p&gt;Lambda is best suited for shorter, event-driven workloads, since Lambda functions run for up to 15 minutes per invocation. &lt;/p&gt;

&lt;p&gt;Also, when you use Lambda, you are only responsible for your code and Lambda will take care of the rest i.e, balance of memory, CPU, network, and other resources to run your code. &lt;/p&gt;

&lt;p&gt;This would mean that you cannot log in to compute instances or customize the operating system on provided runtimes because Lambda will perform operational and administrative activities on your behalf, including managing capacity, monitoring, and logging your Lambda functions. &lt;/p&gt;

&lt;p&gt;If you are looking to manage your own compute resources, you can use EC2 or EBS (Elastic Beanstalk) as per your requirements.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to use AWS Lambda?
&lt;/h2&gt;

&lt;p&gt;You can create, invoke, and manage your Lambda functions using any of the following interfaces:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS Management Console&lt;/strong&gt; – Provides a web interface for you to access your functions. &lt;br&gt;
&lt;strong&gt;AWS Command Line Interface (AWS CLI)&lt;/strong&gt; – Provides commands for a broad set of AWS services, including Lambda, and is supported on Windows, macOS, and Linux. &lt;br&gt;
&lt;strong&gt;AWS SDKs&lt;/strong&gt; – Provide language-specific APIs and manage many of the connection details, such as signature calculation, request retry handling, and error handling. &lt;br&gt;
&lt;strong&gt;AWS CloudFormation&lt;/strong&gt; – Enables you to create templates that define your Lambda applications. &lt;br&gt;
&lt;strong&gt;AWS Serverless Application Model (AWS SAM)&lt;/strong&gt; – Provides templates and a CLI to configure and manage AWS serverless applications. &lt;/p&gt;

&lt;p&gt;For demo purpose, I will be using AWS Management Console to get started with Lambda service.&lt;/p&gt;

&lt;p&gt;So, let's create our first Lambda function using console...&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Lambda Service in the console&lt;/li&gt;
&lt;li&gt;Go to Functions page&lt;/li&gt;
&lt;li&gt;Click Create Function. You should see the below flyout.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwewdtg569naxgyl8wi4l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwewdtg569naxgyl8wi4l.png" alt="create function" width="800" height="453"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, in this flyout, there are various option that we should understand. Let me help you with that.&lt;/p&gt;

&lt;p&gt;First, you need to choose one of the options out of the following to create a function&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Author from scratch&lt;/strong&gt;: This is the default code that Lambda creates to start with a simple "Hello World" example.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Use a blueprint&lt;/strong&gt;: This option is to create a Lambda application with sample code and setup configuration for typical scenarios.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Container image&lt;/strong&gt;: Choose this option if you have a container image which you would like to use to deploy a Lambda function&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Browse serverless app repository&lt;/strong&gt;: Choose this option for deploying serverless applications from the AWS Serverless Application Repository&lt;/p&gt;

&lt;p&gt;For now, I will go with the first option "Author from scratch" and fill the basic information:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Function name&lt;/strong&gt;: Enter a name that describes the purpose of your function. I've entered "Lambda-demo"&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Runtime&lt;/strong&gt;: Choose the language to use to write your function. Note that Lambda provides runtimes for .NET (PowerShell, C#), Go, Java, Node.js, Python, and Ruby. I'm selected "Node.js 14.x" &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Architecture&lt;/strong&gt;: Choose the instruction set architecture you want for your function code. Keeping default i.e, x86_64&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Permissions&lt;/strong&gt;: By default, Lambda will create an execution role with permissions to upload logs to Amazon CloudWatch Logs. You can customize this default role later when adding triggers.&lt;/p&gt;

&lt;p&gt;You have three options to choose for Execution role.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Create a new role with basic Lambda permissions - Default option&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use an existing role - if you have an existing IAM role to use, you can choose this option &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Create a new role from AWS policy templates - use this option if you would like to use AWS provided policy template for various services &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I am going with the default option i.e, "Create a new role with basic Lambda permissions"&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After filling all the above details, Choose "Create function" in the bottom right corner.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now, let's invoke the Lambda function which you created...&lt;/p&gt;

&lt;h3&gt;
  
  
  Steps to Invoke Lambda Function:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Under Functions, select your lambda function which you want to test.&lt;/li&gt;
&lt;li&gt;Go to Actions on the top right corner -&amp;gt; select 'Test' from the drop down.&lt;/li&gt;
&lt;li&gt;You can invoke your lambda function with a test event.You can either choose a template that matches the service that triggers your function, or enter your event document in JSON.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I will be using 'hello world' event template provided by AWS.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;So, choose 'New event'&lt;/li&gt;
&lt;li&gt;For template, select 'Hello World' from the drop down&lt;/li&gt;
&lt;li&gt;Template should look like this:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "key1": "value1",
  "key2": "value2",
  "key3": "value3"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Provide a name to your event function e.g. 'demo-event' &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fugwwe16hjfp8rlhuekb3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fugwwe16hjfp8rlhuekb3.png" alt="event" width="800" height="262"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now, click 'Save changes' and choose 'Test'&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;You should see the Execution results as below:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8uuy9uvtgqmgzcmudf5b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8uuy9uvtgqmgzcmudf5b.png" alt="Execution results" width="800" height="349"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can see the 200 OK response, memory used, billing duration for lambda function, total duration, etc in the summary section.&lt;/p&gt;

&lt;p&gt;Further, under Log Output, you can see the logs generated for the test event or you can click on the 'Click here' which will direct you to view the same logs under the corresponding CloudWatch log group.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lambda Monitoring
&lt;/h3&gt;

&lt;p&gt;You can view the monitoring details of a Lambda function under the 'Monitor' tab.&lt;/p&gt;

&lt;p&gt;Lambda sends runtime metrics for your functions to Amazon CloudWatch. Logs all requests handled by your function and automatically stores logs generated by your code through Amazon CloudWatch Logs.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpg69plbw82r3mlocgyq7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpg69plbw82r3mlocgyq7.png" alt="lambda-metrics" width="800" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc847glffvf4wttazjkad.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc847glffvf4wttazjkad.png" alt="lambda-logs" width="800" height="436"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see in the screenshots, you can also view these logs under CloudWatch console by clicking 'View logs in CloudWatch' option.&lt;/p&gt;

&lt;h3&gt;
  
  
  Execution Role
&lt;/h3&gt;

&lt;p&gt;In the end, you can check the details for the Execution Role that grants the function permission to access AWS services and resources. In our demo, this role was created when we created the Lambda function above. &lt;/p&gt;

&lt;p&gt;In order to check execution role details of your Lambda function, go to 'Configuration' tab -&amp;gt; then select "Permissions' from the left pane. You should see the details as follows:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxcn3xyfpij050lhdujev.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxcn3xyfpij050lhdujev.png" alt="Execution-role" width="800" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;View the resources and actions that your function has permission to access by choosing 'By action' or 'By resource' column. &lt;/p&gt;

&lt;p&gt;That's all I wanted to cover in this blog. Hope it was helpful. Thank you.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>cloudnative</category>
      <category>cloudskills</category>
    </item>
    <item>
      <title>Working with Placement groups in Amazon EC2</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 05 Feb 2022 20:12:16 +0000</pubDate>
      <link>https://forem.com/aws-builders/working-with-placement-groups-in-amazon-ec2-2eg3</link>
      <guid>https://forem.com/aws-builders/working-with-placement-groups-in-amazon-ec2-2eg3</guid>
      <description>&lt;p&gt;This blog will cover the following content:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What is a Placement Group in Amazon EC2?&lt;/li&gt;
&lt;li&gt;What are the benefits of using placement group?&lt;/li&gt;
&lt;li&gt;How to create placement group, and launch instances in placement group?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let's get started...&lt;/p&gt;

&lt;h2&gt;
  
  
  What is a Placement Group in Amazon EC2?
&lt;/h2&gt;

&lt;p&gt;Placement group is a way to impact the placement of interdependent EC2 instance groups in order to suit your workload requirements. AWS provides three placement strategies which you can use based on the type of your workload:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Cluster Placement Groups&lt;/strong&gt;: A logical grouping of instances within a single AZ.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Partition Placement Groups&lt;/strong&gt;: Logical partition of instance groups such that no two partitions within a placement group share the same underlying hardware. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Spread Placement Groups&lt;/strong&gt;: each instance within a spread placement group will be placed in a different rack. &lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  What are the benefits of using placement group?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Cluster Placement group benefits:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; Recommended for low network latency, and/or high network throughput applications. &lt;/li&gt;
&lt;li&gt; Only specific to a single AZ&lt;/li&gt;
&lt;li&gt; Can span peered VPCs in the same Region&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Partition Placement Group benefits:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; Reduces the impact of correlated hardware failures for your application&lt;/li&gt;
&lt;li&gt; Mainly used to deploy large distributed and replicated workloads, such as HDFS, HBase, and Cassandra, across distinct racks.&lt;/li&gt;
&lt;li&gt; Can have partitions in multiple Availability Zones in the same Region. &lt;/li&gt;
&lt;li&gt; Offer visibility into the partitions using which you can check which instance is in which partition. 
Topology-aware applications, such as HDFS, HBase, and Cassandra use this information to make intelligent data replication decisions for increasing data availability and durability.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Spread Placement Group benefits:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; Recommended for applications that have a small number of critical instances that should be kept separate from each other. &lt;/li&gt;
&lt;li&gt; Reduces the risk of simultaneous failures that might occur when instances share the same racks which is not the case in spread placement group&lt;/li&gt;
&lt;li&gt; Can span multiple Availability Zones in the same Region.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How to create placement group, and launch instances in placement group?
&lt;/h2&gt;

&lt;p&gt;Before using placement groups, I would suggest you to go through the &lt;a href="https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/placement-groups.html#concepts-placement-groups" rel="noopener noreferrer"&gt;rules and limitations&lt;/a&gt; of placement groups once for awareness. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To create a placement group&lt;/strong&gt;,  &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Amazon EC2 console&lt;/li&gt;
&lt;li&gt;In the left pane, go to Network &amp;amp; Security -&amp;gt; choose Placement Groups -&amp;gt; Create placement group.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnibpir9ud50p5kjmco5w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnibpir9ud50p5kjmco5w.png" alt="create placement group"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Fill the details in the fly-out&lt;br&gt;
&lt;strong&gt;Name&lt;/strong&gt;: Specify the name of your placement group&lt;br&gt;
&lt;strong&gt;Placement strategy&lt;/strong&gt;: choose the strategy from the drop-down&lt;br&gt;
&lt;strong&gt;Tags&lt;/strong&gt;: Optionally assign tag values to the placement group&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click Create Group&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In my account, I created 3 placement groups with cluster, spread, and partition startegies as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fli8xwlfbm2ma5s6jdwwe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fli8xwlfbm2ma5s6jdwwe.png" alt="placement groups"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Launch instance in placement group&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to EC2 console -&amp;gt; Instances&lt;/li&gt;
&lt;li&gt;Click launch instances in the top right corner&lt;/li&gt;
&lt;li&gt;Launch instance with the following steps:&lt;/li&gt;
&lt;li&gt; Step 1: Choose AMI as per your requirements &lt;/li&gt;
&lt;li&gt; Step 2: Choose instance type by keeping in mind the limitations of your placement group. Example: You cannot lanuch t2 type instances in Cluster placement group because burstable performance instances such as T2 are not supported by cluster placement group. 
Hence, make sure to choose the instance type which is supported by the placement group in which you are planning to launch it otherwise, you will receive an error message as shown in the screenshot below.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxexyqsw71q8dxac0neel.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxexyqsw71q8dxac0neel.png" alt="error message"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; Step 3: Configure Instance Details
This is the step where you will specify the instance details required for your placement group.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Number of instances&lt;/strong&gt;: Enter the total number of instances that you need in this placement group, because you might not be able to add instances to the placement group later.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Placement group&lt;/strong&gt;: Select the Add instance to placement group check box&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Placement group name&lt;/strong&gt;: You can choose to add the instances to an existing placement group or to a new placement group that you create&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Placement group strategy&lt;/strong&gt;: Choose the appropriate strategy. &lt;br&gt;
If you choose partition, for Target partition, choose Auto distribution to have Amazon EC2 do a best effort to distribute the instances evenly across all the partitions in the group. Alternatively, you have the option and control to specify the partition in which to launch the instances.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flwozvviwp432fd8k2zye.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flwozvviwp432fd8k2zye.png" alt="configure details"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt; Step 4: Add Storage to your instances&lt;/li&gt;
&lt;li&gt; Step 5: Add tags such as Name tag&lt;/li&gt;
&lt;li&gt; Step 6: Configure Security group for your instances&lt;/li&gt;
&lt;li&gt; Step 7: Review and Launch&lt;/li&gt;
&lt;li&gt; If everything looks fine, click launch in the bottom right corner to launch the instances.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I created 3 instances in partition placement group as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3lu9gylngww2s3c3sbg4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3lu9gylngww2s3c3sbg4.png" alt="ec2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Additionally, to check the placement group details of your instance, select the instance -&amp;gt; go to Details section of your instance and scroll down to Host &amp;amp; placement group section. You can find the placement group name, and partition number etc. &lt;/p&gt;

&lt;p&gt;Hope this information is helpful. Thank you.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>architecture</category>
      <category>node</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Connect with Couchbase Capella over Private network created using AWS VPC Peering</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sun, 23 Jan 2022 10:10:33 +0000</pubDate>
      <link>https://forem.com/aws-builders/connect-with-couchbase-capella-over-private-network-created-using-aws-vpc-peering-51aj</link>
      <guid>https://forem.com/aws-builders/connect-with-couchbase-capella-over-private-network-created-using-aws-vpc-peering-51aj</guid>
      <description>&lt;p&gt;This blog will cover the following things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What is Private Network in Couchbase Capella?&lt;/li&gt;
&lt;li&gt;What is AWS VPC Peering and how Private network is created using it?&lt;/li&gt;
&lt;li&gt;How to connect with Capella over private network?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let's get started...&lt;/p&gt;

&lt;h3&gt;
  
  
  What is Private Network in Couchbase Capella?
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.couchbase.com/cloud/index.html"&gt;Capella&lt;/a&gt; is a fully managed Database as a Service (DBaaS) offered by Couchbase which provides easiest and fastest way to begin with Couchbase database and eliminate your database management efforts. You can easily deploy a clustered database in the public cloud such as AWS/Azure/GCP using Couchbase Capella.&lt;/p&gt;

&lt;p&gt;Private network is the feature provided by Capella using which you can connect your application with the Couchbase Capella Cluster over a private connection. It enables you to have more secure connection with less latency and data egress costs.  &lt;/p&gt;

&lt;h3&gt;
  
  
  What is AWS VPC Peering and how Private network is created using it?
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://docs.aws.amazon.com/vpc/latest/peering/what-is-vpc-peering.html"&gt;AWS VPC Peering&lt;/a&gt; is the private network connection created between two VPCs that can be in the same or another AWS account, to route traffic between them using private IPv4 addresses or IPv6 addresses and access or share the resources created in those VPCs as if all the resources are a part of the same network.&lt;/p&gt;

&lt;p&gt;Couchbase Capella uses AWS VPC Peering to create a Private Network connection between your application and the Capella Cluster.&lt;/p&gt;

&lt;h3&gt;
  
  
  How to connect with Capella over private network?
&lt;/h3&gt;

&lt;p&gt;Let me help you with the step-by-step guidelines to create a Private Network connection in Capella below:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: Before beginning, please make sure of the following things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Route53 should be enabled on your AWS account&lt;/li&gt;
&lt;li&gt;Your application VPC and Couchbase Capella cluster VPC have different CIDR. 
If both VPCs will have overlapping CIDR blocks, then we won't be able to setup peering connection due to the &lt;a href="https://docs.aws.amazon.com/vpc/latest/peering/invalid-peering-configurations.html#overlapping-cidr"&gt;limitation of VPC peering&lt;/a&gt;. &lt;/li&gt;
&lt;/ol&gt;
&lt;/blockquote&gt;

&lt;p&gt;So, let's get started with the setup...&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Login to Couchbase Capella console and &lt;a href="https://docs.couchbase.com/cloud/get-started/create-account.html"&gt;create a cluster&lt;/a&gt; using Couchbase’s Cloud Account option for free. &lt;br&gt;
You can create the cluster using your own Cloud account option as well. However, I am sharing the steps using Couchbase Cloud Account option so would be easier for you to follow.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Created private network between Capella Cluster VPC and your own VPC in which your application resides. Steps are as follows:&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Go to Clusters in the left pane&lt;/li&gt;
&lt;li&gt;Go to Connect tab &amp;gt; Virtual Network &amp;gt; Manage Private Network&lt;/li&gt;
&lt;li&gt;On the top-right corner, click on Setup Private Network&lt;/li&gt;
&lt;li&gt;Confirm the pre-requisites 
Route53 Enabled
Virtual Network Peering Enabled&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enter the following details&lt;br&gt;
Name - Your Private Network name which will be visible on the Capella UI&lt;br&gt;
AWS Account ID - Your AWS Account ID in which your application VPC resides&lt;br&gt;
Virtual Network ID - Your application VPC ID &lt;br&gt;
Available Regions - AWS Region in which your application VPC exists&lt;br&gt;
CIDR Block - CIDR block of your app VPC&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Now, run the commands shown on your Capella UI&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;aws ec2 accept-vpc-peering-connection --region=&amp;lt;&amp;gt; --vpc-peering-connection-id=&amp;lt;pcx-xxxxxxxxxx&amp;gt;&lt;/code&gt; -&amp;gt; for accepting vpc-peering connection (this can be done via AWS console also)&lt;/p&gt;

&lt;p&gt;&lt;code&gt;aws route53 associate-vpc-with-hosted-zone --hosted-zone-id=&amp;lt;&amp;gt; --vpc=VPCId=&amp;lt;&amp;gt;,VPCRegion=&amp;lt;&amp;gt; --region=&amp;lt;&amp;gt;&lt;/code&gt;  -&amp;gt; for associating VPC with the hosted zone in route53&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;By now, your peering connection between Capella VPC and application VPC has been created. Next step would be to add the routes in your app VPC's route table in order to communicate.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Capella Cluster VPC CIDR can be found in the AWS VPC peering console &lt;br&gt;
-&amp;gt; Login to your AWS account in which app VPC resides&lt;br&gt;
-&amp;gt; Go to VPC peering console&lt;br&gt;
-&amp;gt; Search for the peering connection using the ID provided in above commands&lt;br&gt;
-&amp;gt; You should see the requester VPC details, copy the Requester CIDRs&lt;br&gt;
-&amp;gt; Now, Go to the your app VPC route table&lt;br&gt;
-&amp;gt; Go to Routes tab &amp;gt; Click on edit route&lt;br&gt;
-&amp;gt; Add Capella VPC CIDR in destination and select peering connection as target&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Private network setup is completed at this point. Now, it's time to test the connection if it is working fine or not.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;You can use the below commands to test the private connectivity with Capella Cluster.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;nslookup output for DNS resolution:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[ec2-user@ip-192-0-0-4 ~]$ nslookup -type=SRV _couchbases._tcp.cb.uvbaw6f5kvhmun7s.cloud.couchbase.com
Server:  192.0.0.2
Address:    192.0.0.2#53

Non-authoritative answer:
_couchbases._tcp.cb.uvbaw6f5kvhmun7s.cloud.couchbase.com    service = 0 0 11207 yk9iixsbth4mj5uf.uvbaw6f5kvhmun7s.cloud.couchbase.com.
_couchbases._tcp.cb.uvbaw6f5kvhmun7s.cloud.couchbase.com    service = 0 0 11207 20pcyksdifyr2r2s.uvbaw6f5kvhmun7s.cloud.couchbase.com.
_couchbases._tcp.cb.uvbaw6f5kvhmun7s.cloud.couchbase.com    service = 0 0 11207 cxq21w9wmkbl90em.uvbaw6f5kvhmun7s.cloud.couchbase.com.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Netcat or telnet command to test connectivity&lt;/p&gt;

&lt;p&gt;telnet output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[ec2-user@ip-192-0-0-4 ~]$ telnet yk9iixsbth4mj5uf.uvbaw6f5kvhmun7s.cloud.couchbase.com. 18091
Trying 10.0.113.52...
Connected to yk9iixsbth4mj5uf.uvbaw6f5kvhmun7s.cloud.couchbase.com..
Escape character is '^]'.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;netcat output:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[ec2-user@ip-192-0-0-4 ~]$ nc -v 20pcyksdifyr2r2s.uvbaw6f5kvhmun7s.cloud.couchbase.com. 11207
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connected to 10.0.112.74:11207.
^C
[ec2-user@ip-192-0-0-4 ~]$ nc -v yk9iixsbth4mj5uf.uvbaw6f5kvhmun7s.cloud.couchbase.com. 11207
Ncat: Version 7.50 ( https://nmap.org/ncat )
Ncat: Connected to 10.0.113.52:11207.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Traceroute output to confirm the path taken:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;[ec2-user@ip-192-0-0-4 ~]$ sudo traceroute 20pcyksdifyr2r2s.uvbaw6f5kvhmun7s.cloud.couchbase.com. -T -p 18091
traceroute to 20pcyksdifyr2r2s.uvbaw6f5kvhmun7s.cloud.couchbase.com. (10.0.112.74), 30 hops max, 60 byte packets
 1  ip-10-0-112-74.ec2.internal (10.0.112.74)  1.292 ms  1.279 ms  1.271 ms
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Hope this information was helpful. Thank you.&lt;/p&gt;

</description>
      <category>couchbase</category>
      <category>aws</category>
      <category>cluster</category>
      <category>awsvpcpeering</category>
    </item>
    <item>
      <title>Create an Organization in AWS</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 08 Jan 2022 13:37:51 +0000</pubDate>
      <link>https://forem.com/aws-builders/create-an-organization-in-aws-38jb</link>
      <guid>https://forem.com/aws-builders/create-an-organization-in-aws-38jb</guid>
      <description>&lt;p&gt;This post will cover the following content:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What is AWS Organizations?&lt;/li&gt;
&lt;li&gt;Why should you use AWS Organizations?&lt;/li&gt;
&lt;li&gt;How to use AWS Organizations?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let's get started...&lt;/p&gt;

&lt;h2&gt;
  
  
  What is AWS Organizations?
&lt;/h2&gt;

&lt;p&gt;AWS Organizations is a service that comes under AWS M&amp;amp;G category, which helps you centrally manage multiple AWS Accounts and govern your environment as you grow and scale your AWS resources. &lt;/p&gt;

&lt;h2&gt;
  
  
  Why should you use it?
&lt;/h2&gt;

&lt;p&gt;You should use AWS Organizations if you would like to &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Centrally manage your environment across multiple AWS accounts&lt;/strong&gt;&lt;br&gt;
With the help of AWS Organizations, you can&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create new accounts that will automatically be a part of your Organization
&lt;/li&gt;
&lt;li&gt;Invite other AWS Accounts to join your Organization &lt;/li&gt;
&lt;li&gt;Programmatically create new AWS accounts to quickly scale your workloads&lt;/li&gt;
&lt;li&gt;Attach appropriate policies to apply on some or all of the accounts.&lt;/li&gt;
&lt;li&gt;Use consolidated billing feature to consolidate and pay for all member accounts which helps to manage billing and cost centrally. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Manage your Organization&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Group AWS accounts into Organization Units(OUs) to easily manage and govern the boundaries based on service control policies(SCPs) for your OUs.
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Simplify permission management and access control&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Using AWS SSO(Single-Sign-On) and Active Directory, control user-based permissions in your Organization.&lt;/li&gt;
&lt;li&gt;Apply SCPs to control access to AWS Services within OUs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Efficiently share and provision resources across accounts&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use AWS RAM(Resource Access Manager) to share critical resources within your Organisation to help reduce resource duplication.&lt;/li&gt;
&lt;li&gt;Use AWS License Manager to centrally meet your software license agreements&lt;/li&gt;
&lt;li&gt;Use AWS Service Catalog to easily share a catalog of IT services and custom products across accounts&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Manage costs and optimize usage&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Organization provides the feature of Consolidated billing to have shared billing functionality which enables the management account of your organization to pay for all the member accounts and take benefit of quantity discounts with a single bill. &lt;/li&gt;
&lt;li&gt;Use AWS Cost Explorer to track resource costs &lt;/li&gt;
&lt;li&gt;Use AWS Compute Optimizer to compute resource usage&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Audit your environment for compliance&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can use various AWS services with Organisations to centrally manage security of your resources such as&lt;/li&gt;
&lt;li&gt;AWS CloudTrail to audit all the events in your accounts&lt;/li&gt;
&lt;li&gt;AWS Config to centrally define your recommended configuration criteria across resources, AWS Regions, and accounts &lt;/li&gt;
&lt;li&gt;AWS GuardDuty for threat detection to protect your resources centrally&lt;/li&gt;
&lt;li&gt;AWS Control Tower to establish cross-account security audits, or manage and view policies applied across accounts&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How to use AWS Organizations?
&lt;/h2&gt;

&lt;p&gt;AWS Organizations is a global service and you can use AWS console, CLI, or API to create and use AWS Organization service. &lt;/p&gt;

&lt;p&gt;I will be performing the following steps to show how to use this service:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Creating an Organization&lt;/li&gt;
&lt;li&gt;Invite and Add other AWS account to my Organization&lt;/li&gt;
&lt;li&gt;Create Groups (OUs) within my Organization&lt;/li&gt;
&lt;li&gt;Apply SCPs(Service Control Policies) to the Group&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;Note:&lt;br&gt;
I already have two separate AWS accounts. Will be using one as Management Account (formerly known as master account) and other as Member Account. Management account is the account that is used to create Organization and Member accounts are all the other accounts which you invite or create within an Organization.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Steps are as follows:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Login to your Management Account and go to the AWS Organizations console&lt;/li&gt;
&lt;li&gt;Click "Create Organization" in the top right corner&lt;/li&gt;
&lt;li&gt;You should receive a verification email sent to the email address associated with your Management Account in order to verify the account. You need to verify your email address in order to invite other AWS accounts to your Organization.&lt;/li&gt;
&lt;li&gt;Once verified, click "Add an AWS account" option&lt;/li&gt;
&lt;li&gt;Since I have an existing AWS account which I would like to invite in this Organization hence, I will be selecting "Invite an existing AWS account" option. If you do not have an existing account, you can simply create one within the Organization.&lt;/li&gt;
&lt;li&gt;Enter the email address or account ID of the AWS account which you want to invite in this Organization &lt;/li&gt;
&lt;li&gt;Add message to the owner of the AWS account (optional)&lt;/li&gt;
&lt;li&gt;Use tags to associate with the resources &lt;/li&gt;
&lt;li&gt;Accept the invitation by clicking on the link sent to the email address &lt;strong&gt;OR&lt;/strong&gt; Login to the member AWS account -&amp;gt; go to AWS Organizations console -&amp;gt; Select Invitations from left pane -&amp;gt; Click Accept Invitation&lt;/li&gt;
&lt;li&gt;Now, login back to the Management account and go to AWS Organizations -&amp;gt; AWS Accounts, you should see the member account added to the Organizational Structure
Organizational Structure should look like this:&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

Root
-&amp;gt; management account 
-&amp;gt; member account 


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Initially, both management and member accounts comes under the Root.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Proceed further to create the groups i.e, OUs within Organization 
Select the "Root" and go to "Actions"
Select "Create New" under Organizational Unit
&lt;em&gt;Note: You cannot delete or rename Root.&lt;/em&gt;
Enter the name for your new OU
Mention tags (optional)
Click "Create Organizational Unit" at the bottom&lt;/li&gt;
&lt;li&gt;Once created, your Organizational Structure should look like this:&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

Root
-&amp;gt; New OU
-&amp;gt; Management Account
-&amp;gt; Member Account


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This shows Root contains one OU named New OU, Management account, and Member account. Currently, New OU is empty.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;You can move the member or management accounts to the newly created OU in order to apply SCPs.
To move an account from one OU to another, follow the steps below:
Select the account which you want to move
Go to "Actions" -&amp;gt; "Move"
Select the OU under which you want to move that account
Click "Move AWS account" at the bottom&lt;/li&gt;
&lt;li&gt;I have created the following Organizational Structure:&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

Root
-&amp;gt; New OU
   -&amp;gt; Member Account
-&amp;gt; Management Account


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;I have moved member account to new OU and kept Management Account under Root only. Will create SCP to create policies for New OU and member account.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to "Policies" in the left pane and Click "Service Control Policies" -&amp;gt; "Enable Service control Policies"&lt;/li&gt;
&lt;li&gt;Once enabled, you can create your own SCP policies to apply them on member accounts, or OUs
Note: You can apply SCPs to only member accounts in an organization. They have no effect on users or roles in the management account.&lt;/li&gt;
&lt;li&gt;To create SCP, follow the below steps:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Go to Policies -&amp;gt; Service Control Policies&lt;br&gt;
Click "Create Policy" on the top right&lt;br&gt;
Fill the details and click "Create Policy" at the bottom&lt;br&gt;
Once created, it should be listed under "Available policies"&lt;br&gt;
Now, select the newly created SCP and attach it to the member accounts or OUs&lt;br&gt;
I have used one of the sample SCPs provided by AWS and attached it to my New OU that contains my member account.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq4ty0um7g5sm5sdnubcj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq4ty0um7g5sm5sdnubcj.png" alt="SCP"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This SCP will deny access to AWS based on the requested AWS Region. Read more details &lt;a href="https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps_examples_general.html#example-scp-deny-region" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;In order to test if the SCP is working or not, I logged into my member account and tried to access the EC2 console. Below is the message displayed on the screen:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fltvvu08vb0z4aoisyqcm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fltvvu08vb0z4aoisyqcm.png" alt="unauthorised"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It restricted me to launch an EC2 instance and perform any operations which confirms that the SCP is working.&lt;/p&gt;

&lt;p&gt;Hope this information helped. Thank you.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>cloudskills</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Create AWS CloudFormation stack using sample template</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sun, 02 Jan 2022 13:31:10 +0000</pubDate>
      <link>https://forem.com/aws-builders/aws-cloudformation-using-lamp-stack-template-1ajj</link>
      <guid>https://forem.com/aws-builders/aws-cloudformation-using-lamp-stack-template-1ajj</guid>
      <description>&lt;p&gt;This post will cover the following content:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;What is AWS CloudFormation?&lt;/li&gt;
&lt;li&gt;Why should you use it?&lt;/li&gt;
&lt;li&gt;How to use it?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Let's get started...&lt;/p&gt;

&lt;h2&gt;
  
  
  What is AWS CloudFormation?
&lt;/h2&gt;

&lt;p&gt;AWS CloudFormation comes under the AWS M&amp;amp;G category and it helps in provisioning and configuring AWS resources for you.&lt;br&gt;
You just need to create a CloudFormation template to describe what AWS resources you would like to configure along with the properties/settings, and CloudFormation will create them for you as you described.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why should you use it?
&lt;/h2&gt;

&lt;p&gt;You should use AWS CloudFormation if you are looking to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Simplify infrastructure management&lt;/strong&gt;&lt;br&gt;
Instead of creating and managing individual resources in AWS, you can just a CloudFormation stack to easily manage a collection of resources as a single unit in order to save your time and effort.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Quickly replicate your infrastructure&lt;/strong&gt;&lt;br&gt;
You can reuse your CloudFormation template to provision the same resources in multiple regions without spending time in configuring those resources in each region individually.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Easily control and track changes to your infrastructure&lt;/strong&gt;&lt;br&gt;
CloudFormation templates are nothing but text files which helps you to track changes to your infrastructure just like version controlling. You can easily roll back to your previous infrastructure using a version control system with templates.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  How to use it?
&lt;/h2&gt;

&lt;p&gt;AWS CloudFormation is a regional service and you can use AWS CloudFormation via the browser console, command line tools, or APIs to create stack and resources.&lt;/p&gt;

&lt;h3&gt;
  
  
  Below are the steps to get started:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Go to the AWS CloudFormation Console &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Click on &lt;code&gt;Create stack&lt;/code&gt; in the top right corner&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Specify the Template. &lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A template is a JSON or YAML file that contains configuration information about the AWS resources you want to include in the stack.&lt;/p&gt;

&lt;p&gt;There are 3 options available for you to choose:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Template is ready&lt;/strong&gt; - If you have your own template file, then choose this option.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use a sample template&lt;/strong&gt; - If you would like to use a sample template file provided by AWS, you should go with this option.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Create template in Designer&lt;/strong&gt; - If you are new to AWS CloudFormation, you can use this option to create, view, and modify AWS CloudFormation templates using a drag-and-drop interface.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;I am selecting a sample template &lt;code&gt;LAMP Stack&lt;/code&gt; to begin with CloudFormation. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;S3 URL&lt;/strong&gt;: This is the Amazon S3 bucket URL which is the location of your CloudFormation template file&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;AWS CloudFormation Sample Template LAMP_Single_Instance: Create a LAMP stack using a single EC2 instance and a local MySQL database for storage. This template demonstrates using the AWS CloudFormation bootstrap scripts to install the packages and files necessary to deploy the Apache web server, PHP and MySQL at instance launch time. &lt;strong&gt;WARNING&lt;/strong&gt; This template creates an Amazon EC2 instance. You will be billed for the AWS resources used if you create a stack from this template.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This is how our template will look like in AWS CF Designer tool&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsb428falc9gks5vbr99f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsb428falc9gks5vbr99f.png" alt="LAMP stack view in Designer" width="800" height="462"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now, Specify stack details&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Stack name&lt;/strong&gt;: mention a stack name of your choice which can include letters (A-Z and a-z), numbers (0-9), and dashes (-).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Parameters&lt;/strong&gt; : You can use the given parameters to input customer values while creating or updating the stack.&lt;br&gt;
&lt;strong&gt;DBName&lt;/strong&gt; : Database name for your MySQL DB &lt;br&gt;
&lt;strong&gt;DBPassword&lt;/strong&gt; : Password for your MySQL DB &lt;br&gt;
&lt;strong&gt;DBRootPassword&lt;/strong&gt; : Root password for MySQL DB &lt;br&gt;
&lt;strong&gt;DBUser&lt;/strong&gt; : User name for your MySQL DB access &lt;br&gt;
&lt;strong&gt;InstanceType&lt;/strong&gt; : Specify the EC2 instance type which you would like to use for your WebServer&lt;br&gt;
&lt;strong&gt;KeyName&lt;/strong&gt; : Specify an existing EC2 KeyPair to enable SSH access to the instance. &lt;br&gt;
Make sure you have an existing EC2 keypair created in the specified region. If you do not have one, please create it using the EC2 console and specify the newly created keypair here otherwise stack creation will fail.&lt;br&gt;
&lt;strong&gt;SSHLocation&lt;/strong&gt; : The IP address range that can be used to SSH to the EC2 instances (by default, it is 0.0.0.0/0)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Next, Configure Stack Options&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All the given parameters are optional. You can fill them as per your requirements.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tags&lt;/strong&gt; : key-value pairs to apply to resources in your stack&lt;br&gt;
&lt;strong&gt;Permissions&lt;/strong&gt; : Specify IAM role name/ARN. By default, CloudFormation uses permissions based on your user credentials.&lt;br&gt;
&lt;strong&gt;Stack failure Options&lt;/strong&gt; : Specify the behavior when stack creation fails&lt;br&gt;
&lt;strong&gt;Advanced Options&lt;/strong&gt; : You can use these options if you would like to define stack policies for designated resources, setup notification, enable stack termination protection, or configure rollback options. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Now comes, Review step. You can review all the details of your CloudFormation stack and if everything looks great, click &lt;code&gt;Create stack&lt;/code&gt; at the bottom right corner to initiate stack creation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;em&gt;PS: At the bottom, there is a &lt;code&gt;Quick-create link&lt;/code&gt; option available for you to use. You can create more stacks with the same basic configuration as the one you just created using this URL.&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Once the stack creation has completed, you should see the CloudFormation stack status as &lt;code&gt;CREATE_COMPLETE&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ff8jjr021x2cy4x31qp.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ff8jjr021x2cy4x31qp.png" alt="stack details" width="800" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Resource tab&lt;/strong&gt; : You should find all the resources created by CF stack under this tab.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzr1swm87kuavv0bwh3wz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzr1swm87kuavv0bwh3wz.png" alt="Stack resource tab" width="800" height="220"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Event tab&lt;/strong&gt; : You can check the CF stack event details under event tab&lt;br&gt;
&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx81sdthivmoiovam5ii9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx81sdthivmoiovam5ii9.png" alt="Stack event tab" width="800" height="336"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Outputs&lt;/strong&gt; : Under this tab, you should see the website URL for your newly created LAMP stack. &lt;/p&gt;

&lt;p&gt;Hope you find this information helpful. Thank you.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
      <category>cloudskills</category>
      <category>cloudnative</category>
    </item>
    <item>
      <title>Connect to the Capella Cluster from Amazon EC2 running Red Hat Enterprise Linux using Couchbase PHP SDK</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 01 Jan 2022 19:03:10 +0000</pubDate>
      <link>https://forem.com/aws-builders/how-to-connect-capella-cluster-from-amazon-ec2-running-red-hat-enterprise-linux-using-couchbase-php-sdk-54o2</link>
      <guid>https://forem.com/aws-builders/how-to-connect-capella-cluster-from-amazon-ec2-running-red-hat-enterprise-linux-using-couchbase-php-sdk-54o2</guid>
      <description>&lt;p&gt;This post will provide you step by step guidelines on how to setup connection with Capella Cluster from Amazon EC2 running Red Hat Enterprise Linux [RHEL 8]&lt;/p&gt;

&lt;h2&gt;
  
  
  Steps are as follows:
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Spin up an EC2 instance&lt;/strong&gt; running Red Hat Enterprise Linux (&lt;code&gt;ami-0ba62214afa52bec7&lt;/code&gt;)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SSH into the EC2 instance.&lt;/strong&gt; The default user name for a RHEL AMI is ec2-user or root so make sure you are using the correct user name.&lt;br&gt;
&lt;code&gt;$ ssh -i /path/my-key-pair.pem ec2-user@my-instance-public-dns-name&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Use below command to install nano editor&lt;br&gt;
&lt;code&gt;$ sudo yum install nano&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install Couchbase C SDK (libcouchbase)&lt;/strong&gt; which is a pre-requisite to use Couchbase PHP SDK. To do so,&lt;br&gt;
Create a &lt;code&gt;couchbase.repo&lt;/code&gt; file in your &lt;code&gt;/etc/yum.repos.d&lt;/code&gt; directory.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ cd /etc/yum.repos.d&lt;/code&gt;&lt;br&gt;
&lt;code&gt;$ sudo nano couchbase.repo&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;paste the below content in &lt;code&gt;couchbase.repo&lt;/code&gt; file&lt;/p&gt;

&lt;p&gt;&lt;code&gt;[couchbase]&lt;br&gt;
enabled = 1&lt;br&gt;
name = libcouchbase package for centos8 x86_64&lt;br&gt;
baseurl = https://packages.couchbase.com/clients/c/repos/rpm/el8/x86_64&lt;br&gt;
gpgcheck = 1&lt;br&gt;
gpgkey = https://packages.couchbase.com/clients/c/repos/rpm/couchbase.key&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now, the repository has been configured, refresh the cache by using below command&lt;br&gt;
&lt;code&gt;$ sudo yum check-update&lt;/code&gt;&lt;br&gt;
&lt;code&gt;$ sudo yum search libcouchbase&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Install libcouchbase3, and any other packages that you need for development:&lt;br&gt;
&lt;code&gt;sudo yum install libcouchbase3 libcouchbase-devel libcouchbase3-tools&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Enable EPEL and Remi Repository&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To install EPEL(Extra Packages for Enterprise Linux), run the below command:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo dnf install -y https://dl.fedoraproject.org/pub/epel/epel-release-latest-8.noarch.rpm&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;To install Remi (a third-party repository that provides a wide range of PHP versions for RedHat Enterprise Linux), run below command:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo dnf install -y https://rpms.remirepo.net/enterprise/remi-release-8.rpm&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install PHP 7.4 on RHEL&lt;/strong&gt;&lt;br&gt;
First, list the available php module by running this command&lt;br&gt;
&lt;code&gt;$ sudo dnf module list php&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You should see remi-7.4 in the list. In order to enable it, run below command:&lt;br&gt;
&lt;code&gt;$ sudo dnf module enable php:remi-7.4 -y&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install the php-devel and php-bear packages&lt;/strong&gt;&lt;br&gt;
The &lt;code&gt;php-devel&lt;/code&gt; package contains the files needed for building PHP extensions and &lt;code&gt;php-pear&lt;/code&gt; package contains the basic PEAR components for reusable PHP components. &lt;/p&gt;

&lt;p&gt;Run the below commands to install both packages:&lt;br&gt;
&lt;code&gt;$ sudo dnf install php&lt;/code&gt;&lt;br&gt;
&lt;code&gt;$ sudo yum install php-devel&lt;/code&gt;&lt;br&gt;
&lt;code&gt;$ sudo yum install php-pear&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Install the Couchbase PHP SDK&lt;/strong&gt; using PHP distribution’s pecl command:&lt;br&gt;
&lt;code&gt;$ pecl install couchbase&lt;/code&gt;&lt;br&gt;
&lt;code&gt;$ sudo pecl install https://packages.couchbase.com/clients/php/couchbase-3.2.2.tgz&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Load the Couchbase SDK as an extension&lt;/strong&gt; &lt;br&gt;
Use below command to locate the php.ini file&lt;br&gt;
&lt;code&gt;$ php --ini&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Insert a line in the php.ini file specifying the extension to be loaded; this should be in the [PHP] section.&lt;br&gt;
&lt;code&gt;extension=couchbase.so&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Also, insert the json extension since The Couchbase SDK depends on the JSON module, which must be loaded before the SDK. To do so, you can add the below extension in php.ini file:&lt;br&gt;
&lt;code&gt;extension=json.so&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now, &lt;strong&gt;add your ec2 instance public IP address to the allow list of Capella Cluster and run the below sample script to connect with Capella Cluster&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;?php
 $connectionString = "couchbases://&amp;lt;connect_string&amp;gt;?ssl=no_verify";
 $options = new \Couchbase\ClusterOptions();
 $options-&amp;gt;credentials("&amp;lt;user&amp;gt;", "&amp;lt;password&amp;gt;");
 //$options-&amp;gt;timeout(3000 /* milliseconds */);
 $cluster = new \Couchbase\Cluster($connectionString, $options);

 $opts = new \Couchbase\GetOptions();
 $opts-&amp;gt;timeout(8000 /* milliseconds */);

 // get a bucket reference
 $bucket = $cluster-&amp;gt;bucket("travel-sample");

 // get a default collection reference
 //$collection = $bucket-&amp;gt;defaultCollection();

 // or for named collection
 $scope = $bucket-&amp;gt;scope("_default");
 $collection = $scope-&amp;gt;collection("_default");

 // upsert document
 $upsertResult = $collection-&amp;gt;upsert("my-document", ["name" =&amp;gt; "mike"]);

 // get document
 $getResult = $collection-&amp;gt;get("my-document", $opts);

 echo "Results:\n";
 var_dump($getResult);
?&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;where&lt;br&gt;
&lt;code&gt;&amp;lt;connect_string&amp;gt;&lt;/code&gt; : Your cluster WAN endpoint&lt;br&gt;
&lt;code&gt;&amp;lt;user&amp;gt;, &amp;lt;password&amp;gt;&lt;/code&gt; : your database username/password you created to give access to the cluster buckets&lt;br&gt;
&lt;code&gt;"travel-sample"&lt;/code&gt; : It is the sample bucket available in the Capella cluster to use. You could create your own bucket as well. Make sure that your database user has access to that bucket.&lt;/p&gt;

&lt;p&gt;Hope the provided steps are helpful. Thank you.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Hello DEV Community!</title>
      <dc:creator>Payal Gupta</dc:creator>
      <pubDate>Sat, 01 Jan 2022 12:21:34 +0000</pubDate>
      <link>https://forem.com/aws-builders/hello-dev-community-4no7</link>
      <guid>https://forem.com/aws-builders/hello-dev-community-4no7</guid>
      <description>&lt;p&gt;I signed up for a DEV account only to join the exclusive DEV Organisation created for AWS Community Builders to use and publish quality content. &lt;/p&gt;

&lt;p&gt;I always wanted to write blogs/articles to share knowledge however, never got enough time to give it a chance and until now, I was only using this platform to check out the great articles written by my co-Community Builders.&lt;/p&gt;

&lt;p&gt;After a lot of procrastination, finally, I have decided to start my journey of writing articles/blogs and I would like to give this credit to AWS Community Builder program which pushed and motivate me to give it a shot. &lt;/p&gt;

&lt;p&gt;It is an absolute honour to be part of this program where you get the opportunity to learn and grow together by sharing knowledge, networking with great and talented people from all over the world, and helping each other in every possible way. &lt;/p&gt;

&lt;p&gt;To give back to the community, I would like to start off by writing a post on the &lt;a href="https://dev.to/aws-builders/significance-of-management-governance-on-aws-53e0"&gt;Significance of Migration &amp;amp; Governance&lt;/a&gt; on AWS.&lt;/p&gt;

&lt;p&gt;PS: I chose this topic because I got selected to be part of this category in AWS Community Builder program so decided to go with it first :)&lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
