<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Jagan</title>
    <description>The latest articles on Forem by Jagan (@jaganrajagopal).</description>
    <link>https://forem.com/jaganrajagopal</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/jaganrajagopal"/>
    <language>en</language>
    <item>
      <title>Business use case for AWS Apprunner and AWS Amplify</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Tue, 27 Feb 2024 05:52:21 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/business-use-case-for-aws-apprunner-and-aws-amplify-36jl</link>
      <guid>https://forem.com/jaganrajagopal/business-use-case-for-aws-apprunner-and-aws-amplify-36jl</guid>
      <description>&lt;p&gt;AWS App Runner and AWS Amplify are both fully managed services provided by Amazon Web Services (AWS) designed to simplify the deployment and hosting of applications, but they cater to different types of applications and use cases.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS App Runner&lt;/strong&gt;&lt;br&gt;
AWS App Runner is designed to make it easy for developers to deploy containerized web applications and APIs at scale without needing to manage the underlying infrastructure. It is particularly well-suited for scenarios where you want to quickly deploy applications from source code or a container image.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Features:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Automatic Deployment: Automatically builds and deploys the application from source code or a container image.&lt;br&gt;
Fully Managed: Manages the infrastructure, networking, scaling, and load balancing.&lt;br&gt;
No Infrastructure Management: No need to provision, configure, or scale clusters of virtual machines.&lt;br&gt;
Integrated with AWS Services: Easily integrates with Amazon ECR for container images and GitHub for source code.&lt;br&gt;
&lt;strong&gt;Use Cases:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Deploying containerized web applications and APIs quickly.&lt;br&gt;
Running microservices without managing servers or infrastructure.&lt;br&gt;
&lt;strong&gt;AWS Amplify&lt;/strong&gt;&lt;br&gt;
AWS Amplify is a set of tools and services designed to build, deploy, and manage full-stack mobile and web applications. It's particularly focused on applications that leverage AWS for the backend (such as databases, authentication, storage, and API) and use modern frontend frameworks (like React, Angular, Vue, or mobile platforms).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Features:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Full-Stack Application Development: Offers a broad set of tools and services for both frontend and backend development.&lt;br&gt;
&lt;strong&gt;Integrated Backend Services:&lt;/strong&gt; Easy integration with AWS services like Amazon Cognito for authentication, Amazon S3 for storage, and AWS AppSync for real-time data.&lt;br&gt;
CI/CD Pipeline: Automates the deployment process with a CI/CD pipeline for frontend and backend components.&lt;br&gt;
Hosting: Provides a simple way to host static web assets with global distribution.&lt;br&gt;
&lt;strong&gt;Use Cases:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Building and deploying full-stack applications leveraging AWS backend services.&lt;br&gt;
Developing mobile and web applications with real-time capabilities and offline support.&lt;/p&gt;

</description>
      <category>apprunner</category>
      <category>amplify</category>
      <category>aws</category>
      <category>awsdeveloper</category>
    </item>
    <item>
      <title>Step by Step instruction for AWS Lambda with dynamodb using cloudformation template</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Sun, 04 Feb 2024 14:28:38 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/step-by-step-instruction-for-aws-lambda-with-dynamodb-using-cloudformation-template-4fj3</link>
      <guid>https://forem.com/jaganrajagopal/step-by-step-instruction-for-aws-lambda-with-dynamodb-using-cloudformation-template-4fj3</guid>
      <description>&lt;p&gt;To create a Python Lambda function that fetches employee information (employeeid and employeename) from a DynamoDB table, you'll need to write a Python script for the Lambda function and include it in your CloudFormation template. Here's an example setup:&lt;/p&gt;

&lt;p&gt;Lambda Function (Python Script): This script uses the AWS SDK for Python (Boto3) to interact with DynamoDB.&lt;/p&gt;

&lt;p&gt;CloudFormation Template: This template includes the Lambda function and necessary permissions.&lt;/p&gt;

&lt;p&gt;First, here's the Python script for the Lambda function:&lt;br&gt;
python code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import json
import boto3
import os

def lambda_handler(event, context):
    dynamodb = boto3.resource('dynamodb')
    table = dynamodb.Table(os.environ['DYNAMODB_TABLE'])

    try:
        # Assuming the event contains the employeeid
        employeeid = event['employeeid']
        response = table.get_item(
            Key={
                'employeeid': employeeid
            }
        )
        employee = response.get('Item', {})

        if not employee:
            return {'statusCode': 404, 'body': json.dumps('Employee not found.')}

        return {
            'statusCode': 200,
            'body': json.dumps(employee)
        }

    except Exception as e:
        print(e)
        return {
            'statusCode': 500,
            'body': json.dumps('Error fetching employee data')
        }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;AWSTemplateFormatVersion: '2010-09-09'
Resources:
  EmployeeTable:
    Type: 'AWS::DynamoDB::Table'
    Properties:
      TableName: EmployeeTable
      AttributeDefinitions:
        - AttributeName: employeeid
          AttributeType: S
      KeySchema:
        - AttributeName: employeeid
          KeyType: HASH
      BillingMode: PAY_PER_REQUEST

  EmployeeLambdaExecutionRole:
    Type: 'AWS::IAM::Role'
    Properties:
      AssumeRolePolicyDocument:
        Version: '2012-10-17'
        Statement:
          - Effect: Allow
            Principal:
              Service:
                - lambda.amazonaws.com
            Action:
              - 'sts:AssumeRole'
      Policies:
        - PolicyName: LambdaDynamoDBAccess
          PolicyDocument:
            Version: '2012-10-17'
            Statement:
              - Effect: Allow
                Action:
                  - dynamodb:GetItem
                  - logs:CreateLogGroup
                  - logs:CreateLogStream
                  - logs:PutLogEvents
                Resource: '*'

  EmployeeFunction:
    Type: 'AWS::Lambda::Function'
    Properties:
      FunctionName: EmployeeFunction
      Runtime: python3.8
      Handler: index.lambda_handler
      Role: !GetAtt EmployeeLambdaExecutionRole.Arn
      Environment:
        Variables:
          DYNAMODB_TABLE: !Ref EmployeeTable
      Code:
        ZipFile: |
          &amp;lt;Your Lambda Function Python Code Here&amp;gt;

  # Add additional resources and outputs as needed

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Step 1: Please go to the AWS Cloudformation console and upload the code from local system &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fppdgx924808ud4d30ks3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fppdgx924808ud4d30ks3.png" alt="Image description" width="800" height="362"&gt;&lt;/a&gt;&lt;br&gt;
Step 2: Cloudformation will be three resource created &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw9lfe49da7h46ppjscu1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw9lfe49da7h46ppjscu1.png" alt="Image description" width="800" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 3: AWS Lambda was created via AWS Cloudformation , please go to AWS Lambda function and open the code as mention in below &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn2h66dl9gwyeor0wfho5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn2h66dl9gwyeor0wfho5.png" alt="Image description" width="800" height="323"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 4: IAM roles will be created and Dynamodb also will be created &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb0yu99ja3zp9jm2ir37v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb0yu99ja3zp9jm2ir37v.png" alt="Image description" width="800" height="212"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Step 5: Clean AWS Resource using cloudformation template &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0hmlxrv2jg1lc61wx6zh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0hmlxrv2jg1lc61wx6zh.png" alt="Image description" width="800" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Please github source code : &lt;a href="https://github.com/jaganrajagopal/cloudformationtemplate.git"&gt;Click Here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Please check more the articles : &lt;a href="https://awstrainingwithjagan.com/aws-solution-architecture-cheat-sheets/"&gt;https://awstrainingwithjagan.com/aws-solution-architecture-cheat-sheets/&lt;/a&gt;&lt;/p&gt;

</description>
      <category>awslambda</category>
      <category>dynamodb</category>
      <category>cloudformation</category>
      <category>aws</category>
    </item>
    <item>
      <title>What are the criteria you will consider while migration from onpremise to aws cloud</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Thu, 04 Jan 2024 16:31:16 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/what-are-the-criteria-you-will-consider-while-migration-from-onpremise-to-aws-cloud-2ea5</link>
      <guid>https://forem.com/jaganrajagopal/what-are-the-criteria-you-will-consider-while-migration-from-onpremise-to-aws-cloud-2ea5</guid>
      <description>&lt;p&gt;When planning a migration from an on-premise environment to AWS Cloud, several key criteria must be considered to ensure a smooth, efficient, and successful transition. Here are the primary factors to consider:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Migration Readiness Assessment:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Evaluate your organization's readiness for cloud migration. This includes technical, business, and operational aspects.&lt;br&gt;
Cost Analysis:&lt;/p&gt;

&lt;p&gt;Perform a detailed cost-benefit analysis. Compare on-premise costs (including hardware, maintenance, and facilities) to the projected costs on AWS.&lt;br&gt;
Consider long-term costs like storage, data transfer, and AWS service pricing.&lt;br&gt;
&lt;strong&gt;Security and Compliance:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Assess security requirements and ensure AWS services meet your security and compliance needs.&lt;br&gt;
Understand AWS shared responsibility model for security.&lt;br&gt;
Ensure compliance with industry regulations (like GDPR, HIPAA, etc.) in the cloud.&lt;br&gt;
&lt;strong&gt;Data Migration Strategy:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Plan data migration carefully, considering the volume of data and acceptable downtime.&lt;br&gt;
Use tools like AWS DataSync, AWS Database Migration Service, or AWS Snowball for large data transfers.&lt;br&gt;
&lt;strong&gt;Application Assessment and Dependency Mapping:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Evaluate applications for cloud suitability, dependencies, and necessary modifications.&lt;br&gt;
Decide on migration strategies for each application (rehost, refactor, re-platform, retire, retain).&lt;br&gt;
&lt;strong&gt;Network and Connectivity:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Plan for network architecture changes, including VPN connections, Direct Connect, and internal networking on AWS.&lt;br&gt;
Assess bandwidth requirements and latency implications.&lt;br&gt;
&lt;strong&gt;Performance and Scalability:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Ensure that AWS resources meet or exceed on-premise performance.&lt;br&gt;
Plan for scalability to handle future growth and peak loads.&lt;br&gt;
&lt;strong&gt;Disaster Recovery and High Availability:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Design for high availability using AWS's global infrastructure.&lt;br&gt;
Implement a disaster recovery strategy that meets your business continuity requirements.&lt;br&gt;
&lt;strong&gt;Operational Readiness:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Prepare your IT staff with the necessary training and skills for managing AWS services.&lt;br&gt;
Set up operational processes like monitoring, logging, alerting, and incident response with AWS tools.&lt;br&gt;
&lt;strong&gt;Change Management:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Manage the organizational change aspect, including aligning stakeholders and communicating plans and benefits.&lt;br&gt;
Vendor Lock-in Considerations:&lt;/p&gt;

&lt;p&gt;Understand the implications of cloud vendor lock-in and plan for potential future needs to migrate or integrate with other clouds or services.&lt;br&gt;
Testing and Validation:&lt;/p&gt;

&lt;p&gt;Thoroughly test applications and infrastructure in the cloud environment before going live.&lt;br&gt;
Conduct performance testing, security testing, and disaster recovery drills.&lt;br&gt;
**&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Bu1zvGTO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ezqhlo7s4bxzq1t7n6kr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Bu1zvGTO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ezqhlo7s4bxzq1t7n6kr.png" alt="Image description" width="309" height="163"&gt;&lt;/a&gt;**&lt;/p&gt;

&lt;p&gt;Continuously monitor and optimize costs, performance, and security after the migration.&lt;br&gt;
Leverage AWS services and tools for cost optimization, like AWS Trusted Advisor and Cost Explorer.&lt;br&gt;
Project Management and Governance:&lt;/p&gt;

&lt;p&gt;Ensure strong project management to oversee the migration process.&lt;br&gt;
Establish clear governance and policies for cloud usage and management.&lt;/p&gt;

</description>
      <category>cloudmigration</category>
      <category>awscloud</category>
      <category>awssolutionarchitecture</category>
      <category>onpremis</category>
    </item>
    <item>
      <title>How will you design cross region replication for aws ec2 instance with aws applicaiton balancer</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Thu, 04 Jan 2024 16:22:09 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/how-will-you-design-cross-region-replication-for-aws-ec2-instance-with-aws-applicaiton-balancer-1mep</link>
      <guid>https://forem.com/jaganrajagopal/how-will-you-design-cross-region-replication-for-aws-ec2-instance-with-aws-applicaiton-balancer-1mep</guid>
      <description>&lt;p&gt;Designing cross-region replication for AWS EC2 instances with an AWS Application Load Balancer (ALB) involves several steps to ensure high availability, fault tolerance, and efficient traffic distribution across regions. Here's a step-by-step approach to achieve this:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Setup of AWS Regions and Availability Zones&lt;/strong&gt;&lt;br&gt;
Select Regions: Choose the AWS regions in which you want to replicate your EC2 instances. Ensure these regions support the services you need.&lt;br&gt;
Availability Zones: Within each region, select multiple Availability Zones (AZs) for higher fault tolerance.&lt;br&gt;
&lt;strong&gt;2. EC2 Instance Replication&lt;/strong&gt;&lt;br&gt;
AMI Creation: Create an Amazon Machine Image (AMI) of your EC2 instance in the primary region.&lt;br&gt;
AMI Copy: Copy the AMI to the secondary region(s).&lt;br&gt;
Instance Launch: Launch EC2 instances in the secondary region(s) using the copied AMI.&lt;br&gt;
&lt;strong&gt;3. Data Synchronization&lt;/strong&gt;&lt;br&gt;
Database Replication: If your application uses a database, set up cross-region database replication.&lt;br&gt;
Storage Synchronization: Use Amazon S3 with Cross-Region Replication (CRR) for any required S3 bucket data synchronization.&lt;br&gt;
File System Consistency: For shared file systems, consider AWS services like EFS or FSx, which can be replicated across regions.&lt;br&gt;
&lt;strong&gt;4. Load Balancing and Traffic Distribution&lt;/strong&gt;&lt;br&gt;
Regional ALBs: Set up Application Load Balancers in each region to distribute traffic to the EC2 instances in their respective regions.&lt;br&gt;
&lt;strong&gt;Route 53:&lt;/strong&gt; Use Amazon Route 53 for DNS and traffic management:&lt;br&gt;
Geolocation Routing: Route users to the nearest region for better performance.&lt;br&gt;
Health Checks: Implement health checks to monitor the health of the instances across regions.&lt;br&gt;
Failover Routing: Configure failover routing to redirect traffic from the primary to the secondary region in case of an outage.&lt;br&gt;
&lt;strong&gt;5. Auto-Scaling&lt;/strong&gt;&lt;br&gt;
Auto Scaling Groups: Implement Auto Scaling Groups in each region to automatically adjust the number of EC2 instances based on demand.&lt;br&gt;
&lt;strong&gt;6. Security and Compliance&lt;/strong&gt;&lt;br&gt;
Security Groups and NACLs: Replicate security group and network ACL settings across regions.&lt;br&gt;
IAM Roles and Policies: Ensure consistent IAM roles and policies for EC2 instances in all regions.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>loadbalancer</category>
      <category>awscloud</category>
      <category>ec2</category>
    </item>
    <item>
      <title>step by step instruction for deployment as frontend and backend on aks cluster using azuredevops</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Thu, 28 Sep 2023 03:19:21 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/step-by-step-instruction-for-deployment-as-frontend-and-backend-on-aks-cluster-using-azuredevops-2fjj</link>
      <guid>https://forem.com/jaganrajagopal/step-by-step-instruction-for-deployment-as-frontend-and-backend-on-aks-cluster-using-azuredevops-2fjj</guid>
      <description>&lt;p&gt;This pipeline will first build a Docker image of your app and push it to your Azure Container Registry. Then, it will deploy the image to your AKS cluster using a deployment and service.&lt;/p&gt;

&lt;p&gt;The deployment.yaml file should define a deployment for your app. The following is an example:&lt;/p&gt;

&lt;p&gt;To deploy multiple replicas of a Node.js frontend app with a MongoDB backend to an AKS cluster using a deployment YAML file, you can use the following code:&lt;/p&gt;

&lt;p&gt;YAML&lt;br&gt;
apiVersion: apps/v1&lt;br&gt;
kind: Deployment&lt;br&gt;
metadata:&lt;br&gt;
  name: my-app&lt;br&gt;
spec:&lt;br&gt;
  replicas: 3&lt;br&gt;
  selector:&lt;br&gt;
    matchLabels:&lt;br&gt;
      app: my-app&lt;br&gt;
  template:&lt;br&gt;
    metadata:&lt;br&gt;
      labels:&lt;br&gt;
        app: my-app&lt;br&gt;
    spec:&lt;br&gt;
      containers:&lt;br&gt;
      - name: my-app&lt;br&gt;
        image: 'nodejs:latest'&lt;br&gt;
        ports:&lt;br&gt;
        - containerPort: 3000&lt;br&gt;
Use code with caution. Learn more&lt;br&gt;
This deployment YAML file will create a deployment for your app with 3 replicas. The my-app container will be created and run with the image you specify. The container will be exposed on port 3000.&lt;/p&gt;

&lt;p&gt;You can deploy this YAML file to your AKS cluster using the following command:&lt;/p&gt;

&lt;p&gt;kubectl apply -f deployment.yaml&lt;br&gt;
Once the deployment is created, you can access your app at the external IP address of your AKS cluster on port 3000.&lt;/p&gt;

&lt;p&gt;To deploy multiple replicas of a MongoDB backend to an AKS cluster using a deployment YAML file, you can use the following code:&lt;/p&gt;

&lt;p&gt;backenddeployment.yaml as mention in below &lt;br&gt;
YAML&lt;br&gt;
apiVersion: apps/v1&lt;br&gt;
kind: Deployment&lt;br&gt;
metadata:&lt;br&gt;
  name: mongodbdeployment&lt;br&gt;
spec:&lt;br&gt;
  replicas: 3&lt;br&gt;
  selector:&lt;br&gt;
    matchLabels:&lt;br&gt;
      app: mongodb&lt;br&gt;
  template:&lt;br&gt;
    metadata:&lt;br&gt;
      labels:&lt;br&gt;
        app: mongodb&lt;br&gt;
    spec:&lt;br&gt;
      containers:&lt;br&gt;
      - name: mongodb&lt;br&gt;
        image: mongo:latest&lt;br&gt;
        ports:&lt;br&gt;
        - containerPort: 27017&lt;br&gt;
Use code with caution. Learn more&lt;br&gt;
This deployment YAML file will create a deployment for your MongoDB backend with 3 replicas. The mongodb container will be created and run with the official MongoDB image. The container will be exposed on port 27017.&lt;/p&gt;

&lt;p&gt;You can deploy this YAML file to your AKS cluster using the following command:&lt;/p&gt;

&lt;p&gt;kubectl apply -f backenddeployment.yaml &lt;br&gt;
Once the deployment is created, your MongoDB backend will be running and accessible on port 27017 within your AKS cluster.&lt;/p&gt;

&lt;p&gt;You can then connect to your MongoDB backend from your Node.js frontend app using the following code:&lt;/p&gt;

&lt;p&gt;JavaScript&lt;br&gt;
const MongoClient = require('mongodb').MongoClient;&lt;/p&gt;

&lt;p&gt;const client = new MongoClient('mongodb://mongodb:27017');&lt;/p&gt;

&lt;p&gt;client.connect(function(err, db) {&lt;br&gt;
  if (err) {&lt;br&gt;
    console.log(err);&lt;br&gt;
    return;&lt;br&gt;
  }&lt;/p&gt;

&lt;p&gt;// Do something with the database&lt;br&gt;
});&lt;/p&gt;

&lt;p&gt;Service.yml file&lt;br&gt;
apiVersion: v1&lt;br&gt;
kind: Service&lt;br&gt;
metadata:&lt;br&gt;
  name: my-app&lt;br&gt;
spec:&lt;br&gt;
  selector:&lt;br&gt;
    app: my-app&lt;br&gt;
  ports:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;protocol: TCP
port: 80
targetPort: 3000&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To create an Azure DevOps pipeline YAML file for deploying a Node.js frontend app with a MongoDB backend to an AKS cluster using a service and deployment, you can use the following code:&lt;/p&gt;

&lt;p&gt;YAML&lt;br&gt;
trigger:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;master&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;pool:&lt;br&gt;
  vmImage: 'ubuntu-latest'&lt;/p&gt;

&lt;p&gt;steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;task: Docker@2&lt;br&gt;
inputs:&lt;br&gt;
containerRegistry: 'Azure Container Registry'&lt;br&gt;
repository: 'my-app'&lt;br&gt;
command: 'build'&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;task: Kubernetes@1&lt;br&gt;
inputs:&lt;br&gt;
azureSubscription: 'my-subscription'&lt;br&gt;
kubernetesServiceConnection: 'my-aks-cluster'&lt;br&gt;
namespace: 'default'&lt;br&gt;
command: 'deploy'&lt;br&gt;
manifests:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;frontdeployment.yaml&lt;/li&gt;
&lt;li&gt;service.yaml&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To create an Azure DevOps pipeline YAML file for deploying a Node.js frontend app with a MongoDB backend to an AKS cluster using a service and deployment, you can use the following code:&lt;/p&gt;

&lt;p&gt;YAML&lt;br&gt;
trigger:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;master&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;pool:&lt;br&gt;
  vmImage: 'ubuntu-latest'&lt;/p&gt;

&lt;p&gt;steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;task: Docker@2&lt;br&gt;
inputs:&lt;br&gt;
containerRegistry: 'Azure Container Registry'&lt;br&gt;
repository: 'my-app'&lt;br&gt;
command: 'build'&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;task: Kubernetes@1&lt;br&gt;
inputs:&lt;br&gt;
azureSubscription: 'my-subscription'&lt;br&gt;
kubernetesServiceConnection: 'my-aks-cluster'&lt;br&gt;
namespace: 'default'&lt;br&gt;
command: 'deploy'&lt;br&gt;
manifests:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;deployment.yaml&lt;/li&gt;
&lt;li&gt;backenddeployment.yaml&lt;/li&gt;
&lt;li&gt;service.yaml&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aks</category>
      <category>azuredevop</category>
      <category>node</category>
      <category>mongodb</category>
    </item>
    <item>
      <title>Docker Cheatsheet for Beginners</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Tue, 26 Sep 2023 03:25:32 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/docker-cheatsheet-for-beginners-1aba</link>
      <guid>https://forem.com/jaganrajagopal/docker-cheatsheet-for-beginners-1aba</guid>
      <description>&lt;p&gt;Docker Cheatsheet for Beginners (in Detail)&lt;/p&gt;

&lt;p&gt;Docker Concepts&lt;/p&gt;

&lt;p&gt;Docker image: A read-only template that contains everything needed to run a specific application, including the application code, system libraries, and runtime environment. Docker images are built using Dockerfiles, which are text files that contain instructions for building the image.&lt;br&gt;
Docker container: A running instance of a Docker image. Containers are isolated from each other and from the host machine, making them lightweight and portable. Containers can be started, stopped, and restarted quickly and easily.&lt;br&gt;
Docker registry: A central repository for storing and distributing Docker images. Docker Hub is the official Docker registry, but there are also many private and third-party registries available.&lt;br&gt;
Docker Commands&lt;/p&gt;

&lt;p&gt;docker build: Builds a Docker image from a Dockerfile.&lt;br&gt;
docker run: Creates and runs a container from a Docker image.&lt;br&gt;
docker ps: Lists all running containers.&lt;br&gt;
docker stop: Stops a running container.&lt;br&gt;
docker rm: Removes a container.&lt;br&gt;
docker images: Lists all local Docker images.&lt;br&gt;
docker rmi: Removes a Docker image.&lt;br&gt;
docker pull: Pulls a Docker image from a registry.&lt;br&gt;
docker push: Pushes a Docker image to a registry.&lt;br&gt;
Useful Docker Flags&lt;/p&gt;

&lt;p&gt;-d: Detaches the container from the terminal, allowing it to run in the background.&lt;br&gt;
-p: Publishes a container port to the host machine. This allows you to access the application running in the container from the host machine.&lt;br&gt;
-v: Mounts a host directory to a container directory. This allows you to share data between the host machine and the container.&lt;br&gt;
-e: Sets an environment variable in the container. This can be useful for configuring the application running in the container.&lt;br&gt;
Example Docker Command&lt;/p&gt;

&lt;p&gt;To run a simple web application in a Docker container, you would use the following command:&lt;/p&gt;

&lt;p&gt;docker run -d -p 8080:80 nginx&lt;br&gt;
This command will create and run a container from the nginx image, publishing port 80 of the container to port 80 of the host machine. This means that you will be able to access the web application at &lt;a href="http://localhost:8080"&gt;http://localhost:8080&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Dockerfile&lt;/p&gt;

&lt;p&gt;A Dockerfile is a text file that contains instructions for building a Docker image. Dockerfiles are typically divided into stages, each of which performs a specific task, such as installing dependencies, copying files, or running commands.&lt;/p&gt;

&lt;p&gt;Here is an example of a simple Dockerfile for a web application:&lt;/p&gt;

&lt;p&gt;FROM nginx&lt;/p&gt;

&lt;p&gt;COPY . /usr/share/nginx/html&lt;/p&gt;

&lt;p&gt;EXPOSE 80&lt;/p&gt;

&lt;p&gt;CMD ["nginx", "-g", "daemon off;"]&lt;br&gt;
This Dockerfile will build an image from the nginx base image and copy the current directory to the /usr/share/nginx/html directory in the container. It will then expose port 80 of the container and start the nginx server with the -g flag to disable daemon mode.&lt;/p&gt;

&lt;p&gt;Docker Compose&lt;/p&gt;

&lt;p&gt;Docker Compose is a tool that allows you to define and manage multiple Docker containers as a single service. This is useful for running complex applications that require multiple containers to work together.&lt;/p&gt;

&lt;p&gt;To use Docker Compose, you create a YAML file called docker-compose.yml that defines your services. For example, the following docker-compose.yml file defines a simple web application with a backend and frontend service:&lt;/p&gt;

&lt;p&gt;YAML&lt;br&gt;
version: "3.7"&lt;/p&gt;

&lt;p&gt;services:&lt;br&gt;
  backend:&lt;br&gt;
    image: nginx&lt;br&gt;
    volumes:&lt;br&gt;
      - ./backend:/usr/share/nginx/html&lt;/p&gt;

&lt;p&gt;frontend:&lt;br&gt;
    image: nginx&lt;br&gt;
    volumes:&lt;br&gt;
      - ./frontend:/usr/share/nginx/html&lt;br&gt;
    depends_on:&lt;br&gt;
      - backend&lt;/p&gt;

&lt;p&gt;ports:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"80:80"
Use code with caution. Learn more
To build and run the application, you would simply run the following command:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;docker-compose up -d&lt;br&gt;
This will build the images for the backend and frontend services and then start the containers. You will be able to access the web application at &lt;a href="http://localhost:80"&gt;http://localhost:80&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Troubleshooting Docker&lt;/p&gt;

&lt;p&gt;If you are having problems with Docker, there are a few things you can check:&lt;/p&gt;

&lt;p&gt;Make sure that Docker is installed and running correctly. You can check this by running the docker info command.&lt;br&gt;
Make sure that you have the required permissions to run Docker. You may need to run Docker with sudo.&lt;br&gt;
Make sure that you have enough disk space to build and run Docker images and containers.&lt;br&gt;
Check the Docker documentation or online forums for help with specific problems.&lt;/p&gt;

</description>
      <category>docker</category>
      <category>devops</category>
    </item>
    <item>
      <title>Step by step for build , test and deploy using azuredevops pipeline</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Sun, 24 Sep 2023 16:06:39 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/step-by-step-for-build-test-and-deploy-using-azuredevops-pipeline-4pm5</link>
      <guid>https://forem.com/jaganrajagopal/step-by-step-for-build-test-and-deploy-using-azuredevops-pipeline-4pm5</guid>
      <description>&lt;p&gt;Creating an Azure Pipeline for building, testing, and publishing artifacts for a Python web application involves defining a pipeline configuration file in your source code repository. Here, I'll provide a step-by-step guide to creating a simple Azure Pipeline for a Python web application:&lt;/p&gt;

&lt;p&gt;Step 1: Prerequisites&lt;/p&gt;

&lt;p&gt;Ensure you have an Azure DevOps organization and project set up.&lt;br&gt;
Have your Python web application code stored in a Git repository (e.g., Azure Repos or GitHub).&lt;br&gt;
Step 2: Create an Azure Pipeline Configuration File&lt;/p&gt;

&lt;p&gt;Create a file named azure-pipelines.yml in the root directory of your repository. This file will define the build, test, and artifact publishing stages of your pipeline.&lt;/p&gt;

&lt;p&gt;Here's a basic example of an azure-pipelines.yml file:&lt;/p&gt;

&lt;p&gt;trigger:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;'*'&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;pool:&lt;br&gt;
  vmImage: 'ubuntu-latest'&lt;/p&gt;

&lt;p&gt;stages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;stage: Build&lt;br&gt;
jobs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;job: BuildJob
steps:&lt;/li&gt;
&lt;li&gt;task: UsePythonVersion@0
inputs:
versionSpec: '3.x'
addToPath: true&lt;/li&gt;
&lt;li&gt;script: |
python -m venv venv
source venv/bin/activate
pip install -r requirements.txt
displayName: 'Install Python dependencies'&lt;/li&gt;
&lt;li&gt;script: |
python -m unittest discover tests
displayName: 'Run Unit Tests'&lt;/li&gt;
&lt;li&gt;task: PublishPipelineArtifact@1
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'webapp-artifact'
condition: succeeded()&lt;/li&gt;
&lt;li&gt;stage: Deploy
jobs:&lt;/li&gt;
&lt;li&gt;job: DeployJob
steps:

&lt;ul&gt;
&lt;li&gt;download: current
artifact: 'webapp-artifact'
displayName: 'Download Artifact'&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Matrix based upon on deployment&lt;/p&gt;

&lt;p&gt;trigger:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;main&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;pool:&lt;br&gt;
  vmImage: ubuntu-latest&lt;br&gt;
strategy:&lt;br&gt;
  matrix:&lt;br&gt;
    Python38:&lt;br&gt;
      python.version: '3.8'&lt;br&gt;
    Python39:&lt;br&gt;
      python.version: '3.9'&lt;br&gt;
    Python310:&lt;br&gt;
      python.version: '3.10'&lt;/p&gt;

&lt;p&gt;steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;task: UsePythonVersion@0&lt;br&gt;
inputs:&lt;br&gt;
versionSpec: '$(python.version)'&lt;br&gt;
displayName: 'Use Python $(python.version)'&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;script: |&lt;br&gt;
python -m pip install --upgrade pip&lt;br&gt;
pip install -r requirements.txt&lt;br&gt;
displayName: 'Install dependencies'&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;script: |&lt;br&gt;
pip install pytest pytest-azurepipelines&lt;br&gt;
pytest&lt;br&gt;
displayName: 'pytest'&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Step 3: Configure Your Python Web Application&lt;/p&gt;

&lt;p&gt;Ensure your Python web application is structured correctly with the required files:&lt;/p&gt;

&lt;p&gt;requirements.txt: List of Python dependencies.&lt;br&gt;
tests/: Directory containing your unit tests.&lt;br&gt;
Any other necessary application files and directories.&lt;br&gt;
Step 4: Create the Azure Pipeline&lt;/p&gt;

&lt;p&gt;Go to your Azure DevOps project.&lt;br&gt;
Navigate to Pipelines &amp;gt; New Pipeline.&lt;br&gt;
Select your source code repository.&lt;br&gt;
Choose "YAML" for the pipeline configuration.&lt;br&gt;
In the YAML editor, make sure it reflects the content of your azure-pipelines.yml file.&lt;br&gt;
Click "Save and Run" to create and trigger the pipeline.&lt;br&gt;
Step 5: Monitor and Troubleshoot&lt;/p&gt;

&lt;p&gt;Once the pipeline is running, you can monitor its progress and view logs and test results. If any issues arise, Azure DevOps provides a rich set of diagnostic tools to help you troubleshoot and fix them.&lt;/p&gt;

&lt;p&gt;This example provides a basic starting point for your Azure Pipeline. Depending on your specific needs, you may need to add deployment steps, environment variables, or additional configurations to customize your pipeline further.&lt;/p&gt;

&lt;p&gt;Regenerate&lt;/p&gt;

</description>
      <category>python</category>
      <category>azuredevops</category>
      <category>azure</category>
    </item>
    <item>
      <title>Step by Step instruction hosting on aks cluster for begineers</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Sun, 24 Sep 2023 15:47:17 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/step-by-step-instruction-hosting-on-aks-cluster-for-begineers-2fb9</link>
      <guid>https://forem.com/jaganrajagopal/step-by-step-instruction-hosting-on-aks-cluster-for-begineers-2fb9</guid>
      <description>&lt;p&gt;Creating a web application on an Azure Kubernetes Service (AKS) cluster using Azure DevOps involves several steps, including defining your application, setting up Azure DevOps pipelines, and deploying your application to the AKS cluster. Here's a high-level overview of the process:&lt;/p&gt;

&lt;p&gt;Pre-requisites:&lt;/p&gt;

&lt;p&gt;Azure DevOps: Make sure you have an Azure DevOps organization and project set up.&lt;br&gt;
Azure AKS Cluster: Create an AKS cluster in Azure if you haven't already.&lt;/p&gt;

&lt;p&gt;Create Your Web Application:&lt;/p&gt;

&lt;p&gt;Develop your web application code and containerize it using Docker. Ensure you have a Dockerfile that describes how to build your application into a container image.&lt;br&gt;
Store your application code in a source code repository, such as Azure Repos or GitHub.&lt;br&gt;
&lt;strong&gt;Set up Azure DevOps Pipeline:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In your Azure DevOps project, create a new pipeline to automate the CI/CD process for your web application.&lt;br&gt;
Configure Build Pipeline:&lt;/p&gt;

&lt;p&gt;Configure a build pipeline to build the Docker container image of your web application from the source code.&lt;br&gt;
Use Azure DevOps tasks or scripts to build and push the Docker image to a container registry (e.g., Azure Container Registry, Docker Hub, or others).&lt;br&gt;
&lt;strong&gt;Configure Release Pipeline:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create a release pipeline to deploy your application to the AKS cluster.&lt;br&gt;
Define the deployment stages, environments, and deployment triggers as needed.&lt;br&gt;
&lt;strong&gt;Deployment to AKS:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Use Kubernetes manifests (YAML files) to describe your application deployment, services, and other resources.&lt;br&gt;
Configure your release pipeline to apply these manifests to the AKS cluster using kubectl or Azure DevOps tasks designed for AKS.&lt;/p&gt;

&lt;p&gt;Step by step &lt;br&gt;
&lt;a href="https://github.com/jaganrajagopal/Jenkinswithdockercomposeup/tree/master#jenkinswithdockercomposeup"&gt;https://github.com/jaganrajagopal/Jenkinswithdockercomposeup/tree/master#jenkinswithdockercomposeup&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://user-images.githubusercontent.com/8061469/270113719-aea14701-d419-4988-900c-633b43f61e0b.png"&gt;https://user-images.githubusercontent.com/8061469/270113719-aea14701-d419-4988-900c-633b43f61e0b.png&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azuredevops</category>
      <category>devops</category>
      <category>dotnetcore</category>
      <category>cicd</category>
    </item>
    <item>
      <title>Azure devops pipeline for dotnet with cicd</title>
      <dc:creator>Jagan</dc:creator>
      <pubDate>Thu, 14 Sep 2023 16:19:03 +0000</pubDate>
      <link>https://forem.com/jaganrajagopal/azure-devops-pipeline-for-dotnet-with-cicd-17af</link>
      <guid>https://forem.com/jaganrajagopal/azure-devops-pipeline-for-dotnet-with-cicd-17af</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2-igqHxt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2zmq18gf5w8cf0bsaokt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2-igqHxt--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2zmq18gf5w8cf0bsaokt.png" alt="Image description" width="772" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Azuredevops pipeline integration for dotnetcore , please find the github link in below&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/jaganrajagopal/azuredevopsdotnet"&gt;https://github.com/jaganrajagopal/azuredevopsdotnet&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azuredevops</category>
      <category>azure</category>
      <category>dotnetcore</category>
      <category>cicd</category>
    </item>
  </channel>
</rss>
