DEV Community

Cover image for Python for DevOps: Python For Provisioning Webserver and Installing Software On It
Dennis Tei-Muno for AWS Community Builders

Posted on • Originally published at Medium

3

Python for DevOps: Python For Provisioning Webserver and Installing Software On It

DevOps Project: EC2 and Apache Deployment with Python & Fabric

image

This project automates the provisioning and deployment of a basic Apache web application using Python-based DevOps tools. It showcases the integration of AWS (via Boto3), SSH-based automation (via Fabric), and infrastructure scripting to streamline web server deployment.


Project Overview

Goal:

Provision an EC2 instance, upload and install Apache2 using a shell script, and manage infrastructure using Python automation.


Project Structure

  • run_ec2_instances.py Automates the creation of an EC2 instance using Boto3 with custom tags, security group, subnet, and monitoring options. Screenshot 2025-07-04 at 10 43 54 PM
import boto3

client = boto3.client("ec2")

def create_ec2_instance(image_id, instance_type, count):
    """
    Launch one or more EC2 instances with the given parameters.
    """
    response = client.run_instances(
        ImageId=image_id,
        InstanceType=instance_type,
        KeyName="keypairforluitccp3",
        MinCount=count,
        MaxCount=count,
        Monitoring={"Enabled": True},
        SecurityGroupIds=["<put-your-here>"],
        SubnetId="<put-yours-here>",
        TagSpecifications=[
            {
                "ResourceType": "instance",       
                "Tags": [
                    {"Key": "environment", "Value": "development"},
                ],
            }
        ],
    )

    return response


if __name__ == "__main__":
    create_ec2_instance("ami-020cba7c55df1f615", "t2.micro", 1)
Enter fullscreen mode Exit fullscreen mode
  • create_s3_bucket.py Creates a private S3 bucket using Boto3 for optional static file storage or artifact management. Screenshot 2025-07-04 at 11 11 16 PM
import boto3

client = boto3.client("s3", region_name="us-east-1")   

def create_s3_bucket():
    """
    Create a private S3 bucket named dtm-python-devops2.
    """
    client.create_bucket(
        Bucket="dtm-python-devops2",
        ACL="private"

    )

if __name__ == "__main__":
    create_s3_bucket()
Enter fullscreen mode Exit fullscreen mode
  • ssh_into_instance.py Uses the Fabric library to:
    • SSH into the EC2 instance via PEM key
    • Upload the Apache setup script
    • Make the script executable and run it with sudo
from fabric import Connection
from subprocess import run
#Connect to the instance using the keypair
c = Connection(
    'ubuntu@23.22.12.129:22', 
    connect_kwargs={
        'key_filename': '/Users/dteimuno/Desktop/keypairforluitccp3.pem'
        },
        )
#Ensure local key is accessible
run(['chmod', '400', '/Users/dteimuno/Desktop/keypairforluitccp3.pem'])
#Transfer the script to the instance
c.put('./apache.sh', '/home/ubuntu/apache.sh')
#Make script executable
c.run('chmod +x /home/ubuntu/apache.sh')
#Run the script on the instance
c.run('sudo /home/ubuntu/apache.sh')
Enter fullscreen mode Exit fullscreen mode
  • apache.sh A Bash script executed remotely to:
    • Install Apache2
    • Enable and start the service
    • Display the server’s public IP address image
#install apache2 on ubuntu
#!/bin/bash
# Update package list
sudo apt update
# Install Apache2
sudo apt install -y apache2
# Enable Apache2 to start on boot
sudo systemctl enable apache2
# Start Apache2 service
sudo systemctl start apache2
Enter fullscreen mode Exit fullscreen mode

Tools & Technologies Used

  • Python 3
  • Boto3 – AWS automation SDK
  • Fabric – SSH and remote task automation
  • AWS EC2 – Cloud compute provisioning
  • AWS S3 – Cloud object storage
  • Apache2 – Web server installed via Bash
  • Ubuntu – EC2 AMI base OS

🔄 Workflow Summary

  1. Provision EC2 instance with run_ec2_instances.py
  2. Create an S3 bucket using create_s3_bucket.py (optional step)
  3. SSH and deploy Apache using ssh_into_instance.py and apache.sh
  4. Apache is now running and serving web content on the EC2 public IP

Future Enhancements

  • Automate web app zip upload to S3 and extraction on EC2
  • Add teardown scripts to delete EC2 and S3 resources
  • Integrate with CI/CD (e.g., GitHub Actions or Jenkins)
  • Add error handling and logging

Status

✅ Completed — ready for demonstration or extension.


DevCycle image

Ship Faster, Stay Flexible.

DevCycle is the first feature flag platform with OpenFeature built-in to every open source SDK, designed to help developers ship faster while avoiding vendor-lock in.

Start shipping

Top comments (0)

Best Practices for Running  Container WordPress on AWS (ECS, EFS, RDS, ELB) using CDK cover image

Best Practices for Running Container WordPress on AWS (ECS, EFS, RDS, ELB) using CDK

This post discusses the process of migrating a growing WordPress eShop business to AWS using AWS CDK for an easily scalable, high availability architecture. The detailed structure encompasses several pillars: Compute, Storage, Database, Cache, CDN, DNS, Security, and Backup.

Read full post