DevOps Project: EC2 and Apache Deployment with Python & Fabric
This project automates the provisioning and deployment of a basic Apache web application using Python-based DevOps tools. It showcases the integration of AWS (via Boto3), SSH-based automation (via Fabric), and infrastructure scripting to streamline web server deployment.
Project Overview
Goal:
Provision an EC2 instance, upload and install Apache2 using a shell script, and manage infrastructure using Python automation.
Project Structure
-
run_ec2_instances.py
Automates the creation of an EC2 instance using Boto3 with custom tags, security group, subnet, and monitoring options.
import boto3
client = boto3.client("ec2")
def create_ec2_instance(image_id, instance_type, count):
"""
Launch one or more EC2 instances with the given parameters.
"""
response = client.run_instances(
ImageId=image_id,
InstanceType=instance_type,
KeyName="keypairforluitccp3",
MinCount=count,
MaxCount=count,
Monitoring={"Enabled": True},
SecurityGroupIds=["<put-your-here>"],
SubnetId="<put-yours-here>",
TagSpecifications=[
{
"ResourceType": "instance",
"Tags": [
{"Key": "environment", "Value": "development"},
],
}
],
)
return response
if __name__ == "__main__":
create_ec2_instance("ami-020cba7c55df1f615", "t2.micro", 1)
-
create_s3_bucket.py
Creates a private S3 bucket using Boto3 for optional static file storage or artifact management.
import boto3
client = boto3.client("s3", region_name="us-east-1")
def create_s3_bucket():
"""
Create a private S3 bucket named dtm-python-devops2.
"""
client.create_bucket(
Bucket="dtm-python-devops2",
ACL="private"
)
if __name__ == "__main__":
create_s3_bucket()
-
ssh_into_instance.py
Uses the Fabric library to:- SSH into the EC2 instance via PEM key
- Upload the Apache setup script
- Make the script executable and run it with
sudo
from fabric import Connection
from subprocess import run
#Connect to the instance using the keypair
c = Connection(
'ubuntu@23.22.12.129:22',
connect_kwargs={
'key_filename': '/Users/dteimuno/Desktop/keypairforluitccp3.pem'
},
)
#Ensure local key is accessible
run(['chmod', '400', '/Users/dteimuno/Desktop/keypairforluitccp3.pem'])
#Transfer the script to the instance
c.put('./apache.sh', '/home/ubuntu/apache.sh')
#Make script executable
c.run('chmod +x /home/ubuntu/apache.sh')
#Run the script on the instance
c.run('sudo /home/ubuntu/apache.sh')
-
apache.sh
A Bash script executed remotely to:- Install Apache2
- Enable and start the service
- Display the server’s public IP address
#install apache2 on ubuntu
#!/bin/bash
# Update package list
sudo apt update
# Install Apache2
sudo apt install -y apache2
# Enable Apache2 to start on boot
sudo systemctl enable apache2
# Start Apache2 service
sudo systemctl start apache2
Tools & Technologies Used
- Python 3
- Boto3 – AWS automation SDK
- Fabric – SSH and remote task automation
- AWS EC2 – Cloud compute provisioning
- AWS S3 – Cloud object storage
- Apache2 – Web server installed via Bash
- Ubuntu – EC2 AMI base OS
🔄 Workflow Summary
-
Provision EC2 instance with
run_ec2_instances.py
-
Create an S3 bucket using
create_s3_bucket.py
(optional step) -
SSH and deploy Apache using
ssh_into_instance.py
andapache.sh
- Apache is now running and serving web content on the EC2 public IP
Future Enhancements
- Automate web app zip upload to S3 and extraction on EC2
- Add teardown scripts to delete EC2 and S3 resources
- Integrate with CI/CD (e.g., GitHub Actions or Jenkins)
- Add error handling and logging
Status
✅ Completed — ready for demonstration or extension.
Top comments (0)