<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Hamd Waseem</title>
    <description>The latest articles on Forem by Hamd Waseem (@hamdivazim).</description>
    <link>https://forem.com/hamdivazim</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/hamdivazim"/>
    <language>en</language>
    <item>
      <title>I Won The Build4Students Hackathon! WeRelaySyllabus Devlog</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Sat, 28 Feb 2026 12:54:57 +0000</pubDate>
      <link>https://forem.com/hamdivazim/i-won-the-build4students-hackathon-werelaysyllabus-devlog-ed4</link>
      <guid>https://forem.com/hamdivazim/i-won-the-build4students-hackathon-werelaysyllabus-devlog-ed4</guid>
      <description>&lt;p&gt;I am thrilled to share that I recently won First Place in the Build4Students Hackathon (2026)! With 708 participants and a panel of judges from companies like Meta, Anthropic, and Capital One, the competition was real.&lt;/p&gt;

&lt;p&gt;The challenge was simple but broad. Build tools to make student life better. My solution was WeRelaySyllabus, a platform to crowdsource the calendar for every course syllabus.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://we-relay-syllabus.vercel.app/" rel="noopener noreferrer"&gt;🔗 Try it out here&lt;/a&gt; | &lt;a href="https://github.com/hamdivazim/WeRelaySyllabus" rel="noopener noreferrer"&gt;🐙 GitHub Repo&lt;/a&gt; | &lt;a href="https://devpost.com/software/werelaysyllabus" rel="noopener noreferrer"&gt;🚀 Devpost page&lt;/a&gt; | &lt;a href="https://www.youtube.com/watch?v=VUg1nECwY-E" rel="noopener noreferrer"&gt;▶️ YouTube demo&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem
&lt;/h2&gt;

&lt;p&gt;Every semester, thousands of students receive a PDF or a printed booklet containing their exam dates and assignment deadlines. Then, we all spend hours individually and manually typing those dates into Google Calendar or Notion.&lt;/p&gt;

&lt;p&gt;It’s a massive, collective waste of time. I wanted to create a way where one student helps, and the whole course benefits.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Solution - WeRelaySyllabus
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3polajvx4pt7chzagxt3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3polajvx4pt7chzagxt3.png" alt="WeRelaySyllabus Calendar" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you're a new student, it’s simple! Get on the website, find your course, and export the calendar in the industry-standard ICS format. That’s it. One click to import into your favorite calendar software, and you're done. No sign-up, no fluff.&lt;/p&gt;

&lt;p&gt;If the course isn't there yet? Everyone gets together to fill it out for the benefit of the whole class. A quick "event name," "description," and "time", boom. Everyone gets the memo.&lt;/p&gt;

&lt;p&gt;The vouch system also makes sure calendars remain trustworthy by getting a community consensus on whether the events registered are legit.&lt;/p&gt;

&lt;p&gt;If you'd like to learn more about the project itself, you can check it out on the &lt;a href="https://devpost.com/software/werelaysyllabus" rel="noopener noreferrer"&gt;Devpost Page&lt;/a&gt;, but this blog post will be more of a devlog going over how I built this project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Jan 20 - Switching from AWS to Firebase
&lt;/h2&gt;

&lt;p&gt;So I'm 3x AWS certified, but I used Firebase to make this project? Sounds weird, but I think it was actually a good decision. I was originally planning to use Lambda, API Gateway and DynamoDB - like I previously have done many times. But I realised with the Vouch logic, I would need WebSockets. In case you aren't aware, a WebSocket is a feature in API Gateway (and APIs in general) that allows changes to be propagated in real time to users currently listening. And well, WebSockets are expensive, and I hadn't even used them before.&lt;/p&gt;

&lt;p&gt;But I noticed that the thing I wanted sounded very similar to another thing I already knew. Users listening, realtime changes? That's right - Firebase Firestore.&lt;/p&gt;

&lt;p&gt;I picked Firebase because it's free, it's extremely easy to set up with a web app, and it had exactly what I wanted without any hassle of having to pay or work out optimisations. It just worked.&lt;/p&gt;

&lt;h2&gt;
  
  
  Jan 28 - UI Styles
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0vjnaqrm5pzzg8ghs55x.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0vjnaqrm5pzzg8ghs55x.png" alt="WeRelaySyllabus Login Page" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So I began the project on January 20 leisurely. By now I had only made decisions about what the project was going to be, and that the backend would be Firebase over AWS.&lt;/p&gt;

&lt;p&gt;I had put in a basic system that counted vouches and showed a simple list of calendars, nothing too complex yet. I just wanted to make sure the Firebase integration was locked in before the project got complex and it would be a hassle to integrate later.&lt;/p&gt;

&lt;p&gt;But I realised that, well, the UI was quite boring. It was just a plain "modern" style that I knew how to do well, but who doesn't? It's incredibly easy to pull off, and Next.js feels like it was practically designed for it.&lt;/p&gt;

&lt;p&gt;So obviously, I had to think of a design style for WeRelaySyllabus. After some looking around, I found this site called Gumroad. I heard about it before - it was a freelancing site. What stuck with me is that it looked fluid, just like the "modern" style. It looked easy to use and modern, but it also looked unique. Everything had bold, completely black shadows and barely anything used rounded corners. After some more research, I found out this style was called neo-brutalism.&lt;/p&gt;

&lt;p&gt;I had actually studied a bit about architecture at school, specifically styles such as modernism, neoclassical, art deco, and brutalism. So naturally, I took neo-brutalism to be the newer mix of modernim and the bold brutalism, but for UI.&lt;/p&gt;

&lt;p&gt;And that's what I ended up using as WeRelaySyllabus's design language.&lt;/p&gt;

&lt;h2&gt;
  
  
  Feb 10 - ICS Generation
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuy34idd4fo92wuiqy3j0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuy34idd4fo92wuiqy3j0.png" alt="WeRelaySyllabus Event Popup" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By Feb 10, I had finished what was my Unique Selling Point for WeRelaySyllabus - the fact that you can just hit download, get a univerally standard calendar file and import it into literally any calendar software you could think of.&lt;/p&gt;

&lt;p&gt;It starts by initializing the string with &lt;code&gt;BEGIN:VCALENDAR&lt;/code&gt; and defining the version and product ID (which is WeRelaySyllabus). Then, for every event in the events array, I concatenate a &lt;code&gt;BEGIN:VEVENT&lt;/code&gt; block containing necessary details like &lt;code&gt;SUMMARY&lt;/code&gt; (title), &lt;code&gt;DTSTART&lt;/code&gt; (start time), and LOCATION. Finally, the string is closed with &lt;code&gt;END:VCALENDAR&lt;/code&gt; and converted into a Blob with the &lt;code&gt;text/calendar&lt;/code&gt; MIME type, allowing the browser to trigger a download.&lt;/p&gt;

&lt;p&gt;But in testing (I actually discovered this while I was filming the demo video!) I found out that many calendars just outright reject the calendar file. And of course, it would just say "Please try again later" without actually telling me what was wrong...&lt;/p&gt;

&lt;p&gt;I figured out that in some events, because the end date is optional, they just have a missing end date. I thought that was ok, but it turns out that  Google Calendar becomes a bit cranky without it. My fix? I updated the logic to guarantee a &lt;code&gt;DTEND&lt;/code&gt;. I now calculate endTime by checking if &lt;code&gt;ev.end&lt;/code&gt; exists; if not, I dynamically generate one by adding one hour to the start time.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Final Crunch - Bugfixing
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqf07w141660vp47go73.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmqf07w141660vp47go73.png" alt="WeRelaySyllabus Search" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you go to my commit history of 16 commits, beginning on Jan 20 and ending (so far) on Feb 14 which was a day before the deadline, you'll notice more than half the commits are on or after Feb 10.&lt;/p&gt;

&lt;p&gt;So here's just a quickfire runthrough of some of the commits and what they fixed...&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Fix days bug&lt;/code&gt; - Feb 10
&lt;/h3&gt;

&lt;p&gt;Because of differences in Daylight Savings, clicking on a day in the calendar after March 29 (which is Daylight Savings in the UK) would show the wrong date to edit, being offsetted by a day.&lt;/p&gt;

&lt;p&gt;The fix was to switch from basic string concatenation to actually using a &lt;code&gt;Date&lt;/code&gt; object to calculate the date based on the locale.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Fix calendar generation error&lt;/code&gt; - Feb 10
&lt;/h3&gt;

&lt;p&gt;This was the calendar end date generation I discussed above.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Fix long calendar title issue&lt;/code&gt; - Feb 11
&lt;/h3&gt;

&lt;p&gt;On smaller devices, the title wouldn't truncate so it'd just overflow to the next line and cut the calendar's space in half. I fixed this by just... allowing the title to truncate I guess&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Swap course id for code&lt;/code&gt; - Feb 11
&lt;/h3&gt;

&lt;p&gt;Pretty self-explanatory, but I found that using the auto-generated course ID instead of the actual course code wasn't very helpful especially as you could just see it in the URL if you really needed it.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Fix stylings and forms on WebKit&lt;/code&gt; - Feb 11
&lt;/h3&gt;

&lt;p&gt;This was an interesting one. On my phone and my Mac, the forms and a lot of the icons had a weird grey tint to them.&lt;/p&gt;

&lt;p&gt;I fixed this by forcing the icons to be white on WebKit based browsers. It's more of a bandaid fix but it was perfect for the hackathon, and it'd only be an issue if I implement dark mode, which I could also just fix by making it the opposite colour if dark mode is enabled.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Make date editable on mobile&lt;/code&gt; - Feb 13
&lt;/h3&gt;

&lt;p&gt;In testing I noticed that on mobile you could only add events to dates already existing in the calendar.&lt;/p&gt;

&lt;p&gt;I fixed this by just adding a small mobile-only button to change the date selected right in the events popup itself.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;Add descriptive auth errors&lt;/code&gt; - Feb 14
&lt;/h3&gt;

&lt;p&gt;If Firebase errored out on authentication, it used to just say that something went wrong.&lt;/p&gt;

&lt;p&gt;I fixed this by just checking was the error was, comparing it against the docs and then showing a more helpful error.&lt;/p&gt;

&lt;h2&gt;
  
  
  Filming the Demo Video
&lt;/h2&gt;

&lt;p&gt;I knew from the beginning that I wanted my demo video to have two main parts: a scripted, fluid demo video that showed all the features quickly and without mistakes. Then I also wanted a more off-the-cuff demo going through the project and showing all the features one by one in a more comprehensive way. So that's what I did. I wrote the script, grabbed some B Roll from Pexels and recorded me just going through the website, and then I stitched it all together in Davinci Resolve. &lt;/p&gt;

&lt;p&gt;And then I just recorded the off-the-cuff demo. No preparation apart from just having a rough idea of what I needed to go through.&lt;/p&gt;

&lt;h2&gt;
  
  
  Next Steps
&lt;/h2&gt;

&lt;p&gt;The project isn't 100% done. In fact in my Devpost README I called it an MVP - because this is what it is. I made it quickly specifically to meet the deadline for the hackathon. It has the necessary features to improve student life measurably, but I could do more to fully flesh out WeRelaySyllabus. Here's what I'd do:&lt;/p&gt;

&lt;p&gt;A Live ICS feature. This is generally the standard when it comes to calendar syncing, but I felt that an export was better just because of the nature of how fast calendars change. Plus, server caching would mean that it'd take ages for the calendar to actually update. But still, this would remove that subtle final friction - that click on the Export button.&lt;/p&gt;

&lt;p&gt;Maybe OCR recognition for uploading syllabi directly? While I couldn't have achieved that with Firebase, doing it is very much possible with other major cloud providers like AWS via Textract. It would definitely be a very very interesting addition!&lt;/p&gt;

&lt;p&gt;For now though, thank you so much for reading. I hope you learned something interesting!&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Written by Hamd Waseem (14)&lt;/em&gt;&lt;/p&gt;

</description>
      <category>hackathon</category>
      <category>nextjs</category>
      <category>build4students</category>
      <category>hackathonwinner</category>
    </item>
    <item>
      <title>About Me</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Sat, 02 Aug 2025 15:46:29 +0000</pubDate>
      <link>https://forem.com/hamdivazim/about-me-5a77</link>
      <guid>https://forem.com/hamdivazim/about-me-5a77</guid>
      <description>&lt;p&gt;This is written for my about page on my v4 portfolio, at &lt;a href="https://hamdivazim.hamdtel.co.uk/about" rel="noopener noreferrer"&gt;https://hamdivazim.hamdtel.co.uk/about&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Hey 👋 I'm Hamd Waseem. I've been coding since I was six, and now I'm one of the youngest people to hold three AWS Certifications. I have experience in Python, web development, cloud computing and game development. This website is my home to showcase the projects I've made, and to write blog posts that teach others what I have learnt. I want to learn more about software engineering with Python, Swift, Next.js, Unity and so much more, and I want to make it easy for you to learn as well in bite sized posts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Background &amp;amp; Skills
&lt;/h2&gt;

&lt;p&gt;I first learned Swift when I was six. I built simple games and programs before moving onto Python. I continued learning by coming up with random project ideas, and sitting down to make them.&lt;/p&gt;

&lt;p&gt;I never really followed tutorials that walked you step by step through projects - it didn’t feel like real learning to me. I instead made what I felt like making, and when I faced an error I simply searched it up and learnt how to fix it.&lt;/p&gt;

&lt;p&gt;I learnt tkinter, Flask, Django, Kivy, pytorch, matplotlib and so many more. At around the same time I was also learning HTML, CSS and JS.&lt;/p&gt;

&lt;p&gt;At 13, I decided to pursue AWS. I learnt how the cloud works and to use AWS effectively, and then I earned the AWS Developer Associate, Solutions Architect and AI Practitioner at age 13.&lt;/p&gt;

&lt;p&gt;And since then, I've been learning React.js and Next.js, Unity &amp;amp; Godot and building my portfolio and blog.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Blog
&lt;/h2&gt;

&lt;p&gt;My &lt;a href="https://hamdivazim.hamdtel.co.uk/blog" rel="noopener noreferrer"&gt;blog&lt;/a&gt; aims to host short but helpful posts, with guidance on how to solve issues, and how to use various frameworks. I will talk about programming languages such as Python and major frameworks. I will also discuss cloud concepts with AWS and in general. Expect useful tips and tricks on everything related to building software.&lt;/p&gt;

&lt;p&gt;I also post videos on my YouTube at &lt;a href="https://youtube.com/@hamdivazim" rel="noopener noreferrer"&gt;@hamdivazim&lt;/a&gt;. I post walkthroughs for programming in Python and Next.js and through AWS consoles, also explaining key concepts in a digestable way.&lt;/p&gt;

&lt;h2&gt;
  
  
  Projects
&lt;/h2&gt;

&lt;p&gt;I’ve worked on a range of projects, from a simple e-commerce app for iOS to a Google data analysis notebook in Python, and even a fully scalable architecture with AWS! A full list of both my released projects and upcoming projects is at &lt;a href="https://dev.to/projects"&gt;https://hamdivazim.hamdtel.co.uk/projects&lt;/a&gt;, so feel free to check them out, or check out my GitHub at &lt;a href="https://github.com/hamdivazim" rel="noopener noreferrer"&gt;hamdivazim&lt;/a&gt;!&lt;/p&gt;

&lt;h2&gt;
  
  
  Contact
&lt;/h2&gt;

&lt;p&gt;I'm available via email at &lt;a href="//mailto:hamd.waseem@hamdtel.co.uk"&gt;hamd.waseem@hamdtel.co.uk&lt;/a&gt; or &lt;a href="//mailto:hamdi.vazim@gmail.com"&gt;hamdi.vazim@gmail.com&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Thank you so much for exploring my portfolio! Feel free to check out my &lt;a href="https://hamdivazim.hamdtel.co.uk/blog" rel="noopener noreferrer"&gt;blog&lt;/a&gt;, my &lt;a href="https://hamdivazim.hamdtel.co.uk/projects" rel="noopener noreferrer"&gt;latest projects&lt;/a&gt;, my &lt;a href="https://hamdivazim.hamdtel.co.uk/devlog" rel="noopener noreferrer"&gt;devlog&lt;/a&gt; or if you'd like to get in touch, email me via the links above. I'd love to hear from you!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>I Became An AWS Certified Developer Associate (at 13) - An Overview</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Fri, 01 Aug 2025 22:26:57 +0000</pubDate>
      <link>https://forem.com/hamdivazim/i-became-an-aws-certified-developer-associate-at-13-an-overview-1o23</link>
      <guid>https://forem.com/hamdivazim/i-became-an-aws-certified-developer-associate-at-13-an-overview-1o23</guid>
      <description>&lt;p&gt;View my badge on &lt;a href="https://www.credly.com/badges/d3f9c9c2-9c58-436e-a0a2-950746da9a2d" rel="noopener noreferrer"&gt;Credly&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;About a month ago, I achieved something exciting - I became an AWS Certified Developer Associate at 13 (you can read more about my own certification journey at &lt;a href="https://hamdivazim.hamdtel.co.uk/blog/how-i-became-an-aws-developer-associate-at-age-13-5gi8" rel="noopener noreferrer"&gt;https://hamdivazim.hamdtel.co.uk/blog/how-i-became-an-aws-developer-associate-at-age-13-5gi8&lt;/a&gt;). The journey wasn’t easy, but it helped me understand the uses and the power of cloud computing. If you’re a developer or just curious about AWS, this post will break down the essentials and show why learning the Cloud can be a game-changer.&lt;/p&gt;

&lt;p&gt;Often people first think of cloud storage as the only thing that the Cloud can offer but there is so much more to it as well. Those who make any type of program, website or game would really benefit from knowing how the Cloud works, and then of course using it to their advantage.&lt;/p&gt;

&lt;p&gt;There are many providers who offer professional services in the Cloud, such as Amazon Web Services (AWS), Microsoft Azure, Google Cloud, Oracle Cloud and many others. I chose to study AWS as it is the leading cloud provider,&lt;/p&gt;

&lt;h2&gt;
  
  
  What is AWS?
&lt;/h2&gt;

&lt;p&gt;As previously mentioned, AWS is the leading cloud provider; they offer a vast breadth of services, including but are not limited to, compute, storage, databases, machine learning, analytics, and networking capabilities. AWS is designed to be scalable, secure, and cost-effective, enabling individuals and businesses of all sizes to develop, deploy, and manage their applications in the cloud.&lt;/p&gt;

&lt;p&gt;While many people think of AWS and the Cloud in general solely as a platform for cloud storage, its capabilities go far beyond that. It supports everything from building complex web applications and mobile apps to setting up serverless architectures and managing large-scale enterprise infrastructures. It provides all the tools developers need to architect cloud solutions for their apps.&lt;/p&gt;

&lt;p&gt;AWS offers a lot of services, and they can do various things in the cloud, such as computing, storage, governance and more. AWS currently has... over 200 services! But you’ll only use a handful for most applications - this guide covers the essentials.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Will You Learn?
&lt;/h2&gt;

&lt;p&gt;Most services in AWS (all except four) use the concept of regions. AWS has infrastructure all over the world and you can choose where you launch. Examples include us-east-1, eu-west-2, ap-south-1, and more. Each region is also split into Availability Zones, which are smaller zones which are spread throughout the region, and you are meant to split your infrastructure and deploy standby units in each Availability Zone. The whole concept promotes the idea of High Availability, which essentially means your app can withstand failure in any one Availability Zone.&lt;/p&gt;

&lt;p&gt;Loosely, you can group each service you need to know into categories:&lt;/p&gt;

&lt;h3&gt;
  
  
  Compute
&lt;/h3&gt;

&lt;p&gt;These services are based around launching a machine on AWS, and using it to do a task. This could really be anything, including running a task that requires a lot of power, connecting to other machines or even hosting your whole application (using web servers such as Apache or Nginx). This set of services is often considered fundamental to AWS, as every other task (even if it’s “serverless”) uses a compute layer on the underlying logic.&lt;/p&gt;

&lt;p&gt;EC2 (Elastic Compute Cloud) is a pay-per-hour service that lets you rent out compute instances to do anything you need to - you get full control over a virtual machine, just like a physical server, but not one you physically own (and pay to maintain). You can also choose exactly how much compute power, memory, or even network optimisation you want. It is the most important service in AWS and you need to know it well.&lt;/p&gt;

&lt;p&gt;ECS and EKS (Elastic Container Service and Elastic Kubernetes Service) let you manage containers in the AWS Cloud by choosing the instance type to deploy your container on. You can learn more about containers at &lt;a href="https://hamdivazim.hamdtel.co.uk/posts/148" rel="noopener noreferrer"&gt;https://hamdivazim.hamdtel.co.uk/posts/148&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;You can also deploy most computer services in ELBs (Elastic Load Balancers) which spread load over multiple instances (often used in highly available architectures by deploying an equal number of instances in each Availability Zone and distributing load among them) and ASGs (Auto Scaling Groups) which scale up or down the number of instances depending on a specific metric (such as CPU utilisation, memory, etc.)&lt;/p&gt;

&lt;p&gt;AWS Lambda also allows you to execute arbitrary pieces of code in the cloud without worrying about launching infrastructure or managing it. AWS takes care ofthe memory allocation and compute capacity and you just have to provide them the code. This is useful for small pieces of code which deploying a whole EC2 instance would be overkill for, and for if your local machine is not powerful enough to execute the code and you need an easy and cost-efficient way to run it on the cloud.&lt;/p&gt;

&lt;h3&gt;
  
  
  Storage
&lt;/h3&gt;

&lt;p&gt;Storage in AWS is also important, with many services offering it. S3 (Simple Storage Service) allows you to simply store files in the cloud, without managing the infrastructure required. You get virtually unlimited space! EBS (Elastic Block Storage) allows you to attach volume block storage to EC2 instances. You choose how much storage you want and how high you want IOPS (Inputs/Outputs Per Second) to be. EFS allows you to create a Network File System (NFS) to give many EC2 instances or general machines access to the same files at the same time over a network. &lt;/p&gt;

&lt;h3&gt;
  
  
  Database
&lt;/h3&gt;

&lt;p&gt;AWS provides both relational and non-relational database options. For traditional relational databases, there’s RDS (Relational Database Service), which supports engines like MySQL, PostgreSQL, Oracle, and SQL Server. Aurora takes it further by offering high-performance, MySQL- and PostgreSQL-compatible databases. For NoSQL needs, DynamoDB delivers fast and predictable performance at scale, while DocumentDB is tailored for JSON-based document storage. If you’re dealing with data warehousing, Redshift offers a fully managed, petabyte-scale solution. You can also simply host a database on an EC2 instance.&lt;/p&gt;

&lt;h3&gt;
  
  
  Networking &amp;amp; Content Delivery
&lt;/h3&gt;

&lt;p&gt;Efficient networking is key in the cloud. VPC (Virtual Private Cloud) lets you provision a logically isolated section of the AWS Cloud, while Route 53 provides scalable Domain Name System (DNS) services. For delivering content globally, CloudFront serves as a Content Delivery Network (CDN) that accelerates your website, and API Gateway facilitates the creation, deployment, and management of secure APIs. These services ensure that your applications are fast, secure, and highly available.&lt;/p&gt;

&lt;h3&gt;
  
  
  Security &amp;amp; Identity
&lt;/h3&gt;

&lt;p&gt;Security is a fundamental pillar of AWS. IAM (Identity and Access Management) allows you to manage access to services and resources securely. Complementary services such as Cognito handle user authentication for mobile and web applications, and AWS Shield along with WAF (Web Application Firewall) protect your applications against common web exploits and Distributed Denial of Service (DDoS) attacks. Additional tools like GuardDuty continuously monitor for malicious activity, adding another layer of defence.&lt;/p&gt;

&lt;p&gt;KMS (Key Management Service) also allows you to manage encryption keys in the cloud. The core concept behind KMS is envelope encryption: you encrypt your object using a data key and then encrypt the data key with a master key, leaving an encrypted object and an encrypted key. Anyone with access to a master key will be able to retrieve the data key and decrypt the object.&lt;/p&gt;

&lt;h3&gt;
  
  
  Management &amp;amp; Governance
&lt;/h3&gt;

&lt;p&gt;To effectively monitor and control your AWS environment, a suite of management tools is available. CloudWatch enables monitoring of resources and applications, while CloudTrail logs API calls for auditing purposes. CloudFormation simplifies resource provisioning through infrastructure as code, and AWS Config helps track configuration changes. These services work together to provide a comprehensive view of your infrastructure’s health and compliance.&lt;/p&gt;

&lt;h3&gt;
  
  
  Developer Tools
&lt;/h3&gt;

&lt;p&gt;AWS offers a range of services to support your development lifecycle. CodeCommit is a fully managed source control service, CodeBuild provides scalable build services, CodeDeploy automates application deployments, and CodePipeline streamlines continuous integration and continuous delivery (CI/CD). These tools are designed to simplify the development process and help you deliver software quickly and reliably. CloudFormation also allows you to write YAML or JSON to automatically launch infrastructure quickly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Analytics
&lt;/h3&gt;

&lt;p&gt;For those working with large volumes of data, AWS offers powerful analytics tools. Kinesis streams real-time data for processing, while Athena allows you to query data in S3 using standard SQL. EMR (Elastic MapReduce) facilitates big data processing with popular frameworks like Hadoop and Spark, and QuickSight enables fast, cloud-powered business intelligence and visualisation.&lt;/p&gt;

&lt;h3&gt;
  
  
  Machine Learning And AI
&lt;/h3&gt;

&lt;p&gt;AWS makes it easier to integrate machine learning into your applications. SageMaker provides an end-to-end solution for building, training, and deploying machine learning models. Other services such as Comprehend (for natural language processing), Rekognition (for image and video analysis), Lex (for building conversational interfaces), and Polly (for text-to-speech conversion) allow you to incorporate intelligent features without needing to start from scratch.&lt;/p&gt;

&lt;h3&gt;
  
  
  Migration and Transfer
&lt;/h3&gt;

&lt;p&gt;When moving existing workloads to the cloud, AWS offers services that simplify the process. The Database Migration Service (DMS) helps transfer your databases to AWS with minimal downtime. Server Migration Service (SMS) automates the migration of on-premises workloads, and solutions like Snowball (and its larger counterpart, Snowmobile) are available for transferring large amounts of data physically when network transfer isn’t practical.&lt;/p&gt;

&lt;h3&gt;
  
  
  Application Integration
&lt;/h3&gt;

&lt;p&gt;Modern applications often require seamless communication between different components. AWS facilitates this with services such as SQS (Simple Queue Service) for message queuing, SNS (Simple Notification Service) for push messaging, and Step Functions, which orchestrate workflows and help manage multi-step applications. These services are crucial for building decoupled and resilient architectures.&lt;/p&gt;

&lt;p&gt;The AWS Developer Associate Exam is mainly focused on knowing ‘serverless’ architectures - where you do not have to manage the underlying infrastructure yourself. All you do is tell AWS that you need to do a task, like execute some code, and you don’t need to pay or manage the actual machine that will run the task. The main services that are serverless include Lambda, DynamoDB, API Gateway, Cognito, S3 and Fargate (integrates with ECS and EKS to provide serverless container management). There are of course many others but you only need to know these in detail.&lt;/p&gt;

&lt;h2&gt;
  
  
  My Certification Path - And What Yours Should Look Like!
&lt;/h2&gt;

&lt;h3&gt;
  
  
  My Path
&lt;/h3&gt;

&lt;p&gt;When I first decided to pursue the AWS Developer Associate certification, I knew it would be a challenge. The exam covers a broad range of topics: building serverless applications, managing networking and security, performing governance, correctly deploying applications, migrating from on-premises to the cloud, analytics on AWS ...&lt;/p&gt;

&lt;p&gt;Since I was already experienced in programming, I focused on deepening my knowledge in areas like IAM permissions, Lambda integrations, and API Gateway.&lt;/p&gt;

&lt;p&gt;I started by using AWS Free Tier to experiment with different services, setting up real-world projects rather than just reading theory. I deployed a few applications using S3, Lambda, and DynamoDB, and learned first-hand about optimising cost and performance. My main project was the budget management app Budgetly (which I am in the process of completing), which I started solely as a way to practice my skills of actually applying what I learnt about serverless services on AWS.&lt;/p&gt;

&lt;p&gt;Preparing for the exam required discipline, as I balanced my studies with schoolwork. I used a combination of official AWS documentation, practice exams, and hands-on projects to reinforce my understanding. Eventually, I was ready to sit the exam, and I passed on my first attempt! You can read more at &lt;a href="https://hamdivazim.hamdtel.co.uk/blog/how-i-became-an-aws-developer-associate-at-age-13-5gi8" rel="noopener noreferrer"&gt;https://hamdivazim.hamdtel.co.uk/blog/how-i-became-an-aws-developer-associate-at-age-13-5gi8&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This achievement is just the beginning for me however. Now that I’ve gained this certification, I’m planning to deepen my AWS expertise by working on real-world projects, integrating AWS into my personal development work, and pursuing more advanced AWS certifications in the future :)&lt;/p&gt;

&lt;h2&gt;
  
  
  Your Path
&lt;/h2&gt;

&lt;p&gt;The three things I outlined in my other blog post were practice labs, practice tests and discipline. So I believe a proper path to becoming an AWS Certified Developer Associate should look like the following:&lt;/p&gt;

&lt;h3&gt;
  
  
  Train Using Documentation
&lt;/h3&gt;

&lt;p&gt;AWS has some of the best documentation of any cloud provider or even in general. It is to the point, provides many examples and covers every feature of every service. Sure, a guided study course could help you, but I realised the best way to learn was to just read what AWS wrote. No one will know AWS better than AWS themselves!&lt;/p&gt;

&lt;h3&gt;
  
  
  Perform As Many Practice Labs As You Can
&lt;/h3&gt;

&lt;p&gt;The best resource out there to practice your skills in AWS is AWS itself. My goal with this certification was to be able to build and launch the infrastructure I needed for my own app myself, and that's exactly what I did. If you don't have prior coding experience then you could just borrow code from someone else - when I practiced ECS I used someone else's container from Docker Hub to practice and it worked just fine, without me having to actually understand the Docker-specific nuances.&lt;/p&gt;

&lt;h3&gt;
  
  
  Do Practice Questions
&lt;/h3&gt;

&lt;p&gt;After doing a lot of these I understood how the questions are laid out and how the answer options usally try to trick you out, and I cannot understate how much this helped me during the real exam. As well as that, there are often small features that you may not have seen in a course or in the documentation, but getting it wrong in a practice exam will help iron it out. And plus, anyone would much rather get it wrong in a practice test than in the real thing.&lt;/p&gt;

&lt;p&gt;To summarise, the AWS Certified Developer Associate certification is one worth getting. If you are a programmer who currently leaves their backend compute, databases, application integration, or deployment to managed services such as Vercel, Netlify, Railway and others, you lose a lot of flexibility, and knowing how to architect it yourself will really help you in the long run.&lt;/p&gt;

&lt;p&gt;Thanks for reading :)&lt;/p&gt;

&lt;p&gt;&lt;em&gt;- Hamd Waseem (13)&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aws</category>
      <category>awscertified</category>
      <category>awscertifiedat13</category>
    </item>
    <item>
      <title>How I made a Personal Finance Tracker using ReactJS and AWS</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Fri, 01 Aug 2025 22:21:24 +0000</pubDate>
      <link>https://forem.com/hamdivazim/how-i-made-a-personal-finance-tracker-using-reactjs-and-aws-4393</link>
      <guid>https://forem.com/hamdivazim/how-i-made-a-personal-finance-tracker-using-reactjs-and-aws-4393</guid>
      <description>&lt;p&gt;I made a personal finance tracker that used ReactJS, Firebase (for authentication) and various AWS services such as API Gateway, DynamoDB and Lambda. You can check it out at &lt;a href="https://budgetly.hamdtel.co.uk" rel="noopener noreferrer"&gt;https://budgetly.hamdtel.co.uk&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I first started by quickly putting together a basic login UI with ReactJS and Tailwind CSS.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F24ohospb4dtqtlc2eqyq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F24ohospb4dtqtlc2eqyq.png" alt=" " width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I then started thinking about how I should go about architecting the backend of my app. I wanted to initially familiarise myself with using basic AWS services so I decided to use Firebase over Cognito for now (although in my next project I will learn and use Cognito). I created my Firebase project and added a web app and a service account so I could communicate with it from my frontend. Then I created a new test user in the authentication service, installed the &lt;code&gt;firebase-admin&lt;/code&gt;npm package into my React project and created &lt;code&gt;firebase-config.js&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { initializeApp } from 'firebase/app';
import { getAuth } from 'firebase/auth';

const firebaseConfig = {
  apiKey: process.env.REACT_APP_FIREBASE_API_KEY,
  authDomain: process.env.REACT_APP_FIREBASE_AUTH_DOMAIN,
  projectId: process.env.REACT_APP_FIREBASE_PROJECT_ID,
  storageBucket: process.env.REACT_APP_FIREBASE_STORAGE_BUCKET,
  messagingSenderId: process.env.REACT_APP_FIREBASE_MESSAGING_SENDER_ID,
  appId: process.env.REACT_APP_FIREBASE_APP_ID
};

const app = initializeApp(firebaseConfig);
export const auth = getAuth(app);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;I used environment variables in the configuration file for security. This code initialised Firebase any time I imported it into a different file and since the only thing I really needed it for was authentication, I only exported that service so other files could use it.&lt;/p&gt;

&lt;p&gt;I was originally planning to just store the UUID of the user in cookies, however that is not very secure so I looked for an alternative. Typically when a user signs in, a 'session token' or 'ID token' is returned. This is temporary and expires after some time, and can be used to retrieve the UUID if you have the service account key. I therefore stored the ID token in cookies and, for testing purposes, outputted it in the console on successful login.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3chsywloyy0mfk0pdin.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fg3chsywloyy0mfk0pdin.png" alt=" " width="800" height="302"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now I needed to use this to be able to retrieve user data (expenses, incomes, subscriptions etc.). Before launching anything on AWS, I used Figma to map out how I would handle storing data in databases and efficiently reading and writing to it; I decided to make it so that the latest 3 months or so of database records were retrieved on login and stored in cookies as a cache - that way database queries could be kept to a minimum. If needed though the application would be able to retrieve more records and add them to the cache. Because of this the best way to retrieve records would be to make a way to specify the ID Token, a start date and an end date and get back all database records for the specific user between both dates.&lt;/p&gt;

&lt;p&gt;Now it was time to start deploying the server-side infrastructure. I started with deploying the database which I chose DynamoDB for. I used &lt;code&gt;userId&lt;/code&gt; as the partition key and &lt;code&gt;transactionDate&lt;/code&gt; as the sort key as these two attributes would be what I would use to query.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkedguivq6qe17lhca2uo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkedguivq6qe17lhca2uo.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I made a quick example record that I could use to make sure that I could read from the table.&lt;/p&gt;

&lt;p&gt;The next step was to create the Lambda function that when invoked could take the three parameters I mentioned earlier (&lt;code&gt;idToken&lt;/code&gt;, &lt;code&gt;startDate&lt;/code&gt;, &lt;code&gt;endDate&lt;/code&gt;) and return the correct records from the database. I made the function and started to write the code for it on my local computer.&lt;/p&gt;

&lt;p&gt;I first installed the AWS SDK to communicate with DynamoDB and then &lt;code&gt;firebase-admin&lt;/code&gt; so I could swap the ID Token for the UUID. Then I wrote the code inside &lt;code&gt;index.js&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
const AWS = require('aws-sdk');
const admin = require('firebase-admin');
const dynamoDB = new AWS.DynamoDB.DocumentClient();

exports.handler = async (event) =&amp;gt; {
    const { idToken, startDate, endDate } = event.queryStringParameters;

    try {
        const decodedToken = await admin.auth().verifyIdToken(idToken);
        const uuid = decodedToken.uid;

        if (new Date(startDate) &amp;gt; new Date(endDate)) {
            return {
                statusCode: 400,
                body: JSON.stringify({ error: 'startDate must be less than or equal to endDate' }),
            };
        }

        const params = {
            TableName: 'Budgetly',
            KeyConditionExpression: 'userId = :userId AND transactionDate BETWEEN :startDate AND :endDate',
            ExpressionAttributeValues: {
                ':userId': uuid,
                ':startDate': startDate,
                ':endDate': endDate,
            },
        };

        const data = await dynamoDB.query(params).promise();

        return {
            statusCode: 200,
            body: JSON.stringify(data.Items),
        };
    } catch (e) {
        console.error(e);
        return {
            statusCode: 500,
            body: JSON.stringify({ error: e }),
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code took those three parameters and first used the Firebase SDK to retrieve the UUID by using the ID Token. Then it queried the database using the condition expression &lt;code&gt;userId = :userId AND transactionDate BETWEEN :startDate AND :endDate&lt;/code&gt;. The final thing to do before testing it was to create an API that could be used to invoke the function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7mcv7uy633fc9kwh5j4b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7mcv7uy633fc9kwh5j4b.png" alt=" " width="520" height="648"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I made a GET route called &lt;code&gt;/data&lt;/code&gt; that used the Lambda integration type. When accessed it would expect the three parameters provided as query string paramters (e.g. &lt;code&gt;/data?idToken=&amp;lt;ID_TOKEN&amp;gt;&amp;amp;startDate=2024-01-01&amp;amp;endDate=2024-12-31&lt;/code&gt;) and send them to the Lambda function.&lt;/p&gt;

&lt;p&gt;However when testing it, it kept failing. I then realised that I forgot to change the Lambda execution role's policy to allow DynamoDB queries. It already had CloudWatch permissions (so it could perform logging) so I added the &lt;code&gt;dynamodb:Query&lt;/code&gt; permission.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2h8y4bh9lhak7xr4en4h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2h8y4bh9lhak7xr4en4h.png" alt=" " width="800" height="275"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After changing that, I got the following result in Postman:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6pap8s89ohch3ltyp3g6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6pap8s89ohch3ltyp3g6.png" alt=" " width="800" height="303"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The reading operation now worked! I could provide an &lt;code&gt;idToken&lt;/code&gt; and a timeframe and I would get the correct records!&lt;/p&gt;

&lt;p&gt;I now had to add to the API and Lambda to add support for writing records as well. I did it in mostly the same way: in the parameters I would have to provide the ID Token, the amount, the name and the category.&lt;/p&gt;

&lt;p&gt;I created a second Lambda function for this and wrote the code on my computer again. I installed the AWS SDK and the Firebase SDK again as I would need to use both. Then I wrote the code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const AWS = require('aws-sdk');
const admin = require('firebase-admin');
const dynamoDB = new AWS.DynamoDB.DocumentClient();
const serviceAccount = require('./sak.json');

if (!admin.apps.length) {
    admin.initializeApp({
        credential: admin.credential.cert(serviceAccount)
    });
}

exports.handler = async (event) =&amp;gt; {
    const { idToken, amnt, category, name } = event.queryStringParameters;

    try {
        const decodedToken = await admin.auth().verifyIdToken(idToken);
        const uuid = decodedToken.uid;

        const currentDate = new Date().toISOString();

        const params = {
            TableName: 'Budgetly', 
            Item: {
                userId: uuid,
                transactionDate: currentDate,
                amount: +amnt,
                category: category,
                name: name,
            },
        };

        await dynamoDB.put(params).promise();

        return {
            statusCode: 200,
            body: JSON.stringify({ message: "Data successfully written" })
        };
    } catch (e) {
        console.error(e);
        return {
            statusCode: 500,
            body: JSON.stringify({ error: e })
        };
    }
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code was also very similar to the other Lambda function except that it used the &lt;code&gt;PutItem&lt;/code&gt; API instead of the &lt;code&gt;Query&lt;/code&gt; API. &lt;/p&gt;

&lt;p&gt;After uploading the code to my Lambda function I then remembered to change the execution role's policy to allow &lt;code&gt;dynamodb:PutItem&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqnwopxdhse1cj36a5pc9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqnwopxdhse1cj36a5pc9.png" alt=" " width="800" height="333"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I then went back to API Gateway to create a second route for writing to the database:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wicg2si6waub18kqrmq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7wicg2si6waub18kqrmq.png" alt=" " width="800" height="164"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I also attached the Lambda integration:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo0xibopom3rfs2atv9nh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo0xibopom3rfs2atv9nh.png" alt=" " width="800" height="405"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After finishing that I went to Postman to test that the API worked... and it didn't:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjm86h8ameknta3p9e5gu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjm86h8ameknta3p9e5gu.png" alt=" " width="800" height="306"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;My code was meant to give me a detailed error message if something did go wrong but it didn't so the error was likely coming from improper configuration.&lt;/p&gt;

&lt;p&gt;I looked into the Lambda function and didn't find anything that could be wrong but I realised that write operations was likely to take some time - so I raised the timeout of the Lambda function from 3 seconds to 15 seconds for good measure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzygv6gzuofgd92yjbdc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhzygv6gzuofgd92yjbdc.png" alt=" " width="226" height="77"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I went back into Postman and tried again, and it worked! It only took 5 seconds for the entire process to complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2a8poj0v68ki2oby7txf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2a8poj0v68ki2oby7txf.png" alt=" " width="800" height="306"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now there was only one more thing to check: I went into DynamoDB and, sure enough, the record was now saved!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjf1b1eczfong47v9sy1e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjf1b1eczfong47v9sy1e.png" alt=" " width="800" height="255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By this point most of the server-side work was complete. I added a registration page, a very simple homepage and fixed up a lot of the routing. I also made it refresh the ID token every half an hour.&lt;/p&gt;

&lt;p&gt;Now I had to implement the 'caching' mechanism. I planned to make it so that the latest 3 months of records for that user were retrieved on login and then I would be able to add to that anytime the app retrieved records that weren't in the cache. However the cache was probably going to be too big for some users - so for good measure I also planned to use &lt;code&gt;lz-string&lt;/code&gt; to compress the data before storing it.&lt;/p&gt;

&lt;p&gt;Before going about writing the code for this I added a small part to the homepage that could allow you to write in values and submit them to be written to the database for ease of use.&lt;/p&gt;

&lt;p&gt;In the login function I added a part that used a simple fetch function however I kept facing a very persistent error, and it didn't help that it wasn't very helpful...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo5g61mtlkfcseps96ocs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo5g61mtlkfcseps96ocs.png" alt=" " width="314" height="30"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I was stumped on this for a while, because the API worked when I tested it in the browser and Postman. I then realised it's probably a CORS issue - I tried doing some research and, sure enough, it was. I had faced these types of issues in the past (like with Flask), but I had no idea how to tackle it. I went back into the AWS Console and saw a tab labelled 'CORS'! Once I went inside it I saw this field:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdz2xcbshxyxzq7jhlibe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdz2xcbshxyxzq7jhlibe.png" alt=" " width="667" height="124"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So it wasn't as big of an issue as I thought. I added localhost for testing purposes but would have to remember to add the production URL later.&lt;/p&gt;

&lt;p&gt;I fixed up the caching mechanism and added a loading screen to the dashboard on initial login while it waited for the &lt;code&gt;cache&lt;/code&gt; cookie to appear, as the API call, obviously, took time. I then fixed up the auth pages to look a little better and added a side panel to show the features of the app:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fziey36jf5y0viaiyf4uy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fziey36jf5y0viaiyf4uy.png" alt=" " width="800" height="394"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before continuing with designing the dashboard I remembered that I would need a system to store user metadata: data such as default currency, custom categories etc.&lt;/p&gt;

&lt;p&gt;To be as efficient as possible I repurposed the transactionDate sort key and the data reading API to also get the metadata if requested:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
const AWS = require('aws-sdk');
const admin = require('firebase-admin');
const dynamoDB = new AWS.DynamoDB.DocumentClient();
const serviceAccount = require('./sak.json');
const currenies = require('./currencies.json');

if (!admin.apps.length) {
    admin.initializeApp({
        credential: admin.credential.cert(serviceAccount)
    });
}

exports.handler = async (event) =&amp;gt; {
    const { idToken, locale } = event.queryStringParameters;

    try {
        const decodedToken = await admin.auth().verifyIdToken(idToken);
        const uuid = decodedToken.uid;

        const params = {
            TableName: 'Budgetly', 
            Item: {
                userId: uuid,
                transactionDate: "meta",
                defaultCurrency: currenies[locale] || 'USD',
            },
        };

        await dynamoDB.put(params).promise();

        return {
            statusCode: 200,
            body: JSON.stringify({ message: "Data successfully written", currency: currenies[locale] || 'USD' })
        };
    } catch (e) {
        console.error(e);
        return {
            statusCode: 500,
            body: JSON.stringify({ error: e })
        };
    }
};

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So now on login I could also store the metadata in a separate cookie.&lt;/p&gt;

&lt;p&gt;I then worked on the homepage and managed to get it to look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzcdi06l8tlo4abr9e33p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzcdi06l8tlo4abr9e33p.png" alt=" " width="800" height="366"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk4vx66veyvzcmz3b9boi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk4vx66veyvzcmz3b9boi.png" alt=" " width="800" height="844"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I was happy with this design, so I had to get on with designing the onboarding page.&lt;/p&gt;

&lt;p&gt;The onboarding page would need to guide the user through the capabilities of my app, and also collect metadata such as default currency, spending limits, and savings goals.&lt;/p&gt;

&lt;p&gt;I spent some time designing the UI and this was the result:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3pbyt1zpgj05qgfy8mwh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3pbyt1zpgj05qgfy8mwh.png" alt=" " width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At this point, most of the MVP (Minimum Viable Product) was complete. All I had to do was design a UI for the homepage:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1zcz4t8dmwchs6pafvk8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1zcz4t8dmwchs6pafvk8.png" alt=" " width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, the app was ready to deploy.&lt;/p&gt;

&lt;p&gt;I have a domain on Route 53, but to save costs with deploying on Elastic Beanstalk (as this is an MVP and I didn't want to incur costs for the underlying EC2 instances), I decided to deploy it to Vercel and then update DNS records on Route 53. I would also have to remember to update the CORS policy on API Gateway.&lt;/p&gt;

&lt;p&gt;First I committed and pushed the source code to GitHub:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffa9e2cfm3khq56jj8wga.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffa9e2cfm3khq56jj8wga.png" alt=" " width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And then I used the repository to deploy to Vercel:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27lzujhagydnqbslb6cz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27lzujhagydnqbslb6cz.png" alt=" " width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now I had to add the DNS records to Route 53.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2xqtjczow2w0fcyuq102.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2xqtjczow2w0fcyuq102.png" alt=" " width="800" height="268"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6r9pfpk9im1ev57qb2br.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6r9pfpk9im1ev57qb2br.png" alt=" " width="800" height="239"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And the final step was to allow CORS on the domain:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8c1qnj6h52p8nml1qyt0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8c1qnj6h52p8nml1qyt0.png" alt=" " width="733" height="368"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And the MVP was complete!&lt;/p&gt;

&lt;p&gt;Thank you for reading!&lt;/p&gt;

&lt;p&gt;&lt;em&gt;- Hamd Waseem (14)&lt;/em&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  *Also props if you noticed the AWS console UI change! I originally started this project in October 2024, but picked it back up in April for my DofE.
&lt;/h6&gt;

</description>
      <category>react</category>
      <category>aws</category>
      <category>sideprojects</category>
    </item>
    <item>
      <title>How I Became An AWS Developer Associate At Age 13</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Tue, 29 Jul 2025 18:44:36 +0000</pubDate>
      <link>https://forem.com/hamdivazim/how-i-became-an-aws-developer-associate-at-age-13-5gi8</link>
      <guid>https://forem.com/hamdivazim/how-i-became-an-aws-developer-associate-at-age-13-5gi8</guid>
      <description>&lt;p&gt;View My Badge on Credly: &lt;a href="https://www.credly.com/badges/d3f9c9c2-9c58-436e-a0a2-950746da9a2d" rel="noopener noreferrer"&gt;https://www.credly.com/badges/d3f9c9c2-9c58-436e-a0a2-950746da9a2d&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Recently, I got my AWS Certified Developer Associate certificate at the age of thirteen. Doing this has possibly put me among the youngest to ever do so (since AWS’s minimum age requirement to take the exam is 13), and I want to tell you how you can achieve something similar. But you are probably asking, why did I even go for something like this?&lt;/p&gt;

&lt;h2&gt;
  
  
  My Certification Path
&lt;/h2&gt;

&lt;p&gt;To me, this seemed like a fun thing to learn when I started training a couple months ago. I figured this could help me with the small projects I was building, so I wanted to start. My dad also did a similar certification a few years prior, so he helped me understand the key topics I needed to know. Plus, I noticed the growing field in cloud computing and wanted to learn more about it.&lt;/p&gt;

&lt;p&gt;My original plan was to just do the Cloud Practitioner exam as it was more high level and easier. And I actually did study for it a lot, doing practice labs and even a few practice tests for it. But my dad had done the Solutions Architect Associate certification, and I wanted to know the difference.&lt;/p&gt;

&lt;p&gt;Well in a nutshell, the difference was that the Solutions Architect certification was harder and more in-depth. My dad also told me about the SysOps and Developer Associate certifications and I was intrigued. I had already done some practice and knew the high level overview of most of the important AWS services, so I decided to challenge myself and go straight for the Associate certs.&lt;/p&gt;

&lt;p&gt;Having only really had coding experience with regards to cloud computing or even computing in general, I was more inclined to pick the Developer Associate certification – so I did. I looked into the required knowledge and saw the breadth of services I needed to know in and out, and I learnt them by doing practice labs and questions. Honestly, it was hard work.&lt;/p&gt;

&lt;p&gt;But my training finally came to fruition on 21/02/25, when I took the 2 hour and 40 min exam containing 65 questions and passed, gaining a score in the 800s (the scaled score being out of 1000 and required 720 to pass). Of course, I was ecstatic!&lt;/p&gt;

&lt;h2&gt;
  
  
  Study Tips
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm1eb6nkv6ttoi86szxbo.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm1eb6nkv6ttoi86szxbo.webp" alt=" " width="600" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h6&gt;
  
  
  &lt;a href="https://ieeecs-media.computer.org/wp-media/2018/07/12233357/man-with-laptop-and-abstract-image-of-connected-devices-in-the-cloud-l.jpg" rel="noopener noreferrer"&gt;https://ieeecs-media.computer.org/wp-media/2018/07/12233357/man-with-laptop-and-abstract-image-of-connected-devices-in-the-cloud-l.jpg&lt;/a&gt;
&lt;/h6&gt;

&lt;p&gt;During my revision I was also looking at others’ revision tips from their own experiences, some which I resonated with and some which I didn’t. I believe the following three points, which I followed, should help you pass your AWS certification too, be it AWS Developer Associate or a different one.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tip 1 – Create a Schedule and Have a Deadline In Mind!
&lt;/h3&gt;

&lt;p&gt;An issue I definitely faced was procrastination. I would often just study in bursts but then leave it off for a while. I’d say this did delay my exam a bit, but deciding on a deadline for when to take the exam really helped me. Without booking the exam (because that can be risky) I gave myself a date and a general schedule of what to do in the lead up to the exam. Usually people I saw made incredibly detailed revision schedules with timings for everything, and this could work for some people. But for myself, just setting a date and putting one sentence next to it was much better; I could just put “01/02/25” and “Do a practice test and review it” next to it and that was enough. In the day I could plan when to do the test and when to review it so I still had some flexibility.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tip 2 – Labs, Labs, LABS!
&lt;/h3&gt;

&lt;p&gt;Nothing helped me more than getting practical lab exposure. I kid you not, at least 15 questions from the real exam weren’t specifically taught to me, and I just had to remember it from a hands-on I conducted. One of the questions I remember was about an issue with a CloudFormation stack not deleting due to a resource failing to delete, and I actually faced this issue in a real lab I did before. I was stuck not being able to delete the template for long but the frustration and the difficulty really helped it to stay in my mind, and I am pretty sure I got the answer correct in the exam. Another thing I need to emphasise is that you can follow tutorials, but make sure to do your own labs to. The labs which I did myself really helped me remember key points. If you’re struggling to come up with ideas, you can just ask AI to help you. I can guarantee doing your own labs will help you.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tip 3 – Practice Your Knowledge!
&lt;/h3&gt;

&lt;p&gt;A lot of people on Reddit say that doing one practice test is enough. To be honest, only doing one test isn’t enough to really rule out your weak points. You can find a few free practice questions online but I would recommend purchasing a cheap set of practice exams that are high quality so you can get that extra practice. If you don’t want to buy practice papers, you can just use AI! I used ChatGPT with the prompt "Generate 20 extremely difficult practice questions for the AWS Developer Associate Exam with a good mix of all key services" with Reasoning on and it gave me legitimately very good questions! If you find any services which you struggle with (for example X-Ray for me) you can just ask ChatGPT to focus on those! By one way or another, I highly recommend you get some practice in so you both know what to expect with the exam format and so you are ready for any questions that come your way.&lt;/p&gt;

&lt;p&gt;After revising using these tips and then during the exam, I finished all of the questions leaving an hour and a half, from the two hours and forty minutes I had, to review the questions. I believe sticking to a good schedule with a set (but not set-in-stone) deadline, practicing with labs and doing practice questions will be enough to pass your AWS exam.&lt;/p&gt;

&lt;p&gt;A final few words of encouragement: don’t be disheartened if your preparation isn’t going well. I woke up that day feeling extremely nervous thinking I wouldn’t pass, yet here I am having passed the Developer Associate exam at 13. If I can do it, you can. Go for it!!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0z2shfdychz1cmwp0j3u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0z2shfdychz1cmwp0j3u.png" alt=" " width="600" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;– Written By Hamd Waseem (13)&lt;/em&gt;&lt;/p&gt;

</description>
      <category>awscertified</category>
      <category>aws</category>
      <category>developer</category>
    </item>
    <item>
      <title>How To Deploy a WordPress Blog with AWS</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Tue, 29 Jul 2025 17:46:30 +0000</pubDate>
      <link>https://forem.com/hamdivazim/how-to-deploy-a-wordpress-blog-with-aws-1pcb</link>
      <guid>https://forem.com/hamdivazim/how-to-deploy-a-wordpress-blog-with-aws-1pcb</guid>
      <description>&lt;p&gt;Recently, I successfully deployed my medical blog at &lt;a href="https://blog.percura.hamdtel.co.uk" rel="noopener noreferrer"&gt;https://blog.percura.hamdtel.co.uk&lt;/a&gt; using AWS EC2 and Route 53.&lt;/p&gt;

&lt;p&gt;To achieve this, I installed WordPress along with an Apache web server on an EC2 instance. I then associated the instance with an Elastic IP address, which I used to create an A record on my Route 53 domain. To ensure the site was secure, I used Certbot to set up HTTPS for encrypted communication. Below, I'll walk you through the process so you can do the same:&lt;/p&gt;

&lt;h2&gt;
  
  
  Launching The EC2 Instance
&lt;/h2&gt;

&lt;p&gt;Log into the AWS console and go into EC2. Then go ahead and go into the Launch Instance wizard.&lt;/p&gt;

&lt;p&gt;Give your instance a name and then select Ubuntu as the AMI for your instance:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3glunsssmh5kxfr05mt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn3glunsssmh5kxfr05mt.png" alt=" " width="800" height="579"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I recommend you choose t2.micro as your instance type as it falls within the free tier, however depending on how much traffic you expect your blog to receive you can choose a more powerful instance (compare instance types at &lt;a href="https://instances.vantage.sh/" rel="noopener noreferrer"&gt;https://instances.vantage.sh/&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;You can generate a new key pair for your instance if you haven't already, and in your network settings, ensure to check off Allow HTTPS traffic from the internet and Allow HTTP traffic from the internet.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frg0w14gtg1ur12c32ml8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frg0w14gtg1ur12c32ml8.png" alt=" " width="800" height="264"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can customise the amount of storage your EBS volume will have, however ensure it is under 30 GB if you want to stay within the free tier.&lt;/p&gt;

&lt;p&gt;After launching your instance, you can select it and click on Connect to connect to it via EC2 Instance Connect.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F79ifywoetbsfchg86rj3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F79ifywoetbsfchg86rj3.png" alt=" " width="800" height="118"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Installing Required Packages onto the Instance
&lt;/h2&gt;

&lt;p&gt;We will be using the apt package manager to install what we need to be able to serve WordPress: &lt;code&gt;apache&lt;/code&gt; for the web server and &lt;code&gt;my-sql-server&lt;/code&gt; for the database server.&lt;/p&gt;

&lt;p&gt;First, update apt so that we don't install outdated packages:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo apt update -y&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Once that is done, we are going to install Apache, a web server which we will use to serve WordPress.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo apt install apache2 -y&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The next step to install is the database server; we will use &lt;code&gt;mysql&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo apt install mysql-server -y&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now we need to secure the database installation, and this can be done via the command:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo mysql_secure_installation&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You will be asked questions about how you want to proceed with securing the database. You can see what I put in this &lt;a href="https://gist.github.com/hamdivazim/089c5b5de87bf951fa0597375ce9d66c" rel="noopener noreferrer"&gt;GitHub Gist&lt;/a&gt; but feel free to put what you want.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdo0ycm0vxfymo0i3ssin.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdo0ycm0vxfymo0i3ssin.png" alt=" " width="576" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As WordPress functions using PHP (and some extensions to it), we need to install it too. You can copy the following command to do so:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo apt install php-curl php-json php-mbstring php-xml php-zip -y&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;We also install &lt;code&gt;libapache2-mod-php&lt;/code&gt; for PHP to work with Apache, and &lt;code&gt;php-mysql&lt;/code&gt; for PHP to work with the MySQL database.&lt;/p&gt;

&lt;p&gt;Now we can restart Apache:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo systemctl restart apache2&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Setting up MySQL Database&lt;/p&gt;

&lt;p&gt;We now need to setup a database for WordPress to use. First enter the MySQL editor:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo mysql&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;And then copy the following lines one by one:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE DATABASE wordpress;
CREATE USER 'wpuser'@'localhost' IDENTIFIED BY '[REPLACEWITHASECUREPASSWORD]';
GRANT ALL PRIVILEGES ON wordpress.* TO 'wpuser'@'localhost';
FLUSH PRIVILEGES;
EXIT;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The first line creates a database &lt;code&gt;'WORDPRESS'&lt;/code&gt;. The next one creates a new user called &lt;code&gt;'wpuser'&lt;/code&gt; which can only connect from &lt;code&gt;localhost&lt;/code&gt; (the same EC2 instance it is hosted on) for security and following the best practice of providing least privilege. The third line grants &lt;code&gt;wpuser&lt;/code&gt; the ability to do anything on all tables of the WORDPRESS database and &lt;code&gt;"FLUSH PRIVILEGES"&lt;/code&gt; updates MySQL's in memory privilege table to ensure the changes take immediate effect.&lt;/p&gt;

&lt;h2&gt;
  
  
  Install WordPress
&lt;/h2&gt;

&lt;p&gt;We are now going to download the WordPress files to our instance. We will need to get the files and then set their permissions correctly.&lt;/p&gt;

&lt;p&gt;First navigate to the web root (&lt;code&gt;/var/www/html&lt;/code&gt;) as this is where the files need to be extracted.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ cd /var/www/html&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now we need to download and extract the .zip file containing WordPress:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo wget https://wordpress.org/latest.tar.gz&lt;br&gt;
$ sudo tar -xzf latest.tar.gz&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;And then move the files into the wordpress/ directory:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo mv wordpress/* .&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;To ensure that Apache can access WordPress, we need to set the proper permissions for it:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;sudo chown -R www-data:www-data /var/www/html&lt;br&gt;
sudo chmod -R 755 /var/www/html&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;"sudo chown -R www-data:www-data /var/www/html"&lt;/code&gt; gives ownership of all files in the web root (containing the files for WordPress) to the www-data group, which is where web servers like Apache operate. &lt;code&gt;"sudo chmod -R 755 /var/www/html"&lt;/code&gt; gives the owner of the web root (which we have set to be Apache) full read/write access to it.&lt;/p&gt;

&lt;p&gt;Now we need to tell WordPress our database's details. The WordPress installation files come with a file called &lt;code&gt;wp-config-sample.php&lt;/code&gt; however WordPress (once installed) expects the config files to be at &lt;code&gt;wp-config.php&lt;/code&gt;, so we need to rename it.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo mv wp-config-sample.php wp-config.php&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now we need to edit it to have the details of the database. We will do this using the nano editor:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo nano wp-config.php&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Inside the editor, paste the following code, replacing the details with the ones from your database:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;define( 'DB_NAME', 'wordpress' );&lt;br&gt;
define( 'DB_USER', 'wpuser' );&lt;br&gt;
define( 'DB_PASSWORD', '[REPLACEWITHASECUREPASSWORD]' );&lt;br&gt;
define( 'DB_HOST', 'localhost' );&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Then press Ctrl+O to save and Ctrl+X to exit.&lt;/p&gt;

&lt;p&gt;You should now be able to view WordPress using your EC2 instance's IP address in your browser! But right now it will only work on HTTP.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setup HTTPS
&lt;/h2&gt;

&lt;p&gt;You will need to have a custom domain for this - it doesn't have to be purchased through Route 53 though, it can be through any registrar as long as you have access to the DNS records.&lt;/p&gt;

&lt;p&gt;To ensure that our EC2 instance's IP address will be persisted even if it is shutdown, we will need an Elastic IP address. Navigate to the "Elastic IPs" section under "Network &amp;amp; Security" and click on Allocate Elastic IP address:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nhept7t5s2fk0korc0j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9nhept7t5s2fk0korc0j.png" alt=" " width="800" height="369"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once allocated, right click the IP Address and click on Associate. Then select the instance you launched, and then your instance will have a new elastic IP Address.&lt;/p&gt;

&lt;p&gt;Now in your domain's DNS records, add a new A record with the Elastic IP as the value. Now if you visit your website using HTTP (e.g. "&lt;a href="http://yourdomain.com%22" rel="noopener noreferrer"&gt;http://yourdomain.com"&lt;/a&gt;) you should see your WordPress installation.&lt;/p&gt;

&lt;p&gt;Now let's setup HTTPS support on our instance. We will do this for free using &lt;code&gt;certbot&lt;/code&gt;, which we have to install first:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo apt install certbot python3-certbot-apache -y&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Now run:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;$ sudo certbot --apache&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;And go through the instructions to setup HTTPS.&lt;/p&gt;

&lt;h2&gt;
  
  
  Potential Costs
&lt;/h2&gt;

&lt;p&gt;You may incur costs for running your EC2 Instance and EBS storage if it they are not in the free tier. Your elastic IP address might also incur costs. Overall though this should be very affordable.&lt;/p&gt;

&lt;p&gt;It is also worth noting that this kind of architecture shouldn't be used if you intend to launch a large blog, in which case deploying your own VPC, RDS database etc. would make more sense. Let me know if I should go through what such architecture may look like!&lt;/p&gt;

&lt;h2&gt;
  
  
  Some Problems With This Architecture
&lt;/h2&gt;

&lt;p&gt;This type of architecture is known as a monolithic application: the web tier, application tier and the database tier are all running on one compute instance, and this will cause bottlenecks in your blog's performance as all work is being done by one singular instance. Splitting the tiers and decoupling them would make more sense. I would recommend architecting your own VPC, and lauching instances in public subnets that will serve your blog, another set of subnets that can host WordPress logic, and a final set of private subnets that can host an RDS database (or Aurora, or any databse for that matter!) to host your blog data. To prevent performance overloads or throttling you can decouple the tiers using two SQS queues as well.&lt;/p&gt;

&lt;p&gt;For a simple blog however, a monolithic architecture is fine.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Now you know how to deploy a WordPress blog using Apache and MySQL. Thank you for reading!&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Written By Hamd Waseem&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Featured Image&lt;/p&gt;

</description>
      <category>aws</category>
      <category>wordpress</category>
      <category>ec2</category>
    </item>
    <item>
      <title>Containerisation With Docker and Kubernetes In A Nutshell</title>
      <dc:creator>Hamd Waseem</dc:creator>
      <pubDate>Fri, 25 Jul 2025 10:36:15 +0000</pubDate>
      <link>https://forem.com/hamdivazim/containerisation-with-docker-and-kubernetes-in-a-nutshell-58m1</link>
      <guid>https://forem.com/hamdivazim/containerisation-with-docker-and-kubernetes-in-a-nutshell-58m1</guid>
      <description>&lt;p&gt;A good application should work the same way no matter where it’s running, and there are two main ways to make that happen. The first method is by using virtual machines. These use their own operating system on top of the host computer’s OS. While they’re effective, virtual machines are resource hogs—they take up a lot of RAM, CPU power, and storage space. This can get expensive and inefficient, especially when you need to scale up. The second method, containerisation, is generally the better method. It’s a much lighter, faster, and more efficient way to run applications.&lt;/p&gt;

&lt;p&gt;Containerisation is a modern approach to virtualisation that bundles an application and everything it needs—like libraries, dependencies, and configurations—into a neat, self-contained package called a container. Unlike virtual machines, containers don’t need their own full operating system. Instead, they share the host’s OS but keep their own isolated space. This means they use way fewer resources while still delivering consistent performance. Containers also work well with each other—you can start, stop, move, or replicate them without any disruptions to other containers. This makes them a favourable tool for developers who value speed and scalability.&lt;/p&gt;

&lt;p&gt;One of the biggest perks of containers is portability; they include everything an app needs to run and they work exactly the same whether they’re on your laptop, a test server, or a cloud platform, eliminating the infamous "it works on my machine" issue. With containers, developers and teams can debug, test, and deploy apps much more smoothly, saving time and effort. Moreover, containerisation has opened up new possibilities for streamlining workflows and standardising environments, which is a significant advantage in the fast-paced tech world.&lt;/p&gt;

&lt;h2&gt;
  
  
  Popular Containerisation Platforms
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Docker
&lt;/h3&gt;

&lt;p&gt;Docker is one of the most widely used containerisation platforms. It simplifies the process of creating, deploying, and running containers. A Docker container wraps up an app and all its dependencies into one tidy package, ensuring it behaves consistently wherever it’s deployed. Docker images—which are lightweight and unchangeable—contain everything needed to run an app: the operating system, app code, libraries, and configuration files. Teams can store these images in repositories, track their versions, and collaborate more effectively.&lt;/p&gt;

&lt;p&gt;Docker uses a client-server model. The Docker client communicates with the Docker daemon, which does the heavy lifting of building, running, and managing containers. The daemon runs in the background and ensures resources are used efficiently. Developers can create Dockerfiles to define exactly how their containers should be built—from the base OS to the specific libraries and environment settings they need. This makes setting up and replicating environments a breeze. Plus, optimising Dockerfiles to keep image sizes small and build times quick speeds up development. It’s not just about efficiency; this process also enhances collaboration within teams as the environment setup becomes predictable and straightforward.&lt;/p&gt;

&lt;h3&gt;
  
  
  Kubernetes
&lt;/h3&gt;

&lt;p&gt;Docker is great for creating and managing individual containers, but when you’re dealing with dozens, hundreds, or even thousands of containers across multiple machines, you need a tool like Kubernetes. Kubernetes is a top-notch orchestration platform that takes care of deploying, scaling, and managing containerised apps. It works seamlessly with Docker containers and organises them into clusters for better scalability and availability. Kubernetes has a strong community, a rich set of features, and the capability to handle complex tasks, making it the go-to solution for enterprises rolling out containers at scale.&lt;/p&gt;

&lt;p&gt;Kubernetes stands out for its automation. It can handle tasks like distributing network traffic to avoid bottlenecks, automatically restarting containers that fail, and scaling up resources when demand spikes. These features keep applications running smoothly even during high-stress periods. Compared to other tools like Docker Swarm, Kubernetes offers more robust capabilities, which is why it’s often the first choice for managing production environments. Additionally, Kubernetes’ integration with cloud platforms makes it even more versatile, allowing businesses to achieve greater agility and resilience in their operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  Docker Hub and Docker Compose
&lt;/h3&gt;

&lt;p&gt;The Docker ecosystem doesn’t stop at containers. Docker Hub is a cloud-based registry where developers can share and access container images. It’s packed with pre-built images for popular tools and technologies, saving developers time during setup, meaning that there’s likely already an image for whatever you need to deploy on Docker Hub! The ability to leverage a vast library of ready-made images has made Docker Hub a requirement to make your development process as fast as possible.&lt;/p&gt;

&lt;p&gt;Another useful tool is Docker Compose, which simplifies managing applications that rely on multiple interconnected containers. With a single configuration file, you can define all the services your app needs, like a front-end, a back-end, and a database, and run them together effortlessly. Docker Compose is especially handy for local development and testing, letting you replicate complex setups with minimal hassle. It’s a real time-saver for teams managing multi-service architectures.&lt;/p&gt;

&lt;h2&gt;
  
  
  Real World Usages of Containerisation
&lt;/h2&gt;

&lt;p&gt;Containerisation isn’t just about running apps more efficiently. Its reach is growing into areas like machine learning, big data, and edge computing. For example, researchers can use containers to deploy AI models across multiple environments without worrying about compatibility issues. This ability to standardise deployment processes is invaluable, especially in fields where precision and reliability are key. In edge computing, lightweight containers can run on devices with limited resources, bringing faster responses and lower latency for users. Whether it’s powering smart devices or enabling real-time data analysis, containers are paving the way for innovation in these domains.&lt;/p&gt;

&lt;p&gt;Security is another area where containers are making waves. Tools like Docker Content Trust (DCT) and Kubernetes’ built-in security policies help ensure that only verified, trusted images get deployed. Plus, the isolation that containers provide acts as an extra barrier against potential attacks. By isolating application processes, containers can minimise risks and limit the impact of security breaches. This focus on security makes containerisation a reliable option for businesses handling sensitive data or critical operations.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;Containerisation allows developers to creat lightweight packages that work the same regardless of the host's operating system. Docker is widely adopted by many developers to build and manage containers effectively, while Kubernetes works well at orchestrating container deployments at scale. With tools like Docker Hub and Docker Compose, developers can access pre-built images and manage multi-service applications seamlessly.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Written by Hamd Waseem&lt;/em&gt;&lt;/p&gt;

</description>
      <category>container</category>
      <category>docker</category>
      <category>kubernetes</category>
    </item>
  </channel>
</rss>
