<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Akanle Tolulope</title>
    <description>The latest articles on Forem by Akanle Tolulope (@akansrodger).</description>
    <link>https://forem.com/akansrodger</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/akansrodger"/>
    <language>en</language>
    <item>
      <title>Classifying Amazon Reviews with Python: From Raw Text to 88% Accuracy</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Thu, 05 Mar 2026 11:23:02 +0000</pubDate>
      <link>https://forem.com/akansrodger/classifying-amazon-reviews-with-python-from-raw-text-to-88-accuracy-15a4</link>
      <guid>https://forem.com/akansrodger/classifying-amazon-reviews-with-python-from-raw-text-to-88-accuracy-15a4</guid>
      <description>&lt;p&gt;Ever wondered how businesses know if customers are happy or not? In this project, I built a machine learning model that classifies Amazon product reviews as Positive or Negative using NLP techniques. Here's how I did it.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;The Dataset&lt;br&gt;
I used the Amazon Review Polarity Dataset — sampling 200,000 reviews for training and 50,000 for testing. The dataset was perfectly balanced between positive and negative reviews, which is ideal for classification.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Cleaning the Text&lt;br&gt;
Raw reviews are messy. I wrote a preprocessing function to lowercase text, strip punctuation, numbers, and remove stopwords using NLTK. This is really helpful for the model to identify words properly.&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def clean_text(text):
    text = str(text).lower()
    text = re.sub(r"[^\w\s]", "", text)
    text = re.sub(r"\d+", "", text)
    words = [word for word in text.split() if word not in stop_words]
    return " ".join(words)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Converting Text to Numbers with TF-IDF
Machine learning models need numbers, not words. TF-IDF weighs words by how unique they are to each review — common words like "the" get ignored, meaningful words like "terrible" get prioritised.
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;vectorizer = TfidfVectorizer(max_features=5000, min_df=5, max_df=0.9)
X_train = vectorizer.fit_transform(train_df["clean_text"])
X_test = vectorizer.transform(test_df["clean_text"])
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Training &amp;amp; Comparing Models&lt;br&gt;
I trained and compared three models — Logistic Regression , Naive Bayes, and Linear SVM. Logistic Regression performed best and was used for the final evaluation.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Results&lt;br&gt;
Tested on 50,000 reviews:&lt;br&gt;
Metric Negative - Positive&lt;br&gt;
Precision = 0.89 - 0.88 &lt;br&gt;
Recall = 0.88 - 0.89&lt;br&gt;
F1-Score = 0.88 - 0.89&lt;br&gt;
Overall Accuracy: 88% — balanced performance across both classes.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzyxir89ssykoj5u1dm3l.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzyxir89ssykoj5u1dm3l.PNG" alt="Classification Report" width="467" height="152"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Real-Time Predictions&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ruwkz87f89zrl5hx0mi.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1ruwkz87f89zrl5hx0mi.PNG" alt="Model identifying positive and negative reviews" width="387" height="95"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def predict_sentiment(text):
    cleaned = clean_text(text)
    vectorized = vectorizer.transform([cleaned])
    prediction = model.predict(vectorized)[0]
    return "Positive" if prediction == 1 else "Negative"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;"This product is amazing!"        -&amp;gt; Positive&lt;br&gt;
"Completely useless, waste of money" -&amp;gt; Negative&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Visualizations
Three charts helped tell the story:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Sentiment distribution — confirmed the dataset was balanced&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbqem7pd5h4tle64tkxnm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbqem7pd5h4tle64tkxnm.png" alt="Class distribution, Positive class and Negative class" width="640" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Word cloud — top positive words: great, love, best&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkmssfqv4qs616fc95u4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffkmssfqv4qs616fc95u4.png" alt="Diplayed the top positive words" width="640" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Confusion matrix — symmetric errors, no class bias&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnncq2buy2k2a32o0mim.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnnncq2buy2k2a32o0mim.png" alt="Diagonal and Off diagonal, Showing the TN,FN,TP,FP" width="640" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;What I Learned&lt;br&gt;
Working at this scale (250k reviews) taught me that clean data and a balanced dataset matter more than model complexity. Logistic Regression beat fancier approaches simply because the data was well prepared.&lt;br&gt;
Next steps: hyperparameter tuning, cross-validation, and eventually a BERT-based model for higher accuracy.&lt;br&gt;
Full code on my GitHub — feel free to clone and try it on your own dataset!&lt;/p&gt;

&lt;p&gt;Found this helpful? Drop a like or leave a comment below!&lt;/p&gt;

</description>
      <category>python</category>
      <category>machinelearning</category>
      <category>beginners</category>
      <category>nlp</category>
    </item>
    <item>
      <title>I Created My First AI Tool-Using Agent with LangGraph and Groq</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Sun, 25 Jan 2026 20:14:16 +0000</pubDate>
      <link>https://forem.com/akansrodger/i-created-my-first-ai-tool-using-agent-with-langgraph-and-groq-3j3n</link>
      <guid>https://forem.com/akansrodger/i-created-my-first-ai-tool-using-agent-with-langgraph-and-groq-3j3n</guid>
      <description>&lt;p&gt;When you are just starting out, you can easily become overwhelmed when trying to figure out how to create an intelligent agent &lt;strong&gt;(AI)&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;In this article, I will share my experience in building my first tool-using agent, based on Python and the technologies LangChain, LangGraph, and Groq.&lt;/p&gt;

&lt;p&gt;As well as providing an overview of my project, I will also highlight the mistakes I made, as well as provide some of the learning experiences I went through while developing my &lt;strong&gt;AI agent&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This project also provided me with valuable insight into the technical workings behind modern intelligent agents.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvtgdyib33y8spggin40l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvtgdyib33y8spggin40l.png" alt="Building a simple AI agent in Python using LangChain, Groq, and tools inside VS Code" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What Is an Intelligent Agent?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Intelligent agents do much more than just provide a conversational experience.&lt;/p&gt;

&lt;p&gt;The chatbot is limited to responding to questions posed by users.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Intelligent agents are capable of:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Finding your request based on task-related knowledge, Making decisions regarding how they should respond to your request, Providing the ability to call upon or use other tools, resources, and services when needed, Providing you with the results of the interaction to your request.&lt;/p&gt;

&lt;p&gt;The project I worked on employed the ReAct pattern (Reason and Act).&lt;br&gt;
The ReAct pattern refers to the process used to build intelligent agents in the real world.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The technology stack I utilized to develop this project was:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Python&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;LangChain (using LLMs to create abstractions for LLMs)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;LangGraph (creating workflows to support agent-oriented tasks)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Groq (providing fast, free inference of LLMs)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Git / GitHub (for version control and publishing)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;One key takeaway from this project is that the LLM model itself may become obsolete and therefore it is important to build a flexible configuration.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8v9194sgudfsx2ifchs4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8v9194sgudfsx2ifchs4.png" alt="AI agent project setup in VS Code with pyproject.toml dependencies and GitHub repository push" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Building the Tool&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Here’s an example of the calculator tool I added to the agent:&lt;/em&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;@tool
def calculator(a: float, b: float) -&amp;gt; str:
    """Useful for performing basic arithmetic calculations with numbers"""
    return f"The sum of {a} and {b} is {a + b}"

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xv321ji8mapb5h6gv4d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3xv321ji8mapb5h6gv4d.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Agent uses the docstring to determine when to make use of the tool, which is a crucial piece for its functioning and decision-making process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Agent in Action&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;code&gt;User: Can you add 7 and 9?&lt;br&gt;
Agent: (reasons about the task)&lt;br&gt;
Tool called: calculator&lt;br&gt;
Result: 16&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpir52gn0b9h5jjhqhqt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcpir52gn0b9h5jjhqhqt.png" alt=" " width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The Moment When I Saw the Agent Using the Tool Independently Was a Surprising and Eye-Opening Experience for Me.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lessons Learned&lt;/strong&gt;&lt;br&gt;
&lt;em&gt;Some Key Takeaways that Come Out of This Experience Include:&lt;/em&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Developing Artificial Intelligence Requires a Significant Amount of Debugging and That It Is Not a Simple Process.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;The Way in Which a Tool is Documented Has a Major Impact on the agent's Decision-Making Process When Using the Tool.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;It Is Essential to Use Git, and It Is Not Optional.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;By Accurately Reading the Error Messages You Receive When Debugging, You Can Save Yourself a Lot of Time.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The Project's Source Code Is Available on GitHub by Following the Link Below:&lt;br&gt;
🔗 [&lt;a href="https://github.com/Akansrodger/ai-agent-with-tools" rel="noopener noreferrer"&gt;https://github.com/Akansrodger/ai-agent-with-tools&lt;/a&gt;]&lt;/p&gt;

</description>
      <category>ai</category>
      <category>python</category>
      <category>programming</category>
      <category>beginners</category>
    </item>
    <item>
      <title>From CSV to Insights: Analysing Walmart Sales with Python &amp; PostgreSQL</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Mon, 15 Dec 2025 11:31:26 +0000</pubDate>
      <link>https://forem.com/akansrodger/from-csv-to-insights-analysing-walmart-sales-with-python-postgresql-1c4l</link>
      <guid>https://forem.com/akansrodger/from-csv-to-insights-analysing-walmart-sales-with-python-postgresql-1c4l</guid>
      <description>&lt;p&gt;During my journey to learn Data Analysis, I decided to start a project that actually seems real, not just another tutorial project. The project I chose is to analyse Walmart Sales Data using Python for Data Cleaning and PostgreSQL (SQL) for SQL Analysis, taking data from a RAW CSV file and providing Business Insights.&lt;/p&gt;

&lt;p&gt;Project Overview: &lt;br&gt;
To take raw (incomplete) sales data and prepare it for storage in a database (where it can be organized), and run SQL queries to obtain key business insights from the prepared data.&lt;/p&gt;

&lt;p&gt;I used Panda's Python library to prepare the sales data for storage; then I imported it into PostgreSQL, so I could perform structured queries on it.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgp0vqhl6mcyv110juo4n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgp0vqhl6mcyv110juo4n.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Tools Utilized: Pandas, SQLAlchemy, PostgreSQL, SQL, and VSCode; and the source was Kaggle, from which I obtained Walmart's sales data set.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4kihrk3q3b20johhjnhn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4kihrk3q3b20johhjnhn.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Description of DataSet: The dataset contains detailed transaction-related information for a specific period, including product family, store/branch information, price and quantity that were sold; customer ratings; time the product was purchased.&lt;/p&gt;

&lt;p&gt;How I did it: Data Preparation with Python - I utilized the Pandas Library to remove duplicates, alter data types, convert currency amounts into numbers, and create a new column that totals up all sales.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F955cl3abq19xfh4f7pug.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F955cl3abq19xfh4f7pug.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Load the Data into PostgreSQL - I imported the cleaned sales data into PostgreSQL utilising SQLAlchemy. This allowed me to create a similar environment for analyzing sales data to how it would be managed within a business.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3szgacogdpuhg4d0as1b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3szgacogdpuhg4d0as1b.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;SQL Analysis - Using SQL, I was able to answer some essential business inquiries, including which day of the week is the busiest for each store and what type of product sells the most, the categories generating profits, and the patterns of product sales throughout each store's operating hours (using Window Functions and Aggregate Functions in SQL).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85t941safhzuvbs2x71f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F85t941safhzuvbs2x71f.png" alt=" " width="800" height="449"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Key Takeaways: For cleaning and preparing datasets to enable structured analysis, Pandas is an excellent tool. The power of SQL is its ability to do structured analysis across different datasets. Complex organizational queries can be answered with the help of SQL window functions. &lt;/p&gt;

&lt;p&gt;🔗 Complete project source code (GitHub): &lt;a href="https://github.com/Akansrodger/Walmart_sales_SQL_PYTHON" rel="noopener noreferrer"&gt;https://github.com/Akansrodger/Walmart_sales_SQL_PYTHON&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;My last words on this journey...&lt;/p&gt;

&lt;p&gt;This project provided me with the ability to integrate all of the technologies (Python/SQL/Databases) into one consolidated process (workflow) and provided me with another important step in becoming a better data analyst.&lt;/p&gt;

</description>
      <category>database</category>
      <category>python</category>
      <category>sql</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Building a Library Management System with SQL: My Journey, Insights, and Best Practices ✨</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Tue, 28 Oct 2025 11:23:08 +0000</pubDate>
      <link>https://forem.com/akansrodger/building-a-library-management-system-with-sql-my-journey-insights-and-best-practices-1lc7</link>
      <guid>https://forem.com/akansrodger/building-a-library-management-system-with-sql-my-journey-insights-and-best-practices-1lc7</guid>
      <description>&lt;p&gt;As a data analyst passionate about turning raw data into structure and insight, I recently completed a project that truly strengthened my SQL skills, a Library Management System built entirely using SQL. The goal was simple: to create a relational database that could efficiently manage books, members, employees, and issued records, just like in a real-world library setup.&lt;/p&gt;

&lt;p&gt;What started as a basic idea quickly became a full-fledged learning experience. I designed tables for books, members, and employees, then implemented joins and relationships to connect them through issued and returned transactions. One of the key lessons I learned was how data normalisation and relationships bring structure and clarity to even the most complex systems.&lt;/p&gt;

&lt;p&gt;For example, this simple query helped me identify members who borrowed more than one book:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; 
  &lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;member_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ist&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;issued_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;total_books&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;issued_status&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;ist&lt;/span&gt;
&lt;span class="k"&gt;JOIN&lt;/span&gt; &lt;span class="n"&gt;members&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;m&lt;/span&gt; 
  &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;member_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ist&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;issued_member_id&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;m&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;member_name&lt;/span&gt;
&lt;span class="k"&gt;HAVING&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ist&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;issued_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It was exciting to see how a few lines of SQL could transform raw data into real insights. I also explored ways to count how many books each employee had issued, which deepened my understanding of JOINs and GROUP BY logic:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;SELECT&lt;/span&gt; 
  &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;emp_name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ist&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;issued_id&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;books_issued&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;issued_status&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;ist&lt;/span&gt;
&lt;span class="k"&gt;JOIN&lt;/span&gt; &lt;span class="n"&gt;employees&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt; 
  &lt;span class="k"&gt;ON&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;emp_id&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ist&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;issued_emp_id&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;emp_name&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Beyond queries, I implemented a stored procedure to automate repetitive tasks like tracking issued and returned books. This step gave the project a more practical feel — similar to how database operations work behind the scenes in real organisations.&lt;/p&gt;

&lt;p&gt;Overall, this project reminded me that even seemingly simple systems can teach deep analytical and structural thinking. If you’re learning SQL or data management, I’d highly recommend trying to build something like this. It’s not just about writing queries — it’s about thinking through real-world data problems, relationships, and efficiency.&lt;/p&gt;

&lt;p&gt;👉 You can check out the full project on my GitHub here: &lt;a href="https://github.com/Akansrodger/Library-Management-System-using-SQL-Project---prj2" rel="noopener noreferrer"&gt;https://github.com/Akansrodger/Library-Management-System-using-SQL-Project---prj2&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can check me out on other social media platforms:&lt;/p&gt;

&lt;p&gt;Instagram: &lt;a href="https://www.instagram.com/jackiiee_.__/" rel="noopener noreferrer"&gt;https://www.instagram.com/jackiiee_.__/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;X: &lt;a href="https://x.com/Akansrodgers" rel="noopener noreferrer"&gt;https://x.com/Akansrodgers&lt;/a&gt;&lt;/p&gt;

</description>
      <category>sql</category>
      <category>beginners</category>
      <category>outreachy</category>
      <category>datascience</category>
    </item>
    <item>
      <title>How I Analyzed Retail Sales Data Using SQL</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Fri, 11 Jul 2025 14:28:23 +0000</pubDate>
      <link>https://forem.com/akansrodger/how-i-analyzed-retail-sales-data-using-sql-5e98</link>
      <guid>https://forem.com/akansrodger/how-i-analyzed-retail-sales-data-using-sql-5e98</guid>
      <description>&lt;p&gt;Hello everyone, it's me again. Today I’m excited to share another practical project I worked on to strengthen my SQL and data analysis skills: a retail sales data analysis built entirely with SQL!&lt;/p&gt;

&lt;p&gt;**&lt;br&gt;
Now lets look at Why Sales Data Analysis is Important  **&lt;/p&gt;

&lt;p&gt;Retail businesses rely heavily on sales data to track product performance, monitor revenue growth, and make informed business decisions. As a data analyst, the ability to efficiently query and summarize sales data is a must-have skill—which this project allowed me to practice.&lt;/p&gt;

&lt;p&gt;** Project Overview  **&lt;/p&gt;

&lt;p&gt;I created a simple retail sales database consisting of two tables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Products: Contains product details like name, category, and unit price.&lt;/li&gt;
&lt;li&gt;Sales: Records sales transactions, including product ID, quantity sold, and sale date.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;From there, I wrote SQL queries to answer critical business questions such as&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What’s the total quantity sold and revenue for each product?&lt;/li&gt;
&lt;li&gt;What’s the average quantity sold per product?&lt;/li&gt;
&lt;li&gt;Which products generated revenue above a target threshold?&lt;/li&gt;
&lt;li&gt;Which product was the top seller by total revenue?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;*&lt;em&gt;Tools and Skills Used  *&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SQL (DDL, DML, Aggregate functions, HAVING, JOIN, Subqueries)**&lt;/li&gt;
&lt;li&gt;GitHub for project hosting and code management&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;** Tables I Created ** &lt;/p&gt;

&lt;p&gt;CREATE TABLE Products(&lt;br&gt;
  ProductID INT PRIMARY KEY,&lt;br&gt;
  ProductName VARCHAR(50),&lt;br&gt;
  Category VARCHAR(50),&lt;br&gt;
  UnitPrice DECIMAL(10, 2)&lt;br&gt;
);&lt;/p&gt;

&lt;p&gt;CREATE TABLE Sales(&lt;br&gt;
  SalesID INT PRIMARY KEY,&lt;br&gt;
  ProductID INT,&lt;br&gt;
  Quantity INT,&lt;br&gt;
  SaleDate DATE&lt;br&gt;
);&lt;br&gt;
Then i inserted values into these tables using 'INSERT INTO'&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Key Queries and Insights&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; View all sales transactions with product names and total sale amounts&lt;/li&gt;
&lt;li&gt;Calculate total quantity sold and total revenue per product&lt;/li&gt;
&lt;li&gt;Compute average quantity sold per product&lt;/li&gt;
&lt;li&gt;Identify products with total revenue above 3,000&lt;/li&gt;
&lt;li&gt; Find the product with the highest total sales revenue&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Example: SELECT ProductName, SUM(S.Quantity) AS total_quantity_sold, &lt;br&gt;
  SUM(S.Quantity * P.UnitPrice) AS total_revenue&lt;br&gt;
FROM Sales S&lt;br&gt;
JOIN Products P ON S.ProductID = P.ProductID&lt;br&gt;
GROUP BY ProductName&lt;br&gt;
ORDER BY total_revenue DESC;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdr50imksfzwljjl0ey0q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdr50imksfzwljjl0ey0q.png" alt=" " width="800" height="410"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lessons I Learned&lt;/strong&gt;&lt;br&gt;
SQL aggregate functions are powerful tools for summarizing business data.&lt;/p&gt;

&lt;p&gt;Always double-check your JOIN conditions when combining tables.&lt;/p&gt;

&lt;p&gt;Subqueries can simplify complex logic if structured correctly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Link&lt;/strong&gt;&lt;br&gt;
Check out the full project and SQL scripts on GitHub:&lt;br&gt;
&lt;a href="https://github.com/Akansrodger/retail-sales-data-analysis-sql" rel="noopener noreferrer"&gt;https://github.com/Akansrodger/retail-sales-data-analysis-sql&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Final Thoughts&lt;/strong&gt;&lt;br&gt;
This was a rewarding small project that reinforced core SQL skills and demonstrated how straightforward queries can provide valuable business insights.&lt;/p&gt;

&lt;p&gt;I recommend similar practical projects to anyone starting out in SQL or data analysis; you’ll gain useful skills fast and have solid portfolio material to show for it.&lt;/p&gt;

&lt;p&gt;If you found this useful, check out my Hospital Patient Management System SQL project too.&lt;/p&gt;

</description>
      <category>sql</category>
      <category>postgres</category>
      <category>portfolio</category>
      <category>learning</category>
    </item>
    <item>
      <title>How I Built a Hospital Patient Management System Using SQL</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Tue, 08 Jul 2025 21:59:56 +0000</pubDate>
      <link>https://forem.com/akansrodger/how-i-built-a-hospital-patient-management-system-using-sql-43a6</link>
      <guid>https://forem.com/akansrodger/how-i-built-a-hospital-patient-management-system-using-sql-43a6</guid>
      <description>&lt;p&gt;Article Outline:&lt;br&gt;
Introduction&lt;/p&gt;

&lt;p&gt;Project Overview&lt;/p&gt;

&lt;p&gt;Tools and Skills Used&lt;/p&gt;

&lt;p&gt;Database Design (ERD)&lt;/p&gt;

&lt;p&gt;Key Features of the System&lt;/p&gt;

&lt;p&gt;Sample Queries I Ran&lt;/p&gt;

&lt;p&gt;Challenges I Faced&lt;/p&gt;

&lt;p&gt;Lessons Learned&lt;/p&gt;

&lt;p&gt;GitHub Project Link&lt;/p&gt;

&lt;p&gt;Closing Thoughts&lt;/p&gt;

&lt;p&gt;Hello everyone, in today’s post, I’ll be sharing how I designed and built a Hospital Patient Management System using SQL as part of my data analyst learning journey and public project portfolio.&lt;/p&gt;

&lt;p&gt;I built this project to strengthen my database management and SQL querying skills, while preparing for open-source opportunities like Outreachy.&lt;/p&gt;

&lt;p&gt;Project Overview  &lt;/p&gt;

&lt;p&gt;The project involved creating a relational database to manage:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Patient's personal information&lt;/li&gt;
&lt;li&gt;Departments&lt;/li&gt;
&lt;li&gt;Diagnoses&lt;/li&gt;
&lt;li&gt;Visit history&lt;/li&gt;
&lt;li&gt;Risk assessments&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The system makes it easy for a hospital to store, retrieve, and analyse patient data efficiently.&lt;/p&gt;

&lt;p&gt;🛠️ Tools and Skills Used  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SQL (DDL, DML, and queries)&lt;/li&gt;
&lt;li&gt;Database design&lt;/li&gt;
&lt;li&gt;ERD (Entity Relationship Diagram)&lt;/li&gt;
&lt;li&gt;GitHub for project hosting and version control&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Database Design&lt;br&gt;&lt;br&gt;
I designed a normalised database with the following tables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;'Patients'&lt;/li&gt;
&lt;li&gt;'Doctors'&lt;/li&gt;
&lt;li&gt;'Visit'&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;They’re connected through primary and foreign key relationships.&lt;/p&gt;

&lt;p&gt;ERD Diagram: &lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0q3csf85fnort78zswhi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0q3csf85fnort78zswhi.png" alt=" " width="356" height="474"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Key Features of the System  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Proper use of &lt;strong&gt;primary and foreign keys&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Inserted realistic sample patient and visit records&lt;/li&gt;
&lt;li&gt;Wrote clean SQL queries to analyse:

&lt;ul&gt;
&lt;li&gt;Patient visits&lt;/li&gt;
&lt;li&gt;Risk level distributions&lt;/li&gt;
&lt;li&gt;Department workload reports&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Sample Queries I Ran  &lt;/p&gt;

&lt;p&gt;Number of visits per patient&lt;br&gt;
SELECT p.Full_Name, COUNT(v.VisitID) AS Number_of_Visits&lt;br&gt;
FROM patients p&lt;br&gt;
JOIN Visit v ON p.PatientID = v.PatientID&lt;br&gt;
GROUP BY p.FullName;&lt;/p&gt;

&lt;p&gt;Get All Visits with Patient &amp;amp; Doctor Names&lt;/p&gt;

&lt;p&gt;SELECT V.VisitDate, P.FullName AS Patient, D.FullName AS Doctor, V.Diagnosis&lt;br&gt;
FROM Visit V&lt;br&gt;
JOIN Patients P ON V.PatientID = P.PatientID&lt;br&gt;
JOIN Doctors D ON V.DoctorID = D.DoctorID;&lt;/p&gt;

&lt;p&gt;Lessons Learned&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Plan your database structure first&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Always set up primary and foreign keys&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Use sample data to test your queries&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;GitHub is a great place to showcase SQL projects publicly&lt;/p&gt;

&lt;p&gt;Project Link&lt;br&gt;
 Check out the full project on GitHub: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/Akansrodger/hospital-patient-management-system-sql" rel="noopener noreferrer"&gt;https://github.com/Akansrodger/hospital-patient-management-system-sql&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feel free to download the .sql files and try out the queries!&lt;/p&gt;

&lt;p&gt;Closing Thoughts&lt;br&gt;
This project has improved my confidence in database design and SQL query writing. It’s one of the first of many I’ll be adding to my public portfolio.&lt;/p&gt;

&lt;p&gt;Thanks for reading! &lt;/p&gt;

</description>
      <category>database</category>
      <category>sql</category>
      <category>outreachy</category>
      <category>beginners</category>
    </item>
    <item>
      <title>How I Built a Sales Dashboard in Excel and Published It on GitHub</title>
      <dc:creator>Akanle Tolulope</dc:creator>
      <pubDate>Thu, 03 Jul 2025 14:02:46 +0000</pubDate>
      <link>https://forem.com/akansrodger/how-i-built-a-sales-dashboard-in-excel-and-published-it-on-github-2fag</link>
      <guid>https://forem.com/akansrodger/how-i-built-a-sales-dashboard-in-excel-and-published-it-on-github-2fag</guid>
      <description>&lt;p&gt;How I Built a Sales Dashboard in Excel and Published It on GitHub&lt;/p&gt;

&lt;p&gt;Hello everyone 👋 — I'm excited to share one of the first projects in my data analysis portfolio: a fully interactive Sales Dashboard built with Microsoft Excel  &lt;/p&gt;

&lt;p&gt;As part of my journey to sharpen my data skills and prepare to solve problems with data. &lt;/p&gt;

&lt;p&gt;Project Overview  &lt;/p&gt;

&lt;p&gt;This project involved analysing a fictional sales dataset to track:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Monthly sales performance&lt;/li&gt;
&lt;li&gt;Regional breakdown&lt;/li&gt;
&lt;li&gt;Product category trends&lt;/li&gt;
&lt;li&gt;Key sales metrics like Total Sales and Profit
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I designed an interactive Excel dashboard that uses pivot tables, pivot charts, KPI cards, and slicers to create a simple yet powerful data visualisation tool for business decision-making.&lt;/p&gt;

&lt;p&gt;Tools and Skills Used  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Microsoft Excel: for data cleaning, analysis, and dashboarding
&lt;/li&gt;
&lt;li&gt;Pivot Tables: to quickly summarise sales by month, region, and category
&lt;/li&gt;
&lt;li&gt;Pivot Charts: for dynamic visualisations
&lt;/li&gt;
&lt;li&gt;Slicers: for interactive filtering
&lt;/li&gt;
&lt;li&gt;GitHub: to host and share the final project&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Key Features of the Dashboard  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dynamic KPI cards for Total Sales, Total Profit, and Sales Target&lt;/li&gt;
&lt;li&gt;Bar and column charts for monthly and regional performance&lt;/li&gt;
&lt;li&gt;Category breakdown pie chart&lt;/li&gt;
&lt;li&gt;Slicers to filter by Region, Product Category, or Sales Period&lt;/li&gt;
&lt;li&gt;Clean, business-style layout&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🛑 Of course,&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9jaejij52w3g71tui3uh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9jaejij52w3g71tui3uh.png" alt=" " width="800" height="416"&gt;&lt;/a&gt; I had challenges! &lt;/p&gt;

&lt;p&gt;At first, I struggled with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Setting up multiple pivot tables on the same worksheet&lt;/li&gt;
&lt;li&gt;Designing a clean, clutter-free dashboard layout&lt;/li&gt;
&lt;li&gt;Making sure the slicers worked across all charts&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But after a few tweaks and learning from some tutorials, I got it working smoothly.&lt;/p&gt;

&lt;p&gt;What Lessons Learned have I learned?  &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pivot tables are incredibly powerful for summarising large data sets
&lt;/li&gt;
&lt;li&gt;A simple, clean dashboard layout communicates insights faster
&lt;/li&gt;
&lt;li&gt;Sharing your projects publicly helps build confidence and opens opportunities
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Project Link  &lt;/p&gt;

&lt;p&gt;Check out the full project on my GitHub:&lt;br&gt;&lt;br&gt;
&lt;a href="https://github.com/Akansrodger/-sales-dashboard-excel" rel="noopener noreferrer"&gt;📊 Sales Dashboard Repository&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Feel free to download the file, explore the pivot tables, and interact with the dashboard.&lt;/p&gt;

&lt;p&gt;👋 Closing Thoughts  &lt;/p&gt;

&lt;p&gt;This was my first step toward building a strong public data analysis portfolio. I’ll be sharing more projects soon, including patient record analysis, survey data summaries, and a personal budget tracker.&lt;/p&gt;

&lt;p&gt;If you have feedback or suggestions, I'd love to connect!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>portfolio</category>
      <category>database</category>
      <category>github</category>
    </item>
  </channel>
</rss>
