<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Chinwe O.</title>
    <description>The latest articles on Forem by Chinwe O. (@chinwee__o).</description>
    <link>https://forem.com/chinwee__o</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/chinwee__o"/>
    <language>en</language>
    <item>
      <title>Vacation Planner Application using AWS</title>
      <dc:creator>Chinwe O.</dc:creator>
      <pubDate>Mon, 12 Feb 2024 01:04:37 +0000</pubDate>
      <link>https://forem.com/aws-builders/vacation-planner-application-using-aws-ali</link>
      <guid>https://forem.com/aws-builders/vacation-planner-application-using-aws-ali</guid>
      <description>&lt;p&gt;Traveling is a popular and enjoyable activity for people all over the world.  I've been going on some spontaneous trips lately, and it's been tough to plan for them properly. So, I started working on a little side project to help with that. It's still in the early stages, but it could turn into something big eventually.&lt;/p&gt;

&lt;p&gt;Let's start building! The main goal is to create a web app that gives users estimated flight costs, recommended travel dates, and hotel choices for a specific destination, all tailored to their budget.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Leveraging AWS for Vacation Planning Application&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbc2ycp1o2qfjmfltgaqm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbc2ycp1o2qfjmfltgaqm.png" alt="AWS Architecture Diagram" width="800" height="673"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Technologies and Services&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Frontend: HTML, CSS, JavaScript using AWS Cloud9 (I used MaterialUI for the design)&lt;/li&gt;
&lt;li&gt;Backend: AWS Lambda for serverless backend logic.&lt;/li&gt;
&lt;li&gt;Database: Amazon DynamoDB to store any necessary data like user preferences or historical data.&lt;/li&gt;
&lt;li&gt;APIs: Integrate third-party APIs for flight data, hotel listings, and travel recommendations.
Hosting: Amazon S3 for hosting the static website, Amazon API Gateway for RESTful API.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Features of an Vacation Planner App
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Destination Input: users can input their desired vacation location.&lt;/li&gt;
&lt;li&gt;Travel Date Suggestions: the tool suggests the best times to visit based on factors like weather, local events, or historical data.&lt;/li&gt;
&lt;li&gt;Flight Cost Estimates: this will provide estimated flight costs for the suggested dates.&lt;/li&gt;
&lt;li&gt; Hotel Recommendations: to list the different hotel options in the destination area.&lt;/li&gt;
&lt;li&gt;Budget: users can input their budget and the app will provide recommendations within that price range.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Frontend Development&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I had to create a basic web app where users can enter their vacation preferences and use CSS Framework (with MaterialUI) to manage user inputs and show results.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Set Up AWS Cloud9&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Log into your AWS Account and go to the AWS Cloud9 service.&lt;/li&gt;
&lt;li&gt;Create a new environment: I gave it a name and proceeded with the default settings.&lt;/li&gt;
&lt;li&gt;Launch the environment: This will open the Cloud9 IDE in your browser.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kqxe5e5zcie6y4hyuqk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8kqxe5e5zcie6y4hyuqk.png" alt="Image of Cloud9" width="800" height="387"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I personally love the Preview feature offered in Cloud9, as it allows me to observe the website in real time while developing the application. Once I had created the static website, I required a method for hosting it to make it accessible to the public. This is where Amazon S3 became essential. To achieve this, I:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Created an S3 Bucket&lt;/li&gt;
&lt;li&gt;Uploaded my Website Files&lt;/li&gt;
&lt;li&gt;Enabled Static Website Hosting&lt;/li&gt;
&lt;li&gt;Set Bucket Permissions (which is set to Public). What this means is that the bucket will be public is a necessary step when hosting a static website on Amazon S3 because it allows anyone on the internet to view and access your website. &lt;/li&gt;
&lt;li&gt;Got Access to the website endpoint AWS provided to me and launched it on my browser&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwy5yqzxkapqcu02p6yj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffwy5yqzxkapqcu02p6yj.png" alt="Uploaded files to S3" width="800" height="464"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Web View&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Launched the web app using Cloudfront URL&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fke9v385zav3rbo0r9fw7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fke9v385zav3rbo0r9fw7.png" alt="Static Website on Cloudfront URL" width="800" height="192"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Backend Setup with AWS Lambda&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Here, I will be using AWS Lambda for serverless backend logic. I set up the backend with AWS Lambda to process user inputs and also interact with the third-party API. I chose a preferred API of my choice (there are others out there that are available which are also free to use). I built the  Lambda function using python&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzti6dkez1xf5fxrdfeg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzzti6dkez1xf5fxrdfeg.png" alt="AWS Lambda" width="800" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testing out the API using POSTMAN&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Before testing out the API, I had to employ the use of Postman to ensure the API was indeed going to fetch results. As you can see in the image below, it does return the expected results&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3y4tbjpypacvlfxmuv78.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3y4tbjpypacvlfxmuv78.png" alt="Postman" width="800" height="511"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What to expect next?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The next phase will be to implement security measures for protecting user data on AWS, Cost Analysis of Running a Vacation Planner on AWS, Testing my AWS-Based Vacation Planner Application amongst others.&lt;/p&gt;

&lt;p&gt;I am excited about this fun-little project and look forward to using it to plan my trips, you should too !&lt;/p&gt;

</description>
      <category>travel</category>
      <category>aws</category>
      <category>awscommunitybuilders</category>
      <category>programming</category>
    </item>
    <item>
      <title>Sentiment Analysis on the Launch of Amazon Q using Amazon Comprehend and Sagemaker Studio</title>
      <dc:creator>Chinwe O.</dc:creator>
      <pubDate>Tue, 02 Jan 2024 20:06:08 +0000</pubDate>
      <link>https://forem.com/aws-builders/sentiment-analysis-on-the-launch-of-amazon-q-using-amazon-comprehend-and-sagemaker-studio-3ifp</link>
      <guid>https://forem.com/aws-builders/sentiment-analysis-on-the-launch-of-amazon-q-using-amazon-comprehend-and-sagemaker-studio-3ifp</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--X1qHUBKA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xqp5ga3jfyleq1hua1w0.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--X1qHUBKA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xqp5ga3jfyleq1hua1w0.jpg" alt="Image of Amazon Q" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Sentiment analysis is a powerful tool that allows companies to analyze customer opinions and sentiments towards their products or services. At AWS Re:Invent in Las Vegas on November 28, 2023, Amazon unveiled Amazon Q, an innovative AI-based assistant tailored for workplace use. This cutting-edge tool is designed to deliver quick responses to inquiries, create content, and enable actions using data from customer information databases, codebases, and corporate systems. Amazon Q provides personalized interactions to streamline processes and speed up decision-making while also fostering a culture of creativity and innovation in the workplace. The launch of Amazon Q marks a significant advancement in the field of AI-powered assistants for work.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uBcQJrZp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7fxn1yjxgczc07d4r0p2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uBcQJrZp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/7fxn1yjxgczc07d4r0p2.png" alt="Photocredit: Amazon Web Service" width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This mini-project came as a result of my curiosity with the whole hype of AI which seems to be dominating the technology landscape. Also as a Product Manager this is of importance in order to determine customer satisfaction and engagement and ultimately make data driven decision in order to scale the product/service. As a Data Analyst, just like how Duet AI is to Google Looker, that is how Amazon Q is to AWS Quicksight. This tool integrated with Quicksight provides real time and fast analysis of data in seconds. When a new product is released, there is a need to determine the business value of its customers. Sentiment analysis can help identify the overall sentiment towards the new product by analyzing customer opinions and emotions expressed in online reviews and social media posts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Getting Started&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this tutorial, I will show how to perform a Sentiment Analysis using 2 major tools from AWS: &lt;strong&gt;AWS Sagemaker Studio&lt;/strong&gt; and &lt;strong&gt;Amazon Comprehend&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Architectural Diagram&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--y8LnjmzU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/co94d5ncj82pu16dzo4h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--y8LnjmzU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/co94d5ncj82pu16dzo4h.png" alt="Architectural Diagram" width="559" height="438"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Data Collection:&lt;/strong&gt; data was extracted from twitter using Python. This data contained tweets from users. Here are some samples of the tweet that were scraped for this project:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Using #AmazonQ in the IDE is reeediculous! I know AI isn't always right but it has been absolutely amazing. It even overcomes my HORRIBLE spelling! So Sick!&lt;/li&gt;
&lt;li&gt;reInvent2023 unveiled #AmazonQ as a standout highlight. At Caylent, we're excited to streamline the integration of Amazon Q into MeteorAI, making adopting GenAI solutions that much faster and easier.&lt;/li&gt;
&lt;li&gt;Do I know anyone at AWS who can comment on when the Amazon Q Code Transformation for .NET will be available? If this works, it would be amazing for some legacy applications we need to modernize at work.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Data Transformation:&lt;/strong&gt; most of the transformation done here was data cleaning, this is because the data contained certain special characters such as "#" and "@". This was done using Excel Sheet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;IDE:&lt;/strong&gt; Amazon Sagemaker Studio served as the coding environment to run and perform the task&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sentiment Analysis:&lt;/strong&gt; AWS offers a range of services with strong NLP capabilities, including entity recognition, key phrase extraction, and sentiment analysis. Amazon Comprehend is one of such service. It utilizes machine learning to uncover insights and connections within text, enabling the detection of sentiments.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;This article is best suited for programmers familiar with Python.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scraping of Tweets&lt;/strong&gt;&lt;br&gt;
I have a detailed article on Web Scraping &lt;a href="https://medium.com/@chinweee/web-scraping-using-python-forloop-dcad8030046"&gt;here&lt;/a&gt;, so ensure to follow the link to get started. After this is done, the tweets are saved in the Excel Sheet and uploaded to S3 Bucket&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sentiment Analysis&lt;/strong&gt;&lt;br&gt;
Import the CSV File needed to run the analysis in Amazon Sagemaker Studio.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import boto3
import pandas as pd

comprehend_client = boto3.client('comprehend', region_name='us-east-1')

# Read CSV file
amazon_q_df = pd.read_csv('amazon_q.csv')
texts = amazon_q_df['Text']  
len(texts)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Perform the sentiment analysis using &lt;em&gt;detect_sentiment&lt;/em&gt; in Amazon Comprehend to determine different sentiments of users tweets. Read more on how it works &lt;a href="https://docs.aws.amazon.com/comprehend/latest/dg/how-sentiment.html"&gt;here&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sentiments = []

# Analyze text using Amazon Comprehend
for text in texts:
    sentiment_response = comprehend_client.detect_sentiment(Text=text, LanguageCode='en')
    sentiment_scan = sentiment_response['Sentiment']

    sentiments.append({'Text': text, 'Sentiment': sentiments})

sentiment_df = pd.DataFrame(sentiments)
print(sentiment_df.head())
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Output:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NczJHPz2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6f5zhlou28glz0xki3s0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NczJHPz2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6f5zhlou28glz0xki3s0.png" alt="First 5 Tweets" width="800" height="115"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Visualize these texts by sentiments (mixed, neutral, positive, negative)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import seaborn as sns
import matplotlib as plt
percentage = (sentiment_df.groupby(['Sentiment']).size())
percentage
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Adjusting the explode parameter to match the number of unique sentiment categories in the data
unique_sentiments = sentiment_df['Sentiment'].nunique()
explode = [0.1 if i == 0 else 0 for i in range(unique_sentiments)]

plt.figure(figsize=(8, 8))
plt.pie(percentage_new['Percentage'], labels=percentage_new['Sentiment'], autopct='%1.1f%%', startangle=140, explode=explode)
plt.title('Percentage of Tweets by Sentiment')

# Show the plot
plt.show()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mObZ1unT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cbzdk6yi2njcygedoy5n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mObZ1unT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cbzdk6yi2njcygedoy5n.png" alt="Image description" width="800" height="824"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;The data collected were only specific to Twitter and the tweets were quite limited too, which may skew the overall findings and limit generalizability.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;This shows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A large majority of tweets, 74.4%, are classified as Neutral, indicating no particular positive or negative emotion.&lt;/li&gt;
&lt;li&gt;20.9% of tweets are Positive, suggesting a favorable or optimistic sentiment of the Amazon Q.&lt;/li&gt;
&lt;li&gt;A small proportion, 2.3%, are Negative, implying a pessimistic or unfavorable sentiment.&lt;/li&gt;
&lt;li&gt;An equally small proportion, another 2.3%, are Mixed, indicating the presence of both positive and negative sentiments within the same content.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This analysis is useful for gauging public opinion or reaction to the launch of Aamzin Q on social media platforms like Twitter.&lt;/p&gt;

&lt;p&gt;Amazon Comprehend and Amazon Sagemaker Studio played a crucial role in carrying out the Sentiment Analysis of Amazon Q. With Amazon Comprehend's advanced NLP capabilities such as entity recognition, key phrase extraction, and sentiment analysis, we were able to uncover valuable insights from the tweets extracted from Twitter. The use of Amazon Sagemaker Studio also provided an efficient coding environment for running and performing these important tasks. These tools not only facilitated the analysis process but also enhanced my understanding of public sentiments towards Amazon Q. The use of Amazon Comprehend and Amazon Sagemaker Studio showcased the power and effectiveness of AI-powered tools in sentiment analysis.&lt;/p&gt;

&lt;p&gt;Thank you for taking the time to read my article. Don't forget to follow me on &lt;a href="https://dev.to/chinwee__o"&gt;here&lt;/a&gt; and feel free to reach out to me. Looking forward to connecting again soon!&lt;/p&gt;

</description>
      <category>awscommunitybuilders</category>
      <category>aws</category>
      <category>python</category>
      <category>programming</category>
    </item>
    <item>
      <title>Language Whisperer - AI/ML Transformer Tools Hackathon</title>
      <dc:creator>Chinwe O.</dc:creator>
      <pubDate>Fri, 30 Jun 2023 04:28:09 +0000</pubDate>
      <link>https://forem.com/aws-builders/language-whisperer-aiml-transformer-tools-hackathon-i9a</link>
      <guid>https://forem.com/aws-builders/language-whisperer-aiml-transformer-tools-hackathon-i9a</guid>
      <description>&lt;p&gt;&lt;strong&gt;LanguageWhisperer: Facilitate your language learning with Transformers!&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;EDIT: We came 1st Place in the AWS Community Builder Hackathon 🥳🥇&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We live in a world full of different objects, images and languages. However not everyone is well versed with learning new languages. But what if we want to learn a foreign language? This is where our tool, the LanguageWhisperer comes in. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;HOW DID WE GET HERE?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;On Friday 26th May, 2023, Lily Kerns, a Community Manager in the AWS Community Builder program announced &lt;strong&gt;our first ever Community Builders Hackathon!&lt;/strong&gt; This is a fantastic opportunity for all of us to learn, be creative, and create something remarkable. 🥳&lt;/p&gt;

&lt;p&gt;We teamed up together - &lt;a class="mentioned-user" href="https://dev.to/ronakreyhani"&gt;@ronakreyhani&lt;/a&gt;, &lt;a class="mentioned-user" href="https://dev.to/dashapetr"&gt;@dashapetr&lt;/a&gt;, &lt;a class="mentioned-user" href="https://dev.to/anja"&gt;@anja&lt;/a&gt; and I to work on this project &lt;/p&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/3oKIPyRuDfitoVWPWE/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/3oKIPyRuDfitoVWPWE/giphy.gif" alt="Girl Power"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Description&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Project Title:&lt;/strong&gt; Language Whisperer &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Language Whisperer&lt;/strong&gt; as the name implies is a simple-to-use application that enables you to translate and learn a new language through images. Think of Language Whisperer as your primary application whenever you encounter an unfamiliar object in any given situation. This application will assist you in gaining knowledge and understanding about the object in question, regardless of your prior familiarity with it.&lt;/p&gt;

&lt;p&gt;This application can be used in different scenarios such as education, entertainment, tourism etc. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Motivation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When visiting a foreign country and expressing a desire to acquire knowledge of its language, it would be advantageous to promptly access a comprehensive vocabulary for a specific location. Manually searching for numerous words would consume a considerable amount of time. However, with the LanguageWhisperer, efficiency is achieved. By capturing an image, multiple words can be translated simultaneously, enabling users to listen to the translations and gain deeper insights into their meanings.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Architectural Diagram&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flsxitro7kfgz1d0a6rde.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flsxitro7kfgz1d0a6rde.png" alt="Language Whisperer Architectural Diagram"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Process&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;During the initial stages of the project, our team engaged in brainstorming sessions to explore various ideas on how to leverage Transformers. We finally selected her concept, the &lt;strong&gt;&lt;em&gt;Language Whisperer&lt;/em&gt;&lt;/strong&gt;, for our project. To facilitate collaborative development, &lt;a class="mentioned-user" href="https://dev.to/dashapetr"&gt;@dashapetr&lt;/a&gt; created a Google Colab notebook where team members could experiment and iterate on the initial codebase. As the project progressed, we transitioned to using GitHub, to streamline collaboration and ensure efficient code management among team members. This transition allowed for smoother coordination and enhanced teamwork throughout the development process. The following tools were used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;StarCoder agent&lt;/strong&gt;, an extensive language model (LLM), offers a natural language API built on top of transformers and has been employed for the purpose of implementation.  Detailed documentation for the agent can be found at the provided link  &lt;a href="https://huggingface.co/docs/transformers/main_classes/agent" rel="noopener noreferrer"&gt;https://huggingface.co/docs/transformers/main_classes/agent&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Wiki Searcher (a custom tool)&lt;/strong&gt; was implemented by utilizing BeautifulSoup, a Python library designed for extracting data from HTML and XML files. This library played a crucial role in parsing and navigating the HTML structure of web pages, enabling the extraction of relevant information for the Wiki Searcher application.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;gTTS (Google Text-to-Speech) library (a custom tool)&lt;/strong&gt; was used for converting text into high-quality speech output with natural-sounding voices. This decision stems from the observation that the default translator within the agent does not meet our desired level of effectiveness when it comes to accurately reading text in various languages. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Streamlit&lt;/strong&gt; was used for the Frontend&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Challenges Faced&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Finding the appropriate voice for the task proved to be quite challenging. Unfortunately, the built-in StarCoder text-to-speech tool rendered foreign phrases in English with a noticeable accent, causing confusion. Additionally, a decision was made to conduct research in order to find a suitable solution. One option considered was the utilization of Amazon Polly, although integrating it with streamlit presented difficulties, as it necessitated authorization to an AWS account. Alternatively, the gtts library offered a viable option, requiring no keys or access and easily installable via pip install. It simply required the addition of a language code as input, yielding natural-sounding voice output.&lt;/p&gt;

&lt;p&gt;One of the challenges we encountered was determining the appropriate front-end stack for our machine learning application. Initially, we embarked on building a Next.js React application with Python APIs. However, in an effort to simplify the process, we made the decision to utilize Next.js embedded APIs instead of deploying lambda functions and an API gateway. Unfortunately, this decision led to significant issues due to dependencies. We found ourselves needing to containerize the Python library dependencies. Considering the urgency of implementing our idea as quickly as possible for a quick proof of concept, we altered our approach and opted to implement the user interface using Streamlit.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Outcome&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With &lt;strong&gt;Language Whisperer&lt;/strong&gt;, all you have to do is &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;take a picture&lt;/li&gt;
&lt;li&gt;upload the image you would want to transcribe&lt;/li&gt;
&lt;li&gt;select your preferred choice of language&lt;/li&gt;
&lt;li&gt;play and listen to the transcribed language&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;E.g: Let us have a look at a sample image:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Falrhgniv5s1mue0iriqt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Falrhgniv5s1mue0iriqt.png" alt="Food"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 1:&lt;/strong&gt; &lt;strong&gt;Image Analysis&lt;/strong&gt; - Language Whisperer receives this image and generates the following&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2:&lt;/strong&gt; &lt;strong&gt;Image Caption:&lt;/strong&gt; 'a plate of food with eggs, hams, and toast'&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3:&lt;/strong&gt; &lt;strong&gt;Translate the caption into a language of your choice:&lt;/strong&gt; E.g Spanish&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4:&lt;/strong&gt; &lt;strong&gt;Learn/Read the caption:&lt;/strong&gt; un plato de comida con huevos, jamones y tostadas&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 5:&lt;/strong&gt; &lt;strong&gt;Search for a word meaning in Wiki:&lt;/strong&gt; (comida) –&amp;gt; food, something edible &amp;lt;…&amp;gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;PROJECT OUTPUT&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Github Repository: &lt;a href="https://github.com/RonakReyhani/LanguageWhisperer" rel="noopener noreferrer"&gt;https://github.com/RonakReyhani/LanguageWhisperer&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Demo Video: &lt;a href="https://youtu.be/zaYRAKcPHOk" rel="noopener noreferrer"&gt;https://youtu.be/zaYRAKcPHOk&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbszvhwp6dka88yocu083.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbszvhwp6dka88yocu083.png" alt="Sample Code Snippet"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Lesson Learnt (New Skill Developed)&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use of Session States&lt;/li&gt;
&lt;li&gt;Use of Amazon Polly&lt;/li&gt;
&lt;li&gt;Use of Streamlit&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Reflection&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;&lt;a class="mentioned-user" href="https://dev.to/anja"&gt;@anja&lt;/a&gt;:&lt;/u&gt;&lt;/strong&gt; &lt;em&gt;As I didn’t have any experience with the Transformers library before the Hackathon, I was hoping to be able to contribute enough anyway. Luckily, the documentation is very beginner friendly even for people that aren’t experienced with Machine Learning yet. I have been reminded that you should never be afraid to try new tech tools, often it’s not as difficult as you think. I will definitely dive more into Machine Learning in the future. Also it was the first Hackathon I participated in, it was awesome to work together with my brilliant teammates.&lt;/em&gt;   &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;&lt;a class="mentioned-user" href="https://dev.to/ronakreyhani"&gt;@ronakreyhani&lt;/a&gt;:&lt;/u&gt;&lt;/strong&gt; &lt;em&gt;I have always harboured a deep passion for machine learning (ML), which makes every new concept or topic in the field incredibly enticing. Recently, my curiosity led me to explore the realm of Generative AI, specifically the renowned Hugging Face LLM models. Although I had heard about them in passing, I had never had the opportunity to delve into their intricacies. This project presented a remarkable chance to step out of my comfort zone and venture into the unknown.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Throughout this journey, I gained extensive knowledge about various aspects of the Hugging Face ecosystem, including Hugging Face Hub, models, transformers, pipelines, and the widely acclaimed "agent" that our app heavily relies on. Beyond the technical advancements, what truly made this experience exceptional was the opportunity to collaborate with an extraordinary team spanning across the globe. Through virtual meetings and vibrant discussions, we pooled our ideas and arrived at a common understanding. Working with them was truly inspiring, and their unwavering support allowed me the freedom to implement my ideas using the tools I was most comfortable with.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;As the hackathon reached its conclusion, I not only acquired a wealth of knowledge about LLM models and Hugging Face agents, but I also forged incredible friendships. The prospect of meeting these newfound friends in person fills me with anticipation and excitement. In retrospect, this sense of camaraderie and connection stands as the greatest achievement of this endeavour.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;&lt;a class="mentioned-user" href="https://dev.to/dashapetr"&gt;@dashapetr&lt;/a&gt;:&lt;/em&gt;&lt;/strong&gt;  &lt;em&gt;As a Data Scientist, I had a bit of experience with Transformers, but I haven’t used tools and agents. I found the hackathon idea very interesting because I saw the huge tools' potential. Choosing the concept was quite challenging; I had several of them, but when the LanguageWhisperer came to my mind, I was so excited that I decided to put aside all the rest of the ideas. The LanguageWhisperer is something I wish existed when I was struggling to learn French and Chinese. I am grateful that my team decided to go with this idea, and I am extremely happy to get to know my fellow female builders better; it’s an enormous pleasure to build together!&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;&lt;a class="mentioned-user" href="https://dev.to/chinwee__o"&gt;@chinwee__o&lt;/a&gt;:&lt;/u&gt;&lt;/strong&gt; &lt;em&gt;One of the many things that stood out for me was employing different alternatives available needed to get the work done. One of which was trying out Amazon Polly for the purpose of this project. While transformer had a text-to-speech agent which could have been implemented in this project, however the outcome did not produce the best result. This further buttressed that these alternatives are available in order to meet a specific need if others fail. 1 month, 4 weeks, 29 days, 696 hours, 41760 minutes,  2505600 seconds and every single meeting, conversations, chat, lines of code, new learnings with these ladies was worth the while.&lt;/em&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;FURTHER STEPS&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We believe that the LanguageWhisperer can be extended and improved. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Firstly, its functionality can be expanded to facilitate the comparison of translations in multiple languages, generate illustrative usage examples for a given word, and provide a feature to "listen" to and rectify the user's pronunciation.&lt;/li&gt;
&lt;li&gt;Secondly, LanguageWhisperer has the potential to be transformed into a mobile application, enabling users to access it conveniently from any location.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>aws</category>
      <category>huggingface</category>
      <category>ai</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>Getting Started with AWS Storage tools — s3</title>
      <dc:creator>Chinwe O.</dc:creator>
      <pubDate>Wed, 04 Jan 2023 04:10:28 +0000</pubDate>
      <link>https://forem.com/aws-builders/getting-started-with-aws-storage-tools-s3-3djd</link>
      <guid>https://forem.com/aws-builders/getting-started-with-aws-storage-tools-s3-3djd</guid>
      <description>&lt;p&gt;Just getting started with Cloud Computing in Amazon Web Service as a Data Analyst and not sure of the different cloud services to should explore. This article is for you.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XHrbnEqO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rilan53skzwtfnc64grq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XHrbnEqO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/rilan53skzwtfnc64grq.png" alt="Image description" width="880" height="527"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As a data analyst, it is imperative for you to have a basic knowledge of cloud services and learn how to use them for whatever projects you will be working on. Just like you, I recently just started exploring and working with the different storage (s3) and analytical tools (AWS Athena and Quicksight) available in AWS.&lt;/p&gt;

&lt;p&gt;In this article, we will focus on STORAGE SERVICES (Amazon S3), which allow us to store structured sets of data used by applications just like SQL Server, Oracle, and MySQL. In my previous article which outlined the steps needed to ingest data from Excel (.xlsx) to Azure Server Database, here I will be guiding you on how to ingest data into one of AWS Cloud Storage Service, S3.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HH6B4SdI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3jmyrkc4tqu0ioneiz7w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HH6B4SdI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3jmyrkc4tqu0ioneiz7w.png" alt="Image description" width="880" height="189"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Before I go on, it is important you have Credit in your AWS Account and also have a dataset available.&lt;/p&gt;

&lt;p&gt;For this walkthrough, we would be using structured data and storing it in a .csv format. Here are some links to free datasets:&lt;/p&gt;

&lt;p&gt;AWS Datasets: &lt;a href="https://registry.opendata.aws/"&gt;https://registry.opendata.aws/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Data World: &lt;a href="https://data.world/"&gt;https://data.world/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Kaggle: &lt;a href="https://www.kaggle.com/datasets"&gt;https://www.kaggle.com/datasets&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Data.gov: &lt;a href="https://data.gov/"&gt;https://data.gov/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Datahub.io: &lt;a href="https://datahub.io/search"&gt;https://datahub.io/search&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In doing so, Here is a step-by-step process:&lt;br&gt;
&lt;strong&gt;Step 1:&lt;/strong&gt; Log into your AWS account, and search for S3 (Amazon Simple Storage Service) in the console home. S3 is a fully managed object-based storage service and it's highly durable and cost-effective. S3 is used for storing large files like videos, images, static websites, and even backup archives as well.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--vJNUoeaL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bilqx1l4u6633u0nvhl3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--vJNUoeaL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bilqx1l4u6633u0nvhl3.png" alt="Image description" width="880" height="551"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2:&lt;/strong&gt; Create a Bucket. Think of a bucket as a container for your data to store objects.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y8GLL4rY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9zlezi4y0nevpp6qxs7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y8GLL4rY--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9zlezi4y0nevpp6qxs7m.png" alt="Image description" width="880" height="214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By default, you can create up to 100 buckets in your AWS Account.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Click the Create icon&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Next, create a bucket name (note that your bucket name has to be completely unique).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Change the AWS Region as you desire&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For the other options such as Object Ownership, I usually just use the default recommended option.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Check the Public access settings for your bucket (this protects against public access and also allows you to inspect and change already existing policies and ACLs for your buckets and objects).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Enable Bucket Versioning, I will recommend you enable this so that you can have a different history of the changes made on your buckets. This will help you recover objects when you accidentally overwrite or delete (Just think of this like Github Version Control)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Go on and click the Create Bucket icon. Once it has been created, you will then see a success message e.g “Successfully created bucket “rfmdataworld”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EIg40NNe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/schu2dtwtylis0d5y13r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EIg40NNe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/schu2dtwtylis0d5y13r.png" alt="Image description" width="880" height="163"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3:&lt;/strong&gt; Select the newly created bucket “rfmdataworld”&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4:&lt;/strong&gt; Upload your data into the bucket&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the objects option, click the “Upload” Icon, you will then be redirected to the UPLOAD PAGE.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9o5oFFLd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kebobcs057eh6bg9jmt2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9o5oFFLd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kebobcs057eh6bg9jmt2.png" alt="Image description" width="880" height="325"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Select “Add files”, these files in this case would be any dataset you already have stored on your computer.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Go ahead and Upload the file&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once it has been successfully uploaded, a success status, “Your file has been successfully uploaded” will be shown.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z_yYWiqW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nqb0iw4nwyk0ob1fmifr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z_yYWiqW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/nqb0iw4nwyk0ob1fmifr.png" alt="Image description" width="880" height="225"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can also view the details of the file that has just been uploaded.&lt;/p&gt;

&lt;p&gt;This file is now currently in your newly created s3 bucket.&lt;/p&gt;

&lt;p&gt;Now you have seen how to upload and store data in AWS S3&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>aws</category>
      <category>data</category>
      <category>cloud</category>
    </item>
  </channel>
</rss>
