<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: vivian mukhongo (Avivo)</title>
    <description>The latest articles on Forem by vivian mukhongo (Avivo) (@vivian_mukhongoavivo_3).</description>
    <link>https://forem.com/vivian_mukhongoavivo_3</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/vivian_mukhongoavivo_3"/>
    <language>en</language>
    <item>
      <title>INSIGHTS FROM PROJECT 1 OF THE DATA SCIENCE BOOTCAMP</title>
      <dc:creator>vivian mukhongo (Avivo)</dc:creator>
      <pubDate>Sat, 03 Aug 2024 16:03:15 +0000</pubDate>
      <link>https://forem.com/vivian_mukhongoavivo_3/insights-from-project-1-of-the-data-science-bootcamp-5188</link>
      <guid>https://forem.com/vivian_mukhongoavivo_3/insights-from-project-1-of-the-data-science-bootcamp-5188</guid>
      <description>&lt;p&gt;I am so thrilled to dive into this five week boot camp by Lux Tech Academy&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;WEEK 1 PROJECT 1 PART 1&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;A weather dataset was provided and requested to write a python code to answer questions regarding the same.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;STEPS USED&lt;/strong&gt;
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;In Jupyter Notebook I imported the pandas and numpy libraries&lt;/li&gt;
&lt;li&gt;- 2. - loaded the csv file and started cleaning the data&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  ** DATA CLEANING**
&lt;/h2&gt;

&lt;p&gt;checking for the structure of the dataframe&lt;br&gt;
     1. df.head() - to check for the first rows&lt;br&gt;
     2. df.tail() - to check for the last rows&lt;br&gt;
     3. df.shape() - checks for the number of rows and columns&lt;br&gt;
     4. df.info() - checking for the general information about the dataset&lt;br&gt;
     5. df.dtypes - checks for the data types used in the columns&lt;br&gt;
     6. df.isna()- checked for the missing values in the dataset.&lt;br&gt;
&lt;strong&gt;I WENT AHEAD AND ANSWERED THE QUESTIONS ASKED USING THE FUNCTIONS BELOW&lt;/strong&gt;&lt;br&gt;
Len()- used to determine the size or length of various data structures.&lt;br&gt;
Considering my code this function has been widely used especially in cases where the number of records for certain values was asked.&lt;/p&gt;

&lt;p&gt;print() -used to output a specified message. I used this function to make increase the readability  of the output.&lt;br&gt;
df.rename()- used to rename labels, columns or index of a dataframe&lt;br&gt;
I renamed the column 'weather' to 'Weather_Conditions' using the syntax below&lt;br&gt;
&lt;strong&gt;_df.rename(columns={Weather:Weather_Conditions}Inplace=True) _&lt;/strong&gt;&lt;br&gt;
Mean() - used to calculate the average value of numeric columns in a dataframe.&lt;br&gt;
groupby()- used when you need to perform an operation of your data defined by unique values in one or more columns.&lt;/p&gt;

&lt;p&gt;From the above analysis, I understood that in the realm of data analysis, understanding and utilizing the right tools and techniques is crucial for extracting meaningful insights from datasets&lt;/p&gt;

&lt;h2&gt;
  
  
  PART TWO:SQL CODE
&lt;/h2&gt;

&lt;p&gt;I used the microsoft sql management studio&lt;br&gt;
where I learnt how to connect the database with the local host and the steps for importing a csv file into the database for analysis. &lt;br&gt;
Here, I explored the use of Select....From and WHERE Clause in SQL&lt;/p&gt;

&lt;p&gt;Though I am a beginner in the field I anticipate to learn more and use a wide range of python and sql applications in the industry!&lt;br&gt;
I am so excited to be in the Data industry!&lt;/p&gt;

</description>
      <category>database</category>
      <category>machinelearning</category>
      <category>datascience</category>
      <category>github</category>
    </item>
  </channel>
</rss>
