<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Roberto Battaglia</title>
    <description>The latest articles on Forem by Roberto Battaglia (@robertobatts).</description>
    <link>https://forem.com/robertobatts</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/robertobatts"/>
    <language>en</language>
    <item>
      <title>Deploying Static Websites with AWS CloudFront and S3</title>
      <dc:creator>Roberto Battaglia</dc:creator>
      <pubDate>Thu, 31 Jul 2025 20:35:51 +0000</pubDate>
      <link>https://forem.com/robertobatts/deploying-static-websites-with-aws-cloudfront-and-s3-49md</link>
      <guid>https://forem.com/robertobatts/deploying-static-websites-with-aws-cloudfront-and-s3-49md</guid>
      <description>&lt;p&gt;When it comes to hosting static websites, AWS offers a powerful combination of S3 and CloudFront that can significantly improve performance while keeping costs low. In this guide, I'll walk you through setting up a production-ready static website using CloudFront as a CDN in front of S3.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why CloudFront + S3?
&lt;/h2&gt;

&lt;p&gt;Before diving into the implementation, let's understand why this setup is beneficial:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Global Performance&lt;/strong&gt;: CloudFront caches your content at edge locations worldwide, reducing latency for users regardless of their location&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost Efficiency&lt;/strong&gt;: S3 storage is cheap, and CloudFront data transfer costs are often lower than direct S3 access&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security&lt;/strong&gt;: You can keep your S3 bucket private while serving content publicly through CloudFront&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;SSL/TLS&lt;/strong&gt;: Easy HTTPS implementation with AWS Certificate Manager&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;p&gt;You'll need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An AWS account&lt;/li&gt;
&lt;li&gt;A domain name (optional but recommended)&lt;/li&gt;
&lt;li&gt;Your static website files (HTML, CSS, JS, images, etc.)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 1: Setting Up Your S3 Bucket
&lt;/h2&gt;

&lt;p&gt;Have a look at &lt;a href="https://dev.to/posts/s3-static-website/"&gt;my previous article to serve a website with S3&lt;/a&gt; and skip Step 4.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 2: Create a CloudFront Distribution
&lt;/h2&gt;

&lt;p&gt;Now let's set up CloudFront to serve your content:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to the CloudFront console&lt;/li&gt;
&lt;li&gt;Click "Create distribution"&lt;/li&gt;
&lt;li&gt;For "Origin domain", select your S3 bucket&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Important&lt;/strong&gt;: Click "Use website endpoint" - this ensures proper routing for SPA applications. In order to use it, your bucket must be public&lt;/li&gt;
&lt;li&gt;Keep default settings for other options&lt;/li&gt;
&lt;li&gt;Click "Create distribution"&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The distribution will take a few minutes to deploy. Once ready, you can find the CloudFront URL in &lt;em&gt;General &amp;gt; Details &amp;gt; Distribution domain name&lt;/em&gt;. You can test it to see if the website content is accessible from it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Custom Domain Setup (Optional)
&lt;/h2&gt;

&lt;p&gt;To use your own domain:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Configure DNS&lt;/strong&gt;:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;In Route 53, create an A record pointing to your CloudFront distribution&lt;/li&gt;
&lt;li&gt;Create another A record for &lt;code&gt;www&lt;/code&gt; pointing to the same distribution&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Create SSL Certificate&lt;/strong&gt;:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Go to AWS Certificate Manager (ACM)&lt;/li&gt;
&lt;li&gt;Request a certificate for &lt;code&gt;yourdomain.com&lt;/code&gt; and &lt;code&gt;www.yourdomain.com&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Use DNS validation as validation method&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Update CloudFront Distribution&lt;/strong&gt;:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Go to your Cloudfront distribution settings&lt;/li&gt;
&lt;li&gt;Click on "General" and then under the "Settings" section click on "Edit"&lt;/li&gt;
&lt;li&gt;Set the domain specified your ACM certificate (&lt;code&gt;yourdomain.com&lt;/code&gt;) in  &lt;strong&gt;Alternate domain name (CNAME)&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Select the ACM certificate&lt;/li&gt;
&lt;li&gt;Save&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Redirect traffic from HTTP to HTTPS&lt;/strong&gt;:&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;Click on "Behaviours" from Cloudfront distribution settings&lt;/li&gt;
&lt;li&gt;Select the behaviour and edit it&lt;/li&gt;
&lt;li&gt;Set &lt;strong&gt;Viewer protocol policy&lt;/strong&gt; to Redirect HTTP to HTTPS&lt;/li&gt;
&lt;li&gt;Save&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 4: Cache Management
&lt;/h2&gt;

&lt;p&gt;CloudFront caches your content, so updates won't appear immediately if the content has already been served before and didn't expire. To invalidate the cache:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to your CloudFront distribution&lt;/li&gt;
&lt;li&gt;Click "Invalidations" tab&lt;/li&gt;
&lt;li&gt;Create invalidation with path &lt;code&gt;/*&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Wait for completion (just a few minutes)&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Advanced: Private S3 Bucket Setup
&lt;/h2&gt;

&lt;p&gt;I've written another tutorial on &lt;a href="http://codevup.com/posts/cloudfront-static-website-public/" rel="noopener noreferrer"&gt;how to set up this architecture with a private bucket&lt;/a&gt;. Go have a read if you're interested. Otherwise, you can directly have a look at the &lt;a href="https://codevup.com/posts/s3-cloudfront-static-website-full-guide/" rel="noopener noreferrer"&gt;full guide&lt;/a&gt; which includes all the steps starting from S3 bucket creation&lt;/p&gt;

&lt;h2&gt;
  
  
  Cost Considerations
&lt;/h2&gt;

&lt;p&gt;While this setup is cost-effective, be mindful of:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CloudFront data transfer costs (though often cheaper than direct S3)&lt;/li&gt;
&lt;li&gt;Cache invalidation costs ($0.005 per invalidation path)&lt;/li&gt;
&lt;li&gt;SSL certificate costs (free with ACM)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Troubleshooting Common Issues
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Website not loading&lt;/strong&gt;: Check bucket permissions and CloudFront origin settings&lt;br&gt;
&lt;strong&gt;HTTPS errors&lt;/strong&gt;: Verify SSL certificate is properly attached to CloudFront&lt;br&gt;
&lt;strong&gt;Cache issues&lt;/strong&gt;: Create cache invalidation and wait for completion&lt;br&gt;
&lt;strong&gt;Routing problems&lt;/strong&gt;: Ensure you're using the S3 website endpoint, not the bucket endpoint&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This CloudFront + S3 setup provides a robust, scalable solution for hosting static websites. The combination offers excellent performance, security, and cost-effectiveness for most use cases.&lt;/p&gt;

&lt;p&gt;For a more comprehensive guide covering advanced scenarios like private buckets,  check out this &lt;a href="https://codevup.com/posts/s3-cloudfront-static-website-full-guide" rel="noopener noreferrer"&gt;complete AWS static website deployment guide&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The real power of this setup is its simplicity. Once configured, it requires minimal maintenance while providing production-ready stability and low latency&lt;/p&gt;

</description>
      <category>s3</category>
      <category>cloudfront</category>
      <category>aws</category>
      <category>serverless</category>
    </item>
    <item>
      <title>CI/CD Building a pipeline used by multiple repositories with Jenkins and Artifactory integration</title>
      <dc:creator>Roberto Battaglia</dc:creator>
      <pubDate>Sun, 15 Dec 2019 14:31:38 +0000</pubDate>
      <link>https://forem.com/robertobatts/ci-cd-building-a-pipeline-used-by-multiple-repositories-with-jenkins-and-artifactory-integration-3kk5</link>
      <guid>https://forem.com/robertobatts/ci-cd-building-a-pipeline-used-by-multiple-repositories-with-jenkins-and-artifactory-integration-3kk5</guid>
      <description>&lt;h1&gt;
  
  
  &lt;a href="https://codevup.com/posts/jenkins-automatic-cicd/" rel="noopener noreferrer"&gt;Read this article directly from my website by clicking here&lt;/a&gt;
&lt;/h1&gt;

&lt;p&gt;Do you want to use the same pipeline for hundreds of projects without adding a Jenkinsfile to each repo? This guide is for you!&lt;/p&gt;

&lt;p&gt;In the section &lt;strong&gt;A&lt;/strong&gt; I'm going to show how a single pipeline can be executed automatically by a commit to any of my repositories. I'm using &lt;em&gt;Remote File Plugin&lt;/em&gt;, which allows you to &lt;strong&gt;make a single Jenkinsfile be triggerable automatically by any of your repos&lt;/strong&gt;. The main advantage of this method is to &lt;strong&gt;automate the creation of the jobs&lt;/strong&gt;. A similar result can be reached also by using &lt;em&gt;Pipeline Shared Groovy Libraries Plugin&lt;/em&gt;, that can give you more flexibility in case you want to introduce differences on how every repo is built, but the downside is that you still have to create a Jenkinsfile for each project. &lt;br&gt;
Even if I’m using Bitbucket (based on Git), this can be easily replaced with Github or any other versioning control tools supported by Jenkins.&lt;/p&gt;

&lt;p&gt;In the section &lt;strong&gt;B&lt;/strong&gt; I'm going to build a pipeline that execute a Maven build, resolving the dependencies from Artifactory, and then publish the artifacts and the buildinfo to Artifactory.&lt;br&gt;
Eventually I'll test this pipeline on two repositories in which one depends on the other.&lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;strong&gt;A. Set up a pipeline for multiple repositories&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;1. Add a webhook on your BitBucket repositories&lt;/strong&gt; to trigger the Jenkins job when a commit is pushed. The URL is just the address of the machine where Jenkins is installed + &lt;em&gt;/bitbucket-hook/&lt;/em&gt;. Make it sure to write the &lt;em&gt;“/”&lt;/em&gt; at the end of the URL because it’ll not work without it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6yjtj0q6fbkim30jd3v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq6yjtj0q6fbkim30jd3v.png" alt="Bitbucket webhook" width="800" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Create a Jenkins Job&lt;/strong&gt; by clicking on &lt;em&gt;New Item &amp;gt; Bitbucket Team/Project&lt;/em&gt; (you need &lt;a href="https://wiki.jenkins.io/display/JENKINS/Bitbucket+Branch+Source+Plugin" rel="noopener noreferrer"&gt;Bitbucket Branch Source Plugin&lt;/a&gt; for this).&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Put your bitbucket owner and credentials under Projects. Under &lt;em&gt;Local File&lt;/em&gt; insert "pom.xml", so that Jenkins can recognize to trigger the pipeline for every repository that has a pom.  After doing so, your job will be able to scan automatically all your projects.&lt;/li&gt;
&lt;li&gt;Create a repository containing only a Jenkinsfile with the pipeline that you want to be executed for all of your repositories. If you're interested about integrating Maven builds with Artifactory you can find an example of a pipeline in the next section, otherwise just make it sure to have the bitbucket trigger to make it work:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight groovy"&gt;&lt;code&gt;&lt;span class="n"&gt;pipeline&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
    &lt;span class="o"&gt;...&lt;/span&gt;
    &lt;span class="n"&gt;triggers&lt;/span&gt; &lt;span class="o"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;bitbucketPush&lt;/span&gt;&lt;span class="o"&gt;()&lt;/span&gt;
    &lt;span class="o"&gt;}&lt;/span&gt;
    &lt;span class="o"&gt;...&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Connect your Jenkinsfile to your job&lt;/strong&gt;. Go to the job configuration, under &lt;em&gt;Projects&lt;/em&gt; click on &lt;em&gt;Add &amp;gt; Remote File Plugin&lt;/em&gt; (you need to install &lt;a href="https://plugins.jenkins.io/remote-file" rel="noopener noreferrer"&gt;Remote File Plugin&lt;/a&gt;), then add the informations to access the repo containing your pipeline, and specify the name of the script that you want to be triggerable.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faqlo0bykgs07ndyakklf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faqlo0bykgs07ndyakklf.png" alt="Remote File Plugin" width="800" height="393"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From now on, if you want to add one more project, you just have to &lt;em&gt;Scan Organization Folder&lt;/em&gt;; afterwards the pipeline will be called every time you push to your new repo. You can also automatize the scan operation by easily setting a periodical scan trigger in the configuration page of your &lt;em&gt;Bitbucket Team/Project&lt;/em&gt;.&lt;/p&gt;
&lt;h2&gt;
  
  
  &lt;strong&gt;B. Integrate with JFrog Artifactory&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;1. Configure Maven and JDK&lt;/strong&gt; on &lt;em&gt;Manage Jenkins &amp;gt; Global Tool Configuration&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjf54hlj95v77cmzcvmei.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjf54hlj95v77cmzcvmei.png" alt="Maven Setup" width="552" height="293"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiaspn4p4cb2c7h6oyxvk.png" alt="JDK Setup" width="800" height="222"&gt;&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;sup&gt; N.B. Use “Install Automatically” only if Jenkins is running on a RedHat machine, otherwise it will throw an IllegalArgumentException&lt;/sup&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;2. Create the repositories on Artifactory&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a local repository on Artifactory&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxehvtzxttc3y8ozz7z0b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxehvtzxttc3y8ozz7z0b.png" alt="Local Repo Artifactory" width="800" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a virtual repository containing the local one&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F79aeyesbl32dspj0j15a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F79aeyesbl32dspj0j15a.png" alt="Virtual Repo Artifactory" width="800" height="291"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Configure Artifactory on Jenkins&lt;/strong&gt; on &lt;em&gt;Manage Jenkins &amp;gt; Configure System&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwszoyxj52knxky3yfkz4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwszoyxj52knxky3yfkz4.png" alt="Artifactory Setup" width="800" height="233"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In order to resolve your dependencies from Artifactory when a build is executed on Jenkins, you need to set the settings.xml&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Generate settings.xml from Artifactory by going on your virtual repository and then &lt;em&gt;Set Me Up &amp;gt; Generate Maven Settings&lt;/em&gt;
&lt;/li&gt;
&lt;li&gt;Go on &lt;em&gt;Manage Jenkins &amp;gt; Managed Files &amp;gt; Add a new Config&lt;/em&gt; and insert your settings.xml here (you need &lt;a href="https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin" rel="noopener noreferrer"&gt;Config File Provider Plugin&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;If you want to resolve the dependencies from Artifactory during your local builds as well, put the settings.xml also at &lt;em&gt;%USERPROFILE%.m2\settings.xml&lt;/em&gt;. Your repository configuration should look like this:&lt;/li&gt;
&lt;/ul&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;&lt;strong&gt;4. Write your pipeline&lt;/strong&gt;. Here is my Jenkinsfile (I'm using the declarative syntax):&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;If you want to use it automatically on multiple projects, you just have to push this file to the repository declared on the &lt;em&gt;Remote File Plugin&lt;/em&gt; part of &lt;strong&gt;A.2&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Try it out!&lt;/strong&gt; I have two repositories on bitbucket: &lt;em&gt;jenkins-project1&lt;/em&gt; and &lt;em&gt;jenkins-project2&lt;/em&gt;. The latter depends on the former, so that I can test whether the dependencies are resolved from Artifactory correctly. Let’s try building project1 and then project2&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4vf08d85jpdn1ie6xm8s.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4vf08d85jpdn1ie6xm8s.PNG" alt="Build Project1" width="800" height="210"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuqcixuhd6qbbczje0tob.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuqcixuhd6qbbczje0tob.PNG" alt="Build Project2" width="800" height="192"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Congratulations! You successfully integrated Artifactory and Jenkins with multiple repository by creating only one pipeline. If you want to have a look of the projects that I built on this tutorial, you can find them on &lt;a href="https://github.com/robertobatts/jenkins-artifactory-tutorial/" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>jenkins</category>
      <category>pipeline</category>
      <category>devops</category>
      <category>cicd</category>
    </item>
  </channel>
</rss>
