<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Edgar Moran</title>
    <description>The latest articles on Forem by Edgar Moran (@yucelmoran).</description>
    <link>https://forem.com/yucelmoran</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/yucelmoran"/>
    <language>en</language>
    <item>
      <title>I Built a Mobile App to Monitor MuleSoft Anypoint Platform Here's How It Works</title>
      <dc:creator>Edgar Moran</dc:creator>
      <pubDate>Sun, 03 May 2026 11:41:26 +0000</pubDate>
      <link>https://forem.com/yucelmoran/i-built-a-mobile-app-to-monitor-mulesoft-anypoint-platform-heres-how-it-works-3mfn</link>
      <guid>https://forem.com/yucelmoran/i-built-a-mobile-app-to-monitor-mulesoft-anypoint-platform-heres-how-it-works-3mfn</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cx1lvcgi8ny6vonocvk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2cx1lvcgi8ny6vonocvk.png" alt="cover"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  I Built a Mobile App to Monitor MuleSoft Anypoint Platform
&lt;/h2&gt;

&lt;p&gt;Picture this. It's 11pm on a Saturday. Your phone buzzes. Slack alert. Something in production is throwing 500s. You drag yourself to the desk, open the laptop, wait for the VPN to connect, load up the Anypoint Platform console, hunt for the right environment, find the app, pull the logs. Fifteen minutes gone. The incident thread already has 40 messages and someone's asking "any update?"&lt;/p&gt;

&lt;p&gt;I lived that too many times. So I decided to build something about it.&lt;/p&gt;




&lt;h2&gt;
  
  
  So What Exactly Is Muleye?
&lt;/h2&gt;

&lt;p&gt;Muleye is a mobile app for iOS and Android that gives MuleSoft engineers real operational visibility into their Anypoint Platform, right from their phone.&lt;/p&gt;

&lt;p&gt;I want to be clear about what it's not. It's not a status page. It's not a browser wrapped in a mobile shell. It's a proper monitoring tool, built with React Native and Expo, with native gestures, push notifications, and features shaped by the stuff integration engineers actually deal with at 2 AM.&lt;/p&gt;

&lt;p&gt;I shipped it on the &lt;a href="https://apps.apple.com/us/app/muleye/id6752311018" rel="noopener noreferrer"&gt;App Store&lt;/a&gt; and &lt;a href="https://play.google.com/store/apps/details?id=com.moran.anypointmobile" rel="noopener noreferrer"&gt;Google Play&lt;/a&gt; back in January 2026, and I've been iterating on it ever since.&lt;/p&gt;




&lt;h2&gt;
  
  
  The "Why" Behind It
&lt;/h2&gt;

&lt;p&gt;I've been working with MuleSoft for years. The Anypoint Platform browser console is great when you're sitting at your desk with your laptop open. But the second you step away, whether you're on-call, traveling, stuck in a meeting, or just trying to eat dinner, you're completely in the dark.&lt;/p&gt;

&lt;p&gt;There was no native mobile experience for Anypoint. The console isn't responsive. You can't tail logs from your phone. You can't quickly check on your apps while you're walking to a conference room.&lt;/p&gt;

&lt;p&gt;That gap between "something broke" and "I know what's happening" was consistently 15 to 20 minutes. I wanted to get that down to about 30 seconds. That's really what started this whole thing.&lt;/p&gt;




&lt;h2&gt;
  
  
  What It Can Do
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Live Application Monitoring
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faquwpunfv8780ukx1yyx.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faquwpunfv8780ukx1yyx.jpeg" alt="Home Page"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The home screen gives you a real-time health snapshot across every app and environment in your org. You can tell in one glance whether everything's good or if something's off.&lt;/p&gt;

&lt;p&gt;It works across CloudHub, CloudHub 2.0, Runtime Fabric, and hybrid runtimes, all in the same view. And if you manage multiple Anypoint organizations (most of us do), you can switch between them instantly from the sidebar without logging out.&lt;/p&gt;

&lt;h3&gt;
  
  
  Real-Time Log Streaming
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxhrxm9bxr9a0gj6scozz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxhrxm9bxr9a0gj6scozz.png" alt="realtime logs"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can tail application logs live from your phone. Filter by log level, search the stream, tap any entry to expand the full JSON payload.&lt;/p&gt;

&lt;p&gt;This ended up being the feature people use most during incidents. Instead of asking a teammate to paste logs into a Slack thread, you just pull out your phone and watch them scroll by in real time.&lt;/p&gt;

&lt;h3&gt;
  
  
  Flow Map Visualization
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl067e5m1hn20dcmxy1jf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl067e5m1hn20dcmxy1jf.png" alt="Flow visualization"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This one was honestly the most fun to build.&lt;/p&gt;

&lt;p&gt;Here's what happens: Muleye downloads the deployed Mule application artifact, parses the raw XML, extracts every &lt;code&gt;&amp;lt;flow&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;sub-flow&amp;gt;&lt;/code&gt; definition, resolves all the &lt;code&gt;&amp;lt;flow-ref&amp;gt;&lt;/code&gt; connections between them, and renders an interactive SVG diagram. All of that happens on the device itself. No server involved.&lt;/p&gt;

&lt;p&gt;The layout engine runs a BFS-based topological sort to figure out which flows depend on which, and arranges them left to right by dependency depth. Sub-flows show up in purple, HTTP-triggered flows in blue, scheduler flows in amber. You can immediately see how the pieces connect.&lt;/p&gt;

&lt;p&gt;The diagram supports:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pinch-to-zoom&lt;/strong&gt; (0.3x to 3x) so you can handle dense flows&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pan&lt;/strong&gt; to move around large diagrams&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Double-tap&lt;/strong&gt; to snap back to the default view&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fullscreen mode&lt;/strong&gt; for when you really need the screen real estate&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Component preview&lt;/strong&gt; where each flow node shows up to 4 of its processors (connectors, transforms, etc.)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here's a simplified look at how the graph gets built under the hood:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Parse Mule XML and extract flow definitions&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;extractFlowDefinitions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;definitions&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="c1"&gt;// Match &amp;lt;flow&amp;gt; and &amp;lt;sub-flow&amp;gt; tags&lt;/span&gt;
  &lt;span class="k"&gt;while &lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;match&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;FLOW_OPEN_TAG&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exec&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;tagType&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;match&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt; &lt;span class="c1"&gt;// 'flow' or 'sub-flow'&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;name&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;match&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
    &lt;span class="nx"&gt;definitions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
      &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;sanitizeId&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;:&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
      &lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;tagType&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;sub-flow&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;sub-flow&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;flow&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;trigger&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;inferTrigger&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="c1"&gt;// HTTP, Scheduler, Message, File, Flow&lt;/span&gt;
      &lt;span class="na"&gt;components&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;extractComponents&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;definitions&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="c1"&gt;// Build the directed graph&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;buildMuleFlowGraph&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;nodes&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;edges&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;nodeByName&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Map&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

  &lt;span class="c1"&gt;// Extract all flow definitions from XML files&lt;/span&gt;
  &lt;span class="nb"&gt;Object&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;entries&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;(([&lt;/span&gt;&lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nf"&gt;extractFlowDefinitions&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;def&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;nodes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;def&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;nodeByName&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;def&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;def&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="c1"&gt;// Resolve flow-ref edges&lt;/span&gt;
  &lt;span class="nx"&gt;nodes&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nf"&gt;extractReferencedFlows&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;targetName&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;target&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;nodeByName&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;targetName&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;target&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;return&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
      &lt;span class="nx"&gt;edges&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;-&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;target&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;from&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;to&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;target&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;nodes&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;edges&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And the layout engine that arranges everything into columns:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;buildFlowLayout&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;graph&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;viewportWidth&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;adjacency&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Map&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;indegree&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Map&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

  &lt;span class="c1"&gt;// BFS to assign levels (columns) by dependency depth&lt;/span&gt;
  &lt;span class="nx"&gt;graph&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;edges&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;edge&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;adjacency&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;edge&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;edge&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;to&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="nx"&gt;indegree&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;edge&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;to&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;indegree&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;edge&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;to&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="c1"&gt;// Start with nodes that have no incoming edges&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;queue&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[];&lt;/span&gt;
  &lt;span class="nx"&gt;indegree&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;key&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;

  &lt;span class="c1"&gt;// BFS assigns each node to the deepest level it reaches&lt;/span&gt;
  &lt;span class="k"&gt;while &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;shift&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;adjacency&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;current&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="p"&gt;[]).&lt;/span&gt;&lt;span class="nf"&gt;forEach&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;next&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;levelById&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;next&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;Math&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;max&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;levelById&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;next&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="nx"&gt;currentLevel&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
      &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;indegree&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;next&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;queue&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;next&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="c1"&gt;// Position nodes in a column grid&lt;/span&gt;
  &lt;span class="c1"&gt;// ... spacing, centering, edge routing&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The end result is a clean left-to-right flow diagram that fits on a phone screen and lets you trace the call chain without ever opening Anypoint Studio.&lt;/p&gt;

&lt;h3&gt;
  
  
  Push Notifications and Smart Alerts
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdu0k47cl0emw5tizqde0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdu0k47cl0emw5tizqde0.png" alt="notifications"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The alert system runs health checks in the background and pushes notifications to your phone even when the app isn't in the foreground.&lt;/p&gt;

&lt;p&gt;One thing I was really intentional about: no MuleSoft credentials ever touch my server. Here's how the flow works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The app detects issues locally (status changes, error spikes)&lt;/li&gt;
&lt;li&gt;It relays alert metadata to a Firebase Cloud Function&lt;/li&gt;
&lt;li&gt;The Cloud Function calls the Expo Push API&lt;/li&gt;
&lt;li&gt;APNs or FCM delivers the notification to your device&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;When you tap a notification, it deep-links you straight to the relevant screen. Could be the specific app, its logs, or the alert detail. No hunting around.&lt;/p&gt;

&lt;p&gt;There's also a &lt;strong&gt;Daily Health Digest&lt;/strong&gt; that runs as a scheduled Cloud Function. Every morning it sends a push summary of your org's application health, so you start the day knowing if anything needs your attention before you even open the app.&lt;/p&gt;

&lt;h3&gt;
  
  
  And a Bunch More
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Performance Metrics&lt;/strong&gt; - CPU, memory, and request charts per application&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Application Command Center&lt;/strong&gt; - a unified dashboard with topology view, scheduler management, and metric cards&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API Manager&lt;/strong&gt; - browse your APIs, view policies, check contracts&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;API Governance&lt;/strong&gt; - conformance checking across your API portfolio&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AnypointMQ Statistics&lt;/strong&gt; - queue and exchange monitoring with message counts&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Object Store Explorer&lt;/strong&gt; - browse and search key-value data&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Environment Comparison&lt;/strong&gt; - side-by-side view of apps across environments&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Multi-region&lt;/strong&gt; - US, EU, and GOV MuleSoft clouds all supported&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The Technical Stack
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Layer&lt;/th&gt;
&lt;th&gt;Technology&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Framework&lt;/td&gt;
&lt;td&gt;Expo SDK 54, React Native 0.81.5&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Language&lt;/td&gt;
&lt;td&gt;JavaScript / TypeScript&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;JS Engine&lt;/td&gt;
&lt;td&gt;Hermes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Navigation&lt;/td&gt;
&lt;td&gt;React Navigation 7.x&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;HTTP&lt;/td&gt;
&lt;td&gt;Axios with interceptors for token refresh&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Auth&lt;/td&gt;
&lt;td&gt;Firebase Auth + MuleSoft OAuth2 (PKCE)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Subscriptions&lt;/td&gt;
&lt;td&gt;RevenueCat&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Push&lt;/td&gt;
&lt;td&gt;Expo Notifications + Firebase Cloud Functions&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Charts&lt;/td&gt;
&lt;td&gt;react-native-chart-kit&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Diagrams&lt;/td&gt;
&lt;td&gt;react-native-svg (flow maps + topology)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Gestures&lt;/td&gt;
&lt;td&gt;react-native-gesture-handler + Reanimated&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Storage&lt;/td&gt;
&lt;td&gt;Expo SecureStore (tokens), AsyncStorage (prefs)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Authentication Was the Hardest Part
&lt;/h2&gt;

&lt;p&gt;Getting mobile OAuth2 working against MuleSoft's Anypoint Platform was probably the single biggest technical challenge of the whole project.&lt;/p&gt;

&lt;p&gt;Muleye uses what I call a &lt;strong&gt;dual authentication system&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Firebase Auth&lt;/strong&gt; handles user identity (email/password or Apple Sign In)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;MuleSoft OAuth2 with PKCE&lt;/strong&gt; handles the actual platform access&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The clever part (if I can say that about my own code) is that the OAuth2 token exchange runs through Firebase Cloud Functions. That means &lt;strong&gt;no Anypoint credentials ever live on my server&lt;/strong&gt;. The mobile client holds its own tokens in Expo SecureStore, and the Axios interceptor automatically refreshes them when they expire.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;User opens app
  -&amp;gt; Firebase Auth check
  -&amp;gt; Link Anypoint account (OAuth2 PKCE)
  -&amp;gt; Token exchange via Firebase Functions
  -&amp;gt; Tokens stored in SecureStore
  -&amp;gt; Axios interceptor auto-refreshes on 401
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Multi-account support stores credentials per organization, so you can flip between your company's production org and your personal sandbox without having to log out and back in.&lt;/p&gt;

&lt;p&gt;Each MuleSoft region (US, EU, GOV) has its own OAuth client ID and API endpoint. The Axios interceptor figures out the right base URL based on whichever account you have active before every request goes out.&lt;/p&gt;




&lt;h2&gt;
  
  
  Design: Make It Feel Native, Not Webby
&lt;/h2&gt;

&lt;p&gt;From day one I wanted Muleye to feel like it belonged on your phone, not like a web app that got squeezed into a mobile container. That meant making a bunch of intentional choices:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Flat cards&lt;/strong&gt; with hairline borders instead of heavy drop shadows&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Grouped backgrounds&lt;/strong&gt; following the iOS &lt;code&gt;systemGroupedBackground&lt;/code&gt; pattern&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Haptic feedback&lt;/strong&gt; on tab switches, button taps, and quick actions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Native segmented controls&lt;/strong&gt; with Reanimated sliding thumb animations&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Dark mode&lt;/strong&gt; that actually works, which required a full multi-wave theme migration across all 44 screens&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The persistent tab bar at the bottom uses a graphite glass surface with blur on iOS, and every screen has its own native header rather than falling back on React Navigation's default chrome.&lt;/p&gt;

&lt;p&gt;The small stuff matters more than you'd think. The flow map uses iOS-standard corner radii. Alert severity colors follow the iOS semantic palette. Pull-to-refresh tint tracks the active accent color. Individually these are tiny decisions, but together they're the difference between "this feels right" and "this feels off."&lt;/p&gt;




&lt;h2&gt;
  
  
  The Business Model
&lt;/h2&gt;

&lt;p&gt;Muleye runs on a freemium model:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Free&lt;/strong&gt; includes the home dashboard, application list, alert center, events feed, platform status, and hybrid infrastructure monitoring.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Premium&lt;/strong&gt; unlocks real-time log streaming, performance metrics, flow maps, the command center, API management, AnypointMQ stats, and AI insights.&lt;/p&gt;

&lt;p&gt;The paywall is powered by RevenueCat with monthly and annual options. When I first turned the paywall on, I gave every existing user a 30-day grace period with full access. Nobody lost features overnight. The conversion rate during that grace window ended up being solid, which told me the approach was right.&lt;/p&gt;




&lt;h2&gt;
  
  
  Things I Learned Along the Way
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Ship the free version first.&lt;/strong&gt; The core monitoring features like app status and alerts are useful enough on their own. Once people rely on the free tier every day, the premium features sell themselves.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Do the work on-device when you can.&lt;/strong&gt; The flow map feature downloads the Mule artifact and parses XML entirely on the phone. No backend needed, no credentials exposed, instant results. The tradeoff is that really large apps with 30+ flows get dense on a phone screen, but pinch-to-zoom handles it, and it's still way faster than opening Studio.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Stop fighting the platform.&lt;/strong&gt; Early versions of the app had custom gradients, heavy shadows, and what I'd call a "designed" look. Ripping all of that out and going with native iOS and Android patterns made the app feel dramatically better. It also cut the styling code roughly in half.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Push notifications are non-negotiable.&lt;/strong&gt; The day I shipped push alerts, daily active usage jumped noticeably. People have their phone on them all the time. If you can tell someone something's wrong before their manager does, you've earned a permanent spot on their home screen.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's Coming Next
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Server-side monitoring so alerts work even when the app is fully closed&lt;/li&gt;
&lt;li&gt;Expanded AI insights with anomaly detection and trend analysis&lt;/li&gt;
&lt;li&gt;Deeper Runtime Fabric and hybrid runtime support&lt;/li&gt;
&lt;li&gt;API analytics and traffic visualization&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Give It a Try
&lt;/h2&gt;

&lt;p&gt;Muleye is live on both platforms:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;iOS&lt;/strong&gt;: &lt;a href="https://apps.apple.com/us/app/muleye/id6752311018" rel="noopener noreferrer"&gt;App Store&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Android&lt;/strong&gt;: &lt;a href="https://play.google.com/store/apps/details?id=com.moran.anypointmobile" rel="noopener noreferrer"&gt;Google Play&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you work with MuleSoft, I'd honestly love to hear what you think. This app was built by an integration engineer for integration engineers. Every feature started as a real frustration I had, and I'm always looking for the next one to solve.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Built by &lt;a href="https://www.linkedin.com/in/edgarmoran/" rel="noopener noreferrer"&gt;Edgar Moran&lt;/a&gt; - Sr Integration engineer, and someone who got really tired of opening a laptop at 11pm.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>mulesoft</category>
      <category>anypointplatform</category>
      <category>mobile</category>
      <category>monitoring</category>
    </item>
    <item>
      <title>Faster Mule deployments using Gitlab cache</title>
      <dc:creator>Edgar Moran</dc:creator>
      <pubDate>Thu, 31 Aug 2023 22:21:17 +0000</pubDate>
      <link>https://forem.com/yucelmoran/faster-mule-deployments-using-gitlab-cache-3pek</link>
      <guid>https://forem.com/yucelmoran/faster-mule-deployments-using-gitlab-cache-3pek</guid>
      <description>&lt;p&gt;Today I was curious about how we can make our deployments faster using CI processes, we have multiple platforms to handle the CI deployments for example GitHub, GitLab, Bitbucket CircleCI, TravisCI etc.. In this case I’m using GitLab.&lt;/p&gt;

&lt;p&gt;I created one application in MulesSoft with one simple scheduler and a logger, really I just want to test the deployment&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1bov4xq18sy4uzzh2aw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu1bov4xq18sy4uzzh2aw.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;the only couple important items to consider is to add the .gitlab-ci.yml file and to setup your build tag in your pom.xml file. Lets see how our .gitlab-ci.yml looks like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;image: maven:3.6.1-jdk-8

variables: 
  MAVEN_OPTS: "-Dmaven.repo.local=$CI_PROJECT_DIR/.m2/repository"

cache:
  key: ${CI_COMMIT_REF_SLUG}
  paths:
    - .m2/repository

stages:
  - build 
  - test
  - deploy-staging
  - deploy-production

build:
  stage: build
  script:
    - mvn  -U -V -e -B clean -DskipTests package
  only:
    - merge_requests

test:
  stage: test
  script:
    - mvn -U clean test
  only:
    - merge_requests
  artifacts:
    when: always
    reports:
      junit:
        - target/surefire-reports/TEST-*.xml  

deploy-staging:
  stage: deploy-staging
  script:
    - mvn -U -V -e -B clean -DskipTests deploy -DmuleDeploy
  rules:
    - if: '$CI_COMMIT_BRANCH == "staging"'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As we can see, I’m specifying these lines, with this, I tell Gitlab to cache the dependencies in the .m2 repository and the key there will allow to persist the dependencies in every branch.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;variables: 
  MAVEN_OPTS: "-Dmaven.repo.local=$CI_PROJECT_DIR/.m2/repository"

cache:
  key: ${CI_COMMIT_REF_SLUG}
  paths:
    - .m2/repository
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In my pom.xml this is the setup I have:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;?xml version="1.0" encoding="UTF-8"?&amp;gt;
&amp;lt;project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd"&amp;gt;
 &amp;lt;modelVersion&amp;gt;4.0.0&amp;lt;/modelVersion&amp;gt;

 &amp;lt;groupId&amp;gt;com.mycompany&amp;lt;/groupId&amp;gt;
 &amp;lt;artifactId&amp;gt;gitlab-cache-deployment&amp;lt;/artifactId&amp;gt;
 &amp;lt;version&amp;gt;1.0.0&amp;lt;/version&amp;gt;
 &amp;lt;packaging&amp;gt;mule-application&amp;lt;/packaging&amp;gt;

 &amp;lt;name&amp;gt;gitlab-cache-deployment&amp;lt;/name&amp;gt;

 &amp;lt;properties&amp;gt;
  &amp;lt;project.build.sourceEncoding&amp;gt;UTF-8&amp;lt;/project.build.sourceEncoding&amp;gt;
  &amp;lt;project.reporting.outputEncoding&amp;gt;UTF-8&amp;lt;/project.reporting.outputEncoding&amp;gt;

  &amp;lt;app.runtime&amp;gt;4.4.0&amp;lt;/app.runtime&amp;gt;
  &amp;lt;mule.maven.plugin.version&amp;gt;3.8.2&amp;lt;/mule.maven.plugin.version&amp;gt;
 &amp;lt;/properties&amp;gt;

 &amp;lt;build&amp;gt; 
  &amp;lt;plugins&amp;gt;
   &amp;lt;plugin&amp;gt;
    &amp;lt;groupId&amp;gt;org.apache.maven.plugins&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;maven-clean-plugin&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;3.2.0&amp;lt;/version&amp;gt;
   &amp;lt;/plugin&amp;gt;
   &amp;lt;plugin&amp;gt;
    &amp;lt;groupId&amp;gt;org.mule.tools.maven&amp;lt;/groupId&amp;gt;
    &amp;lt;artifactId&amp;gt;mule-maven-plugin&amp;lt;/artifactId&amp;gt;
    &amp;lt;version&amp;gt;${mule.maven.plugin.version}&amp;lt;/version&amp;gt;
    &amp;lt;extensions&amp;gt;true&amp;lt;/extensions&amp;gt;
    &amp;lt;configuration&amp;gt;
     &amp;lt;classifier&amp;gt;mule-application&amp;lt;/classifier&amp;gt;
     &amp;lt;cloudHubDeployment&amp;gt;
      &amp;lt;uri&amp;gt;${CLOUDHUB_URI}&amp;lt;/uri&amp;gt;
      &amp;lt;muleVersion&amp;gt;4.4.0&amp;lt;/muleVersion&amp;gt;
      &amp;lt;connectedAppClientId&amp;gt;${CLIENT_ID}&amp;lt;/connectedAppClientId&amp;gt;
      &amp;lt;connectedAppClientSecret&amp;gt;${CLIENT_SECRET}&amp;lt;/connectedAppClientSecret&amp;gt;
      &amp;lt;connectedAppGrantType&amp;gt;client_credentials&amp;lt;/connectedAppGrantType&amp;gt;
      &amp;lt;environment&amp;gt;Sandbox&amp;lt;/environment&amp;gt;
      &amp;lt;applicationName&amp;gt;gitlab-cache-deployment&amp;lt;/applicationName&amp;gt;
      &amp;lt;workerType&amp;gt;Micro&amp;lt;/workerType&amp;gt;
      &amp;lt;objectStoreV2&amp;gt;true&amp;lt;/objectStoreV2&amp;gt;
     &amp;lt;/cloudHubDeployment&amp;gt;
    &amp;lt;/configuration&amp;gt;
   &amp;lt;/plugin&amp;gt;
  &amp;lt;/plugins&amp;gt;
 &amp;lt;/build&amp;gt;

 &amp;lt;dependencies&amp;gt;
  &amp;lt;dependency&amp;gt;
   &amp;lt;groupId&amp;gt;org.mule.connectors&amp;lt;/groupId&amp;gt;
   &amp;lt;artifactId&amp;gt;mule-http-connector&amp;lt;/artifactId&amp;gt;
   &amp;lt;version&amp;gt;1.7.3&amp;lt;/version&amp;gt;
   &amp;lt;classifier&amp;gt;mule-plugin&amp;lt;/classifier&amp;gt;
  &amp;lt;/dependency&amp;gt;
  &amp;lt;dependency&amp;gt;
   &amp;lt;groupId&amp;gt;org.mule.connectors&amp;lt;/groupId&amp;gt;
   &amp;lt;artifactId&amp;gt;mule-sockets-connector&amp;lt;/artifactId&amp;gt;
   &amp;lt;version&amp;gt;1.2.3&amp;lt;/version&amp;gt;
   &amp;lt;classifier&amp;gt;mule-plugin&amp;lt;/classifier&amp;gt;
  &amp;lt;/dependency&amp;gt;
 &amp;lt;/dependencies&amp;gt;

 &amp;lt;repositories&amp;gt;
  &amp;lt;repository&amp;gt;
   &amp;lt;id&amp;gt;anypoint-exchange-v3&amp;lt;/id&amp;gt;
   &amp;lt;name&amp;gt;Anypoint Exchange&amp;lt;/name&amp;gt;
   &amp;lt;url&amp;gt;https://maven.anypoint.mulesoft.com/api/v3/maven&amp;lt;/url&amp;gt;
   &amp;lt;layout&amp;gt;default&amp;lt;/layout&amp;gt;
  &amp;lt;/repository&amp;gt;
  &amp;lt;repository&amp;gt;
   &amp;lt;id&amp;gt;mulesoft-releases&amp;lt;/id&amp;gt;
   &amp;lt;name&amp;gt;MuleSoft Releases Repository&amp;lt;/name&amp;gt;
   &amp;lt;url&amp;gt;https://repository.mulesoft.org/releases/&amp;lt;/url&amp;gt;
   &amp;lt;layout&amp;gt;default&amp;lt;/layout&amp;gt;
  &amp;lt;/repository&amp;gt;
 &amp;lt;/repositories&amp;gt;

 &amp;lt;pluginRepositories&amp;gt;
  &amp;lt;pluginRepository&amp;gt;
   &amp;lt;id&amp;gt;mulesoft-releases&amp;lt;/id&amp;gt;
   &amp;lt;name&amp;gt;MuleSoft Releases Repository&amp;lt;/name&amp;gt;
   &amp;lt;layout&amp;gt;default&amp;lt;/layout&amp;gt;
   &amp;lt;url&amp;gt;https://repository.mulesoft.org/releases/&amp;lt;/url&amp;gt;
   &amp;lt;snapshots&amp;gt;
    &amp;lt;enabled&amp;gt;false&amp;lt;/enabled&amp;gt;
   &amp;lt;/snapshots&amp;gt;
  &amp;lt;/pluginRepository&amp;gt;
 &amp;lt;/pluginRepositories&amp;gt;

&amp;lt;/project&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now I created couple repositories one for an application using the cache and a second one NO using the cache in the ci yaml file, this way we can validate and check performance between both apps. I created in both repos three branches (master, staging, mydevbranch). In order to verify performance, out pipeline has three stages&lt;/p&gt;

&lt;p&gt;build: Only builds the project and verifies is succesful&lt;br&gt;
test: will run the test (MUnit) in the pipeline&lt;br&gt;
deploy: After a PR is approved and merged from Dev branch to Staging will deploy to Anypoint Platform.&lt;br&gt;
The comparison&lt;br&gt;
In the end using the cache improves in terms of minutes the time of running a build, test or deploy&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;No cache&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Build&lt;/strong&gt;: took 1 minute, 37 seconds&lt;br&gt;
&lt;strong&gt;Test&lt;/strong&gt;: took 1 minute, 37 seconds&lt;br&gt;
&lt;strong&gt;Deploy&lt;/strong&gt;: Took 3 minutes 42 seconds&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiu1qdqtv0urbkgqj1vsa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiu1qdqtv0urbkgqj1vsa.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;With cache&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Build&lt;/strong&gt;: took 32 seconds&lt;br&gt;
&lt;strong&gt;Test&lt;/strong&gt;: 38 seconds&lt;br&gt;
&lt;strong&gt;Deploy&lt;/strong&gt;: 3 minutes 42 seconds&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpfzeg73pr8d8sbk46j28.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpfzeg73pr8d8sbk46j28.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As we can see there’s improvement on test and build while deployment seems to be the same, in the end a few minutes gained is better.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxd69psykfmx2god5pzq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnxd69psykfmx2god5pzq.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfy5qyvsqk7dorwwotp0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqfy5qyvsqk7dorwwotp0.png" alt="Image description"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I will keep investigating if there are better ways to enhance the time of deployment, even some times the time is related to network, and the availability on the platform as well.&lt;/p&gt;

&lt;p&gt;Hope this help you in your deployments!&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Basic Google Big Query Operations with a Salesforce sync demo in MULE 4</title>
      <dc:creator>Edgar Moran</dc:creator>
      <pubDate>Thu, 17 Feb 2022 20:21:56 +0000</pubDate>
      <link>https://forem.com/yucelmoran/basic-google-big-query-operations-with-a-salesforce-sync-demo-mule-4-52ak</link>
      <guid>https://forem.com/yucelmoran/basic-google-big-query-operations-with-a-salesforce-sync-demo-mule-4-52ak</guid>
      <description>&lt;p&gt;If we think about data storage the first think it comes to our mind is a regular database, this can be any of the most popular ones like Mysql, SQL server, Postgres, Vertica etc, but I noticed no too many have interacted to one of the services Google provides with the same purpose Google Big Query. And maybe it is because of the &lt;a href="https://cloud.google.com/bigquery/pricing?utm_source=google&amp;amp;utm_medium=cpc&amp;amp;utm_campaign=na-US-all-en-dr-bkws-all-all-trial-e-dr-1011347&amp;amp;utm_content=text-ad-none-any-DEV_c-CRE_573148306951-ADGP_Desk%20%7C%20BKWS%20-%20EXA%20%7C%20Txt%20~%20Data%20Analytics%20~%20BigQuery_Pricing%20Google%20Google-KWID_43700068582852990-kwd-166600832170&amp;amp;utm_term=KW_google%20bigquery%20pricing-ST_google%20bigquery%20pricing&amp;amp;gclsrc=aw.ds&amp;amp;gclid=Cj0KCQiAjJOQBhCkARIsAEKMtO3fCohKU2ihxQrQ21u9XixqWhXy9w7QNu8MqGaVp58zn59WTN0ekA4aAtAeEALw_wcB"&gt;pricing&lt;/a&gt;, but in the end many companies are moving to cloud services and this service seems to be a great fit for them.&lt;/p&gt;

&lt;p&gt;In this post I would like to demonstrate in a few steps how we can make a sync job that allows us to describe a Salesforce instance and use a few objects to create a full schema of those objects (tables) into a Google Big Query Dataset. Then with the schema created we should be able to push some data into Bigquery from Salesforce and see it in our Google Cloud Console project.&lt;/p&gt;

&lt;p&gt;In order to connect to Salesforce and Google Big Query, there are a few prerequisites we need:&lt;/p&gt;

&lt;p&gt;Salesforce:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  If you don't have a salesforce instance, you can create a developer one &lt;a href="https://developer.salesforce.com/signup"&gt;here&lt;/a&gt;,&lt;/li&gt;
&lt;li&gt;  From Salesforce side you will need username, password and security token (you can follow &lt;a href="https://help.salesforce.com/s/articleView?id=sf.user_security_token.htm&amp;amp;type=5"&gt;this process&lt;/a&gt; to get it)&lt;/li&gt;
&lt;li&gt;  A developer instance contains a few records, but if you need to have some more data, this will help the process to sync that information over.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;GCP (Google Cloud Platform)&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  You can sign up &lt;a href="https://console.cloud.google.com/freetrial/signup/tos?_ga=2.68094097.1640278748.1644510554-1516430238.1644510554&amp;amp;_gac=1.218077796.1644510554.Cj0KCQiAjJOQBhCkARIsAEKMtO0NyXkbcz86jMGZOta5V7HYUNkiDHCDR_6OSc4ioZFtAHlp0tw8_JUaAnI7EALw_wcB"&gt;here&lt;/a&gt; for free. Google gives you $300 for 90 days to test the product (similar to Azure). Also if you already have a google account you can use it for this.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  CREATING A NEW PROJECT IN GCP AND SETTING UP OUR SERVICE ACCOUNT KEY.
&lt;/h1&gt;

&lt;p&gt;Once you sign up for you account on GCP, you should be able to click on New Project option and write a project name, in this example I choose mulesoft&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--n9FinK3F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AR9cDYzV2GidJIUb-" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--n9FinK3F--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AR9cDYzV2GidJIUb-" alt="1" width="880" height="231"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fTcIOgh2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Ax1q_Mk2Ot1vKv2vd" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fTcIOgh2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Ax1q_Mk2Ot1vKv2vd" alt="2" width="880" height="558"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once a project is created we should be able to go to the menu in the left and we should be able to select IAM &amp;amp; Admin &amp;gt; Service Accounts option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7qTlinnw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AuS-P6MIGO6oIw1wL" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7qTlinnw--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AuS-P6MIGO6oIw1wL" alt="3" width="880" height="1326"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, we should be able to create our service account&lt;/p&gt;

&lt;p&gt;"A service account is a special type of Google account intended to represent a non-human user that needs to authenticate and be authorized to access data in Google APIs. Typically, service accounts are used in scenarios such as: Running workloads on virtual machines"&lt;/p&gt;

&lt;p&gt;At the top of the page you should be able to see the option to create it, then just you need to specify a Name and click on create and continue,&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--523B2Afq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Aem_KPwQsEYKRWxgl" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--523B2Afq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Aem_KPwQsEYKRWxgl" alt="4" width="880" height="606"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next step is to set the permissions, so for this we need to select from the roles combo BigQuery Admin.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--i6Cx3CQD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2ATUGCNyhClV0vIBlk" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i6Cx3CQD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2ATUGCNyhClV0vIBlk" alt="5" width="880" height="683"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once created, we should be able to select from the three dot menu on the right the option Manage Keys&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7sdjvdSC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1244/0%2Atf1JBbFVxaEzKD6D" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7sdjvdSC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1244/0%2Atf1JBbFVxaEzKD6D" alt="6" width="622" height="686"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then we can create a new Key, in this case one as json should be enough. The key will get downloaded automatically in your computer (Please keep this json key somewhere you can use it later.)&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YMdcQMda--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1144/0%2AqV-jZHfhqDjjrEeh" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YMdcQMda--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1144/0%2AqV-jZHfhqDjjrEeh" alt="7" width="572" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  DATASET IN BIG QUERY
&lt;/h1&gt;

&lt;p&gt;Datasets are top-level containers that are used to organize and control access to your tables and views. A table or view must belong to a dataset, so you need to create at least one dataset before loading data into BigQuery.&lt;/p&gt;

&lt;p&gt;From the left menu we can search for BigQuery and click on it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3CMOoU3V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1084/0%2Aco-42V2xq7tRUCQz" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3CMOoU3V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1084/0%2Aco-42V2xq7tRUCQz" alt="8" width="542" height="568"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;That will take us to the Bigquery console, now we can click in the three dots menu and select Create dataset option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m9DzLWeV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AJ52tiRCJEcidraxt" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m9DzLWeV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AJ52tiRCJEcidraxt" alt="9" width="880" height="752"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now we just need to set the name as salesforce and click on "Create Dataset"&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--FbKgqaW5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AUnK4bWtQKddU_f7J" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--FbKgqaW5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AUnK4bWtQKddU_f7J" alt="10" width="880" height="1009"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  SETTING UP OUR MULE APPLICATION.
&lt;/h1&gt;

&lt;p&gt;Since this is a sync job, we don't need any API specification but totally can fit some scenarios where we have another application that needs to consume specific endpoints / operations.&lt;/p&gt;

&lt;p&gt;Let's then open our Anypoint Studio app (In my case I'm using mac) and let's use the default template. For this we are going to create five flows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; Sync. This flow just triggers the process.&lt;/li&gt;
&lt;li&gt; DescribeInstance. This flow will be in charge of calling the describe operation using the Salesforce connector and provide all objects information from the Salesforce instance, also will have a loop that will allow us to process the job for the objects we are going to use.&lt;/li&gt;
&lt;li&gt; DescribeIndividualSalesforceObject. Allows to describe an specific Salesforce object, this will basically will capture the fields and field types (STRING, EMAIL, ID, REFERENCE etc.) and will be on charge to create a payload that BigQuery will recognize in order to get created in GBQ&lt;/li&gt;
&lt;li&gt; BigQueryCreateTable. This flow only will be in charge of creating the table in BigQuery based on the Salesforce object name and the fields.&lt;/li&gt;
&lt;li&gt; QuerySalesforceObject. This flow dynamically will query the Salesforce object and pull the data (&lt;em&gt;For this we are limiting the output to 100 records but in a bigger scale it should be done on a batch process of course.&lt;/em&gt;)&lt;/li&gt;
&lt;li&gt; InsertDataIntoBigQuery. This flow will push the data over into BigQuery only&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now let's grab our json key generated by google and copy the file under src/main/resources folder. The key will let us authenticate against our project and execute the operations&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--t0RvNYjL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/872/0%2A7kE2a6bTDkCIQTIG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--t0RvNYjL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/872/0%2A7kE2a6bTDkCIQTIG" alt="11" width="436" height="356"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  IMPORT THE GOOGLE BIG QUERY CONNECTOR.
&lt;/h1&gt;

&lt;p&gt;From Exchange we can search "Big Query" and we should be able to see the connector listed&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_by9eB2X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A78uaKN-gFwvJ0sNH" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_by9eB2X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A78uaKN-gFwvJ0sNH" alt="12" width="880" height="160"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;then we can just use the "Add to project" option and we should be able to see the operations in the Palette&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--L0zxb-c9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1008/0%2AHGiDcGcpHF81nJZk" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--L0zxb-c9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1008/0%2AHGiDcGcpHF81nJZk" alt="13" width="504" height="804"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  SYNC FLOW
&lt;/h1&gt;

&lt;p&gt;As I mentioned, this is only in charge of triggering the whole application, so we only need one scheduler component and a flow reference to the DescribeInstance flow.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5RjRMuX3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/984/0%2AORVkakau2T0QX3cn" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5RjRMuX3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/984/0%2AORVkakau2T0QX3cn" alt="14" width="492" height="328"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  DESCRIBEINSTANCE
&lt;/h1&gt;

&lt;p&gt;This flow will describe the whole Salesforce instance using the &lt;a href="https://developer.salesforce.com/docs/atlas.en-us.208.0.api_rest.meta/api_rest/resources_describeGlobal.htm"&gt;Describe Global&lt;/a&gt; operation. Next&lt;/p&gt;

&lt;p&gt;steps on this is to use a Dataweave transform to filter to get only the Objects we are interested in, so in this case I'm only pulling three, Accounts, Contacts and a custom object called Project__c. I left in the transformation a few more attributes to only pull the objects that we are able to query.&lt;/p&gt;

&lt;p&gt;%dw 2.0import try, fail from dw::Runtimeoutput application/java fun isDate(value: Any): Boolean = try(() -&amp;gt; value as Date).successfun getDate(value: Any): Date | Null | Any = ( if ( isDate(value) ) value as Date as String else value ) --- -(payload map (item,index) -&amp;gt;{ (item mapObject ((value, key, index) -&amp;gt; { (key):(getDate(value)) } ))})&lt;/p&gt;

&lt;p&gt;&lt;a href="https://gist.github.com/emoran/055a8b509044c899d9fdcddbfe66ff41/raw/6d80d9ca2fc6371ab575a45a41a0e8909cf6f656/mapSalesforceReocrds"&gt;view raw&lt;/a&gt;&lt;a href="https://gist.github.com/emoran/055a8b509044c899d9fdcddbfe66ff41#file-mapsalesforcereocrds"&gt;mapSalesforceReocrds&lt;/a&gt; hosted with ❤ by &lt;a href="https://github.com/"&gt;GitHub&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--s3qzuoYi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AnGmyj4akCqu_S5xT" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--s3qzuoYi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AnGmyj4akCqu_S5xT" alt="15" width="880" height="170"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally you need to loop over these three objects and there's a flow reference for this sample that will call the other flows to be able to continue the process.&lt;/p&gt;

&lt;h1&gt;
  
  
  DESCRIBEINDIVIDUALSALESFORCEOBJECT
&lt;/h1&gt;

&lt;p&gt;The flow basically takes the name of the Salesforce Object and will allow to describe it, the connector only ask for the object name, then we have a pretty interesting DW Transformation&lt;/p&gt;

&lt;p&gt;%dw 2.0input payload application/javaoutput application/javafun validateField(field) = if ( (field == "REFERENCE") or (field == "ID") or (field == "PICKLIST") or (field == "TEXTAREA") or (field == "ADDRESS")or (field == "EMAIL")or (field == "PHONE") or (field == "URL")) "STRING" else if ( (field == "DOUBLE") or (field == "CURRENCY") ) "FLOAT" else if ((field == "INT")) "INTEGER" else field --- -(payload.fields filter ($."type" != "LOCATION") map { fieldName : $.name, fieldType : validateField($."type")})&lt;/p&gt;

&lt;p&gt;&lt;a href="https://gist.github.com/emoran/170bde67c68a8dd5a18c9c3daee831e8/raw/07b6e97ba6cf1f9273fd31f16e40e83326f170a0/Salesforce%20to%20Bigquery%20Fields%20Schema"&gt;view raw&lt;/a&gt;&lt;a href="https://gist.github.com/emoran/170bde67c68a8dd5a18c9c3daee831e8#file-salesforce-to-bigquery-fields-schema"&gt;Salesforce to Bigquery Fields Schema&lt;/a&gt; hosted with ❤ by &lt;a href="https://github.com/"&gt;GitHub&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Salesforce data types are not 100% the same as BigQuery, so we need to make a little trick to be able to create the schema in BigQuery seamless as Salesforce so in this case I've created an small function to convert some fields like (ID, REFERENCE,TEXTAREA,PHONE, ADDRESS,PICKLIST, EMAIL) to be STRING, in this case the reference or values are not really anything else than a text, for (DOUBLE and CURRENCY) I'm using the value FLOAT and finally for INT fields are changed to be INTEGER&lt;/p&gt;

&lt;p&gt;Finally because Location fields are a bit tricky and we are not able to make much with the API on them, I'm removing all location fields.&lt;/p&gt;

&lt;p&gt;The output of this is the actual schema we will use to create the table in Google BigQuery.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--N3j12KIX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Acb032ELY2g1anSgJ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--N3j12KIX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Acb032ELY2g1anSgJ" alt="16" width="880" height="325"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  BIGQUERYCREATETABLE
&lt;/h1&gt;

&lt;p&gt;This flow allows us to create the table in BigQuery; we only need to specify Table, Dataset and Table Fields.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dE0t9zw5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AxG29lQviGNNLWXYW" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dE0t9zw5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2AxG29lQviGNNLWXYW" alt="17" width="880" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BmqCccOR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/960/0%2AFrXheP5MxHkdQ3oJ" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BmqCccOR--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/960/0%2AFrXheP5MxHkdQ3oJ" alt="18" width="480" height="380"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  QUERYSALESFORCEOBJECT
&lt;/h1&gt;

&lt;p&gt;This flow basically query the Object in Salesforce and then maps the data dynamically to prepare the payload for BigQuery.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m6K3UxAu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A9NH9k3AVkfWrGg-4" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m6K3UxAu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A9NH9k3AVkfWrGg-4" alt="19" width="880" height="427"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The query basically comes from a variable "salesforceFields" same field we collected when we described the Object using this script&lt;/p&gt;

&lt;p&gt;(payload.fields filter ($."type" != "LOCATION") map { fieldName : $.name}).fieldName joinBy ","&lt;/p&gt;

&lt;p&gt;&lt;a href="https://gist.github.com/emoran/5d87cb4d2477e6b422687066b9150f2c/raw/10bc9ba926737f3c22538c1dd8a5505085ce1a1f/salesforceFields"&gt;view raw&lt;/a&gt;&lt;a href="https://gist.github.com/emoran/5d87cb4d2477e6b422687066b9150f2c#file-salesforcefields"&gt;salesforceFields&lt;/a&gt; hosted with ❤ by &lt;a href="https://github.com/"&gt;GitHub&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And finally I'm limiting the result to only 100 records.&lt;/p&gt;

&lt;p&gt;Next step is to map the Salesforce result data and map it dynamically using this script:&lt;/p&gt;

&lt;p&gt;%dw 2.0import try, fail from dw::Runtimeoutput application/java fun isDate(value: Any): Boolean = try(() -&amp;gt; value as Date).successfun getDate(value: Any): Date | Null | Any = ( if ( isDate(value) ) value as Date as String else value ) --- -(payload map (item,index) -&amp;gt;{ (item mapObject ((value, key, index) -&amp;gt; { (key):(getDate(value)) } ))})&lt;/p&gt;

&lt;p&gt;&lt;a href="https://gist.github.com/emoran/055a8b509044c899d9fdcddbfe66ff41/raw/6d80d9ca2fc6371ab575a45a41a0e8909cf6f656/mapSalesforceReocrds"&gt;view raw&lt;/a&gt;&lt;a href="https://gist.github.com/emoran/055a8b509044c899d9fdcddbfe66ff41#file-mapsalesforcereocrds"&gt;mapSalesforceReocrds&lt;/a&gt; hosted with ❤ by &lt;a href="https://github.com/"&gt;GitHub&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Thanks so much to Alexandra Martinez for the insights on the utilities for DW 2.0! (&lt;a href="https://github.com/alexandramartinez/DataWeave-scripts/blob/main/utilities/utilities.dwl"&gt;https://github.com/alexandramartinez/DataWeave-scripts/blob/main/utilities/utilities.dwl&lt;/a&gt; )&lt;/p&gt;

&lt;p&gt;This last script basically maps the records and uses the key as field and the value, but the value needs to be replaced as Date in this case for the Strings that are date or date time. So I consider this the best script in this app.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---UHkMz69--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A1SnsNjRyLH8VTDuM" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---UHkMz69--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A1SnsNjRyLH8VTDuM" alt="" width="880" height="327"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  INSERTDATAINTOBIGQUERY
&lt;/h1&gt;

&lt;p&gt;This flow just inserts the data we prepared only, so basically we only need to specify table id , dataset id and the Row Data&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3Z4C_h1V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A-GRExUX-MkjriLG4" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3Z4C_h1V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A-GRExUX-MkjriLG4" alt="20" width="880" height="267"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dyLGHko_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Ar0z9PyW37QG1Q0Bt" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dyLGHko_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Ar0z9PyW37QG1Q0Bt" alt="21" width="880" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  SETTING UP OUR MULE APPLICATION.
&lt;/h1&gt;

&lt;p&gt;Now we should be able to run our application and see the new tables and the data over Google big query.&lt;/p&gt;

&lt;p&gt;On GCP I can see the tables I selected created:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gXA1ffrS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1336/0%2AWlTHj49AZqCjVyzn" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gXA1ffrS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1336/0%2AWlTHj49AZqCjVyzn" alt="22" width="668" height="782"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And if we open any of them we should look into the schema to verify all fields are there&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--c1Xa9QJE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A3Jl5IAwmGF0pvd-G" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--c1Xa9QJE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2A3Jl5IAwmGF0pvd-G" alt="23" width="880" height="1582"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally we should be able to query the table in the console or clic on the Preview option to check the data is there.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RHEyiLst--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Aej9ZJ1DKkWkTSy4_" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RHEyiLst--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1400/0%2Aej9ZJ1DKkWkTSy4_" alt="24" width="880" height="754"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I think this is kind of a common request we get on the integration space and many tweaks can be implemented if we are thinking of big migrations or setting some jobs that eventually will require tables to be created automatically from Salesforce to GCP.&lt;/p&gt;

&lt;p&gt;If you like to try it, I created this &lt;a href="https://github.com/emoran/sfdc-to-bigquery"&gt;GitHub&lt;/a&gt; repository. I hope this was useful and I'm open to hear any enhancement / scenario.&lt;/p&gt;

</description>
      <category>mule4</category>
      <category>google</category>
      <category>bigquery</category>
      <category>mulesoft</category>
    </item>
    <item>
      <title>Image recognition using Mulesoft and Salesforce</title>
      <dc:creator>Edgar Moran</dc:creator>
      <pubDate>Mon, 16 Nov 2020 22:01:48 +0000</pubDate>
      <link>https://forem.com/yucelmoran/image-recognition-using-mulesoft-and-salesforce-4fe7</link>
      <guid>https://forem.com/yucelmoran/image-recognition-using-mulesoft-and-salesforce-4fe7</guid>
      <description>&lt;p&gt;Mulesoft and Salesforce seem to be the right combination of technologies to be able to deliver projects robust and complex in short time. I would like to demonstrate how we can use both of them to recognize images produced from a mobile device and recognize a picture bringing more information and interesting data for kind of a real scenario.&lt;/p&gt;

&lt;p&gt;So, how this is done? Well here are some of components I'm using for this project (I will go deep on each one):&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Salesforce developer Org&lt;/li&gt;
&lt;li&gt;Anypoint Platform (Sandbox) account&lt;/li&gt;
&lt;li&gt;Mulesoft mule-aws-recognition-system-api.&lt;/li&gt;
&lt;li&gt;Mulesoft mule-aws-recognition-process-api&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  Salesforce developer Org
&lt;/h2&gt;

&lt;p&gt;I have got a developer account for Salesforce from their developer site (developerforce.com). Salesforce in this case allows me to have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Custom Objects (tables)&lt;/li&gt;
&lt;li&gt;Custom Fields on each of those objects created&lt;/li&gt;
&lt;li&gt;A way to expose a mobile application (previously known as Salesforce 1) available to install on IOS or Android devices&lt;/li&gt;
&lt;li&gt;Visualforce pages, allowing to customize what we cant to show on a mobile app or browser&lt;/li&gt;
&lt;li&gt;Apex Classes, custom apex code to handle data from a page or allowing to expose REST services from a custom apex (java style) definition.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So here we have the design:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Standard Object (Content Version and         ContentDocumentLink). Allows to store the actual binary file in Salesforce&lt;/li&gt;
&lt;li&gt;Custom Object (Hackathon Image). Allows to have a record to link the photo taken&lt;/li&gt;
&lt;li&gt;Custom Object (Image Label). Stores the image information labels and how accurate is the image with the label from AWS.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;here comes the fun part..&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Visualforce page allowing to show an UI to take the picture:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F17v83buqsp4tb1isij7v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F17v83buqsp4tb1isij7v.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Apex Controller. Allows to get all information from the picture and Create the Content Version and Content Link record related to the Hackathon image.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Apex Controller REST . Exposes the mentions endpoint allowing to trigger a push notification in the mobile device.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Where I can get this code from?  &lt;a href="https://github.com/emoran/sfdc-mulesoft-hackathon-2020.git" rel="noopener noreferrer"&gt;https://github.com/emoran/sfdc-mulesoft-hackathon-2020.git&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now a basic flow:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvffdari364hl066tdh3r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvffdari364hl066tdh3r.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Mulesoft mule-aws-recognition-system-api.
&lt;/h2&gt;

&lt;p&gt;Initially this system api was for AWS only but because the time and resources I also included here one of the pieces I need to complete this exercise.&lt;/p&gt;

&lt;p&gt;As I mentioned this system API allows to process a Base64 image and send it to Amazon Rekognition API, the result of this call is to be able to retrieve the labels generated from this call. &lt;/p&gt;

&lt;p&gt;This same application contains the logic to pull a few tweets using a parameter based on hashtags.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#%RAML 1.0
title: mule-aws-recognition-system-api

/image:
  post:
    body:
      application/json:

    responses:
      200:
        body:
          application/json:

/twitter:
  /tweets:
    get:
      queryParameters:
        q:
          description: "Parameters to filter by hashtag"
      responses:
        200:
          body:
            application/json:
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To process the image basically I used the AWS Java SDK to use the API my flow looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwqo8o4yx314mklcwoan0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwqo8o4yx314mklcwoan0.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the other hand for the Tweets we have a different endpoint which receives only the GET request and we return all tweets based on the hashtags provided.&lt;/p&gt;

&lt;p&gt;Here how the flow looks like:&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fk0et7hkipk2y1fk4jo8a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fk0et7hkipk2y1fk4jo8a.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;As you can see this is just a pretty simple HTTP Request to the Twitter API, It's not included in the process API as we are not using a connector to extract the logic of this request.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fusrydm9n42zpit7ihdtx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fusrydm9n42zpit7ihdtx.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvmlzxwjcj6wbbb0muv6s.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fvmlzxwjcj6wbbb0muv6s.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can get the code of the system API from here: &lt;a href="https://github.com/emoran/mule-aws-recognition-system-api.git" rel="noopener noreferrer"&gt;https://github.com/emoran/mule-aws-recognition-system-api.git&lt;/a&gt; &lt;/p&gt;
&lt;h2&gt;
  
  
  Mulesoft mule-aws-recognition-process-api
&lt;/h2&gt;

&lt;p&gt;At this point in the process api, now we are really doing more things and connecting the dots. I will try to explain step by step what happens.&lt;/p&gt;

&lt;p&gt;The process API has this RAML:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#%RAML 1.0
title: mule-aws-recognition-process-api


/image:
  post:
    body:
      application/json:
    responses:
      200:
        body:
          application/json:
            example:

/sfdc:
  /images:
    get:
      responses:
        200:
          body:
            application/json:
  /contentVersion:
    get:
      queryParameters:
        id:
          description: imageId
          type: string
/tweets:
  get:
    responses:
      200:
        body:
          application/json:


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;&lt;p&gt;After the mobile application saves the picture  we took with our device, Salesforce calls the /images endpoint we exposed in Mulesoft, basically it passes three params imageRecordId (Hackathon Image), contentVersionId (Id of the actual file in Saleforce) and contentDocumentLinkId (Link document to the picture.)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Mulesoft gets the parameters, then using the Salesforce connector we make a query to Content Version and we download the file (actual image in Base64), then we call the system API to passing the image and wait for the bunch of labels that AWS recognized &lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fpj22olnpa5hbypfuikrv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fpj22olnpa5hbypfuikrv.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fj98g5ifeyts44g0z363k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fj98g5ifeyts44g0z363k.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Once AWS responded we also create the labels in Salesforce (Image Labels) for the uploaded image as records and lastly we call the REST Service we exposed in Salesforce in order to notify the person the image has been processed and now it has labels created.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2moeoualvk3isknffn9t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F2moeoualvk3isknffn9t.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It was really interesting to check how to call that REST service from the connector, since on older versions of the connector we were able to connect getting the session ID and use the REST endpoint directly. In Mule 4 we are not able to do so, in this case we use the connector capabilities to do it&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgru79gfgrh7ck43wlgvv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fgru79gfgrh7ck43wlgvv.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Now in the last part as user you can use your device to see the labels created per record, but also I created a feature on this process API. I've created a page served on Mulesoft to show the information we saved!&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;How I did do it?, in the same process API I placed a new configuration file named "portal", a flow that contains a "Load Static Resource" that serves a page stored in a folder named "web" on src/main/resources &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F06q1vlq2rcex1g5vclen.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F06q1vlq2rcex1g5vclen.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now the main page contains a script that uses JQuery in order to show the information of images and tweets.&lt;/p&gt;

&lt;p&gt;This is how the render page looks like:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffp23s1divqsurqxtdiv3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Ffp23s1divqsurqxtdiv3.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Basically the page tells you the labels we got from AWS, shows the image we took the picture and the tweets related to the generated labels.&lt;/p&gt;

&lt;p&gt;you can get this code from here &lt;a href="https://github.com/emoran/mule-aws-recognition-process-api.git" rel="noopener noreferrer"&gt;https://github.com/emoran/mule-aws-recognition-process-api.git&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Watch the video:&lt;br&gt;
&lt;a href="http://www.youtube.com/watch?v=GWKP4U0o2Ng" rel="noopener noreferrer"&gt;http://www.youtube.com/watch?v=GWKP4U0o2Ng&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="http://www.youtube.com/watch?v=GWKP4U0o2Ng" rel="noopener noreferrer"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/http%3A%2F%2Fimg.youtube.com%2Fvi%2FGWKP4U0o2Ng%2F0.jpg"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>mulesofthackathon</category>
      <category>salesforce</category>
      <category>aws</category>
      <category>twitter</category>
    </item>
  </channel>
</rss>
