<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Karthik Ganesan</title>
    <description>The latest articles on Forem by Karthik Ganesan (@karth1k14).</description>
    <link>https://forem.com/karth1k14</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/karth1k14"/>
    <language>en</language>
    <item>
      <title>List app registrations with credentials about to expire</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Mon, 20 Mar 2023 04:37:37 +0000</pubDate>
      <link>https://forem.com/karth1k14/list-app-registrations-with-credentials-about-to-expire-370i</link>
      <guid>https://forem.com/karth1k14/list-app-registrations-with-credentials-about-to-expire-370i</guid>
      <description>&lt;p&gt;App registrations is a mechanism in Azure AD allowing to work with an application and its permissions. It’s an object in Azure AD that represents the application, its redirect URI (where to redirect users after they have signed in), its logout URL (where to redirect users after they’ve signed out), API access and custom application roles for managing permissions to users and apps.&lt;/p&gt;

&lt;p&gt;As a matter of fact, through an app registration, you can restrict access to an application to only a specific group of users, if needed. An example of this is a solution I built a few years ago where we had two separate apps: a customer-facing app and a management app. Each had its app registration. I’ve restricted access to only a select group of people responsible for managing the system for the management app.&lt;/p&gt;

&lt;p&gt;Associated with an app registration is a service principal, which is the identity of that application. As you undoubtedly know, a service principal has credentials. However, you may not know that these credentials have an expiry date (end-date). If you’re not aware of that and don’t monitor and manage that, you may end up with applications and services that stop working.&lt;/p&gt;

&lt;p&gt;So, we’ll see how we can define an Azure Automation runbook that we can run periodically to detect and get a list of those credentials that are either expired or about to expire.&lt;/p&gt;

&lt;p&gt;Also, suppose you are looking for an out-of-the-box solution rather than building custom solutions. In that case, this article also recommends one that proactively monitors and alerts on &lt;a href="https://docs.serverless360.com/docs/app-registration"&gt;Azure app registration credential expiry&lt;/a&gt; without any coding or much manual effort.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setting up the automation runbook&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Creating an Azure Automation runbook can be done through the Azure portal or a CLI. We’ll show the portal way here.&lt;/p&gt;

&lt;p&gt;We first start by creating an Automation account. In the Azure portal, look for “Automation accounts”, then create a new instance:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YiibdoDM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x291rot8wbgqs8ff7pp3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YiibdoDM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x291rot8wbgqs8ff7pp3.png" alt="App registration - automation" width="576" height="470"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once the account is created, we need to make a runbook (Use an Automation account to do many tasks where each runbook will handle a given task).&lt;/p&gt;

&lt;p&gt;Go to the “Runbooks” section, then click “Create a runbook” and enter the requested information&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rb5h8NvV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q78xn443x2at3fu6zh07.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rb5h8NvV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q78xn443x2at3fu6zh07.png" alt="runbook" width="719" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You’re then presented with a screen to enter the code for that runbook. Our code will be in PowerShell. We’ll get to the complete source code in the next section.&lt;/p&gt;

&lt;p&gt;For now, I’ve displayed some sample code:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can notice, in line 3, that we import the “AzureAD” PowerShell module to interact with Azure AD. We use it at line 13 to get the list of all app registrations.&lt;/li&gt;
&lt;li&gt;You can notice that, too, between lines 6 and 9, we are authenticating to Azure AD before getting the list of app registrations (again, at line 13).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gcDoYDCZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udmcwyn1al6einonmaek.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gcDoYDCZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/udmcwyn1al6einonmaek.png" alt="powershell runbook" width="576" height="169"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From the toolbar (above the text editor), you can save the runbook, test it, publish it (you need to do that before you can use it in production), and revert to the previous version (in case the new version doesn’t work as expected).&lt;/p&gt;

&lt;p&gt;We need first to install it since we’re importing a module (here, “AzureAD” at line 3).&lt;/p&gt;

&lt;p&gt;For that matter, at the Automation account level, we click on “Modules”, and we look for “AzureAD”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7a4hg6hg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wq2s1kq1da4w5lk3b1j2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7a4hg6hg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/wq2s1kq1da4w5lk3b1j2.png" alt="modules" width="517" height="273"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since that module isn’t installed, we need to install it from the gallery by clicking on “Add a module”. We’ll pick 5.1 as the runtime version:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ew5cAwad--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q4tf78gdvbdbxianxi8u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ew5cAwad--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q4tf78gdvbdbxianxi8u.png" alt="modules-2" width="576" height="523"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The code&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The PowerShell code to be added to the runbook is listed &lt;a href="https://gist.github.com/BelRarr/87add04e39dbe44801681a49376ee762"&gt;here&lt;/a&gt;. Replace the previous code with this one.&lt;/p&gt;

&lt;p&gt;The code is pretty easy to understand. One thing worth mentioning is the $daysToExpire variable that you’ll have to set to an appropriate value for your scenario. It’s intended to detect the service principals whose credentials are about to expire in the x coming days.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configuring the permissions for the runbook&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;At this point, if you execute the runbook, you’ll notice that it might not work. That’s because the identity under which the runbook runs doesn’t have permissions to interact with Azure AD.&lt;/p&gt;

&lt;p&gt;An Azure Automation account has an associated identity. Find it in the “Connection” section under “Shared resources” in the Azure portal.&lt;/p&gt;

&lt;p&gt;I’ll choose the “AzureRunAsConnection”, which is of type “Service principal”, and give it the appropriate&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fNB09Mde--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/113z15qtm7uxy7j2leva.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fNB09Mde--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/113z15qtm7uxy7j2leva.png" alt="psautomation" width="576" height="188"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To find that service principal in Azure AD, I need to search for the name of the Automation account in the list of “All applications” under “App registrations”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KSMXYpw2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fqk8sfbosng72fxlnhyi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KSMXYpw2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fqk8sfbosng72fxlnhyi.png" alt="app registrations" width="576" height="312"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since we want to list app registrations from the Azure AD, we need to assign the directory role “Directory readers” to the service principal associated with our Automation account (the one that will execute the runbook) following the least privileges principle.&lt;/p&gt;

&lt;p&gt;So, we go to “Roles and administrators” in our Azure AD tenant select “Directory readers”:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fb-Iy2IS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8h3clmcrw6x9w2obpqjw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fb-Iy2IS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8h3clmcrw6x9w2obpqjw.png" alt="roles &amp;amp; administrators" width="576" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, we add an assignment to our service principal:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5f1YL260--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0apcc689o6jh40bc19t7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5f1YL260--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0apcc689o6jh40bc19t7.png" alt="assignment" width="576" height="355"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And we’re done.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Setting up a schedule&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Schedule configuration can be done in the runbook blade since each runbook can have its schedule:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OsxpOWQr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1ghj1i7to7vva64z3e1k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OsxpOWQr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/1ghj1i7to7vva64z3e1k.png" alt="list app" width="393" height="331"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The process is straightforward, and the frequency depends on your need. Once every 24 hours can be a reasonable frequency.&lt;/p&gt;

&lt;p&gt;*&lt;em&gt;Get proactive alerts on credential expiration with out of box solution&lt;br&gt;
*&lt;/em&gt;&lt;br&gt;
Building custom solutions by writing lengthy codes and using various Azure services has been the preference to get notified about the credentials that are about to expire. This is because the native-Azure monitoring tools don’t offer any features to fulfill such &lt;a href="https://docs.serverless360.com/docs/app-registration"&gt;App registration credential expiry monitoring&lt;/a&gt; needs.&lt;/p&gt;

&lt;p&gt;But, following the custom approach involves too many manual tasks, might be error-prone, and if you are from the Azure Operations team, you will have to rely heavily on developers.&lt;/p&gt;

&lt;p&gt;So, it’s been a goal of Serverless360 to mitigate this enormous drawback by offering to monitor the expiry dates of credentials associated with App registrations, and here is what it’s currently capable of,&lt;/p&gt;

&lt;p&gt;-Manage and monitor any number of credentials with ease.&lt;br&gt;
-Get intelligent alerts before a client secret/credential expires.&lt;br&gt;
-Customize the number of days to notify you before the credential expires.&lt;br&gt;
-Extensively view the properties of your Azure App registrations and their credentials.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;To sum it up&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Azure Automation is an excellent service that you can use for various management tasks. In this article, we’ve used it to detect and list service principals that are either expired or about to expire so we can act on them and decide whether to renew these credentials.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>appregistrations</category>
    </item>
    <item>
      <title>Azure Application Insights vs Log Analytics: Which one should you choose?</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Wed, 22 Feb 2023 06:57:41 +0000</pubDate>
      <link>https://forem.com/karth1k14/azure-application-insights-vs-log-analytics-which-one-should-you-choose-52jg</link>
      <guid>https://forem.com/karth1k14/azure-application-insights-vs-log-analytics-which-one-should-you-choose-52jg</guid>
      <description>&lt;p&gt;We often speak to organizations about Azure and one of the common questions is what is the difference between App Insights and Log Analytics.&lt;/p&gt;

&lt;p&gt;In this article we will aim to discuss those differences and overlaps.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr83laalhrsocshf1qyw8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr83laalhrsocshf1qyw8.png" alt=" " width="405" height="228"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If we think about the Azure Monitoring Platform / ecosystem, then Log Analytics and App Insights both play a role within that platform as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwe0ci8lod8lpgk1zuy4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbwe0ci8lod8lpgk1zuy4.png" alt=" " width="602" height="380"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Log Analytics plays a role in the storage of Log Data and the analysis of Log Data.&lt;/p&gt;

&lt;p&gt;App Insights plays a role in the analysis and insights into log data.&lt;/p&gt;

&lt;p&gt;The official line from Microsoft is that in 2018 Log Analytics and &lt;a href="https://www.serverless360.com/compare-application-insights" rel="noopener noreferrer"&gt;Application Insights&lt;/a&gt; became a single service to provide powerful end-to-end monitoring for your applications. The 2 resource types are still however separate and have different target use cases and overlaps which we will explore more below. &lt;/p&gt;

&lt;p&gt;The key change that Microsoft has introduced was that App Insights used to be a complete standalone service whereas now you have a workspace-enabled App Insights whereby when you configure an App Insights instance you will also have a Log Analytics workspace which it points to.&lt;/p&gt;

&lt;p&gt;Rather than having its own separate log storage, App Insights will store the logs in Log Analytics and App Insights provides views of the log data.&lt;/p&gt;

&lt;p&gt;It’s probably easiest to think of Log Analytics as being the raw database and tables for Log data and App Insights is a set of views aimed at providing useful views on your applications’ telemetry data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Log Analytics Use Cases&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The most common use cases people will be familiar with for Log Analytics will be a Virtual Machine which will have a logging agent on it which will send telemetry to a Log Analytics workspace.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbu0x6wip964bvfd5x9us.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbu0x6wip964bvfd5x9us.png" alt=" " width="370" height="153"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can also configure other services to send their telemetry logs to a Log Analytics workspace with some examples such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Service Bus&lt;/li&gt;
&lt;li&gt;Frontdoor&lt;/li&gt;
&lt;li&gt;SQL Azure&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu150lo4y1waepuf8n2e6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu150lo4y1waepuf8n2e6.png" alt=" " width="364" height="301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For many of the Azure resources you can configure the diagnostics settings to send telemetry logs to Log Analytics and this is like infrastructure level logging of how those resources are being used and how they are performing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;App Insights Use Cases&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With App Insights, you are looking at more application-level use cases. Some of the examples would be things like configuring a Web App, a Function App, or API Management to send application-level telemetry to App Insights.&lt;/p&gt;

&lt;p&gt;With the workspace-based App Insights the log data is still stored in a Log Analytics workspace but you are streaming your telemetry to App Insights and viewing it through App Insights.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpd8tzapvm73ie5ay9rw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flpd8tzapvm73ie5ay9rw.png" alt=" " width="589" height="392"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;A look under the hood&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With Log Analytics the data sent to the logs is a more raw format and is typically in tables like AzureDiagnostics and AzureMetrics for infrastructure level logs discussed previously, but if your using App Insights you will find the Log Analytics workspace under the hood contains the tables shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gfytia7pmotwz8e4qp7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gfytia7pmotwz8e4qp7.png" alt=" " width="359" height="331"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;These are the tables where App Insights is storing the telemetry data in Log Analytics.&lt;/p&gt;

&lt;p&gt;If you look at the “logs” through App Insights (remember we said think of the App Insights as a view on the tables in Log Analytics) you will see the following tables.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxkrra07jt3rll82xptkl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxkrra07jt3rll82xptkl.png" alt=" " width="334" height="324"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you write a query of the requests table in App Insights that queries the log table then it is querying the AppRequests table in Log Analytics.&lt;/p&gt;

&lt;p&gt;From the perspective of writing a query the below 2 queries are equivalent:&lt;/p&gt;

&lt;p&gt;App Insights:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbytapcgxys44u2c7io7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flbytapcgxys44u2c7io7.png" alt=" " width="800" height="173"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Log Analytics:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzbtibcy67r7396om1t2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqzbtibcy67r7396om1t2.png" alt=" " width="800" height="223"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice that the key difference is that the filter on the _resourceId field gets to the app insights resource and the TimeGenerated field in Log Analytics is seen as timestamp in App Insights.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What additional value does App Insights provide on top of Log Analytics&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Reiterating the point that Log Analytics is the log storage and App Insights is a layer on top aimed at providing friendly views of the Log data to suit application performance management for apps built with App Service, Function Apps or even if your using a Server based IIS app with the telemetry agent, the thing App Insights gives you are added value features which use that log data and present it in ways which help you build and operate awesome apps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Transaction Search&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One of the most handy features of App Insights is the transaction search. This allows you to search for requests and dependencies or traces that have been executed by your application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmx0tx93djfejmp9joyyk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmx0tx93djfejmp9joyyk.png" alt=" " width="602" height="345"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can then open up the item you’re looking for and it will wind through the distributed trace and show you the call stack of services that have been called. You can see below a request to by API Management which was then forwarded to a Function App.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpiqwemhklckbw3xqawid.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpiqwemhklckbw3xqawid.png" alt=" " width="602" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Application Map&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The application map allows me to see the dependencies between my app and resources it may use such as an HTTP endpoint or a SQL database. App Insights will look at the telemetry captured and workout these services and their usage and performance levels.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1a95we3xp3k6aam218du.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1a95we3xp3k6aam218du.png" alt=" " width="602" height="429"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Live Metrics&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Live metrics is a great way to see a live stream overview of how your app is working and the requests it is processing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhoyxnn2ave29wxoub758.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhoyxnn2ave29wxoub758.png" alt=" " width="602" height="286"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Availability&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Availability allows you to create web tests to ping endpoints in your app and test that they are working and monitor their SLA.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7gvz69ghjd5qaswicv6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx7gvz69ghjd5qaswicv6.png" alt=" " width="602" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Failures&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Failures allows you to begin exploring the errors within your app to identify common problems and areas you need to work on.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvi10hkfhj3dizfmg5ij.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxvi10hkfhj3dizfmg5ij.png" alt=" " width="602" height="282"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Performance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Performance shows you the average response times, usage and useful aggregations of the performance data in your app. It’s a great way to explore and monitor your app usage and to troubleshoot performance issues.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs5i6nb7jqlfj0s9v74vq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs5i6nb7jqlfj0s9v74vq.png" alt=" " width="602" height="279"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Usage&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Usage provides a number of features aimed at helping you explore the user experience and usage of your application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fakszy0ga5olh3uvi4ny0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fakszy0ga5olh3uvi4ny0.png" alt=" " width="602" height="235"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can begin to understand user journeys in your app and how it is used in the real world.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz2luxuiuwzdnejk8umat.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz2luxuiuwzdnejk8umat.png" alt=" " width="602" height="245"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are also lots of deeper dive views and dashboards you can look into.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzo8b314v4a6d86fsed0g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzo8b314v4a6d86fsed0g.png" alt=" " width="602" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What do App Insights and Log Analytics mean for Serverless360&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Log Analytics and App Insights are awesome features on Azure and they can really help your team. The challenge with them really falls into the area of roles and responsibilities. In an organization, the Azure experts will be really comfortable with using App Insights and Log Analytics and can do deep dive analysis and learn deep insights about application performance.&lt;/p&gt;

&lt;p&gt;The problem is that not everyone in the organization who is a stakeholder in the application or IT operations is an Azure expert and, in most cases, Azure experts are a premium and scarce resource for most organizations.&lt;/p&gt;

&lt;p&gt;The value proposition for Serverless360 in this case is to be able to leverage App Insights and Log Analytics data but to present it in a much more consumable format for the IT operations team so they can understand the data in the context of an application they are supporting without needing the steep learning curve to use Azure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.serverless360.com/" rel="noopener noreferrer"&gt;Serverless360&lt;/a&gt; is all about lowering the total cost of ownership of your solutions by moving support to the left away from your experts and allowing your support teams to be more productive.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd1m85gtvg7zvdtqr6ufk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd1m85gtvg7zvdtqr6ufk.png" alt=" " width="602" height="298"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The example below is a dashboard we created for one of our application teams which shows an overview of how the web services from their application which we expose through API Management are performing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffzt5fe5rtt869fsqcib8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffzt5fe5rtt869fsqcib8.png" alt=" " width="602" height="288"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can learn more about how Serverless360 democratizes support of Azure solutions around App Insights and Log Analytics via the following articles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://www.serverless360.com/blog/integrating-log-analytics-in-serverless360" rel="noopener noreferrer"&gt;Integrating Log Analytics with Serverless360&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.serverless360.com/blog/application-insights-health-check-with-serverless360" rel="noopener noreferrer"&gt;Serverless360 and Application Insights&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.serverless360.com/blog/enable-security-manager-to-azure-web-application-firewall-waf-data" rel="noopener noreferrer"&gt;Enable your security manager to see Frontdoor WAF data&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=1N22XLhxsuM" rel="noopener noreferrer"&gt;Comparing Log Analytics, App Insights and Serverless360 BAM when using Logic Apps&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=n0Je2TrVHFs" rel="noopener noreferrer"&gt;Azure Monitor &amp;amp; Serverless360&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hopefully, this article clarifies your understanding of the relationship between Log Analytics and App Insights. Log Analytics is the log storage and query resource which can be used by both infrastructure-level and application-level solutions. It is about storage and query of log data.&lt;/p&gt;

&lt;p&gt;Application Insights is a layer on top of Log Analytics aimed at application-level telemetry and uses the log data stored in Log Analytics to provide an additional bunch of features that make App Insights an Application Performance Management tool.&lt;/p&gt;

&lt;p&gt;Both are highly likely to be part of your architecture and provide great features to help you build better apps and operate well used infrastructure.&lt;/p&gt;

</description>
      <category>firstpost</category>
      <category>posts</category>
      <category>introduction</category>
    </item>
    <item>
      <title>Using APIM as a Proxy for Serverless360 BAM</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Tue, 07 Feb 2023 04:37:16 +0000</pubDate>
      <link>https://forem.com/karth1k14/using-apim-as-a-proxy-for-serverless360-bam-45ng</link>
      <guid>https://forem.com/karth1k14/using-apim-as-a-proxy-for-serverless360-bam-45ng</guid>
      <description>&lt;p&gt;Many companies like all their API calls to external services routed through Azure API Management (APIM). If you are a user of Serverless360 BAM, then one of the options is to use Azure API Management as a proxy to your BAM API and then use APIM from within Logic Apps.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Serverless360 BAM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Serverless360 is an enterprise product that aims to provide simple yet powerful toolsets for all operational needs in Azure Cloud. Serverless360 offers an end-to-end distributed tracing feature called BAM. BAM helps users identify how a transaction flows in a business process and spotlight failures in those transactions. BAM can be instrumented in your business process by containing services like Logic Apps, Function Apps, APIMs, an On-prem system (capable of making an API call), etc.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure API Management Proxy&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The standard approach for BAM would look something like the below. There are three main areas:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The BAM Service provided by Serverless360&lt;/li&gt;
&lt;li&gt;Your database and storage where you want to store your BAM data&lt;/li&gt;
&lt;li&gt;The components which you wish to publish BAM events&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu92waypppqr9p7m3ao42.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu92waypppqr9p7m3ao42.png" alt="Image description" width="453" height="347"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Depending on the hosting option you choose for BAM, you may choose to host the BAM Service yourself or host it on the Serverless360 SaaS platform.&lt;/p&gt;

&lt;p&gt;For the scenario we are discussing here, as a BAM user, you might pass your calls to Serverless360 BAM through your own Azure API Management solution before they go to Serverless360. It would look like the below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqwrch330evx3v2vqneo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgqwrch330evx3v2vqneo.png" alt="Image description" width="511" height="382"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why might you choose to do this?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;There are several different reasons you might consider doing this, which include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You want to centrally route all API calls through APIM for operational and monitoring purposes.&lt;/li&gt;
&lt;li&gt;You might want to connect APIM to BAM, and you might use the advanced subscription and critical management features in APIM to give other teams access to BAM&lt;/li&gt;
&lt;li&gt;You might want to have a layer of abstraction from your implementation to a 3rd party service.&lt;/li&gt;
&lt;li&gt;You might want to simplify your configuration scenarios for logic apps, so you don’t need to use a custom connector.&lt;/li&gt;
&lt;li&gt;There can be some performance benefits by using APIM rather than a custom connector with Logic Apps.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;How do we set up this scenario?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;It is easy to set up this scenario. The steps to do this are below.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Download the API Specification for Serverless360 BAM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You need to go to the action menu from the BAM home page and access the BAM connection details.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2c45eplugsv393v5z0dm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2c45eplugsv393v5z0dm.png" alt="Image description" width="299" height="237"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can get the link to the Swagger for your BAM API to view it in swagger UI.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff1tgi3lkqu4tk73au4kw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff1tgi3lkqu4tk73au4kw.png" alt="Image description" width="362" height="173"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can paste the link into the browser, and it will then show you the BAM API.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3z7fmepcat5z82z5nk8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk3z7fmepcat5z82z5nk8.png" alt="Image description" width="602" height="432"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The swagger link you need for APIM is shown above with the arrow; you can click it if you want and see the Swagger.&lt;/p&gt;

&lt;p&gt;It would help if you took this URL to APIM so you could import your API.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create BAM API in APIM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, go to Azure APIM and start to create a new API using the import Open API option.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5w073i2pwosbx8u4f8cs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5w073i2pwosbx8u4f8cs.png" alt="Image description" width="402" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can then add the URL for the Swagger, which will import information about your BAM API. You probably want to modify the API suffix so it has an extension that matches the pattern for the standard you are using in your APIM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fka08csf4v5mkv0wwj8f2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fka08csf4v5mkv0wwj8f2.png" alt="Image description" width="602" height="286"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next, click Create, and APIM will import the API and set up BAM for you.&lt;/p&gt;

&lt;p&gt;You will now see a new API, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gqklvbssmf8k6qku6es.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9gqklvbssmf8k6qku6es.png" alt="Image description" width="602" height="404"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Set the host and key for BAM&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, you need to set the key for calling your BAM API. The best way to do this is to get the key from the BAM portal, and then in the policy for your All Operations on the API, you can add the header to set the key, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faslh8m8n5rdbf4ofg1qq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faslh8m8n5rdbf4ofg1qq.png" alt="Image description" width="505" height="322"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Most people will probably have a named value which will be either a secret or linked to a Key Vault which will be used to set the BAM API key in the policy.&lt;/p&gt;

&lt;p&gt;Your APIM is now set up to be able to send messages to BAM.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Use BAM in your Logic App&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, I will go to my Logic Apps and look to add an APIM action. If you select your APIM instance and the BAM API you created, you will see the list of operations imported when we set up APIM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh9ubb8ztmsgq1ajamvck.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh9ubb8ztmsgq1ajamvck.png" alt="Image description" width="602" height="278"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can now use these operations as we do with the custom connector.&lt;/p&gt;

&lt;p&gt;With APIM having all of the schemas for the BAM API, your Logic App will be able to create a rich design-time experience, just like with the custom connector, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F07nwgom8b5cowgcki7nz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F07nwgom8b5cowgcki7nz.png" alt="Image description" width="602" height="385"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The main difference is that instead of creating an API connection behind the scenes and inheriting the additional configuration complexity that Logic Apps have in that area, you inherit the cool features for Logic Apps and APIM. You will be able to do more advanced security scenarios such as managed identity, and you can supply a subscription key and other things depending on how you have set up APIM.&lt;/p&gt;

&lt;p&gt;You will also avoid the custom connector throttling limits, so we expect to see a performance improvement here too.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hopefully, you will find that choosing to use APIM as a proxy is easy to set up, and it will open some more advanced setup scenarios that some BAM users may find helpful. Choosing the right distributed tracing tool plays a vital role in your business.&lt;/p&gt;

</description>
      <category>devto</category>
      <category>announcement</category>
      <category>offers</category>
    </item>
    <item>
      <title>Business Activity Monitoring with Flat File Messages</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Tue, 07 Feb 2023 04:30:26 +0000</pubDate>
      <link>https://forem.com/karth1k14/business-activity-monitoring-with-flat-file-messages-2cc3</link>
      <guid>https://forem.com/karth1k14/business-activity-monitoring-with-flat-file-messages-2cc3</guid>
      <description>&lt;p&gt;Let us consider that you have implemented an integrated solution and are using Serverless360 Business Activity Monitoring (BAM) to help provide your support users and business users with visibility of what is happening in the business transaction. BAM provides you with distributed tracing to attain maximum visibility on the integration solution that the functional operations team needs. You might have a scenario where you have an unstructured message format that you want to archive in BAM so that the users can see the data which was processed.&lt;/p&gt;

&lt;p&gt;Some examples of these might be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Processing CSV files&lt;/li&gt;
&lt;li&gt;Processing plain text files&lt;/li&gt;
&lt;li&gt;Processing EDI messages&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this article, we will look at how you can implement BAM with a Logic App solution that will allow you to archive unstructured messages but use data from them within BAM.&lt;/p&gt;

&lt;p&gt;Before continuing, if you are unfamiliar with BAM or how it can be used in Logic Apps, please check out the resources below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.serverless360.com/docs/checkpointing-in-bam" rel="noopener noreferrer"&gt;https://docs.serverless360.com/docs/checkpointing-in-bam&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sending flat file data to a system&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To start, we will look at the scenario where we have an interface that will run on a schedule and look up data from one of our systems and then produce a flat-file CSV message which will be written to the partner’s SFTP site.&lt;/p&gt;

&lt;p&gt;The solution will look like the below diagram.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv0vzburcowjbp5bxj7kj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv0vzburcowjbp5bxj7kj.png" alt="Image description" width="800" height="462"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A Logic App implements the solution with a recurrence trigger that will call API Management to query some data from our SaaS system. It will then use the Integration Account to convert the resulting data to a CSV format and deliver the data with the SFTP connector.&lt;/p&gt;

&lt;p&gt;We want to share the raw messages with BAM to support the solution effectively. The level 1 support operator can troubleshoot any data issues, and the user can do some self-service actions by checking transactions executed and the data passed between systems.&lt;/p&gt;

&lt;p&gt;To implement this, we will start by adding the BAM Start transaction shape, as shown below, to the Logic App.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnfcta9ffeil9a6h72wda.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnfcta9ffeil9a6h72wda.png" alt="Image description" width="602" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will also have a CheckPoint shape after retrieving the API data. In both shapes, we are passing information to BAM to advise where we currently are in the business transaction execution. In these shapes, we can send message body data to BAM, which can be used to extract properties from the data, or we can choose to archive the message so the support user can retrieve it.&lt;/p&gt;

&lt;p&gt;When we called the API in the Logic App, we were returned JSON data. We can use the Transform XML shape as shown below to transform the JSON to XML, representing a schema for the CSV file. You would use the Enterprise Integration Pack to create an XSD file for the schema and load it to the integration account.&lt;/p&gt;

&lt;p&gt;I can add a BAM checkpoint shape if I want to archive the XML data in BAM.&lt;/p&gt;

&lt;p&gt;The critical bit about sending the XML or JSON data to BAM is these structured representations of the data make it easy to promote properties to your BAM transaction so users can search for specific transactions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe7cdw6lhm40oi8fumyy3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe7cdw6lhm40oi8fumyy3.png" alt="Image description" width="602" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Once we have the XML representation of the data, we can use the Flat File Encoding shape, as shown above. It will let us specify the schema and convert the XML to CSV format.&lt;/p&gt;

&lt;p&gt;We can then deliver the file to the SFTP site and do a BAM Checkpoint to indicate the interface was successfully processed. The diagram below shows that the CSV data is sent as the message body to BAM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frylp064dkqitcyz7tluk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frylp064dkqitcyz7tluk.png" alt="Image description" width="602" height="544"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When the support or business user uses the BAM distributed tracing module, they will not need to worry about the shapes your Logic App may need to process your complex interface. They will see those critical milestones in the process. In a successful BAM transaction, you can see below that each stage lights up green indicating success.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5uvakj1cyaelhgoytvx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx5uvakj1cyaelhgoytvx.png" alt="Image description" width="602" height="457"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Users can see the JSON data you archived to BAM if they click on the Get CRM Data shape.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl61i99hc5jxhqet5kt22.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl61i99hc5jxhqet5kt22.png" alt="Image description" width="602" height="631"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If the user clicks on the Complete stage in the BAM transaction, they can see the CSV message body sent to the destination system.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxzchonhl6s5y4r0zk3lh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxzchonhl6s5y4r0zk3lh.png" alt="Image description" width="602" height="556"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Receiving flat file data from a system&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The opposite scenario is when we receive data from a system and then need to decode and parse it before processing it.&lt;/p&gt;

&lt;p&gt;In this case, we also want to share the raw data with the support and business super user to help them see the data we are processing for troubleshooting purposes.&lt;/p&gt;

&lt;p&gt;In this case, we can implement BAM in our Logic App, and we will use the message archiving feature on the Start Transaction and CheckPoint shapes and pass the data at different stages in the process.&lt;/p&gt;

&lt;p&gt;You can see below in this case, and we are using the StartTransaction shape. I have passed the raw CSV file as the message body.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwzlgj5hb41w337a8tvgt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwzlgj5hb41w337a8tvgt.png" alt="Image description" width="602" height="435"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We would then use the Integration Account flat file decoding shape to reference an XSD schema in the integration account, and this would convert the CSV data to XML format.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskyovbwlsjk5u42f52eh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fskyovbwlsjk5u42f52eh.png" alt="Image description" width="602" height="192"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can then use a CheckPoint shape after the decoding if we want to send the XML message to BAM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdu5fc0ugtaw6ksfr5fq1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdu5fc0ugtaw6ksfr5fq1.png" alt="Image description" width="602" height="589"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the BAM distributed tracing module, as earlier in the article, you will see the key milestones in your interface showing green if they were successful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqz0urj65ayc4jzxswqc1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqz0urj65ayc4jzxswqc1.png" alt="Image description" width="602" height="455"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can click in the first stage where I archived the CSV data, and the BAM user can see the message in BAM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj95e260qxpj18f70a0m2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fj95e260qxpj18f70a0m2.png" alt="Image description" width="602" height="448"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the real world, business users, level 1 support operators, and support people outside the integration team are often quite familiar with some data formats and business transactions. Using BAM distributed tracing to give them visibility into the execution and data being processed within your interface can let them participate in your interface’s day-to-day operations, leading to a lower operating cost for your solution.&lt;/p&gt;

&lt;p&gt;With Logic Apps, we have some excellent features for decoding, parsing, and transforming message formats like CSV, EDI, and other formats. If we send the raw messages to BAM, we can let the users see what they look like, and if we send the decoded message formats, such as the XML or JSON data, then it’s easy for us to use BAM. It helps to promote metadata from the message to allow users to search for transactions that might have processed specific data.&lt;/p&gt;

</description>
      <category>announcement</category>
      <category>devto</category>
      <category>offers</category>
    </item>
    <item>
      <title>Tracking IDocs for Integration Scenarios with Serverless360 BAM</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Tue, 07 Feb 2023 04:24:24 +0000</pubDate>
      <link>https://forem.com/karth1k14/tracking-idocs-for-integration-scenarios-with-serverless360-bam-23mb</link>
      <guid>https://forem.com/karth1k14/tracking-idocs-for-integration-scenarios-with-serverless360-bam-23mb</guid>
      <description>&lt;p&gt;Suppose you are a user of the Microsoft integration stack, and your organization also uses SAP. In that case, you will likely have use cases where an IDOC triggers integration processes in SAP being published.&lt;/p&gt;

&lt;p&gt;One of the good things about Logic Apps on Azure is that a connector for SAP allows you to register to receive IDOCS published by SAP, and you can then use them in your integration processes.&lt;/p&gt;

&lt;p&gt;We implemented a pattern where we had a generic Logic App registered to receive IDOCS and would then use the Azure Service Bus to implement a pub/sub pattern to allow multiple different interfaces to act upon the receipt of an IDOC into the integration platform.&lt;/p&gt;

&lt;p&gt;Below is a representation of this architecture.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs72g5d3k1mfuyk6b4mqo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs72g5d3k1mfuyk6b4mqo.png" alt="Image description" width="800" height="286"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is quite a handy pattern as it allows you to add new IDOCS and new interfaces being triggered via Service Bus in a plug-and-play style.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Challenge&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Because we have several interfaces triggered by IDOCS being published, it is challenging to track which Logic App run processed each IDOC received by the integration platform and which downstream interface processed each IDOC.&lt;/p&gt;

&lt;p&gt;The net result of this challenge is that the interfaces are harder to support, lots of time is spent tracking down issues, and the business users and SAP team feel like they have no visibility into the processes that are important to the effective running of their applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Value of BAM in this Scenario&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We felt like BAM could help us a lot here. It is beneficial to track the IDOCS being published and received by the integration platform, tracking their participation in interfaces, and then making this accessible. It would help us reduce those support costs and give a better-quality service to the users who depend on our integration platform.&lt;/p&gt;

&lt;p&gt;In BAM, we could achieve distributed tracing by creating business processes and transactions that would represent an excellent, more straightforward view of our interfaces. Then, we could create searchable properties that our integrations could use to make it easy to track and trace the information we needed.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Global Properties&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In BAM, we have global properties where you declare properties that will be searchable. Interfaces will use common business values, and users may wish to find information using this property. For example, a BAM user may want to search for a transaction via an Order Number.&lt;/p&gt;

&lt;p&gt;We started by declaring the properties which are common business properties such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CUSTOMER-NO&lt;/li&gt;
&lt;li&gt;ORDER-NO&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These have business meaning, but in different interfaces, they may be in other messages and message formats. For example, in one interface, they may be in a JSON message, and in another, they could be in XML. Having your interfaces connected to BAM allows you to map properties to the implementation at both design and runtime to ensure you can extract the needed information.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;SAP specific properties&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next up, we added some SAP-specific properties. In an IDOC, the EDI_DC40 segment is common to all the IDOCS we receive, and we felt that promoting some properties from this segment would be an excellent way to make helpful information searchable in BAM.&lt;/p&gt;

&lt;p&gt;The IDOC properties in the EDI_DC40 segment we are interested in including the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;DOCNUM is the IDOC number&lt;/li&gt;
&lt;li&gt;STATUS is the status of the IDOC&lt;/li&gt;
&lt;li&gt;IDOCTYP is the type of IDOC&lt;/li&gt;
&lt;li&gt;MESTYP is the message type for the IDOC&lt;/li&gt;
&lt;li&gt;CREDAT is the date the IDOC was created&lt;/li&gt;
&lt;li&gt;CRETIM is the time the IDOC was created&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We create the following list of properties in Serverless360 BAM:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdx503j24rp63pe6xnfds.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdx503j24rp63pe6xnfds.png" alt="Image description" width="800" height="358"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logic App-Specific Properties&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We also need some Logic App properties in BAM so that when we escalate to the IT support team, they can easily see which Logic App they need to check and if they need to study more detail on the technical implementation of the interface.&lt;/p&gt;

&lt;p&gt;To do this, we add the below two properties to the BAM Global Properties too:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;LOGICAPP-RUN-ID&lt;/li&gt;
&lt;li&gt;LOGICAPP-NAME&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;At this point, we now have all the properties we would want for tracking our IDOCS, and we would begin to create our transaction and Business Process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Business Process Setup&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;We chose a dedicated business process for our SAP-focused interfaces so they could easily be managed together. In the business process, we had a transaction called IDoc Published to the EAI Platform, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7jcdw6v8gyvoxw9momc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr7jcdw6v8gyvoxw9momc.png" alt="Image description" width="800" height="301"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Within the transaction, we created a simple activity diagram, as shown below, representing a simple view of the Logic App. It receives, parses, processes the IDOC, and then publishes it to Service Bus. The Logic App might be around 20 – 30 actions. Still, the BAM user will likely be either an IT support user, SAP application user, or business user. They are not so interested in the implementation detail of the Logic App they want to know about the key milestones, which are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The IDOC was received&lt;/li&gt;
&lt;li&gt;It was successfully published to Service Bus&lt;/li&gt;
&lt;li&gt;There was an error&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F50l8dbw5suu4xu7hws8b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F50l8dbw5suu4xu7hws8b.png" alt="Image description" width="800" height="352"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can configure the properties on the IDOC Received from SAP action, allowing us to map the technical properties from the Logic App to the BAM properties. The below picture shows the properties we configured. The first properties we extract at the start of the transaction.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7wkrbn58kuwy766mdzv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ff7wkrbn58kuwy766mdzv.png" alt="Image description" width="800" height="403"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We promote some additional properties from the body of the IDOC after we have parsed it.&lt;/p&gt;

&lt;p&gt;There are two types of properties we are using here:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Message context properties which the Logic App provides at design time&lt;/li&gt;
&lt;li&gt;Message Body properties which we are extracting from the message the Logic App sends to BAM at runtime&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this case, the message header properties were the Logic App name and ID and the SAPActionUri from the SAP connector.&lt;/p&gt;

&lt;p&gt;The message body properties use XPath against the message body. The Logic App sends the IDOC XML to BAM and allows to extract data from it to show in BAM.&lt;/p&gt;

&lt;p&gt;Note:&lt;/p&gt;

&lt;p&gt;Sending a message body is optional; if you choose to do it, you also have the option to archive the message body so that the BAM user can access it later if needed. If you do not archive it, it will be discarded when the message body is processed.&lt;/p&gt;

&lt;p&gt;When configuring message properties from the SAP IDOC, you would use an XPath like the one below, extracting the DOCNUM, which is the IDOC number.&lt;/p&gt;

&lt;p&gt;Receive/idocData/EDI_DC40/DOCNUM/text()&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Adding BAM to the Logic App&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Most people will probably use the Serverless360 connector JSON with a custom connector in Logic Apps to set up a connector for BAM. Several other ways to set up your Logic Apps to call BAM, such as via API Management.&lt;/p&gt;

&lt;p&gt;In the Logic App, you will add an action to start your transaction and then for the checkpoints you want to send to BAM.&lt;/p&gt;

&lt;p&gt;Below is the Start Transaction shape I have configured in my Logic App.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fav6ytchq7wa8ayc347e2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fav6ytchq7wa8ayc347e2.png" alt="Image description" width="602" height="452"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It provides information that maps to my BAM transaction and the first step within it. I am also passing the IDOC message as a message body to extract some properties.&lt;/p&gt;

&lt;p&gt;Next, in my Logic App, I would do some work to process the IDOC and send a notification that it is received or the message to Service Bus. I would then add a checkpoint shape for BAM, as shown below. It is completing my transaction.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd62xxc4e9rppvgae46hc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd62xxc4e9rppvgae46hc.png" alt="Image description" width="602" height="287"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Runtime Tracking&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;At runtime, I can use the tracking features of BAM to query for the instances of my transactions that have received an IDOC and then publish them into the integration platform. I can create a search like the one below, which uses the friendly columns from my BAM properties to allow me to see the IDOC type, the IDOC number easily and which logic app processed it, as you can see below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F09zhj9k0am6jsgur2qdk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F09zhj9k0am6jsgur2qdk.png" alt="Image description" width="800" height="408"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One of the most common support queries will be, “what happened with IDOC [add idoc number here].”&lt;/p&gt;

&lt;p&gt;With this query, we can check how we received it and see if it was successful. I can also open up the transaction by clicking and seeing what happened, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk6b61ub1el05q989fsfv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk6b61ub1el05q989fsfv.png" alt="Image description" width="480" height="370"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What if there is an issue?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The BAM tracking view allows your BAM users to have self-service-friendly ideas for complex interfaces. If there was an obvious problem and they needed to escalate it, then they could use the assign feature shown below, where they can link the transaction to another user in Serverless360. It might allow the BAM user to escalate to the Integration support person to take a deeper look at what is happening. It will enable them to find the transaction quickly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fryevefg3vqhrb9vi7j8r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fryevefg3vqhrb9vi7j8r.png" alt="Image description" width="480" height="365"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are also options to add a transaction as a favorite to you can easily find the one you are working with later if you need to take some time to investigate a problem.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Reprocessing&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Suppose you wanted to allow the BAM users to reprocess a failed message without needing an Integration Expert to step in. Then it is possible to configure a reprocessing step for the transaction. You might set up a way for the BAM user can push the IDOC back to the Logic App if you need to get it processed again. There is more information about reprocessing in this article.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.serverless360.com/docs/bam-message-reprocessing" rel="noopener noreferrer"&gt;https://docs.serverless360.com/docs/bam-message-reprocessing&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Performance Dashboards&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In BAM, I can also create dashboards to give me an easy-to-consume view of how my processes run. Below is an example of one of the widgets I use where I can see how many IDOCS we receive over time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zd2j56i7vo2wp1ibctb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3zd2j56i7vo2wp1ibctb.png" alt="Image description" width="800" height="767"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is interesting to see we get a few bursts of load in these scenarios.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Downstream Interfaces&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Once we have received our IDOCS and published them to Service Bus for other interfaces, we can also add BAM to those interfaces, and we will be able to track and monitor them with similar features, as shown here.&lt;/p&gt;

&lt;p&gt;One of the key benefits is mapping properties such as the IDOC number where it is used in other interfaces so we can easily trace the flow of an IDOC to other systems within our Business Process.&lt;/p&gt;

</description>
      <category>announcement</category>
      <category>offers</category>
      <category>devto</category>
      <category>web3</category>
    </item>
    <item>
      <title>Using BAM from Azure Synapse Pipelines</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Tue, 07 Feb 2023 04:11:49 +0000</pubDate>
      <link>https://forem.com/karth1k14/using-bam-from-azure-synapse-pipelines-56d</link>
      <guid>https://forem.com/karth1k14/using-bam-from-azure-synapse-pipelines-56d</guid>
      <description>&lt;p&gt;Many organizations are investing heavily in the data space, and Azure Synapse is one of the technologies in the Microsoft stack which is very popular. Within Synapse, Data Flows are the component of Synapse used to orchestrate the movement of data into and out of your Data Lake and for orchestrating jobs within your Data Platform.&lt;/p&gt;

&lt;p&gt;The business heavily depends on your data platform for movements of data within the organization and the analytics-driven from data via Synapse. The challenge, however, is the same old problem that your Data Platform is like a black box that your technology experts can only manage, so you have a bottleneck when someone wants to know if everything is working, how things are running, etc.&lt;/p&gt;

&lt;p&gt;With Serverless360, there is an opportunity to leverage the Business Activity Monitoring feature to open up the black box and provide visibility of those critical data flows and orchestrations in an easy-to-digest form, allowing non-experts to see what is going on safely and securely. This visibility will drive confidence in your Data Platform and help it become a key and popular asset to your organization.&lt;/p&gt;

&lt;p&gt;In this post, we wanted to demonstrate how you can use BAM from a Synapse Pipeline to help democratize your Data Platform and give users and non-Synapse experts visibility&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure Synapse Pipeline Data Flow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In this scenario, I have an application database for my rail car management system, which records railcars, what’s in them, and where they are at any time.&lt;/p&gt;

&lt;p&gt;I need to pull this data into Synapse to merge it with data from other systems and analyze it in Power BI.&lt;/p&gt;

&lt;p&gt;The solution will look something like the below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxtxr9cqlraiochuid2e0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxtxr9cqlraiochuid2e0.png" alt="Image description" width="602" height="243"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this scenario, we have some data flow pipelines which pull data into the data lake, and then Synapse can query them, and the analytics is served in Power BI.&lt;/p&gt;

&lt;p&gt;For this post, we will look at the area in the red box.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb3a87tq1j40fjdhtpjb0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fb3a87tq1j40fjdhtpjb0.png" alt="Image description" width="602" height="245"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The data flow pipeline is the thing we want to create visibility for. We will look at how we can implement BAM in the pipeline to track the processing of this crucial business process without being a Synapse expert.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Design Time&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To start with, we need to implement a Business Process and Transaction within Serverless360 BAM, which will be used to represent our data flow.&lt;/p&gt;

&lt;p&gt;When you are looking at using BAM in this context, you could have a BAM process that covers multiple data flows or single ones. BAM is the simplified logical representation of distributed tracing and does not have to match the physical implementation precisely. BAM is about creating a friendly and easily understandable view.&lt;/p&gt;

&lt;p&gt;I assume you are already slightly familiar with building a business process and transaction for BAM. Please refer to this page for more info on Azure APIM as a proxy for Serverless360 BAM.&lt;/p&gt;

&lt;p&gt;In the below picture, you can see I have added a business process for my Synapse demo, and I have added a transaction called “Copy Railcar Data to Synapse.”&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsj320vb52soy7c40c5ma.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsj320vb52soy7c40c5ma.png" alt="Image description" width="602" height="246"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the design for your BAM transaction, you can add shapes to the designer to represent the high-level milestones within the complex data flow. You can see this below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3uirfs48smnnzn39o5k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fz3uirfs48smnnzn39o5k.png" alt="Image description" width="602" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this case, we will have a pipeline started milestone, and we will then begin the Importing Railcar Data milestone, which may run for a few mins, and then we will complete it and the transaction.&lt;/p&gt;

&lt;p&gt;It will take a few minutes to set up, and you have your BAM transaction ready.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implement BAM in Synapse Pipeline&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, we need to go to Synapse and modify our pipeline to include BAM. In this case, we will start with a primary copy data pipeline generated from the Copy Data Tool. The pipeline looks like the below picture.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsttgmxgslivm9sonzb28.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsttgmxgslivm9sonzb28.png" alt="Image description" width="602" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When it runs, it will go a for each and copy three tables into my data lake.&lt;/p&gt;

&lt;p&gt;If I want to add BAM to my data flow pipeline, we can use the Web Activity to make an HTTP call to BAM, and we will end up with some shapes like the below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0wf4za42jsytj4bdy3hf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0wf4za42jsytj4bdy3hf.png" alt="Image description" width="602" height="136"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Calling BAM API&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you want to call the BAM API from your data flow, it’s as easy as an HTTP call. You will need to supply the URL, some headers, a security key, and a body that maps to your BAM business process, transaction, and milestone stage.&lt;/p&gt;

&lt;p&gt;You can call the BAM API directly and get the settings from your BAM configuration settings, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2mwv6bqffua3krqaqfys.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2mwv6bqffua3krqaqfys.png" alt="Image description" width="471" height="327"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The alternative, and also the way id expect many customers to implement this in the real world, would be to connect their internal Azure API Management to the BAM API. Then they can share access to the API with multiple teams within their organization but centrally manage access control and usage.&lt;/p&gt;

&lt;p&gt;It would look like the below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ct5f193twjbzro8b2u3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3ct5f193twjbzro8b2u3.png" alt="Image description" width="602" height="326"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Regardless of which way you implement the pattern, from the perspective of the Data Flow, it’s just a different URL and API key that would be used. In previous articles, we discussed this APIM proxy approach Azure APIM as a proxy for Serverless360 BAM.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pipeline Variables&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If we come back to the pipeline, we will start by needing to add some variables (or parameters) for your pipeline.&lt;/p&gt;

&lt;p&gt;I created three variables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;bamUrl will be the url for the BAM API&lt;/li&gt;
&lt;li&gt;bamKey will be the API key for my BAM API&lt;/li&gt;
&lt;li&gt;bamTransactionInstanceId will be to hold the guid returned by BAM when I start the transaction.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh0ls6voihy94o1k4vxnc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh0ls6voihy94o1k4vxnc.png" alt="Image description" width="602" height="265"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Start Transaction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the pipeline, the first activity I added is a web activity that will call the BAM API to start a transaction. When this activity executes, it will point to the pipeline began milestone within the BAM diagram and mark it as green to show where we are in the process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8gcaxmdqwthoumujk720.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8gcaxmdqwthoumujk720.png" alt="Image description" width="602" height="242"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I will pass in the properties shown below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq21c4k73iktvtdfx8t7h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq21c4k73iktvtdfx8t7h.png" alt="Image description" width="602" height="454"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b3c7ztjkcvmhn5cs3ba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0b3c7ztjkcvmhn5cs3ba.png" alt="Image description" width="753" height="379"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foamoye9qddnasc4v1qh1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foamoye9qddnasc4v1qh1.png" alt="Image description" width="759" height="700"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpc0uo9i0m0uyb36uztna.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpc0uo9i0m0uyb36uztna.png" alt="Image description" width="747" height="445"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Body&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The body for the request needs to, at a minimum, be the JSON value below. It’s a JSON object with a message body and message header section.&lt;/p&gt;

&lt;p&gt;{&lt;br&gt;
    "MessageBody": {},&lt;br&gt;
    "MessageHeader": {&lt;br&gt;
        }&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;You can populate the headers with a key-value pair of properties BAM can capture and promote to be searchable. You might consider passing things like the pipeline name, id, or custom values.&lt;/p&gt;

&lt;p&gt;When the pipeline runs, this activity will start the transaction in BAM.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Transaction Instance Id&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The next activity in the pipeline was a Set-Variable. It will take the response from the Start Transaction web activity, and it will extract from the response the TransactionInstanceId property in the JSON response, which we will use when we call other BAM milestones to link them all together.&lt;/p&gt;

&lt;p&gt;I created the bamTransactionInstanceId variable at the pipeline level earlier, and we will point this set variable activity to that variable and use the following expression.&lt;/p&gt;

&lt;p&gt;@activity('BAM - Start Transaction').output['TransactionInstanceId']&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Import Started CheckPoint&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, we are about to start the long-running import process. At this point, I have decided to tell BAM that I am at a checkpoint I have called “Importing Railcar Data.” I will tell BAM that this checkpoint has started, but I expect it to take a little while, so ill come back later and mark it as complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjzmmvwzoke6jk1iq2qq0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjzmmvwzoke6jk1iq2qq0.png" alt="Image description" width="602" height="205"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To set this checkpoint, we use another web activity in your pipeline. In the case of my demo, it’s pretty simple, but in the real world, you might have some work happening in the pipeline before you get to this checkpoint. Maybe you look up some metadata or similar.&lt;/p&gt;

&lt;p&gt;In the pipeline, I am now at this point.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F49gneqbznhz5vkf29ggv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F49gneqbznhz5vkf29ggv.png" alt="Image description" width="602" height="136"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This shape is similar to the previous activity for doing the call to start the transaction. The differences are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The URL is not the same&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The route is to the CheckPoint operation rather than start transaction, so I use the below expression&lt;/p&gt;

&lt;p&gt;&lt;a class="mentioned-user" href="https://dev.to/concat"&gt;@concat&lt;/a&gt;(variables('bamUrl'),'/api/CheckPoint')&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The headers are different&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I do not need to supply the SL360-BusinessProcess or SL360-Transaction headers. I pass the SL360-TransactionInstanceId header and use the value from the guid we retrieved from the start transaction response.&lt;/p&gt;

&lt;p&gt;I am also setting In Progress as the value for the SL360-StageStatus header.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftohavfvy4duv7ypxwvzs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftohavfvy4duv7ypxwvzs.png" alt="Image description" width="573" height="272"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The body can be different if you want&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In my case, I’m just using the same body as earlier, but you can send a different payload if you want to tell BAM about some new or changed metadata.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Simulating Some work Happening&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next in the pipeline, I call the foreach, which will iterate over the configured tables for the import and copy them from my SQL database to the Data Lake.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpfvkqzmo71g19wb450vs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpfvkqzmo71g19wb450vs.png" alt="Image description" width="602" height="137"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Completing the Transaction&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The final stage in the pipeline tells BAM that the transaction is completed. It marks the Importing RailcarData stage as either successful or failed. When we did the checkpoint earlier to set the Importing RailcarData stage to In Progress, it will show as orange in the BAM runtime view until it is updated to Success or Failure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm8ovwm603f1l8gew4rub.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm8ovwm603f1l8gew4rub.png" alt="Image description" width="602" height="165"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the pipeline, I have set 2 web activities to update BAM. Using the output lines from the ForEach, one will execute in a success condition, and one will execute in a failure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdt5snhqhcxyomroopg0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkdt5snhqhcxyomroopg0.png" alt="Image description" width="602" height="143"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The only difference between these two shapes is that one sets the Header SL360-StageStatus to Failure and one to Success.&lt;/p&gt;

&lt;p&gt;Both of them use the same URL as the previous step to do a checkpoint with the below expression:&lt;/p&gt;

&lt;p&gt;&lt;a class="mentioned-user" href="https://dev.to/concat"&gt;@concat&lt;/a&gt;(variables('bamUrl'),'/api/CheckPoint')&lt;/p&gt;

&lt;p&gt;You can see the below properties for the headers from the success activity:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxaa3zmadj2k2olhdid0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxaa3zmadj2k2olhdid0.png" alt="Image description" width="602" height="315"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Running the Data Flow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I have set up my activity to run on a recurring schedule at 8 am each day, like a typically scheduled import to the data lake. If we go to Synapse, we can see in the demo that we have the pipeline run completed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstqm950cmyjpi0lczidb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fstqm950cmyjpi0lczidb.png" alt="Image description" width="602" height="234"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And we can see in the pipeline history the visual shows us the pipeline execution.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp0eoclq4cpzxv5kiaz8o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp0eoclq4cpzxv5kiaz8o.png" alt="Image description" width="602" height="171"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I am returning to the core problem that I need to be a Synapse expert to see this view and to be able to interpret what is happening. Here, BAM comes into the solution to provide a simplified, user-friendly view of the complex implementation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Runtime Tracking&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the BAM runtime view, I can now see the tracking data each time my pipeline runs. Below you can see the runs for the last few days.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjljx2j0vduqbzibqe1s6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjljx2j0vduqbzibqe1s6.png" alt="Image description" width="602" height="144"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I can quickly tell they were successful, and any that failed would appear here as red on the left-hand side.&lt;/p&gt;

&lt;p&gt;If I had sent any custom MessageHeaders from any web activities, I would have the ability to promote that data as searchable columns in the BAM view. An example might be that you promote a specific application property or similar. You can then search for transactions that match that value.&lt;/p&gt;

&lt;p&gt;If I click on one of the transactions in the grid, I can open it up, and it will show me the process diagram, and I can see from the key which stages were successful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa29qdjq28pcil8631mkb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa29qdjq28pcil8631mkb.png" alt="Image description" width="602" height="525"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If we had sent any metadata or body with the checkpoint actions, we can click on the stages in the BAM diagram and see them.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Other BAM Features&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;There are several other great features in BAM that you now start opening up for potential users, such as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Action Required can help you keep on top of managing when issues might have happened and ensuring you have sorted them out&lt;/li&gt;
&lt;li&gt;Reprocessing can let you a way to allow the non-expert potentially replay an action&lt;/li&gt;
&lt;li&gt;Dashboards can help you get a holistic view of performance and processing across your BAM process&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can find out more about our BAM features in the section of our documentation on this link:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://docs.serverless360.com/docs/what-is-business-activity-monitoring" rel="noopener noreferrer"&gt;https://docs.serverless360.com/docs/what-is-business-activity-monitoring&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I hope this article will show how you can use BAM to help you achieve distributed tracing and improve the visibility and manageability of your data platform processes which you are implementing with Synapse data flows.&lt;/p&gt;

&lt;p&gt;We aim to help lower the total cost of ownership through the democratization of support for your solution and improve your users’ experience and confidence in the solution, which will encourage further investment in your platform.&lt;/p&gt;

</description>
      <category>announcement</category>
      <category>devto</category>
      <category>web3</category>
      <category>offers</category>
    </item>
    <item>
      <title>Robotic Process Automation better monitored with Serverless360 BAM</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Mon, 06 Feb 2023 18:06:41 +0000</pubDate>
      <link>https://forem.com/karth1k14/robotic-process-automation-better-monitored-with-serverless360-bam-38l0</link>
      <guid>https://forem.com/karth1k14/robotic-process-automation-better-monitored-with-serverless360-bam-38l0</guid>
      <description>&lt;p&gt;The critical use case for BAM is to provide a simplified business-friendly view of the integrated processes that are key to your business. In business today, Robotic Process Automation (RPA) is a trending technology. While it has been around for quite a long time, it has gained a significant boost in popularity alongside the popularity of citizen developer and maker use cases. Microsoft Power Automate now includes modules in the RPA space, which has made the technology accessible to many users and scenarios through its association with the Microsoft Power Platform and Office 365.&lt;/p&gt;

&lt;p&gt;RPA can let you try out the right ideas in automation, which you may re-make as a fully-fledged automated solution, or you may choose to keep the RPA solution if it just works for your needs.&lt;/p&gt;

&lt;p&gt;Due to the popularity of opportunities RPA offers, customers will want to be able to fit RPA use cases into the same business activity monitoring (BAM) approaches they currently use for monitoring other critical business processes. With Serverless360, we wanted to explore how customers might use our BAM product to monitor RPA processes to ensure they can easily track progress, monitor, and troubleshoot.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scenario&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If we consider an example scenario, we will have a sales user meeting a customer. They will discuss a sales deal, and then when the sales user has concluded the deal, they will click a button on their mobile to trigger a cloud flow via the flow button.&lt;/p&gt;

&lt;p&gt;The process for this example would be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The user captures some data on mobile&lt;/li&gt;
&lt;li&gt;The user clicks a button on mobile to submit data&lt;/li&gt;
&lt;li&gt;Power Automate cloud flow runs and queries additional information from SAP&lt;/li&gt;
&lt;li&gt;Cloud flow then runs a desktop flow to integrate with the legacy desktop application and process an order&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The below diagram illustrates what this might look like.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqibs95exprqwokojpe0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Foqibs95exprqwokojpe0.png" alt="Image description" width="602" height="249"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This automated process will execute across several places, and we want a consolidated view of what is happening. We want to see what’s happening in a way that is easy to digest so that the support and business user does not need to be an expert to understand it.&lt;/p&gt;

&lt;p&gt;By using Serverless360’s distributed tracing capability, BAM, we can send events about significant milestones. We will be able to send BAM events from the Power Automate Cloud flow and the Power Automate Desktop RPA flow, as shown in the diagram below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxoevkta0plnnyqoubr00.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxoevkta0plnnyqoubr00.png" alt="Image description" width="602" height="319"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Design Time&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I described how a real cloud and RPA combined solution might work in the scenario above. In a simplified demo to walk through how you can use BAM, we will design a business transaction that looks like the following diagram.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqv3iit09yz4ahx11d3p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpqv3iit09yz4ahx11d3p.png" alt="Image description" width="602" height="258"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here we have four steps. The “Cloud Flow Triggered” and “Cloud Flow Complete” steps will execute in the Power Automate Cloud Flow. The middle two steps will perform in our RPA Power Automate Desktop Flow.&lt;/p&gt;

&lt;p&gt;In each stage, you will be able to configure the properties as shown in the image below, and the critical thing you want is the Tracking Name which maps to one of the fields of info you will pass from the Flow.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ae3rdit44ru12g98o60.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5ae3rdit44ru12g98o60.png" alt="Image description" width="463" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you would like to take a step back for a moment and learn more about how to build the BAM transaction at design time, there is more information on this page: &lt;a href="https://docs.serverless360.com/docs/bam-end-to-end-tracking" rel="noopener noreferrer"&gt;https://docs.serverless360.com/docs/bam-end-to-end-tracking&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implement BAM in Cloud Flow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the cloud flow, we will manually trigger a flow trigger to allow a user to trigger the Flow from a flow button that can accept some user input, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqbz0u3w3s8bsrxj3wvx8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqbz0u3w3s8bsrxj3wvx8.png" alt="Image description" width="482" height="424"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The cloud flow will then use the Serverless360 Start Transaction, telling BAM that we have started a transaction. The Flow will then call Power Automate Desktop to run a flow on a machine, and then it will use the BAM CheckPoint shape to indicate the Flow is complete.&lt;/p&gt;

&lt;p&gt;The below shape shows calling the Start Transaction. We passed in some information, such as the order and triggered, so that we can see this in BAM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbmd5vuzuwcl54y5n0wk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwbmd5vuzuwcl54y5n0wk.png" alt="Image description" width="492" height="326"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The below shape shows calling the RPA process. The output from the BAM Start Transaction is a transaction instance id that we will pass down to the RPA flow, which will use this id to correlate BAM events it sends to the same transaction.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1x3a81gin5ibum5xc5sr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1x3a81gin5ibum5xc5sr.png" alt="Image description" width="602" height="294"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, when the desktop flow is complete, we will use another CheckPoint shape to indicate the entire process is complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpoff35zsffhkzc1xepmo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpoff35zsffhkzc1xepmo.png" alt="Image description" width="602" height="239"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Implement BAM in Desktop Flow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In Power Automate Desktop (PAD), I can now create a flow that will perform actions when it is running on a machine simulating a user performing actions on the desktop.&lt;/p&gt;

&lt;p&gt;In PAD, there is the option to use the Invoke Web Service action, allowing you to make an HTTP call from your desktop flow.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Design Decision&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One of the design decisions I think you need to make here is how you will allow people to consume your BAM API. From a personal perspective, I would prefer to have RPA use cases call out to an API on my Azure API Management which sits in front of the BAM API. It is how most enterprise customers would likely do this because it gives you rich API Management features for monitoring an API and managing security for multiple users, which will help you support many different RPA projects.&lt;/p&gt;

&lt;p&gt;If you decide to use API Management as a proxy to your BAM API, the only difference is that the hostname for your API and the key and header would be different. In this case, I will use APIM as a Proxy which is discussed in one of our previous articles.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;View of RPA Desktop Flow&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In the desktop flow, I am doing a brief demo below. I will set a variable, the API key, for calling my web service. I will invoke a web service, the BAM CheckPoint, which we also used above from the cloud flow.&lt;/p&gt;

&lt;p&gt;I will then wait, simulating and performing some actions to automate the desktop.&lt;/p&gt;

&lt;p&gt;I will then execute another checkpoint with the invoke web service action.&lt;/p&gt;

&lt;p&gt;The below picture shows the desktop flow.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiwycou6a5gh2u71em264.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fiwycou6a5gh2u71em264.png" alt="Image description" width="602" height="204"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The first step in my Flow is the key for my API calls to BAM. I can either store this as a variable in my desktop flow or pass it in as a variable from the cloud flow.&lt;/p&gt;

&lt;p&gt;If we look at the individual action for invoking web service, you can see that you do an HTTP post to the API Management, which then forwards the call to the BAM API. (Remember that you can call BAM directly if you don’t want to use APIM).&lt;/p&gt;

&lt;p&gt;The key bits of information to supply are in the custom headers and body. If you note that the SL-360-TransactionInstanceId is the variable we passed in from the cloud flow.&lt;/p&gt;

&lt;p&gt;The SL360-Stage will match the value of the tracking name property from the stage in the BAM designer.&lt;/p&gt;

&lt;p&gt;The below picture shows the various settings for calling BAM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc7knyd8z7j3nc10lqfb5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc7knyd8z7j3nc10lqfb5.png" alt="Image description" width="493" height="621"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There is one KEY point you need to remember here. When using this action, you need to turn off the encode request body property on the advanced settings for the Invoke Web Service. If you don’t turn this off, the JSON body will be encoded, and you will get an error. The below picture shows this setting.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7rh9ncyxh29f6q6yiw2p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7rh9ncyxh29f6q6yiw2p.png" alt="Image description" width="470" height="253"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the request body, I provided the below JSON.&lt;/p&gt;

&lt;p&gt;{&lt;br&gt;
    "messageBody":{},&lt;br&gt;
    "messageHeader":{&lt;br&gt;
        "OrderId":"%orderId%"&lt;br&gt;
    }&lt;br&gt;
}&lt;/p&gt;

&lt;p&gt;It allows me to pass a message body and an array of headers to BAM. In this case, I’m giving the order id. I could also pass others; maybe something like the machine name from the unattended agent would make sense.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Running the Scenario&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When I save and deploy my BAM Desktop flow, I should now have my cloud and desktop flow available. I will see a flow button on my mobile that can run my process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcgqla1461wd74h236hlt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcgqla1461wd74h236hlt.png" alt="Image description" width="447" height="558"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When I click this, it will open a form created by the Flow button to capture some data, and then I can click to start my Flow, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdmes1ntl2xz9oy7ha6o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjdmes1ntl2xz9oy7ha6o.png" alt="Image description" width="378" height="472"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If I want to see what happened to the execution, as a Power Automate user who knows what they are doing, I could go into Power Automate and look at the run history and see how the Flow executed, as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqp9jg8rbiw5rf5n3ub37.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqp9jg8rbiw5rf5n3ub37.png" alt="Image description" width="800" height="548"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While this is a pretty good experience, what happens when you move this to the real world and scale it up? What if you have many users triggering processes and need to be able to let your support team, who aren’t Power Automate experts, provide that triage to support your solution to deal with everyday issues? Here is where BAM can help you.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Runtime Tracking&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you are a super user or support user, then you could access BAM and see what is happening because we sent events from our Cloud Flow and Desktop Flow to BAM so we can get visibility of the transactions being processed.&lt;/p&gt;

&lt;p&gt;Below is a view from the BAM track where some transactions are in flight.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8z4s2ijcwev7mnertkvk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8z4s2ijcwev7mnertkvk.png" alt="Image description" width="800" height="257"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;By promoting metadata from the flows, we can create excellent views which make it easy to see what’s happening. We can also search within these views so that you can search for an instance related to a specific order id.&lt;/p&gt;

&lt;p&gt;If you want to check what’s happening to a specific process instance, you can click on it in the above tracking view. It will open up an activity diagram, as shown below, representing the design time diagram you drew earlier in the article.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5lk0zk654n01da7hmxrw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5lk0zk654n01da7hmxrw.png" alt="Image description" width="602" height="618"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here, each shape is shown based on the runtime data to indicate if it is executed and what the results were.&lt;/p&gt;

&lt;p&gt;When the transactions finish executing, you can see in the below view both are finished.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe4p8696lhbm6putl53a0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe4p8696lhbm6putl53a0.png" alt="Image description" width="800" height="262"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Next up, you can click the transaction again and see all shapes are complete.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx4aso0h2pje0gvyp1w87.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx4aso0h2pje0gvyp1w87.png" alt="Image description" width="800" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This tracking view should make it a lot easier to troubleshoot your Power Automate Desktop RPA processes. &lt;/p&gt;

&lt;p&gt;Some of the other things you might choose to do when linking Power Automate Desktop or Cloud flows to BAM include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Send the run id from the Flow to BAM, so it’s easy to work out which Flow ran which process&lt;/li&gt;
&lt;li&gt;Send messages or responses to BAM so you can make them more accessible for the support user to see&lt;/li&gt;
&lt;li&gt;Promote additional metadata properties so you can search on them&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;BAM Dashboards&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;BAM has a dashboard feature that you might find very useful. Once your RPA processes are up and running and you are managing them with BAM, you might be interested to see some analysis of the success and performance of your processes.&lt;/p&gt;

&lt;p&gt;With BAM, you will get a generated dashboard for some metrics about your process. You can also add customized widgets based on specific properties or criteria that will provide a rich view of what’s happening.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ju7ivrw4bj8s768j6lo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2ju7ivrw4bj8s768j6lo.png" alt="Image description" width="602" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;BAM Action Required&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In BAM, we also have a feature called Action Required. The action required aims to help you manage any issues with your processes. If a transaction in BAM is recorded as a failure, it will appear in action needed list.&lt;/p&gt;

&lt;p&gt;You can then ignore it and provide a message if you have handled it outside of Serverless360 or configure a reprocessing step that your Serverless360 BAM user can use to reprocess a message. It might allow you to configure a way for your support team to reprocess a message if needed, marking the transaction instance as having been reprocessed.&lt;/p&gt;

&lt;p&gt;Action Required is about helping you keep things healthy by knowing what issues haven’t been dealt with yet.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Conclusion&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;I hope this article shows how easy it is to add BAM to an RPA solution. As customers become more invested in the power of RPA solutions, we believe that plugging them into the modules you use simplifies the management and operations of your other integrated solutions. It will help customers to provide excellent support services to the business functions that depend on IT support for the RPA processes they develop.&lt;/p&gt;

</description>
      <category>discuss</category>
      <category>writing</category>
      <category>blogging</category>
    </item>
    <item>
      <title>Optimizing BAM Performance with API Management</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Mon, 06 Feb 2023 17:45:46 +0000</pubDate>
      <link>https://forem.com/karth1k14/optimizing-bam-performance-with-api-management-54fb</link>
      <guid>https://forem.com/karth1k14/optimizing-bam-performance-with-api-management-54fb</guid>
      <description>&lt;p&gt;In a previous article, I discussed how you could use API Management as a proxy for the Serverless360 BAM API. In the real world, I have used this approach a few times because I prefer it to the custom connectors that you can use with Logic Apps which are sometimes painful to use from a DevOps perspective and have known performance limitations that can affect the latency and throughput for connections Logic Apps make to Cloud Services.&lt;/p&gt;

&lt;p&gt;Using APIM as a proxy overcomes those bottlenecks and gives me several other benefits such as key management which we have discussed previously.&lt;/p&gt;

&lt;p&gt;What a few customers have asked about previously is if they can minimize as much as possible the latency impact in their Logic Apps when they want to use BAM. The latency involves in the BAM API is that when your Logic App (or another component) pushes a message to the BAM API we save the message to a queue on Service Bus behind the scenes so that we can then process the message out of the process and do the work of matching to a transaction, etc. The BAM API can have that latency from the call to your Logic App to the API and then persist the message to the Queue.&lt;/p&gt;

&lt;p&gt;One of the things that some customers have done is to use either the HTTP action or APIM Action in the Logic App which takes the custom connector out of the call. This is the first performance optimization you can do if you want to tune your scenario to reduce latency.&lt;/p&gt;

&lt;p&gt;If you want to tune things so that you can get even lower latency, then an approach I have used is to take advantage of the send-one-way-request policy in API Management. This allows me to send a normal request from my Logic App to BAM via API Management but inside the policy, I will then use the send-one-way-request to forward the message to BAM and return a response. This means that APIM will respond super quickly to your Logic App so you can continue processing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What does this look like?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If we consider a sequence diagram showing the typical flow of a Logic App consuming the BAM API, it will look like the below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KTGbbOiZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3h8cp3dyen8tixz3mk3y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KTGbbOiZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3h8cp3dyen8tixz3mk3y.png" alt="Image description" width="482" height="515"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this case, each time the Logic App calls BAM it waits until the message is persisted for processing and then continues.&lt;/p&gt;

&lt;p&gt;By putting APIM between the Logic App and BAM we can use a sync to async pattern and the call to the Logic App will return quickly so the Logic App can continue processing. The call to the BAM API will then be done in the background out of the process as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jzKg73Xy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2pkt2emltbz9wej7mwxa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jzKg73Xy--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2pkt2emltbz9wej7mwxa.png" alt="Image description" width="602" height="592"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Design Decision&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When you choose to use this sync to async process you do need to be aware that you will be making a trade-off. When you use the API directly you can be sure that the message got persisted and will appear in BAM before you continue processing the Logic App. By using the sync to async approach you accept that if there is an error between APIM and the BAM API then your Logic App will not be aware of this error.&lt;/p&gt;

&lt;p&gt;In BAM we chose to use Service Bus for reliability and guaranteed at least one processing of messages and the advanced dead letter and error support, but if you have a scenario where you are willing to accept that in an error condition you don’t mind the risk that you may lose a BAM event then this sync to the async pattern will remove the latency of the BAM API and the Logic App connector from your Logic App processing duration.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Approach&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With the approach for this pattern, we will import the API Definition for BAM into API Management as we did in the previous article however this time, we will clone the StartTransaction and CheckPoint operations and create two additional ones called StartTransaction-Async and CheckPoint-Async. This will allow us to use the out-of-the-box approach for the cases where we want to be sure BAM has the message before continuing and we can use the async ones where we are ok to risk potential message loss so we can reduce the latency.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Walk Through&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;At the start of the walkthrough, I am assuming you have imported your BAM API definition into API Management as demonstrated in the previous article about using APIM as a proxy to BAM.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Add Sync to Async API Operations&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Next, we will add the cloned operations. You can use the Azure Portal to clone an operation by right-clicking on it as shown in the picture below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--efFX4gtI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k0ofwmmcznncd2dh0dp8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--efFX4gtI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/k0ofwmmcznncd2dh0dp8.png" alt="Image description" width="397" height="312"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will then go and modify the name and path for the operation in the operation settings.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eDYXju7q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ur7yaytfp2i5adiloyv6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eDYXju7q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ur7yaytfp2i5adiloyv6.png" alt="Image description" width="453" height="390"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will want to modify the name that Azure sets by default when cloning the operation and the URL path. I chose to just add a suffix of -Async on them.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cD47hXQH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y57qsm888plhwf3hn7us.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cD47hXQH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y57qsm888plhwf3hn7us.png" alt="Image description" width="480" height="339"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You will need to repeat these steps, so you have set up both StartTransaction and CheckPoint with async versions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Set Policy&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For each operation, you will need to modify its inbound policy to do a send-one-way request and then return a response. This is shown in the samples below. Note that I set the BAM URL and API Key as named values, so they are easy to access in the policy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Start Transaction Async&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4uKDh4Qg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2wd6e0uvozkujdlqciws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4uKDh4Qg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2wd6e0uvozkujdlqciws.png" alt="Image description" width="775" height="631"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;CheckPoint Async&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UyHXAcbS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xfkus0t6mtyiv1dpflgc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UyHXAcbS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xfkus0t6mtyiv1dpflgc.png" alt="Image description" width="769" height="610"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logic App Setup&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In my Logic App you will see if I access API Management and choose my BAM API there are additional operations available so I can use the async versions of the operations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sSyyvc9h--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mat5fnkbzi4rxzg8z9vm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sSyyvc9h--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/mat5fnkbzi4rxzg8z9vm.png" alt="Image description" width="602" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When I use the operations the one key difference you need to be aware of is that we have added a new property on the StartTransaction so you can supply your guid when starting a transaction rather than having to wait for the one the BAM API gives you back as shown below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--5zPR5XxK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vnip1zk98h9ja2x273kz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--5zPR5XxK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vnip1zk98h9ja2x273kz.png" alt="Image description" width="602" height="407"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this case, I just create a string variable at the start of my Logic App using the Guid() expression and then reference it when calling BAM.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--U7yLvgo6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ki73iuv4y1957mvgcews.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U7yLvgo6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ki73iuv4y1957mvgcews.png" alt="Image description" width="602" height="177"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You are now ready to use your BAM actions from APIM in your Logic App and you will find that your logic app will take less time to run using the sync to the async pattern.&lt;br&gt;
Comparison&lt;/p&gt;

&lt;p&gt;If we do a basic comparison of the time the Logic App takes for the different approaches with three similar logic apps which do 2 BAM actions and return a response where the logic app looks like the below picture, then the comparison is below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lQ3Rrixp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/22l4kcz7r85y4g42jm6e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lQ3Rrixp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/22l4kcz7r85y4g42jm6e.png" alt="Image description" width="602" height="271"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logic App with Custom Connector&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wFBZmUxQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xk0rzp5c0zz3dfynhkip.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wFBZmUxQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xk0rzp5c0zz3dfynhkip.png" alt="Image description" width="602" height="139"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logic App with APIM – Sync Pattern&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IXRvzRtf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/by8jpah20ser0lf2tzam.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IXRvzRtf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/by8jpah20ser0lf2tzam.png" alt="Image description" width="602" height="160"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Logic App with APIM – Sync to Async Pattern&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--anpL4Bi---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n7jouh50r11ksqoallm2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--anpL4Bi---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n7jouh50r11ksqoallm2.png" alt="Image description" width="602" height="186"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Observations&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can see there is a slight performance benefit from the APIM sync approach in terms of latency compared to the custom connector, but it is not that big. Where APIM offers a benefit here however is that when you have a connector that is shared by a lot of Logic Apps there can sometimes be throttling challenges. This is a common Logic App challenge people have come across with connectors in general for Logic Apps at scale and there are patterns people use with multiple instances of connectors.&lt;/p&gt;

&lt;p&gt;With the APIM connector, it does not use a cloud connector in the same way, and you do not get the Azure API Connection resource type created, the APIM action is more like a wrapper over the HTTP action which utilizes discovery from Azure APIM to make it easier to consume the API. For this reason, we should not have the throttling challenges we get with API Connectors. There is a small latency improvement in a basic test but as you scale up id expect a more noticeable benefit.&lt;/p&gt;

&lt;p&gt;With the Sync to Async pattern, you can now see a noticeable benefit where the API calls to APIM are significantly quicker demonstrating we can make our Logic App run much faster if we need to.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Summary&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Hopefully, this article shows it is easy to use APIM to help implement more advanced scenarios with the BAM API where you can tune the performance of your interfaces when you need to have lower latency requirements, but still want to use the cool operational features of our BAM solution.&lt;/p&gt;

</description>
      <category>azure</category>
      <category>api</category>
    </item>
    <item>
      <title>Azure Documentation tool – the right way to monitor Azure usage</title>
      <dc:creator>Karthik Ganesan</dc:creator>
      <pubDate>Mon, 06 Feb 2023 17:07:54 +0000</pubDate>
      <link>https://forem.com/karth1k14/azure-documentation-tool-the-right-way-to-monitor-azure-usage-2l9h</link>
      <guid>https://forem.com/karth1k14/azure-documentation-tool-the-right-way-to-monitor-azure-usage-2l9h</guid>
      <description>&lt;p&gt;As the adoption to cloud expand, subscribing and provisioning resources increases day by day. Enterprises have multiple strategies to implement their cloud deployments: single cloud, private cloud, multi-cloud, or hybrid cloud. There is a need to analyse and assess on regular basis to keep a watch on the usage to ensure expenditure on the cloud resources are efficient and there is no wastage.&lt;/p&gt;

&lt;p&gt;There are various possibilities but will need some level of technical competence and regular maintenance:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Use Azure portal capabilities to configure asset management and monitoring&lt;/li&gt;
&lt;li&gt;Build custom tools using APIs exposed by Microsoft Azure support&lt;/li&gt;
&lt;li&gt;Use 3rd party tools and services&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Not all stake holders will have access and the necessity to check through these tools, but it will be easy for the decision-makers to investigate the reports, summaries, comparison assessments, and useful charts or diagrammatic representations. This will help them to assess the usage briefly and take strategic decisions.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"A well designed and articulated documentation of your Microsoft Azure environment can help in comprehending the usage easily without accessing too many tools and configurations."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bnum1wkx9nytnk1ms69.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6bnum1wkx9nytnk1ms69.png" alt="Documentation page" width="800" height="282"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.serverless360.com/azure-documenter" rel="noopener noreferrer"&gt;Azure Documentation Generator&lt;/a&gt; in Serverless360 helps you to document your Microsoft Azure Subscription. It comes with the capability to aggregate data from disparate resource providers in Microsoft Azure in a single report.&lt;/p&gt;

&lt;p&gt;This allows for the creation of comprehensive technical documentation across resources enabling richer insights that would otherwise be impossible. Interpreting your cost and resource information on Microsoft Azure subscription into legible documentation is what Azure Documenter is for.&lt;/p&gt;

&lt;h2&gt;
  
  
  Microsoft Azure usage is constantly evolving
&lt;/h2&gt;

&lt;p&gt;Microsoft Azure like any other cloud service provider makes it easy to instantiate and deploy resources at ease on a subscription basis. Though limits on usage and restricted user permissions could be set, the resource deployment and usage grow day by day due to the nature of the business demands for various purposes.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Resources could be deployed manually or through automated processes&lt;/li&gt;
&lt;li&gt;Resources are deployed across geographical locations&lt;/li&gt;
&lt;li&gt;Resources and environments are managed by different teams&lt;/li&gt;
&lt;li&gt;Configurations are set or automated to support different workloads/demands&lt;/li&gt;
&lt;li&gt;Resources are billed across subscriptions and business units&lt;/li&gt;
&lt;li&gt;There is a need for different environments like Development, QA, Staging, Production, etc&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Isn’t it necessary to have utmost clarity on Resource usage and their deployments, type, Configuration, SKU of these resources?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Deployment of resources in Microsoft Azure is made so simple that, without proper auditing and governance, this may lead to situations where resource management becomes uncontrollable.&lt;/p&gt;

&lt;p&gt;Challenges in maintaining the documentation on your Microsoft Azure Subscriptions&lt;/p&gt;

&lt;p&gt;It is very difficult to keep the document up to date with the changes that happen so frequently in the environment/infrastructure. Eventually, the documentation becomes stale. Getting the complete picture of how well Microsoft Azure is being used can be complicated.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;It can be quite challenging to consolidate all resource providers, their instances, locations, and cost details into comprehensive documentation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Subsequently, the reports or documents shared with respective stakeholders and their effectiveness can become too cumbersome. It can also be hard for your team and partners to access and understand them. Many customers experience challenges in documenting the resources deployed to Azure, their configuration, and settings.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to document Microsoft Azure usage?
&lt;/h2&gt;

&lt;p&gt;It is very helpful to document the whole environment in a way that can be easily exported to a standard format like having the ability to look at what exactly is going on in your Azure environment.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The quantity and quality of the documentation should meet the stakeholders’ needs. Only this can create accurate and just enough documentation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;There are certain aspects to the documentation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What is to be documented?&lt;/li&gt;
&lt;li&gt;How do we document?&lt;/li&gt;
&lt;li&gt;Relating the documentation to the deployment and usage?&lt;/li&gt;
&lt;li&gt;Keeping the documentation up to date?&lt;/li&gt;
&lt;li&gt;Making it transparent and accessible to the relevant stakeholders?&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Save time by using our Azure documentation tool
&lt;/h2&gt;

&lt;p&gt;The solution is to use a documentation generator tool like Serverless360 &lt;a href="https://www.serverless360.com/azure-documenter" rel="noopener noreferrer"&gt;Azure Documenter&lt;/a&gt; that helps to document the Azure resources that are running, regardless of how/why they were provisioned, the Azure region and subscription in which they are located.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsl3qsfpcrnak7ahdun0d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsl3qsfpcrnak7ahdun0d.png" alt="Azure Documenter" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We have made the configuration simple to get started with generating the documentation with few steps.&lt;/p&gt;

&lt;p&gt;There is no separate installation or configuration change required in your Azure subscriptions. Serverless360 Azure Documenter requires Service Principal identity with ‘Reader’ permission to your Azure subscription. After you have configured your account, you can easily generate documentation of your cloud environment. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Our Azure documentation tool will automatically extract information about all the resources in your Microsoft Azure subscription through standard APIs and publish them.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Depending upon the size of your Azure deployment, the time taken for generating the document may differ.&lt;/p&gt;

&lt;p&gt;What are the benefits of using Serverless360 Azure Documenter?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Azure Documenter saves a lot of time by automating up-to-date documentation of your Microsoft Azure environments. It generates a report on your assets and reduces manual effort.&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Being able to easily document your Azure infrastructure will be most valuable to understand the usage.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;It will provide stakeholders at all levels, confidence, and clarity of the adoption to Microsoft Azure cloud.&lt;/li&gt;
&lt;li&gt;It aggregates inventory of all your Resource groups. It eliminates manual activity to search and collect data about different resources and deployments.&lt;/li&gt;
&lt;li&gt;It eliminates the need for a dedicated documentation team will save you a lot of time considering how little it costs.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Capabilities of Azure Documenter&lt;/p&gt;

&lt;p&gt;You would be spending days creating and updating documentation but now you can spend time doing other important activities. Serverless360 Azure Documenter provides capabilities to generate:&lt;/p&gt;

&lt;p&gt;Executive Summary&lt;/p&gt;

&lt;p&gt;Key decision-makers may not need all the intricate details of the cloud implementation and technical aspects. All they need is a summary at a high level:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What is the usage?&lt;/li&gt;
&lt;li&gt;Where are the wastages?&lt;/li&gt;
&lt;li&gt;Are the compliance and security guidelines met?&lt;/li&gt;
&lt;li&gt;What is the cost associated?&lt;/li&gt;
&lt;li&gt;Advisory and guidelines&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Billing &amp;amp; Metrics&lt;/p&gt;

&lt;p&gt;Generate documentation with vital information like billing and metrics for the overall/group-wise resources in the subscription. Provides a graphical representation of cost incurred resource wise, resource type-wise, location-wise and, resource group-wise. It also provides a split-up of cost consumed at the individual resource level.&lt;/p&gt;

&lt;p&gt;Change tracking reports&lt;/p&gt;

&lt;p&gt;As the business grows, Azure cloud implementations could expand vastly with thousands of resources and groups cutting across multiple subscriptions and environments. Due to changes in architecture, infrastructure, business logic, upgrades, and migrations, there will be a lot of changes in the resource configuration, addition, and removal, etc on a day-to-day basis. Not all these are done manually, most of them are directed through DevOps and are automated. Also, there are number of teams and people involved.&lt;/p&gt;

&lt;p&gt;Keeping a track of all these changes, interpreting the need, etc becomes very important. This will help to understand and check if there were any potential modifications that will impact the application and eventually the business.&lt;/p&gt;

&lt;p&gt;Azure Security Compliance reports&lt;/p&gt;

&lt;p&gt;Primary need for any infrastructure deployment is to ensure security and compliance of data, practices, access, and use. Adopting to Microsoft Azure is no different. From time to time, it is important to have absolute knowledge and control of data security, data losses due to storage, migration and archives, any security breaches.&lt;/p&gt;

&lt;p&gt;These reports provide tracking of implementation of compliance rules, following organisation standards and practices, and for auditing purposes.&lt;/p&gt;

&lt;p&gt;Azure cost analysis&lt;/p&gt;

&lt;p&gt;Businesses across the world are embracing cloud computing than ever before. They look at migrations, new implementations, and hence the interest towards cloud implementations is only increasing day by day.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“Our commercial cloud revenue grew 36% year over year to $19.5 billion.” said Microsoft Corp. announced the results for the quarter ended June 30, 2021.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;This means a lot of enterprises are spending more on the cloud. There could be an overspend if there are no periodic assessments on the expenses. Though cloud vendors promote pay-as-you-go models and various other cost control practices, in reality, the expenditure pattern is different. There are platform costs, most of which are fixed based on the configuration, some cost spiral up exponentially based on usage unless controlled, there are hidden costs as well.&lt;/p&gt;

&lt;p&gt;You may also want to compare and analyse cost between two different months.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl5ithoopbi5oad0z7yig.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fl5ithoopbi5oad0z7yig.png" alt="Cost consumption" width="800" height="331"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is not always easy to use tools like Azure cost calculators for your production loads or interpreting the azure billing data. It becomes imperative to understand the expenses, estimate the growth in future expenditure, determine the wastage, with proper classification and deeper insights.&lt;/p&gt;

&lt;p&gt;Schedule documentation and notify stake holders&lt;/p&gt;

&lt;p&gt;Generating documentation for an established and evolving Azure estate with thousands of resources may take so long due to data collection, computation, and analysis. It is not practical to manually initiate documentation at the intended time and for many subscriptions and configurations. Relevant stakeholders may have to be notified once the documentation is complete. Azure Documenter provides automation to address these requirements. Notifications are sent to widely used channels like emails, Microsoft Teams etc.&lt;/p&gt;

&lt;p&gt;Documentation templates&lt;/p&gt;

&lt;p&gt;Documentation needs range from quick summaries and reports that provide high-level overview to comprehensive documentation that covers in-depth details. Create documents that represent not only the Azure resources but its expenses, the exact location of resources, the resource groups, the resource types, and a lot more.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;For eg:&lt;/strong&gt; Executive team may need an expenditure summary, auditing team may be interested in security reports, IT management team may need deeper insights.&lt;/p&gt;

&lt;p&gt;Azure Documenter provides default templates to start with easy and insightful documentation and to configure custom templates that will meet your needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Publishing options – pdf, online &amp;amp; manage documentation history
&lt;/h2&gt;

&lt;p&gt;The generated documentation can be published and shared with stake holders. Azure Documenter provides two different publishing options:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Publish as PDF&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The documents can be published as PDF files and can be stored in default storage provided by Serverless360 or at Azure Storage of your choice. Once the PDF file is created and stored, a unique link for the document will be provided within the portal.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Publish to online platform – Document360&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Managing PDF files and maintaining the versions may become difficult after a stage when you deal with multiple environments and too many documents generated over time. There are lot of content publishing tools available online. One such, software-as-a-service is Document360, an online knowledge base authoring and publishing platform.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;With Azure Documenter integration to Document360 – Generate, publish and authorize access to documents online hassle free&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;Archive Documents&lt;/strong&gt; – PDF of online publishing, Azure Documenter provides document history management for you to access old documents for future reference and comparison.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are the benefits of integrating Azure Documenter with Document360?
&lt;/h2&gt;

&lt;p&gt;You could of course read any data from your Azure environment and document it to Word or PDF. Once you create the documents you might want to share it with the wider community. You may want to review the document to keep track of the environments and send it to a colleague to review. Though you have SharePoint, office365, or other online platforms to share your files, they demand manual efforts and resources to store, maintain versions, distribute, and access control.&lt;/p&gt;

&lt;p&gt;Maintaining versions on word or PDF and sharing them with stakeholders through shared folder access becomes too difficult to manage. As teams grow, managing access control of documents to a wider audience is tedious.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feiaiajn44gbmgi8l3nia.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feiaiajn44gbmgi8l3nia.png" alt="Billing &amp;amp; Metrics" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Document360, an online Knowledge Base authoring, and publishing platform is used to publish the documentation generated by Azure Documenter. It provides online publishing, restricted access to the documentation, and archiving capabilities.&lt;/p&gt;

</description>
      <category>gratitude</category>
    </item>
  </channel>
</rss>
