Month: May 2025 (page 1 of 1)

How to use Logic App to send email from Synapse Pipeline

A task that I completed a while ago that I wanted to share was creating two different pipelines for a business user to load data to and from one of their databases to another server for monthly processing. The business user requested that they receive an email if the pipeline failed so that they would know sooner rather than later that there had been a problem with processing. Unfortunately, there isn’t currently a way to natively send an email from a Synapse Analytics pipeline, there isn’t a built-in activity for that (that I’m aware of). To enable the functionality requested by the business users, I had to create a custom system to send an email.

What’s in this post

Overview of the process

Since there isn’t a native activity within Synapse Analytics that allows you to send an email, the workaround system I created is to make a Logic App that will send an email when triggered by an HTTP request, then add an HTTP activity in the pipeline to trigger the pipeline and feed through error information to the Logic App to generate a useful email. Keep reading to learn the details of how to setup such a process.

Creating a Logic App to Send an Email

1. Click on the component “Add a trigger” in the designer. In the search box, type “request” and then select the option that says “When an HTTP Request is received”.

    2. Click on the link to “Use sample payload to generate schema”

    3. Enter a simple JSON-formatted string containing the values you will need to use to send the email

    4. Click “Done” when you’ve included all the parameters you want to include in your incoming JSON values, and your simple JSON will be automatically formatted in the exact way needed for the payload to handle the request.

    5. Choose the METHOD that will be used for the HTTP request, select “POST”

    6. Click into the open area of the designer to close the editor for the HTTP request component then click the “+” under the component to add another to the flow

    7. In the search box, search for “outlook”, scroll down until you see the section for “Office 365 Outlook” and click the component for “Send an Email (v2)”

    8. If you haven’t logged into an email account through the Logic App yet, it will request that you do so in order to start specifying how the email should be sent. Go through the process of signing into an email account. I recommend using a service account email instead of your own email, unless you’re the only person that will ever be working on this process in the future.

    9. Click into the “To” box in the properties of the “Send an Email” component, and you’ll see two blue icons pop up to the left of the box that will allow you to dynamically enter information into that text box. Click the top icon that looks like a lightning bolt to pick values from the previous step of the process.

    10. In the box that opens, select the “to” value from the HTTP request component to fill into the “To” box of the email properties.

    11. Repeat the same steps for the Subject of the email and the Body of the email.

    Adding the Email Capability to your Synapse Pipeline

    After saving the Logic App, you will be ready to use it from your Synapse pipeline. Once saved for the first time, you will now see that a URL has been generated for the HTTP Request activity which you will use to call the Logic App from your pipeline. Copy that URL.

    Return to your Synapse pipeline so that we can set it up to call the Logic App to send an email. Since we essentially created an API which we can hit to send an email, we need to add a “Web” component to our pipeline. In my case, I want to send an email to a business user when the pipeline fails since they are the key party interested in the pipeline failing. For that, I have added my “Web” component as the “On Failure” route of my pipeline (red “x”/line).

    Open the “Web” activity to set it up to hit your new API. Give the Activity whatever name you would like, in my case it is named “Send Failure Email 1”. Then go to the “Settings” tab and paste the URL into the URL field. Then for “Method”, choose “POST” since we want to send information to the URL from our pipeline.

    Next, we need to write the body of text to send to the API. This Body will need to match the formatting that we specified in the Logic App when we configured the JSON Payload. In that payload, I specified the following parameters should be required: To, Subject, and EmailBody. Based on that, I have written my Body for the Web activity as the following:

    {
        "to":"@{pipeline().parameters.EmailRecipient}",
        "subject":"@{pipeline().parameters.EmailSubject}",
        "emailbody":"Pipeline failed when trying to copy data for table @{item().TableName}."
    }

    When working with the expression builder in Synapse, you need to be very particular about formatting when using pipeline parameters or variables in an expression. Pay close attention to the double-quotes, the curly brackets, and the @ symbols. In my HTTP Body, I set the values of two of the parameters I send to the Logic App through pipeline parameters so that I only had to change them in one place and not in every iteration of the email component in my pipeline. If you don’t want to use parameters like that, you could hard-code the values of the key-value pairs in the body to whatever you would like, similar to how I set the value of emailbody.

    Click “OK” to close the expression builder panel. The very last thing we need to add to our “Web” activity is the Headers of the POST HTTP message, we only need to add one. Click the “+ New” button to add one, then enter the Name as “Content-Type” and the Value as “application/json”.

    Testing the Pipeline Email Capability

    Once you have finalized the addition of the “Web” activity in your pipeline, you are ready to Debug and test the changes, which you can do in the manner in which you normally test your Synapse pipelines. For me, I knew my pipeline was setup successfully in the way I wanted when I created a false failure scenario (since the email only sends when a component fails), I ran a debug, the previous component failed and I received an email containing the failure information that I wanted. Your testing may look slightly different.

    Summary

    Although Synapse Analytics doesn’t have a built-in feature for sending an email from a pipeline, you are able to build that functionality yourself fairly easily using a Logic App and a Web activity in your pipeline. I would love to hear if you found this method useful or implemented it yourself. Thanks for reading!

    Resources

    https://omnidata.com/how-to-setup-notifications-within-synapse-pipelines-using-logic-apps/

    Related Posts

    Add Azure SQL Server auditing with Bicep

    As part of my ongoing series about working with Bicep and SQL Servers and SQL Databases, I am going to cover how you can turn on auditing for a SQL Server instance using Bicep. The auditing I am referring to is what you will find if you go to an Azure SQL Server (Platform as a Service, fully cloud version), then navigate to “Settings” and then “Auditing”. By default, the auditing option will track a few different things that happen on your server, and then you’re able to customize it further if you would like. This post will specifically cover how to enable auditing for a SQL Server using a Storage Account for the destination of the audit logs and the “Storage Access Keys” option for the Storage Authentication Type.

    Other Posts in this Series

    What’s in this post

    What is SQL Server Auditing?

    According to the Azure portal, auditing “tracks database events and writes them to an audit log in your Azure Storage account, Log Analytics workspace or Event Hub.” This allows you to keep track of what is happening on your server and its databases in case you ever need to look back and see what changed or if you need to troubleshoot an error in more detail. Auditing gives you a similar level of server and database traceability as the standard SQL Server Logs on a normal instance, just in a slightly different form since it’s 100% in the cloud.

    In my organization, we have a requirement for auditing to be turned on for all our SQL Servers as a safety measure, which is why I wanted to learn how to enable it in the portal and then later how to enable it through a Bicep template now that we’re automating our resource creation more.

    Note: This Bicep template applies auditing at the SQL Server level, not the SQL Database level. Both are options that you can choose from. Understand that I am not showing how to enable auditing for a specific database, as the template for that is likely slightly different.

    In the portal, this is what the Auditing settings page looks like.

    Continue reading and I’ll show you how to convert that portal page into a Bicep template that you can deploy without much effort at all.

    Full Bicep Template

    This post is discussing a subset of a Bicep template that I’ve also covered in part in other posts. If you would like to see the Bicep template in its entirety, you can find it on my GitHub.

    Create a Storage Account Resource

    Since I have opted to send my SQL Server audit logs to a Storage Account, we first need to create a storage account to send the logs to. If you already have an existing storage account that you would like to send the files to instead, you can skip this step and do an “existing” type reference in the script.

    The definition below creates a very standard storage account. The only thing to mention is that it must be a “V2” type of storage in order to work with auditing.

    resource storageAcct 'Microsoft.Storage/storageAccounts@2023-05-01'= {
      name: stgAcctName
      location: location
      sku: {
        name: 'Standard_RAGRS'
      }
      kind: 'StorageV2'
      properties: {
        dnsEndpointType:'Standard'
        defaultToOAuthAuthentication:false
        publicNetworkAccess:'Enabled'
        allowCrossTenantReplication:false
        minimumTlsVersion:'TLS1_2'
        allowBlobPublicAccess:false
        allowSharedKeyAccess:true
        largeFileSharesState:'Enabled'
        networkAcls: {
          defaultAction: 'Allow'
          bypass:'AzureServices'
        }
        supportsHttpsTrafficOnly:true
        encryption: {
          requireInfrastructureEncryption:false
          services: {
            file: {
              keyType:'Account'
              enabled:true
            }
            blob: {
              keyType:'Account'
              enabled:true
            }
          }
          keySource:'Microsoft.Storage'
        }
        accessTier:'Hot'
      }
    }

    Create the Auditing Resource

    Once you have your storage account created and ready to use, you can then enable auditing on your SQL Server by creating a auditingSettings type resource like what I have below.

    resource sqlServerAuditing 'Microsoft.Sql/servers/auditingSettings@2024-05-01-preview'= {
      name: 'default'
      parent: sqlServer
      properties: {
        state: 'Enabled'
        auditActionsAndGroups: [
          'BATCH_COMPLETED_GROUP'
          'SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP'
          'FAILED_DATABASE_AUTHENTICATION_GROUP'
        ]
        retentionDays:90
        storageAccountSubscriptionId:'2b896190-565d-4f66-9397-92a34afbec85'
        storageEndpoint:'https://${stgAcctName}.blob.core.windows.net'
        storageAccountAccessKey:storageAcct.listKeys().keys[0].value
      }
    }

    Set the Parent of the Auditing Resource

    The first notable thing is that I have specified the parent of the resource as the SQL Server that I created earlier in the template (not shown in this post, but is on GitHub), which is important to tell the auditing resource what you want it to be applied/connected to. If you created the SQL Server in a separate template, you would have to reference the server differently to get the template to understand what you’re trying to apply the auditing to.

    Specify what you want to audit

    The next notable thing about this resource definition is the list of auditActionsAndGroups, which I chose as the default values. If you go to the Auditing settings in the portal and hover above the (i) icon next to where you enable the auditing feature, you will get information about what auditing rules the Azure SQL Server will get by default unless you specify otherwise.

    The default policies are:

    • BATCH_COMPLETED_GROUP
    • SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP
    • FAILED_DATABASE_AUTHENTICATION_GROUP

    In my experience so far, those options have been sufficient for my troubleshoot needs, but you can add others to the list if you would like. To see all the options, review this Microsoft document.

    The best part of creating the auditing resource through Bicep is that you are able to specify those additional “action groups”. Otherwise, you would have to use PowerShell to update the list of things you want to be audited, since you aren’t able to make that change in the portal at this time.

    Retention length and storage account connection

    One of the final things you will need to set in your auditing resource is the number of days you would like to retain the logs, with a maximum value of 100. I chose 90 days, according to my organization’s standards.

    Then you need to specify the storage account and the storage account keys to grant access with. I have set the storageEndpoint value using a variable which contains the name of my storage account, which was created earlier in the full template. I then need to provide the “primary” key of the storage account which gives the auditing resource access to save logs to the storage account. To do this, since I had created my storage account in the same template above, I extract the keys using built-in Bicep functionality. storageAcct.listKeys().keys[0].value

    The breakdown of that key extraction is <stg acct symbolic name>.<function that will list keys>.<first key in list (primary)>.<value in keys array at location 0>.

    Summary

    With just two resource definitions in a Bicep template, you are able to specify custom auditing for your Azure SQL Server instance, and do extra detailed auditing beyond the defaults which isn’t available through the portal. Let me know in the comments below if you found this helpful. Thanks for reading!

    Resources

    Related Posts

    Add RBAC Role Definitions to Resources with Bicep Template

    If you’ve been following along in my series on using Bicep templates to deploy database-related resources in Azure, you may have seen my template for creating Azure SQL Servers and Databases on GitHub which contains a section for defining role assignments. I haven’t yet covered the specifics of that type of resource in a post, so I will be going over that today to simplify the information you need to know to assign particular Role-Based Access Control (RBAC) roles to resources you’re creating in Bicep templates. Because why would you want to manually add roles to a resource that you’re automatically deploying? (You don’t!)

    What’s in this post

    What the template does

    The template I am pulling the role assignment section from creates an Azure SQL Server, Azure SQL Database, and then a Key Vault, and you can learn more about that in my previous post. What I didn’t cover previously though, is that in my actual use case, I need to then give my team admin rights on the Key Vault after it’s created so that we have the permissions needed to add secrets to the Key Vault.

    The Full Template

    If you would like to see the Bicep template I will be covering in today’s post in full, instead of piecing it together from this post, you can find it on my GitHub.

    Assigning RBAC Roles to Resources with Bicep

    The process of creating role assignments on Azure resources using Bicep templates is relatively simple. The only data you need to supply to create such role assignments are the Object ID of the group or user you want to assign the role to and the ID for the specific role you want to assign. The hardest part of that is to find the ID of the role assignment in the Azure portal.

    Get the Object ID for the user or group

    To get to the Object ID value for the user or group to which you want to assign a role, navigate to Entra ID in the Azure Portal, then search for and select the user or group from the dropdown list. When you open the user/group, you will find the Object ID value to copy into your template.

    Finding the ID for the Role Assignment

    The easiest way to figure out the ID value of the role assignment you want to give to a user or group is to go to the type of resource that will get the assignment, in this case a Key Vault, go to the IAM page of the resource in the portal, and then view role assignments and view the JSON for the needed assignment. If you know of a better way, please let me know in the comments below!

    First, navigate to an existing Key Vault in your Azure environment and go to the “Access control (IAM)” page.

    Screenshot showing the Access control page for a Key Vault resource in the Azure Portal

    Then, click on the “Roles” tab. On that page, search for “key vault”, or whatever keyword will help you find the role you’re looking for. Once you’ve found the role you want to add through your Bicep template, click on the “View” link for the role under the “Details” column.

    Screenshot showing how you can find a specific role from the Roles tab of the Access control page for a resource

    When you click on “View”, it will open another pane with details about the role. In that pane, select the “JSON” tab and then copy out the GUID value from the end of the string for “id” in the JSON description.

    Screenshot of the Azure Portal showing where you locate the ID of a role assignment from the JSON details of the role

    That is the value you will need to put into your template, so keep that handy.

    Creating the roleAssignment resource

    The short definition to assign a role to a user or group via Bicep is like the following:

    var roleDefIDKVAdmin = '00482a5a-887f-4fb3-b363-3b7fe8e74483' /* The role ID for "Key Vault Administrator" */
    var admin_entra_group = '0000-0000-0000-0000' /* Object ID for our Entra group */
    var roleAssignmentName = guid(admin_entra_group, roleDefIDKVAdmin, resourceGroup().id)
    
    resource roleAssignment 'Microsoft.Authorization/roleAssignments@2022-04-01'= {
      name: roleAssignmentName
      properties: {
        principalId: admin_entra_group
        roleDefinitionId:resourceId('Microsoft.Authorization/roleDefinitions', roleDefIDKVAdmin)
      }
    }

    Variables

    I have 3 variables in my template that simplify the process of specifying the settings for the roleAssignment resource.

    • roleDefIDKVAdmin: The ID value I retrieved from the Azure Portal for the role I want to assign
    • admin_entra_group: The Object ID for the Entra group I am assigning the role to
    • roleAssignmentName: A value I generate to give to the roleAssignment resource based on the other two variables. Important to note: the part “resourceGroup().id” is what ties the role assignment to the key vault, because I am granting the role assignment at the resource group level. If you want to apply the role assignment to a single resource, you would do symbolicName.id instead.

    Resource definition

    Once you have the 2 values retrieved as I specified above, you simply need to pass the variables into the correct locations of the resource definition. The name of the resource is given the name from variable roleAssignmentName. The principalID is the user or group the role should be given to, which is provided through variable admin_entra_group. And finally, the roleDefinitionID which we provide by using the resourceId function in Bicep, which retrieves the actual ID value for a role given the ID value we retrieved manually from the portal. We are required by Bicep formatting to use this function instead of simply passing in that value ourselves.

    Summary

    Once you retrieve the ID of the user or group you want to give a role assignment to, get the ID for the specific role assignment you want to give, then tie those together with the Bicep template, it’s very simple to set up the resource definition to assign that role to that group when the template deploys.

    Related Posts

    Check the box to allow Azure Resources to Access SQL Server in Bicep Template

    This is the fifth in my series of posts about working with Bicep, the declarative scripting language created by Microsoft as the modern facade for Azure Resource Manager templates. Today’s post is focusing on a small bit of information that may be important for some Azure environments.

    For those of us who work in Azure and have Azure SQL Servers and Databases, we often want those resources to be available to other resources, such as applications, within the Azure environment. To grant that access through the portal, there is a simple checkbox on the Networking page of the Azure SQL Server that will allow that. But if you’re trying to declaratively create your Azure infrastructure with Bicep like I am, how are you meant to accomplish checking that same box? That is what this post will answer.

    What’s in this post

    Background

    For those of you who may not be as familiar with what I was referencing above, what I am referring to is this checkbox that you find on the Networking page of an Azure SQL Server instance in the Azure Portal.

    Screenshot showing the checkbox I am talking about, which allows other Azure resources to access the SQL Server resource

    The purpose of this checkbox, as you can learn by hovering over the information icon at the end of the line of text for the checkbox, is to configure the firewall settings to allow inbound access from Azure resources. By default, this box is left unchecked.

    Screenshot showing the information about the checkbox in the Portal screen

    However, if you are in a situation where you have a separate application that is going to be accessing the server and database (which is fairly common), you will also likely need to ensure this networking rule is checked when creating new Azure SQL Servers.

    How to check that box in a Bicep template

    Understanding what the checkbox does

    The heart of what that checkbox is doing in the background when it is checked is creating a firewall rule which indicates to Azure services if they are allowed to connect to the server and database or not. You can see what it’s doing by querying the system table which contains the list of server firewall rules, which I wrote about in a previous post.

    When the box is checked, you can see a firewall rule for the server in the table sys.firewall_rules (run on the master database), called “AllowAllWindowsAzureIps” with start and end IP addresses both set to 0.0.0.0.

    Screenshot of SSMS showing query that lists all server firewall rules

    Conversely, if that box is not checked on the portal, you will not see a firewall rule in the table with that name or IP address range. Once you understand that the checkbox is setting a specific firewall rule for you in the background, it becomes simple to figure out how you can make that box get checked through a Bicep template.

    Bicep setup to check the box

    Since the checkbox in the portal is creating a simple firewall rule with a specific name and IP address, we can recreate that easily within a Bicep template by creating a firewallRule resource within the template and tying that to your Azure SQL Server with the parent attribute.

    Create the sqlServer resource in the template

    First, create the SQL Server resource so that we can tie the firewall rule resource to it. If you would like to see how to define such a resource to then create a firewall rule for, see last week’s post.

    Create the firewallRules resource in the template

    After you’ve create the Azure SQL Server resource in your script, the next step is to specify a second resource of the type firewallRules which we will use to check that box on the server’s networking page.

    resource firewallRule 'Microsoft.Sql/servers/firewallRules@2023-08-01-preview' = {
      name: 'AllowAllWindowsAzureIps'
      parent: sqlServer
      properties: {
        endIpAddress: '0.0.0.0'
        startIpAddress: '0.0.0.0'
      }
    }

    The important parts to note from that resource definition are the following:

    • name: 'AllowAllWindowsAzureIps': This specifies the name of the firewall rule that is going to be created. In this instance, the rule needs to have this specific name in order to check the settings box in the portal and allow the access
    • parent: sqlServer: This line is what tells the template that this firewall rule needs to be applied to the server we created in the same template previously, with the symbolic name of sqlServer.
    • endIpAddress: '0.0.0.0': This, along with the startIpAddress, must be set to all 0s, which is what Azure looks for when trying to see if another Azure resource has access to the server. This “range” is special, reserved for this situation.
    • startIpAddress: '0.0.0.0': This, along with the endIpAddress, must be set to all 0s, which is what Azure looks for when trying to see if another Azure resource has access to the server. This “range” is special, reserved for this situation.

    Then you can also specify any and all Azure SQL Databases you want to be on your server in the template if you would like and then deploy the template. After the template has been deployed, navigate to your new server in the Azure Portal and review the networking settings to see that the “Allow Azure services and resources to access this server” box is successfully checked.

    Summary

    If you’ve been wondering how you can get the one box on the Azure SQL Server Networking page in the portal checked by using a Bicep template, it’s as simple as adding a firewall rule for the server with a start and end IP address of all 0s. This special firewall rule is used to indicate to Azure that a given resource is available to access through other Azure resources.

    Let me know in the comments below if this tip helped you or if you have any questions regarding setting up firewall rules for an Azure SQL Server using Bicep.

    Resources

    • https://stackoverflow.com/questions/72433407/is-there-a-way-to-set-allow-access-to-azure-services-in-microsoft-dbforpostgre
    • https://learn.microsoft.com/en-us/azure/azure-sql/database/firewall-configure?view=azuresql#connections-from-inside-azure
    • https://github.com/Azure/bicep-registry-modules/blob/main/avm/res/sql/server/main.bicep

    Related Posts