Tag: Azure (page 1 of 2)

Deploying A DACPAC to Azure SQL Server

One of the easiest ways to duplicate SQL Server database objects from one database to another–such as in a situation where you want to copy the structure of a production database down to a development database to help with testing and development–is to create a file called a DACPAC from the database you want to copy. With SQL Server, there are two ways to copy a database without creating and restoring a full database backup: 1) Create a full copy of the database structure and data by exporting to a BACPAC file, or 2) Create a copy of only the database structure by exporting to a DACPAC file.

This post will cover how you can import a previously generated DACPAC file onto an Azure SQL Database to duplicate the structure of another database with little effort on your part. It specifically covers copying an existing Azure SQL Database to another Azure SQL Database. This process may work with on-prem or managed instances of SQL Server, but I haven’t tested that yet.

What’s in this post

Prerequisites

  • Have SSMS available to use
  • Have an existing Azure SQL Database ready to copy
  • Have an Azure SQL Server or Database to load the DACPAC to

Note: The source and target databases can be on the same Azure SQL Server or on different ones.

What are DACPACs and BACPACS?

There have already been many blog posts written on the subject of the difference between DACPACs and BACPACs, so I will refrain from trying to duplicate that information. Essentially, a DACPAC is an export of the schema/structure of a database, including the database users and logins. A BACPAC is similar to the DACPAC except that it contains the data from the database as well. If you would like more detailed information about both of these types of files, I recommend reading this post from SQLShack.

Then if you’re like me and are now wondering what the difference is between a BACPAC file and a full database backup (BAK file), I found this resource on SQL Server Central that explained the difference well.

Why use a DACPAC?

As I mentioned in the introduction above, the primary use case for making and importing a DACPAC file of your database is to quickly copy all database objects–including tables, procedures, functions, and even users–to a new version of the database, such as when trying to create a new dev version of an existing production database. That was my exact use case and what triggered me to write this post. I got tired of manually copying tables, procedures, and functions from one database to another whenever we needed a new version of a database.

How to Create a DACPAC File

When you have a source database that you would like to copy over to a new database, without the data, you should create a DACPAC from SQL Server using a “Task” on the database. To do this, right-click on the database name in the Object Explorer, select “Tasks” then select “Extract Data-Tier Application”. This will open a wizard for you to create a DACPAC file of the selected database.

Screenshot of the context menu where you can select to Extract Data-tier Application for a database
Choose to “Extract Data-tier Application” from the context menu of the Database in the Object Explorer

If instead, you click on “Export Data-Tier Application”, you will be brought to a wizard for creating a BACPAC file, which is not what we intend to do during this process, so make sure you choose the option to Extract a data-tier application.

After selecting that option, you will see a screen like the following. Review the introduction page and make sure it refers to creating a DAC package (see screenshot below, underlined in red) and not a BAC package.

Screenshot of the "Extract Data-tier Application" wizard in SSMS
The introduction page of the “Extract Data-tier Application” wizard, confirm that it is going to extract to a DAC package, not a BACPAC.

Step through the wizard, provide a name for the “Application” (database name), and choose where you want the file to be downloaded to on your local machine.

Screenshot of the setup screen of the "Extract Data-Tier Application" wizard in SSMS.
The DACPAC extraction properties page of the “Extract Data-tier Application” wizard

Once you have reviewed the settings for your file, click “Next” until the creation process starts. Once that has been completed, you will have successfully created a DACPAC file which is stored in the location that you chose.

Using a DACPAC to Format an Azure SQL Database

There are two different ways that you can deploy a DACPAC to an Azure SQL Server. The first way is to deploy the DACPAC to create a database, the second is to update an existing database with the DACPAC.

Overview of the Steps

  • Choose whether you want to create a new database with the DACPAC or update/format an existing database
  • If choosing to create a new database:
    • Right-click on the “Databases” folder in the Object Explorer of SSMS, choose “Deploy Data-tier Application”
    • Specify the DACPAC you want to use and the name of the database you want to create
    • Run the wizard which will create a database with default settings
  • If choosing to use an existing database, whether blank or already in use:
    • Right-click on the database, select “Tasks” then “Upgrade Data-tier application”
    • Specify the DACPAC file you want to use
    • Review the changes the wizard identifies between the DACPAC and the target database
    • Run the wizard which will update the target database with all changes necessary from the DACPAC

Deploying a New Azure SQL Database using a DACPAC

If you want to make your new Azure SQL Database on an existing Azure SQL Server and don’t want to do that through the portal, you can create a new database on the server by “deploying” a DACPAC to create a new database.

The main downside to this method of using a DACPAC is that the database will be created with the default setup, which for me is a General Purpose standard database with 2 vCores. That level of database may be pricy for some people, or it could be underperforming for others, depending on your needs. Due to that downside, that is not the method I chose to use my DACPAC with, but I still researched it so I could understand both ways. If you want to know how I used a DACPAC, skip down to the section for “upgrading” with a DACPAC below.

Deploying a Data-Tier Application

To create a new database from a DACPAC, you first need to connect to your target server through SSMS in the Object Explorer. Once you have connected, right-click on the “Databases” folder and choose “Deploy Data-Tier Application” from the list.

Screenshot of the context menu where you can choose to "Deploy Data-tier Application" for a server
Choose to “Deploy Data-tier Application” from the context menu of the Databases folder in the Object Explorer of SSMS

This will open a wizard that is very similar to the one for “upgrading” a data-tier application (discussed below). On the overview page of the wizard, you can confirm that you are going to deploy a DACPAC and not a BACPAC.

Screenshot of the introduction screen of the "Deploy Data-Tier Application" wizard in SSMS.
The introduction page of the “Deploy Data-tier Application” wizard

The first thing you will need to do in this wizard is to choose the DACPAC file you previously created.

Screenshot of the "Deploy Data-tier Application" wizard on the page where you choose the DACPAC you want to deploy
The page of the “Deploy Data-tier Application” wizard where you choose the DACPAC file you want to deploy from

Click Next to move to the following page, where you will choose the name of the database you will deploy.

Screenshot of the "Deploy Data-tier Application" wizard on the page where you choose the the name of the new database you want to deploy
The page of the “Deploy Data-tier Application” wizard where you choose the name of the new database you want to deploy from the DACPAC

Click Next to move to the Summary page and review the process that the wizard is about to execute to ensure everything is set the way you want. Click Next again to deploy the DACPAC and create a new database from that file.

Once that process completes, you can close the wizard and then check on your database through the Object Explorer to see that you now have a new database with the same objects as the source database. At this point, you will also be able to see the newly created database in the Azure Portal under that SQL Server, where you can make any changes you would like to the default setup chosen by the deployment.

Updating or Formatting an Existing Azure SQL Database with a DACPAC

This method of “updating” an existing Azure SQL Database is helpful if you already have a database that you want to align with another, or if you have already created a blank database meeting your standards that you want to quickly deploy database objects to. In the previous section, the method of “Deploy a Data-Tier Application” creates a new database from the specified DACPAC, but that database is created with the default Azure SQL Database settings, which is “General Purpose: Standard-series (Gen5), 2 vCores”. That database setup is estimated to cost (in our setup) $373 per month.

If that default setup for a database doesn’t work for you or you don’t want to fix that after the database has been created, you can use this second method of “upgrading” your existing database with the DACPAC. Before you can complete this method, you first need to create a blank database on your Azure SQL Server, through whichever method you prefer. Or use any other existing Azure SQL Database that you are fine with overwriting with the DACPAC.

Upgrading a Data-Tier Application

Connect to your target database in SSMS through the Object Explorer. When you are ready to import the source database objects onto this new database, right-click on the empty database then select “Tasks” then “Upgrade Data-tier Application”.

Screenshot of the context menu in SSMS where you can choose to "Upgrade Data-tier Application"
Choose to “Upgrade Data-tier Application” from the context menu of the existing database you want to deploy the DACPAC to

This will open another wizard that will walk you through the process of “upgrading” your chosen database with the DAC package of another database.

Screenshot of the "Upgrade Data-tier Application" wizard on the introduction page.
The introduction page of the “Upgrade Data-tier Application” wizard

On the second page of the wizard, you will be prompted to select the DAC package you would like to use for the upgrade. Browse to the DACPAC file you created earlier, then click Next.

Screenshot of the "Upgrade Data-tier Application" wizard on the page where you choose the DACPAC you want to deploy.
The page of the “Upgrade Data-tier Application” wizard where you choose the DACPAC file you want to deploy to the database

After you click Next, the system will start detecting changes between the target database and the DACPAC you selected, which may take a few minutes.

Screenshot of the "Upgrade Data-tier Application" wizard on the page where you have the wizard detect changes between the DACPAC and the target database.
The page of the “Upgrade Data-tier Application” wizard where the wizard checks for differences between the DACPAC and the target database

If you have already imported the DACPAC before or have anything on your target database that is different from the source, you will receive a warning like this:

Screenshot of the "Upgrade Data-tier Application" wizard on the page where you must specify that you want to proceed with the upgrade despite possible data loss.
The page of the “Upgrade Data-tier Application” wizard where you confirm that you are okay with proceeding despite the risk of data loss from deploying the DACPAC

I got that warning because I had already imported the DACPAC so I no longer had a blank database. In this situation, you are required to check the box that you’re okay with overwriting the existing structure by continuing. If it’s your first time attempting to import the DACPAC onto a blank database, you should instead receive a green check box on this page.

On the next page, choose whether or not you want to roll back in the event of a failure during the “upgrade”. I chose to not have it roll back since this was going to be a new dev database that I could tweak, fix, or simply recreate if something went wrong. You may want to choose differently.

Screenshot of the "Upgrade Data-tier Application" wizard on the page where you choose rollback options.
The page of the “Upgrade Data-tier Application” wizard where you choose the rollback option you want the wizard to use when deploying the DACPAC

The final step will be for the system and you to review the “upgrade” plan. You will see a list of all changes that will be applied to the database on the “Review Upgrade Plan” page. If you’re happy with the proposed list of changes, click Next until the upgrade process starts and the DACPAC gets applied. Once that process completes, you can close the wizard and then check on your database through the Object Explorer to see that you now have database objects deployed to the database that you didn’t before.

Screenshot of the "Upgrade Data-tier Application" wizard on the page where you review the differences between the DACPAC and the target database that have been identified.
The page of the “Upgrade Data-tier Application” wizard where you review all the changes that will be applied based on the DACPAC

Can’t I just import the DACPAC from the Azure Portal?

This was the first thing I tried when trying to import my DACPAC into an Azure SQL Server. I uploaded my DACPAC file to a storage account, then clicked “Import” on the SQL Server page in the Portal. This started a new deployment where the SQL Database was created but then the deployment of the DACPAC to the database failed so I was left with an empty database on my server. The error I received when the deployment failed was related to not specifying a BACPAC file for the import process, so it expects that file type instead of a DACPAC.

If you can figure out how to import a new database to a SQL Server using the Portal, I would love to hear about it in the comments below.

Summary

If you, like me, were confused about how you could import a DACPAC file into a database on an Azure SQL Server (because the Microsoft documentation for this isn’t great), I hope these instructions help you with your import. It is as simple as creating the DACPAC file and then using a wizard to “upgrade” or “deploy” your database using the DACPAC.

Related Posts

Moving Data Between Azure SQL Databases–the Fast Way

I have done more database creation in the past couple weeks than I had done in the previous year combined, all because my team is working on making several different AI chat bots for our business users and we are being very agile about our development process. At first we had separate Azure SQL Databases for each chat bot to use for its backend, but then I decided it would be more prudent, cost effective, and easier to manage if we combined the three related bots into a single database. This meant that I needed to move data from three disparate databases into one single database, and I wanted to do it easily without any extra tools.

How I decided to move the data between the databases was to quickly export to a flat file then import that flat file onto the new database–since I only had a few hundred rows to move from each source database. As far as I am aware, this should work with all modern types of SQL Server (on-prem, IaaS, Azure SQL Database). Read on to learn how to follow the same process I did.

What’s in this post

Exporting the data from the source

On your source database in SSMS, run a SELECT query to retrieve all the data you want to copy over to your target database. Once you have your result set, select all records, right-click on them, then choose “Save Results As…” from the menu.

Screenshot showing the "Save Results As" option when right-clicking on the results of a query in SSMS
You can export your query results using “Save Results As…” from the right-click dialog

That will open the normal “Save As” dialog, and you can choose where to save your results. Save the file to your computer.

Once the file has been saved, I recommend opening the CSV using Notepad++ or similar tool to review the formatting of the data. There were a few times when I went through this process that the columns names did not get exported to the flat file, and instead the columns were generically named “Column1”, “Column2”, and so on (see final screenshot below for what this would look like when trying to import). Chances are that you won’t run into this issue, but it doesn’t hurt to check. If for some reason your file doesn’t have the correct column names that you expect, simply export the query results to a file again and it should fix the issue.

Importing the data to the target

To import the data onto your target database, connect to it in SSMS as well, right-click on the database name in the Object Explorer, then choose “Tasks” > “Import Flat File”. This will open a wizard to help you import the data.

Screenshot showing where to navigate to in the database menu to import a flat file
Import a flat file using the Task “Import Flat File”, which will open a wizard tool for you

In the dialog that opens, browse to the file that you previously exported from the other database. Once you select that file, a table name is generated that the data will be imported to. You can change the name of that table if you wish. Also, choose the schema that table will go into. This process creates a new table from the imported data, so make sure you don’t use an existing table name.

Screenshot of the Import Flat File wizard where you specify the file you want to import and the table you want to load the data to
Specify the CSV file you created during the export process and the name of the new table you want to load the data into

Click “Next” to move to the next page, where you can preview the data that will be imported. If the data looks good to you, click “Next” to move to the “Modify Columns” page where you can change some settings on the table columns if you would like. I chose to “Allow Nulls” for all columns, just to make things easier for myself in this import process. I plan on deleting this table after I copy the data from it into the final target table, so it doesn’t matter to me if every column allows nulls.

Screenshot of the Import Flat File wizard showing the screen where you can modify the columns of the new table you are going to create during import
This screenshot shows you what can happen if the export process goes a little wonky–the column names from the source may not be put into the CSV file and thus only show as generic names in the Import wizard. From this menu, though, you can modify the properties of the columns in the new table that is going to be created for the import.

Keep clicking “Next” to move through the rest of the pages, review the process before you opt to import, then import the data. Once the import wizard has completed its work, you will now have a new table in your database, essentially a staging table.

To get the data from your staging table into the final target table in your database, you can use an INSERT INTO tablename SELECT... query. This step also allows you to modify the data before moving it into the target table if you need to. In my case, I needed to add an extra column to the data that the source table didn’t have but the target requires, so I hard-coded the value for that column into my SELECT statement so it would be added to the final table.

Cleanup

After you have moved your data into its final destination, make sure to clean up the temporary new table that you created during the import wizard process, to keep things as neat as possible in your database.

Summary

Using this method to copy data between two Azure SQL Databases may be considered a bit lazy, but I truly think it’s the most efficient way to get data from point A to point B. Sure, I could have made a pipeline or used some other system to move the data around, but I only had a few hundred records to copy over so this method worked great for me. Let me know if this method saved you time as well!

Related Posts

How to View Pipelines Associated with A Synapse Data Source

This morning I received a question from a colleague about how to set up a new linked service for an on-prem server to use that as a data source for a pipeline. As part of my review of our current pipelines setup to make sure I was answering properly, I needed to find an example pipeline where we were already doing that. I had a linked service and associated data source doing exactly what I needed to share with this other person, but then I got to wondering if that data source was even used anywhere and set out to find out. As per usual though, Google betrayed me and wasn’t giving me anything related to the question I was asking, so I had to figure out the answer for myself. Which is why I am now sharing with you the simple way you can see what pipelines are using which data sources, and thus linked services, in your Synapse Analytics workspace.

What’s in this post

Seeing the Data Sources associated with a Linked Service

In Azure Synapse Analytics studio, it is very easy to figure out what data sources have been created using a given Linked Service. To view this information, you go to the “Manage” page in the workspace, then select the page for “Linked Services”.

Screenshot showing the left hand menu in the Synapse Analytics studio
Open the Manage page to view Linked Services
Screenshot showing the "Manage" menu with Synapse Analytics studio
Open the Linked Services page under Manage to view all Linked Services in the workspace

Once you are on the Linked Services tab, there is a column in the list of services called “Related” which lists the number of related data sources for each Linked Service. If you click on the number, it will open a new pane of information telling you which data sources are created using that Linked Service.

Screenshot showing the Linked Services page in the Synapse Analytics studio management tab, where you can view the data sources related to a Linked Service.
Click on the linked numbers in the Related column of the Linked Services list to see data sources using the Linked service
Screenshot showing the list of data sources related to a Linked Service
When clicking on the linked numbers in the Related column, you will see this Related pane.

In that “Related” pane, the names of any related data sources will be links, so you can click on those to take you to the data source editor for that specific data source within the workspace.

Seeing which pipelines use a Data Source

Once you navigate to the Data Source editor, you are then able to view which pipelines are using that data source. To view that information, you simply need to click on the “Properties” icon in the top right corner of the Data Source editor tab.

Screenshot showing where to locate the Properties panel of a data source so you can view the related pipelines
The Properties pane for the Data Source can be accessed from the top right corner of the Studio window.

Once you have the Properties pane opened, click the “Related” tab to view the list of any pipelines that are using the data source.

Screenshot showing the list of pipelines related to the current Data Source
In the Properties pane for the Data Source, all related pipelines can be found on the Related tab

If there are no pipelines using the data source, you will instead see this on the “Related” tab.

Screenshot showing an empty Related Pipelines panel for a given Data Source in Synapse Analytics studio
If there are no pipelines using the Data Source, you will see “No items to show”

Summary

I wish I had looked for this information sooner, because I think it could have saved me some time while working with Synapse pipelines. Viewing the Data Sources related to a Linked Service and then what pipelines are using a given Data Source is simple to accomplish and helpful for seeing the relationships between items in your Synapse workspace.

Related Posts

Azure Storage Explorer for Moving Files

Once again I am sharing a quick tip that will potentially speed up your work process, this time by using Azure Storage Explorer for moving database backup files between servers. This process saved me so much time this past weekend while completing a server migration, which is why I wanted to cover it with a quick post.

What’s in this post

What is Azure Storage Explorer?

Azure Storage Explorer is a desktop application that helps you manage your Azure Blob Storage and Azure Data Lake storage. Before this past weekend, the only purpose that I had used it for was to propagate Access Control Lists (ACLs) for storage accounts, which is also a very helpful task it can accomplish. However, what I want to focus on for this post is how you can also use this tool to very quickly transfer files between servers, as long as those servers have access to the internet.

Moving files with Azure Storage Explorer

If you are ever in a situation where you are migrating a database from one server to another using the backup/restore method, and that backup file is very large even after being compressed, you may want to try this method of moving the file between your servers. This of course only works if you use Azure as your cloud provider.

With previous database migrations, I tried different methods of moving my file between the servers, and even have an old post discussing this. But this past weekend, I was doing the production migration for a database and the file was larger than previous ones I had moved between servers, even after creating the backup in a compressed format. My first transfer method that I tried was to drop the file onto a network drive and then copy it to the destination server from that share drive on the destination server. While that worked, it was pretty slow, taking about 30 minutes to complete. That would have been… fine… if I hadn’t run into issues with the backup which forced me to generate a new backup file that needed to be copied over as well. Since I didn’t want to force the rest of the upgrade team to wait for that again, I started trying a new method.

While that slow copy was in progress, I quickly download Azure Storage Explorer (ASE) onto the source server and uploaded my backup file to a storage account in our Azure subscription. And to my surprise, the upload of the 15 GB file took just about a minute or two if I recall correctly. No matter what specific amount of time it took, using ASE was significantly faster, and it didn’t cause a browser memory issue like when I attempted to upload the same file to the same storage account manually through the Azure portal. Because for some reason, even though I got the manual upload to a storage account to work in the past, I have now been having issues with my browser, both Chrome and Firefox, crashing out part way through the upload. So this new, faster transfer method is a huge win for me!

Then I also quickly downloaded and installed ASE onto the target server, and the download of the file from the storage account through the application was just as fast as the upload was. I had my database backup copied over to the new server in the same amount of time that the progress of the network drive copy only reached 20% done. So I gratefully cancelled that copy process. I was happy about the speed of ASE and I am sure my colleagues were happy they didn’t have to wait on the database part of the upgrade even longer.

Why is this so much faster?

Given how much faster the upload and download for my file went using Azure Storage Explorer compared to every other method, I really want to know how it manages to achieve that. Unfortunately, it seems that the information about why and how it manages to be so fast is limited. Part of the speed obviously came from our own network speed, but some of it certainly comes from something special with the tool since trying to upload manually through the portal proved to be super slow, and would then crash out in the browser.

From the few resources I’ve found online (listed in the References section below), it seems that the performance of ASE comes from how it uses the azcopy tool to speed up and also parallelize the file transfers and use multiple cores from the host machine to do so. Whatever makes it work so quickly, I am very happy that it exists and will likely be using this method of copying files between servers going forward. Downloading and installing ASE, then using it to upload and download files, is still much faster than trying to move big files any other way.

Summary

If you need to move a large file between two servers in your Azure cloud environment, I highly recommend using Azure Storage Explorer to drop that file onto a storage account, which will complete very quickly as long as your internet speed is normal, and then download that file using ASE as well, which will be equally as fast. There are other methods of copying files between servers that I’ve discussed before, but this is now going to be my preferred method.

Resources

  • https://stackoverflow.com/questions/57729003/blob-code-download-much-slower-than-ms-azure-storage-explorer
  • https://azure.microsoft.com/en-us/blog/azcopy-support-in-azure-storage-explorer-now-available-in-public-preview/
  • https://learn.microsoft.com/en-us/azure/storage/storage-explorer/vs-azure-tools-storage-manage-with-storage-explorer?tabs=windows
  • https://azure.microsoft.com/en-us/products/storage/storage-explorer

How to easily transfer logins between servers

During server upgrades and migration projects, you will likely run into the task or issue where you need to transfer the server logins from the previous to the new server. For non-SQL authenticated users, this is simple. But if your source server happens to have more than a handful of SQL Authentication users or you don’t have access to the passwords of those users, it would normally be very difficult to transfer those logins to the new server.

I recently setup three new test environments based on a current production environment, and there were literally dozens of SQL Authentication users on the production server that needed to be copied to the new environments. I didn’t have access to the passwords of any of those users so I couldn’t just recreate them manually. That is where two non-standard Microsoft stored procedures come in very handy to help migrate all these users.

What’s in this post

Some Context

If you haven’t ever been challenged with this task before, copying SQL Authentication users from one server to another, you may not know why this can be a challenge, so I’ll describe that first. The problem with copying logins between servers is that unless you know the password of the account, just scripting out the login as a CREATE script will not give you the script that will recreate the login exactly how it is on the original server. For good reason, when you right-click on a login and opt to script it as a CREATE statement, SQL Server will generate the script but the password it puts in the script is a randomly generated string that isn’t what the true password is. So if you took that script and ran it on your new server, the login would be created, just with a different password than the login originally had, which could cause a lot of problems for whatever or whoever is using that login. (Side note: I don’t know if I’ve see this before, but when I scripted out my test login on my personal computer to demonstrate for this post, SSMS automatically added a comment saying the password is random, which is a helpful note there.)

Screenshot showing the script that is generated if you right-click on a login and opt to "Script as CREATE", with a random password being generated for the script

The Helpful Stored Procedures

Given the above context of why transferring these logins isn’t as simple as scripting them on the source server and then running those scripts on the destination, how then are we supposed to accurately copy those logins to a new server? While there are some StackOverflow answers online that essentially provide a homegrown method of achieving the same goal, they all seemed overly complicated to me so I didn’t want to attempt them. And thankfully I found the Microsoft approved method online that very easily does all the required work for you.

To create the stored procedures that accurately script out all your SQL Authentication logins without generating any security issues or wrong password problems, go to this Microsoft document page. On that page, you’ll want to scroll down a bit until you see “Method 2” in the bulleted list, which is where you find the scripts to create the helpful stored procedures.

Screenshot of the section of Microsoft documentation where it mentions the two stored procedures that can generate scripts to recreate SQL Server logins

Note: For your own safety, please review the code before you run the script to ensure you know exactly what it’s doing and that it won’t cause issues for your specific system.

Copy the scripts from the web page into your SSMS new query window and then execute on the master database of the server you wish to copy logins from.

The script as it exists on the Microsoft website starts with the USE master command, but this will not work if you are trying to run the script on an Azure SQL Database, which no longer allows the USE statement to switch database contexts. If you are running the query on such a database, you will need to remove that top line and make sure yourself that you are running the command on the master database.

Scripting the Logins

Once you have executed the Microsoft script, there will now be two new stored procedures on your master database: dbo.sp_hexadecimal and dbo.sp_help_revlogin. The first one is used by the second, and you will only be executing the second procedure manually.

When you are ready to generate scripts to recreate all the logins on your server, you can execute the stored procedure dbo.sp_help_revlogin on the master database. Doing that will return data in the “Messages” tab of the results window with all necessary CREATE statements, including encrypted/hashed passwords, for your logins.

EXEC dbo.sp_help_revlogin
Screenshot of the basic output of the stored procedure which generates the create scripts for SQL Server logins

Reviewing the output further, I can see that the SQL Authentication users I have on my server have a “password” value displayed, which I can assure you is not the actual password I set for my users. Instead, the password value you see in the script is a securely hashed version of the password stored in the depths of the server security settings. The script has included the HASHED keyword at the end of the password which will notify the new server we are going to run this on that the password should be handled as the hashed value it is so the server doesn’t literally set the login password to the provided value.

Screenshot showing the output of the stored procedure that generates scripts with hashed passwords to recreate SQL Authentication logins

Creating the New Logins

Copy the output presented in the “Messages” tab of your query window into a new query window with a connection to the server you want to add the logins to. Now to create all your logins from the original server, all you need to do is to execute the script. It’s as simple as that! You have now copied all your logins, including any login-level permissions they were assigned on the original server, added to your new server.

Summary

Copying SQL Authentication logins from one server to another doesn’t have to be a difficult or time-consuming process involving looking up old saved passwords or writing your own script to maybe fully copy the logins. Instead of trying to write a homegrown solution yourself, save yourself the time and effort and use the two stored procedures that Microsoft already wrote and shared to do the work for you.

The Fastest way to Add Azure Firewall Rules

To keep resources secure in the Azure cloud environment, there are usually multiple levels of security that must be cleared for someone to be able to access a resource. For Azure SQL Databases, for example, the user who is trying to access the database must have access granted for their user account on the database but they also need to be given access for their IP address through the network firewall rules for the server or database resource.

I usually only need to add or update a single user’s firewall rule at a time when our business users get their IP addressed updated sporadically, but I had a situation last week where I needed to add over 40 firewall rules to an Azure SQL Server resource for a new database I imported on the server. I did not want to manually add that many firewall rules one at a time through the Azure Portal because that sounded tedious, boring, and like too many mouse clicks, so I instead figured out how to do it the fastest way possible–running a T-SQL script directly on the server or database through SQL Server Management Studio (SSMS).

Note: This would also work through Azure Data Studio or your chosen SQL Server IDE, I simply prefer to use SSMS so use that as a reference in comparison to completing the same actions directly through the Azure

What’s in this post

Server vs. Database Firewall Rules

According to this Microsoft document, it should be possible to grant firewall rule access for a user to a single database on a logical Azure SQL Server resource. In my experience in Azure so far, we’ve only ever granted firewall access at the server level instead, since that is acceptable for our business use-cases. But since it’s possible to set the same firewall rules at the database level according to Microsoft, I added how to do that to this post, as well as the server-level rules which I will be creating for my use case.

T-SQL for Creating Firewall Rules

Did you know it was possible to create both database-level and server-level firewall rules for an Azure SQL Database through SSMS using T-SQL commands? I didn’t before I started this project request last week. Going forward, I will likely use the T-SQL route through SSMS when needing to make multiple firewall rules instead of using the Azure portal, to save myself time, effort, and mouse clicks.

Create a Server-Level Firewall Rule

To create new firewall rules at the server level, you can connect to your Azure SQL Server through SSMS and run the following command on the master database.

/* EXECUTE sp_set_firewall_rule N'<Rule Name>','<Start IP Address>','<End IP address>'; */
EXECUTE sp_set_firewall_rule N'Example DB Rule','0.0.0.4','0.0.0.4';

When you run that command, you are executing a built-in stored procedure that exists on the master database that will insert a new record for a server-level firewall rule into the table sys.firewall_rules, creating the server level firewall rule that will allow access through the specific IP address or IP address range. In the example above, I have the same value for both the Start IP Address and End IP Address parameters, but you can just as easily set that to a range of addresses, like 10.0.5.0 for the start and 10.0.5.255 as the end. I usually prefer to do that for a user’s IP address since our systems can change the last value fairly regularly, but since my current task was to grant access for servers, I set the start and end values to the same single IP address since I’m not expecting them to change frequently.

Create a Database-Level Firewall Rule

If instead of granting access to all databases on the server through a server-level firewall rule you would prefer to grant access to only a single database on your Azure SQL Server instance, you can do that as well. The T-SQL command to create a database-level firewall rule is the following. Notice how it’s very similar to the server-level command above, but with “database” specified in the stored procedure name.

/* EXECUTE sp_set_database_firewall_rule N'<Rule Name>','<Start IP Address>','<End IP address>'; */
EXECUTE sp_set_database_firewall_rule N'Example DB Rule','0.0.0.4','0.0.0.4';

The parameters that are required to run this database stored procedure are the same as what’s required for the server-level procedure, so you can either specify the same Start and End IP address values to allow access through a single address, or you can specify different values for the two to give access to a range of addresses. This procedure inserts records into the system table sys.database_firewall_rules on the database you ran the command on.

Speeding up the Creation of Execution Scripts

Knowing the above commands is helpful, and even works if you only have one or two firewall rules to add, but I promised I would tell you the fastest way possible to add 40+ firewall rules to a server, so how do I do that? How do I save myself the trouble of writing the above procedure execution commands 40+ times or copy/pasting/updating the line over and over again?

The trick I use is to generate my SQL queries in Excel, since Excel has the ability to concatenate strings and then to use the same formula across however many rows you have to generate the same concatenated string using multiple distinct values. I use this Excel trick quite frequently, whenever I need to generate the same type of SQL query/command multiple times based on specific values.

In this use case, I was provided a list of the names of the servers that needed inbound firewall access to my Azure SQL Database along with the IP addresses they would be using. I copied that information into two columns in Excel then wrote a formula in a third column to generate the SQL command that I needed to run to make a firewall rule for that particular server and IP address.

Here is an example of how I accomplish this type of task in Excel:

Screenshot of text in Excel demonstrating how to quickly develop SQL queries for adding firewall rules

In Column A of the spreadsheet, I copied in the list of the server names I was provided. I am going to set the name of the firewall rule for each to the name of the server that is being given access. Column B then has the IP address that the server will be connecting through.

Note: all server names and IP addresses were made up for this example.

Then in Column C, I use the CONCAT function of Excel to generate the SQL query that I want, which I will then be able to copy to the database and execute to generate the firewall rule. The following is the full formula I used to generate the SQL command:

=CONCAT("EXECUTE sp_set_firewall_rule N'",A2,"','",B2,"','",B2,"';")

After I’ve successfully made the command as I want it for the first server in the list, I then copy that same formula down the column for every row to generate the same command for each server and IP address combination. Once all the queries have been created, I copy the entire Column C into an SSMS query window:

Screenshot of SQL queries in SQL Server Management Studio for adding firewall rules to the server

I then just have to click Execute and all the commands run, creating all the new firewall rules I needed in just a couple seconds. Believe me, this method of using Excel will save you a lot of time copying and pasting and updating IP addresses in the queries.

View existing Server-Level Firewall Rules Through SSMS

If you would like to review the list of server-level firewall rules that currently exist on your server, you can do so by running the following query on the master database of your Azure SQL Server instance.

select id, [name], start_ip_address, end_ip_address, create_date, modify_date
from sys.firewall_rules
order by id

This list will directly correspond to the list of values that you would see under “Networking” for your Azure SQL Server instance in the Azure portal.

View existing Database-Level Firewall Rules Through SSMS

If you would like to review the list of database-level firewall rules that currently exist on your database, you can do so by running the following query on the database you would like to see the firewall rules for.

select id, [name], start_ip_address, end_ip_address, create_date, modify_date
from sys.database_firewall_rules

As far as I am aware, there is no way to view this same list of details from the Azure portal, so this should be the source of truth for database-level firewall rules for an Azure SQL Database.

How to Delete Firewall Rules Through SSMS

Similarly to how there is a more efficient way to create multiple firewall rules through T-SQL queries in SSMS, you can also quickly delete a lot of firewall rules at once through SSMS using a delete procedure. I haven’t yet had to delete a large number of firewall rules at once, but you can follow the same process I outlined above for adding them, but use the deletion procedure instead.

Delete Server-Level Firewall Rules

The T-SQL command you can use to delete a specified server-level firewall rule through SSMS is:

EXEC sp_delete_firewall_rule N'SERVER1';

When deleting a rule, you only need to provide the name of the firewall rule you would like to delete.

Delete Database-Level Firewall Rules

The T-SQL command you can use to delete a specified database-level firewall rule through SSMS is:

EXEC sp_delete_database_firewall_rule N'SERVER1';

When deleting a rule, you only need to provide the name of the firewall rule you would like to delete.

Summary

In this post, I outlined the full process I used to generate 40+ server-level firewall rules on my Azure SQL Server instance as requested. Before starting on this task I had no idea that it was possible to generate these firewall rules through a T-SQL command in SSMS, I only knew about adding them through the Azure portal manually. But like every good programmer, I knew there had to be a better and faster way and I was correct. I hope this post helps save you a little time as well the next time you need to add more than a couple firewall rules to your server.

SQL Query to Get Schema-Level Database Permissions

This is going to be a super quick post because I only want to share a short query that I’ve developed for myself to be able to see what schema-level permissions have been assigned to users on an Azure SQL Server. If you didn’t see my previous post about my least favorite things about Azure, you should check that out because it includes other queries I’ve been using to identify what permissions have been granted on a server or database.

The Query

My query I use for getting a list of all schema-level permissions on a database uses the sys.database_principals, sys.database_permissions, and sys.schemas views.

SELECT DISTINCT pr.principal_id, pr.[name], pr.[type_desc],
    pr.[authentication_type_desc], pe.[state_desc], pe.[permission_name], pe.class_desc, s.[name]
FROM sys.database_principals AS pr
INNER JOIN sys.database_permissions AS pe
    ON pe.grantee_principal_id = pr.principal_id
INNER JOIN sys.schemas as s
	on s.schema_id = pe.major_id
ORDER BY pr.name;

As you can see, the query is very short but it provides the most useful and relevant information for schema-level permissions on a database. It provides just enough information without being overly complicated or giving you more than you need. This query tells me what users have been given what type of permissions for every schema in my database.

Figuring out What Version of SQL Server You Have

If you connect to a SQL Server instance using SSMS and you’re trying to find what version of of SQL Server it is, you might be left scratching your head since it doesn’t seem to be in the server properties anywhere. Or at least there isn’t a value that lines up with the version numbers you’re used to hearing (2016, 2019, 2022, etc.). This will be a quick and easy post explaining how you can find the normal version name for a SQL Server you’re connected to.

What’s in this post

How to Find the SQL Server Version

The first thing you need to do is connect to your server instance using SQL Server Management Studio (SSMS). After you connect, you should see the server listed in the Object Explorer, which is where we will be getting the version number from. To the right of the server name in paratheses, you should see something like “(SQL Server 16.0.1000.6 – myusername)”.

That is technically the most correct version number for the SQL Server software you are running, but it’s not the one that most people think of when they think of SQL Server versions. To get to that version, you can look at the very useful table included on this blog post from SQL Server Builds.

For the example screenshot above, the server is really SQL Server 2022 because the “Release To Manufacturing” number of “16.0.100.6” corresponds to that version of SQL Server, as can be seen in the table included in that blog.

You can also find the same version number by right-clicking on the server in the Object Explorer list, then selecting “Properties” which will bring up a dialog where you can see the version and other settings for the server.

Note: For even more ways that you can find the version number of any SQL Server, there is this very useful blog post from SQL Server Builds again that I found super informative.

Summary

Now you should know how you can find the normal SQL Server version number for your instance based on the “Release to Manufacturing” number you will find to the right of the server name in the Object Explorer of SSMS. I wish they would put “SQL Server 2022” somewhere in the Properties of the server so we didn’t have to do this lookup, but for now we aren’t that lucky.

Note: Curious about what “Release to Manufacturing” really means? I found a great explanation on Wikipedia that clarified that term for me.

My Least Favorite Thing About Azure Databases

Now that I have been working in the Azure cloud environment for a few months with my new job, I have come to the realization that there is one thing related to Azure SQL Server instances that annoys me more than anything else in the Azure environment. Obviously every cloud environment is going to have its pros and cons, so this is only one small problem from the whole ecosystem. Azure has a lot of nice things to work with, so this complaint is very superficial, but I know others at my company have been confused by this issue so thought I would share the annoying difference here for educational purposes.

On-Prem SQL Server Permissions

The one thing that has been driving me crazy about Azure SQL Server instances recently is that there is not an easy way to quickly get an overview of the permissions a user/account/group has on the server and database.

In on-prem/normal versions of SQL Server, you can open the “Security” section of the server or database then right-click on the user or group you want to review the permissions for, and then you’ll get a GUI that quickly shows all the server- or database-level permissions that user has. The following screenshot shows the Server Roles for a given login on my local test server, so you can quickly know what permissions the login has.

And then if you go to the “User Mapping” page of the Login properties, you can see any databases the login has access to as well as the permissions that user has on the database.

I love this interface that I’m pretty sure has been around for decades. It’s so much easier to view and quickly change the permissions for a login/database user that new new system in Azure-based servers.

Azure SQL Server Permissions

In comparison, you can’t easily view or determine the server- or database-level permissions any login or user has on an Azure-based SQL Server. You can’t even view any general properties of the user like you can for on-prem installations. For example, here is a screenshot of what you see on an Azure SQL Server when you right-click a Login from the Object Explorer:

If you do the same thing on the on-prem SQL Server, you can see that there’s an option to view the properties of the user, which will get you to the menus where you can see server- and database-level permissions, along with other properties as well.

If you can’t get a nice GUI representation of those login permissions, how are you supposed to know what permissions anyone has on your server or database? A SQL query.

While that kind of makes sense for a platform where you do most of your work with SQL, it does mean you need to do more work to write or track down a copy of the appropriate SQL query to get the necessary data, instead of to right-clicking on an object and have its information displayed to you immediately. The SQL query you need to get permissions information is also tedious to me because you can only do it for a single database at a time instead of being able to quickly view the permissions for every database on the server in one place. For example, if you have 3 databases on your server, you need to run the query individually for each of those databases. You can’t even utilize the “USE [database]” statement to quickly switch between the databases and get all the results in a single result pane (if you want to read more about that, I have another upcoming post to discuss that topic).

This is the query I use to review database permissions:

SELECT DISTINCT pr.principal_id, pr.[name], pr.[type_desc],
    pr.[authentication_type_desc], pe.[state_desc], pe.[permission_name]
FROM sys.database_principals AS pr
JOIN sys.database_permissions AS pe
    ON pe.grantee_principal_id = pr.principal_id
ORDER BY pr.name;

If you want to view the permissions at the server level, run that query on the master database. Then to view the permissions specific to a single database, you will need to switch to that database (using the database dropdown from the toolbar above the query window, or updating your full server connection to specify the other database).

To view the role assignments for users in your database, you can use the following query, which utilizes the views sys.database_permissions, sys.database_principals, and sys.database_role_members to gather all the useful information related to role assignments.

SELECT u.[name] AS [UserName],
       r.[name] AS RoleName 
FROM sys.database_principals u
     JOIN sys.database_role_members drm ON u.principal_id = drm.member_principal_id
     JOIN sys.database_principals r ON  drm.role_principal_id = r.principal_id
WHERE u.[name] = N'TestUser';

Summary

While it’s not hard to get all role assignments and assigned permissions for logins and database users in Azure SQL Server instances, it isn’t as easy as it used to be with on-prem servers. I don’t think the new method of getting the permissions gets the information you need in as useful a way as the old method of using the GUI. I personally don’t prefer having to scroll through a table of information about each user and deciphering the details of their permissions, but I will continue to deal with it since that’s the only option for Azure SQL Servers currently.

How to Download SQL Server 2022

You would think that figuring out how to download the installer for SQL Server 2022 Standard or Enterprise editions would be a simple task, much like the process for how you download any other installer or ISO for any other version of SQL Server. But with 2022, Microsoft seems to have changed their method for getting the software to people, making it a lot more locked down than other versions are. I’m not sure why this change was made, and if you have some insight please feel free to share it in a comment, but it was made and I have finally figured out how to get that installer with this new system. It took my colleagues and I a whole work day to figure this out. We had to work with our Microsoft account managers to get to the bottom of it, and it seems a little crazy to me that it would be that unintuitive to find the ISO for this software.

What’s in this post

Background

In my company’s current architecture, we create SQL Server virtual machines (VMs) by creating standard Windows servers through the Azure portal, and then one of the database developers will logon to the VM and install SQL Server through the ISO/installer, instead of creating a SQL Server VM from a provided image in Azure. In the past, when we needed to install SQL Server onto a VM, we would google something like “SQL Server 2019 download” and click on the link that takes you to the Microsoft downloads page, and then get the ISO from there. When we tried to do the same for SQL Server 2022, we were taken to a downloads page that only allowed you to download Developer or Express editions, and if we wanted to do Standard or Enterprise, we could only get an evaluation version after providing our information to start a free trial.

We obviously did not want a short trial version of the software, we wanted the full Standard edition, but could not find a full installer/ISO for that version. For a few months, we gave up and installed 2019 instead of 2022 whenever we needed to create a new server, but we reached a point where we are required to use 2022 for some applications, so I set out to find out how to get the Standard edition of SQL Server.

The Proper Method

There are two different ways that we found we could get a Standard or Enterprise edition of SQL Server 2022, and I’ll start with the method that I think they’re intending people to use. If your organization uses SQL Server in Azure already, you will likely need to go through this same method to get the installer for 2022.

I unfortunately can’t get any screenshots of this method of the process because I’m not an elevated user in our organization, but there is a way for your Azure tenant owner (and a few other select individuals) to get the latest ISOs and installers for every piece of Microsoft software your company has licenses for. I believe it is at admin.microsoft.com, but you can talk to your account representative to find out for sure. You can then navigate to “Billing” and then “Your Products” and then “Volume Licensing” where you can view all your licenses and get the downloads for them.

This is how we resolved the installer issue in our organization; we had our tenant owner download the ISO from this location and save the file in a shared location. Going forward, whenever we need an installer for any Microsoft product that we use, we will be required to have that one person download the files and send it to us (not a super convenient method if you’re a developer that doesn’t have full access to your entire Azure tenant).

Once you have the ISO, you can run it as you usually would to install any version of SQL Server, and your license key will be prefilled into the installer so you don’t need to worry about specifying that at any time during the installation process.

The Workaround Method

Before we had talked to our Microsoft account managers, this is the route we were going to use to be able to install SQL Server 2022 on our VMs. Go to the SQL Server 2022 page on the Microsoft website (https://www.microsoft.com/en-us/sql-server/sql-server-downloads) then download the “Developer Edition”.

After you download that version, run the installer that you downloaded. When it first opens, click the option that says “Download Media”:

On the next page, click “Download”, which will download the ISO for SQL Server:

After the ISO file has downloaded, double-click on the file in File Explorer, which will mount a “DVD” to your system, which will open another directory. In that directory, double-click on “setup.exe” to run the installer for SQL Server.

When the installer window opens, click on the “Installation” page from the left-hand pane, then click on the first link that says “New SQL Server standalone installation or add features to an existing installation”.

If you have an issue where clicking that link doesn’t open anything else, seems to do nothing, check out my other post about how to resolve that issue.

Click “Next” multiple times to move through the initial installation steps for SQL Server 2022. Review the information on each page before moving past it to ensure the settings are what you want them to be. When you come to the following page, keep the radio button for “Perform New Installation” selected then click “Next” again.

The “Edition” page is the one we’ve been needing to get to. On this page, you can change the selection from “Specify a free edition” to either one of the other two options to install Standard or Enterprise editions.

If you select the option for “Use Pay-As-You-Go billing through Microsoft Azure”, there is a dropdown menu that will let you specify Standard or Enterprise editions. Note that going with this option would then later require you to specify Azure Subscription information to bill the server usage to.

If you select the option for “Enter the product key”, you can then give a license key for SQL Server 2022 if you already have one. Review the two optional checkboxes under the license key box to select any options that apply to you. It looks like if you don’t specify a license key after choosing this option, you can still move forward with the installation, but there must be some point in the future of using the software where you’ll be required to provide a license key to keep using it.

Note: If you go through the “proper method” I listed above, the license key should be prefilled in the third option when you get to this page with the properly downloaded ISO file. For our situation, we also need to make sure the checkbox below the license key box “I have a SQL Server license with Software Assurance or SQL Software Subscription” is checked, since we use Software Assurance for our licensing.

After you’ve made your selections for the “Edition” page of the installer, you are ready to move through the rest of the installation process, specifying whatever options required for your own version of SQL Server.

Summary

There are two different methods I have found for getting the installer for SQL Server 2022 Standard edition, and neither of them are necessarily intuitive. Based on your needs, you can choose either the “proper” or the “workaround” methods. I hope this tutorial has helped you download the software without having to spend a day figuring it out like my coworkers and I had to.