As database administrators, we are always looking for ways to automate our daily processes. SQL Server Agent has always been a great tool for doing this, whether it be for scheduling regular maintenance or administrative jobs. For those of you making the leap to the PaaS offering of Azure SQL databases, you will quickly discover that SQL Server Agent is no longer a feature. For those of you who might start to panic thinking you will now be required to wake up at 2:00 AM to manually run your weekly maintenance or nightly administrative job- don’t worry! This is where Azure Automation comes to save the day! Azure Automation brings a PowerShell workflow execution service to the Azure platform that allows one to automate those maintenance and administrative tasks all within the Azure portal and take the role of the SQL Server Agent. To demonstrate how you can leverage Azure Automation, I will take a common request that I have encountered with many clients who have the need to schedule a stored procedure execution.
To get started, we will create a new runbook under an existing Azure Automation account. There are two options for creating a new runbook. The first option is to do a quick create where you define the name and type for your runbook and then are provided with an empty runbook that you can build from here or paste your existing workflow or code into. The second option is to import an existing file, which must be a PS workflow, PS script, or graphical runbook. When importing, the runbook name and type are populated based on the file selected for import. For this example, we will be importing an existing PS workflow designed for executing a stored procedure.
While we could have created the workflow with hardcoded values for a specific request, this would not be scalable as future implementations would require multiple runbooks and workflows to accommodate. Instead, I planned and built the workflow to accept parameters for SQL Server Name, Database Name, Stored Procedure Name, and Credential, so one runbook and workflow could facilitate many different implementations.
# Fully-qualified name of the Azure DB server
# Name of database to connect and execute against
# Name of stored procedure to be executed
# Credentials for $SqlServerName stored as an Azure Automation credential asset
Write-Output “JOB STARTING”
# Setup variables
$ServerName = $Using:SqlServerName
$UserId = $Using:Credential.UserName
$Password = ($Using:Credential).GetNetworkCredential().Password
$DB = $Using:DBName
$SP = $Using:StoredProcName
# Create & Open connection to Database
$DatabaseConnection = New-Object System.Data.SqlClient.SqlConnection
$DatabaseConnection.ConnectionString = “Data Source = $ServerName; Initial Catalog = $DB; User ID = $UserId; Password = $Password;”
Write-Output “CONNECTION OPENED”
# Create & Define command and query text
$DatabaseCommand = New-Object System.Data.SqlClient.SqlCommand
$DatabaseCommand.CommandType = [System.Data.CommandType]::StoredProcedure
$DatabaseCommand.Connection = $DatabaseConnection
$DatabaseCommand.CommandText = $SP
Write-Output “EXECUTING QUERY”
# Execute the query
# Close connection to DB
Write-Output “CONNECTION CLOSED”
Write-Output “JOB COMPLETED”
After importing our PowerShell workflow, it has a status of ‘New’ and needs to be published prior to starting or scheduling. In order to publish, we are required go into Edit mode where we can edit the PS workflow as well as save, publish, revert, check in, or test any changes made. Personally, I really like the Test pane for testing prior to publishing and ultimately scheduling the runbook. Within the Test pane, we are able to manually enter in desired parameters, start the runbook, and ensure there are no errors in our PowerShell workflow.
Upon entering all of the mandatory parameters, we can start and issue a request to submit the runbook for testing. Here we will see it pass through a queued, starting, and running phase before showing as completed, or of course, error messages if we have an issue in our workflow.
Once our testing completes successfully, as expected, we are ready to publish and start scheduling our new runbook!
We now published our very first runbook- of likely many! We can see the runbook now has a status of ‘Published’ and will be able to do more, particularly start it on demand or schedule it for automated runs. Any good database administrator takes every opportunity they can to automate as much as possible, so let us get into scheduling our runbook!
When we’re on the overview page of the runbook, we see there is a ‘Schedules’ tile that will tell us how many schedules are linked to it and give us the ability to view existing or add new schedules. When adding a new schedule, we need to create or link a schedule to our runbook and configure any parameters or specific run settings.
When linking a schedule to our runbook, we have the ability to select an existing schedule that already exists under our automation account or to create a new one. For this case, we will demonstrate creating a new schedule. Similar to SQL Server Agent job schedules, we can provide a name, description, start time, recurrence settings and expiration options. I found that the recurrence is not nearly as verbose as SQL Server Agent, so you may have to get creative to accomplish your needs. Here we created a simple hourly schedule:
After creating and linking a schedule, we need to configure parameters and run settings. For this example, the runbook has four mandatory parameters that we need to configure. This is where we benefit from a parameterized runbook so that we can define multiple schedules with different parameters in just a matter of minutes without changing or deploying a new runbook.
Now we see the new schedule listed with details on the next run and status. Back on the overview page, the ‘Schedules’ tile will represent the new schedule.
BOOM! We have the replacement to our SQL Server Agent job deployed and scheduled in Azure Automation for automating daily processes or maintenance!
One quick way we can review the details of our executions is under the ‘Job’ tile on the overview page. Here, we can see the breakdown history and can drill in further to each execution for even more details we can leverage when troubleshooting issues.
Thanks for reading!