Wednesday, November 04, 2020

PASS Virtual Summit 2020: I'm (Virtually) Presenting

I'll be presenting at 7AM Central Time in the first timeslot of the main three-day PASS Virtual Summit 2020 conference. It's long been a goal of mine to present at the best of all international SQL conferences, and this is the first year it happened for me, so I'm thrilled to be a part of it.

It's not too late to register for the all-online event, with the same great quality content as always, at a fraction of the usual cost of going to Seattle.

Like many (but not all) presentations at PASS Virtual Summit, my 75-minute presentation will feature roughly 60 minutes of pre-recorded (and painstakingly edited) content, with the rest of the time available for live Q&A with the speaker. 

My presentation will cover a lot of important foundational material about security, accounts, authentication. 

  • For folks new to SQL Server security design and administration, this will be a great foundation for your learning. 
  • For those experienced in SQL admin, this will be a thorough evaluation of what you know, or thought you know, and maybe some gaps in what you know. 
  • I think there is content in here to interest everyone in the SQL career lifecycle, and I’m not just guessing at that. I got my first DBA job in 2006. I’ve been giving a presentation on Security at User Groups and SQLSaturdays basics for years, it was one of the first topics I started speaking technically on a decade ago. As my own experience has deepened and broadened throughout my career, so has the content I build into this presentation. 

So I’m going to start basic, and build quickly from there, focusing my content around common hurdles and tasks that database administrators face, in the hopes of deepening or broadening your experience, as well. 

I'm setting the stage for a good conversation around security at PASS Virtual Summit 2020, especially around how permissions behave inside each database, how you can design database security, the relationships between logins, users, and databases. My session one of a four part Learning Pathway on security. We worked together over the past four months to make sure we're presenting a thorough conversation on security. 

In subsequent presentations over the next three days:

Monday, November 02, 2020

Protect your SQL Server from Ransomware: Backup Your Service Master Key and More

Not many disaster recovery or SQL migration/upgrade scenarios require the SQL Server instance service master key to be restored. 

Some do.

Recently, by far the most frequent and common disaster recovery scenario for my clients has been the need for a complete, bare-metal rebuild or restore of the master database. Not hardware failure but ransomware (crypto-locking file) attacks have been the cause. 

You should consider complimentary backup solutions that backup/snapshot the entire server (or VM) for SQL Server, but sometimes these technologies are limited or have too much of an impact on the server. A whole VM snapshot for example that is reliant on VSS could incur an unacceptable long IO stun duration when it occurs. 

Regardless, in all cases, SQL Server backups of each database should be taken regularly. This is a conversation for another blog post but a typical pattern is weekly full backups, nightly differential backups, and in the case of databases not in SIMPLE recovery model, 15 minute transaction log backups.

In the case of SQL Servers needing a rebuild from nothing but SQL Server backups, some of the key pieces of information from this checklist will be helpful:

1. The exact SQL Server version of each instance to recover, so that you can restore system databases and settings. Storing the output of the @@Version global variable is helpful. Store this in server documentation.

2. The volume letters and paths of SQL Server data and log files would be helpful too. Output from the system view sys.master_files is helpful so that you can recreate the volumes. Store this in server documentation.

3. The service master key backup file and its password is needed to restore certain items in the master database like linked server information. Though the master database can be restored without restoring the service master key, some encrypted information will be unavailable and will need to be recreated. This is very easy to do, but the catch is making sure that the backup file created and its password are stored security in an enterprise security vault software. There are many options out there for something like this, I won't list any vendors, but you should be able to store both strings and small files securely, with metadata, and with enterprise security around it, like multi-factor authentication.

BACKUP SERVICE MASTER KEY --not actually important for TDE, but important overall and should be backed up regardless.
TO FILE = 'E:\Program Files\Microsoft SQL Server\MSSQL14.SQL2K17\MSSQL\data\InstanceNameHere_SQLServiceMasterKey_20120314.snk' 
    ENCRYPTION BY PASSWORD = 'complexpasswordhere';

4. In the event they are present, database master key files. Here's an easy script to create backups of each database's symmetric master key, if it exists. Other keys in the database should be backed up as well, upon creation, and stored in your enterprise security vault.
exec sp_msforeachdb 'use [?];
if exists(select * from sys.symmetric_keys )
begin
select ''Database key(s) found in [?]''
select ''USE [?];''
select ''OPEN MASTER KEY DECRYPTION BY PASSWORD = ''''passwordhere'''';   
BACKUP MASTER KEY TO FILE = ''''c:\temp\?_''+name+''_20200131.snk''''
    ENCRYPTION BY PASSWORD = ''''passwordhere'''';
GO  ''
from sys.symmetric_keys;
END'

5. Transparent Data Encryption (TDE) certificates, keys and passwords. You should have set this up upon creation, backed up and stored them in your enterprise security vault. For example:

BACKUP CERTIFICATE TDECert_enctest
TO FILE = 'E:\Program Files\Microsoft SQL Server\MSSQL14.SQL2K17\MSSQL\data\TestingTDEcert.cer'
 WITH PRIVATE KEY ( FILE = 'E:\Program Files\Microsoft SQL Server\MSSQL14.SQL2K17\MSSQL\data\TestingTDEcert.key' , --This is a new key file for the cert backup
 --, NOT the same as the key for the database MASTER KEY 
    ENCRYPTION BY PASSWORD = '$12345testpassword123' ); --This password is for the cert backup's key file. 

6. Shared Access Signature certificates, in the cases where your SQL Server has been configured to use a SAS certificate to, for example, send backups directly to Azure Blob Storage via the Backup to URL feature. You should save the script used to create the SAS certificate when it is created, and store it in your enterprise security vault.

7. Integration Services SSISDB database password for the SSIS Catalog. You created this password when you created the SSISDB catalog, and stored in your enterprise security vault. You can always try to open the key to test whether or not your records are correct: 

OPEN MASTER KEY
DECRYPTION BY PASSWORD = N'[old_password]'; --Password used when creating
SSISDB

More information here on restoring the SSISDB key: https://techcommunity.microsoft.com/t5/sql-server-integration-services/ssis-catalog-backup-and-restore/ba-p/388058

8. Reporting Services (SSRS) encryption key and password. Backup and restore this key using the Reporting Service Configuration Manager, and store them your enterprise security vault.


In the comments: what other steps have you taken to prevent or recover a SQL Server from a ransomware attack?

Monday, October 19, 2020

SQLSaturday #1000! Join Oregon Virtually this Saturday

Honored to be a part of not just SQLSaturday #1000, but the long proud history of the free community series of SQLSaturdays celebrating their 1000th+ event. (I spoke at SQLSat Memphis #1003 virtually last week, but the numbers are doled out at the time of event registration, not event occurrence.) 

SQLSaturdays have come a long way. Having attended my first SQLSaturday in Pensacola in June of 2009 (#14), then volunteered and spoke at my first SQLSaturday in Baton Rouge two months later in August (#17), it's been awesome to see the community response, legitimacy, and growth of what is and will always be a free, volunteer-driven, authentic event. 

The future of SQLSaturdays needs to remember the story of success. SQLSaturdays are supported by consultants who travel the world, sponsors who give away cool swag, and venues that host increasingly large and sophisticated events. But the core of the events is about making sure the events are accessible to the community, including job seekers and students, for free. Especially in these dire economic times, I firmly believe that SQLSaturdays have a role supporting the development and recovery of the local IT community. I regret that SQLSatBR didn't happen this year, I missed my friends there, even though it was clearly the best decision not to hold a 500+ person event during a pandemic. 

I'm very proud to have been accepted to speak at SQLSat Oregon, once as a speaker and once as a co-speaker with my wife Christine, on our favorite topic. Last year we attended and spoke at SQLSat Oregon for the first time in person and really enjoyed it, this year (like everything) it's virtual, and we'll be joining it in pajama bottoms.

Join me in the first two timeslots, then enjoy the rest of SQLSat Oregon!

Certification Exams Inside Out - 6AM PST - Track 2 - Slide deck

Ethics in Modern Data - 7:30AM PST - Track 6 - Slide deck 

Wednesday, October 14, 2020

Spoooky SQL at the Baton Rouge SQL/.NET Campfire Stories!

Enjoyed the heck out of tonight's October special edition of the virtual Baton Rouge .NET and SQL User Groups meeting, where we had a lot of active chat, three "spoooky" tech stories from two devs and a DBA, and then a roundtable on why the UK Public Health system used .xls files to send critical COVID-19 data. Scary stuff!

My 2019 PASS Summit SQL Idol award-losing presentation on Spoooooky SQL, expanded for this format, is here. The super scary duplicate foreign key demo is here.



Tuesday, September 29, 2020

SQLSat Memphis Virtually This Saturday

Thanks for joining us at SQLSaturday Memphis this weekend, and kudos to all the organizers for a well-run virtual event!

Microsoft is offering any one certification exam for USD15 to anyone who has become unemployed or furloughed due to the pandemic this year. If you're interested in learning more about how to prep cert exams, how they're written, and how the test questions are constructed, I'm presenting on this topic.

9:40AM: Certification Exams Inside Out | Slidedeck available for download here

Both my wife and I were on the schedule, and the organizers chose interesting nontechnical topics for both of us. Christine's presentation at 10:50AM: Organizational Trauma: Supporting Mental Health During a Crisis

Wednesday, September 23, 2020

Confounding Variables from Historical Bias

Note: co-authored with Christine Assaf, originally published in the now-defunct PASS Blog.

Historical data analysis that is naïve to past discrimination is doomed to parrot bias. How do we combat bias in our data analytics? 

First, we already know that complex, multi-talented teams are best-suited to face complex, multi-faceted problems. A more diverse group of researchers and engineers is more likely to recognize and think harder about bias problems that may impact them personally. “It’s difficult to build technology that serves the world without first building a team that reflects the diversity of the world,” wrote Microsoft President Brad Smith in Tools and Weapons (2019). 

Second, data collection is often provided to us through real-world interactions, subject to real-world bias. No real-world data exists in a vacuum. We should understand that bias in data collection and analysis may be inevitable but is not acceptable. It is not your responsibility to end bias (though that’s a worthy cause), but rather to be proactively informed and transparent.  

Look for systemic outcomes, not intentions. Only in outcomes are potential disparate impacts measured. Let’s review a couple of examples. 

Many Americans are familiar with the ACT test, an exam many students take as part of the college application process.  In 2016, ACT admitted an achievement gap in composite scores based on family income. According to ACT’s own data, there is a 3-4 point difference in scores between poorer and wealthier households, and the gap continues to widen

Credit to ACT for disclosing their research. Transparency is part of accounting for bias in historical data and data collection and is critically important to furthering a larger conversation about inequality of opportunity. 

Recently, more than 1,000 American institutions of higher learning have adopted test-optional admissions policies, meaning they no longer ask for the ACT (or SAT, a similar exam) on applications. An overwhelming amount of studies have been conducted suggesting that the ACT does NOT predict college graduation outcomes as strongly as other factors, including high school GPA and household income

Knowing the variables involved in your analysis is important. When conducting analysis, researchers must review, identify, and anticipate variables. You will never find the variables unless you are looking for them.   

This is why a proposed new rule by the United States Dept of Housing and Urban Development in 2019 stirred a massive reaction from technologists and ethicists alike. The proposed rules, yet to be implemented, would make it nearly impossible for a company to be sued when racial minorities are disproportionately denied housing, as mortgage lenders or landlords could simply blame their algorithm to avoid penalty

Centuries of racially-divided housing policies in the United States evolved into legalized housing discrimination known as redlining, ensconced in federal-backed mortgage lending starting in the 1930s. The long history of legal racial housing discrimination in the United States was arguably not directly addressed until the 1977 Community Reinvestment Act. Yet today, 75% of neighborhoods “redlined” on government maps 80 years ago continue to struggle economically, and minority communities continue to be denied housing loans at rates far higher than their white counterparts. If discriminatory outcomes in housing are to change for the better, the algorithmic orchestration of mortgage lending should not be excused from scrutiny.  

In both cases, we examined industries where algorithms draw from data revealing larger societal outcomes. These outcomes are the result of trends of economic inequality and a pervasive opportunity gap. Are such data systems to be trusted as the result of an algorithm, and thereby inherently ethical? No, not without scrutiny.  

These provide examples of when data professionals must be aware of the historical and societal contexts from which our data is drawn, and how the outcomes of our data findings could be leveraged. Would our outcomes contribute to justice?  For example, the industries of financial marketing, healthcare, criminal justice, education, or public contracting have histories checkered with injustice. We should learn that history. 

Transparency is desirable. It is needed to aid an informed societal conversation. We should not that assume an algorithm can overcome historical biases or other latent discrimination in its source data. Informed scrutiny should gaze upon historical data with a brave and honest eye.

Monday, September 14, 2020

[Updated] Prepping the local SSD D: in an Azure VM for SQL Server TempDB

One of the few good, safe uses for the local SSD volume of an Azure VM is for SQL Server tempdb data and log files. 

Note: this blog post has been updated, specifically to solution 3b below. See comments.

Typically this is the D: volume of an Azure VM, though not always in the case of migrated VMs.

Moving your SQL Server instance's tempdb files to the D: volume is recommended for performance, as long as the tempdb files fit it the D: that has been allocated, based on your VM size. 

When the D: is lost due to deallocation, as expected, the subfolder you created for the tempdb files (if applicable) and the NTFS permissions granting SQL Server permission to the folder are no longer present. SQL Server will be unable to create the tempdb files in the subfolder and will not start. Even if you put the tempdb data and log files in the root of D:, after deallocation, that's still not a solution, as the NTFS permissions to the root of D: won't exist. In either case, SQL Server will be unable to create the tempdb files and will not start.

In the Windows Application Event log of the Azure VM, you'll see something like:
Event ID: 17204
FCB::Open failed: Could not open file D:\tempdb.mdf for file number 1.  OS error: 2(The system cannot find the file specified.).
Event ID: 5123
CREATE FILE encountered operating system error 5(Access is denied.) while attempting to open or create the physical file 'D:\tempdb.mdf'.
On Azure Windows VMs, the local SSD temporary disk is D:, I'll assume it is for you as well. On Azure Linux VMs, the local SSD temporary disk is /dev/sdb by default. I mention these solutions for Windows OS below.

Solutions


There are three solutions to this problem in Windows. To test any of these solutions, don't just restart your VM, you have to deallocate it (stop it in the Azure Portal, then restart it). That's the way to be sure the contents of D: has been lost. However, data on the temporary disk may be lost during Azure maintenance, which is mostly out of your control, so it is important to solve this issue when using the D: for tempdb.

Note that simply shutting down the Azure VM from inside operating system does not deallocate the VM, it only stops the VM. During a normal, successful reboot of the VM OS, data on the D: will not be lost. A reboot initiated from inside an RDP session, for example, shouldn't trigger the contents of D: to be lost. You must stop the VM from the Azure Portal to deallocate the VM. This is the difference between a VM's status of "Stopped" or "Stopped (deallocated)". 

1. Windows Scheduled Task Calling PowerShell

The first solution it appears is the one Microsoft documented years ago here but is no longer necessary. To be clear, Solution 3 below, using the IaaS Agent, is the recommended and best option now.

The old workaround was something like this. A simple PowerShell script running in Windows scheduled task, triggered on Startup. It also requires setting the SQL Server service to "Automatic (Delayed Startup)". 

The script creates the subfolder on D: for the tempdb, then grants permissions to the SQL Server service account, before SQL Server tries to start.

This type of script and delayed startup is a yet-unsolved necessary complication, but this solution works fine.

2. Add SQL service account to local Administrators group

The second solution is to place the tempdb files on the root of D: but then make the SQL Server service account a local Windows administrator, so that it has permissions to write to the root of D:.  This violates most security best practices (though it is more common than you'd think). 

Key here is placing tempdb in the root of D:. Even as a member of the local Administrators group on the server, SQL Server will not create subfolders upon startup, and a subfolder on D: won't exist after deallocation. 

I'm not a fan but it's a solution, and this was certainly the simple, though it violated the principles of least privilege.

3a. Azure SQL Server IaaS Agent Extension, installed with VM (recommended)

A third and best solution involves the Azure SQL VM Resource Provider. It places the SQL Server IaaS Agent Extension that connects the SQL instance to the Azure portal for all sorts of easy integrations, bringing a lot of manageability to the Azure Portal. It is generally highly recommended. 

The SQL Server IaaS Agent Extension should handle preparing D: for the tempdb, solving startup issues for an Azure VM with SQL Server. This definitely works for VM images with SQL Server from the Azure marketplace, which by default configure tempdb on the local SSD D: anyway. I have tested this out with the newest "SQL Server 2019 on Windows Server 2019" image available in the Azure Marketplace. It works. For example:


"If you create the VM from Azure portal or Azure quickstart templates and place Temp DB on the Local Disk then you do not need any further action." Turns out, Solution 3 here is very similar to Solution 1. When installed from the marketplace with the SQL Server IaaS Agent installed, the Azure VM has a Windows Scheduled task triggered At Startup to call "C:\Program Files\Microsoft SQL Server IaaS Agent\Bin\SqlIaaSExtension.SqlServerStarter.exe".  

I speculate that this app runs on startup to prep the D: for tempdb, handling the permissions on the  D:\TempDb subfolder for the SQL Server service account, which is not a member of the local admins group by default. In this case, unlike Solution #1 above, the SQL Server service is not configured to "Automatic (Delayed Startup)", it is still configured to "Automatic", as usual.

Again, to test this out, don't just restart your VM, you have to deallocate it (stop it in the Azure Portal, start it). That's the way to be sure the D: has been deallocated and restarted.

I have split this solution into two parts because 3a only appears to work if SQL and the SQL VM Resource Provider are pre-installed via the image gallery. 

3b. Azure SQL Server IaaS Agent Extension installed manually

But what about for an existing VM, not created with the SQL Server IaaS Agent Extension pre-installed from the Azure VM image gallery? This is currently unclear. [UPDATE August 2023]: This now works. The IaaS Agent now configures tempdb on the ephemeral local SSD D: drive, even if registered on the VM after SQL Server was installed. This solution should work the same as 3a, above: Manage SQL Server virtual machines in Azure by using the Azure portal - SQL Server on Azure VMs.

We can install the SQL Server IaaS Agent Extension ourselves by registering the VM with the SQL VM resource provider. Once the RP is associated with your Azure subscription, adding the VM is easy. Install the SQL Server IaaS Agent Extension in lightweight management mode, without restarting the VM from the Azure CLI:
az sql vm create --name vm-iaasag-postsql --resource-group vm-iaasag-postsql_group --location eastus2 --license-type PAYG
Options for PowerShell are available in the same link.

However, you'll notice the SQL Server IaaS Agent Extension in lightweight management mode doesn't actually create the executable found in "C:\Program Files\Microsoft SQL Server IaaS Agent\Bin\". For that to be present, get up to "Full management mode". You can upgrade from lightweight management mode, or go straight to full management mode, but it costs a SQL Server service restart either way for Full management mode.

Currently though, if I create a new Windows VM without SQL Server, install a SQL instance, install the SQL Server IaaS Agent Extension in full management mode, and even schedule "C:\Program Files\Microsoft SQL Server IaaS Agent\Bin\SqlIaaSExtension.SqlServerStarter.exe" in a Scheduled Task triggered on startup (just like in a VM from the image gallery has), it doesn't appear to prep the local SSD D:\ drive for TempDB upon startup after deallocation. SQL Server service still fails to start because it cannot create TempDB. 

So, the SQL Server IaaS Agent Extension appears to prep the local SSD D:\ drive for the TempDB files, but only if SQL and the SQL VM RP are pre-installed via the image gallery. I'm still actively working on a solution to this and reaching out to Microsoft. I will update this blog post at a later date.

Saturday, September 12, 2020

All-Day at Virtual Houston Tech Fest 2020

It's a full day at Houston Tech Fest 2020 for us. Speaking at Houston TechFest is an decade-long annual tradition interrupted only by a hurricane and one year where I missed it to be at a wedding. This year for the first time it's all-online thanks to a host of great volunteers acting as virtual moderators.

Christine and I each presented four times (once jointly) and each were recorded to Youtube:










Wednesday, September 09, 2020

New Microsoft Azure Certifications presentation at BRSSUG

Hi internet! 

Thanks to the surprising number of you who joined our first ever virtual meeting of the Baton Rouge .NET and SQL Server User Groups tonight!

I presented on the new generation of Microsoft Azure Certifications coming out, talked about what they did with our old MCSE's and highlighted the new cert paths for developers, DBA's, and data analysts. I also talked about the new remote proctored cert exams available for you while *all this* is going on.

Our first ever virtual UG meeting in Baton Rouge was made possible by the awesome enthusiasm volunteer streaming manager Bedford Nieves-Sanchez!

Slidedeck download here. Best of luck on your upcoming cert pursuits!

The first ever online BRSSUG/BRDNUG meeting 

We peaked at 39 viewers during our first ever stream!