What is dynamic data masking (DDM)?
Dynamic data masking is a data protection feature that masks, or hides, sensitive data and personally identifiable information. DDM applies a mask to the result set of a query over certain database fields.
The term “dynamic” means that the information is masked on the fly when a user queries the database; the data itself does not change. In addition, users with the appropriate level of permissions can still see the unmasked data on a need-to-know basis.
DDM is a built-in feature for both Microsoft SQL Server 2016 and Microsoft Azure SQL Database. By giving you more control over how data appears in the result set, DDM helps you protect sensitive and confidential information.
The benefits of dynamic data masking
- DDM makes it easier to prevent the unauthorized disclosure of sensitive data. Administrators can allow privileged users or roles to have access to this data, while restricting access to others.
- DDM is very simple to configure and use with Transact-SQL commands, and does not require changes in the application code.
- DDM includes functions for both full masking and partial mask, as well as a random mask to be used on numeric data.
- DDM can be coordinated via a centralized data masking policy.
The limitations of dynamic data masking
DDM is not a perfect solution for database security. By itself, DDM can’t prevent database users from exposing sensitive data by continually querying the database.
In addition, DDM is not a method for physical data encryption. DDM is intended to be used in conjunction with other data security features in SQL Server, such as encryption, auditing, and Row-Level Security.
Finally, certain actions are not possible with DDM. For example, users can’t define masking rules for Always Encrypted columns, FILESTREAM, or COLUMN_SET. In addition, you can’t use a column with data masking as the key for a FULLTEXT index.
Dynamic data masking functions
DDM includes four types of masking functions that you can use to conceal your sensitive data: default, email, random, and custom.
The default data mask function is applied according to the type of data field:
- String: Masked as XXXX
- Date/time: Masked as 01.01.1900 00:00:00.0000000
- Numeric, binary, or image: Masked as 0
The email data mask function displays only the first character of an email address and hides the rest of the address (e.g. “aXXX@XXXX.com”).
The random data mask function is used to hide the value of a numeric data type by randomly selecting a numeric value between a specified minimum and maximum.
The custom data mask function lets you create a mask of your own choosing for string data. The first and last letters of the string will be displayed, and you can add a custom padding string in the middle.
Dynamic data masking permissions
- UNMASK: Users must have this permission in order to reveal the original unmasked data.
- ALTER ANY MASK: This permission allows users to add, replace, and remove masks on existing columns.
- CONTROL: This permission includes both the UNMASK and the ALTER ANY MASK permissions.
How to use dynamic data masking
To start using DDM with your database, first decide which columns you want to mask. Select the masking function (default, email, random, or custom) that best fits your needs for each column.
Next, alter the columns in the database to apply the desired masking rules. Remember that the underlying data will not be affected by these changes.
Finally, designate which people should have what level of access to this masked data. Configure the appropriate permissions (UNMASK, ALTER ANY MASK, CONTROL) for the right users and roles.
Subscribe to Our Blog
Never miss a post! Stay up to date with the latest database, application and analytics tips and news. Delivered in a handy bi-weekly update straight to your inbox. You can unsubscribe at any time.
Most people will encounter this error when their application tries to connect to an Oracle database service, but it can also be raised by one database instance trying to connect to another database service via a database link.
Imagine over 100 logins on the source server, you need to migrate them to the destination server. Wouldn’t it be awesome if we could automate the process?