Skip to content

Create microsoft.sp_update_statistics.sql #42

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Azure SQL Data Warehouse Samples Repository
# Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) Samples Repository

This GitHub repository contains code samples that demonstrate how to use [Microsoft's Azure SQL Data Warehouse](http://aka.ms/sqldw) service. Each sample includes a README file
This GitHub repository contains code samples that demonstrate how to use [Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)](http://aka.ms/sqldw) service. Each sample includes a README file
that explains how to run and use the sample.

## Releases in this repository
Expand Down
4 changes: 2 additions & 2 deletions arm-templates/sqlDwLogicAppAutoPause/readme.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# SQL Data Warehouse Auto-Pause and Auto-Resume using Logic apps
# Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) Auto-Pause and Auto-Resume using Logic apps

This sample includes 3 templates to showcase different scenarios for using Logic apps to pause and resume compute resources for a SQL Data Warehouse

Expand All @@ -11,7 +11,7 @@ In this scenario, you deploy a timer-based Logic app configured with a [service
**1. Deploy timeschedule-based autopause and autoresume template**

> [!NOTE]
> You will need an existing SQL Data Warehouse to deploy the template below.
> You will need an existing Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) to deploy the template below.

* Use the server-name and database-name properties of your data warehouse, and a pause-resume schedule to complete the deployment.
* You can additionally specify the timezone for the pause-resume schedule, and if the schedule should only run on weekdays.
Expand Down
1 change: 0 additions & 1 deletion arm-templates/sqlDwLogicAppAutoScale/readme.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
# SQL Data Warehouse AutoScale using Logic apps

This sample includes 3 templates to showcase different scenarios for using Logic apps to autoscale compute resources for a SQL Data Warehouse

Expand Down
3 changes: 2 additions & 1 deletion arm-templates/sqlDwSpokeDbTemplate/README.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
# SQL Data Warehouse Hub Spoke Template with SQL Databases
# Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) AutoScale using Logic apps
Hub Spoke Template with SQL Databases

<a href="https://ms.portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FMicrosoft%2Fsql-data-warehouse-samples%2Fmaster%2Farm-templates%2FsqlDwSpokeDbTemplate%2Fazuredeploy.json" target="_blank">
<img src="https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/1-CONTRIBUTION-GUIDE/images/deploytoazure.png"/>
Expand Down
3 changes: 2 additions & 1 deletion arm-templates/sqlDwTimerScaler/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@

# SQL Data Warehouse Timer Scaler
# Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) AutoScale using Logic apps
Timer Scaler

<a href="https://ms.portal.azure.com/#create/Microsoft.Template/uri/https%3A%2F%2Fraw.githubusercontent.com%2FMicrosoft%2Fsql-data-warehouse-samples%2Fmaster%2Farm-templates%2FsqlDwTimerScaler%2Fazuredeploy.json" target="_blank">
<img src="https://raw.githubusercontent.com/Azure/azure-quickstart-templates/master/1-CONTRIBUTION-GUIDE/images/deploytoazure.png"/>
Expand Down
3 changes: 2 additions & 1 deletion samples/adf/Readme.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Azure Data Factory samples

This is a collection of samples that demonstrate how to use [Azure SQL Data Warehouse](https://aka.ms/sqldw) with [Azure Data Factory](https://azure.microsoft.com/services/data-factory).
This is a collection of samples that demonstrate how to use [Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) AutoScale using Logic apps
](https://aka.ms/sqldw) with [Azure Data Factory](https://azure.microsoft.com/services/data-factory).

Samples:

Expand Down
6 changes: 4 additions & 2 deletions samples/automation/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Azure Automation Samples

This is a collection of samples that demonstrate how to use [Azure SQL Data Warehouse](https://aka.ms/sqldw) with [Azure Automation](https://azure.microsoft.com/services/automation).
This is a collection of samples that demonstrate how to use [Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) AutoScale using Logic apps
](https://aka.ms/sqldw) with [Azure Automation](https://azure.microsoft.com/services/automation).

When using the SQL Azure Data Warehouse service (ADW) it can be important to automate operations. One of these goals of this project is to help with the pause, resume and scale of the data warehouse.

Expand All @@ -10,7 +11,8 @@ If you have an automated ETL/ELT process, it can be important to properly manage

## When to Pause or Scale ADW

Your Azure SQL Data Warehouse is not a regular SQL server. Though it has many similarities, many of request of the service cause [data movement operations](https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-distribute#understanding-data-movement) which can be time consuming. When you pause/scale the service, all running queries are canceled. Other than having lost productivity, rollback of queries that are in the middle of data movement could take hours to complete. While the rollback operations are processing the compute will be running but you will not be able to connect or submit request. These workflows will test to ensure that is is OK to pause or scale before taking action. Though there are retry sequences in the workflows, it will also be important to ensure that the data warehouse is in the right state before your processes continue.
Your Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) AutoScale using Logic apps
is not a regular SQL server. Though it has many similarities, many of request of the service cause [data movement operations](https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-distribute#understanding-data-movement) which can be time consuming. When you pause/scale the service, all running queries are canceled. Other than having lost productivity, rollback of queries that are in the middle of data movement could take hours to complete. While the rollback operations are processing the compute will be running but you will not be able to connect or submit request. These workflows will test to ensure that is is OK to pause or scale before taking action. Though there are retry sequences in the workflows, it will also be important to ensure that the data warehouse is in the right state before your processes continue.

## How to Use Workflows

Expand Down
2 changes: 1 addition & 1 deletion samples/automation/RefreshReplicatedTable/README.md
Original file line number Diff line number Diff line change
@@ -1 +1 @@
# Refresh Replicated Tables in a Azure SQL Data Warehouse
# Refresh Replicated Tables in a Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Suspend or Pause Azure SQL Data Warehouse
# Suspend or Pause Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)

See the blog post [here](https://blogs.msdn.microsoft.com/allanmiller/2017/09/20/pausing-azure-sql-data-warehouse-using-an-automation-runbook/ "Pausing Azure SQL Data Warehouse using an Automation Runbook")

Expand Down
2 changes: 1 addition & 1 deletion samples/scripts/Readme.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Database scripts

Contains sample scripts for managing, monitoring, and maintaining Azure SQL Data Warehouse databases.
Contains sample scripts for managing, monitoring, and maintaining Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) databases.

## Data-load
The [data-load](data-load/) folder contains TSQL scripts that can be adapted to simplify loading DW tables from Azure Blob Storage or Azure Data Lake Store using tables in DW as the model.
Expand Down
4 changes: 2 additions & 2 deletions samples/scripts/data-load/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Auto Generate Azure SQL DW Load – TSQL Scripts
# Auto Generate Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)Load – TSQL Scripts

These scripts were developed to help with a large Azure SQL DW POC with one of the DMJ customers. Some of the assumptions made in the scripts are very specific to the scenario encountered at the customer. Feel free to adapt these scripts to your data loading scenario.
These scripts were developed to help with a large Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) POC with one of the DMJ customers. Some of the assumptions made in the scripts are very specific to the scenario encountered at the customer. Feel free to adapt these scripts to your data loading scenario.

Each of the scripts creates the appropriate objects to allow an optimal performance DW load from text files in blob storage into Azure SQL DW using Polybase. You can easily change the external data source to use Azure Data Lake as a source as well. The optimal nature of the load will depend on the number of files provided. Our customer had many gzip’d files that distributed well among several DW nodes (few larger gzip’d files would not have done as well).

Expand Down
2 changes: 1 addition & 1 deletion samples/scripts/queries/Readme.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Database queries

This folder contains a collection of scripts used to monitor query usage in your [Azure SQL Data Warehouse](http://aka.ms/sqldw) database.
This folder contains a collection of scripts used to monitor query usage in your [Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)](http://aka.ms/sqldw) database.

- [query_memory_usage](query_memory_usage.sql): This query shows the memory utilization of queries run against your data warehouse. It allows you to easily identify any query that could benefit from additional memory via a larger resource class.
2 changes: 1 addition & 1 deletion samples/sqlops/MonitoringScripts/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
## Introduction

The following scripts are leveraged to create custom monitoring dashboard widgets in SQL Operations Studio (preview) for Azure SQL Data Warehouse.
The following scripts are leveraged to create custom monitoring dashboard widgets in SQL Operations Studio (preview) for Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse).

## Current Widgets

Expand Down
6 changes: 3 additions & 3 deletions samples/sqlops/README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
## Azure SQL Data Warehouse Insights
## Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) Insights
The Azure SQL Data Warehouse insight widget extension provides a pre-built dashboard surfacing insights to your data warehouse. This helps with scenarios around managing and tuning your data warehouse to ensure it is optimized for consistent performance.

For any issues with this extension, please submit them here:
[SQL Data Warehouse Samples - issues](https://github.com/Microsoft/sql-data-warehouse-samples/issues).
[Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) Samples - issues](https://github.com/Microsoft/sql-data-warehouse-samples/issues).

If you have any feedback, please contact: sqldwadvisor@service.microsoft.com.

The following widgets are generated by using T-SQL monitoring scripts embedded within SQL Operations Studio. All monitoring scripts are uploaded to the following github:
[SQL Data Warehouse Samples](https://github.com/Microsoft/sql-data-warehouse-samples/tree/master/samples/sqlops/MonitoringScripts).
[Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse) Samples](https://github.com/Microsoft/sql-data-warehouse-samples/tree/master/samples/sqlops/MonitoringScripts).

## Backup Details and User Activities
Identify and understand workload patterns through active sessions, active queries, queued queries, loads, and backups.
Expand Down
4 changes: 2 additions & 2 deletions solutions/monitoring/Readme.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
# Microsoft Toolkit for SQL Data Warehouse
The Microsoft Toolkit for SQL Data Warehouse offers a set of views and procedures that help you optimize your [Azure SQL Data Warehouse](http://aka.ms/sqldw).
# Microsoft Toolkit for Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)
The Microsoft Toolkit for SQL Data Warehouse offers a set of views and procedures that help you optimize your [Azure Synapse Analytics SQL Pool (formerly SQL Data Warehouse)](http://aka.ms/sqldw).

## Installation
The toolkit installs the following items:
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,90 @@
PRINT 'Info: Creating the ''microsoft.sp_update_statistics'' procedure';
GO

CREATE PROCEDURE [microsoft].[sp_update_statistics]
( @update_type tinyint -- 1 default 2 fullscan 3 sample 4 resample
,@sample_pct tinyint
,@user_created tinyint -- 0-system created stats, 1-user created stats
)
AS

IF @update_type NOT IN (1,2,3,4)
BEGIN;
THROW 151000,'Invalid value for @update_type parameter. Valid range 1 (default), 2 (fullscan), 3 (sample) or 4 (resample).',1;
END;

IF @sample_pct IS NULL
BEGIN;
SET @sample_pct = 20;
END;

IF OBJECT_ID('tempdb..#stats_ddl') IS NOT NULL
BEGIN
DROP TABLE #stats_ddl
END

CREATE TABLE #stats_ddl
WITH
(
DISTRIBUTION = HASH([seq_nmbr])
)
AS
SELECT
CASE @update_type
WHEN 1
THEN 'UPDATE STATISTICS '+[two_part_name]+'('+[stats_name]+');'
WHEN 2
THEN 'UPDATE STATISTICS '+[two_part_name]+'('+[stats_name]+') WITH FULLSCAN;'
WHEN 3
THEN 'UPDATE STATISTICS '+[two_part_name]+'('+[stats_name]+') WITH SAMPLE '+CAST(@sample_pct AS VARCHAR(20))+' PERCENT;'
WHEN 4
THEN 'UPDATE STATISTICS '+[two_part_name]+'('+[stats_name]+') WITH RESAMPLE;'
END AS [update_stats_ddl]
, [seq_nmbr]
FROM (
SELECT
sm.[name] AS [schema_name]
, tb.[name] AS [table_name]
, st.[name] AS [stats_name]
, st.[has_filter] AS [stats_is_filtered]
, ROW_NUMBER()
OVER(ORDER BY (SELECT NULL)) AS [seq_nmbr]
, QUOTENAME(sm.[name])+'.'+QUOTENAME(tb.[name]) AS [two_part_name]
, QUOTENAME(DB_NAME())+'.'+QUOTENAME(sm.[name])+'.'+QUOTENAME(tb.[name]) AS [three_part_name]
, STATS_DATE(st.[object_id],st.[stats_id]) AS [stats_last_updated_date]
FROM sys.objects AS ob
JOIN sys.stats AS st ON ob.[object_id] = st.[object_id]
JOIN sys.stats_columns AS sc ON st.[stats_id] = sc.[stats_id]
AND st.[object_id] = sc.[object_id]
JOIN sys.columns AS co ON sc.[column_id] = co.[column_id]
AND sc.[object_id] = co.[object_id]
JOIN sys.tables AS tb ON co.[object_id] = tb.[object_id]
JOIN sys.schemas AS sm ON tb.[schema_id] = sm.[schema_id]
WHERE 1=1 and STATS_DATE(st.[object_id],st.[stats_id]) is not null
AND st.[user_created] = @user_created
GROUP BY
sm.[name]
, tb.[name]
, st.[name]
, st.[filter_definition]
, st.[has_filter]
) t1

;

DECLARE @i INT = 1
, @t INT = (SELECT COUNT(*) FROM #stats_ddl)
, @s NVARCHAR(4000) = N''

WHILE @i <= @t
BEGIN
SET @s=(SELECT update_stats_ddl FROM #stats_ddl WHERE seq_nmbr = @i);

PRINT @s
EXEC sp_executesql @s
SET @i+=1;
END


DROP TABLE #stats_ddl;