Skip to content

Commit d8407e5

Browse files
authored
Update sample import script to use a valid value for MaxSizeBytes (Azure#340)
The value previously in use (262144000, which is 250MB) is not a valid size for an S3 database. Trying to create a database of that size will throw an error like `The edition 'Standard' does not support the database data max size '262144000'.` I am not clear whether 250MB was *ever* a valid size for S3, but it certainly is not now. It used to be the case that the Import/Export service would silently swallow this error and just create a database of the max size for the SLO. However, that (buggy) behavior was fixed around November 2020 to respect the parameter.
1 parent 5aea814 commit d8407e5

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

sql-database/import-from-bacpac/import-from-bacpac.ps1

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ $serverFirewallRule = New-AzSqlServerFirewallRule -ResourceGroupName $resourceGr
6262
$importRequest = New-AzSqlDatabaseImport -ResourceGroupName $resourceGroupName `
6363
-ServerName $serverName `
6464
-DatabaseName $databaseName `
65-
-DatabaseMaxSizeBytes "262144000" `
65+
-DatabaseMaxSizeBytes 100GB `
6666
-StorageKeyType "StorageAccessKey" `
6767
-StorageKey $(Get-AzStorageAccountKey -ResourceGroupName $resourceGroupName -StorageAccountName $storageAccountName).Value[0] `
6868
-StorageUri "https://$storageaccountname.blob.core.windows.net/$storageContainerName/$bacpacFilename" `
@@ -91,4 +91,4 @@ Set-AzSqlDatabase -ResourceGroupName $resourceGroupName `
9191
-RequestedServiceObjectiveName "S0"
9292

9393
# Clean up deployment
94-
# Remove-AzResourceGroup -ResourceGroupName $resourceGroupName
94+
# Remove-AzResourceGroup -ResourceGroupName $resourceGroupName

0 commit comments

Comments
 (0)