-
-
Notifications
You must be signed in to change notification settings - Fork 803
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Import-DbaCsv, map correct types for BulkCopy #9479
Conversation
public/Import-DbaCsv.ps1
Outdated
$tableDef = Get-DbaDbTable $instance -SqlCredential $SqlCredential -Database $Database -Table $table -Schema $schema | ||
|
||
if ($tableDef.Count -ne 1) { | ||
Stop-Function -Message "Could not fetch table definition for table $table in schema $schema" -ErrorRecord $_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm unsure if this is the right thing to do here ... at this point I don't see any branches in code that could allow reaching here and not having a table present, but ... "what if"?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd leave it cuz we encounter "what if" more than we'd expect.
My only opinion is that it's wonderful to have this addressed 😁 Thank you! I'll re-run whatever is failing to see if we can get it all green. |
Oh, oops. it's a failure on Import-DbaCsv, @niphlod |
yeah, I know. did need some sleep yesterday and hadn't time to look into it. |
Yes! Thank you 🙏🏼 |
This is so good, @niphlod ! thank you. WIll merge when andreas approves |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good. added some minor suggestions.
public/Import-DbaCsv.ps1
Outdated
# we do not use $server because the connection is active here | ||
$tableDef = Get-TableDefinitionFromInfoSchema -table $table -schema $schema -sqlconn $sqlconn | ||
if ($tableDef.Length -eq 0) { | ||
Stop-Function -Message "Could not fetch table definition for table $table in schema $schema" -ErrorRecord $_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think -ErrorRecord $_
should only be used in a catch block. Here we don't have an error.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I knew I was missing something
public/Import-DbaCsv.ps1
Outdated
# start by getting the table definition | ||
$tableDef = Get-TableDefinitionFromInfoSchema -table $table -schema $schema -sqlconn $sqlconn | ||
if ($tableDef.Length -eq 0) { | ||
Stop-Function -Message "Could not fetch table definition for table $table in schema $schema" -ErrorRecord $_ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove -ErrorRecord $_
public/Import-DbaCsv.ps1
Outdated
@@ -663,8 +830,8 @@ function Import-DbaCsv { | |||
$bulkCopy.Add_SqlRowsCopied( { | |||
$script:totalRowsCopied += (Get-AdjustedTotalRowsCopied -ReportedRowsCopied $args[1].RowsCopied -PreviousRowsCopied $script:prevRowsCopied).NewRowCountAdded | |||
|
|||
$tstamp = $(Get-Date -Format 'yyyyMMddHHmmss') | |||
Write-Message -Level Verbose -Message "[$tstamp] The bulk copy library reported RowsCopied = $($args[1].RowsCopied). The previous RowsCopied = $($script:prevRowsCopied). The adjusted total rows copied = $($script:totalRowsCopied)" | |||
#Write-Message -Level Verbose -FunctionName "Import-DbaCsv" -Message " The bulk copy library reported RowsCopied = $($args[1].RowsCopied). The previous RowsCopied = $($script:prevRowsCopied). The adjusted total rows copied = $($script:totalRowsCopied)" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should be removed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yep. BTW, found instances of the same logging which results in a "bad-looking" message so once this goes in I'll replace other occurrences through our functions as well
$sqlconn | ||
) | ||
|
||
$query = "SELECT c.COLUMN_NAME, c.DATA_TYPE, c.ORDINAL_POSITION - 1 FROM INFORMATION_SCHEMA.COLUMNS AS c WHERE TABLE_SCHEMA = @schema AND TABLE_NAME = @table;" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any chance this will ever get math error on the ordinal position?
The ExecuteReader()
should be in a try/catch just in case it can/does.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ORDINAL_POSITION always starts as 1. I'll wrap executereader in a bit
Co-authored-by: Shawn Melton <11204251+wsmelton@users.noreply.github.com>
hell yeah, thanks everyone! 🔥 |
Type of Change
.\tests\manual.pester.ps1
)Purpose
Enlarge the usecase of Import-DbaCsv.
Right now it's either going to never fail (creating a table of all nvarchar(max)) or it will as soon as the table has some not-that-weird types (bit or guid, for example).
This should also greatly speed-up loads to existing tables without users needing to "stage" to a nvarchar(max) table and then copying over to the destination table.
I'm guessing it also helps without even counting the "stage-then-move" bit: loading a pre-existing table with the correct types should be faster than loading to a nvarchar(max) table.
Approach
It's long, and windy, but I made the code redundant and verbose (hoperfully readable and obvious as well) rather than short and speedy. My 2c here is that it's a bit of code that is rather convoluted and it's better to be explicit more than anything. Cost (as in milliseconds spent) on adding the proper machinery is minimal if you take into account loading even just 1k rows.
I'm unsure about approaching the "get the types from the table" using SMO (like this version) or if it'd be better to just use a quicker query going directly to system tables to get "name, datatype" resultset and use that for the mapping. Pro of that is that we don't open another connection, because at that point of the function one has already been established and a transaction is pending, so re-using $server for Get-DbaDbTable ends up in errors.
There's another bit that's missing which is enabling this on a csv that has no headers... not sure if it's needed by users but if so the code gets longer because we can't let lumen do "quite a bit of work" (the
$reader.GetFieldHeaders()
call)That being said, @potatoqualitee is the original creator and MAY have an opinionated view on how to tackle that.
@petervandivier : I got it working for most of my tables, but you may want to give it a whirl and maybe test extensively with any type you can throw at it.
Commands to test
The one on the report is good, short and simple
NB: this is a first PROPOSAL. If this approach is "accepted" I'm going to add a few tests for it in new commits.