-
Notifications
You must be signed in to change notification settings - Fork 103
Avoid setting dummy values inappropriately for original files. #892
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Adding |
|
I have no idea what's going on here. It's imported, and builds fine locally. I assume it's some weird kind of merge conflict? Maybe I should wait a day, rebase and repush, and see what happens then. |
|
It's likely a merge interaction on snoopy between your 2 PRs. |
|
Yeah, looks like a duplicate import, maybe two different PRs each removed a different duplicate. (-: |
|
It should be okay now. The conflict was with #881. |
|
Since a number of DB changes are coming, this is likely not an issue, but as a heads up, modifying the SQL files in place can cause weird errors if someone is working with an older version of the DB. It's usually safest to bump the patch number forcing someone to re-create. |
|
Note: this is also still excluded. Should it be? |
|
Can probably try including in build again; I don't have any reasons noted why not to. Patch number: good idea, thank you. I'll push a commit today for that to get into the habit. |
|
Gah, actually maybe I oughtn't add a commit, I guess I add a few hundred kilobytes to the repository each time due to the directory name under (Is it important to keep the old SQL in the current release -- can we just keep the version number out of the directory name and trust git and our archive of release artifacts to make older versions easily obtained when necessary? Not for this PR...) |
|
I think we definitely want to work toward removing all SQL scripts from the main repo. |
|
(But yes, in another PR) |
|
For everyone else's benefit, using Otherwise, interested to get this in. There may be some documentation we need to do in terms of "expect nulls for Merging. |
Avoid setting dummy values inappropriately for original files.
|
Thanks very much, I'll remember that the |
|
Note: @chris-allan was able to deploy this merged PR but without the dbpatch change the |
|
The |
Avoid setting dummy values inappropriately for original files.
The previous code had dummy required values being set for hashes and lengths of
OriginalFileinstances for things like files that hadn't yet been uploaded, and for directories. Discussion in http://trac.openmicroscopy.org.uk/ome/ticket/2581 considers the idea of simply allowing such properties to be set tonullunless there is actually an applicable value for them, so as to declutter the code and the database.For
OriginalFileinstances, this PR makes the size and checksum optional, to tidy things slightly in anticipation of further changes once the checksum algorithm is made selectable. The model and dummy values aroundPixelsinstances remain unchanged.In testing, one might want to check the
originalfiletable in the DB to make sure that somesizeorsha1that was previously being correctly set still is set, i.e. that this PR causes no regressions. Obviously, also test things like importing and annotating images with files to make sure that there are no regressions there either.In the longer term one may want to consider designing a richer type hierarchy around
OriginalFile.--no-rebase FS only