Skip to content

Commit 06aa412

Browse files
cmadjarlaemtlLaetitia Fesselierregisoc
authored
Merge 24.1.7 into main (#879)
* update the VERSION for the next bug fix release (#816) * [dcm2bids] Remove hardcoded dcm2niix binary to use the value stored in the `converter` Config setting (#815) * modifies the dcm2niix command to use the Config converter value instead of hardcoding dcm2niix * add a check to make sure the converter is a dcm2niix binary * Pull 24.0.3 in 24.1 release (#820) * Reload the mri_upload dictionary before checking if a tarchive has been validated (#783) * reload mri_upload object * remove debugging exit and print statements * fix minor bugs when dealing with scans.tsv files (#774) * fix regex search for excluded series description patterns (#786) * fix_return_statement_of_create_imaging_upload_dict_from_upload_id_function (#787) * [dcm2bids] Insert into MRICandidateErrors if there is a Candidate PatientName mismatch (#790) * insert into MRICandidateErrors when candidate mismatch or pname not matching between DICOMs and NIfTI * remove exit * fix table name to MRICandidateErrors instead of MriCandidateErrors as apparently, it makes a difference on MariaDB/Linux VMs while it just worked on local install on Mac... (#793) * Set DICOM dates to undef if the date does not follow proper DICOM standard (#794) * set date to undef if it does not follow proper DICOM formats * fix all dates set to NULL * Installation and pet fixes (#818) Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> * Update VERSION file for next bug fix release Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mail.mcgill.ca> Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> * fix nonetype errors when the visit of a session does not exist so that proper logging is done (#824) * fix some errors when RepetitionTime is not available in JSON file (#825) * Add capability to download file from s3 (#826) * add capability to download file from s3 * fix flake8 error * Upload to S3: support object name starting with s3://bucket_name/ for upload (#827) * add capability to download file from s3 * fix flake8 error * add ability to remove s3://bucketname/ from the object name before upload * fix database class pselect documentation for the return type (#828) * map scan type to scan type ID when scan type provided to run_nifti_insertion.pl (#829) * modify permission of script run_push_imaging_files_to_s3_pipeline.py to make it executable (#830) * skip violation if not found on filesystem since it means the scan has been rerun (#831) * update VERSION file (#832) * do not push files to S3 when their path in the DB is already an S3 URL (#833) * fix violation files path when checking if the files are on the filesystem before adding them to the list of files to push to S3 (#834) * Merge 24.0 release into 24.1 release (#836) * Reload the mri_upload dictionary before checking if a tarchive has been validated (#783) * reload mri_upload object * remove debugging exit and print statements * fix minor bugs when dealing with scans.tsv files (#774) * fix regex search for excluded series description patterns (#786) * fix_return_statement_of_create_imaging_upload_dict_from_upload_id_function (#787) * [dcm2bids] Insert into MRICandidateErrors if there is a Candidate PatientName mismatch (#790) * insert into MRICandidateErrors when candidate mismatch or pname not matching between DICOMs and NIfTI * remove exit * fix table name to MRICandidateErrors instead of MriCandidateErrors as apparently, it makes a difference on MariaDB/Linux VMs while it just worked on local install on Mac... (#793) * Set DICOM dates to undef if the date does not follow proper DICOM standard (#794) * set date to undef if it does not follow proper DICOM formats * fix all dates set to NULL * Installation and pet fixes (#818) Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> * Update VERSION file for next bug fix release * Project, event validation and protobuf update (#823) * Project, event validation and protobuf update * Site and project search when creating candidate * missing import * correct pscid search * Events validation * flake rules update * review, and NULL value filtered out for site and project * flake * flake Co-authored-by: regisoc <regis.ongaro-carcy@mcin.ca> * fix conflict * fix version * fix version Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mail.mcgill.ca> Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> Co-authored-by: regisoc <regis.ongaro-carcy@mcin.ca> * fix check if file already inserted in DB (#845) * Fix logic of determining file run number when previously inserted files are already pushed to S3 and not on filesystem anymore (#846) * fix bug * fix listing of filenames * comment new function * update version file (#847) * Chunk creation subprocess failure check (#848) * Chunk creation subprocess failure check Raise error when the chunk creation subprocess fails. Fix #843 * Update python/lib/physiological.py Print actual error message Co-authored-by: Cécile Madjar <cecile.madjar@mcin.ca> Co-authored-by: Cécile Madjar <cecile.madjar@mcin.ca> * Revert chunk_pb2.py changes (#849) * remove prints in nifti_insertion_pipeline.py (#851) * fix permissoin denied upon deletion of tmp dir (#853) * update to next bug fix relesae (#854) * fix duplicated protocols error when same scan type returned (#856) * Add missing exit codes on the python's side (#857) * add some missing exit codes * add some missing exit codes * add ignore case to regex (#859) * add download from S3 and reupload if file provided to run_nifti-insertion was an S3 URL (#860) * fix intended for bug when no acq time available (#861) * fix bug for intended for when getting the list of files needed IntendedFor (#862) * fix paths when there are not / at the end of the Config (#866) Tested on sandbox with Config `data_dir` = `/data/loris/data` * fix NoneType error /opt/loris/bin/mri/python/lib/dcm2bids_imaging_pipeline_lib/dicom_archive_loader_pipeline.py, line 346, in _add_intended_for_to_fieldmap_json_files (#867) Tested on HBCD sandbox on the dataset that caused the issue. * Properly update `mri_upload` 'Inserting' column when different sections of the pipeline are run (#868) * update mri_upload to inserting=0 when push to s3 pipeline is finished * fix Inserting flag being properly set when pipeline is running * update version file to 24.1.6 (#870) * Add download from S3 for fmap already pushed to S3 that needs to have IntendedFor written in them (#874) * add download from S3 for fmap that needs to have IntendedFor written in them * add print * update version to 24.1.7 (#876) * Merge 24.0.4 into 24.1 release (#878) * Reload the mri_upload dictionary before checking if a tarchive has been validated (#783) * reload mri_upload object * remove debugging exit and print statements * fix minor bugs when dealing with scans.tsv files (#774) * fix regex search for excluded series description patterns (#786) * fix_return_statement_of_create_imaging_upload_dict_from_upload_id_function (#787) * [dcm2bids] Insert into MRICandidateErrors if there is a Candidate PatientName mismatch (#790) * insert into MRICandidateErrors when candidate mismatch or pname not matching between DICOMs and NIfTI * remove exit * fix table name to MRICandidateErrors instead of MriCandidateErrors as apparently, it makes a difference on MariaDB/Linux VMs while it just worked on local install on Mac... (#793) * Set DICOM dates to undef if the date does not follow proper DICOM standard (#794) * set date to undef if it does not follow proper DICOM formats * fix all dates set to NULL * Installation and pet fixes (#818) Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> * Update VERSION file for next bug fix release * Project, event validation and protobuf update (#823) * Project, event validation and protobuf update * Site and project search when creating candidate * missing import * correct pscid search * Events validation * flake rules update * review, and NULL value filtered out for site and project * flake * flake Co-authored-by: regisoc <regis.ongaro-carcy@mcin.ca> * DICOM Archive broken archive link fix (#872) * update version to 24.0.4 (#877) Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mail.mcgill.ca> Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> Co-authored-by: regisoc <regis.ongaro-carcy@mcin.ca> Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mail.mcgill.ca> Co-authored-by: Laetitia Fesselier <laetitia.fesselier@mcgill.ca> Co-authored-by: regisoc <regis.ongaro-carcy@mcin.ca> Co-authored-by: regis <regisoc@users.noreply.github.com>
1 parent b713180 commit 06aa412

File tree

4 files changed

+34
-5
lines changed

4 files changed

+34
-5
lines changed

VERSION

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
24.1.6
1+
24.1.7

python/lib/dcm2bids_imaging_pipeline_lib/dicom_archive_loader_pipeline.py

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,6 +58,11 @@ def __init__(self, loris_getopt_obj, script_name):
5858
# ---------------------------------------------------------------------------------------------
5959
self.nifti_tmp_dir = self._run_dcm2niix_conversion()
6060

61+
# ---------------------------------------------------------------------------------------------
62+
# Get S3 object from loris_getopt object
63+
# ---------------------------------------------------------------------------------------------
64+
self.s3_obj = self.loris_getopt_obj.s3_obj
65+
6166
# ---------------------------------------------------------------------------------------------
6267
# Get the list of NIfTI files to run through NIfTI insertion pipeline
6368
# ---------------------------------------------------------------------------------------------
@@ -355,7 +360,9 @@ def _add_intended_for_to_fieldmap_json_files(self):
355360

356361
for key in fmap_files_dict.keys():
357362
sorted_fmap_files_list = fmap_files_dict[key]
358-
self.imaging_obj.modify_fmap_json_file_to_write_intended_for(sorted_fmap_files_list)
363+
self.imaging_obj.modify_fmap_json_file_to_write_intended_for(
364+
sorted_fmap_files_list, self.s3_obj, self.tmp_dir
365+
)
359366

360367
def _order_modalities_per_acquisition_type(self):
361368
"""

python/lib/imaging.py

Lines changed: 24 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1034,18 +1034,33 @@ def get_list_of_files_sorted_by_acq_time(self, files_list):
10341034

10351035
return sorted_files_list
10361036

1037-
def modify_fmap_json_file_to_write_intended_for(self, sorted_fmap_files_list):
1037+
def modify_fmap_json_file_to_write_intended_for(self, sorted_fmap_files_list, s3_obj, tmp_dir):
10381038
"""
10391039
Function that reads the JSON file and modifies it to add the BIDS IntendedFor field to it.
10401040
10411041
:param sorted_fmap_files_list: list of dictionary that contains JSON file path info and IntendedFor content
10421042
:type sorted_fmap_files_list: list
1043+
:param s3_obj: S3 object for downloading and uploading of S3 files
1044+
:type s3_obj: AWS object
1045+
:param tmp_dir: temporary directory where to download JSON file if file is on S3
1046+
:type tmp_dir: str
10431047
"""
10441048

10451049
for fmap_dict in sorted_fmap_files_list:
10461050
if 'IntendedFor' not in fmap_dict:
10471051
continue
1048-
json_file_path = os.path.join(self.config_db_obj.get_config('dataDirBasepath'), fmap_dict['json_file_path'])
1052+
json_file_path = ''
1053+
if fmap_dict['json_file_path'].startswith('s3://'):
1054+
try:
1055+
json_file_path = os.path.join(tmp_dir, os.path.basename(fmap_dict['json_file_path']))
1056+
s3_obj.download_file(fmap_dict['json_file_path'], json_file_path)
1057+
except Exception as err:
1058+
print(err)
1059+
continue
1060+
else:
1061+
data_dir = self.config_db_obj.get_config('dataDirBasepath')
1062+
json_file_path = os.path.join(data_dir, fmap_dict['json_file_path'])
1063+
10491064
with open(json_file_path) as json_file:
10501065
json_data = json.load(json_file)
10511066
json_data['IntendedFor'] = fmap_dict['IntendedFor']
@@ -1059,6 +1074,13 @@ def modify_fmap_json_file_to_write_intended_for(self, sorted_fmap_files_list):
10591074
)
10601075
self.param_file_db_obj.update_parameter_file(json_blake2, param_file_dict['ParameterFileID'])
10611076

1077+
if fmap_dict['json_file_path'].startswith('s3://'):
1078+
try:
1079+
s3_obj.upload_file(json_file_path, fmap_dict['json_file_path'])
1080+
except Exception as err:
1081+
print(err)
1082+
continue
1083+
10621084
@staticmethod
10631085
def get_intended_for_list_of_scans_after_fieldmap_acquisition_based_on_acq_time(files_list, current_fmap_acq_time,
10641086
next_fmap_acq_time):

uploadNeuroDB/NeuroDB/objectBroker/TarchiveOB.pm

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -71,7 +71,7 @@ use File::Basename;
7171
use TryCatch;
7272

7373
my @TARCHIVE_FIELDS = qw(
74-
TarchiveID ArchiveLocation PatientName PatientID PatientDoB md5sumArchive
74+
DicomArchiveID TarchiveID ArchiveLocation PatientName PatientID PatientDoB md5sumArchive
7575
ScannerManufacturer ScannerModel ScannerSerialNumber ScannerSoftwareVersion
7676
neurodbCenterName SourceLocation DateAcquired
7777
);

0 commit comments

Comments
 (0)