Import DICOM data

Tutorial on how to import DICOM data into NiDB

Step 1 - Choose an Import Method

There are two main methods to import DICOM data into NiDB

(a) Global Import - used by the DICOM receiver. All files go into the same directory to be archived completely unattended. Filenames must be unique. Preferable if you have a large batch of disorganized DICOM data

(b) Individual Import - import a single zip file or directory. The import can contain subdirectories. This will parse and display the contents of the import and wait until you select which series to archive before any data will be imported. Preferable if you have smaller batches of data, or data that must be parsed differently than the default global method

Step 2 - (a) Global Import

Overview

DICOM files are parsed into subject/study/series groups using 3 DICOM header tags (or set of tags). These are parsed in order.

  1. Subject - PatientID (0010,0020) - this uniquely identifies the subject. PatientID will match to existing subjects in the database (regardless of project enrollment) by comparing the UID, and alternate UID fields in NiDB

  2. Study - Modality (0008,0020) & StudyDate (0008,0020) & StudyTime (0008,0030) - this set of tags uniquely identifies the study. This will match to existing studies within NiDB. Those existing studies must also be associated with the Subject from the previous step.

  3. Series - SeriesNumber (0020,0011) - this uniquely identifies the series. This will match to existing series in NiDB based on this series number, as well as the study and subject from the previous step.

Copy/move DICOM files into the import directory

Check your configuration (Admin-->Settings-->NiDB Config) for the incomingdir variable. It will most likely be /nidb/data/dicomincoming. This will be the directory NiDB will search every minute for new data, which will then be automatically parsed and archived.

From a Linux terminal on the NiDB server, run the following commands as the nidb user to find and copy all dicom files.

cd /directory/with/the/data
find . -name '*.dcm' -exec cp {} /nidb/data/dicomincoming \;

You can also move files, instead of copying, by replacing cp with mv. If your files have a different extension, such as .IMG, or no extension, you can change that in the command as well.

Check status of archiving

Go to Admin-->Modules-->import-->View Logs to view the log files generated by the import process. The global import process expects to process a stream of data, where there is no beginning and no end, so the log file will not delineate this particular import from any other data that were found and archived. This import method is also designed to take a stream of potentially random data, and only utilize readable DICOM files.

The global import method will only archive readable DICOM files. Any unreadable or non-DICOM files will be moved to the /nidb/data/problem directory.

Here's a sample section of import log file. Log files can be very detailed, but any errors will show up here. Problems

   NiDB version 2022.6.847
   Build date [Jun 21 2022 16:55:11]
   C++ [201703]
   Qt compiled [6.3.1]
   Qt runtime [6.3.1]
   Build system [x86_64-little_endian-lp64]
[2022/07/01 14:13:01][2845079] Entering the import module
[2022/07/01 14:13:01][2845079] ********** Working on directory [/nidb/data/dicomincoming] with importRowID [0] **********
[2022/07/01 14:13:01][2845079] Found [7] files in [/nidb/data/dicomincoming]
[2022/07/01 14:13:01][2845079] dcmseries contains [1] entries
[2022/07/01 14:13:01][2845079] Getting list of files for seriesuid [1.3.12.2.1107.5.2.19.45351.2022070114112835334403330.0.0.0] - number of files is [7]
[2022/07/01 14:13:01][2845079] ArchiveDICOMSeries() Beginning to archive this DICOM series (0, -1, -1, -1, uidOrAltUID, ModalityStudyDate, SeriesNum, -1, , -1, , )
[2022/07/01 14:13:01][2845079] GetProject() Found project [121961] with id [213]
[2022/07/01 14:13:01][2845079] select subject_id from subjects where uid = ? [S6563ELA]
[2022/07/01 14:13:01][2845079] GetSubject() Subject [S6563ELA] with subjectRowID [7969] found by criteria [uidoraltuid]
[2022/07/01 14:13:01][2845079] ArchiveDICOMSeries() SubjectRowID [7969] found
[2022/07/01 14:13:01][2845079] GetFamily() Entering GetFamily()
[2022/07/01 14:13:01][2845079] GetFamily() Leaving GetFamily()
[2022/07/01 14:13:01][2845079] ArchiveDICOMSeries() GetFamily() returned familyID [4126]  familyUID []
[2022/07/01 14:13:01][2845079] select enrollment_id from enrollment where subject_id = ? and project_id = ? [7969] [213]
[2022/07/01 14:13:02][2845079] GetEnrollment() Subject is enrolled in this project [213], with enrollmentRowID [58018]
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() GetEnrollment returned enrollmentRowID [58018]
[2022/07/01 14:13:02][2845079] GetStudy() Study [S6563ELA2] with studyRowID [77409] found by criteria [modalitystudydate]
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() StudyRowID [77409] found
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() This MR series [13] exists, updating
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() Deleted from mr_qa table, now deleting from qc_results
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() Deleted from qc_results table, now deleting from qc_moduleseries
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() SeriesRowID: [360439]
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() Created outdir [/nidb/data/archive/S6563ELA/2/13/dicom]
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() Checking for existing files in outdir [/nidb/data/archive/S6563ELA/2/13/dicom]
[2022/07/01 14:13:02][2845079] ArchiveDICOMSeries() There are [384] existing files in [/nidb/data/archive/S6563ELA/2/13/dicom]. Beginning renaming of existing files [------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------]  Done renaming existings [0] files
[2022/07/01 14:13:03][2845079] CreateThumbnail() Executed command [convert -normalize /nidb/data/dicomincoming/1.3.12.2.1107.5.2.19.45351.2022070114180219558183887 /nidb/data/archive/S6563ELA/2/13/thumb.png], Output [], elapsed time [0.255 sec]
[2022/07/01 14:13:03][2845079] ArchiveDICOMSeries() Renaming new files [.......]  Done renaming [7] new files
[2022/07/01 14:13:03][2845079] ArchiveDICOMSeries() Archive directory [/nidb/data/archive/S6563ELA/2/13/dicom] is [807831694] bytes in size and contains [391] files
[2022/07/01 14:13:03][2845079] ArchiveDICOMSeries() Executed command [chmod -Rf 777 /nidb/data/archive/S6563ELA/2/13/dicom], Output [], elapsed time [0.056 sec]
[2022/07/01 14:13:03][2845079] ArchiveDICOMSeries() Starting copy to the backup directory
[2022/07/01 14:13:04][2845079] ArchiveDICOMSeries() Executed command [rsync -az /nidb/data/archive/S6563ELA/2/13/dicom/* /nidb/data/backup/S6563ELA/2/13], Output [], elapsed time [1.269 sec]
[2022/07/01 14:13:04][2845079] ArchiveDICOMSeries() Finished copying to the backup directory
[2022/07/01 14:13:04][2845079] Performance metrics
Elapsed time: 3s
Subjects [0]  Studies [0]  Series [0]
FilesRead [0]  FilesArchived [0]  FilesIgnored [0]  FilesError [0]
Read rate: Bytes/Sec [3.33333e-08]
[2022/07/01 14:13:04][2845079] ModuleCheckIfActive() returned true
[2022/07/01 14:13:04][2845079] Finished archiving data for [/nidb/data/dicomincoming]
[2022/07/01 14:13:04][2845079] Performance metrics
Elapsed time: 3s
Subjects [0]  Studies [0]  Series [0]
FilesRead [7]  FilesArchived [0]  FilesIgnored [0]  FilesError [0]
Read rate: Bytes/Sec [4.82082e+06]
[2022/07/01 14:13:04][2845079] Found [0] directories in [/nidb/data/dicomincoming]
[2022/07/01 14:13:04][2845079] Directories found: 
[2022/07/01 14:13:04][2845079] Leaving the import module
[2022/07/01 14:13:04][2845079] Successfully removed lock file [/nidb/lock/import.2845079]

You may ask... where's my data? You can search, on the Search page, by ID, dates, protocol, and other information.

Potential problems

The global import method will group files by the method specified above. If one of those fields are blank for some or all of your data, that could cause the archiving process to create a subject/study/series hierarchy that does not match what you are expecting. Sometimes you will find that each series is placed in it's own study. Or each study is placed in a unique subject.

To troubleshoot these issues, try using the individual import method described below. This allows you to select different matching criteria and preview the data found before archiving it.

Step 2 - (b) Individual Import

This tutorial is based on the Importing data section of the User's guide, but the content on the current page is more detailed. See link to the user's guide:

Importing data

Go to Data-->Import Imaging. Click the New Import button.

Fill out the required information. Choose if you are uploading a file, or if the data is located in an NFS path. Select the modality and project. Then select the matching criteria, which will determine how the data will structured into a subject/study/series hierarchy. When everything is all set, click Upload.

Check the status of the import by going to Data-->Import Imaging and finding the import that was just created. The current import step will be displayed, and you can click View Import to view more details. Details of the import will be displayed.

If the import has finished parsing, it will ask for your attention. You'll need to review the subjects, studies, and series that were found and then select which series you want to archive.

Last updated