-
Notifications
You must be signed in to change notification settings - Fork 26
pfcon FS and DS plugin example
This page provides instructions that interact with pfcon
in a manner similar to how CUBE would. Data is uploaded to swift storage, and then a FS plugin is run on the data, followed by a DS plugin on the results.
The set of operations are:
- pull a sample dataset from github;
- push the dataset into swift openstorage;
- run an FS plugin called
pl-dircopy
that copies data from swiftstorage and reorganizes it in a different location; - run a DS plugin on the results that will extract some DICOM meta data
-
pfurl
and supporting requirements. Doing apip install pfurl
in a python virualenv should take care of everything. If, however, there are issues withCURL_OPENSSL_3
, you might need to also do (for Ubuntu):
apt-get install -y libssl-dev libcurl4-openssl-dev
-
pl-pfdicom_tagsub
andpl-pfdicom_tagextract
: Pull the containers and check their internal version numbers:
docker pull fnndsc/pl-pfdicom_tagextract
docker pull fnndsc/pl-pfdicom_tagsub
and check versions:
docker run --rm fnndsc/pl-pfdicom_tagextract \
dcm_tagExtract.py --version /tmp /tmp 2>/dev/null
which should reply with
Plugin Version: 1.0
Internal pfdicom_tagExtract Version: 2.2.2
and
docker run --rm fnndsc/pl-pfdicom_tagsub \
dcm_tagSub.py --version /tmp /tmp --version 2>/dev/null
which should reply with
Plugin Version: 1.0.2
Internal pfdicom_tagSub Version: 1.4.6
Make sure you are in the base directory of the pfcon
repo:
git clone https://github.com/FNNDSC/pfcon.git
cd pfcon
export HOST_IP=$(ip route | grep -v docker | awk '{if(NF==11) print $9}')
export HOST_PORT=8000
Pull a sample data set that will be used in this example:
git clone https://github.com/FNNDSC/SAG-anon
Set a convenience variable:
export DICOMDIR=$(pwd)/SAG-anon
unmake ; sudo rm -fr FS; rm -fr FS ; make
If you have instantiated pfcon
you might want to try the following in a new terminal:
cd pfcon
export HOST_IP=$(ip route | grep -v docker | awk '{if(NF==11) print $9}')
export HOST_PORT=8000
export DICOMDIR=$(pwd)/SAG-anon
The PUSH operation relies on the command line app, swift
. The app is just a pip install
away
pip install python-swiftclient
Now, use the swiftCtl.sh
script to push the data
./swiftCtl.sh -A push -E dcm -D $DICOMDIR -P chris/uploads/DICOM/dataset1
./swiftCtl.sh
You should see a listing of files in swift storage:
chris/uploads/DICOM/dataset1/0001-1.3.12.2.1107.5.2.19.45152.2013030808110258929186035.dcm
chris/uploads/DICOM/dataset1/0002-1.3.12.2.1107.5.2.19.45152.2013030808110261698786039.dcm
chris/uploads/DICOM/dataset1/0003-1.3.12.2.1107.5.2.19.45152.2013030808110259940386037.dcm
chris/uploads/DICOM/dataset1/0004-1.3.12.2.1107.5.2.19.45152.2013030808110256555586033.dcm
...
...
chris/uploads/DICOM/dataset1/0190-1.3.12.2.1107.5.2.19.45152.2013030808105512578785411.dcm
chris/uploads/DICOM/dataset1/0191-1.3.12.2.1107.5.2.19.45152.2013030808105486367685381.dcm
chris/uploads/DICOM/dataset1/0192-1.3.12.2.1107.5.2.19.45152.2013030808105485455785379.dcm
Using pfurl
, call pfcon
passing it the location of the images in storage and instructing the pl-dircopy
plugin to copy these files to a new location:
pfurl \
--verb POST --raw --http ${HOST_IP}:5005/api/v1/cmd \
--httpResponseBodyParse \
--jsonwrapper 'payload' --msg '
{
"action": "coordinate",
"meta-compute": {
"auid": "chris",
"cmd": "python3 /usr/src/dircopy/dircopy.py --saveinputmeta --saveoutputmeta --dir /share/incoming /share/outgoing",
"container": {
"manager": {
"app": "swarm.py",
"env": {
"meta-store": "key",
"serviceName": "1",
"serviceType": "docker",
"shareDir": "%shareDir"
},
"image": "fnndsc/swarm"
},
"target": {
"cmdParse": false,
"execshell": "python3",
"image": "fnndsc/pl-dircopy",
"selfexec": "dircopy.py",
"selfpath": "/usr/src/dircopy"
}
},
"cpu_limit": "1000m",
"gpu_limit": 0,
"jid": "1",
"memory_limit": "200Mi",
"number_of_workers": "1",
"service": "host",
"threaded": true
},
"meta-data": {
"localSource": {
"path": "chris/uploads/DICOM/dataset1",
"storageType": "swift"
},
"localTarget": {
"createDir": true,
"path": "chris/feed_1/dircopy_1/data"
},
"remote": {
"key": "%meta-store"
},
"service": "host",
"specialHandling": {
"cleanup": true,
"op": "plugin"
},
"transport": {
"compress": {
"archive": "zip",
"cleanup": true,
"unpack": true
},
"mechanism": "compress"
}
},
"meta-store": {
"key": "jid",
"meta": "meta-compute"
},
"threadAction": true
} ' --quiet --jsonpprintindent 4
If you are monitoring the containers, you can, in three separate terminals, do:
# In terminal 1:
docker-compose logs --follow pfcon_service
# In terminal 2:
docker-compose logs --follow pman_service
# In terminal 3:
docker-compose logs --follow pfioh_service
If all goes well, in the pfcon
log terminal you should see the tail end of a long output:
"chris/feed_1/dircopy_1/data/0189-1.3.12.2.1107.5.2.19.45152.2013030808105517130085417.dcm",
"chris/feed_1/dircopy_1/data/0190-1.3.12.2.1107.5.2.19.45152.2013030808105512578785411.dcm",
"chris/feed_1/dircopy_1/data/0191-1.3.12.2.1107.5.2.19.45152.2013030808105486367685381.dcm",
"chris/feed_1/dircopy_1/data/0192-1.3.12.2.1107.5.2.19.45152.2013030808105485455785379.dcm",
"chris/feed_1/dircopy_1/data/input.meta.json",
"chris/feed_1/dircopy_1/data/output.meta.json"
],
"fullPath": "chris/feed_1/dircopy_1/data"
}
}
Check the contents of the swift storage:
./swiftCtl.sh
and check that the following exists:
chris/feed_1/dircopy_1/data/0001-1.3.12.2.1107.5.2.19.45152.2013030808110258929186035.dcm
chris/feed_1/dircopy_1/data/0002-1.3.12.2.1107.5.2.19.45152.2013030808110261698786039.dcm
chris/feed_1/dircopy_1/data/0003-1.3.12.2.1107.5.2.19.45152.2013030808110259940386037.dcm
chris/feed_1/dircopy_1/data/0004-1.3.12.2.1107.5.2.19.45152.2013030808110256555586033.dcm
chris/feed_1/dircopy_1/data/0005-1.3.12.2.1107.5.2.19.45152.2013030808110251492986029.dcm
...
chris/feed_1/dircopy_1/data/0190-1.3.12.2.1107.5.2.19.45152.2013030808105512578785411.dcm
chris/feed_1/dircopy_1/data/0191-1.3.12.2.1107.5.2.19.45152.2013030808105486367685381.dcm
chris/feed_1/dircopy_1/data/0192-1.3.12.2.1107.5.2.19.45152.2013030808105485455785379.dcm
chris/feed_1/dircopy_1/data/input.meta.json
chris/feed_1/dircopy_1/data/jobStatus.json
chris/feed_1/dircopy_1/data/jobStatusSummary.json
chris/feed_1/dircopy_1/data/output.meta.json
We will now run this data through another plugin, pf-pfdicom_tagExtract
that will extract meta data in the files and also convert the middle image to a jpg:
pfurl \
--verb POST --raw --http ${HOST_IP}:5005/api/v1/cmd \
--httpResponseBodyParse \
--jsonwrapper 'payload' --msg '
{
"action": "coordinate",
"meta-compute": {
"auid": "chris",
"cmd": "python3 /usr/src/dcm_tagExtract/dcm_tagExtract.py /share/incoming /share/outgoing --saveinputmeta --saveoutputmeta -e dcm -m m:%_nospc|-_ProtocolName.jpg -s 2 -o %PatientID-%PatientAge -t raw,json,html,dict,col,csv --useIndexhtml --threads 0 -v 5",
"container": {
"manager": {
"app": "swarm.py",
"env": {
"meta-store": "key",
"serviceName": "2",
"serviceType": "docker",
"shareDir": "%shareDir"
},
"image": "fnndsc/swarm"
},
"target": {
"cmdParse": false,
"execshell": "python3",
"image": "fnndsc/pl-pfdicom_tagextract",
"selfexec": "dcm_tagExtract.py",
"selfpath": "/usr/src/dcm_tagExtract"
}
},
"cpu_limit": "1000m",
"gpu_limit": 0,
"jid": "2",
"memory_limit": "200Mi",
"number_of_workers": "1",
"service": "host",
"threaded": true
},
"meta-data": {
"localSource": {
"path": "chris/feed_1/dircopy_1/data",
"storageType": "swift"
},
"localTarget": {
"createDir": true,
"path": "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data"
},
"remote": {
"key": "%meta-store"
},
"service": "host",
"specialHandling": {
"cleanup": true,
"op": "plugin"
},
"transport": {
"compress": {
"archive": "zip",
"cleanup": true,
"unpack": true
},
"mechanism": "compress"
}
},
"meta-store": {
"key": "jid",
"meta": "meta-compute"
},
"threadAction": true
} ' --quiet --jsonpprintindent 4
If you are monitoring the containers, you can, in three separate terminals, do:
# In terminal 1:
docker-compose logs --follow pfcon_service
# In terminal 2:
docker-compose logs --follow pman_service
# In terminal 3:
docker-compose logs --follow pfioh_service
If all goes well, in the pfcon
log terminal you should see the tail end of a long output:
"lsList": [
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-col.txt",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-csv.txt",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-dict.txt",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-raw.txt",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y.json",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/SAG-MPRAGE-220-FOV.jpg",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/index.html",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/input.meta.json",
"chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/output.meta.json"
],
"fullPath": "chris/feed_1/dircopy_1/pfdicom_tagextract_2/data"
}
}
Check the contents of the swift storage:
./swiftCtl.sh
and check that the following exists:
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-col.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-csv.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-dict.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y-raw.txt
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/1449c1d-003Y.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/SAG-MPRAGE-220-FOV.jpg
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/index.html
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/input.meta.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/jobStatus.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/jobStatusSummary.json
chris/feed_1/dircopy_1/pfdicom_tagextract_2/data/output.meta.json
Using swift, pull the resultant files from swift:
./swiftCtl.sh -A pull -P chris/feed_1/dircopy_1/pfdicom_tagextract_2/data -O pull
which should pull the files into a tree starting at ./pull
-30-