-
Notifications
You must be signed in to change notification settings - Fork 93
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Strange dice coefficient on volume-54.nii #5
Comments
And there is another question that confused me. It is that 'epoch=3000' is used to trained the network. But I found the network tended to convergence at a quite early time.(May be it seems before 1000 epoch.) |
Please tell me that you are giving the whole volume for training or split it to 3D patches? Please confirm it to me. Thanks. |
And from which file of code you are doing testing? |
As the author did, I used the 256×256×48 3d patches as the input of the network. And the patches were obtained by 'data_prepare/get_random_data.py' and 'dataset/data_random.py'. The former was used to pre-process the train data, including down sampling on the xy plane and getting slices which contain liver (20 slices were expanded in the positive and negative directions along the Z axis) while the latter was used to randomly extract 48 continuous slices based on the results of the former. The latter's results were directly used as the input of the network. |
val.py |
I did the above steps. the first code gave me 256×256×n image. but it not gave me 256×256×48 3d patches. Could I get it manually? because the output of 'data_prepare/get_random_data.py', I understand. But the second code dont gave me exactly 256×256×48 3D patch. 'dataset/data_random.py'. is used in train_ds.py as from dataset.dataset_random import train_ds |
|
Hi,@mitiandi,@ahmadmubashir |
In the truth, i have forgot the details because there are a long time since then. But it seems right. what you need to do is just to change the data path to yours. Good luck~
…---Original---
From: "zz10001"<notifications@github.com>
Date: Wed, Jul 24, 2019 16:20 PM
To: "assassint2017/MICCAI-LITS2017"<MICCAI-LITS2017@noreply.github.com>;
Cc: "Mention"<mention@noreply.github.com>;"mitiandi"<1422578348@qq.com>;
Subject: Re: [assassint2017/MICCAI-LITS2017] Strange dice coefficient on volume-54.nii (#5)
Hi,@mitiandi,@ahmadmubashir
Is this program start with using data_prepare/get_random_data.py to get pre-process train data and dataset/data_random.py to extract 48 continuous slices,then i can use python train_ds.py directly to start train,looking forward to your reply!
Best,
Ming
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or mute the thread.
|
Thanks for your kind help,may you have a good day! |
hi, have you found the reason Or how did you solve it |
You can solve it by #6 (comment). |
Thank you very much |
Hi, I used this method, but dice coefficient on volume-43 is still 0.67,I want to know if this is a normal value. |
The input of DialResUnet is 512512, but the output is 10241024. Shouldn't the divided input and output be the same size? Do you know why this is? |
What is the code operation procedure of this project, can you share it? Thank you |
Have you visualed the predict by ITK-SNAP or other viewer Software? Maybe you can see the predict first |
It seems easy to exchange color for Liver and Tumor, You just need to Negate the Nii you predicted. |
Hi, @zz10001, @life-8079 , |
I randomly split the train dataset (131 cases) to two no-overlap subset, which are train set (105 cases) and test (26 cases) set. When I finshed train the network and test it on the test set (26 cases). I obtained a result that dice per case is 0.932. It is lower than your result (0.957).
Most importantly, I found that the dice coefficient on volume-54.nii is very poor. (0.18). Then I visualized the segmentation result of volume-54.nii and compared it to its ground truth. And then I found there were some dislocation about them.(about 10 slices). For example, segmentation result started with 62th slice,while ground truth started with 52th slice.
The text was updated successfully, but these errors were encountered: