-
-
Notifications
You must be signed in to change notification settings - Fork 211
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Tarfiles #27
Open
bganglia
wants to merge
54
commits into
mlmed:master
Choose a base branch
from
bganglia:tarfiles
base: master
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Tarfiles #27
Changes from 1 commit
Commits
Show all changes
54 commits
Select commit
Hold shift + click to select a range
9f3eb7b
Open MIMIC from tarfile
bganglia 0414355
Merge branch 'master' of https://github.com/ieee8023/torchxrayvision …
bganglia 289747d
Merge branch 'master' of https://github.com/ieee8023/torchxrayvision …
bganglia 303832c
revert whitespace
bganglia da2490b
don't use get_image() in NIH_Dataset
bganglia 8fb3f03
NIH_Dataset extends TarDataset
bganglia 395e5e4
Store tarfiles in dictionary
bganglia fa69973
use getnames intead of getmembers
bganglia abbbfec
use O(n) method for determining imgid from tar_path
bganglia 2ba6f5d
random data in MIMIC format
bganglia cacc3ad
script for generating random MIMIC data
bganglia ecbf302
track random MIMIC data
bganglia 04f1a32
tarfile test using random MIMIC data
bganglia 90129ab
fix test directory
bganglia 0aa52a7
use .close() on tarfile and regenerate test directory
bganglia 349babb
support for tarfiles in NIH dataset
bganglia 6999bd3
Inherit from TarDataset in PC_Dataset
bganglia 842ddf8
Storage-agnostic dataset
bganglia 37afa4e
Inherit from storage agnostic loader
bganglia bbd4007
tidy up tarfile code
bganglia 34daddb
remove previous TarDataset, ZipDataset classes
bganglia 727d9ff
Scripts for generating test data
bganglia d2ae7c0
Test data
bganglia 41b50c4
Tests for zip, tar in MIMIC, NIH, and PC
bganglia 48d8170
clean up storage classes
bganglia 5c4117e
save progress
bganglia 2773c69
inherit from Dataset in NIH_Dataset
bganglia 7ffc252
Add code for automated tests with script-generated data
bganglia 68a71ae
script for writing random data
bganglia ec9777b
fall back on .index() instead of trying to load a cached version in .…
bganglia 29498a6
support multiprocessing
bganglia 3674357
Clean up new code for tests and format interfaces
bganglia ccec9ae
write partial metadata files with subset of columns
bganglia c091734
Improve caching
bganglia e56a565
fix tests
bganglia 1dde4b7
fix error in data-generation script
bganglia 1628db4
create .torchxrayvision if it does not already exist
bganglia 124467c
fix line adding .torchxrayvision
bganglia 28816e5
Commit sample data for testing NLM_TB datasets, instead of auto-gener…
bganglia ce38e57
Commit covid test cases
bganglia 281935c
Include parallel tests again
bganglia 9c2c9d2
trycatch on reading/writing stored_mappings, with disk_unwriteable_ou…
bganglia 7c6aebb
work when .torchxrayvision is not writeable
bganglia cb97e70
remove some print statements
bganglia 950ae96
add test simulating an unwriteable disk
bganglia 300c9d7
use filesystem instead of dictionary
bganglia 218fa75
rewrite data generation scripts as python, not bash scripts; add para…
bganglia b22cead
cleanup: better variable names and use blake2b instead of hash (works…
bganglia ae09bc9
Add test for asserting a dataset loads faster the second time
bganglia 30c043b
Don't invoke duration test, to avoid spurious errors
bganglia bfdebf2
Call on new data generation script
bganglia 0f7ea51
simplify and improve documentation
bganglia 71c7a50
reorganize
bganglia 1715b9d
Fix path length in CheX_Dataset
bganglia File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
NIH_Dataset extends TarDataset
- Loading branch information
commit 8fb3f039c56e4138d0db2c51af2939a6331af10a
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -200,7 +200,7 @@ def __getitem__(self, idx): | |
|
||
class TarDataset(Dataset): | ||
def __init__(self, imgpath): | ||
if imgpath.endswith(".tar"): | ||
if tarfile.is_tarfile(imgpath): | ||
self.tarred = tarfile.open(imgpath) | ||
self.tar_paths = self.tarred.getmembers() | ||
else: | ||
|
@@ -215,7 +215,7 @@ def get_image(self, path): | |
bytes = self.tarred.extractfile(name).read() | ||
return np.array(Image.open(BytesIO(bytes))) | ||
|
||
class NIH_Dataset(Dataset): | ||
class NIH_Dataset(TarDataset): | ||
""" | ||
NIH ChestX-ray8 dataset | ||
|
||
|
@@ -314,7 +314,7 @@ def __getitem__(self, idx): | |
imgid = self.csv['Image Index'].iloc[idx] | ||
#img_path = os.path.join(self.imgpath, imgid) | ||
#print(img_path) | ||
img = imread(img_path) | ||
img = self.get_image(imgid) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. You added this for NIH_Dataset but it doesn't extend TarDataset. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Ah, let me fix that |
||
if self.normalize: | ||
img = normalize(img, self.MAXVAL) | ||
|
||
|
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So you said this takes a lot of time. I see how this approach is super robust. I did some tests for time and it seems like just 30 seconds for the MIMIC data. What about caching this using a dict based on the file path? It would speed things up if multiple objects are created? But it seems like a reasonable price to pay.
The alternative is to skip it and assume a file structure but that is not as nice.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ok, using a dict, the second load is around 10x faster on my machine.
The dict could also be pickled so there is only one slow load. That option would lead to issues if someone wanted to change the tarfile, although I don't know why they would do that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doing hashing and caching a file could be nice but also could be annoying to debug and create more issues (like if there are no write permissions).