Skip to content

Commit c4336b6

Browse files
talmoroomryssheridana
authoredOct 10, 2022
Docs update (#993)
* Fix printing of auto-selected GPU free memory (#955) * Fix printing of auto-selected GPU free memory * Fix printing in inference * Lint * Fix add suggestions when target is current video (#956) * Fix export cli arg parsing (#962) * Fix cattr in Python 3.9 (#967) * Remove structuring hook for forward reference and handle manually * Handle missing key * Add sample movies to docs (#992) * Add initial examples * Add clips for all datasets * Add quick example clip * Typo on instance count for flies13 Co-authored-by: Liezl Maree <[email protected]> Co-authored-by: sheridana <[email protected]>
1 parent 659475e commit c4336b6

File tree

8 files changed

+77
-29
lines changed

8 files changed

+77
-29
lines changed
 

‎.github/workflows/website.yml

+1-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ on:
88
# 'main' triggers updates to 'sleap.ai', 'develop' to 'sleap.ai/develop'
99
- main
1010
- develop
11-
- liezl/siv-inference
11+
- talmo/sample-movies
1212
paths:
1313
- "docs/**"
1414
- "README.rst"

‎docs/datasets.md

+23-7
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,10 @@ For the full set of labeled datasets and trained models reported in the SLEAP pa
1111
([Pereira et al., Nature Methods, 2022](https://www.nature.com/articles/s41592-022-01426-1)),
1212
please see the [OSF repository](https://osf.io/36har/).
1313

14+
````{hint}
15+
Need a quick testing clip? [Here's a video of a pair of mice in a standard home cage setting.](
16+
https://storage.googleapis.com/sleap-data/sandbox/sleap-mice-demo/mice.mp4)
17+
````
1418

1519
## `fly32`
1620
![fly32](_static/example.fly32.jpg)
@@ -43,7 +47,9 @@ widths: 10 40
4347
* - Labels
4448
- 1500
4549
* - Download
46-
- [Train](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/random_split1/train.pkg.slp) / [Validation](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/random_split1/val.pkg.slp) / [Test](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/random_split1/test.pkg.slp)
50+
- [Train (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/random_split1/train.pkg.slp) / [Validation (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/random_split1/val.pkg.slp) / [Test (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/random_split1/test.pkg.slp)
51+
* - Example
52+
- [Clip (`.mp4`)](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/clips/072212_173836%400-3200.mp4) / [Tracking (`.slp`)](https://storage.googleapis.com/sleap-data/datasets/BermanFlies/clips/072212_173836%400-3200.slp)
4753
* - Credit
4854
- [Berman et al. (2014)](https://royalsocietypublishing.org/doi/10.1098/rsif.2014.0672), [Pereira et al. (2019)](https://www.nature.com/articles/s41592-018-0234-5), [Pereira et al. (2022)](https://www.nature.com/articles/s41592-022-01426-1), Talmo Pereira, Gordon Berman, Joshua Shaevitz
4955
```
@@ -77,9 +83,11 @@ widths: 10 40
7783
* - Identity
7884
- ✔
7985
* - Labels
80-
- 2000 frames, 2000 instances
86+
- 2000 frames, 4000 instances
8187
* - Download
82-
- [Train](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/tracking_split2/train.pkg.slp) / [Validation](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/tracking_split2/val.pkg.slp) / [Test](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/tracking_split2/test.pkg.slp)
88+
- [Train (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/tracking_split2/train.pkg.slp) / [Validation (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/tracking_split2/val.pkg.slp) / [Test (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/tracking_split2/test.pkg.slp)
89+
* - Example
90+
- [Clip (`.mp4`)](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/clips/talk_title_slide%4013150-14500.mp4) / [Tracking (`.slp`)](https://storage.googleapis.com/sleap-data/datasets/wt_gold.13pt/clips/talk_title_slide%4013150-14500.slp)
8391
* - Credit
8492
- [Pereira et al. (2022)](https://www.nature.com/articles/s41592-022-01426-1), Junyu Li, Shruthi Ravindranath, Talmo Pereira, Mala Murthy
8593
```
@@ -115,7 +123,9 @@ widths: 10 40
115123
* - Labels
116124
- 1000 frames, 2950 instances
117125
* - Download
118-
- [Train](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/labels.full/random_split1/train.pkg.slp) / [Validation](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/labels.full/random_split1/val.pkg.slp) / [Test](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/labels.full/random_split1/test.pkg.slp)
126+
- [Train (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/labels.full/random_split1/train.pkg.slp) / [Validation (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/labels.full/random_split1/val.pkg.slp) / [Test (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/labels.full/random_split1/test.pkg.slp)
127+
* - Example
128+
- [Clip (`.mp4`)](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/clips/OFTsocial5mice-0000-00%4015488-18736.mp4) / [Tracking (`.slp`)](https://storage.googleapis.com/sleap-data/datasets/wang_4mice_john/clips/OFTsocial5mice-0000-00%4015488-18736.slp)
119129
* - Credit
120130
- [Pereira et al. (2022)](https://www.nature.com/articles/s41592-022-01426-1), John D'Uva, Mikhail Kislin, Samuel S.-H. Wang
121131
```
@@ -151,7 +161,9 @@ widths: 10 40
151161
* - Labels
152162
- 1474 frames, 2948 instances
153163
* - Download
154-
- [Train](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/random_split1/train.pkg.slp) / [Validation](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/random_split1/val.pkg.slp) / [Test](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/random_split1/test.pkg.slp)
164+
- [Train (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/random_split1/train.pkg.slp) / [Validation (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/random_split1/val.pkg.slp) / [Test (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/random_split1/test.pkg.slp)
165+
* - Example
166+
- [Clip (`.mp4`)](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/clips/20200111_USVpairs_court1_M1_F1_top-01112020145828-0000%400-2560.mp4) / [Tracking (`.slp`)](https://storage.googleapis.com/sleap-data/datasets/eleni_mice/clips/20200111_USVpairs_court1_M1_F1_top-01112020145828-0000%400-2560.slp)
155167
* - Credit
156168
- [Pereira et al. (2022)](https://www.nature.com/articles/s41592-022-01426-1), Eleni Papadoyannis, Mala Murthy, Annegret Falkner
157169
```
@@ -187,7 +199,9 @@ widths: 10 40
187199
* - Labels
188200
- 804 frames, 1604 instances
189201
* - Download
190-
- [Train](https://storage.googleapis.com/sleap-data/datasets/yan_bees/random_split1/train.pkg.slp) / [Validation](https://storage.googleapis.com/sleap-data/datasets/yan_bees/random_split1/val.pkg.slp) / [Test](https://storage.googleapis.com/sleap-data/datasets/yan_bees/random_split1/test.pkg.slp)
202+
- [Train (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/yan_bees/random_split1/train.pkg.slp) / [Validation (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/yan_bees/random_split1/val.pkg.slp) / [Test (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/yan_bees/random_split1/test.pkg.slp)
203+
* - Example
204+
- [Clip (`.mp4`)](https://storage.googleapis.com/sleap-data/datasets/yan_bees/clips/bees_demo%4021000-23000.mp4) / [Tracking (`.slp`)](https://storage.googleapis.com/sleap-data/datasets/yan_bees/clips/bees_demo%4021000-23000.slp)
191205
* - Credit
192206
- [Pereira et al. (2022)](https://www.nature.com/articles/s41592-022-01426-1), Grace McKenzie-Smith, Z. Yan Wang, Joshua Shaevitz, Sarah Kocher
193207
```
@@ -223,7 +237,9 @@ widths: 10 40
223237
* - Labels
224238
- 425 frames, 1588 instances
225239
* - Download
226-
- [Train](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/cohort1_compressedTalmo_23vids_march_7_to_march_17/random_split1.day001/train.pkg.slp) / [Validation](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/cohort1_compressedTalmo_23vids_march_7_to_march_17/random_split1.day001/val.pkg.slp) / [Test](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/cohort1_compressedTalmo_23vids_march_7_to_march_17/random_split1.day001/test.pkg.slp)
240+
- [Train (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/cohort1_compressedTalmo_23vids_march_7_to_march_17/random_split1.day001/train.pkg.slp) / [Validation (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/cohort1_compressedTalmo_23vids_march_7_to_march_17/random_split1.day001/val.pkg.slp) / [Test (`.pkg.slp`)](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/cohort1_compressedTalmo_23vids_march_7_to_march_17/random_split1.day001/test.pkg.slp)
241+
* - Example
242+
- [Clip (`.mp4`)](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/clips/2020-3-10_daytime_5mins_compressedTalmo%403200-5760.mp4) / [Tracking (`.slp`)](https://storage.googleapis.com/sleap-data/datasets/nyu-gerbils/clips/2020-3-10_daytime_5mins_compressedTalmo%403200-5760.slp)
227243
* - Credit
228244
- [Pereira et al. (2022)](https://www.nature.com/articles/s41592-022-01426-1), Catalin Mitelut, Marielisa Diez Castro, Dan H. Sanes
229245
```

‎sleap/gui/commands.py

+13-4
Original file line numberDiff line numberDiff line change
@@ -2413,11 +2413,20 @@ def do_action(cls, context: CommandContext, params: dict):
24132413
else:
24142414
params["videos"] = context.labels.videos
24152415

2416-
new_suggestions = VideoFrameSuggestions.suggest(
2417-
labels=context.labels, params=params
2418-
)
2416+
try:
2417+
new_suggestions = VideoFrameSuggestions.suggest(
2418+
labels=context.labels, params=params
2419+
)
24192420

2420-
context.labels.append_suggestions(new_suggestions)
2421+
context.labels.append_suggestions(new_suggestions)
2422+
except Exception as e:
2423+
win.hide()
2424+
QtWidgets.QMessageBox(
2425+
text=f"An error occurred while generating suggestions. "
2426+
"Your command line terminal may have more information about "
2427+
"the error."
2428+
).exec_()
2429+
raise e
24212430

24222431
win.hide()
24232432

‎sleap/gui/suggestions.py

+7-4
Original file line numberDiff line numberDiff line change
@@ -66,8 +66,10 @@ def suggest(cls, params: dict, labels: "Labels" = None) -> List[SuggestionFrame]
6666
if method_functions.get(method, None) is not None:
6767
return method_functions[method](labels=labels, **params)
6868
else:
69-
print(f"No {method} method found for generating suggestions.")
70-
return []
69+
raise ValueError(
70+
f"No {'' if method == '_' else method + ' '}method found for "
71+
"generating suggestions."
72+
)
7173

7274
# Functions corresponding to "method" param
7375

@@ -82,7 +84,8 @@ def basic_sample_suggestion_method(
8284
):
8385
"""Method to generate suggestions randomly or by taking strides through video."""
8486
suggestions = []
85-
sugg_idx_dict: Dict[Video, list] = {video: [] for video in videos}
87+
sugg_idx_dict: Dict[Video, list] = {video: [] for video in labels.videos}
88+
8689
for sugg in labels.suggestions:
8790
sugg_idx_dict[sugg.video].append(sugg.frame_idx)
8891

@@ -287,7 +290,7 @@ def filter_unique_suggestions(
287290
proposed_suggestions: List[SuggestionFrame],
288291
) -> List[SuggestionFrame]:
289292
# Create log of suggestions that already exist
290-
sugg_idx_dict: Dict[Video, list] = {video: [] for video in videos}
293+
sugg_idx_dict: Dict[Video, list] = {video: [] for video in labels.videos}
291294
for sugg in labels.suggestions:
292295
sugg_idx_dict[sugg.video].append(sugg.frame_idx)
293296

‎sleap/instance.py

+12-6
Original file line numberDiff line numberDiff line change
@@ -1232,6 +1232,13 @@ def structure_instances_list(x, type):
12321232
if "score" in inst_data.keys():
12331233
inst = converter.structure(inst_data, PredictedInstance)
12341234
else:
1235+
if (
1236+
"from_predicted" in inst_data
1237+
and inst_data["from_predicted"] is not None
1238+
):
1239+
inst_data["from_predicted"] = converter.structure(
1240+
inst_data["from_predicted"], PredictedInstance
1241+
)
12351242
inst = converter.structure(inst_data, Instance)
12361243
inst_list.append(inst)
12371244

@@ -1243,14 +1250,13 @@ def structure_instances_list(x, type):
12431250

12441251
# Structure forward reference for PredictedInstance for the Instance.from_predicted
12451252
# attribute.
1246-
converter.register_structure_hook(
1247-
ForwardRef("PredictedInstance"),
1248-
lambda x, _: converter.structure(x, PredictedInstance),
1253+
converter.register_structure_hook_func(
1254+
lambda t: t.__class__ is ForwardRef,
1255+
lambda v, t: converter.structure(v, t.__forward_value__),
12491256
)
1250-
12511257
# converter.register_structure_hook(
1252-
# PredictedInstance,
1253-
# lambda x, type: converter.structure(x, PredictedInstance),
1258+
# ForwardRef("PredictedInstance"),
1259+
# lambda x, _: converter.structure(x, PredictedInstance),
12541260
# )
12551261

12561262
# We can register structure hooks for point arrays that do nothing

‎sleap/nn/inference.py

+3-3
Original file line numberDiff line numberDiff line change
@@ -4238,7 +4238,7 @@ def export_cli():
42384238
)
42394239

42404240
args, _ = parser.parse_known_args()
4241-
export_model(args["models"], args["export_path"])
4241+
export_model(args.models, args.export_path)
42424242

42434243

42444244
def _make_cli_parser() -> argparse.ArgumentParser:
@@ -4620,9 +4620,9 @@ def main(args: list = None):
46204620
free_gpu_memory = sleap.nn.system.get_gpu_memory()
46214621
if len(free_gpu_memory) > 0:
46224622
gpu_ind = np.argmax(free_gpu_memory)
4623+
mem = free_gpu_memory[gpu_ind]
46234624
logger.info(
4624-
f"Auto-selected GPU {gpu_ind} with {free_gpu_memory} MiB of "
4625-
"free memory."
4625+
f"Auto-selected GPU {gpu_ind} with {mem} MiB of free memory."
46264626
)
46274627
else:
46284628
logger.info(

‎sleap/nn/training.py

+2-2
Original file line numberDiff line numberDiff line change
@@ -1947,9 +1947,9 @@ def main():
19471947
free_gpu_memory = sleap.nn.system.get_gpu_memory()
19481948
if len(free_gpu_memory) > 0:
19491949
gpu_ind = np.argmax(free_gpu_memory)
1950+
mem = free_gpu_memory[gpu_ind]
19501951
logger.info(
1951-
f"Auto-selected GPU {gpu_ind} with {free_gpu_memory} MiB of "
1952-
"free memory."
1952+
f"Auto-selected GPU {gpu_ind} with {mem} MiB of free memory."
19531953
)
19541954
else:
19551955
logger.info(

‎tests/gui/test_suggestions.py

+16-2
Original file line numberDiff line numberDiff line change
@@ -81,11 +81,13 @@ def test_frame_increment(centered_pair_predictions: Labels):
8181
print(centered_pair_predictions.videos)
8282

8383

84-
def test_video_selection(centered_pair_predictions: Labels):
84+
def test_video_selection(
85+
centered_pair_predictions: Labels, small_robot_3_frame_vid: Video
86+
):
8587
# Testing the functionality of choosing a specific video in a project and
8688
# only generating suggestions for the video
8789

88-
centered_pair_predictions.add_video(Video.from_filename(filename="test.mp4"))
90+
centered_pair_predictions.add_video(small_robot_3_frame_vid)
8991
# Testing suggestion generation from Image Features
9092
suggestions = VideoFrameSuggestions.suggest(
9193
labels=centered_pair_predictions,
@@ -150,6 +152,18 @@ def test_video_selection(centered_pair_predictions: Labels):
150152
# Confirming every suggestion is only for the video that is chosen and no other videos
151153
assert suggestions[i].video == centered_pair_predictions.videos[0]
152154

155+
# Ensure video target works given suggestions from another video already exist
156+
centered_pair_predictions.set_suggestions(suggestions)
157+
suggestions = VideoFrameSuggestions.suggest(
158+
labels=centered_pair_predictions,
159+
params={
160+
"videos": [centered_pair_predictions.videos[1]],
161+
"method": "sample",
162+
"per_video": 3,
163+
"sampling_method": "random",
164+
},
165+
)
166+
153167

154168
def assert_suggestions_unique(labels: Labels, new_suggestions: List[SuggestionFrame]):
155169
for sugg in labels.suggestions:

0 commit comments

Comments
 (0)
Please sign in to comment.