-
Notifications
You must be signed in to change notification settings - Fork 231
Refactor metrics into its own module #4183
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
|
This looks great - love the I think this is a good chance to remove I'd vote to take the chance to make multi channel template metrics included by default: they're very helpful. |
I agree! Maybe we can make it default for number of channel > 64? |
| .. code-block:: python | ||
| import spikeinterface.qualitymetrics as sqm | ||
| from spikeinterface.metrics.spiketrain import compute_firing_rates |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
More easy no ?
| from spikeinterface.metrics.spiketrain import compute_firing_rates | |
| from spikeinterface.metrics import compute_firing_rates |
| metric_params = {} # to be defined in subclass | ||
| metric_columns = {} # column names and their dtypes of the dataframe | ||
| needs_recording = False # to be defined in subclass | ||
| needs_tmp_data = False # to be defined in subclass |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this diserve more comments to explain the concept
| @@ -0,0 +1,8 @@ | |||
| import pytest | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
why do have this file in sources ?
Better in tests sub folder no ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it needs to be at the module level, so that config are shared across submodules
| @@ -0,0 +1,84 @@ | |||
| import numpy as np | |||
| from spikeinterface.core.analyzer_extension_core import BaseMetric | |||
|
|
|||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How do we prevent previous users to not do
analyzer.get_extension("quality_metics").get_data()["firing_rates"] is it duplicated ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that it should work!
|
|
||
| def _compute_metrics( | ||
| self, | ||
| sorting_analyzer: SortingAnalyzer, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
An extension is aware of the analyzer no ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is because the same function is also used for merge/splits, where you provide a new analyzer!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK sure.
| if not include_multi_channel_metrics and num_channels >= MIN_CHANNELS_FOR_MULTI_CHANNEL_METRICS: | ||
| include_multi_channel_metrics = True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not sure to like the hidden magic choices.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That was Chris's idea :) then we should make it True by default. These are really useful
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
They are the best template metrics. I would vote True by default -- but am a little worried that tetrode people will get loads of warnings about nan results. Maybe we can discuss how to deal with these better?
| ) | ||
| return params | ||
|
|
||
| def _prepare_data(self, sorting_analyzer, unit_ids=None): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this deserve also more explcation and how this cache this propagated on other other class
|
This mostly almost OK for me. |
This PR includes a major refactor of the metrics concept.
It defines a
BaseMetric, with core metadata of individual metrics including dtypes, column names, extension dependance, and a compute function.Another
BaseMetricExtensioncontains a collection ofBaseMetrics and deals with most of the machinery, including:The
template_metrics,quality_metrics, and a newspiketrain_metricsextensions are now in themetricsmodule. The latter only includesnum_spikesandfiring_rate, which are also imported as quality metrics.Still finalizing tests, but this should be 90% done