Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions doc/changes/latest.inc
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,8 @@ Current (0.22.dev0)

.. |Hongjiang Ye| replace:: **Hongjiang Ye**

.. |Qianliang Li| replace:: **Qianliang Li**


Enhancements
~~~~~~~~~~~~
Expand Down Expand Up @@ -95,6 +97,8 @@ Enhancements

Bugs
~~~~
- Fix orthogonalization of power envelopes in :func:`mne.connectivity.envelope_correlation` (:gh:`8658` **by new contributor** |Qianliang Li|_ and `Eric Larson`_)

- Fix a transpose issue of :func:`mne.decoding.CSP.plot_filters` (:gh:`8580` **by new contributor** |Hongjiang Ye|_)

- Fix :func:`mne.io.read_raw_curry` to deal with Curry datasets that have channels that are listed in the labels file, but which are absent from the saved data file (e.g. 'Ref' channel). Also now populates info['meas_date'] if possible (:gh:`8400` **by new contributor** |Tod Flak|_)
Expand Down
2 changes: 2 additions & 0 deletions doc/changes/names.inc
Original file line number Diff line number Diff line change
Expand Up @@ -341,3 +341,5 @@
.. _Evan Hathaway: https://github.com/ephathaway

.. _Hongjiang Ye: https://github.com/rubyyhj

.. _Qianliang Li: https://www.dtu.dk/english/service/phonebook/person?id=126774
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
:footcite:`HippEtAl2012,KhanEtAl2018` in source
space using resting state CTF data.
"""

# Authors: Eric Larson <larson.eric.d@gmail.com>
# Sheraz Khan <sheraz@khansheraz.com>
# Denis Engemann <denis.engemann@gmail.com>
Expand Down
39 changes: 33 additions & 6 deletions mne/connectivity/envelope.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@

@verbose
def envelope_correlation(data, combine='mean', orthogonalize="pairwise",
verbose=None):
log=False, absolute=True, verbose=None):
"""Compute the envelope correlation.

Parameters
Expand All @@ -40,6 +40,18 @@ def envelope_correlation(data, combine='mean', orthogonalize="pairwise",
absolute values.

.. versionadded:: 0.19
log : bool
If True (default False), square and take the log before orthonalizing
envelopes or computing correlations.

.. versionadded:: 0.22
absolute : bool
If True (default), then take the absolute value of correlation
coefficients before making each epoch's correlation matrix
symmetric (and thus before combining matrices across epochs).
Only used when ``orthogonalize=True``.

.. versionadded:: 0.22
%(verbose)s

Returns
Expand All @@ -54,6 +66,10 @@ def envelope_correlation(data, combine='mean', orthogonalize="pairwise",
This function computes the power envelope correlation between
orthogonalized signals [1]_ [2]_.

.. versionchanged:: 0.22
Computations fixed for ``orthogonalize=True`` and diagonal entries are
set explicitly to zero.

References
----------
.. [1] Hipp JF, Hawellek DJ, Corbetta M, Siegel M, Engel AK (2012)
Expand Down Expand Up @@ -99,6 +115,9 @@ def envelope_correlation(data, combine='mean', orthogonalize="pairwise",
data_mag = np.abs(epoch_data)
data_conj_scaled = epoch_data.conj()
data_conj_scaled /= data_mag
if log:
data_mag *= data_mag
np.log(data_mag, out=data_mag)
# subtract means
data_mag_nomean = data_mag - np.mean(data_mag, axis=-1, keepdims=True)
# compute variances using linalg.norm (square, sum, sqrt) since mean=0
Expand All @@ -107,21 +126,29 @@ def envelope_correlation(data, combine='mean', orthogonalize="pairwise",
corr = np.empty((n_nodes, n_nodes))
for li, label_data in enumerate(epoch_data):
if orthogonalize is False: # the new code
label_data_orth = data_mag
label_data_orth_std = data_mag_std
label_data_orth = data_mag[li]
label_data_orth_std = data_mag_std[li]
else:
label_data_orth = (label_data * data_conj_scaled).imag
np.abs(label_data_orth, out=label_data_orth)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

First change: when orthogonalizing, use abs of the orthogonalized data rather than just (data * data_conj_scaled).imag). Not 100% sure this is right but conceptually the thing we're correlating with is abs(data) so it seems to make sense to take the abs here, too...

I think the original operated on np.log10(label_data_orth ** 2) so an abs was implicitly taken by the squaring operation.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure about the log10, but they do square and take the log-transform:
Citation from their supplementary information:
"The signal Y(t,f) is orthogonalized in the complex plane with respect to X(t,f). This results in a positive number |Y_orth_X(t,f)| which is then squared and log-transformed, and correlated with [|X(t,f)|." Source

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

log10 is the same as log up to a scale factor, and that scale factor goes away in the correlation so it doesn't really matter. But np.log is faster than np.log10 so why not!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

... but that text does make it clear that it should result in a positive number even before the square and log transform, so I think this is correct now (and will continue to be on the next push where I change np.log10 to np.log)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think abs was missing in the original code, because of the squaring in the next step.

# protect against invalid value -- this will be zero
# after (log and) mean subtraction
label_data_orth[li] = 1.
if log:
label_data_orth *= label_data_orth
np.log(label_data_orth, out=label_data_orth)
label_data_orth -= np.mean(label_data_orth, axis=-1,
keepdims=True)
label_data_orth_std = np.linalg.norm(label_data_orth, axis=-1)
label_data_orth_std[label_data_orth_std == 0] = 1
# correlation is dot product divided by variances
corr[li] = np.dot(label_data_orth, data_mag_nomean[li])
corr[li] /= data_mag_std[li]
corr[li] = np.sum(label_data_orth * data_mag_nomean, axis=1)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Second change is what @Avoide talked about in the other issue, we need to correlate with the correct pairs. This (plus the log=True, absolute=False) makes our code match both what @Avoide suggested and what FieldTrip produces.

Copy link
Member

@SherazKhan SherazKhan Dec 14, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Second change is what @Avoide talked about in the other issue, we need to correlate with the correct pairs. This (plus the log=True, absolute=False) makes our code match both what @Avoide suggested and what FieldTrip produces.

We were using np.corrcoef but this is much faster

corr[li] /= data_mag_std
corr[li] /= label_data_orth_std
if orthogonalize is not False:
# Make it symmetric (it isn't at this point)
corr = np.abs(corr)
if absolute:
corr = np.abs(corr)
corr = (corr.T + corr) / 2.
corrs.append(corr)
del corr
Expand Down
54 changes: 40 additions & 14 deletions mne/connectivity/tests/test_envelope.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,19 +15,23 @@
def _compute_corrs_orig(data):
# This is the version of the code by Sheraz and Denis.
# For this version (epochs, labels, time) must be -> (labels, time, epochs)
data = np.transpose(data, (1, 2, 0))
corr_mats = np.empty((data.shape[0], data.shape[0], data.shape[2]))
for index, label_data in enumerate(data):
label_data_orth = np.imag(label_data * (data.conj() / np.abs(data)))
label_data_orig = np.abs(label_data)
label_data_cont = np.transpose(
np.dstack((label_data_orig, np.transpose(label_data_orth,
(1, 2, 0)))), (1, 2, 0))
corr_mats[index] = np.array([np.corrcoef(dat)
for dat in label_data_cont])[:, 0, 1:].T
corr_mats = np.transpose(corr_mats, (2, 0, 1))
corr = np.mean(np.array([(np.abs(corr_mat) + np.abs(corr_mat).T) / 2.
for corr_mat in corr_mats]), axis=0)
Comment on lines 16 to -30
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I always had a feeling since our working session at Biomag2018 that the MNE implementation does something different than the prototype.

@dengemann IIRC the code here was the prototype that I used to code a more efficient version in MNE, so it should have replicated it properly. Unfortunately I had to change this prototype code to do something different in order to get it to follow FieldTrip and the paper definitions...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No problem, If we can come up with a more flexible set of options this is good for research on this method.
The comparisons in the previous discussion suggest that difference may be rather subtle.

Copy link
Member

@SherazKhan SherazKhan Dec 14, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I second, differences will be subtle and additional options will be worth comparing.

n_epochs, n_labels, _ = data.shape
corr = np.zeros((n_labels, n_labels))
for epoch_data in data:
for ii in range(n_labels):
for jj in range(n_labels):
# Get timeseries for each pair
x, y = epoch_data[ii], epoch_data[jj]
x_mag = np.abs(x)
x_conj_scaled = x.conj()
x_conj_scaled /= x_mag
# Calculate orthogonalization
y_orth_x = (y * x_conj_scaled).imag
y_orth_x_mag = np.abs(y_orth_x)
# Estimate correlation
corr[ii, jj] += np.abs(np.corrcoef(x_mag, y_orth_x_mag)[0, 1])
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@SherazKhan the actual critical change in the code below is from data_mag_nomean[li] which only used the lith label's data to correlate with all N labels label_data_orth series, and this code now uses data_mag_nomean, i.e., all N labels data to correlate with all N labels label_data_orth series.

It's probably easier to see if you look at the "simplified" version here. The old/master version of the code does the correlation with the jj original (magnitude) data that you could call y_mag:

                corr[ii, jj] += np.abs(np.corrcoef(np.abs(epoch_data[jj]), y_orth_x)[0, 1])

Instead of this correlation, with (magnitude) of the iith data that is called x_mag here, which is now implemented in this PR, equivalently written as:

                corr[ii, jj] += np.abs(np.corrcoef(np.abs(epoch_data[ii]), y_orth_x)[0, 1])

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@larsoner indeed, this is more closer to the original Hipp version.

corr = (corr + corr.T) / (2. * n_epochs)
corr.flat[::n_labels + 1] = 0.
return corr


Expand All @@ -37,7 +41,7 @@ def test_envelope_correlation():
data = rng.randn(2, 4, 64)
data_hilbert = hilbert(data, axis=-1)
corr_orig = _compute_corrs_orig(data_hilbert)
assert (0 < corr_orig).all()
assert (0 <= corr_orig).all()
assert (corr_orig < 1).all()
# using complex data
corr = envelope_correlation(data_hilbert)
Expand Down Expand Up @@ -72,3 +76,25 @@ def test_envelope_correlation():
assert_allclose(np.diag(corr_plain_mean), 1)
np_corr = np.array([np.corrcoef(np.abs(x)) for x in data_hilbert])
assert_allclose(corr_plain, np_corr)

# check against FieldTrip, which uses the square-log-norm version
# from scipy.io import savemat
# savemat('data.mat', dict(data_hilbert=data_hilbert))
# matlab
# load data
# ft_connectivity_powcorr_ortho(reshape(data_hilbert(1,:,:), [4, 64]))
# ft_connectivity_powcorr_ortho(reshape(data_hilbert(2,:,:), [4, 64]))
ft_vals = np.array([
[[np.nan, 0.196734553900236, 0.063173148355451, -0.242638384630448],
[0.196734553900236, np.nan, 0.041799775495150, -0.088205187548542],
[0.063173148355451, 0.041799775495150, np.nan, 0.090331428512317],
[-0.242638384630448, -0.088205187548542, 0.090331428512317, np.nan]],
[[np.nan, -0.013270857462890, 0.185200598081295, 0.140284351572544],
[-0.013270857462890, np.nan, 0.150981508043722, -0.000671809276372],
[0.185200598081295, 0.150981508043722, np.nan, 0.137460244313337],
[0.140284351572544, -0.000671809276372, 0.137460244313337, np.nan]],
], float)
ft_vals[np.isnan(ft_vals)] = 0
corr_log = envelope_correlation(
data, combine=None, log=True, absolute=False)
assert_allclose(corr_log, ft_vals)