Skip to content

Brodcasting error when saving recording after deep interpolation #3846

Open
@OleBialas

Description

@OleBialas

Hi,
I'm having issues with storing the recording object after applying deepinterpolation.
When saving an interpolated recording, I get error messages like:

ValueError: could not broadcast input array from shape (29886,768) into shape (30000,768)

I assume that is because the convolutional network removes samples at the edges of the signal.
What is the recommended way of dealing with this? Should I pad the traces before applying deepinterpolation? If so, is there a function for this? Or would it be better to create a new object for storing the interpolated recording?

Below is a code example. I'm using spikeinterface version 0.100.6 because deepinterpolation requires python <= 3.8

from pathlib import Path
import numpy as np
import spikeinterface.extractors as se
from spikeinterface.preprocessing import common_reference, deepinterpolate, zero_channel_pad

root = Path(__file__).parent.parent.absolute()
model_path = list((root/"model").glob("*"))[0]

recording = se.read_spikeglx( folder_path=root/"data"/"neuropixels")

# pad so dimensions are compatible with model
recording = zero_channel_pad(recording, num_channels=384*2) 

# create preprocessing pipeline
recording_cmr = common_reference(recording=recording, operator="median")
recording_deepint = deepinterpolate(recording=recording_cmr, model_path=str(model_path), use_gpu=False)

# process and save
recording_deepint.save()

Thanks in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingpreprocessingRelated to preprocessing module

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions