Skip to content

Embeddings from MOIRAI and MoE #215

@kaushikb258

Description

@kaushikb258

Hi,

Excellent work. I have a question on extracting embeddings for a timeseries using MOIRAI or MOIRAI-MoE. I am saving the variable "reprs" from uni2ts/src/uni2ts/model/moirai_moe/module.py for an input timeseries and using this as embeddings. However, this is very high-dimensional, = N*384, where N is the number of tokens. Is there a way to obtain more compact, lower-dim embeddings? I do not see a CLS token implemented in the code anywhere, why is a CLS token not used? Any suggestions on how to obtain a lower-dim embedding of a timeseries?

In Fig. 5 of the MoE paper, you undertake a T-SNE visualization of the embeddings. Did you use in_reprs for this? If so, did you do T-SNE from Nx384 dimensional in_reprs to obtain a 2d T-SNE visualization or did you do T-SNE from a lower-dim representation (say, 1x384 instead of Nx384 by using only 1 token per timeseries input)?

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions