Skip to content

Conversation

jacobsfletch
Copy link
Member

Description

Implements a simple relationship cache within Live Preview.

  • I have read and understand the CONTRIBUTING.md document in this repository.

Type of change

  • New feature (non-breaking change which adds functionality)

Checklist:

  • I have added tests that prove my fix is effective or that my feature works
  • Existing test suite passes locally with my changes

@matthijs166
Copy link

Yes awesome we hit the rate limit constant...

@matthijs166
Copy link

Hey @jacobsfletch nice fix!!! We already implemented it into a site where the live preview was very slow.

One thing is that it did not work right away for us the way you build it.
I had to move the cache object in the outer scope of the handleMessage function for it to function.

I saw that the cache object was not the same after everything resolved when I used:

setInterval(() => {
    console.log( cache)
}, 1000)

We use a lot of richtext (slate) media, maybe the issue is here, I don't know exactly.
I can provide more details if you want to, got not a lot of time at the moment :)

(etc)
let payloadLivePreviewFieldSchema // TODO: type this from `fieldSchemaToJSON` return type

const cache = new Map()

export const handleMessage = async <T>(args: {
(etc)

@jacobsfletch
Copy link
Member Author

Heyyy thank you @matthijs166 👍 this approach was directly inspired by the community plugin here: https://github.com/pemedia/payload-visual-editor so I can't take full credit. BUT this will likely end up being closed. Here's why:

I think the way to go here is to actually batch requests instead of cache them. This way, even if you have 30+ relationships, only one or two requests might ever be needed to populate every single one of them (the exact number would depend on the number of collections you are referencing, etc.). But the real reasoning behind this is that related documents can potentially be updated behind the scenes (like in a drawer, see #4287). Their IDs would never change, and so the cache would return stale data unless purged in some way. Most of the time, you'd probably want to re-fetch relationships, especially if the it's only ever a single request.

@jacobsfletch
Copy link
Member Author

Yes awesome we hit the rate limit constant...

This likely another issue. API requests shouldn't be hit that much. Are you up to date with the latest versions? I've made some recent optimizations to how this all works. More to come.

@matthijs166
Copy link

API requests shouldn't be hit that much

Yes before using this fix we used version 0.1.6. But we have a sophisticated navigation structure characterized by numerous relationships with other pages.
Additionally, when a character is entered in the title field, the traverseFields function is activated. Without caching, this results in fetching each relation every time, leading to a significant performance impact, especially when typing 10 characters with 30 relations on the page

BUT this will likely end up being closed

I say let's push for releasing this feature as a temporary fix. It does the job of tackling performance problems on larger pages, especially when you've got 10 or more relations going on. It's a practical and much-needed improvement for the current situation.

batch requests instead of cache them

Yeah, going with that plan sounds like the way to go! Just loop through all the data, toss those IDs into a queue, hit a single call to fetch them, and boom, fill up that cache.

Also an idea: Why not consider a new live preview endpoint? It's like when you fetch a collection item with depth in the regular frontend.

Orrr throw in a web sockets into payload to subscribe to all the changes, you can have a shared editing experience and the live preview can connect over the web socket.

Their IDs would never change, and so the cache would return stale data unless purged in some way.

Totally get your point but fetching every time a user types a character does seem a bit much. Another option to consider could be setting a timeout to invalidate the cache after, say, x seconds. That way, you strike a balance between keeping things updated and not bombarding the server with requests every time someone makes a small edit.

@jacobsfletch
Copy link
Member Author

I really appreciate your response @matthijs166. Cache is not off the table yet, but I'd like to see how far we can go without implementing any sort of cache. I think we can get 99% of the way there for the majority of use cases.

Yeah, going with that plan sounds like the way to go! Just loop through all the data, toss those IDs into a queue, hit a single call to fetch them, and boom, fill up that cache.

Yeeaaaaa you got it

Totally get your point but fetching every time a user types a character does seem a bit much.

Well this is definitely not the intended behavior and has since been fixed. Are you on the latest versions of payload and live preview? I also just opened #4306 which will take it a step further. The only time relationships should ever be fetched if when you add new ones (or move them around, which batching and/or caching will help alleviate).

Also an idea: Why not consider a new live preview endpoint? It's like when you fetch a collection item with depth in the regular frontend.

Interesting you mention this. I have played around with this, and even had it implemented at one time. But this only added complexity, and after batching, will probably only save on a negligible number of requests. It also almost feels like we're recreating the Payload API at that point too. Besides, this is precisely how the admin dashboard itself works. If you dropdown a relationship selector, for instance, you'll see the requests made in the network panel that populate the values beyond their IDs. They are not re-fetched when you drop the selector open again, bc the results are stored in state.

@jacobsfletch
Copy link
Member Author

I think the way to go here is to actually batch requests instead of cache them. This way, even if you have 30+ relationships, only one or two requests might ever be needed to populate every single one of them (the exact number would depend on the number of collections you are referencing, etc.).

I just opened #4315 which demonstrates this, check it out!

@jacobsfletch jacobsfletch deleted the chore/live-preview-cache branch November 29, 2023 19:10
@matthijs166
Copy link

Awesome job!! 💪 I will look at it tonight

Interesting you mention this. I have played around with this, and even had it implemented at one time. But this only added complexity,

Ahhh, I can imagine.. Who knows, a topic to revisit if we got proper web sockets and shared editing in Payload

I will also check with the newest version if typing a new character will trigger a refetch or not. Sounds like it should all work smooth now!

@matthijs166
Copy link

@jacobsfletch Everything works like a charm! 💪 💪 💪 💪

@jacobsfletch
Copy link
Member Author

That is such a relief, thank you for spinning this up so quickly 🙏

AlessioGr added a commit that referenced this pull request Dec 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants