Well I have an application in which I display 1.2 million items in a CellTable (50 items at a time).
I solved that by using AsyncDataProvider and doing bulk-requests of around 2000 items (you can play with that vaue).
I have a SimplePager which displays 50 of them at a time and when the user pages through it it won't do any server request until the list of 2000 is completed. Then I request the next 2000 items (actually that is automatically taken care of by the AsyncDataProvider interface).
For filtering/searching I implemented a hybrid approach. For example if the user types in a string for a name it will first search on the server. Again only 2000 items will be sent back from the backend. For each new character I will do another server-request. However I also return the number of hits (which the search returns). If the number of hits is smaller than 2000 the next time the user types in another character it will only search on the client side (because I already have the whole search set transfered to the client).
For sorting you can take the same path.
To be honest I am not sure if it's worth just sending the IDs and then replacing it later with a second call with the missing real objects. I don't think it will be much of a performance problem even when you have a lot of user requests at the same time. You can try to solve that problem on the backend (via caching, etc) and if you still have an issue you can implement your approach.
Don't forget Knuth famous quote: "Premature optimization is the root of all evil"
You received this message because you are subscribed to the Google Groups "Google Web Toolkit" group.
To view this discussion on the web visit https://groups.google.com/d/msg/google-web-toolkit/-/IeGIOrFVgyIJ.
To post to this group, send email to google-web-toolkit@googlegroups.com.
To unsubscribe from this group, send email to google-web-toolkit+unsubscribe@googlegroups.com.
For more options, visit this group at http://groups.google.com/group/google-web-toolkit?hl=en.
No comments:
Post a Comment