Unable to view large datasets

Issue #56 open
Kym Eden created an issue

My users recently came across two related issues when trying to view large scan sets:

  1. Opening “View images” on an experiment or subject containing over 15GB total, takes prohibitorily long; usually giving “ViewerMain: Loading chunk 15 failed. (timeout:…”
  2. Switching to “3d MPR” on a scan larger than 2GB produces the error: “ViewerMain: WebGL2RenderingContext.texImage3D: Argument 10 can't be an ArrayBuffer or an ArrayBufferView larger than 2 GB”

For the first, would it be possible to introduce a url parameter like “scanId” similar to “experimentId”, so the viewer can load a single scan? Ideally there could then be a “view” button alongside the details, download and delete buttons on the scan list, but even being able to manually craft the url would unblock quite a lot of my users.

For the MPR issue, is there a work around for the WebGL2 limit?

Thanks for the help.

Comments (3)

  1. Mo Alsad
    • changed status to open

    Hi Kym,

    May I ask which version of the viewer you are using?

    The viewer typically pulls and loads the thumbnail image for each scan, then it prefetchs all the images for the current (active) scan. If you switch to another scan, the preferch process pulls and caches the images for the newly selected scan and stops for the previous scan. So, you can simply click on the thumbnail of the scan of interest to stop loading other scans.

    Could please share the structure and type of your data for further investigation?

    I will look into how feasible it is to enable the viewer command at a scan-level.

    For the 3D MPR mode, the buffer size limitation comes from the browser and the underlying libraries used by the viewer. Have you tried to use Firefox? Would you be able to share some sample data?

    Kind regards,

    Mo

  2. Kym Eden reporter

    Hi Mo,

    Sorry, yes missing information:

    OHIF plugin 3.6.0
    XNAT 1.8.9.2

    For both issues I have tested Firefox (127.0.1) and Chrome (126.0.6478.127), and they give the same behaviour although Firefox displays an explicit stack trace while Chrome produces a small error dialog in the bottom right.

    We won’t be able to share the very large datasets were we experience the timeout issue. I’m enquiring about sharing a video of them, if that would be useful to you? Otherwise, I’m also looking at generating synthetic data that produces the same effect. These are XA modality in 49 scans each consisting of a single file that ranges in size from 2MB to 1.3GB. The one at the top of the list (which I assume is loaded by default) is 170MB. Some thumbnails do appear in the OHIF viewer sidebar but a lot of the previews still show the loading wheel at the time when the timeout error is displayed.

    We have only been able to test the >2GB MPR limit on CT data. I’m asking if we can share this or again if we need synthetic equivalent.

    Many thanks

    Kym

  3. Log in to comment