Fractorium fails to set output texture when performing scaled rendering at large resolutions

Issue #34 closed
Benjamin Middaugh created an issue

I discovered Fractorium a few months ago and got very good results, but my GTX 750 Ti didn’t have enough video memory for renders at the resolution I wanted to target, so I recently installed a GTX 1060 instead. Now, Fractorium renders just fine at the resolution specified in the file, but when I set the final render to my desired scale (and set an appropriate number of strips) I get the following error message and the render fails immediately:

Rendering failed.

EmberCLns::RendererCL<float,float>::SetOutputTexture()
NVIDIA Corporation GeForce GTX 1060 6GB
Platform: 0, device: 0, error:
Failed to init set output texture

EmberCLns::RendererCL<float,float>::Alloc()
NVIDIA Corporation GeForce GTX 1060 6GB
Platform: 0, device: 0, error:
Failed to set output texture

Histogram, accumulator and samples buffer allocations failed, aborting.

ERROR: Invalid image size in cl::Image2D().

I’m trying to render at 18000 x 12000 (or a 60 inch wide by 40 inch tall print at 300 DPI), but it clearly isn’t working. When I render at 12000 x 8000 (well, 12000 x 8001 because the rendering pipeline seems to eat a pixel somewhere when doing the calculations in terms of number of strips needed) with 4 strips it renders fine.

Am I being too ambitious for my render target, or is there a bug somewhere? When I tried a render at 15001 x 10001 (at 4 strips) it failed after 1 strip with the following error:

Running final accumulation program failed

ERROR: Invalid image size in cl::Image2D().

ERROR: Memory object allocation failure in cl::CommandQueue::enqueueNDRangeKernel().

It is possible that I have some bad hardware somewhere (the graphics card is used, and I suspect I have an unstable RAM stick since Windows has random crashes, although Linux runs fine), but I haven’t had this problem with my 750 Ti (to be fair, it didn’t have enough VRAM to get to the sizes I’m having problems with before I ran out of the number of strips the GUI would support) so I want to make sure the problem isn’t with the software before I try doing something expensive like change my hardware again.

Thanks for your wonderful work with this software. It’s made my high-resolution flame rendering bearable on my old PC.

Comments (6)

  1. Benjamin Middaugh reporter

    I mentioned in passing that Fractorium’s final render dialog sometimes calculates the number of strips strangely. I tried another render at 12000 x 8001 and changing the resolution to 12000 x 8000 instead lowered the number of strips required from 7 to 4 (usually I encounter it the other way around, having to add one pixel, whihc is why I often try the extra pixel first).

  2. Matt Feemster repo owner

    Hi Ben, thanks for the feedback.

    Yes, you are being too ambitious with your render. Even if the strips allows you to perform iterations successfully by splitting up the histogram and density filtering buffers, the final output image buffer must be contiguous. So you are trying to create a texture that is 18000 x 12000 and your card can’t handle it. That is one limitation of rendering on the GPU. One possible idea for the distant future is to allow the final image buffer to be handled on the CPU. But that’s a ways out.

    Further, a GPU will have a max allocation size, which is less than the total amount of memory. No single buffer can be larger than this. If you look in the text box as you play around with the sizes, you will see an error message describing this.

    Be sure to read this article on my website (as well as all of the other documentation):

    http://fractorium.com/?article=strips

    The error in computing the number of required strips is because there is a small gutter around the image buffer for rendering. So if you are specifying an image of a certain size, the actual buffer used internally is a bit bigger than that to handle some of the filtering processing. Just tinker with the size until you get the smallest number of strips needed. Then if your image has an extra line, crop it in an image editing program.

    Thanks for your interest in my program. Another user has spotted some bugs, so I’ll probably do a small release next year.

  3. Benjamin Middaugh reporter

    Thanks. That’s what I was afraid of. I definitely appreciate your work, and I’ll be looking forward to what’s to come.

  4. Matt Feemster repo owner

    Ok great, closing this.

    Remember, total available VRAM is not the only issue. Even within that, the driver will set a max allocation size which is some amount less than that. That size is the largest size buffer you can allocate at a time.

  5. Log in to comment