Sign In

How to create equirectangular 360° panoramas in Flux

How to create equirectangular 360° panoramas in Flux

To create panoramic images which can be viewed in 360 VR or as skyboxes, I've trained this LoRA for Flux Dev:

https://civitai.com/models/735980?modelVersionId=823024

Updated workflow with better notes: https://civitai.com/models/745010?modelVersionId=833115

Some basic info about how 360 panoramas work: Equirectangular projection images have a particular spherical lens distortion which makes the top and bottom edge of the image represent the point directly above/below the viewer, and the left and right edge of the image represents the space directly behind the viewer. The horizon line should also be along the central horizontal line of the image. Without the LoRA, the base model can occasionally generate images with a distortion like this, but it usually won't be very accurate. This LoRA was trained on a variety of different professional equirectangular images to try to teach the model how this distortion should be applied, and it works quite well.

However, it's still not perfect, and there will usually be a seam at the edges of the image, where the left and right edge won't match up exactly. The idea of this workflow (attached) is to try to resolve that seam issue by shifting the image over halfway, then inpainting the center section of the image to remove the seam. Afterwards, it shifts back to get the original image composition.

After this, if you want a higher resolution (which is useful in VR), you can upscale the image and run another low denoise pass on both the normal and shifted versions again. I also attached a workflow that includes these upscale and hi-res fix steps.

There are other LoRAs which aim to do the same thing, but I'm not sure how they compare. The one I trained seems to work well for landscapes, but might not be as good for indoor scenes without trying a few seeds or using doing some img2img steps. I'm hoping to improve the range of scenes it can generate in future versions.

Custom nodes used in the workflow that aren't in the registry:

  • The GGUF loader, which you can replace with the standard diffusion model loader if you aren't using GGUF, but you can get it here https://github.com/city96/ComfyUI-GGUF (and I highly recommend using GGUF for the smaller model size with minimal change in quality)

  • The ComfyMath nodes, available here https://github.com/evanspearman/ComfyMath (I've tried other math nodes but I've been having issues with them for some reason)

  • The Stereo Image node, which is optional and only needed if you are planning to generate stereo images for VR, which you can get here https://github.com/Dobidop/ComfyStereo

Ways to view 360 panoramas:

  • In VR, with many different VR media players (e.g. SteamVR Media Player, Deo VR, etc.). These also support stereo images that can show a 3D effect which makes the environment more immersive. For the stereo image generation, I recommend using polylines_sharp for the fill_technique value, for the 3D appearance to not have jagged edges.

  • With a web tool for viewing them, like https://renderstuff.com/tools/360-panorama-web-viewer/

  • After uploading to a site that supports an immersive view, with the appropriate image metadata added (in Windows, you can add the equirectangular projection metadata using exiftool by using a command like this: path\to\exiftool.exe -XMP:ProjectionType="equirectangular" image.png)

  • Civitai VR native support when? :)

6

Comments