Sign In

Nomos8kHAT-L_otf

Verified:

SafeTensor

Type

Upscaler

Stats

7,826

0

Reviews

Published

Jun 25, 2024

Base Model

SDXL 1.0

Hash

AutoV2
4AA7FC4B42
default creator card background decoration
Knight Badge
Creepybit's Avatar

Creepybit

Note: This upscaler is not mine

Credit to Helaman

Originally uploaded at: https://openmodeldb.info/models/4x-Nomos8kHAT-L-otf

About version 2.0

Everything is the same is with the first upload, but converted to .safetensors. I had issues getting Forge and Automatic1111 to load the .safetensors version of the upscaler but it works like a charm in ComfyUI.

General info

Hybrid Attention Transformer (HAT) combines channel attention and self-attention schemes and makes use of their complementary advantages. To enhance the interaction between neighboring window features, an overlapping cross-attention module is employed in HAT. Read more

Where does it go?

To use this (and other HAT upscalers) with Automatic1111 and Forge follow these steps.

  • Create a folder in in \webui\models\ and name it HAT

  • Download the file either here or from the source

  • Place the file in \webui\models\HAT\

  • Restart your webui

Note: If you have issues getting the model to work, change the file name from .pt to .pth