Articles for category: . Generative AI

Running SD XL with 4 GB VRAM, can’t use LoRAs (Automatic1111)

So I have a quite old laptop, it has 16 GB RAM and a GeForce 960M with 4 GB VRAM. I was able to load a XL model into Automatic1111 and create images. As you can guess, is pretty slow but it works, I really don't mind about how long it takes. The problem lies when I try to use a LoRA, if I load even a single one, PC freezes or the CMD prompt freezes, or weird things start to happen. I was able to use LoRAs a couple of times but I don't know how and why. I'm