Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does the Real-HAT have a small version like HAT-S? #126

Open
ANYMS-A opened this issue Feb 1, 2024 · 6 comments
Open

Does the Real-HAT have a small version like HAT-S? #126

ANYMS-A opened this issue Feb 1, 2024 · 6 comments

Comments

@ANYMS-A
Copy link

ANYMS-A commented Feb 1, 2024

Does the Real-HAT have a small version like HAT-S? SInce I want to deploy the HAT on a edge-computing device. But Real-HAT is too large to be deployed.

@0MiDo0
Copy link

0MiDo0 commented Feb 2, 2024

I met the same problem, the Real-HAT SRx4 requires too much VRAM. It would be better if the author releases a Real-HAT SRx2 version

@wangxinchao-bit
Copy link

Do you have trained the Real-HAT SRx2 ?

@0MiDo0
Copy link

0MiDo0 commented Apr 26, 2024

Do you have trained the Real-HAT SRx2 ?

Not yet, my GPU is not enough for training (RTX 4070Ti 12GB). If you want a good model, you need a large batch_size for training, 12GB is only enough for batch_size=1 (img_size 512)

@wangxinchao-bit
Copy link

Have you come across any smaller models that deliver slightly better performance? I'm interested in experimenting with a GAN. I saw that the author trained their network using a 2080 with a batch size of 4 for x4 super-resolution and achieved impressive results. However, I'm not sure about the duration of their training process.

@0MiDo0
Copy link

0MiDo0 commented Apr 26, 2024

Have you come across any smaller models that deliver slightly better performance? I'm interested in experimenting with a GAN. I saw that the author trained their network using a 2080 with a batch size of 4 for x4 super-resolution and achieved impressive results. However, I'm not sure about the duration of their training process.

Because my image for training is too large, which is 512*512, batch_size 1 takes 11.7 GB vram.
I'm now using Real-ESRGAN, it's OK for me, but not as good as HAT.
There is a new network called DRCT, you can have a look: https://github.com/ming053l/drct
image

@wangxinchao-bit
Copy link

Thank you, I wander how long it spend for yout to make supre resolution for 512*512 images with real-esrgan?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants