Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The effect of fine-tuning without domain gap #24

Open
rachel-lyq opened this issue Jul 19, 2023 · 1 comment
Open

The effect of fine-tuning without domain gap #24

rachel-lyq opened this issue Jul 19, 2023 · 1 comment

Comments

@rachel-lyq
Copy link

Your outstanding performance has surprised us, and I have tried to apply the method in my work. In a few shot classification experiment on a single dataset MiniImageNet (without cross-domain data from multiple datasets), fine-tuning made a very small difference, even failing to bring a 0.1% improvement. Was the fine-tuning effect significant in your experiments? What do you think might be the cause of this problem?

@hushell
Copy link
Owner

hushell commented Jun 5, 2024

I guess fine-tuning work better when the domain gap is not too small. On MiniImageNet, the pre-trained DINO just work amazing as this is now well understood that foundation model solves many classification problems. While scaling up foundation model solves more and more problems, certain domains (e.g. 3D vision) lack good foundation model where meta-learning and fine-tuning are still of good practice.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants