Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Prelosses and Postlosses #76

Open
Naman9639 opened this issue Jan 29, 2020 · 2 comments
Open

Prelosses and Postlosses #76

Naman9639 opened this issue Jan 29, 2020 · 2 comments

Comments

@Naman9639
Copy link

Hey!
I am new in this field and I was going through the code but I am not able to understand what exactly does the preloss and post loss mean here. Can someone help?

Thanks

@ryujaehun
Copy link

ryujaehun commented Mar 31, 2020

I understand it that preloss is adaptaion loss about label_a(Number of classes per task) and postloss is meta-learning loss about lable_b(Number of test example per class).

@asker-github
Copy link

I understand it that preloss is adaptaion loss about label_a(Number of classes per task) and postloss is meta-learning loss about lable_b(Number of test example per class).
Hello, I have a question. Why does postoss get higher and higher in training? I think that should mean the accuracy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants