Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

would be good to talk about how vectors (col vectors) and covectors (row vectors) differ #3

Open
bohrium opened this issue Dec 11, 2022 · 0 comments

Comments

@bohrium
Copy link

bohrium commented Dec 11, 2022

I'd be pleased to add this vector-vs-covector content if you see fit.

It's not just a point of theoretical pedantry: once students understand it, things like the coordinate-dependence of gradient descent (said another way, the fact that the learning rate is an inverse riemannian metric rather than a number) become manifest. From here, implicit regularization becomes easy to analyze near a minimum just based on dimensional analysis! Thus we come to appreciate once again how an inner product (that says how "alike" its two inputs are) controls generalization behavior --- just as with kernel svms, l2 regularized underdetermined linear regression, and more.

[email protected]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant