Pinned Loading
-
togethercomputer/MoA
togethercomputer/MoA PublicTogether Mixture-Of-Agents (MoA) – 65.1% on AlpacaEval with OSS models
-
-
-
DS3Lab/AC-SGD
DS3Lab/AC-SGD PublicCode associated with the paper **Fine-tuning Language Models over Slow Networks using Activation Compression with Guarantees**.
-
two-are-better-than-one
two-are-better-than-one PublicCode associated with the paper **Two are Better Than One: Joint Entity and Relation Extraction with Table-Sequence Encoders**, at EMNLP 2020
Something went wrong, please refresh the page to try again.
If the problem persists, check the GitHub status page or contact support.
If the problem persists, check the GitHub status page or contact support.