-
Notifications
You must be signed in to change notification settings - Fork 74
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[LoRA] feat/ add LoRA to fc_layer @open sesame 03/07 10:42 #2442
Conversation
📝 TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2442. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
nntrainer/layers/fc_layer.cpp
Outdated
@@ -186,6 +237,11 @@ void FullyConnectedLayer::incremental_forwarding(RunLayerContext &context, | |||
} | |||
} | |||
|
|||
/** |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please describe in header file
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
cibot: @EunjuYang, nntrainer/layers/fc_layer.cpp does not include Doxygen tags such as @file @brief @author @bug. You must include the Doxygen tags in the source code. Please refer to a Doxygen manual at http://github.com/nnstreamer/TAOS-CI/blob/main/ci/doc/doxygen-documentation.md |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
cibot: @EunjuYang, nntrainer/layers/fc_layer.cpp does not include Doxygen tags such as @file @brief @author @bug. You must include the Doxygen tags in the source code. Please refer to a Doxygen manual at http://github.com/nnstreamer/TAOS-CI/blob/main/ci/doc/doxygen-documentation.md |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
I reflected the off-line discussion in |
nntrainer/layers/fc_layer.cpp
Outdated
/** loraB: (lora_rank, out_dim) */ | ||
TensorDim loraB_dim( | ||
1, is_nchw ? 1 : unit, is_nchw ? lora_rank : 1, | ||
is_nchw ? unit : in_dim.channel(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so there's no use of lora_rank
in the channels last case for B?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be fixed! Thanks!!:)
This commit includes implementation of lora only for the FC layer, which means it is not the generalized version. It is required to be written as a seperate class in order to remove code duplicates for other layers Signed-off-by: Eunju Yang <[email protected]>
This commit updates TfLite node exporter of fully conntected layer. It adds new property (LoraRank) as additional input property of fullyconnected layer Signed-off-by: Eunju Yang <[email protected]>
- update type of LoraRank property : Property<int> -> PositiveIntegerProperty - fix typo dot_batched_deriv_wrt_1 -> dot_deriv_wrt_1 - update code with add -> add_i - apply clang-format Signed-off-by: Eunju Yang <[email protected]>
- remove `forwarding_lora()` function - update forwarding path with LoRA option - First, compute the forwarding logits of base weight (W) and lora weight (A @ B) respectively. - then merge the logits to return - [update] (W + A @ B)x -> Wx + (A @ B)x - update `calcDerivative` to reflect the changes in forwarding operation - implicit update of calcDerivative is updated. **Self-evaluation:** 1. Build test: [X]Passed [ ]Failed [ ]Skipped 2. Run test: [X]Passed [ ]Failed [ ]Skipped Signed-off-by: Eunju Yang <[email protected]>
- Fix typo in the code - edit comments to to add some explanations Signed-off-by: Eunju Yang <[email protected]>
- apply clang format to - nntrainer/tensor/tensor_v2.cpp - nntrainer/utils/node_exporter.cpp Signed-off-by: Eunju Yang <[email protected]>
- remove a redundant and incorrect block comment in `nntrainer/layers/fc_layer.cpp` Signed-off-by: Eunju Yang <[email protected]>
- clang-format re-apply to pass static checker - `fc_layer.cpp` Signed-off-by: Eunju Yang <[email protected]>
10aae55
to
aaebf77
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
i will fix FC Layer can support export to tflite format later
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
- updates the LoRA computation (applying Inception-LoRA) - compute with LoRA vectors without matrix construction - revise `forwarding()` - revise `calcGradient()` - revise `calcDerivative()` Self evaluation: Build test: [X]Passed [ ]Failed [ ]Skipped Run test: [X]Passed [ ]Failed [ ]Skipped Signed-off-by: Eunju Yang <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@EunjuYang, 💯 All CI checkers are successfully verified. Thanks.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
(Update)
This PR contains commits to
fc_layer
.node_exporter
of fully connected layer (to pass tflite test)2024/03/06
forwarding
functionforwarding_lora
is deletedSelf evaluation: