Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ccapi] add return last output option in incremental inference @open sesame 10/02 11:44 #2740

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

lhs8928
Copy link
Contributor

@lhs8928 lhs8928 commented Sep 26, 2024

  • Added return last output option in incremental inference. To support backward compatibility, default value would be set as false.

 - Added return last output option in incremental inference.
   To support backward compatibility, default value would be set as false.

Signed-off-by: hyeonseok <[email protected]>
@taos-ci
Copy link
Collaborator

taos-ci commented Sep 26, 2024

📝 TAOS-CI Version: 1.5.20200925. Thank you for submitting PR #2740. Please a submit 1commit/1PR (one commit per one PR) policy to get comments quickly from reviewers. Your PR must pass all verificiation processes of cibot before starting a review process from reviewers. If you are new member to join this project, please read manuals in documentation folder and wiki page. In order to monitor a progress status of your PR in more detail, visit http://ci.nnstreamer.ai/.

Copy link
Contributor

@djeong20 djeong20 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! please check the comments

@@ -849,27 +849,33 @@ std::vector<float *> NeuralNetwork::incremental_inference(
unsigned int step = from ? 0 : to - 1;

for (auto &out : output_tensors) {
const auto &out_t = *out.get();
float *last_out_buf_data = new float[batch_size * out_t.width()];
auto out_t = *out.get();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is there any reason why you changed it to auto?
If out_t is not modified, const auto & would be a better option to avoid the coverity issue.

Copy link
Collaborator

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lhs8928, 💯 All CI checkers are successfully verified. Thanks.

@@ -816,7 +816,7 @@ sharedConstTensors NeuralNetwork::incremental_inference(
std::vector<float *> NeuralNetwork::incremental_inference(
unsigned int batch_size, const std::vector<float *> &input,
const std::vector<float *> &label, unsigned int init_seq_len,
unsigned int from, unsigned int to) {
unsigned int from, unsigned int to, bool return_last_output_only) {
Copy link
Member

@SeoHyungjun SeoHyungjun Sep 27, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about setting it to "full_output" or "need_only_last" instead of "return_last_output_only"?
(I think the word 'return' is unnecessary and could be replaced with a better name)
It seems like an internal flag that determines the return value.

Copy link
Contributor

@baek2sm baek2sm Oct 14, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the current parameter name is also good.
However, if there's a necessity for renaming it, I would recommend considering 'output_hidden_states' for user convenience. Because this term has been utilized in HuggingFace for similar purposes. (if "output_hidden_states == false", then only return the last output values)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If there is a similar case, it would be good to use the same name.
It seems that developers will easily understand this.

@jijoongmoon jijoongmoon changed the title [ccapi] add return last output option in incremental inference [ccapi] add return last output option in incremental inference @open sesame 10/02 11:44 Oct 2, 2024
Copy link
Collaborator

@taos-ci taos-ci left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@lhs8928, 💯 All CI checkers are successfully verified. Thanks.

@DonghakPark
Copy link
Member

Please Rebase (upstream/main), #2741 resolve cacheLoader_test_case issue

@SeoHyungjun
Copy link
Member

It seems there is a problem with CI.
Please check it out.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

9 participants