Skip to content
This repository has been archived by the owner on Dec 16, 2022. It is now read-only.

Making TransformerQAPredictor compatible with interpret modules #249

Merged
merged 4 commits into from
Apr 14, 2021

Conversation

AkshitaB
Copy link
Contributor

@AkshitaB AkshitaB commented Apr 13, 2021

  • Implements predictions_to_labeled_instances and _json_to_instance.

Fixes allenai/allennlp-demo#679

raise NotImplementedError(
"This predictor maps a question to multiple instances. "
"Please use _json_to_instances instead."
logger.warning(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@dirkgr I'm not sure if just adding a warning instead of an error actually breaks anything.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it breaks anything, but people don't read warnings.

@AkshitaB AkshitaB requested a review from dirkgr April 13, 2021 23:24
raise NotImplementedError(
"This predictor maps a question to multiple instances. "
"Please use _json_to_instances instead."
logger.warning(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it breaks anything, but people don't read warnings.

@AkshitaB AkshitaB merged commit f4fb932 into main Apr 14, 2021
@AkshitaB AkshitaB deleted the transformer-qa-interpret branch April 14, 2021 00:50
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None of the attackers or interpreters work for the Transformer QA model.
2 participants