You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jul 17, 2023. It is now read-only.
For H2O-3, this code snippet may be helpful. For a specific row in the dataset it creates a table which has the results of trying different values for the users and looking at how the prediction changes. This is really important for explainability. This code specifically creates the partial dependence plot for the top positive and negative feature:
importh2ofromh2o.automlimportH2OAutoMLh2o.init()
# Load data into H2Odf=h2o.import_file('https://h2o-internal-release.s3-us-west-2.amazonaws.com/data/Splunk/churn.csv')
y='Churn?'x=df.columnsx.remove(y)
# Build modelsaml=H2OAutoML(max_models=2, seed=1)
aml.train(x=x, y=y, training_frame=df)
# Save the best modelmodel=aml.leader# Get how much each feature contributed for each personpred_contribs=model.predict_contributions(df).drop('BiasTerm').as_data_frame()
# ID of the phone nubmer row_id=77# Columns that are important for this usermin_contrib=pred_contribs.idxmin(axis=1)[row_id]
max_contrib=pred_contribs.idxmax(axis=1)[row_id]
min_pdp=model.partial_plot(
df,
cols=[min_contrib],
plot=False, # change to false, just for debuggingnbins=20ifnotdf[max_contrib].isfactor()[0] else1+df[max_contrib].nlevels()[0],
row_index=0
)
display(min_pdp)
max_pdp=model.partial_plot(
df,
cols=[min_contrib],
plot=False, # change to false, just for debuggingnbins=20ifnotdf[max_contrib].isfactor()[0] else1+df[max_contrib].nlevels()[0],
row_index=0
)
display(max_pdp)
The text was updated successfully, but these errors were encountered:
This example: https://wave.h2o.ai/docs/examples/ml-h2o-shap shows how to get the SHAP values from a WaveML model. I think that is good enough for developers to build custom downstream plots/cards.
@geomodular Is there anything more we want to accomplish with WaveML regarding this?
The ideal goal would be to explain the model using Wave ML interface without interfering with .model param. i.e. m.explain(). The same should be doable with DAI model as well.
That's the general idea we can bend.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
For H2O-3, this code snippet may be helpful. For a specific row in the dataset it creates a table which has the results of trying different values for the users and looking at how the prediction changes. This is really important for explainability. This code specifically creates the partial dependence plot for the top positive and negative feature:
The text was updated successfully, but these errors were encountered: