-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sycl implementation for objective functions #9846
Sycl implementation for objective functions #9846
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi, thank you for the work on objective functions. Some initial comments for discussion. It would be great if we could reduce the amount of copied code.
Find a better way to dispatch names of SYCL kernels with various
If you have ideas on how to refactor the code, please share. ;-)
/*! | ||
* \brief Find the maximum iterator within the iterators | ||
* \param begin The begining iterator. | ||
* \param end The end iterator. | ||
* \return the iterator point to the maximum value. | ||
* \tparam Iterator The type of the iterator. | ||
*/ | ||
template<typename Iterator> | ||
inline Iterator FindMaxIndex(Iterator begin, Iterator end) { | ||
Iterator maxit = begin; | ||
for (Iterator it = begin; it != end; ++it) { | ||
if (*it > *maxit) maxit = it; | ||
} | ||
return maxit; | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any chance we can use the one in common/math.h
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
struct SoftmaxMultiClassParam : public XGBoostParameter<SoftmaxMultiClassParam> { | ||
int num_class; | ||
// declare parameters | ||
DMLC_DECLARE_PARAMETER(SoftmaxMultiClassParam) { | ||
DMLC_DECLARE_FIELD(num_class).set_lower_bound(1) | ||
.describe("Number of output class in the multi-class classification."); | ||
} | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe create a header instead of copying?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
} | ||
static bst_float SecondOrderGradient(bst_float predt, bst_float label) { | ||
const bst_float eps = 1e-16f; | ||
return std::max(predt * (1.0f - predt), eps); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So .. is this sycl::max
or std::max
? If the compiler can handle std::max
then we can avoid lots of copy and paste.
struct LinearSquareLoss { | ||
static bst_float PredTransform(bst_float x) { return x; } | ||
static bool CheckLabel(bst_float x) { return true; } | ||
static bst_float FirstOrderGradient(bst_float predt, bst_float label) { | ||
return predt - label; | ||
} | ||
static bst_float SecondOrderGradient(bst_float predt, bst_float label) { | ||
return 1.0f; | ||
} | ||
static bst_float ProbToMargin(bst_float base_score) { return base_score; } | ||
static const char* LabelErrorMsg() { return ""; } | ||
static const char* DefaultEvalMetric() { return "rmse"; } | ||
|
||
static const char* Name() { return "reg:squarederror_sycl"; } | ||
|
||
static ObjInfo Info() { return {ObjInfo::kRegression, true, false}; } | ||
}; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like the same as the original one.
static bst_float ProbToMargin(bst_float base_score) { | ||
CHECK(base_score > 0.0f && base_score < 1.0f) | ||
<< "base_score must be in (0,1) for logistic loss, got: " << base_score; | ||
return -logf(1.0f / base_score - 1.0f); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
// TODO(razdoburdin): SYCL does not fully support std math inside offloaded kernels
Would be great if we could avoid copying when the function is supported.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have made some refactoring. The original regresson_loss.h
is now totally reused.
src/learner.cc
Outdated
if (ctx_.IsSycl()) { | ||
tparam_.objective = ObjFunction::GetSyclImplementationName(tparam_.objective); | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it possible that the dispatching happens inside ObjFunction::Create
? Also, what happens if a user saves the model, loads it back on a different machine that doesn't have a sycl device?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is a good idea. In this case we wouldn't have problems with saved models.
In continuation of #9800
Replaces of #7732
This PR adds SYCL implementation and corresponding tests for
multi:softmax
,multi:softprob
,reg:squarederror
,reg:squaredlogerror
,reg:logistic
,binary:logistic
,binary:logitraw
objectives.The dispatching is organized as follows. In case of execution device is set to
sycl
, the original name of objective function is replaced byoriginal_name_sycl
. In case of requited function hasn't sycl implementation, the original name is used.