-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[on hold] generate kerning and allow sparseness in a dangerous and incorrect manner #345
Conversation
6ecdc8d
to
cb22607
Compare
1d8e324
to
3c1be3d
Compare
/// is an error. | ||
pub variation_model: VariationModel, | ||
/// is an error. This model enforces the no delta at the default location constraint | ||
/// used in things like gvar. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm confused. The "no delta at the default location" is not just "for things like gvar", the entire OpenType Variations model works like that. Why do you need to define an alternative to ModelConstraints::ZeroAtDefault? What are you trying to do?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This stems from a misunderstanding on my part (as we discussed on monday) where I thought that in some cases variable metrics were missing values at the default location.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In reading the comments I think perhaps we are conflating source and binary output? - I was imagining we might allow kerning to be specified w/o a value at the default. The binary would still get a value at default, it merely wouldn't be required in source.
As a concrete example, imagine if you will:
- A font with weight 100..900, default 400
- Hand-written fea for (some pair) that says kern +10 at weight 100, +20 at weight 900
We don't get a direct value for the value at default so we interpolate it. The binary is still written out based on deltas from the default, it's merely the sources that lose the limitation.
To me this feels like the logical outcome of approaching things in a sparse, variable-first, manner. I would extend it to glyphs as well, no need to provide an actual outline for default in source as long as you provide definitions that span the default so we can interpolate it.
@@ -1247,4 +1303,29 @@ mod tests { | |||
model.deltas(&point_seqs).unwrap() | |||
); | |||
} | |||
|
|||
#[test] | |||
fn model_with_no_default_location() { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The base master's location is at 0.0 (in normalized locations) by definition, otherwise it is simply not the base/default/neutral master. A VariationModel must have a base master and this has to be at 0.0, without one it's just invalid/incomplete. What's this and what does it have to do with variable kerning FEA?
/// locations we observe. We also use different `ModelConstraints` than glyf: we don't require | ||
/// the effect at the default location to be zero. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We also use different
ModelConstraints
than glyf: we don't require the effect at the default location to be zero.
sorry but that's wrong. The effect at the default location is always 0 throughout the OT Variations, it's not about glyf or GPOS. The variable scalars in FEA must always specify a value for the default master's location, there's no exception here.
} | ||
|
||
// For any kern that is incompletely specified fill in the missing values using UFO kerning lookup | ||
// Not 100% sure if this is correct for .glyphs but lets start there |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would say yes, not least because fontmake has compiled .glyphs files by turning them into UFOs first for long time now and if Glyphs used a different kerning lookup algorithm than UFOs we would have known by now
// For any kern that is incompletely specified fill in the missing values using UFO kerning lookup | ||
// Not 100% sure if this is correct for .glyphs but lets start there | ||
// Generate variable format kerning per https://github.com/adobe-type-tools/afdko/pull/1350 | ||
// Use design values per discussion on https://github.com/harfbuzz/boring-expansion-spec/issues/94 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand that a consensus is building on using design coordinates but at least the current feaLib implementation expects user coordinates for variable scalars in FEA, and there might be code/fonts out there relying on this
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see below that normalized coordinates are being used (with the proposed n
suffix), so maybe the comment referring to "design" coordinates is stale and needs to be updated.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Correct, IIRC this code was written in the interval where I wasn't aware suffixes were coming.
.join(",") | ||
}); | ||
|
||
// TODO can we skip some values by dropping where value == interpolated value? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you'd have to construct a new temporary model with a given master omitted, interpolate at that location and compare original with interpolated value, might become expensive to do for many kerns * masters. We don't currently do any of that in the python pipeline, source masters are always retained even if potentially redundant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we let people hand-write variable value records I think we have to support building new variation models in case they write a rule that references a point that isn't aligned with any master. To be fair, we could kick that out for now as I don't believe current compilation or sources allow this.
let static_metadata = weight_variable_static_metadata(300.0, 400.0, 700.0); | ||
let var_info = FeaVariationInfo::new(&static_metadata); | ||
|
||
// Find me the default value despite that not being one of the input locations |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this example won't compile in fonttools because, as I said, a value for the baster master (at default location) is always required. You'll probably get "base master not found". What you seem to be doing here is like a naive version of the L4 instancer (moving the default). May seem to make sense for this simple case with one axis and two masters, but will blow up as soon as you make it more complicated. I'll repeat, the general rule in OT variations is that the default is never sparse, always densely contains all the values, it defines the skeleton of the VF, what's left when you take away all the variation tables, it's the reference point for all deltas. The default master cannot be left undefined for it influences all the regions of the designspace (whose deltas are defined in terms of it), therefore the font developer must know exactly and take responsibility to specify where/what that looks like, it should not be left to the compiler to guess.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I understand that in the binary it must be dense at 0n. If we differ in opinion it is wrt whether it follows that the source needs to be dense at default. IMHO it doesn't, it should be fine to have a source with say weight 300-700, default 400, with source defining only two outlines, at 300 and 700, for a glyph.
Ultimately IMHO in source you just need to provide a definition that spans default. That is, provide at least one of:
- A definition at default
- if there are no other definitions the glyph won't vary
- Definition at 2 or more locations that define a region that encompasses 0n
- compiler will figure out what the value at 0n should be and the deltas to give the expected results
I'm with Cosimo here. What you are doing, resolving value at default location and reinserting it into the set of masters, breaks the modeling. As Cosimo points out, to do it correctly you need the full L4 instancing logic. Here's an example. Imagine we have two axes A and B both running from 0 to 1. We have three kerns: at 0,0=10; 0,1=20; and 1,0=30. In this model, if the first master is picked to be the base master for modeling purposes, then the interpolated kern at 1,1 will come out as 40. That's desirable. Now if you interpolate the kern at .5,.5 you get 25. If you insert that as your new default location / value, then suddenly the kern you get at 1,1 will drop down to 25. Ie. the rest of the masters are inactive at 1,1 because they are in different quadrants. That's just how the OT model works. |
even then, you still need to have or select some default to begin with before you can move it to some place else, with the consequences in terms of additional masters that this entails |
Correct. You'd get different interpolation results based on which master you pick as base. |
IIUC this is exclusively due to the zero influence at 0n rule. OK, fine, make a non-OT model (doesn't zero at zero) and use that to figure out an OT model that gives the desired results. I wouldn't claim the WIP does that correctly, it's just exploratoyr hackery, but I think there might be a path to some interesting capabilities. |
Current naive output akin to:
NOTE TO SELF: make sure we have a test where the locations in the conditionset are not identical.