Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ack and compression parameters for Kafka #1359 #1712

Merged
merged 1 commit into from
Sep 12, 2019

Conversation

chandresh-pancholi
Copy link
Contributor

@chandresh-pancholi chandresh-pancholi commented Aug 6, 2019

Which problem is this PR solving?

Short description of the changes

  • Add RequiredAck, Compressor, and Compression Level params

@chandresh-pancholi
Copy link
Contributor Author

@yurishkuro, I have made all the required changes along with the test. Kindly review.

defaultRequiredAcks = "local"
defaultCompression = "none"
defaultCompressionLevel = -1000
suffixProtocolVersion = ".protocol-version"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please keep suffix constants together

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

func getCompressionLevel(compressionMode string, compressionLevel int) (int, error) {
compressionModeData := compressionModes[compressionMode]

if compressionLevel == defaultCompressionLevel {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is confusing, I suggest if compressionLevel == 0 {, which makes it obvious that default type's default value is being converted.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a snippet from sarama library. this is the reason why I made defaultCompressionLevel = -1000

// CompressionLevelDefault is the constant to use in CompressionLevel
// to have the default compression level for any codec. The value is picked
// that we don't use any existing compression levels.
const CompressionLevelDefault = -1000

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's not the point. The meaning of "default" in sarama is different from what we have here. When dealing with CLI, if you want to use the default value for the respective compressor you simply not specify compression-level flag. Both 0 and -1000 are invalid values for most compressors except for zstd, where BOTH are valid and neither is the default. So I don't see a reason why this bad parameter design from sarama should be carried over to our flags. 0 is the natural null value for int field. It makes no sense to be playing tricks with -1000, which achieve the same results.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree with you, i will make the changes

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's been fixed.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i prefer if we compare with 0 explicitly, not with defaultCompressionLevel which happens to be 0, because it makes it easier to understand the logic (you're replacing the zero value).


// getCompressionLevel to get compression level from compression type
func getCompressionLevel(compressionMode string, compressionLevel int) (int, error) {
compressionModeData := compressionModes[compressionMode]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

in the other functions you use ToLower, should it be consistent?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

return compressionModeData.defaultCompressionLevel, nil
}

if compressionModeData.minCompressionLevel <= compressionLevel && compressionModeData.maxCompressionLevel >= compressionLevel {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suggest reversing the condition and returning an error "compression level is not within valid range"

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

assert.Equal(t, val.compressor, sarama.CompressionSnappy)
}

func TestRequiredAcks(t *testing.T) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this test is useful, it just tests that map[] works as expected

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed how? what is it testing? It's just a map lookup.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My bad. I have removed it now.

require.Error(t, err)
}

func TestCompressionModes(t *testing.T) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this test is useful, it just tests that map[] works as expected

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

plugin/storage/kafka/options_test.go Outdated Show resolved Hide resolved
}

if compressionModeData.minCompressionLevel > compressionLevel || compressionModeData.maxCompressionLevel < compressionLevel {
return 0, fmt.Errorf("compression level is not within valid range")
Copy link
Member

@yurishkuro yurishkuro Aug 10, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

could make the error more descriptive:

fmt.Errorf("compression level %d for '%s' is not within valid range [%d, %d]", level, mode, min, max)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

}

//getCompressionModes maps input modes to sarama CompressionCodec
func getCompressionModes(mode string) (sarama.CompressionCodec, error) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

s/getCompressionModes/getCompressionMode/ (it returns a singular value, not plural)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed


func TestCompressionModesFailures(t *testing.T) {
_, err := getCompressionModes("test")
require.Error(t, err)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: this should be assert, not require (you CAN continue the test)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

@codecov
Copy link

codecov bot commented Aug 20, 2019

Codecov Report

❗ No coverage uploaded for pull request base (master@2f98e82). Click here to learn what that means.
The diff coverage is 87.23%.

Impacted file tree graph

@@            Coverage Diff            @@
##             master    #1712   +/-   ##
=========================================
  Coverage          ?   98.17%           
=========================================
  Files             ?      195           
  Lines             ?     9602           
  Branches          ?        0           
=========================================
  Hits              ?     9427           
  Misses            ?      137           
  Partials          ?       38
Impacted Files Coverage Δ
plugin/storage/kafka/options.go 92.2% <87.23%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 2f98e82...e5582d9. Read the comment docs.

@chandresh-pancholi
Copy link
Contributor Author

@yurishkuro Kindly review.

suffixEncoding = ".encoding"
suffixRequiredAcks = ".required-acks"
suffixCompression = ".compression"
suffixCompressionLevel = ".compression.level"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you change this to compression-level otherwise it will cause problems when specifying the options in yaml due to the previous option.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done

plugin/storage/kafka/options.go Show resolved Hide resolved
_, err := getCompressionMode("test")
assert.Error(t, err)

_, err = getCompressionMode("temp")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why testing test and temp - isn't this one redundant?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed it.

assert.Error(t, err)
}

func TestRequiredAcksFailures(t *testing.T) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about testing the valid values aswell?

Also same comment as above related to test and temp?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed it. Also added test for valid values.

@yurishkuro
Copy link
Member

NB: the code coverage drop is only from error clauses.

image

@chandresh-pancholi
Copy link
Contributor Author

@yurishkuro How do you propose to test it? I am not able to figure it out.

"--kafka.producer.encoding=protobuf",
"--kafka.producer.required-acks=local",
"--kafka.producer.compression=gzip",
"--kafka.producer.compression.level=6"})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should have been changed to compression-level - so not sure why the unit tests seems to be passing atm?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Such a stupid mistake from my side. :(
fixed and pushed it.

"--kafka.producer.encoding=protobuf",
"--kafka.producer.required-acks=local",
"--kafka.producer.compression=gzip",
"--kafka.producer.compression-level=6"})
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You need to change the value, as 6 is the default for gzip - so regardless of whether this flag is present, the test would currently pass.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done.

Copy link
Contributor

@objectiser objectiser left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@chandresh-pancholi thanks.

For reference in the future, if you could avoid squashing commits during the review cycles, it makes it easier to review changes. The commits will then be squashed when the PR is merged.

@objectiser
Copy link
Contributor

@chandresh-pancholi To address the code coverage, you should create some additional tests, variations of TestOptionsWithFlags, where the flags exhibit some error values.

@chandresh-pancholi
Copy link
Contributor Author

@objectiser Thank you so much for the review. I didn't know about the squashing commit. I will keep that in mind in future commits.

@chandresh-pancholi
Copy link
Contributor Author

@objectiser I agree but how to test log.Fatal? It's easy to reach to that line but assertion won't help here. What's your suggestion? I think of below code just to increase the code coverage.

func TestOptionsWithFlagsWithInvalidRequiredAcks(t *testing.T)  {
	opts := &Options{}
	v, command := config.Viperize(opts.AddFlags)
	command.ParseFlags([]string{
		"--kafka.producer.topic=topic1",
		"--kafka.producer.brokers=127.0.0.1:9092, 0.0.0:1234",
		"--kafka.producer.encoding=protobuf",
		"--kafka.producer.required-acks=test",
		"--kafka.producer.compression=gzip",
		"--kafka.producer.compression-level=7"})
	opts.InitFromViper(v)
}

@chandresh-pancholi
Copy link
Contributor Author

is this PR mergable to master?

@objectiser
Copy link
Contributor

@chandresh-pancholi Good point - possibly the best way would be to return the error and handle up the stack? From quick look the other storage plugins don't seem to validate the options there.

@yurishkuro Thoughts?

Copy link
Contributor

@jpkrohling jpkrohling left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. The coverage for the error handling inside InitFromViper could be fixed with some hacking, but not sure it brings much value, as the underlying error conditions are being tested in isolation already.

@yurishkuro
Copy link
Member

how to test log.Fatal?

Exactly why we, as a rule, don't use log, except in main's.

func getCompressionLevel(compressionMode string, compressionLevel int) (int, error) {
compressionModeData := compressionModes[compressionMode]

if compressionLevel == defaultCompressionLevel {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i prefer if we compare with 0 explicitly, not with defaultCompressionLevel which happens to be 0, because it makes it easier to understand the logic (you're replacing the zero value).

@yurishkuro yurishkuro changed the title Improve configuration parameters for Kafka #1359 Add ack and compression parameters for Kafka #1359 Sep 12, 2019
@yurishkuro yurishkuro merged commit 865f169 into jaegertracing:master Sep 12, 2019
@yurishkuro yurishkuro added this to the Release 1.15 milestone Nov 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Improve configuration parameters for Kafka
4 participants