Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle DriverWrapper when scanning registered drivers in JDBWrapper.getConnector() #147

Closed
wants to merge 1 commit into from

Conversation

JoshRosen
Copy link
Contributor

This is a followup to #143 which fixes a corner-case bug in our scanning of registered JDBC drivers: we need to properly handle Spark's DriverWrapper drivers, which are used to wrap JDBC drivers in order to make them accessible from the root classloader so that the DriverManager can find them.

A simpler, reflection-free version of this change was incorporated into apache/spark#10519

@JoshRosen JoshRosen added this to the 0.6.0 milestone Jan 4, 2016
@JoshRosen JoshRosen added the bug label Jan 4, 2016
@codecov-io
Copy link

Current coverage is 89.18%

Merging #147 into master will not affect coverage as of 61c04da

@@            master    #147   diff @@
======================================
  Files           13      13       
  Stmts          638     638       
  Branches       140     140       
  Methods          0       0       
======================================
  Hit            569     569       
  Partial          0       0       
  Missed          69      69       

Review entire Coverage Diff as of 61c04da

Powered by Codecov. Updated on successful CI builds.

@JoshRosen
Copy link
Contributor Author

Unfortunately, this is a little tricky to test in an automated fashion because we need to rig the classloaders in order to make sure that the test JDBC drivers aren't loaded by the root classloader.

As a pragmatic trade-off, I'm inclined to test this manually using spark-shell and --packages with the Postgres driver. I'll do this tomorrow.

@JoshRosen
Copy link
Contributor Author

I tested this manually with spark-shell, --jars, and a modified copy of this code with additional logging statements. I ran my manual tests on Spark 1.4.1, 1.5.1, and 2.0.0-SNAPSHOT, so I'm confident that the portions of the reflection code here which aren't covered by unit tests will work in those versions.

In case we don't get automated tests for this in place by the time of the next release, I'll open a PR to document the manual testing procedure and will create a checklist so future release coordinators / maintainers can know how to ensure that this doesn't break.

I'm going to merge this now and will begin preparation for a new spark-redshift release tomorrow.

@JoshRosen JoshRosen closed this in 038d061 Jan 6, 2016
@JoshRosen JoshRosen deleted the jdbc-driver-precedence-round-2 branch January 6, 2016 08:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants