-
Notifications
You must be signed in to change notification settings - Fork 966
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Process for releasing registered but apparently unused names for reuse #1487
Comments
So, we don't currently have a real process for this. It's kind of very adhoc at the moment. It appears pyspark is registered to user |
Hmm, no I don't recognize that name. @holdenk, do you? |
Its close to the name of the person who created SPARK-1267 back in 2014 about adding pip installability to Spark. I can try and see if I can get in touch with them through JIRA (but if we end up needing to use apacha-spark anyways). |
If y'all don't know the name I can poke them via the email address they have on file in PyPI's DB and see if they'll give up the name. I just figured if you knew them it'd be easiest that way first. |
@holdenk - I've contacted that person on JIRA to see if it's them: SPARK-18128. |
Is it possible this person is the same as @prabinb? That user has a fork of https://github.com/apache/spark. |
It sounds like that user came through on JIRA and released the registration. @holdenk - Shall we close this? |
Sounds good :) |
What's the process for investigating whether a name registered on PyPI can be released for reuse by another party? Do we file an issue here, submit something via the feedback form, or something else?
To provide context, some contributors to the Apache Spark project are gearing up to package the project for distribution via PyPI. We're considering publishing under the name
pyspark
, since that's the name of the Python API for Apache Spark, and it's how the interactive PySpark shell is invoked. However, it appears thatpyspark
is already registered.The text was updated successfully, but these errors were encountered: