Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(UndefinedFunctionError) function nil.node_count/0 is undefined (module nil is not available) #16

Closed
fdietz opened this issue Apr 28, 2017 · 7 comments

Comments

@fdietz
Copy link

fdietz commented Apr 28, 2017

Thanks for working on this library - just what I was looking for!

I seem to run into a problem when doing a lookup, like this:

Geolix.lookup("131.231.23.11")

My guess is that it's just a configuration problem on my side. Following the complete error message:

[error] Task #PID<0.653.0> started from #PID<0.608.0> terminating
** (UndefinedFunctionError) function nil.node_count/0 is undefined (module nil is not available)
    nil.node_count()
    (geolix) lib/geolix/adapter/mmdb2/lookup_tree.ex:59: Geolix.Adapter.MMDB2.LookupTree.traverse/6
    (geolix) lib/geolix/adapter/mmdb2/database.ex:35: Geolix.Adapter.MMDB2.Database.lookup/5
    (elixir) lib/task/supervised.ex:85: Task.Supervised.do_apply/2
    (elixir) lib/task/supervised.ex:36: Task.Supervised.reply/5
    (stdlib) proc_lib.erl:247: :proc_lib.init_p_do_apply/3
Function: #Function<2.65470077/0 in Geolix.Server.Worker.lookup_all/3>
    Args: []
** (exit) exited in: GenServer.call(#PID<0.608.0>, {:lookup, {131, 231, 23, 11}, [as: :struct, locale: :en, where: nil]}, 5000)
    ** (EXIT) an exception was raised:
        ** (UndefinedFunctionError) function nil.node_count/0 is undefined or private
            nil.node_count()
            (geolix) lib/geolix/adapter/mmdb2/lookup_tree.ex:59: Geolix.Adapter.MMDB2.LookupTree.traverse/6
            (geolix) lib/geolix/adapter/mmdb2/database.ex:35: Geolix.Adapter.MMDB2.Database.lookup/5
            (elixir) lib/task/supervised.ex:85: Task.Supervised.do_apply/2
            (elixir) lib/task/supervised.ex:36: Task.Supervised.reply/5
            (stdlib) proc_lib.erl:247: :proc_lib.init_p_do_apply/3
     (elixir) lib/gen_server.ex:737: GenServer.call/3
    (poolboy) src/poolboy.erl:76: :poolboy.transaction/3

And this is my configuration:

config :geolix,
  databases: [
    %{
      id:      :city,
      adapter: Geolix.Adapter.MMDB2,
      source:  "./data/maxmind/geolite2-city.tar.gz"
    },
    %{
      id:      :country,
      adapter: Geolix.Adapter.MMDB2,
      source:  "./data/maxmind/geolite2-county.tar.gz"
    }
  ]

Hope that gives you enough information to help me get things running. Thanks a lot in advance!

@mneudert
Copy link
Member

The first thing I would check is if the application was started able to load your database:

iex(1)> Process.whereis(Geolix.Database.Supervisor)
#PID<0.139.0>
iex(2)> GenServer.call(Geolix.Database.Loader, :registered)
[:city, :country]

Can you confirm the output of those calls?

It might be the way you have stored your databases. For compressed databases it uses :zlib.gunzip/1 (erlang doc). I don't think that it detects your .tar.gz compression and the result tar file could not be parsed and leaves you with some broken state.

That should lead to a {:error, :no_metadata} but there might be a case this message is dropped when starting the application initially. You should, however, see it when loading the database manually:

Enum.each(Application.get_env(:geolix, :database), &Geolix.load_database/1) 

That is be the problem I suspect at the moment...

@fdietz
Copy link
Author

fdietz commented Apr 28, 2017

Hi! And thanks for your quick response:

iex(1)> Process.whereis(Geolix.Database.Supervisor)
#PID<0.287.0>
iex(2)> GenServer.call(Geolix.Database.Loader, :registered)
[:city, :country]

And loading the database:

iex(3)> Enum.each(Application.get_env(:geolix, :databases), &Geolix.load_database/1)
:ok

It seems that I don't get any error messages.

Unfortunately, I have to leave now. But, will test again tomorrow using a zip compression instead of the tar.gz. I'll keep you posted!

Thanks a lot for your assistance! And have a nice evening!

@mneudert
Copy link
Member

mneudert commented Apr 28, 2017

Err, sorry, my bad -.-

That should have been Enum.map as Enum.each does not return the results of the individual iterations. So please try that again :D

I will do some digging myself to see if I can reproduce this behaviour and maybe get a fix up and running.

@fdietz
Copy link
Author

fdietz commented Apr 29, 2017

Missed the Enum.each, too ;-) Using Enum.map showed an actual error.

I got it working by extracting the file and directly using "city.mmdb" file.

So, my initial mistake was to download the files from here:
http://dev.maxmind.com/geoip/geoip2/geolite2/

For the city and country:
http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz
http://geolite.maxmind.com/download/geoip/database/GeoLite2-Country.tar.gz

And I was using these archives directly with geolix and assumed that's the way to go. Since don't work because they contain a directory with some readme and the db file.

Additionally, now I tried to zip and also to tar.gz the "city.mmdb" file, which both don't work. How do you compress the files for yourself to make it work?

What would you suggest in case of using your library on Heroku? I assume it won't work, because of the Heroku ephemeral filesystem. Would it be possible to make it so that your library downloads the db files from somewhere and keeps them in memory instead?

@mneudert
Copy link
Member

The error being swallowed (probably {:error, :no_metadata}?) by the library is something I have a fix for upcoming. So at least one would notice there is something wrong with the database.


But MaxMind having changed their download links (and compression/archive) is a different thing that implicates something else...

Back in the days when the download was last changed there was a real "pure gzip" download location. As in just the database file and just gzipped. Directly usable by in-memory decompression. A tarball (and the new ones especially) will never work that way because they have folders and multiple files and things.

So there are two ways for the database to be used:

  1. Plain gzip compression. Only the .mmdb file and nothing more. And real gzip, not zip or something else.
  2. Just don't compress it :)

The last one might be tricky to configure however. As the path in the archive seems to adjust to the "last build date", you might need some shell foo to have a predictable location (if not using an auto-updater from your system package manager):

# not really tested, probably there are easier ways
# also depends on the "GNU" version of commands like sort
find . -name 'GeoLite-City.mmdb' \
  | sort -rV \
  | head -1 \
  | xargs -I {} cp {} /path/to/GeoLite-City.mmdb

That should find the newest file (by the date of the folder) and copy the database there to a custom location. Something like that in your Procfile might be enough the keep things going. That should keep the file available whenever you update your software or move to a different dyno.


The only way of using "direct download" files is by providing an URL for it (see Remote Files Configuration). You could use S3 or some other storage of yours to use that. With the changed downloads you would need to manage the compression yourself.

Everything more would include some real package management and that is somewhat out of scope here. At least out of scope of the main library, there might be a use of a specific "download and update database application"...

@mneudert
Copy link
Member

After digging through the hex code there might be a way to handle tarballs in memory. So there might be an easy way out :)

@fdietz
Copy link
Author

fdietz commented Apr 30, 2017

Great - that you found the error swallow!

I've tested it again and used gzip on the mmdb files directly and everything works as expected now! Awesome!

After some more research, it found that some people use Heroku buildpacks for example: https://github.com/Shopify/heroku-buildpack-geoip. The build pack is then downloading and extracting everything. Just have to make sure that you deploy on a regular basis, so the build pack is re-downloading a fresh db.

Totally understand that you don't want to include the update db stuff in your package 👍

I'm not 100% certain that using an in memory approach makes sense here, yet. Would have to think more about it. But, if it's a quick win for you which does offer another option to deploy things more easily - why not ;-)

Thank you so much for assisting me here :-)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants