Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Vagrant: Failed to connect to VM! Failed to boot? #63

Closed
chriskottom opened this issue May 12, 2010 · 14 comments
Closed

Vagrant: Failed to connect to VM! Failed to boot? #63

chriskottom opened this issue May 12, 2010 · 14 comments
Labels

Comments

@chriskottom
Copy link

At times it seems to be impossible to bring up a Vagrant box using what should be a relatively vanilla Vagrantfile. Is this a problem that other people are also having? Is it a solved problem?

Base OS: Ubuntu Lucid Desktop
Box OS: Ubuntu Lucid Server
VirtualBox 3.1.6 r59338

Vagrantfile:
Vagrant::Config.run do |config|
config.vm.box = "base_lucid"
end

Vagrant::Config.run do |config|
  config.vm.forward_port("web", 80, 4567)
end

Vagrant::Config.run do |config|
  config.vm.provisioner = :chef_solo
  config.chef.cookbooks_path = "cookbooks"
end

Output from "vagrant up":
[INFO 05-12-2010 06:16:24] Vagrant: Provisioning enabled with Vagrant::Provisioners::ChefSolo
[INFO 05-12-2010 06:16:24] Vagrant: Importing base VM (/home/ck1/.vagrant/boxes/base_lucid/box.ovf)...
[INFO 05-12-2010 06:18:11] Vagrant: Persisting the VM UUID (dc164fff-eeb8-40a8-84c6-5b76f7b3ba44)...
[INFO 05-12-2010 06:18:12] Vagrant: Matching MAC addresses...
[INFO 05-12-2010 06:18:12] Vagrant: Running any VM customizations...
[INFO 05-12-2010 06:18:13] Vagrant: Deleting any previously set forwarded ports...
[INFO 05-12-2010 06:18:13] Vagrant: Forwarding ports...
[INFO 05-12-2010 06:18:13] Vagrant: Forwarding "ssh": 22 => 2222
[INFO 05-12-2010 06:18:13] Vagrant: Forwarding "web": 80 => 4567
[INFO 05-12-2010 06:18:13] Vagrant: Clearing previously set shared folders...
[INFO 05-12-2010 06:18:14] Vagrant: Creating shared folders metadata...
[INFO 05-12-2010 06:18:14] Vagrant: Booting VM...
[INFO 05-12-2010 06:18:15] Vagrant: Waiting for VM to boot...
[INFO 05-12-2010 06:18:15] Vagrant: Trying to connect (attempt #1 of 10)...
[INFO 05-12-2010 06:18:20] Vagrant: Trying to connect (attempt #2 of 10)...
[INFO 05-12-2010 06:18:55] Vagrant: Trying to connect (attempt #3 of 10)...
[INFO 05-12-2010 06:19:30] Vagrant: Trying to connect (attempt #4 of 10)...
[INFO 05-12-2010 06:20:05] Vagrant: Trying to connect (attempt #5 of 10)...
[INFO 05-12-2010 06:20:40] Vagrant: Trying to connect (attempt #6 of 10)...
[INFO 05-12-2010 06:21:15] Vagrant: Trying to connect (attempt #7 of 10)...
[INFO 05-12-2010 06:21:50] Vagrant: Trying to connect (attempt #8 of 10)...
[INFO 05-12-2010 06:22:25] Vagrant: Trying to connect (attempt #9 of 10)...
[INFO 05-12-2010 06:23:00] Vagrant: Trying to connect (attempt #10 of 10)...
[INFO 05-12-2010 06:23:35] Vagrant: Failed to connect to VM! Failed to boot?
=====================================================================
Vagrant experienced an error!

Failed to connect to VM! Failed to boot?
=====================================================================
@mitchellh
Copy link
Contributor

Can you try with the "base" box? http://files.vagrantup.com/base.box

Another thing you can try is setting config.vm.boot_mode = "gui" which will boot with an open GUI window so you can see if there is a kernel panic (which I'm certain there is).

The lucid box has caused trouble before on ubuntu-based hosts, so I have a feeling this is what is causing it.

Thanks.

@jasherai
Copy link

I was having the same issue, and after using the gui mode i saw that the box vm was booting correctly.

I narrowed it down to an issue of too many failures whilst trying to login using my existing ssh keys. (by running vagrant ssh to see what the ssh errors might be).

My issue was some weird conflict with the built in gnome-ssh agent on the host OS (Lucid). I managed to resolve it by turning off the "ssh key agent" program in startup applications.

Can I suggest that we set ForwardAgent no in the ssh-config and also the switch -a in the ssh kernel.exec line.

This is my tuppence, but hope it resolves it for you.
jasherai

@mitchellh
Copy link
Contributor

jasherai,

Interesting, I don't see any reason why we can't disable agent forwarding. Are you certain that this is what fixed the issue?

Additionally, can you verify that your ~/.ssh folder is CHMOD to 0600? I had an issue where anything other than 0600 caused the "too many failures" issue.

Mitchell

@chriskottom
Copy link
Author

I've been looking at the issue for a couple of days, and my problems seem to be caused by VirtualBox DHCP server is not allocating an IP address to the VM. So even though this doesn't seem to be occurring for other, non-Vagrant VMs I'm using, the root of the problem seems unrelated to Vagrant. I will keep looking into it though and post whatever I find.

Snippet from healthy log file:

00:00:14.757 NAT: IPv6 not supported
00:00:14.498 Display::handleDisplayResize(): uScreenId = 0, pvVRAM=adcc7000w=640 h=480 bpp=0 cbLine=0x140
00:00:15.223 NAT: DHCP offered IP address 10.0.2.15
00:00:15.224 NAT: DHCP offered IP address 10.0.2.15
00:00:15.832 SharedFolders host service: connected, u32ClientID = 1
00:00:15.982 PATM: patmR3RefreshPatch: succeeded to refresh patch at c0175d50 
00:00:16.215 PATM: patmR3RefreshPatch: succeeded to refresh patch at c01333a0 
00:00:23.500 PATM: patmR3RefreshPatch: succeeded to refresh patch at c015b500 
00:00:23.542 SharedFolders host service: request to map folder v-csc-0
00:00:23.542 SharedFolders host service: map operation result VINF_SUCCESS.
00:00:23.542     Mapped to handle 0.
00:00:23.679 SharedFolders host service: request to map folder v-root
00:00:23.680 SharedFolders host service: map operation result VINF_SUCCESS.
00:00:23.680     Mapped to handle 1.

Snippet from unhealthy log file:

00:00:14.757 NAT: IPv6 not supported
00:00:14.836 Display::handleDisplayResize(): uScreenId = 0, pvVRAM=adc1d000 w=640 h=480 bpp=0 cbLine=0x140
00:00:15.996 SharedFolders host service: connected, u32ClientID = 1
00:00:16.344 PATM: patmR3RefreshPatch: succeeded to refresh patch at c01333a0

@mitchellh
Copy link
Contributor

Strange! I've never heard of this issue occurring and of course I can't reproduce it so I can't offer much help here. And I may have asked this before but you've tried with more than one base box correct? I just want to make sure this isn't something isolated to the "base_lucid" box somehow.

@eignerchris
Copy link

using 'getting_started' base box. seeing the 10 failed connections as well. booted with config.vm.boot_mode = "gui" and confirmed box starts. not very familiar with virtualbox network troubleshooting but i can tell you i can't ping anything from inside the vm.

@mitchellh
Copy link
Contributor

eignerchris,

You're probably getting the same issue with the DHCP or something or another. I haven't figured out what is causing this yet. In the meantime could you please post your OS, ruby version (ruby -v), and vagrant version (vagrant -v)?

Thanks

@eignerchris
Copy link

OS: Darwin host-246-74.pubnet.pdx.edu 10.3.0 Darwin Kernel Version 10.3.0: Fri Feb 26 11:58:09 PST 2010; root:xnu-1504.3.12~1/RELEASE_I386 i386

ruby: ruby 1.8.6 (2010-01-11 patchlevel 388) [i686-darwin10.2.0]

vagrant: 0.3.4

@mitchellh
Copy link
Contributor

Is this issue still occurring? Sorry its been awhile since an update and I wanted to verify whats going on here.

@eignerchris
Copy link

not seeing this behavior anymore. now just seeing any exception about json...

eignerchris: ~/Development/test_boxes/excelsior $ vagrant init eignerchris: ~/Development/test_boxes/excelsior $ vagrant up [default] Creating VM 'default' [default] Importing base VM (/Users/eignerchris/.vagrant/boxes/base/box.ovf)... [default] Persisting the VM UUID (fb393071-9348-4d28-a477-c823caa76fe5)... /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/environment.rb:324:into_json': uninitialized constant JSON::SAFE_STATE_PROTOTYPE (NameError)
from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/environment.rb:324:in update_dotfile' from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/environment.rb:323:inopen'
from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/environment.rb:323:in update_dotfile' from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/environment.rb:311:inupdate_dotfile'
from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/actions/vm/up.rb:29:in update_dotfile' from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/actions/vm/up.rb:22:inafter_import'
from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/actions/runner.rb:125:in send' from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/actions/runner.rb:125:ininvoke_callback'
... 20 levels...
from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/lib/vagrant/command.rb:13:in execute' from /opt/local/lib/ruby/gems/1.8/gems/vagrant-0.4.1/bin/vagrant:7 from /opt/local/bin/vagrant:19:inload'
from /opt/local/bin/vagrant:19
`

@mitchellh
Copy link
Contributor

Ah, this is caused by conflicting versions of json. Do a gem list | grep json and make sure you remove any conflicting versions. Its generally best to just gem uninstall json and gem uninstall json_pure all together then reinstall json.

@eignerchris
Copy link

thanks! working now. successfully booted a test vm.

@mitchellh
Copy link
Contributor

Great, closing this ancient issue :)

@chriskottom
Copy link
Author

I tested this against my previous Lucid box, and it was still failing at least some of the time, even after updating the Vagrant gem. I rolled a new base box last night (again Ubuntu Lucid) and after running about a dozen or so startup tests, I've not seen the same problem yet, so it could have been a combination of the version and the box itself. In any case, thanks for the support!

@ghost ghost locked and limited conversation to collaborators Apr 17, 2020
This issue was closed.
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
Projects
None yet
Development

No branches or pull requests

4 participants