Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error "Something went wrong" when sending code completion request to Ollama instance #684

Closed
Majroch opened this issue Sep 9, 2024 · 9 comments
Labels
bug Something isn't working

Comments

@Majroch
Copy link

Majroch commented Sep 9, 2024

What happened?

After update to new version of plugin in PhpStorm, when I write something in editor with code completion enabled, then error is thrown. Same goes for Chat window.

Only exception is when fetching models list from instance -> request is sent and valid list of available models is fetched.

Relevant log output or stack trace

Something went wrong

java.net.ConnectException
	at java.net.http/jdk.internal.net.http.HttpClientImpl.send(HttpClientImpl.java:951)
	at java.net.http/jdk.internal.net.http.HttpClientFacade.send(HttpClientFacade.java:133)
	at ee.carlrobert.llm.client.ollama.OllamaClient.processStreamRequest(OllamaClient.java:222)
	at ee.carlrobert.llm.client.ollama.OllamaClient.lambda$getCompletionAsync$2(OllamaClient.java:79)
	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1804)
	at java.base/java.util.concurrent.CompletableFuture$AsyncRun.exec(CompletableFuture.java:1796)
	at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:507)
	at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1491)
	at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:2073)
	at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:2035)
	at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:187)
Caused by: java.net.ConnectException
	at java.net.http/jdk.internal.net.http.common.Utils.toConnectException(Utils.java:1028)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:227)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.checkRetryConnect(PlainHttpConnection.java:280)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$2(PlainHttpConnection.java:238)
	at java.base/java.util.concurrent.CompletableFuture.uniHandle(CompletableFuture.java:934)
	at java.base/java.util.concurrent.CompletableFuture$UniHandle.tryFire(CompletableFuture.java:911)
	at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:510)
	at java.base/java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1773)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
	at java.base/java.lang.Thread.run(Thread.java:1583)
Caused by: java.nio.channels.ClosedChannelException
	at java.base/sun.nio.ch.SocketChannelImpl.ensureOpen(SocketChannelImpl.java:202)
	at java.base/sun.nio.ch.SocketChannelImpl.beginConnect(SocketChannelImpl.java:786)
	at java.base/sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:874)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.lambda$connectAsync$1(PlainHttpConnection.java:210)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:571)
	at java.net.http/jdk.internal.net.http.PlainHttpConnection.connectAsync(PlainHttpConnection.java:212)
	... 9 more

Steps to reproduce

  1. Install newest update
  2. Enable code completion with Ollama instance configured
  3. Write something in Chat or in open file

CodeGPT version

2.11.0-241.1

Operating System

Linux

@Majroch Majroch added the bug Something isn't working label Sep 9, 2024
@Kenterfie
Copy link

The problem currently prevents any use of this plugin and downgrades no longer work due to the new Intellij version.

@carlrobertoh
Copy link
Owner

It looks like the host can't be overridden since the last version. So, if your Ollama server is running on a port other than 11434, the connection will fail.

I will fix this in the next release. As a workaround, please run the server on the default port (11434).

@Kenterfie
Copy link

This can not be the reason. In my case the port is already the default port.

I have currently http://llm.local.net:11434

@carlrobertoh
Copy link
Owner

carlrobertoh commented Sep 11, 2024

It looks like the host can't be overridden

The same issue is related to your use case as well. 6b7e26

@Majroch
Copy link
Author

Majroch commented Sep 12, 2024

It looks like the host can't be overridden since the last version. So, if your Ollama server is running on a port other than 11434, the connection will fail.

Ok, my instance is at 443, so this can be possible. I use Nginx as my proxy between Ollama instance and PhpStorm.

I'll wait for new release and check out, if it works :)

@tomsykes
Copy link

I've just tried installing CodeGPT, and came across this issue. My ollama instance is hosted remotely and I was getting the "something went wrong" message.

I've used an ssh tunnel (from localhost:11434 to remote:port) to resolve for now.

@carlrobertoh
Copy link
Owner

This issue should be fixed in the latest version (2.11.1). Please reopen the ticket if it is still reproducible.

@Majroch
Copy link
Author

Majroch commented Sep 13, 2024

I can confirm, everything works after update to 2.11.1 :D

@jwsims
Copy link

jwsims commented Sep 13, 2024

I'm not sure if this is related to the fix or not, but has anyone else noticed that Ollama ignores the timeout setting in Advanced Settings?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

5 participants