Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There was a error happened when entered my input after "ollama serve" and "npm run dev" #53

Open
Kyrie-LiuX opened this issue Jun 25, 2024 · 0 comments

Comments

@Kyrie-LiuX
Copy link

the error information is :
dell@dell:/mnt/data/chatbot-ollama$ npm run dev

[email protected] dev
next dev

▲ Next.js 14.1.0

✓ Ready in 5.1s
✓ Compiled /api/models in 202ms (77 modules)
○ Compiling /_error ...
✓ Compiled /_error in 3.9s (304 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
○ Compiling / ...
✓ Compiled / in 980ms (1659 modules)
⚠ Fast Refresh had to perform a full reload. Read more: https://nextjs.org/docs/messages/fast-refresh-reload
(node:1207575) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead.
(Use node --trace-deprecation ... to show where the warning was created)
✓ Compiled /api/chat in 103ms (80 modules)
Error: aborted
at abortIncoming (node:_http_server:806:17)
at socketOnClose (node:_http_server:800:3)
at Socket.emit (node:events:532:35)
at TCP. (node:net:339:12) {
code: 'ECONNRESET'
}
⨯ uncaughtException: Error: aborted
at abortIncoming (node:_http_server:806:17)
at socketOnClose (node:_http_server:800:3)
at Socket.emit (node:events:532:35)
at TCP. (node:net:339:12) {
code: 'ECONNRESET'
}
⨯ uncaughtException: Error: aborted
at abortIncoming (node:_http_server:806:17)
at socketOnClose (node:_http_server:800:3)
at Socket.emit (node:events:532:35)
at TCP. (node:net:339:12) {
code: 'ECONNRESET'
}
⨯ Error: failed to pipe response
at pipeToNodeResponse (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/pipe-readable.js:111:15)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async DevServer.runEdgeFunction (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/next-server.js:1225:13)
at async NextNodeServer.handleCatchallRenderRequest (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/next-server.js:247:37)
at async DevServer.handleRequestImpl (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/base-server.js:807:17)
at async /mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/dev/next-dev-server.js:331:20
at async Span.traceAsyncFn (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/trace/trace.js:151:20)
at async DevServer.handleRequest (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/dev/next-dev-server.js:328:24)
at async invokeRender (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/router-server.js:163:21)
at async handleRequest (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/router-server.js:342:24)
at async requestHandlerImpl (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/router-server.js:366:13)
at async Server.requestListener (/mnt/data/Psy_KGLLM_LX/chatbot-ollama/node_modules/next/dist/server/lib/start-server.js:140:13) {
[cause]: SyntaxError: Unexpected non-whitespace character after JSON at position 97 (line 2 column 1)
at JSON.parse ()
at Object.start (webpack-internal:///(middleware)/./utils/server/index.ts:46:45)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
}
the chatbot UI:
image

Can you help me solve this problem? I will really appreciate that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant