Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

streamlit based web_demo #117

Merged
merged 1 commit into from
Mar 17, 2023
Merged

streamlit based web_demo #117

merged 1 commit into from
Mar 17, 2023

Conversation

AdamBear
Copy link
Contributor

Add a steamlit based demo web_demo2.py for better UI.
Need to install streamlit and streamlit-chat component fisrt:

pip install streamlit
pip install streamlit-chat

then run with the following command:
streamlit run web_demo2.py --server.port 6006

Add a steamlit based demo web_demo2.py for better UI.
need to install streamlit and streamlit-chat component fisrt:
pip install streamlit
pip install streamlit-chat

then run with the following:
streamlit run web_demo2.py --server.port 6006
@duzx16 duzx16 changed the base branch from main to dev March 17, 2023 12:58
@duzx16 duzx16 merged commit 45f4566 into THUDM:dev Mar 17, 2023
@hdjx520
Copy link

hdjx520 commented Mar 22, 2023

RuntimeError: Library cudart is not initialized

@ywtll
Copy link

ywtll commented Mar 29, 2023

RuntimeError: Library cudart is not initialized

Install CUDA Toolkit

maybeluo pushed a commit to maybeluo/ChatGLM-6B that referenced this pull request Apr 2, 2023
@hellojazy
Copy link

ERROR: Ignored the following versions that require a different python version: 0.0.2.1 Requires-Python >=3.8; 0.0.2.2 Requires-Python >=3.8

@andrewsysu
Copy link

TypeError: Protocols cannot be instantiated

@yy835055664
Copy link

Add a steamlit based demo web_demo2.py for better UI. Need to install streamlit and streamlit-chat component fisrt:

pip install streamlit
pip install streamlit-chat

then run with the following command: streamlit run web_demo2.py --server.port 6006

Hello, after replacing the fine-tuned model, an answer cannot be generated. At runtime, you will be prompted to 'You should potentially Train this model on a down stream task to be able to use it for predictions and preferences'

May I ask how to correctly call the fine-tuned model?

@Mike2100
Copy link

Mike2100 commented May 4, 2023

Hello,there is an "ucaught exception" happening. Protocols cannot be instantiated. Could you pls help me solve it ?

2023-05-05 07:20:53.674 Uncaught exception
Traceback (most recent call last):
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\site-packages\tornado\http1connection.py", line 276, in _read_message
delegate.finish()
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\site-packages\tornado\routing.py", line 268, in finish
self.delegate.finish()
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\site-packages\tornado\web.py", line 2395, in finish
self.execute()
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\site-packages\tornado\web.py", line 2417, in execute
self.handler = self.handler_class(
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\site-packages\tornado\websocket.py", line 219, in init
super().init(application, request, **kwargs)
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\site-packages\tornado\web.py", line 214, in init
super().init()
File "C:\Users\Administrator\AppData\Local\Programs\Python\Python39\lib\typing.py", line 1083, in _no_init
raise TypeError('Protocols cannot be instantiated')
TypeError: Protocols cannot be instantiated

@BiggieW
Copy link

BiggieW commented May 19, 2023

Hi, I've encountered the following error:

SSLError: HTTPSConnectionPool(host='huggingface.co', port=443): Max retries exceeded with url: /THUDM/chatglm-6b/resolve/main/tokenizer_config.json (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: self signed certificate in certificate chain (_ssl.c:1007)')))

Adding os.environ['CURL_CA_BUNDLE'] = '' or response = requests.get('https://huggingface.co/THUDM/chatglm-6b/resolve/main/tokenizer_config.json', verify=False) couldn't help.

Any help will be appreciated.

@wzfjesun
Copy link

wzfjesun commented Jun 8, 2023

can you write one more for ptuning? i donot know python @AdamBear

@ghost
Copy link

ghost commented Jul 3, 2023

Explicitly passing a revision is encouraged when loading a model with custom code to ensure no malicious code has been contributed in a newer revision.
Traceback (most recent call last):
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 386, in _make_request
self._validate_conn(conn)
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 1042, in validate_conn
conn.connect()
File "E:\pychram\python387\lib\site-packages\urllib3\connection.py", line 414, in connect
self.sock = ssl_wrap_socket(
File "E:\pychram\python387\lib\site-packages\urllib3\util\ssl
.py", line 449, in ssl_wrap_socket
ssl_sock = ssl_wrap_socket_impl(
File "E:\pychram\python387\lib\site-packages\urllib3\util\ssl
.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "E:\pychram\python387\lib\ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "E:\pychram\python387\lib\ssl.py", line 1040, in _create
self.do_handshake()
File "E:\pychram\python387\lib\ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ConnectionResetError: [WinError 10054] 远程主机强迫关闭了一个现有的连接。

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "E:\pychram\python387\lib\site-packages\requests\adapters.py", line 489, in send
resp = conn.urlopen(
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 787, in urlopen
retries = retries.increment(
File "E:\pychram\python387\lib\site-packages\urllib3\util\retry.py", line 550, in increment
raise six.reraise(type(error), error, _stacktrace)
File "E:\pychram\python387\lib\site-packages\urllib3\packages\six.py", line 769, in reraise
raise value.with_traceback(tb)
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 386, in _make_request
self._validate_conn(conn)
File "E:\pychram\python387\lib\site-packages\urllib3\connectionpool.py", line 1042, in validate_conn
conn.connect()
File "E:\pychram\python387\lib\site-packages\urllib3\connection.py", line 414, in connect
self.sock = ssl_wrap_socket(
File "E:\pychram\python387\lib\site-packages\urllib3\util\ssl
.py", line 449, in ssl_wrap_socket
ssl_sock = ssl_wrap_socket_impl(
File "E:\pychram\python387\lib\site-packages\urllib3\util\ssl
.py", line 493, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "E:\pychram\python387\lib\ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "E:\pychram\python387\lib\ssl.py", line 1040, in _create
self.do_handshake()
File "E:\pychram\python387\lib\ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "web_demo.py", line 6, in
tokenizer = AutoTokenizer.from_pretrained("THUDM/chatglm-6b", trust_remote_code=True)
File "E:\pychram\python387\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 663, in from_pretrained
tokenizer_class = get_class_from_dynamic_module(
File "E:\pychram\python387\lib\site-packages\transformers\dynamic_module_utils.py", line 388, in get_class_from_dynamic_module
final_module = get_cached_module_file(
File "E:\pychram\python387\lib\site-packages\transformers\dynamic_module_utils.py", line 286, in get_cached_module_file
commit_hash = model_info(pretrained_model_name_or_path, revision=revision, token=use_auth_token).sha
File "E:\pychram\python387\lib\site-packages\huggingface_hub\utils_validators.py", line 118, in _inner_fn
return fn(*args, **kwargs)
File "E:\pychram\python387\lib\site-packages\huggingface_hub\hf_api.py", line 1675, in model_info
r = get_session().get(path, headers=headers, timeout=timeout, params=params)
File "E:\pychram\python387\lib\site-packages\requests\sessions.py", line 600, in get
return self.request("GET", url, **kwargs)
File "E:\pychram\python387\lib\site-packages\requests\sessions.py", line 587, in request
resp = self.send(prep, **send_kwargs)
File "E:\pychram\python387\lib\site-packages\requests\sessions.py", line 701, in send
r = adapter.send(request, **kwargs)
File "E:\pychram\python387\lib\site-packages\requests\adapters.py", line 547, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(10054, '远程主机强迫关闭了一个现有的连接。', None, 10054, None))

python web_demo.py

@gao-jian-bin
Copy link

嗯嗯

@choko-oni
Copy link

OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like THUDM/chatglm-6b is not the path to a directory containing a file named config.json. Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.

@GoldenGYQ
Copy link

how can we streamlit run web_demo2.py with another localhost?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.