Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Gradio Error: _queue.Empty #21

Open
Kingdroper opened this issue Dec 20, 2024 · 2 comments
Open

Gradio Error: _queue.Empty #21

Kingdroper opened this issue Dec 20, 2024 · 2 comments

Comments

@Kingdroper
Copy link

Can't generate anything using the sample prompt:
截屏2024-12-20 14 49 21

@thuwzy
Copy link
Collaborator

thuwzy commented Dec 21, 2024

What is the output in command line?

@luispatriciorf
Copy link

Hello,
Same issue over here:

Image

I got this from the command line:

The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask to obtain reliable results.
Setting pad_token_id to eos_token_id:128009 for open-end generation.
Traceback (most recent call last):
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<5 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<11 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\blocks.py", line 2042, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\blocks.py", line 1601, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 833, in asyncgen_wrapper
response = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\chat_interface.py", line 884, in _stream_fn
first_response = await utils.async_iteration(generator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 722, in anext
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
run_sync_iterator_async, self.iterator, limiter=self.limiter
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
func, args, abandon_on_cancel=abandon_on_cancel, limiter=limiter
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio_backends_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio_backends_asyncio.py", line 962, in run
result = context.run(func, *args)
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 705, in run_sync_iterator_async
return next(iterator)
File "d:\LamaMesh\LLaMA-Mesh-main\app.py", line 161, in chat_llama3_8b
for text in streamer:
^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\transformers\generation\streamers.py", line 224, in next
value = self.text_queue.get(timeout=self.timeout)
File "C:\Python313\Lib\queue.py", line 212, in get
raise Empty
_queue.Empty
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask to obtain reliable results.
Setting pad_token_id to eos_token_id:128009 for open-end generation.
Traceback (most recent call last):
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<5 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<11 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\blocks.py", line 2042, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\blocks.py", line 1601, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 833, in asyncgen_wrapper
response = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\chat_interface.py", line 884, in _stream_fn
first_response = await utils.async_iteration(generator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 722, in anext
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
run_sync_iterator_async, self.iterator, limiter=self.limiter
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
func, args, abandon_on_cancel=abandon_on_cancel, limiter=limiter
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio_backends_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio_backends_asyncio.py", line 962, in run
result = context.run(func, *args)
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 705, in run_sync_iterator_async
return next(iterator)
File "d:\LamaMesh\LLaMA-Mesh-main\app.py", line 161, in chat_llama3_8b
for text in streamer:
^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\transformers\generation\streamers.py", line 224, in next
value = self.text_queue.get(timeout=self.timeout)
File "C:\Python313\Lib\queue.py", line 212, in get
raise Empty
_queue.Empty
The attention mask and the pad token id were not set. As a consequence, you may observe unexpected behavior. Please pass your input's attention_mask to obtain reliable results.
Setting pad_token_id to eos_token_id:128009 for open-end generation.
Traceback (most recent call last):
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\queueing.py", line 625, in process_events
response = await route_utils.call_process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<5 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\route_utils.py", line 322, in call_process_api
output = await app.get_blocks().process_api(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<11 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\blocks.py", line 2042, in process_api
result = await self.call_function(
^^^^^^^^^^^^^^^^^^^^^^^^^
...<8 lines>...
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\blocks.py", line 1601, in call_function
prediction = await utils.async_iteration(iterator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 833, in asyncgen_wrapper
response = await iterator.anext()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\chat_interface.py", line 884, in _stream_fn
first_response = await utils.async_iteration(generator)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 728, in async_iteration
return await anext(iterator)
^^^^^^^^^^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 722, in anext
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
run_sync_iterator_async, self.iterator, limiter=self.limiter
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio\to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
func, args, abandon_on_cancel=abandon_on_cancel, limiter=limiter
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio_backends_asyncio.py", line 2461, in run_sync_in_worker_thread
return await future
^^^^^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\anyio_backends_asyncio.py", line 962, in run
result = context.run(func, *args)
File "D:\LamaMesh\lamamesh\Lib\site-packages\gradio\utils.py", line 705, in run_sync_iterator_async
return next(iterator)
File "d:\LamaMesh\LLaMA-Mesh-main\app.py", line 161, in chat_llama3_8b
for text in streamer:
^^^^^^^^
File "D:\LamaMesh\lamamesh\Lib\site-packages\transformers\generation\streamers.py", line 224, in next
value = self.text_queue.get(timeout=self.timeout)
File "C:\Python313\Lib\queue.py", line 212, in get
raise Empty
_queue.Empty

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants