You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Confirm this is an issue with the Python library and not an underlying OpenAI API
This is an issue with the Python library
Describe the bug
When calling OpenAI.beta.chat.completions.parse in a highly concurrent environment, and providing a class as the response_format, I get the following error:
response = self.client.beta.chat.completions.parse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/resources/beta/chat/completions.py", line 156, in parse
return self._post(
^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1280, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 957, in request
return self._request(
^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1063, in _request
return self._process_response(
^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1162, in _process_response
return api_response.parse()
^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_response.py", line 319, in parse
parsed = self._options.post_parser(parsed)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/resources/beta/chat/completions.py", line 150, in parser
return _parse_chat_completion(
^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 122, in parse_chat_completion
construct_type_unchecked(
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 445, in construct_type_unchecked
return cast(_T, construct_type(value=value, type_=type_))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 519, in construct_type
return type_.construct(**value) # type: ignore[arg-type]
^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 230, in construct
fields_values[name] = _construct_field(value=values[key], field=field, key=key)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 394, in _construct_field
return construct_type(value=value, type_=type_)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 513, in construct_type
if not is_literal_type(type_) and (issubclass(origin, BaseModel) or issubclass(origin, GenericModel)):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen abc>", line 123, in __subclasscheck__
TypeError: issubclass() arg 1 must be a class
This is being run in a multi-threaded environment. When I run with a single thread, I don't see this issue. I'm trying to churn through a bunch of data, so I'm attempting to use about 100 threads to make these API requests in parallel. If I reduce the count to 1 the problem goes away.
To work around this, I believe I can stop using the beta parse method with the provided response_format,
To Reproduce
Call OpenAI.beta.chat.completions.parse with 100 threads simultaneously
I can confirm I'm having the same issue which is making it impossible for me to use o1-preview. Tried changing versions of pydantic, typing_extensions and other libraries to no avail. Python 3.11.0 and this environment:
File "D:\Trabajo\random2\game of Quatro\openai_player.py", line 172, in select_AI_move
completion = client.beta.chat.completions.parse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Trabajo\random2\game of Quatro\env\Lib\site-packages\openai\resources\beta\chat\completions.py", line 181, in parse
"response_format": _type_to_response_format(response_format),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Trabajo\random2\game of Quatro\env\Lib\site-packages\openai\lib\_parsing\_completions.py", line 248, in type_to_response_format_param
if is_basemodel_type(response_format):
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\Trabajo\random2\game of Quatro\env\Lib\site-packages\openai\lib\_pydantic.py", line 130, in is_basemodel_type
return issubclass(typ, pydantic.BaseModel)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "<frozen abc>", line 123, in __subclasscheck__
TypeError: issubclass() arg 1 must be a class
Thanks for the bug report, I haven't been able to reproduce this issue but it should be fixed in the next release as I've added some more inspect.isclass() checks to the places referenced in the stack traces.
Confirm this is an issue with the Python library and not an underlying OpenAI API
Describe the bug
When calling
OpenAI.beta.chat.completions.parse
in a highly concurrent environment, and providing aclass
as theresponse_format
, I get the following error:This is being run in a multi-threaded environment. When I run with a single thread, I don't see this issue. I'm trying to churn through a bunch of data, so I'm attempting to use about 100 threads to make these API requests in parallel. If I reduce the count to 1 the problem goes away.
To work around this, I believe I can stop using the beta
parse
method with the providedresponse_format
,To Reproduce
OpenAI.beta.chat.completions.parse
with 100 threads simultaneouslyCode snippets
OS
Ubuntu 22.04.4 LTS
Python version
Python v3.11.10
Library version
openai v1.57.2
The text was updated successfully, but these errors were encountered: