Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: issubclass() arg 1 must be a class #1936

Closed
1 task done
aardvarkk opened this issue Dec 10, 2024 · 2 comments
Closed
1 task done

TypeError: issubclass() arg 1 must be a class #1936

aardvarkk opened this issue Dec 10, 2024 · 2 comments
Labels
bug Something isn't working

Comments

@aardvarkk
Copy link

aardvarkk commented Dec 10, 2024

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

When calling OpenAI.beta.chat.completions.parse in a highly concurrent environment, and providing a class as the response_format, I get the following error:

    response = self.client.beta.chat.completions.parse(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/resources/beta/chat/completions.py", line 156, in parse
    return self._post(
           ^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1280, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 957, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1063, in _request
    return self._process_response(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_base_client.py", line 1162, in _process_response
    return api_response.parse()
           ^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_response.py", line 319, in parse
    parsed = self._options.post_parser(parsed)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/resources/beta/chat/completions.py", line 150, in parser
    return _parse_chat_completion(
           ^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/lib/_parsing/_completions.py", line 122, in parse_chat_completion
    construct_type_unchecked(
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 445, in construct_type_unchecked
    return cast(_T, construct_type(value=value, type_=type_))
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 519, in construct_type
    return type_.construct(**value)  # type: ignore[arg-type]
           ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 230, in construct
    fields_values[name] = _construct_field(value=values[key], field=field, key=key)
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 394, in _construct_field
    return construct_type(value=value, type_=type_)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/local_disk0/.ephemeral_nfs/cluster_libraries/python/lib/python3.11/site-packages/openai/_models.py", line 513, in construct_type
    if not is_literal_type(type_) and (issubclass(origin, BaseModel) or issubclass(origin, GenericModel)):
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen abc>", line 123, in __subclasscheck__
TypeError: issubclass() arg 1 must be a class

This is being run in a multi-threaded environment. When I run with a single thread, I don't see this issue. I'm trying to churn through a bunch of data, so I'm attempting to use about 100 threads to make these API requests in parallel. If I reduce the count to 1 the problem goes away.

To work around this, I believe I can stop using the beta parse method with the provided response_format,

To Reproduce

  1. Call OpenAI.beta.chat.completions.parse with 100 threads simultaneously

Code snippets

class Bar(Enum):
    C = "C"

class Qux(Enum):
    D = "D"

class Foo(BaseModel):
    a: Bar
    b: Qux

def main(): # Called by 100 threads concurrently
    OpenAI(api_key="...").beta.chat.completions.parse(
        model="gpt-4o-mini",
        seed=0,
        temperature=0,
        messages=[
            {"role": "system", "content": "..."},
            {"role": "user", "content": "..."},
        ],
        response_format=Foo,
    )

OS

Ubuntu 22.04.4 LTS

Python version

Python v3.11.10

Library version

openai v1.57.2

@aardvarkk aardvarkk added the bug Something isn't working label Dec 10, 2024
@Joseelmax-00
Copy link

I can confirm I'm having the same issue which is making it impossible for me to use o1-preview. Tried changing versions of pydantic, typing_extensions and other libraries to no avail. Python 3.11.0 and this environment:

Package Version


annotated-types 0.7.0
anyio 4.7.0
certifi 2024.12.14
colorama 0.4.6
distro 1.9.0
h11 0.14.0
httpcore 1.0.7
httpx 0.28.1
idna 3.10
jiter 0.8.2
mypy-extensions 1.0.0
openai 1.58.1
pip 24.3.1
pydantic 2.10.4
pydantic_core 2.27.2
python-dotenv 1.0.1
setuptools 65.5.0
sniffio 1.3.1
tqdm 4.67.1
typing_extensions 4.12.2
typing-inspect 0.9.0

File "D:\Trabajo\random2\game of Quatro\openai_player.py", line 172, in select_AI_move
    completion = client.beta.chat.completions.parse(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Trabajo\random2\game of Quatro\env\Lib\site-packages\openai\resources\beta\chat\completions.py", line 181, in parse
    "response_format": _type_to_response_format(response_format),
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Trabajo\random2\game of Quatro\env\Lib\site-packages\openai\lib\_parsing\_completions.py", line 248, in type_to_response_format_param
    if is_basemodel_type(response_format):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "D:\Trabajo\random2\game of Quatro\env\Lib\site-packages\openai\lib\_pydantic.py", line 130, in is_basemodel_type
    return issubclass(typ, pydantic.BaseModel)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen abc>", line 123, in __subclasscheck__
TypeError: issubclass() arg 1 must be a class

@RobertCraigie
Copy link
Collaborator

Thanks for the bug report, I haven't been able to reproduce this issue but it should be fixed in the next release as I've added some more inspect.isclass() checks to the places referenced in the stack traces.

#1987

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants