Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Limit max request size #2155

Open
Kludex opened this issue May 26, 2023 Discussed in #1516 · 8 comments · May be fixed by #2328
Open

Limit max request size #2155

Kludex opened this issue May 26, 2023 Discussed in #1516 · 8 comments · May be fixed by #2328
Labels
feature New feature or request
Milestone

Comments

@Kludex
Copy link
Member

Kludex commented May 26, 2023

Discussed in #1516

Originally posted by aviramha April 5, 2020
As discussed in the Gitter, my opinion is that starlette should provide a default limit for request size.
The main reason is that without it, any Starlette application is vulnerable to very easy DoS.
For example, newbie me can write a program as follows:

from starlette.requests import Request
from starlette.responses import Response


async def app(scope, receive, send):
    assert scope['type'] == 'http'
    request = Request(scope, receive)
    body = b''
    json = await request.json()
    response = Response(body, media_type='text/plain')
    await response(scope, receive, send)

As a malicious user, I could send a 30GB sized JSON and cause the memory to go OOM.
Other frameworks support this also - Django, Quart.
My proposal is to add a default limit which can be overrided in the app configuration.

Important

  • We're using Polar.sh so you can upvote and help fund this issue.
  • We receive the funding once the issue is completed & confirmed by you.
  • Thank you in advance for helping prioritize & fund our backlog.
Fund with Polar
@adriangb
Copy link
Member

If we're going to make it a configurable middleware it might also make sense to have some sort of timeout for connections and each chunk, maybe infinite by default but definitely tunable.

Another thing to keep in mind is that this is likely something users want to control on a per-endpoint basis. That is, if I have an app that has an upload feature where I'm expecting 1GB files it's likely a single endpoint that expects 1GB files so I'd want to bump up the limits just for that endpoint. That makes me think that the best strategy may be a per-endpoint middleware w/ a companion middleware that just tweaks the config by changing it in scope. That would allow layering and overriding of these settings. This would be similar to #2026

@alex-oleshkevich
Copy link
Member

This is a good one! I also agree that we need a global setting and per-route (Route + Mount).
Use case: global limit - 1mb, photo upload endpoint - 10mb limit.

We can add max_body_size to request.form(), request.json(), and request.stream() functions.

@Kludex Kludex added the feature New feature or request label May 31, 2023
@Kludex Kludex added this to the Version 1.x milestone May 31, 2023
@Kludex
Copy link
Member Author

Kludex commented Jun 24, 2023

Why should the ASGI application be the one to set this instead of the server?

@alex-oleshkevich
Copy link
Member

alex-oleshkevich commented Jun 24, 2023

Why should the ASGI application be the one to set this instead of the server?

Example: global POST limit is 1mb, for selected endpoints that upload files - 100mb.
Setting this at the server level is global and leaves no chance to override it.

@abersheeran
Copy link
Member

Adding a LimitRequestSizeMiddleware is the simplest and forward-compatible way.

@Kludex
Copy link
Member Author

Kludex commented Nov 6, 2023

Adding a LimitRequestSizeMiddleware is the simplest and forward-compatible way.

Yeah. Shall we follow this path?

@adriangb
Copy link
Member

adriangb commented Nov 6, 2023

Yes I think someone should make a PR and we can discuss the details (override vs. min/max, should there be a default, etc.) there.

@adriangb
Copy link
Member

adriangb commented Nov 8, 2023

Yes I think someone should make a PR

I am someone, I made a PR 😆 : #2328

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request
Projects
None yet
4 participants