Skip to content

configuring for image uploads and ML inference - issues with file buffering #378

Answered by tiangolo
austinben asked this question in Questions
Discussion options

You must be logged in to vote

I wouldn't know the best way to handle it. And unfortunately I'm not using Flask nor this Docker image for a long while, so I wouldn't have a chance to try and optimize configs for your use case.

I'm biased, but I would try to use FastAPI instead 😅 I think handling Uvicorn alone might be much simpler than this setup with several components. 🤓

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by tiangolo
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants
Converted from issue

This discussion was converted from issue #282 on August 29, 2024 00:24.