tornado stream upload
tornado 4.0 新加tornado.web.stream_request_body decorator ,用于stream request
Streaming uploads let you handle large requests without buffering everything into memory, but there is still generally some limits to what you're willing to handle. The max_buffer_size and max_body_size parameters are now separate, but they both default to 100MB. With streaming uploads, you can increase max_body_size as much as you want without increasing your memory requirements, but make sure you have enough disk space (or s3 budget, etc) to handle the uploads you'll get. You can even set max_body_size on a per-request basis by calling self.request.connection.set_max_body_size() from prepare()
import tornado.webimport tornado.ioloopMB = 1024 * 1024GB = 1024 * MBTB = 1024 * GBMAX_STREAMED_SIZE = 1*GB@tornado.web.stream_request_bodyclass MainHandler(tornado.web.RequestHandler): def prepare(self): self.f = open("xxxxxxxx", "wb") # 如果不设max_body_size, 不能上传>100MB的文件 self.request.connection.set_max_body_size(MAX_STREAMED_SIZE) def post(self): print("upload completed") self.f.close() def data_received(self, data): self.f.write(data)if __name__ == "__main__": application = tornado.web.Application([ (r"/", MainHandler), ]) application.listen(7777) tornado.ioloop.IOLoop.instance().start()
tornado.web.stream_request_body 源码
测试:
curl -v -XPOST --data-binary @presto-server-0.144.2.tar.gz -127.0.0.1:7777/
关键字:Python, tornado
版权声明
本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处。如若内容有涉嫌抄袭侵权/违法违规/事实不符,请点击 举报 进行投诉反馈!