Scrapy 1.4 allows remote attackers to cause a denial of service (memory consumption) via large files because arbitrarily many files are read into memory, which is especially problematic if the files are then individually written in a separate thread to a slow storage resource, as demonstrated by interaction between dataReceived (in core/downloader/handlers/http11.py) and S3FilesStore.
The product does not properly control the allocation and maintenance of a limited resource.
Link | Tags |
---|---|
https://github.com/scrapy/scrapy/issues/482 | issue tracking third party advisory |
http://blog.csdn.net/wangtua/article/details/75228728 | third party advisory exploit |