MantisBT - Piwigo
View Issue Details
0000949Piwigoweb APIpublic2009.03.15 21:362009.04.15 00:54
Apache 1.3.x
0000949: [pwg.images.add] heavy photo upload fails when bigger than memory limit
Current algorithm (for 2.0.1, see bug:941) chunks file for upload and then merge it back before base64_decode.

The problem is that if your photo weights 8MB, it becomes a 10.5MB base64 string. the merge_chunks function loads 10.5MB in memory as a string. Then base64_decode created another 8MB string => you have 18.5MB in memory (without taking into account the basic memory usage). Standard memory limit is set to 16MB, so the upload fails BUT pwg.images.add returns code 200 (ie "OK").
It seems that an algorithm based on reading chunks with a length multiple of 4 should make the partial base64_decode possible, it would solve the problem :-) Unfortunately, I haven't been able to make it work yet.
No tags attached.
related to 0000966closed rosman pLoader encode chunks one by one 
Issue History
2009.03.15 21:36plgNew Issue
2009.03.15 21:36plgStatusnew => assigned
2009.03.15 21:36plgAssigned To => plg
2009.03.15 21:36plgbrowser => any
2009.03.15 21:36plgWeb server => Apache 1.3.x
2009.04.08 00:15plgNote Added: 0002571
2009.04.08 00:15plgTarget Version => 2.0.2
2009.04.08 00:17plgRelationship addedrelated to 0000966
2009.04.15 00:33plgNote Edited: 0002571
2009.04.15 00:54plgNote Added: 0002575
2009.04.15 00:54plgStatusassigned => closed
2009.04.15 00:54plgResolutionopen => fixed
2009.04.15 00:54plgFixed in Version => 2.0.2

2009.04.08 00:15   
(edited on: 2009.04.15 00:33)
A solution is to base64 encode chunks one by one, so that pwg.images.addChunk can base64 decode a small amount of data at a time, no memory problem anymore. raw chunks can then be appended into the final file without loading the full file in memory.

2009.04.15 00:54   
fixed on branch 2.0 in [Subversion] r3239
merged on trunk in [Subversion] r3240