Using FTP and the Dropbox plugin, I’ve uploaded a 115 MB PDF file to our repository, creating then the relative Item.
Now, if I click on the file thumbnail, the PDF won’t open up (error message: “Impossible to load the PDF document”); and if I try to download it (right click, save as…), the browser won’t let me do it.
I’ve checked that the file saved by Omeka is ok (used FTP to download it from files/original), and other smaller files are ok too, so I suppose the problem is in the size. Is there any solution to the problem?
I’m not sure what’s happening here… It looks like the server keeps dropping the connection if you try to download that file.
Apache itself should have no problem with that size of file, but are you doing something else with downloads, like using a plugin or something like that? If so, there’s likely a problem at that layer. Or maybe a proxy server in place in front of your install?
It’s unlikely for it to be a problem with Omeka itself though.
I’d guess the most likely issue is that the Stats plugin actually loads the whole file into memory, and you’re just htting PHP’s configured memory limit with this large file.
I suppose the issue had never appeared before, so nobody noticed it.
One solution would be for the plugin to degrade gracefully and, at least, give a chance to download the file (or open it nonetheless, skipping the load-into-memory step) in all cases when the file size is bigger than PHP’s memory limit. Let’s see what @Daniel_KM says about it.
I’ve tried commenting out first readfile($filepath), with no result; then commented out also $response->sendHeaders() and it gets to the die('test') I’ve added. So it seems like the problem starts with $response->sendHeaders()
I’ve then checked the headers’ content, and all variables seem to be fine:
if it’s not working, the last solution will be to create a symlink inside files/ and to redirect user on it, so the loading will be managed by apache directly (it should be allowed via htaccess). The symlink will have to be removed later.
It’s been a very long time since I looked into this, but the basic issue (I seem to remember) is that you have to make sure all the various forms of output buffering are closed before you try to use fpassthru or similar options to stream the file to output.
I believe the general way to do this is to use ob_get_level to see how many buffers are active and then call ob_end_flush that number of times to close them all. Turning off zlib.output_compression may also be necessary.
Gitlab is a lot better, in particular for free software support (github is not free) and for privacy, so I prefer to use it for now. Nevertheless, I continue to push updates on github for some time, at least until next major version.