Error message when trying to preview or download large-sized pdf file

Hello.

Using FTP and the Dropbox plugin, I’ve uploaded a 115 MB PDF file to our repository, creating then the relative Item.

Now, if I click on the file thumbnail, the PDF won’t open up (error message: “Impossible to load the PDF document”); and if I try to download it (right click, save as…), the browser won’t let me do it.

I’ve checked that the file saved by Omeka is ok (used FTP to download it from files/original), and other smaller files are ok too, so I suppose the problem is in the size. Is there any solution to the problem?

Thanks.

p.s.: link to the problematic Item is https://www.bitoteko.it/items/show/1174

I’m not sure what’s happening here… It looks like the server keeps dropping the connection if you try to download that file.

Apache itself should have no problem with that size of file, but are you doing something else with downloads, like using a plugin or something like that? If so, there’s likely a problem at that layer. Or maybe a proxy server in place in front of your install?

It’s unlikely for it to be a problem with Omeka itself though.

Spot on, John: I think I’ve identified the problem in the Stats plugin (by Daniel Berthereau), and in the following line added to the .htaccess file:

RewriteRule ^files/original/(.*)$ http://www.bitoteko.it/download/files/original/$1 [NC,L]

When disabling the plugin and commenting out the line in .htaccess, the problem disappears.

Now, I hope @Daniel_KM will have a chance to look into it and maybe find a fix (I’ll open an issue on GitHub).

Thanks.

I’d guess the most likely issue is that the Stats plugin actually loads the whole file into memory, and you’re just htting PHP’s configured memory limit with this large file.

That’s probably the reason, yes.

I suppose the issue had never appeared before, so nobody noticed it.

One solution would be for the plugin to degrade gracefully and, at least, give a chance to download the file (or open it nonetheless, skipping the load-into-memory step) in all cases when the file size is bigger than PHP’s memory limit. Let’s see what @Daniel_KM says about it.

Generally I fix all issues in Omeka S, but more rarely for Omeka 2 ones. Can you try the last version (master)?

Installed and tried, but the problem is unfortunately still there, I’m afraid :frowning:

Can you try this patch : https://github.com/Daniel-KM/Omeka-plugin-Stats/issues/5 ?

Sorry, but not clear: the patch you mentioned is referring to older code, since in the actual version (= the one I’ve installed) the code

// $file = file_get_contents($filepath);
// $response->setBody($file);

is already commented out. I’ve tried anyway to replace

$response->sendHeaders();
readfile($filepath);

with

$file = fopen($filepath,"rb");
		
set_time_limit(0);
while(!feof($file))
{
    ob_start();
    $content = fread($file, 1024*8*1024);
    print $content;
    ob_flush();
}

but the problem is still there (and smaller files do open up with no problem at all).

Can you try die() instead of return? :

$response->sendHeaders();
readfile($filepath);
die();

Done. The error is still there.

I’ve tried commenting out first readfile($filepath), with no result; then commented out also $response->sendHeaders() and it gets to the die('test') I’ve added. So it seems like the problem starts with $response->sendHeaders()

I’ve then checked the headers’ content, and all variables seem to be fine:

$mode = inline
filename = bb889fa1974524733915ef6ed86a28d9.pdf
$contentType = application/pdf
$filesize = 121433738
gmdate = Sat, 24 Apr 2021 13:15:51 GMT

Let me know if you need me to do other tests.

All solutions above should work. We have done all solutions of https://stackoverflow.com/questions/1754883/serve-a-large-file-via-zend-framework#1754926, except the first :

$this->_helper->layout()->disableLayout();
$this->_helper->viewRenderer->setNoRender(true);
$response->sendHeaders();
filepassthru($filepath);
exit();

if it’s not working, the last solution will be to create a symlink inside files/ and to redirect user on it, so the loading will be managed by apache directly (it should be allowed via htaccess). The symlink will have to be removed later.

Tried this too, I’m afraid it did not solve the problem.

Here’s the relative server log entry:

`[error] [client xxx.xxx.xxx.xxx] - www.bitoteko.it - AH01215: PHP Fatal error: Allowed memory size of 268435456 bytes exhausted (tried to allocate 121434112 bytes) in /web/htdocs/www.bitoteko.it/home/application/libraries/Zend/Controller/Response/Abstract.php on line 588: /web/htdocs/www.bitoteko.it/home/index.php

I don’t understand why 121 MB are causing a problem when the memory limit is 256 MB.

It’s been a very long time since I looked into this, but the basic issue (I seem to remember) is that you have to make sure all the various forms of output buffering are closed before you try to use fpassthru or similar options to stream the file to output.

I believe the general way to do this is to use ob_get_level to see how many buffers are active and then call ob_end_flush that number of times to close them all. Turning off zlib.output_compression may also be necessary.

So can you try :

$this->_helper->layout()->disableLayout();
$this->_helper->viewRenderer->setNoRender(true);
while (ob_get_level()) {
    ob_end_clean();
}
$response->sendHeaders();
readfile($filepath);
die();

You can try with readfile then filepassthru and ob_end_clean() then ob_end_flush().

I’ve tried different variations, and the one that seems to work is

// $this->_helper->layout()->disableLayout();
// $this->_helper->viewRenderer->setNoRender(true);
while (ob_get_level()) {
	ob_end_clean();
}
$response->sendHeaders();
readfile($filepath);
return true;

If I try to uncomment any of the first two lines, the error comes back.

Ok, so I include this one in the version 2.2.4.2 of the plugin (https://gitlab.com/Daniel-KM/Omeka-plugin-Stats).

Great. Thanks to you and to @jflatnes for helping us with this.

Daniel, will you update also the repository on GitHub? Or have you moved permanently to GitLab? Just to know where to submit issues or pull requests.

Gitlab is a lot better, in particular for free software support (github is not free) and for privacy, so I prefer to use it for now. Nevertheless, I continue to push updates on github for some time, at least until next major version.

This topic was automatically closed after 250 days. New replies are no longer allowed.