Hi,
After updating to Piwigo 11.1.0 from 11.0 using the auto-updater I get a timeout message when trying to access the admin area.
Error message in the server logs is as follows:
mod_fcgid: stderr: VALUES('544851526ca2d26005e98374eddd2280cc07','pwg_device|s:7:\\"desktop\\";pwg_caps|a:3:{i:0;s:1:\\"1\\";i:1;s:4:\\"1349\\";i:2;s:4:\\"1085\\";}pwg_uid|i:1;cache_activity_last_weeks|a:2:{s:13:\\"calculated_on\\";i:1611388972;s:4:\\"data\\";a:3:{i:0;a:1:{i:7;a:3:{s:7:\\"details\\";a:1:{s:4:\\"User\\";a:2:{s:5:\\"Login\\";s:1:\\"2\\";s:6:\\"Logout\\";s:1:\\"2\\";}}s:6:\\"number\\";i:4;s:4:\\"date\\";s:24:\\"Samstag 26 Dezember 2020\\";}}i:2;a:2:{i:6;a:3:{s:7:\\"details\\";a:1:{s:4:\\"User\\";a:2:{s:5:\\"Login\\";s:1:\\"1\\";s:6:\\"Logout\\";s:1:\\"1\\";}}s:6:\\"number\\";i:2;s:4:\\"date\\";s:22:\\"Freitag 15 Januar 2021\\";}i:7;a:3:{s:7:\\"details\\";a:3:{s:5:\\"Photo\\";a:1:{s:3:\\"Add\\";s:1:\\"8\\";}s:3:\\"Tag\\";a:1:{s:6:\\"Delete\\";s:1:\\"2\\";}s:4:\\"User\\";a:2:{s:5:\\"Login\\";s:1:\\"1\\";s:6:\\"Logout\\";s:1:\\"1\\";}}s:6:\\"number\\";i:12;s:4:\\"date\\";s:22:\\"Samstag 16 Januar 2021\\";}}i:3;a:3:{i:3;a:3:{s:7:\\"details\\";a:1:{s:4:\\"User\\";a:2:{s:5:\\"Login\\";s:1:\\"2\\";s:6:\\"Lo in [/...]/include/dblayer/functions_mysqli.inc.php on line 864
Browsing the gallery works normally, so I would conclude it is not a general database issue.
Files in directory include/dblayer have date 22.01.2021 17:29:36, so come from Piwigo 11.1
Logout/login does not solve the issue
PHP 7.3
5.5.41-MariaDB
Apache server
Piwigo URL: https://mfux.ch/photos
If you have a solution please advise how to update since I cannot access the auto-update function.
Offline
It looks like the error is coming from the mod_fcgid Apache module? Maybe try temporarily disabling that module?
Offline
How can I deactivate mod_fcgid via .htaccess? I have no server access, hosting is on shared hosting.
I tried the settings described in https://www.apachelounge.com/viewtopic.php?p=24256 without success.
Uploading functions_mysqli.inc.php from (code in which error happens according error log) version 11.0 did not solve the issue.
Then I found a workaround: using the direct link https://mfux.ch/photos/admin.php?page=updates I can access the admin area and click everywhere EXCEPT on Dashboard, which leads to the error described in my initial post.
Offline
I just upgraded my primary site to 11.1.0 and noticed getting into the Admin area the first time took a long time. I think it was generating all the stats for the new home page (i.e., activity peak in the last weeks). It eventually did load, but the delay reminded me of this thread. I wonder if that is what's timing out for you.
Offline
Tried extending PHP timeout to 150 seconds (max allowed by my shared costing)
Tried setting longer timeouts via .htaccess and php.ini
Deleted the Piwigo history in the DB (reduced from 43MB to few kB)
No success, timeout still appears when trying to open the dashboard in admin section.
Offline
In file admin/intro.php, line 254, remove :
$_SESSION['cache_activity_last_weeks'] = array( 'calculated_on' => time(), 'data' => $activity_last_weeks, );
and a few lines after that, remove also:
$activity_last_weeks = $_SESSION['cache_activity_last_weeks']['data'];
and give a new try to opening dashboard.
Offline
Commented lines 254-257 and 260
But still loading the Dashboard results in a timeout and error.
Offline
(after working directly on martin's Piwigo)
OK, so the problem is not the activity chart but the new function to calculate cache directory size. Gonna find one that work!
So this does not work (timeout):
if($path!==false && $path!='' && file_exists($path)){ foreach(new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path, FilesystemIterator::SKIP_DOTS)) as $object){ $bytestotal += $object->getSize(); } }
This does not work (timeout) either:
$size = 0; $files = glob($path.'/*'); foreach($files as $filepath){ is_file($filepath) && $size += filesize($filepath); is_dir($filepath) && $size += get_fs_directory_size($filepath); } return $size;
and this one returns false (but quickly):
if (!function_exists('exec')) { return false; } @exec('du -sk '.$path, $returnarray); if (is_array($returnarray) and !empty($returnarray[0]) and preg_match('/^(\d+)\s/', $returnarray[0], $matches)) { $bytestotal = $matches[1] * 1024; } return $bytestotal;
The problem with this last one is that we're sure it does not work on Windows (which is not that problematic since Piwigo is not officially compatible with Windows anyway)
Offline
What is the actual bottleneck why it times out? The amount of directories? The amount of files? Both? A slow disk? All three? A nasty short timeout? ;-P
In case exec() or popen() is allowed, a du call is probably the best to use.
Else, if it's the amount of directories (i.e. under _data/i/upload/) with files, a "fuzzy" status could be obtained one by one directory and caching the result in a database table and remembering each directory and datetime when last obtained, refreshing oldest or not touched ones incrementally (for which a list of all subdirectories needed to be obtained that hopefully would not timeout). The result wouldn't be accurate and too little at the beginning but getting better over time. Percentage of scanned directories could be given along. Just a quick idea..
Offline
erAck wrote:
What is the actual bottleneck why it times out? The amount of directories? The amount of files? Both? A slow disk? All three? A nasty short timeout? ;-P
Can't say. The PHP execution times out on:
$bytestotal += $object->getSize();
It's a code a copied from Stack Overflow and I have absolutely no experience with RecursiveDirectoryIterator. I guess it's not as "work everywhere" as it was described!
erAck wrote:
In case exec() or popen() is allowed, a du call is probably the best to use.
Not 100% satisfying BUT works fine and quickly where it can and doesn't break the execution where it doesn't work. Not having the "cache" size is not a major issue (we didn't have it in Piwigo 2.10).
erAck wrote:
Else, if it's the amount of directories (i.e. under _data/i/upload/) with files, a "fuzzy" status could be obtained one by one directory and caching the result [...] Just a quick idea..
Makes me think of the caching already performed by [extension by mistic100] Batch Downloader :
mysql> desc piwigo_image_sizes; +-----------+--------------+------+-----+---------+-------+ | Field | Type | Null | Key | Default | Extra | +-----------+--------------+------+-----+---------+-------+ | image_id | mediumint(8) | NO | PRI | NULL | | | type | varchar(16) | NO | | NULL | | | width | smallint(9) | NO | | NULL | | | height | smallint(9) | NO | | NULL | | | filesize | mediumint(9) | NO | | NULL | | | filemtime | int(16) | NO | | NULL | | +-----------+--------------+------+-----+---------+-------+ 6 rows in set (0.01 sec) mysql> select * from piwigo_image_sizes; +----------+-------+-------+--------+----------+------------+ | image_id | type | width | height | filesize | filemtime | +----------+-------+-------+--------+----------+------------+ | 287 | small | 576 | 432 | 105 | 1527176891 | | 288 | small | 576 | 384 | 89 | 1527176891 | +----------+-------+-------+--------+----------+------------+ 2 rows in set (0.01 sec)
Offline
You could temporarily place this standalone code somewhere in the document path
<?php $path = '/path/piwigo/_data'; $entries = 0; $bytestotal = 0; if($path!==false && $path!='' && file_exists($path)){ foreach(new RecursiveIteratorIterator(new RecursiveDirectoryIterator($path, FilesystemIterator::SKIP_DOTS)) as $object){ ++$entries; echo "entry: $entries "; echo "object: $object "; $size = $object->getSize(); echo "size: $size "; $bytestotal += $object->getSize(); echo "bytes: $bytestotal\n"; } } else { echo "nopath\n"; } echo "total: $bytestotal\n"; ?>
and call it with a curl request like
curl http://example.com/dirsize.php
and see if when and where it fails.
Though unlikely it might even be that the process times out because a subdirectory entry can't be read due to disk failure or an awkward file system access or something.
Apart from that, the code is fragile because it throws an exception when encountering for example a symbolic link that points into the void, i.e. the target does not exist. Though per default RecursiveDirectoryIterator does not follow symlinks, getSize() tries a stat() call on them that when void fails, and count the target on valid symlinks, which is also undesired. It also throws when an entry can't be read due to permissions. These cases at least should be caught.
Offline