Hi,
I am using Piwigo for a couple of weeks now. I use it as a web portal to serve images from a NAS. So the files are on the local fle system.
Initially, the synchronization works great but when I now want synchronize again over the portal, I get a 504 timeout. I've read some suggestions to increase PHP timeout and the select folder by folder. Both are no real options for me (it will take a couple of hours and there are plenty of folders).
So I wonder if there is a command / tool available which I can use from the command line?
Cheers,
Piscator
Offline
I'm afraid there's no way to synchronize physical directories from the command line. Synchronization from the web interface is a legacy way to add photos to albums.
But if you have lot of photos you can add the command line to add your photos through the API.
Offline
I assume you mean this?
http://piwigo.org/demo/tools/ws.htm#top
I wasn't aware of this. Thanks for pointing me in this direction.
Any hints which API calls I could use?
I found this one: "pwg.images.syncMetadata"
But I don't think that's what I would need.
Offline
That does only that, it syncs metadata from images.
It seems there is no API call to sync physical directories, but there is tools/remote_sync.pl though that in turn uses the site_update page again, so will not help against the server set timeout. But maybe you could write a script to move images in portions from a temporary location to the galleries/ directory albums and call tools/remote_sync.pl on those slices.
Offline
Piscator wrote:
I assume you mean this?
http://piwigo.org/demo/tools/ws.htm#top
Yes I mean web service when I said API.
Piscator wrote:
Any hints which API calls I could use?
I found this one: "pwg.images.syncMetadata"
But I don't think that's what I would need.
I think you'd better use pwg.images.addSimple of pwg.images.addFile. I'm not sure you notice it but you have to switch from physical albums to virtuals ones.
Offline
I finally had some time to look into 'physical directory' synchronization again.
Just for testing purposes, I increased the timeout from php timeout 30s to 300s. When I add multiple folders at a go, it always seems to timeout but the pictures seems to be correctly in the database and the physical directories seem to work as expected. Adding a single folder seems to work ok (no timeout). So, I think there's some bug in the script that makes it hang indefinitely.
As a work-around, I was thinking to adapt the script *tools/remote_sync.pl* so that I can add a folder parameter (and then a script around it that adds folder by folder).
# perform the synchronization
$form = {
'sync' => 'files',
'display_info' => 1,
'add_to_caddie' => 1,
'privacy_level' => 0,
'sync_meta' => 1, # remove this parameter, turning to 0 is not enough
'simulate' => 0,
'subcats-included' => 1,
'submit' => 1,
};
When I look in the script, this should be the place to add a folder. Anyone knows if this is possible?
Offline
For anyone who is interested. I made a python3 script that can synchronize all physical albums over the command-line. There is no need to do a sync on sub albums. You can use the script from a cron tab to keep piwigo in sync with newly added / deleted photos.
The script works as follows.
* Login with pwg.session.login
* Do a HTTP request to do a synchronization
* If there is a timeout, check the piwigo docker container to determine when it is finished (based on CPU load). If you don't use docker you'll have to use an alternate method the check the CPU load of the piwigo process / php requests.
I first did a new sync request after a timeout, but you'll notice that this request is just starting the sync over again in parallel, so doubling the CPU load. After 6-10 requests, the requests overload piwigo and it is completely inaccessible.
#!/usr/bin/env python3 import sys, time, subprocess import requests class PiwigoConnector(object): def __init__(self, base_url, user, password, docker_container="piwigo"): self.base_url = base_url self.user = user self.password = password self.docker_container = docker_container def sync(self): time_start = time.time() response, duration = self.sync_single() print("Status: {}, Duration: {}".format(response.status_code, duration)) if response.status_code == 504: self.wait_for_sync() time_end = time.time() print("Total Duration: {}".format(time_end - time_start)) def sync_single(self): login_data = {} login_data["method"] = "pwg.session.login" login_data["username"] = self.user login_data["password"] = self.password session = requests.Session() print("Login: ...") login_response = session.post(self.base_url + "/ws.php?format=json", data=login_data) if login_response.status_code != 200: print(login_response) raise Exception("Login Failed!") print("Login: OK!") sync_data = {} sync_data["sync"] = "files" sync_data["display_info"] = 1 sync_data["add_to_caddie"] = 1 sync_data["privacy_level"] = 0 sync_data["sync_meta"] = 1 sync_data["simulate"] = 0 sync_data["subcats-included"] = 1 sync_data["submit"] = 1 time_start = time.time() sync_response = session.post(self.base_url + "/admin.php?page=site_update&site=1", data=sync_data) time_end = time.time() session.close() print("Connection closed") return sync_response, (time_end - time_start) def wait_for_sync(self, delay_time=10.0): command = "docker stats " + self.docker_container + " --no-stream --format \"{{.CPUPerc}}\"" result = subprocess.check_output(command.split(" ")) cpu_percentage = float(result[1:-3]) while cpu_percentage > 2.0: print(".", end="") sys.stdout.flush() time.sleep(delay_time) result = subprocess.check_output(command.split(" ")) cpu_percentage = float(result[1:-3]) print() return if __name__ == '__main__': base_url = "http://mypiwigo_server" user = "admin" password = "<password>" PiwigoConnector(base_url, user, password).sync()
Offline
Piscator,
Awesome, just what I was looking for. In my case, I slightly adapted to add the particular category I wanted to update (I didn't want to update the whole directory for resource reasons).
I've got your command running in a crontab process along with some other odds and ends using rsync.
Great stuff, Thanks, P404.
Offline
Whilst I have nothing against Python, why would you write it with Python when a PHP cli is feasible, especially given that this is a PHP app. Also any plans to include this in the package so that it can be served with the installation and documented?
Just adding to the previous message. Bulk operations might involve a lot of files. See for instance this synch preview
----------
104 albums added to the database
5345 photos added to the database
0 albums deleted in the database
0 photos deleted from the database
422 photos updated in the database
0 errors during synchronisation
----------
A CLI that talks directly to a Synchronise service would definitely be better also to allow it running in the background with the php cli alone without a web service/server involvement.
Last edited by gebbione (2023-09-11 12:06:03)
Offline