Skip to content

Feature to re-sync local folder structure #8874

@christianlupus

Description

@christianlupus

How to use GitHub

  • Please use the 👍 reaction to show that you are interested into the same feature.
  • Please don't comment if you have no relevant information to add. It's just extra noise for everyone subscribed to this issue.
  • Subscribe to receive notifications on status change and new comments.

Feature request

Which Nextcloud Version are you currently using: 23/31/32, probably not related in the feature request

Is your feature request related to a problem? Please describe.
I have installed NC sync client on Archlinux and have it sync my files. This works mostly on my installation 🎉 .

From time to time, there are glitches, however. I cannot reproduce. These might be related to power outages, OOM kills or similar things, I simply do not know. All I know that at some point, the sync operation was aborted and did forget about certain files it started to sync: These files/folders are deselected in the list of files to sync. This has a few implications:

  1. I do not necessarily assume that the sync process is broken. I add locally more data and then want to access the files via web frontend or on another machine and am surprised about the missing data.
  2. If I rely on the NC as a safeguard against data loss (e.g. broken hard disk), this is especially nasty if you assume everything was synced, fine, I can easily replace the disk. That kills data.
  3. Readding the folder causes me to get a message from the client that there are already files in the location and the location is left alone.

While I can understand point 3 to prevent further conflicts, I see no easy solution to get the data again synced. I now went on to create a local copy of all data, synced the upstream files (downloading GBs of local data again), and then used tools like rmlint, fsdupes, or similar to remove all duplicated data in my local copy. Then, I can manually sync all files back in. This is time-consuming and only for tech-savvy people.

Describe the solution you'd like
There are two things that I would like to have

  1. Prevent the problem in the first place. This is more like a bug but I cannot reproduce and I am not willing to risk my data consistency on a futile effort to trigger crashing again. So, if there was something that can be addressed in hindsight, we can talk about this (it happened now the third time that I have to clean up my local synced data).
  2. Some way to establish syncing without completely copying over all files and then cleaning the copy on the local disk on the file system level. (this is the main point of this feature request)

I think, this could be made either by temporarily disabling the check if a folder is pre-populated and then handling conflict by conflict. Alternatively, it was possible to do the local copy and tell the desktop client to pull in all files from a local copy as a possible duplicate candidate. Think of the --copy-dest parameter of rsync: if a file was to be downloaded, it is first checked if the local copy has the file and the hash is matching with the upstream hash. This would also allow other situations when you want to copy via sneakernet huge amounts of data.

Describe alternatives you've considered
Keeping the manual process and hoping for no further data losses.

Additional context
Currently running Archlinux with NC desktop version 3.17.2 from the official repositories.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions