Rclone
Rclone is a command line program to manage files on cloud storage. It is a feature rich alternative to cloud vendors’ web storage interfaces. Over 40 cloud storage products support rclone including S3 object stores, business & consumer file storage services, as well as standard transfer protocols.
Rclone has powerful cloud equivalents to the unix commands rsync, cp, mv, mount, ls, ncdu, tree, rm, and cat. Rclone’s familiar syntax includes shell pipeline support, and –dry-run protection. It is used at the command line, in scripts or via its API.
Rclone documentation and in-depth guide on how to install can be found at rclone.org.
Rclone helps you:
- Backup (and encrypt) files to cloud storage
- Restore (and decrypt) files from cloud storage
- Mirror cloud data to other cloud services or locally
- Migrate data to cloud, or between cloud storage vendors
- Mount multiple, encrypted, cached or diverse cloud storage as a disk
- Analyse and account for data held on cloud storage using lsf, ljson, size, ncdu
- Union file systems together to present multiple local and/or cloud file systems as one.
Fixing Duplicates in Google Drive using Rclone to Dedupe:
<# RCLONE HELP https://rclone.org/commands/rclone_dedupe/ Useful options I choose --max-depth int --dry-run --log-file=PATH --tpslimit 1 # can help prevent rate limiting errors you might see if you run verbose x2, ie `-vv` --checkers 1 #> # Install Rclone and a log viewer to make it nice to review results choco upgrade rclone -y choco upgrade tailblazer -y # nice tail log view for streaming rclone activity # Alias for usage new-alias rclone -value "C:\Program Files\rclone\rclone-v1.42-windows-amd64\rclone.exe" -force # logging to this directory New-Item 'C:\temp' -ItemType Directory -force $LogFile = "C:\temp\rclone-$(Get-Date -format 'yyyy-MM-dd').log" # Tested Against Folder With Dups With Dry Run rclone dedupe newest googleappsdrive:Test --log-file=$LogFile --dry-run # Ran against folder and it removed 3 of the 4, leaving only one file, now deduplicated rclone dedupe newest googleappsdrive:Test --log-file=$LogFile # another folder with dups but larger. This ran into issues when I didn't limit depth rclone dedupe newest --dry-run googleappsdrive:"Amazon Drive\Development" --log-file=$LogFile --max-depth 2 # merge the root folder level only of duplicate folders rclone dedupe newest googleappsdrive:"" --drive-skip-gdocs --log-file=$LogFile -vv --tpslimit 4 --transfers 1 --fast-list --max-depth 1 --stats=30s # now dig into my Lightroom Library and deduplicate a bit more. I did this to confirm it was working before I did everything in my folders. rclone dedupe newest googleappsdrive:"Lightroom" --drive-skip-gdocs --log-file=$LogFile -vv --tpslimit 4 --transfers 1 --fast-list --max-depth 1 --stats=30s rclone dedupe newest googleappsdrive:"Lightroom" --drive-skip-gdocs --log-file=$LogFile -vv --tpslimit 4 --transfers 1 --fast-list --max-depth 2 --stats=30s rclone dedupe newest googleappsdrive:"Lightroom" --drive-skip-gdocs --log-file=$LogFile -vv --tpslimit 4 --transfers 1 --fast-list --max-depth 3 --stats=30s # now that I've got everything figured out, I ran against the entire drive. rclone dedupe newest googleappsdrive:"" --drive-skip-gdocs --log-file=$LogFile -vv --tpslimit 4 --transfers 1 --fast-list --stats=30s
Use RClone for cloud Storage and coordination
Rclone can connect your cloud storage (GDrive, Dropbox, OneDrive, Nextcloud … ) to your PC like a HDD drive, a good alternative to Google Drive File Stream.
rclone mount --vfs-cache-mode full Gdrive: z: -v
rclone mount --vfs-cache-mode full Gdrive: x: -v
How to transfer files that marked as abused
--drive-acknowledge-abuse # Set to allow files which return cannotDownloadAbusiveFile to be downloaded.
rclone sync -P A: B: --drive-acknowledge-abuse --exclude "/some_folder/**"
Reference links
For global flags, check the docs.
https://www.osc.edu/resources/getting_started/howto/howto_use_rclone_to_upload_data
….
(Related posts:)
https://pquan.info/cach-thiet-lap-de-cho-phep-ghi-ext4-ubuntu-linux-mint/
Để lại bình luận: