![]() Wasabi’s partners in this area include: Archiware, Actifio, Arcserve, Aparavi, Arq, CloudBerry Lab, Comet Backup, Commvault, Duplicacy, Duplicati, GoodSync, IBM, Iperius Backup, qBackup, Quest, reevert, Restic, Retrospect, StorageCraft, Syncovery, Veeam, Veritas, and Vertical Backup. The use of Wasabi’s hot cloud storage service with backup and disaster recovery solutions is one of the most popular applications today. Backup & Disaster Recovery Solution Partners If you build a platform that offers an AWS S3 interface and you would like to work with Wasabi as a technology alliance partner, we love to hear from you. If you would like to learn a bit more about what each technology partner does, just click on the appropriate link below or visit /partners/technology-partners. Here is just a small sample of our current alliance partners and the categories in which they belong. Wasabi works with many different types of technology alliance partners covering a wide variety of important storage solutions and related categories. That’s why we’re thrilled at how many partners have already joined us–over one hundred and counting–and the rapid pace at which new partners are discovering us. The localtest_crypt:/ is what I think is wrong, like the rclone_jobber.sh isn’t handling the $new variable in the job script properly, but I don’t know why.Wasabi’s technology alliance partners play an essential role in helping us achieve our vision of making cloud storage a fast and affordable utility that’s as simple as plugging into an electric socket. #display test directories (comment these if calling from job scheduler)īack up in progress _21:51:35 _music_test.sh rclone sync /home/joti/test_rclone_data localtest_crypt:/ -backup-dir=localtest_crypt:/archive/2018/_21:51:35 -log-file=/home/joti/github_repos/rclone_jobber/rclone_jobber.log ERROR: _music_test.sh failed. $rclone_jobber/rclone_jobber.sh "$source" "$dest" "$move_old_files_to" "$options" "$(basename $0)" "$monitoring_URL" Rclone_jobber=$rclone_jobber #path to rclone_jobber directory #substitute $rclone_jobber and $ with paths on your system # testing using paramters to override the "latest_backup" folder name The rclone target is “localtest_crypt”, set as $test_remote, which is a subfolder in my home dir called rclone_test. The folder I’m trying to use as the latest backup folder is ‘music’, so in the job script, I set the variable new=“music”. ![]() I edited the rclone_jobber.sh, to replace the new=“latest_backup” variable with new="$7". Thanks for responding! I tried this on a small test directory and ran into problems. Of course, test on a small test directory before deploying real backup jobs. If the job specifies new=“last_snapshot”, it behaves like the stock rclone_jobber.sh script. ![]() $rclone_jobber/rclone_jobber.sh "$source" "$dest" "$move_old_files_to" "$options" "$(basename $0)" "$monitoring_URL" "$new" Options="-filter-from=$rclone_jobber/examples/filter_rules -checksum -dry-run" “job_backup_pictures” backs up to new=“pictures”.Įxample job_backup_pictures: #!/usr/bin/env sh Write a separate backup job for each directory e.g. If you can not rearrange your existing destination backup directories, you can customize the rclone_jobber.sh script.Ī minimal rclone_jobber.sh customization would be to move the $new variable to a parameter, for example: new="$7"Īnd then specify “$new” in the backup jobs. Rclone_jobber directory structures are illustrated in ![]() As is, rclone_jobber.sh will always backup to “last_snapshot”.Ĭmd="rclone sync $source $dest/$new $backup_dir $log_option $options"Ĭan you rearrange your existing destination backup directories to match the directory structure used by rclone_jobber? i.e.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |