Rclone Pictures to Google Drive
From Rabbi Blog
Move Files
Perl
- Purpose:
- Read source dir
- Rename files and move to a Year\Month\Day structure
Code
use strict;
use warnings;
use English;
use File::Basename qw( fileparse );
use File::Path qw( make_path );
use File::Spec;
use File::Copy;
use Date::Format;
my $dir = '/home/user/dropbox/incoming';
my $destination ='/home/user/dropbox/sorted/';
foreach my $fp (glob("$dir/*.jpg")) {
print "_______________________________________\n";
print "Source File: $fp\n";
my $filename_year="";
my $filename_month="";
my $filename_day="";
my $filename_camera="";
my $filename_time="";
my $filename_extension="";
my $output_filename="";
### Filename Conversion #############################
# The original script used the file name for the source of time, however
# the camera being used apparently likes to change to GMT without
# reason...
#
# Pull the date from the file creation (or last mod) and use that instead
##############################################
# Fought getting the time & epoch conversion with Perl to decide just use
# Google
################################################
my @array = stat($fp);
print "$array[9]\n";
my $inputEpoch = $array[9];
# Handling EDT... really bad later for EST but I'm tired
$inputEpoch = $inputEpoch-14400;
my ($sec,$min,$hour,$mday,$mon,$year,$wday,$yday,$isdst) = gmtime($inputEpoch);
$mon++;
$year = $year + 1900;
my $humanTime = sprintf ("%04d-%02d-%02d %02d:%02d:%02d", $year, $mon, $mday, $hour, $min, $sec);
print "$humanTime\n";
#################################################
# Output Filename creation ###
#################################################
$filename_camera="Prefix";
$filename_year=sprintf ("%04d", $year);
$filename_month=sprintf ("%02d", $mon);
$filename_day=sprintf ("%02d", $mday);
$filename_time=sprintf ("%02d:%02d:%02d", $hour, $min, $sec);
$filename_extension=".jpg";
#$output_filename=$filename_year.$filename_month.$filename_day.'_'.$filename_time.'_'.$filename_camera.$filename_extension;
$output_filename=$filename_camera.'-'.$filename_year.$filename_month.$filename_day.'_'.$filename_time.$filename_extension;
print "Output File: $output_filename\n";
###################################
# Check the destination folder(s) and create as needed
###################################
### Check for destination directory - YEAR
my $dest_year = $destination.$filename_year;
print "Dest Year: $dest_year\n";
### Check for destination directory - MONTH
my $dest_month = $dest_year.'/'.$filename_month;
print "Dest Month: $dest_month\n";
### Check for destination directory - DAY
my $dest_day = $dest_month.'/'.$filename_day;
print "Dest Day: $dest_day\n";
### Dest File
my $dest_file = $dest_day.'/'.$output_filename;
print "Dest File: $dest_file\n";
############# Create Dirs ################
if ( !-d $dest_year )
{
print "Creating $dest_year\n";
make_path $dest_year or die "Failed to create path: $dest_year";
}
if ( !-d $dest_month )
{
print "Creating $dest_month\n";
make_path $dest_month or die "Failed to create path: $dest_month";
}
if ( !-d $dest_day )
{
print "Creating $dest_day\n";
make_path $dest_day or die "Failed to create path: $dest_day";
}
#### MOVE FILES #####
# copy ($fp,$dest_file) or die "The move operation failed: $!";
move ($fp,$dest_file) or die "The move operation failed: $!";
#
}
rclone
- Install rclone
- Reference for setup: here
- Execute rclone config from the account that will be using it.
- N(ew)
- Name: <NameofRcloneProject>
- Find the Google Drive entry (13):
- Client_ID: enter (default)
- Client_Secret: enter (default)
- Scope: 1 (we need to look into 3)
- Assumption: create a folder and get the folder ID for next line
- Root folder ID: Enter
- Service Account: Enter
- Edit Advanced config: n
- Use autoconfig? N (headless)
- Copy line to browser that is logged into the Google account to be used
- Grant rclone access
- copy the code
- Enter Verifcation Code: paste the code
- Configure this as a team drive? N
- Select: Yes this is OK
Example rclone lines
copy
rclone copy --update --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s "/home/someuser/dropbox/sorted" "NameofRcloneProject:NameofGoogleDriveFolder"
sync
rclone sync /home/someuser/dropbox/sorted NameofRcloneProject[1]:GoogleDriveFolder[2] --transfers=4 --checkers=2 --tpslimit=2 --drive-chunk-size=1M --bwlimit 4M
Google Drive Limited Space
For the 15GB limit, consider a mechanism to allow the deletion of remote files during the check. May need to do something local to check for current date, then move older folders out of the sync path... though that would mean the local drive could fill some day.
rclone sync /home/userdir/securitycameras NameofRcloneProject:NameofGoogleDriveFolder --transfers=4 --checkers=2 --tpslimit=2 --drive-chunk-size=1M --bwlimit 4M --delete-during
runscript.sh
- Purpose:
- make sure rclone isn't running
- fetchmail, then check (if mail), pull out mime objects
- run the perl to coordinate
- run rclone to sync
Contents
if ps ax | grep -v grep | grep -v grep | grep rclone > /dev/null
then
echo "RCLONE is running, let's not bind the system up"
else
echo "RCLONE is not running"
fetchmail
cp /var/spool/mail/somedir/home/somedir/
echo 'd *' | mail -N
ripmime -i somedir -d /home/somedir/checkmail
rm /home/somedir/checkmail/text*
perl /home/somedir/movefiles.pl
# rclone copy /home/somedir/securitycameras rclone:securitycameras --transfers=40 --checkers=15 --tpslimit=10 --drive-chunk-size=1M --bwlimit 4M
rclone sync /home/somedir/securitycameras rclone:securitycameras --transfers=4 --checkers=2 --tpslimit=2 --drive-chunk-size=1M --bwlimit 4M
exit 0
fi