Lately I found myself becoming completely frustrated with the Dropbox and R connection. I used the Dropbox folder on my production server hosted on AWS for months, but something would invariably cause the sync to stop working. I would it very difficult to get the dropbox folders to re-synch. I would waste a lot of time fiddling with it and finally get it working only to find that it would stop working again for some reason out of my control. I loved the simplicity of saving client files to a Dropbox folder and just letting them access it, but I found myself not able to afford the time sensitive downtime. I just really want to set it and forget.

As an option I have found, that the aws.s3 library works bullet proof, but getting clients to access a s3 folder is not something I would like to manage. Thus I recently turned to using the RGoogleDrive library. It has taken some set up but I think I have a method for using it and replacing my Dropbox use. The first problem is that the GoogleDrive set up does not just sync a folder to my Google drive from my AWS s3 machine. So I needed to save the files one at a time and use the naming convention for the Drive API. It took a little while to do a work around and I came up with this:

library(googledrive)

# function to remove all files from the google drive and reuplaod them from the data files in data to match
# this is the location in my Google Drive of where I have previously saved reports
google_path = paste0("Client_Reports/", client,  "/reports/")
# this saves it as a tribble so that the excel files can be deleted in bulk
tmp.df <- drive_ls(google_path, recursive = TRUE)
# removes all the excel files perminantly since I run this every day.  I don't need to save back up or versions of these reports
drive_rm(tmp.df[grep(".xlsx", tmp.df$name), ])

I save it the location this way so it can be run using Rscript and not be set to a specific environment. as I use this report logic for many different clients I need it to work both ways and for many clients. the files I want to put into google drive are saved in a data folder and then separated by client name. You can use subfolders here and they will be automatically saved into the google drive, if they where previously named.

local_path = paste0( "/home/rstudio/sync", "/data/", client, "/reports/")
x = dir(local_path, recursive = TRUE)        

# this loops through all the excel files in the local data folder and saves a copy to the google drive.
lapply(x, function(.x){
p =  paste0(local_path, dirname(.x), "/", basename(.x))
n = basename(.x)
g = paste0(g_path, dirname(.x))
drive_upload(media = p, path = g, name = n)
})