When I initially set up my 1999.io server, I wrote an article comparing Fargo with 1999.io and in the article I wrote about one advantage that Fargo had in using Dropbox is that it had built-in backup for my writing.
Tonight I used some sample code from the 1999-server support group to implement a script that copies what I write using my server to a S3 bucket, which I configured for web hosting. The result is everything that I write is now stored on my server and on S3, providing me with the backup copy I desire.
The configuration also resolves a dependency that my Fargo blogs have on FargoPublisher to access what I write. FargoPublisher is the web server for the Fargo blogs, using S3 as the file system for the server.
The consequence of this configuration is that my Fargo blogs require S3 and a server running publisher.js via nodejs in order for one to access my content. If the server hosting publisher.js goes down, you cannot access webnotes.frankmcpherson.net even though the site hosting the content (S3) is still up.
By copying the content I write in 1999.io to a S3 bucket and configuring that bucket for web hosting means that if my 1999 server goes down, the content will still be accessible. Amazon is much more capable of keeping S3 accessible than I am in keeping my servers running.
Best of all, keeping a web site running on S3 is a matter of paying the AWS bill each month, it doesn't require the ongoing administration that my Linux servers require.
One really cool feature is that I can edit stories published to S3 directly within that page, and the edits are sent back to the my instance of 1999.io and rendered on the server and back to the bucket. No need to move back and forth between editor and web site to edit content, just like you could do with EditThisPage.