backup to twitter?

Ridiculous? Yes.

Pointless? Arguably, Yes.

Possible? Yes.

While catching up on some tweets last night, I came across this:

Which, naturally made me wonder if you could actually use twitter as a backup service. Never mind that no one in their right mind would ever want to do this, I had to see if I could. My first thought was, encode, split, post? My first action was

$ man -k encode

Oh, neat, uuencode(1) will let me encode binary data to text. If you work in the web world, like I do, you might think of php’s base64_encode() function. uuencode(1) does the same thing, but on the command line. Basically it takes binary data and converts it to printable ASCII data. This happens to be exactly what twitter accepts, so it seemed a perfect match.

So, what am I going to encode? A file would be too easy, I need something a bit more like something you might backup. How about a directory?

   $ mkdir twitterbackup
   $ cd twitterbackup
   $ touch file1.txt
   $ touch file2.txt
   $ touch file3.txt  

Throw a bit of text in each file, and we’re good. Test case alpha ready to go. So, what’s next? Oh, look at that, the uuencode(1) man page has an example of doing almost this same thing but over email, lets steal that:

   $ tar cf - src_tree | compress | \
      uuencode src_tree.tar.Z | mail user@example.com

And adapt it to what we’re doing

   $ tar cf - ./twitterbackup | compress | \
      uuencode -m twitterbackup.tar.Z | split -b 140

So, except for the mail bit, all I did differently was add the -m flag to uuencode(1) to ask for base64 encoding instead of the standard encoding, base64 encoding is much more twitter friendly, as there are no @ signs or #‘s in the output.

The only other change I made, was to pipe the output of all of that to the split command, and ask for splits of 140 bytes. Since ascii characters are represented as a single byte of data, 140 bytes equals 140 characters, which is just right for twitter.

I now have 5 files, of 140 characters each, which can be posted to twitter:

Cool. If I take the contents of those posts, and put them in a single file I should be able to get back to my original files by simply reversing the process. After a bit of tweaking, yes, it works. It seems that twitter slightly mangles the data, some new lines became spaces after being posted to twitter, but that should be easy to fix.

After I fix the spacing issues, I’m able to uudecode -c the file, uncompress it, and untar it and I’m left with my original directory with 3 files inside, and in one of the files, the following message:

“Will this all be here when we get back?”

So, it seems that yes, twitter can be used to backup arbitrary data, though it’s quite tedious to do, and probably violates some terms of service somewhere, and probably isn’t very reliable for important data. The other thing I’d probably want to do before releasing my twitter based backup solution, is encrypt the data before posting it to twitter, otherwise, anyone with a bit of patience could easily restore my backups.

 
12
Kudos
 
12
Kudos

Now read this

Code review before commit

I’ve started implementing code review before commit/push on my team. We theoretically had a code review system in place before this, but it went something like this: Write code Commit code Push code Roll a d20 If die came up greater than... Continue →