Skip to main content

using vim and command line to backup and remove a lot of directories

Was just working on some server cleanup for a client and thought that this might be a handy tip for anyone out there in a similar situation...

Here's the problem, I have a directory on a server that has lots (i.e. thousands) of subdirectories. I need to copy those up to an S3 backup bucket - organized by year and month - and then delete them.

The tricky bit is they're not organized by month/year right now. It's just a giant mess of subdirectories without much of a coherent naming strategy.

So, here's what I've come up with so far.

First, I generate a text file listing all the directories, filtering through grep for the year I'm interested in, i.e.

ls -ltr | grep 2019 > list.txt

This gives me a list of all directory entries that have '2019' in them, which should get them all since we're using 'ls -l'

Next, I open that file in vim, and copy all the files for the month I'm backing up, and use a regex to remove the leading bit of each line that shows the permissions, date, size, etc.

:% s/.*2019 //g

So now I have a list of just the directory names, one per line.

Then I use another regex to create an aws cli command on each line, which copies the directory up to the bucket. The aws cli command looks something like this

aws s3 cp --recursive [directory] s3://[bucket]/2019/04/[directory]

The vim regex looks like this:

:% s/\(.*\)/[s3 command from above]\/\1/g

Once that's done, I have a big old text file of aws cli commands. I quit out of vim, make that file executable, and run it on the command line.

To remove the directories after the command finishes, I basically use the exact same process - except that I the last bit generates the appropriate rm -rf command - and execute that file.

I'm sure there's better/smoother processes, but this one works pretty well for my needs. One thing I need to remember to due next time I start a command is to do it in a screen session in case of connection issues. That's a post for another day!

Comments

Popular posts from this blog

Using FIle FIlters in FileZilla

Here's a handy tip for situations when you want to download a large number of files - but only of a certain type. For example, perhaps you want to download all the PHP files from a largish website, scattered through many subdirectories. Perhaps you're making a backup and don't want any image files, etc. FileZilla (still the best FTP in my opinion) has a handy feature called filename filters - located under the Edit menu. Here you can set various filters that filter out files based on their filename. Took me a minute to figure that out - you're saying show only PHP files, rather you're saying filter out files that do not have ".php" as their suffix. For some reason, that seems a little backwards to me, but whatever. It works quite well. You can also check whether the filter applies only to files, only to directories - or both. In this example, you'd want to check only files, as otherwise you won't see any directories unless they happen to end in...

Great google article

Over on Maximum PC - there were a few things I didn't know you could do with the various Google apps. One is uploading files to google docs - any file. Which ties in well with my previous post about storing passwords - I uploaded a copy of my password safe file to google docs as a backup. Can't hurt, right? Also, I wasn't aware that you could set up forms in google docs that act as surveys, and then store the results in a google docs spreadsheet. This is a little alarming, as a decent amount of my work involves coding up custom surveys similar to this...

Cleaning content from OpenOffice using Perl

Open office is great software for a number of things - I use it as my office software instead of paying a premium for Microsoft office. But one thing it's not so hot at is converting documents to clean HTML. And one of the main things I use it for is adding content to sites that clients send me in word files or excel spreadsheets. Of course, you can always cut and paste, but that loses a lot of formatting. For example, if the content uses a lot of italics, bold text, etc. it can be a huge pain to go back and put all that back in. Another common situation is a client sending some sort of tablular data in a spreadsheet - for example a list of events. It's the kind of data that can change a lot, and it also needs to be in a table with some decent formatting to be usable. Doing it manually is a lot of grunt work. But grunt work is what computers excel at, and I'm not very good at. So I've developed a number of perl scripts to help streamline this kind of job. I'll go ...