Necessity is the Mother of Invention
Background
A lot of my most recent projects, the ones that I quickly lose interest in, are projects that I decided to build solely because I wanted to build something. However, when something is really needed, it becomes top priority, and a real sense of accomplishment comes from overcoming the challenge.
I used to never delete pictures. However, since I got my DSLR, I tend to take pictures in both RAW and JPG format simultaneously. It's a feature I love about my camera. I occasionally use the RAW files for touch-ups, etc. These files are large. I decided that I can't keep everything anymore, which is OK. There really isn't a good reason to keep a complete blur of a picture, especially since I tend to take more than one picture at a time. Why not just keep the good ones?
Well, going to Groundhog Day was the last straw. It was time to automate.
The Problem
When I move all of my pictures from my camera to my PC, I keep them in a folder structure that looks like this:
Camera/
Year.Month/
*.JPG
raw/
*.CR2
So, the JPG file of my close-up of the Punxsutawney Phil sign is located at Camera/2010.02/IMG1532.JPG . The CR2 file (a Canon-camera formatted RAW file) is located at Camera/2010.02/raw/IMG1532.CR2 .
After moving the pictures, I load them up in digiKam, my photo manager. I tag them, sometimes write captions for a slideshow, and delete the blurry and unusable pictures. I used to then go into the raw/ folder and delete the corresponding CR2 file. I took a lot of pictures on Groundhog Day, and I kept about 183 of them to give perspective on the number of files I'm talking about.
I decided to write a script to delete CR2 files for me.
My Solution
First, I backed up my pictures directory. Then, I copied the 2010.02/ folder to test my script. I get paranoid about my pictures and documents. I can't just download these again. They're original. Once they're gone, they're gone for good.
Then, I broke out the old perl skills. I love perl.
#!/usr/bin/perl
use warnings;
use strict;
opendir (JPGDIR, "./") || die "Can't open ./";
opendir (CR2DIR, "./raw/") || die "Can't open ./raw/";
my @jpgs;
foreach (readdir(JPGDIR))
{
if (/(.+)\.JPG/i)
{
push (@jpgs, $1);
}
}
closedir(JPGDIR);
foreach (readdir(CR2DIR))
{
if (/(.+)\.CR2/)
{
&removeifmissing($1);
}
}
closedir(CR2DIR);
exit;
sub removeifmissing {
my $raw = $_[0];
foreach (@jpgs)
{
return if ($_ eq $raw);
}
qx{rm ./raw/$raw.CR2};
print "$raw.CR2 removed\n";
}
What does that mean?
First, the script opens both directories. Then, the script copies each JPG file name with the "JPG" suffix into an array. Finally, the script finds each CR2 file name and calls removeifmissing with the "CR2" suffix.
removeifmissing compares the CR2 (without suffix, such as "IMG_1592") with each element in the array. If the file name is found, then the JPG exists and the function returns without doing anything. If the file name isn't found, then I must have deleted the corresponding JPG, so the script deletes the CR2 file.
Improvements/Justifications
Let me be the first to say that this script can be improved. There is a lot of iterating that I'm sure could be done more intelligently and quickly. I know I can use a proper perl library to delete a file rather than performing a direct "rm" call. The script only checks CR2 files, which I did on purpose since I sometimes make XCF (GIMP format) and TIFF files.
Here's the other side of the coin though: I don't care. There's a principle of diminishing returns. Sure, I could spruce this up a bit, but how much will that really help me? I literally typed this script up in about five minutes in vim. It gets the job done. If I ever get a Nikon, I can fix my script in literally three lines of typing:
vim ~/bin/rmCR2s.pl
:%s/CR2/NEF/g
:w ~/bin/rmNEFs.pl
Hell, I could even do it in one line on the shell with sed. The point is that by not trying to make everything perfect and elegant, I got the job done quickly, mostly because I needed it, not because I wanted it.