One thing I have learned recently is to have several subdirectories denied to googlebot so your images don't get indexed because people will hotlink to them after they find them on Google images.
When I do find one or more people stealing my bandwidth I simply change the subdirectory. I know there are more sophisticated ways to foil hotlinkers but this simple and effective method works fine for me.
Also have learned (from Terra) that more than 3000 files in one directory puts a strain on the server and slows things down so I keep them below 1000 files per folder just to be on the safe side.
Keeping historical files depends on how many and how often you may need to access them. I just keep them locally myself.