round-robbin
I tend to use the round-robin system a lot; that is; do a small change here, then move to the next script, and update that, add a feature, whatever, and then the next, and so on. In actual fact, it's more like opening brackets, and that ugly word, 'dependencies'; you make one wee change, and everything is effected. Perhaps a slight exaggeration; but for sure, hundreds of on-site scripts have seen an update this week; some big, mostly small.To my HORROR*, I noticed that since I chucked the files from three different distro machines into one (as it now has infinite depth capabilities), lots of old external links to the source view pages were broken. Sure, search engines would eventually catch up to moved files, and the distro machine has a facility for redirecting moved files, too; but we're talking entire sections of files, and I wasn't about to create a hundred new preference items.
So I devised a better solution; something that's been in the back of my mind for a while. If you link to a file that doesn't exist, the distro machine searches for it, and if it's there, presents it, regardless of its new location. As well as catching genuine bad links to good files, it enables webmaster to be really lazy with their own source links.. /engine?source=whatever.php, and even if it's actually, in php/foo/bar/whatever.php, the link still works fine, so long as the name is exactly the same.
I've had similar redirections working previously - using mod_rewrite in .htaccess, but this is much more fun, more complete, requires zero .htaccess hacking, and is extremely simple to implement; basically, you don't need to do anything at all!
Both methods have a downside; you now have potentially duplicate versions of the same page. However, only bad links will get the "alternative" version, and that's better than getting nothing; and spiders will only ever spider the good links, anyway. This doesn't affect SEO. If you don't believe me, Google for Linux Website Sync (or similar)**, which has had similar all-roads-lead-to-one-script redirection in place for some time.
Of course, the distro machine still performs all the regular checks before presenting the moved source file for the browser's perusal; allowed extension, inside the source tree, and so on.
I've not added this capability for zip downloads, as yet. I'll still need to think about whether I want that; I'm not to bothered about inward links directly to zips, anyway. Mainly I was interested in unbreaking search engine hits, not making things easy for leechers.
There's loads more I could tell you about, and I may blog again soon. In the meantime, I should mention, that 2MB+ of spam DID NOT make its way to the comments. Shields at 100%!
for now..
;o)
ps. here's an anecdote.. On loading my Google Analytics, I was double-plus horrified to find that my site visitors, after a rousing comeback, had dwindled again to their hundreds, then dozens! Calamity! I put my thinking cap on began the investigation.
Had some international SuperPower barred IP access to corz.org? No? Had my web host performed some devious server trick to redirect away all my traffic except me and folk I know? No. What, then?
Simply, I had uploaded my metadata script, and my local copy has the Google Analytics code commented out. Oops.
This foolish blunder was easily rectified with something similar to..
<?php
if (stristr($_SERVER['http_host'], 'corz.org')) {
echo '
<script src="http://www.google-analytics.com/urchin.js" type="text/javascript">
</script>
<script type="text/javascript">
_uacct = "OO-000000-0";
urchinTracker();
</script>';
}
.. and will never happen again, etc.
references:
*I'm kidding
** and remember, that's after a week of being away for six months, too! 1st from 2,000,000+ results. Thanks Google! Thoroughly deserved, of course - and with another exciting update this very day!
** and remember, that's after a week of being away for six months, too! 1st from 2,000,000+ results. Thanks Google! Thoroughly deserved, of course - and with another exciting update this very day!