checksum
point-and-click MD5, SHA1 and BLAKE2 hashing for Windows..
The world's fastest hashing application, just got faster!
Welcome to checksum, a blisteringly fast, no-nonsense file hashing application for Windows, a program that generates and verifies BLAKE2, SHA1 and MD5 hashes; aka. "MD5 Sums", or "digital fingerprints"; of a file, a folder, or recursively, even through an entire disk or volume, does it extremely quickly, intelligently, and without fuss. Many people strongly believe it to be the best hashing utility on planet Earth.
Did I say fast? Not only mind-blowing hashing speeds (way faster than even the fastest SSD) but the quickest "get stuff done" time. With checksum you point and click and files, folders, even complete hard drives get hashed. Or verified. Simple. checksum just gets on with the job. Click-and-Go..
Available for 64 bit or 32 bit Windows (a basic Linux/UNIX/BSD version is also included).
Why?
In the decade before checksum, I must have installed and uninstalled dozens, perhaps hundreds of Windows MD5 hashing utilities, and overwhelmingly they leave me muttering "brain-dead POS!" under my breath, or words to that effect, or not under my breath. I always knew that data verification should be simple, even easy, but it invariably ended up a chore.
Either the brain-dead programs don't know how to recurse, or don't even pretend to, or they give the MD5 hash files daft, generic names, or they can't handle long file names, or foreign file names, or multiple files, or they run in MS DOS, or choke on UTF-8, or are painfully slow, or insist on presenting me with a complex interface, or don't have any decent hashing algorithms, or don't know how to synchronize new files with old, or have no shell integration or any combination of these things; and I would usually end up shouting "FFS! JUST DO IT!!!".
No more! Now I have checksum, and it suffers from none of these problems; as well as adding quite a few tricks of its own..
What is it for, exactly?
Peace of mind! BLAKE2, SHA1 and MD5 hashes are used to verify that a file or group of files has not changed. Simple as that. This is useful, even crucial, in all kinds of situations where data integrity is important.
For instance, these days, it's not uncommon to find MD5 hashes (and less rarely now, SHA1 hashes) published alongside downloads, even Windows downloads. This hash, when used, ensures that the file you downloaded is exactly the same file the author uploaded, and hasn't been tampered with in any way, Trojan added, etc.; even the slightest change in the data produces a wildly different hash.
A file hash is also the best way to ensure your 3D Printed propeller blade hasn't been "redesigned" to self-destruct!
It's also useful if you want to compare files and folders/directories; using checksums is far more accurate than simply comparing file sizes, dates or any other property. For quick file compare tasks, there's also checksum's little brother; simple checksum, simply drag & drop Two files for an instant hash-accurate comparison.
If you burn a lot of data to CD or DVD, you can use checksum to accurately verify the integrity of your data right after a burn, and at any time in the future. If you distribute data in any way, maybe torrenteering your favourite things, run a file server of some kind, or just email a few files to your friends; hashes enable the person at the other end to be absolutely sure that the file arrived perfectly, 100% intact.
As well as providing secure verification against tampering, virus infection, file (and backup file) corruption, transfer errors and more, digital fingerprints can serve as an "early warning" of possible media failures, be they optical or magnetic. It was a hash failure that recently alerted me to a failing batch of DVD-R disks; I saved my fading data in time, and got a refund on the disks. I'll leave you to consider the million other uses. There's only one reason, though; peace of mind.
Absolutely no-nonsense file verification..
checksum can create (two clicks, or a drag-and-drop) or verify (one click) hashes of a file, a folder, even a whole disk full of files and folders in one simple, no-nonsense, high-performance operation. Basically, you point it at a file or folder and go! The parameters are controlled by command-line switches, but most folk won't have to worry about that; it all happens invisibly, and is built-in to your Windows® Explorer context (aka "concept", aka "right-click") commands (see above).
Note: while checksum operates with command-line switches, it is NOT a Windows® console application; there's no messy DOS box, or anything like that. But if you want to run it from a console, that's covered, too.
There are a wealth of command-line options, but most people find that checksum just works exactly as they would expect, without any messing about; right-click and go! But, if you are the sort who likes to customize and hack at things, you will find plenty to keep you occupied!
On-the-fly configuration..
If you want to change any of checksum's options on-the-fly, simply hold down the SHIFT key when you select its Explorer context menu item, and checksum will pop up a dialog for you to tweak the process. If you want to have anything permanently set, checksum comes with standard plain text Windows ini file for you to tweak to your heart's content. Anyone smart enough to use MD5sums, can edit plain text.
The options dialog is most useful when you want to only hash certain files in a folder, like mp3's, or movies. With your file mask groups, you can configure file-type specific hashing with just a couple of clicks.
Common music, video, and archive formats come setup and ready to go, and you can easily edit or add to these at any time.
You pop up the options by holding down the SHIFT key when you select the explorer menu item, so it's easy to get to the advanced options whenever you need them. Same goes for verification, though generally you won't need it - checksum is smart enough to just get on with the job, verifying whatever checksum files it finds in the path, be they MD5, SHA1 or BLAKE2, or all of the above, and you'll probably never need to use anything but the default verify command, no matter how advanced you are! And because checksum recognizes other formats of MD5 and SHA1 files (there is no standard BLAKE2 format), it can be used not only to verify and create new checksums, but also verify existing checksum files, even ancient ones, automatically.
I expect there is some weird MD5 file format out there that I don't have an example of, Wang, maybe? but in practice, checksum supports ALL known MD5 verification file formats, that is, known by me. If you find an MD5 file format that checksum doesn't support, send me that file!!
There isn't really a standard SHA1 format yet, but checksum's is pretty good (it's the same as the output from a *NIX sha1sum command in binary mode). Shall we?
100% Portable..
checksum usually operates as a regular installed desktop application with Explorer context menus, custom .hash
, .md5
, .sha1
and .blake2
desktop icons, Windows start menu entries, and so on; but checksum can also operate in a completely portable state, and happily works from a pen-drive, DVD, or wherever you happen to be; no less than total portability.
Even with its little brother, simple checksum tagging along, the whole lot fits easily on the smallest pen-drive (the 32 bit version will UPX onto a floppy disk!), enabling you to create BLAKE2, SHA1 and MD5 hashes, wherever you are. To activate portable mode, simply drop a checksum.ini
file next to checksum.exe
(or run one-time with the "portable
" switch), you're done.
It's no problem to run checksum both ways simultaneously, or to run checksum in portable mode on a desktop where checksum is already installed. Simply put, if there's a checksum.ini
next to it, checksum will use it, and if there isn't an ini there, checksum uses the one in your user data folder (aka. "Application Data", aka. "AppData").
If you like applications to run in a portable state, even on your own desktop, no problem; you can skip the installer altogether and simply copy the files (checksum.exe
and simple checksum.exe
) to wherever you like. They are in the installer's files/
directory inside the main zip archive. There's also a checksum.ini
inside the archive, so you can unzip-and-go.
Note: Regardless of whether you install checksum or run it in a portable state, its functionality is identical.
Introducing.. The Unified Hash Extension™
And Multi-Hashing™..
checksum uses the MD5, SHA1 and BLAKE2 hashing algorithms, and can create .md5
and .sha1
and .blake2
(or .b2
or whatever you use) files to contain these hashes. But checksum prefers to instead create a single .hash
extension for all your hash files, whatever algorithm you use. Welcome to the unified .hash
extension..
I feel there are quite enough file extensions to deal with, and with some effort on the part of software developers, this may catch on. I hope it does, anyway, and that you agree. A single, unified hash extension looks like the way forward, to me. All comments welcome, below.
As well as being able to verify MD5, SHA1 and BLAKE2 hashes, even mixed up in the same file, checksum can also create such a file, if you so desire. At any rate, if you start using BLAKE2 or SHA1 hashes some day, you can keep your old MD5 hashes handy, inside your .hash
files..
The single, unified hash extension gives us not only the freedom to effortlessly upgrade algorithms at any time, without having to handle yet-another-file-type, but also the ability to easily store output from multiple hashing algorithms inside a single .hash
file. Welcome to multi-hashing, which will doubtless have security benefits, to boot (a multi-hash is simply collision-proof).
Lightning fast..
If you do a lot of hashing, you will know that it's an intensive process, and relatively slow. Well, checksum is fast, lightning fast.
Even on my old desktop (a lowly 1.3GHz, where checksum was initially developed) it would rip through a 100MB file in under one second. The latest checksum can crunch data faster than any hard drive or even SSD can supply it. Hashing your average album or TV episode is instantaneous.
With right-click convenience, intelligent recursion and synchronization, full automization, and crazy-fast hashing speeds, digital fingerprinting is no longer a chore, it's a joy!
Okay, I'm getting carried away, but seriously, this is how hashing was always meant to be.
Features..
If you like lists, and who doesn't, here's a list of checksum's "features", as compared to your average md5 utility..
True point-and-click hash creation and verification..
No-brainer hash creation and verification. In a word; simple.
Choice of MD5, SHA1 or BLAKE2 hashing algorithms..
Create a regular MD5sum (128-bit), or further increase security by using the SHA1 algorithm (160-bit). For the ultimate in security, you can create BLAKE2 hashes (technically, BLAKE2s-256, which kicks the SHA family's butt in both security AND hashing speed). checksum recognizes and works with all these formats, even mixed up in the same file.
hash single files, or folders/directories full of files.. no problem..
checksum can create hash files for individual files or folders full of files, and importantly, automatically recognizes both kinds during verification, verifying every kind of checksum file it can find. Also, when creating individual hash files, checksum is smart enough to skip any that already exist.
Effortless recursion (point at a folder/directory or volume and GO!) ..
Not only fully automatic creation and verification of files, and folders full of files, but hash all the files and folders inside, and all the folders inside them, and so on, and so on, through an entire volume, if you desire.. one click! ... Drive hashing is now officially EASY!
LONG PATH support..
All checksum's internal file operations use UNC-style long paths, so can easily create and verify hashes for files with paths of up to 32,767 characters in length. Goodbye MAX_PATH
!
Full UNICODE file name support..
checksum can work with file names in ANY language, even the tricky ones like Russian, Arabic, Greek, Japanese, Belarusian and Urdu. checksum can also handle those special characters and symbols that lurk inside many fonts. In short, if you can use it as a Windows file or folder name, checksum can hash it!
"root", folder or individual file hashes, your call..
Some people prefer hashes of folders, some prefer "root" hashes (with an entire volume's hashes in a single file). Some people like individual hashes of every single file. I like all three, depending on the situation, and checksum has always been able to do it all.
Email notifications..
checksum can mail you when it detects errors in your files; especially handy for scheduled tasks running while you are away or otherwise engaged. checksum's Mail-On-Fail can do CC, BCC, SSL, single and multiple file attachments (including attaching your generated log file), mail priority and more.
Multiple user-defined file mask groups..
For instance, hash only audio files, or only movies, whatever you like, available from a handy drop-down menu. All your favourite file types can be stored in custom groups for easy-peezy file-type-specific hashing. e.g..
The most common groups are already provided, and it's trivial to create your own. You can also enter custom masks directly into the one-shot options, e.g. report*.pdf
, to hash all the reports in a folder, create ad-hoc groups, or whatever.
Automatic music playlist creation..
Another killer feature; checksum can create music playlist files along with your checksums! When creating a folder hash, if checksum encounters any of the music files you have specified in your preferences; mp3's, ogg files, wma, whatever; it can create a playlist for the collection (i.e.. the album). Rather nifty, and a perfect addition to the custom command in the tips and tricks section.
As well as regular Windows standard .m3u/m3u8
playlist files (Winamp, etc.), checksum also supports .pls
(shoutcast/icecast) playlists.
Effortlessly handles all known** legacy md5 files..
If you discover an MD5sum that checksum doesn't support, Send Me That FILE!
Create lowercase or UPPERCASE checksums at will..
Like many things, this can also be set permanently, if you so wish.
Automatic synchronization of old and new files..
Automatically add new hashes to existing checksum files.
That's right! Automatically add new hashes to existing checksum files!
Integrated Windows® Explorer context (right-click) operation..
The installer will setup Windows® Explorer context commands for all files and folders, so you can right-click anything and create or verify checksums at will. Very handy. "setup", the rather clever installer, is also available in its own right, as a free, and 100% ini-driven installer engine for your own goodies. Stuffed with features, easy to use, and definitely deserving a page to itself. Soon.
As explained above, you can also bypass the installer altogether, and simply unzip-and-go, for 100% portable checksumming. Or you can have both.
Scheduler Wizard..
One of checksum's special startup tasks is a Scheduler Wizard, which will guide you simply through the process of creating a checksum scheduled command in Windows Task Scheduler.
Click a few buttons, set your preferences in the familiar one-shot options dialog, and go!
No-fuss intelligent checksum verification..
Cut and paste your own checksum files if you like, rename them, mix and match legacy MD5 formats in a single file, even throw in a few SHA1 or BLAKE2 hashes just for fun; worry not; checksum will work it out!
Permanently ignore any file types..
Obviously we don't want checksums files of checksum files, for starters, but if you have other file types you'd like on a permanent ignore, desktop.ini
files, thumbs.db
, whatever; it's easy to setup. The most common annoying file types already are.
Ignored folders..
As well as a set of permanently ignored folders (like "System Volume Information", $RECYCLER, and so on) you can set custom ignore masks on a per-job basis, using standard Windows file masks, e.g. "foo*
" or "?bar
".
Real-time tool-tip style dynamic progress update..
Drag it around the screen - it snaps to the edges, and stays there (checksum also remembers its dialog screen positions, for intuitive, fast operation).
Tool-tip progress can be disabled altogether, if you wish.
Right-click the Tooltip for extra options.
During verification, any failures can be seen real-time in a system tray tool-tip, hover your mouse over the tray icon for details. checksum also flashes the progress tooltip red momentarily, and (optionally) beeps your PC speaker, to let you know of any hash failures. If there were errors, the final tooltip is red (by default). Anything to make life a bit easier.
Verify a mix of multiple (and nested) MD5, SHA1 and BLAKE2 checksum files with a single command..
Does what it says on the can!
Extensionless checksum files..
Traditionally, individual checksum files are named filename.ext.md5
. Personally, I find this inelegant, and prefer them to be named filename.md5
. I like it so much, I made it the default, but you can change that, if you like. When running extensionless; if checksum encounters multiple files with same name, it simply adds them to the same checksum file, so checksums for foo.txt
, foo.htm
, and foo.jpg
would all go inside foo.md5
, or better yet, foo.hash
. Highly groovy.
On the verify side of things, checksum has always verified every possible checksum it can find, so these multi-hash file look just like regular folder hash files, and verify perfectly, so long as the data hasn't changed, of course!
Search & Verify Single Files..
With checksum, you can verify a single file, anywhere in your system, from anywhere in your system, regardless of where its associated .hash
file is in the file tree, be it in a folder or root (aggregate) hash.
checksum will search up the tree, first looking for matching individual .hash
files, and then folder hashes, all the way up to the root of the volume until it finds one containing a hash for your file, at which point it will verify that one hash and return the result. Another fantastic time-saver!
This works best as an explorer context menu command (supplied).
Smart checksum file naming, with dynamic @tokens..
checksum file names reflect the actual files or folders checked! Automatically.
If you want more, you can specify either static or dynamic checksum file names, with a wide range of automagically transforming tokens. See below for details.
Report Changed/Corrupt/Missing States..
checksum can optionally store a file's modification date and time along with the checksums, like so..
#md5#info.nfo#2009.09.26@19.49:36
5deee1f6ac75961d2f5b3cfc01bdb39c *info.nfo
Thanks to the extra information, during verification checksum will report files with mismatched hashes as either "CHANGED" (they have been modified by some user/process) or "CORRUPT", where the modification time stamp is unchanged.
These will show as a different color in your HTML logs.
You can choose whether or not to report (and log) missing, changed, or corrupted files. For example, if you only want to know about CORRUPT files, but don't care about changed or missing files, you would set..
report_missing=false
report_changed=false
report_corrupt=true
As one commenter (below) pointed out, with this sort of functionality, checksum would become "the only tool against silent data corruption". I believe this goal has now been achieved.
The chosen algorithm is also stored along with this information, for possible future use (aye, more algorithms!).
Automatically remove hashes for missing files..
Stuff gets deleted, on purpose; fact of computing life. When verifying your hashes, you can have checksum remove those entries from your .hash
file automatically, so you never have to think about them again!
The number of deleted hashes, if any, is posted in your final notification.
Automatically update hashes for changed files..
Files gets mindfully altered; another fact of computing life - MP3's get new ID3 tags, documents get edited, and so on. Now you can have your hashes updated, too! That's right! During verification, you can instruct checksum to automatically update (aka. "refresh") those entries (and their associated timestamps) inside your .hash
file. No more editing required!
The number of updated hashes, if any, is also posted in your final notification.
Effortless hashing of read-only volumes..
checksum can create BLAKE2, SHA1 and MD5 hashes for the read-only volume, but store the checksum files elsewhere; either with relative paths inside; so you can later copy the checksum file into other copies of the volume, or absolute paths; so you can keep tabs on the originals from anywhere.
checksum currently has three different read-only fall-back strategies to choose from; use whichever most suits your needs.
Extensive logging capabilities, with intelligent log handling and dynamic log naming..
checksum always gives you the option to log failures. But you can log everything if you prefer. hashing times can be included in the logs, and proper CSS classes ensure you can tell what's-what at a glance.
Relative or absolute log file path locations can be configured in your preferences, as can the checksum log name itself; with dynamic date and time, as well as dynamic location and status tokens, so you can customize the output naming format to your exact requirements.
In other words, as well leaving it to checksum to work out automatically, or typing a regular name into your prefs, such as "checksum.log", you can use cool @tokens
to insert the current..
@sec
... seconds value. from 00 to 59
@min
... minutes value. from 00 to 59
@hour
... hours value, in 24-hour format. from 00 to 23
@mday
... numeric day of month. from 01 to 31
@mon
... numeric month. from 01 to 12
@year
... four-digit year
@wday
... numeric day of week. from 1 to 7 which corresponds to Sunday through Saturday.
@yday
... numeric day of year. from 1 to 366 (or 365 if not a leap year)
There is also a special token:
@item
which is transformed into the name of the file or folder being checked, and@status
, which automatically transforms into the current success/failure status.
You can mix these up with regular strings, like so..
log_name=[@year-@mon-@mday @ @hour.@min.@sec] checksums for @item [@status!].log
The @status
strings can also be individually configured in your prefs, if you wish. Roll the whole thing up, and with the settings above, the final log name might look like..
[2007-11-11 @ 16.43.50] checksums for golden boy [100% AOK!].log
HTML logging with log append and auto log-rotation..
As well as good old plain text, checksum can output logs in lovely XHTML, with CSS used for all style and positional elements. With the ability to append new logs to old, and auto-transforming tokens, you setup automatic daily/monthly/whatever log rotation by doing no more than choosing the correct name. You can even have your logs organized by section and date, all automatically; via the free-energy from your @tokens
.
Click here to see a sample of checksum's log output, amongst other things.
Exit Command..
checksum can be instructed to run a program upon job completion. It can also pass its own exit code to the program.
Total cross-platform and legacy md5 file format support..
Work with hidden checksums..
If you don't like to see those .hash
files, no problem; checksum can create and verify hidden checksum files as easily as visible ones. Like most options, as well as on-the-fly configuration via the options dialog (hold down SHIFT when you launch checksum), you can set this permanently by altering checksum.ini.
To create hidden checksums (same as attrib +h), use "h" on the command-line, or choose that option from the options dialog.
Don't worry about creating music playlists with the invisible option enabled, the playlists will be perfectly visible, only the checksums get hidden! (well, someone asked! ;o)
"Quiet" operation..
Handy if you are making scheduled items, etc, and want to disable all dialogs and notifications. Simply add a 'q' (or check the box in the one-shot options).
You can also set checksum to only pop up dialogs for "long operations". Just how long constitutes a long operation, is of course, up to you. The default is 0, so you get "SUCCESS!", even if it only took a millisecond. Check your preferences for many more wee tricks like this.
"No-Lock" file reading..
checksum doesn't care is a file is in-use, it will hash it anyway! And it won't lock your files up while it's doing it. Feel free to point checksum at any folder.
Audio alerts..
Unrelated to the "quiet" option (above), checksum can thoughtfully invoke your PC speaker to notify you of any verification failures as they happen, as well as shorter double-pips on completion (if your PC supports this - many modern PCs don't). You can even specify the exact KHz value for the beeps, whatever suits you best.
You can also assign WAV files for the success and failure sounds, if you prefer. A few samples can be found here.
Drag-and-drop files, folders and drives onto checksum..
If you prefer to drag and drop things, you can keep checksum (or a shortcut to it) handy on your desktop/toolbars/SendTo menu, and drag files or folders onto it for instant checksum creation. This works for verification, too; if you drag a hash file onto checksum, its hashes are instantly verified.
Note: like regular menu activation, you can use the SHIFT key to pop-up the options dialog at launch-time. You can also drag and drop files and folders onto the one-shot options dialogs, to have their paths automatically inserted for you.
User preferences are stored in a plain text Windows® ini file..
You can look at it, edit it, back it up, script with it, and handle it. Lots of things can be tweaked and set from here, though 99.36% of people will probably find the defaults are just fine, and the one-shot option dialogs handle everything else they could ever need. But if you are a more advanced user, with special requirements, chances are checksum has a setting just for you. Click here to find out more about checksum.ini
Comprehensive set of command-line switches..
Normally with checksum, you simply click-and-go; but checksum also accepts a large number of command-line switches. If you are creating a custom front-end, modifying your explorer context menu commands, or creating a custom scheduled task or batch file, take a look at checksum's many switches. For lots more details, see here.
If you simply have some special task to perform, it can probably be achieved via the one-shot options dialog.
Shutdown when done..
If your system doesn't normally run 24/7, don't let that stop you from hashing Terabytes of data! checksum can be instructed to shutdown your PC at the end of the job.
That's a lot of features! And it's not even them all!
checksum is jam-packed with thoughtful little touches, you might even call it Artificial Intelligence! Wherever possible, if checksum can anticipate and interpret users, it will.
Legacy and cross-platform MD5/SHA1 file formats that checksum can handle..
If you look inside any MD5/SHA1 checksum file - it's plain text - you find all sorts of things.
Here's what a regular (MD5) checksum file looks like..
01805fe7528f0d98c
Each line begins with the MD5/SHA1 digest (hash), followed by a space, then an asterisk, then the filename. It's a clear format, flexible, relatively fool-proof ("*" is not allowed on any file system), and well supported.
Other formats I've come across..
single file single MD5/SHA1 hash types - these necessarily have the same name as the file, with ".md5" or ".sha1" extension added, and are often hand-made by system admins, or else piped from a shell md5/sha command) ..
01805fe7528f0d98c
4988ae20125db8071
space delimited hashes (before we figured out the clever asterisk)..
01805fe7528f0d98c
4988ae20125db8071
double-space delimited hashes (just silly, really)..
Believe it or not, this is the de-facto standard for md5 files, mainly because it's the output from the UNIX md5sum/sha1sum command in 'text' mode, which amazingly; is the default setting. By the way; md5sum's "-b" or "--binary" switch overrides this insanity.
01805fe7528f0d98c
4988ae20125db8071
TAB delimited hashes (I am assured these do exist!)..
01805fe7528f0d98c
4988ae20125db8071
back-to-front hashes in parenthesis - this is quite a common format around the UNIX/Solaris archives of the world (it's the output from openssl dgst
command) ..
MD5(01 - Stygian Vista (radio controlled).mp3)= 01805fe7528f0d98c
or..
MD5 (01 - Stygian Vista (radio controlled).mp3) = 01805fe7528f0d98c
even..
SHA1(01 - Stygian Vista (radio controlled).mp3)= 4988ae20125db8071
checksum supports verification of all these formats with ease, so feel free to point it at any old folder structure, Linux CD, whatever, or any .md5 or .sha1 files you have lying around, and get results.
And in case the above track names got you googled here, yes, checksum also works great in Microsoft® Vista, and Windows 7, Windows 8, Windows 8.1, Windows 10 and Windows Server of course, even XP! ;o)
simple checksum
Supplied along with checksum is checksum's little brother app, "simple checksum", a supremely simple, handy, free, and highly cute drag-and-drop desktop checksumming tool utilizing checksum's ultra-fast hashing library; for all those "wee" hashing tasks..
Drop a file onto simple checksum, get an instant MD5, SHA1 or BLAKE2 hash readout.
Drop two files, and get an instant MD5, SHA1 or BLAKE2 file compare.
Drop two folders, and get a hash-perfect folder compare (using checksum as the back-end).
Drop a file onto simple checksum with a hash in your clipboard, get an instant clipboard hash compare.
And that works from your "SendTo" menu, too (select two files - SendTo simple checksum.. instant file compare; send two folder, get a hash-perfect folder compare), as well as drag and drop onto simple checksum itself, or a shortcut to simple checksum.
Packed with intuitive HotKeys and time-saving automatic settings, simple checksum is Handy Indeed!
And simple checksum is COMPLETELY FREE, as in beer. Check it out..
download
Download and use checksum, for free..
click to see zip archive contents
# made with checksum.. point-and-click hashing for windows (64-bit edition). # from corz.org.. http://corz.org/windows/software/checksum/ # #md5#checksum.zip#2015.07.04@01.26:25 024f061d2262d95d0864fa558fd938f9 *checksum.zip #sha1#checksum.zip#2015.07.04@01.26:25 199ef31f91c06786a05eeead114c026a67426488 *checksum.zip
click to see zip archive contents
# made with checksum.. point-and-click hashing for windows (64-bit edition). # from corz.org.. http://corz.org/windows/software/checksum/ # #md5#checksum_x64.zip#2015.07.04@01.26:28 72e1cac7bd2dfd4ce3cf862920350bfa *checksum_x64.zip #sha1#checksum_x64.zip#2015.07.04@01.26:28 86d8db98f96b5c8e196594667b9d324e066f4215 *checksum_x64.zip
NOTE: If your Anti-Virus software detects anything in this software, I recommend you switch to an Anti-Virus that isn't brain-dead. If you DO discover an actual virus, malware, trojan, or anything of that nature inside this software, please mail me, and I will send you a cheque for a Million Pounds, as a reward. In other words, this software is clean.
These guys agree..
(note: I've now removed checksum from most of these sites!)
(Ahh.. The beauty of PAD Files!)
License Upgrade
If you need to upgrade your ancient license to the new format (checksum v1.3+) go here.Itstory..
aka. 'version info', aka. 'changes'..
This is usually bang-up-to-date, and will keep you informed if you are messing around with the latest beta, and let you know what's coming up next. Note: it was getting a bit long to include here in the main page, so now there's a link to the original document, instead..
You can get the latest version.nfo in a pop-up windoid, here, or via a regular link at the top of this page.
Leave a comment about checksum..
If you think you have found a bug, click here. If you want to suggest a feature, click here. For anything else, feel free to leave a comment below..
Will checksum do a hash of a complete drive as well? Is there a size limitation?
Situation: I want to confirm an accurate duplication of a hard drive.
Also, can it create a printout to report time and date and the two hashes?
I don't know what you mean about "the two hashes" (each file gets only one), but apart from that..
Yes, complete drives are no problem, I hash entire DVD's regularly. I recently checksummed my entire archive drive, too (160GB)
No, there is no size limit, at least, theoretically. I've not tried anything much over 1.5GB, but there shouldn't be any problems hashing even really huge files.
Logging only takes place on verification, but yes, you can tell checksum to log every single item, success or fail, and the log is always timestamped, looks something like this..
Send me a mail; I'll be putting out the latest beta some time this week (all those who have previously mailed will get something in their inbox very soon) so you can play with it, and let me know how big it really goes.
for now..
;o)
ps. about the "two hashes", do you mean, the original drive, and the duplicated drive? I suspect so. The way to do it is simply run checksum on the drive before duplication, and then afterwards, run checksum again (in verify mode) on the duplicate drive. The checksum files will have been duplicated along with the drive. Keeping the checksum files along with the real files is the best method, by far.
hello,
very interesting what you wrote down here, what do I need to do, in order to get a copy of your program :-)
currently I'm using md5summer (http://www.md5summer.org/download.html) to get the job done, but your app looks much more powerfull.
hope you release it as GPL?
Best regards from berlin/germany
- Phil
I agree, Phil! Very interesting indeed!
How do you get a copy? Join the "ßeta program", unless you are running XP, I have enough XP testers! However, I am keen to enroll someone running Vista - mail me for details.
You're right, checksum looks much more powerful. That's because it is. Nothing even comes close. From a recent mail from one of the beta testers (all hardened MD5 utility users)..
I love that quote!
I already have a bag of similar quotes, and it's not even released!
Guys! Post them HERE, eh!
No.
Ironically, all my other apps are open source, and though they get many downloads, source packs, too; no one ever mails me about them, or offers their appreciation and thanks, or asks about the licensing terms - copy+paste is easier. Okay, my /windows area hasn't been up for long, but still; already THREE people have asked me if I plan to release the code for checksum! What does this tell you?
But no. I plan to keep this one all to myself. I know from experience that when you release code, it just gets ripped off, and folk go on to claim it as their own (I've got loads of php 'out there', thinly disguised as other things). Almost never will you get actual credit. I don't mind too much, I do it for me, and release it because I can, and have a knack for readable documentation.
With checksum, I decided that there was just so much cool stuff going on inside, no way was I going to let someone come along and just steal it. It's MY baby!
The other reason is that I plan to charge for the full version (the free version will be MD5-only) because I'd like this place to start making some cash! An artist's gotta eat, you know.
for now..
;o)
I just came across another app that gets the job done:
www.slavasoft .com/fsum
While I can understand, that you hope to get some ca$h, but the problem I think that every security related application has to be OpenSource so that everybody can check the source-code if there are any threats.
But of course I can also understand if you want to earn some money - hope it doesn't cost too much, but from what I understand there will be also a free version, nice.
I sometimes use paypal to honor good work from people, releasing their work for free.
keep us informed about future updates - if you're interested in beta-testing for win2k, drop me a line.
- Phil
www.slavasoft .com/fsum
One of many many many that will "do the job", but not in a way that this human finds acceptable. And judging by my inbox recently, many other humans, too.
No. Checksum implements widely-known, open-source algorithms. There's nothing inherently secure or insecure about checksum itself, so these sorts of concerns would be completely misplaced.
Both the MD5 and SHA1 algorithms are not only open source, they are public domain - feel free to check their robustness at any recognized code archive, as well as checksum's ability to 100% comply with their specifications using the reference checksums provided at MIT, wikipedia, etc.
Even if I wrote the loosest, most insecure code on the planet (which I don't), it wouldn't affect anyone's security in the slightest, so long as the checksum's themselves are 100% accurate, which they are. Bottom line: the only person that needs to see the source, is me.
Yes, I do! I've been releasing open source code for years, with tens of thousands of users, yet I could count the donations on one hand. But in spite of that, yes, there will be a free version. The only difference; the free version will do MD5 only. For the vast majority of people, md5 is all they need, so even with the SHA1-enabled payware version, it will still be a miracle if I can cover my domain fees!
Simple checksum (included, and also free) works with SHA1 as well as MD5, though on a one-shot basis. checksum "pro" (haha!) will be around £10.
You are, quite literally, one in a million! Feel free to browse around the corz.org/engine !
The itstory (aka. "changes") above these comments, is dynamically included, and sometimes you can quite literally watch it grow, live (with a page refresh, anyway). If something happens, it'll be there.
I'll see if if I can find your email address in the list (I'm assuming you entered it at some point), because I'd like to know if everything's okay on Win2K. I today got confirmation that it's working swell on Vista, so 2K would be the Hat-Trick.
Thanks!
;o)
I just came across another app that gets the job done:
www.slavasoft .com/fsum
Hey phil,
FSUM was one of the many hash checking utilities I've tried during my quest to find the perfect one. Although it's a very good one, it simply cannot handle the types of "jobs" I've been able to accomplish with checksum.
I've written advanced batch scripts for FSUM which have greatly enhanced its "out-of-the-box" capabilities but even then it doesn't handle certain situations as well as checksum does.
If you just need a program that will let you make a checksum file for certain files/folders, or a program that will let you verify a single checksum here and a single checksum there whenever you need to, then I agree with you, FSUM will get the job done. But if you need something more powerful, then you will love checksum.
And just when you think it can't get any better, it gets better! cor is always adding new features and making improvements in one way or another. checksum really is great software and I highly recommend it.
-seVen
I agree with seVen!
And just when you thought it couldn't get any better.. Along comes spiffy HTML Logging!
Here's a sample of the output..
https://corz.org/windows/software/checksum/files/checksum-example-log-output.html
;o)
When will checksum leave the beta-testing-phase ... I'm just curious to try it out myself and I want to make sure, that I really don't need to look any further for another checksum-utility.
The current version is bug-free, as far as most of my testers are concerned - but actually contains a couple of beauties. They only occur under unusual circumstances, but still, it will need a little more work yet before a general release.
I'm tied up for the next couple of weeks, so it's unlikely I'll be fixing them until after that, but if all goes to plan, the release version will hit this page at the start of the new year.
Something to look forward to, eh!
;o)
ps. about the source (from earlier) I meant to mention, I am referring to the checksum program source; not the actual MD5/SHA1 routines in the hashing dll; the source for which you can easily check by following the credits in the dll's version information (via Explorer properties dialog).
Excuse me,
how can I download "checksum" program?
This would be an absolutely awesome program to integrate into a file level scan for a file copy/compare process. Parsing large (> a TB) volumes of millions of small files could largely benefit by a file system scan via hashing. It would allow the user to create a hash, copy all files to a destination as a backup, then the following day another run could be commenced copying the differential data to the destination, but the origin file system could be scanned via hash rather than a scan and compare operation. Sounds fast anyway.
Any updates on a release date?
If you have a feel for a release date, please let us know. Really interested in seeing checksum and simple checksum in operation.
Cor,
Your beta history section lists version 0.9.19
on 4th Jan, but nothing past v0.9.18.1b seems
to be available...
-thm
There's a bit about this in my /blog/.
In short, I'm just back from a rather long tech-break (though sadly not a holiday!) and checksum is definitely on the list of jobs for March.
I've decided to just release checksum for free (actually, "shirtware" - see recent blog for more details), and the next checksum job is to rip out the payware code and run some tests. There will be one more beta release (rc1) in a week or two (God-willing) for the beta testers (at least, those that mailed me along the lines of "hey! my beta ran out!"). If all the reports come back favourable; a proper release will follow soon after.
In the meantime, those who haven't read the itstory are urged to do so (starting at the bottom!); not just to appreciate the tremendous amount of hard work that's gone into checksum, but also to get a handle on its many features, some of which aren't documented elsewhere.
for now..
;o)
I'm so sad. I spend all that time reading the page and getting excited about checksum...and it's in closed beta testing.
Boooo
In the trade, we call this "building anticipation".
;o)
Glad to hear about this project! Like many others, I've been persistently irritated with current checksum tools.
Are you planning to add SFV support down the line? I know there are good current utilities for Windows but it'd be nice to get them all in one reliable package.
I haven't even considered SFV support; basically because it's fairly worthless. An "upgrade SFV checksums to MD5/SHA1" might be doable. Creating SFV, however, is not on the cards.
;o)
ive been waiting for this since last year. any news on the release cor??? i think youve built up enough anticipation.
Hey, if you're that desperate, mail me, and I'll throw you in with the beta testers!
;o)
Hey, I have a question. I know that MD5 software can usually be used to check the contents of a burned data DVD against a MD5 file. But what I'm interested in is checking burned DVDs as a whole. Like in ImgBurn, when you go to verify a previously burned DVD against the original ISO file, and it shows you that the MD5 checksums of each of them match. I'm looking for a way to do the same thing, except using a MD5 instead of an ISO file for verification. Will your program work for that?
cor, i sent you an email from a gmail account. look out for it.
galva12, what you are trying to do sounds, to me, like a poor second-cousin to what checksum actually does. It also sounds like you are keeping the old ISO around, which would be a huge waste of space.
Doing things the checksum way gives you way more flexibility - you can check every file on the disk with a single checksum file, or one checksum for each file, with the checksums on the disk, or elsewhere, or both. If you need to know the status of a single file, or all files, you can do that.
I don't see any advantages to the ImgBurn method. In fact, I think an md5 which changes because dates and times of file access change, if a recipe for user confusion. No, checksum won't be going there!
Having said that, you could do what you want with checksum, by simply ripping an ISO of the disk at any time, and comparing it to the original iso md5.
Thanks noname, I'm still catching some up after my birthday celebrations, but I did get it your mail. Expect a reply soon.
;o)
I didnt know you had your birthday recently cor. happy belated birthday! i hope you got heaps of presents. i look forward to getting your reply.
Hey cor, thanks for replying. I don't think I explained what I'm trying to do properly though. As you said, keeping the ISO files around would be a massive waste of space, which is why I'd rather verify my burns after the fact using premade MD5 files.
I'm dealing with images that don't contain just easily-accessible data files though; they're in some kind of convoluted format that makes it impossible to scan them file-by-file; they aren't listed in Windows explorer. I need a checksum of the DVD image as a whole.
In ImgBurn, when you verify a DVD it's shown that the checksum of the original ISO file is identical to that of the burned DVD. I'm just looking for a way to cut the ISO out of it, and use a premade checksum of the ISO instead. It's no task to get the ISO's checksum, but I haven't yet found any method to get the checksum of an entire DVD, as an image and not file-by-file. I wonder how ImgBurn does it?
It simply reads the disc into an ISO image. You can do this with most burning tools; Nero, MagicISO, etc. Once you have your image, you can checksum it, and compare it with the original ISO md5.
ImgBurn probably just hashes-as-it-reads, which is kinda slick, as you don't have to dump the ISO file anywhere, missing out the middle step.
;o)
ps. cheers defier!
I'll be emailing you in a few minutes. I have tried all of the software listed at these two sites
http://en.wikipedia.org/wiki/Comparison_of_file_verification_software
http://koolmonkey.bravehost.com/sfv-md5.html
and even the ones that cost money are not very good (I can't afford them anyway). I am religious about the security of my data, and I the only good piece of software I've been able to find is ViceVersa (http://www.tgrmn.com/index.htm), which does automatic backups as soon as you edit a file, and then it does a CRC32 check to make sure the backup is good. The advantage ViceVersa has over a RAID 1 mirroring setup is that ViceVersa also keeps old copies in an archive. I can't tell you how many times I've spent two days editing a file, and then realized that the version I had 3 days (or months) ago was much better, and I was glad that VV kept the old copies for me. It's not CVS versioning or revision control, but it's good enough for just me on my machine, while I work on something that may have come from a revision control system (RCS) like Subversion or SourceSafe.
Anyway, I've got backup reasonably solved with ViceVersa. Now I need Checksum for general purpose use, like the times when I need to make a manual backup copy. Gimme? Beta? Please? Desperate!
I got that, ta.
I feel the same way about my data; more and more people do. My approach was to bug the developer of my favourite text editor to add comprehensive backup facilities, and now I have a folder with every single version of every single thing I've worked on, timestamped in neat directory-named folders. Pretty cool, though sometimes it does get HUGE, and needs to be archived. Suits me fine. Btw, surely you mean RAID5.
As to the checksum applications, yup, they're all crap! Well, okay, one of them is actually slightly faster than checksum (that's the only good thing about it!), and a couple of them have kinda cool (though pointless) GUI, but apart from that.. pfff. That's why we're here!
I just put out rc1 to the beta testers on Wednesday. Early feedback..
I love it, love it, and love it... Bravo!
..and as I work on rc2, I'm looking to widen the net some; let a few more folk try it. There will be no more free licenses, but as I'm putting it out as Shirtware, I guess that's less of an incentive, anyway.
Look forward to some juicy download details in your inbox in the next few minutes. All feedback welcome.
;o)
I like raid 1, since it's not about write speed for me, just cheap redundancy with only 1 extra disk. My text editor does backup too, but I just let viceversa handle it, since it'll back it up as soon as I change something.
Yup, that's what Editplus does. Every single change creates a new backup. If you enable it, that is.
Ahh right.. RAID1. For some reason I get RAID0 and RAID1 mixed up. I don't use either, is probably why.
;o)
What's up cor!
Glad to see checksum is close to being released to the public! To tell you the truth, it's so good that it makes me feel guilty that I (as a beta tester) get to use it daily while the rest of the world has to deal with the pains of using inferior hash-checking software.
Don't worry guys, hopefully you will all have it soon. Only then will you truly understand
Yes, we're making the world a better place for our children, and our children's children! Hopefully we can handle the pain of guilt for a wee while longer, yet!
Hey! I did add another batch of testers to the latest release (rc2), yesterday. Anyone who has mailed over the last few months about testing checksum should definitely check whatever address they used - there could be juicy download details waiting for you right now!
I'm thinking I might go semi-public with rc3. I've been messing around with my mailer this week, getting it ship-shape to handle the thousands of folk who have signed up for "corz odd mailing" (a list I've not used yet) with a view to letting them have a crack at checksum before a full release. But it's slow going, and I keep finding reasons to do something else instead. I might just install PHPList and be done with it!
Anyways, in ever increasing circles, it's coming folks, and real soon.
;o)
The only checksum utility I could find on the net.
Thanks a lot
I just read through the changelog. It looks like checksum is getting a lot of work done on it to polish it up for a release. Thank you very much for all the work you've done.
Yes! And so far only a single bug report for rc3 (which is already fixed).
So it's looking like Real Soon Now!
;o)
This is just what I've been looking for! I can finally replace the non-recursive QuickSFV A pity it's not done yet. I'll watch this page closely for the release.
As for speed I really can't see how you can improve much. On PCs today you're usually limited by disk I/O rather than processing power. Not that less cpu usage isn't welcome
Can you compare two directories with this tools without generating md5 files everywhere? E.g. to verify that a file copy has been done without errors.
Any thoughts on including par2 support or some other repair capable algorithm in the future?
Enough nagging
Thanks for using your spare time to make this. If it works as well as you say I'll recommend it to everyone I know, and I'll definatly donate some money.
Para Noid, thank you! Yes! I've been working on it whenever possible, check out the itstory for details - the last thing I wanted before calling it 1.0, and letting it out, was a "unified hash extension", as I like to call it. I'd already done the tricky bits, teaching checksum how to interrogate files line-by-line, and not give a damn about the file extension. Who knows what hashes might be in future files, and one more file extension is quite enough!
I got that done a couple of nights ago, and the unified .hash extension is with us now; checksum phase two, complete. Pretty new icons and a funky PAD file followed suit.
As for speed, yes, I had a similar conversation with a beta-tester, and it's totally true; utilizing 100% cpu for a file hash is, for now, a pipe-dream, and file system I/O is always the deciding factor. Having said that, it's nice to get a file into ram and just go Woosh! See that's possible. It is rather fast, and the the aforementioned conversation is definitely at back of the most recent batch of speed improvements - I had to get it into the top-three! And as storage devices get faster, so will checksum!
Ideally, I get T-Shirt Three-of-Three ready, and then do one grueling night of checksum promotion, dropping blogs, signing up with download sites, dropping a link right here being the one I look forward to most. This page needs some completion! Especially since the beautiful search engines of the world are already sending people.
But Three-of-Three is "The Amazing Metaphysical Traveling Map", and for years it has caused me no end of grief, refusing to settle itself into one simple design. I literally have hundreds of sketches, the concept being more important, of course, but design is important, too, as any user, even potential user of checksum appreciates. I'll get there, but if I don't nail it within a week; that and an even smarter way to "upgrade" multiple hash files to the new unified hashing scheme, then it's getting released anyway.
A week, tops.
Well, they wouldn't go everywhere! Just inside the directories in question. I know what you mean, though. No. checksum makes real hash files*. However, it will happily work with hidden checksum files, creating or verifying them, so you don't have to see them if you don't want to**. Having said all that, hash files a) are rather reassuring, sitting there, and b) have a really cute new icon, took me ages!
It also seems wrong, somehow, to not keep the hash data. We used CPU cycles to create something, a state of things, captured. Why not keep that data? I like to think that we'll all get used to living with hashes. They are so small and beautiful, and perfect! "Perfect until your data isn't!". Keep 'em! I say!
All I know of Par is that you make tiny wee extra files that somehow can be used to re-create missing parts of very large files, usually Rars. It seems to defy the laws of physics, even though I'm reliably informed that it's all true. Which is to say; No, I haven't considered it. I'm not at that stage with PAR, yet, and even if I was, I'm not sure it's something checksum should be doing, anyway. Of course, I'll happily accept enlightenment on this topic!
Keep 'em coming!
;o)
ps. I had to dash out in the middle of that post, sorry about that!
For quick file-compare tasks, I more usually use simple checksum, checksum's wee brother. And the more I think about it, the more I think that if I added these capabilities anywhere, it would more likely be to simple checksum.
** I used to always keep hidden files visible in my desktops, but more and more these days, I like to keep hidden things hidden - I did a macro to hotkey between the two states, when required.
I totally agree with your blurb. Making MD5s was always a pain! I Have been checking this page regularly for a couple of weeks since I first found it. It's great to see checksum released at last! AWESOME! It's exactly what I've been looking for!
Thank YOU!
JkR
p.s. It was a bit weird my md5 files becoming hash files but I totally get where you are coming from with that and I hope it catches on. Good luck!
Looks like JkR is another happy camper
I agree, it's awesome to see checksum finally out in the open. I bet cor is PROUD right now.
As a beta tester, I've been lucky enough to have checksum by my side for quite some time now. And over time, I sometimes forget about all the trouble I used to go through just to verify a couple of folders or recursively create hashes. checksum is just so simple, and well, it just "makes sense".
Why did it take so long for something like this to be created? That's something I'll always wonder. It's like all those other developers of hashing software just didn't care... I'm glad one of them did.
Thanks guys!
I actually started on a new feature minutes after I uploaded rc4; a request; WAV file alerts. Tonight, I also made a wee change to the installer's setup.ini (removed Explorer context menu "dividers"), and built an rc5. It's up now, in the same download place.
Keep the feedback coming, here and in my inbox, it's all good, even the bad! - though don't be insulting, cuz I'm a right c*nt when I'm annoyed!
It's about ready to be labeled v1.0, and promoted left, right and centre, I think; which will probably be at the weekend, when I'll have more time to trawl the popular download sites*.
Feel free to tell anyone and everyone about checksum, even torrent it at your favourite tracker, whatever; it's good, so let's get it out there!
for now..
;o)
When checksum appears for download on a popular download site, post it here so I can get it. If my download fails for whatever reason, corz.org blocks me from trying to download it again.
If you use a regular web browser, and don't try anything funky, there's absolutely no reason your download would fail. And if it does, email me, and I'll gladly send you a copy by hand.
How's that for service!
And by the way, the popular download sites will most likely just post a link back to here. Why wait? It's already here!
;o)
when you going to make a tshirt with your emoticon logo thingy?:
;o)
Great work. Thanks.
notnymous said..
;o)
It has crossed my mind, but my self-indulgence-o-meter is already in the red, so I'll probably at least wait until I get the first three designs ready. Also, it encourages "fans", and that was one of the reasons I got out of the music industry when I did!
I guess digital fans is okay; buying shirts, as opposed to ripping them off your back! It's a cool smiley, but, would YOU buy such a shirt?
;o)
ps. if I could find a manufacturer that could reproduce my site gradient on a T-Shirt; yes, I'd go for it, even just for myself.
I've been using a command line MD5 utility that does only files. But if a folder matches the filespec it attempts to access it and fails and quits. Inane behavior.
Oh, I've seen a lot crazier behaviour than that in hashing utilities! But let's forget about those, and look to the future: checksum.
;o)
Thanks a lot for writing this! I'd been using a command line md5 tool and was looking forward to point-and-click ease.
I deal with big (1GB - 13GB) video files that are typically moved around or backed up individually, so I always use the "individual checksums" option. I would like this to be the default. Unfortunately every time I right click on a directory and choose "Create checksums" I get one checksum file in the dir. The only way to get the "individual checksums" is to do the shift-key thing every single time, which gets old. Is there any way to have the "individual checksums" choice stick?
Also: Is there any way to be have an audio alert only when there is an error? The pc speaker beep quickly becomes very irritating.
Thanks ...
Hey, Mr. Snout, no problem!
checksum's operation is completely controlled by switches which makes it extremely flexible, so although there's no actual preference for "always create individual checksums" (I hadn't considered someone might want this set permanently, though I'll be sure to add this for a future release - cheers!), it's easy enough to achieve right now, simple add the i switch to your explorer context menu commands.
There are at least two ways to achieve this; probably the quickest is to add the i switch to the directory and/or drive commands in the registry..
HKEY_CLASSES_ROOT\Directory\shell\01.checksum
HKEY_CLASSES_ROOT\Drive\shell\01.checksum
Currently the switches in the command will be cr, so make them cri, and from then on, folder checksum commands will always create individual checksum files. Of course, if you occasionally want the usual per-folder checksums, you can do the SHIFT thing.
You could also create additional commands specifically for creating individual checksum files directly from the explorer menu in much the same way as the custom music file commands I outline in the tricks & tips page, here. You could even make the context menu entry specific to movie files, using crim(movies).
If you don't want to play with the registry, simply edit the installer's setup.ini file, adding the switches in there, instead..
HKCR\Directory\shell\01.|name|\command="|InstalledApp|" cri "%1"
HKCR\Drive\shell\01.|name|\command="|InstalledApp|" cri "%1"
and then reinstall checksum to get your updated Explorer context menu items. Actually, that's probably even quicker than editing the registry!
As for the audio alerts; if you want checksum to not beep on successful completion, yet still beep on any failures, simply set beep_success to a blank WAV file.
checksum will still alert you of failures with either a WAV or beep, depending on your preferences, but all success notifications will be completely silent. I've uploaded a small, blank WAV file for the purpose, here.
Of course, audio alerts can also be disabled completely in checksum.ini, if you wish.
Have fun!
;o)
To those who have read the page, and are now just following the comments..
There is a new section!
Check this out!
;o)
Very good. I also was thinking hashes should be done with point and click.
Thanks!
Hey cor. That's a nice tool - unfortunatelly it isn't what I'm searching for. Though it might be possible to make it such easily.
I'm actually looking for a tool with which I can easily check a file against a hash published on a web site. You too publish the hashes for your file here but to check the downloaded zip against it, I'd have to generate a hash file, open it, compare it against the one on the website, close the file and delete it.
So any way to generate and just display a hash to compare it visually? Or even better compare it automatically if a hash is found in the clipboard.
Thanks, chris
Hey Chris. I anticipated this need, and created "simple checksum", checksum's wee brother, which does exactly what you want. It's also very slim, and designed to sit out of the way somewhere, so you can do visual comparisons easily with your browser open - you can even make simple checksum semi-transparent, and leave it there.
As a bonus, you can also switch easily between MD5 and SHA1 (there's an application menu item and HotKeys for both) and simple checksum will recalculate the other algorithm for you, without the need for a second drag-and-drop. It comes free with checksum, just start it up!
Cheers, Laz3r. In fact, you pre-empted my new checksum slogan, there! Once the new beta goes up, I'll upload all the new pages, and you'll see what I mean.
;o)
Well cor, that surely works. But where to put that app to be invisible when not used but at hand when needed? Holding Ctrl while starting the app from context menu does something else than without but I'm not that sure what it is for. I still think it would be a nice behaviour to tell checksum to only display the result when a modifier key is pressed.
BTW, when a hash files content is verified, checksum tries to start cmd.exe. If I prevent that, everything still works. What ist the aim of this?
Nonono chris, simple checksum is a totally different application! If you ran the checksum installer, it should have been placed beside checksum.exe, in the program folder; it's the one with the green icon. Click this link; it's a page all about simple checksum.
There are no modifier keys for simple checksum. You just run it, and it stays open, displaying hashes for whatever you type into it, or drop onto it. You can toggle it (show/hide) by simply clicking its tray icon.
checksum doesn't display hashes, ever. checksum's job is to create and verify hash files.
But neither checksum or simple checksum attempt to start cmd.exe, at any time, for any reason; I suspect something else is happening on your system. Feel free to mail me more details about this if you like.
However, checksum will attempt to run compact.exe after creating logs, to compress them (more details in the itstory), and though compact.exe is a console application (it's built-in to windows) it should all happen in the background, and shouldn't be seen. If it's not happening in the background on your system, then I definitely want to know more about it; OS details, etc, in my inbox, or here, cheers!
;o)
Hi again!
I've been testing checksum on about 1 TB worth of files and it looks good so far. Easy to use and has several features. There are however a few features I miss which you can consider or just file under /dev/null
- Delete missing files from hash files
I have replaced several files with newer versions but the old files are still listed in the hash files which lists them as missing every time I do a verify. I can delete them manually from the hash files of course but where's the fun in that? Should not be a default option of course.
- Rename files to match filename in hash file or vice versa
Let's say a file has been renamed for some reason. Could be an irc server replacing spaces with underscores for instance. Or someone has renamed a file to match their own personal format.
The idea is to calculate the checksums for all files in a dir that isn't listed in the hash file(s) and see if it matches the checksum of any hash entries that are missing it's file and rename it accordingly.
Some considerations: Empty files should be ignored (and very small ones?). Multiple hash files can point to files in the same dir so it's probably a good idea to try to merge as many as possible to find the files that doesn't have a hash.
Another option that's much simpler to implement is to set it to only run on single hash files. It could then look for missing files in that hash file regardless if they're listed anywhere else.
I'm sure I can give you more suggestions if you're interested. :-)
Keep up the good work :-)
Yes, good work! I've just been testing checksum out on my archive, very impressed. It does lots of things I've wanted to do but couldn't until now. Now I just have to acquire a "decent text editor" (as you say!) so I can have a proper fiddle with all the preferences.
No one has mentioned simple checksum I see, it's very handy too. They make a good pair.
Thanks for all your work!
Excellent! Thank you.
I don't see where version 1.1b can be downloaded. Has it been released yet?
Thanks guys!
Cheers for the suggestions, Para Noid (and the other thing!). "Delete missing files from hash files" sounds useful, but highly risky; like you say, it wouldn't be the default! I'll definitely consider a special switch for this, though.
"Rename files to match filename in hash file or vice versa" I'll have to think about. If I did something like this, it would almost certainly work on a line-by-line basis, so the location of changed file names would be whatever scheme the original name was, regardless of relative or absolute paths. You can mix them all up real good in the latest version.
Setting this for only "single hash files" is is doable, so long as we remember, the idea of a single hash file is purely a virtual one, at least in verification (there is no "i" switch for verification, for this reason). A hash file is viewed simply as a container for hashes. Where each individual hash points, is determined on a line-by-line basis. I'll definitely consider all this, though.
notnymous, it's mostly done. Unfortunately, I've been away a lot this last week, and that will be true for a few days, yet, so the list is slow-going. I'm hoping to have 1.1b up this weekend, but that's not a promise!
Meantime, you could play with Batch Runner, a small but useful wee app I put up tonight; something I've been trying to make time for all week. I built it to run batches of tests on checksum before releases, but it's already proving handy for other jobs.
;o)
ps. Randy, also, their icon colour schemes compliment each other beautifully!
Nice. Thanks.
awesome!!
I got an error installing on Vista (SP1) about the zip.dll not getting registered. I got the folder out of the zip, and ran it again and it worked, but I thought you should know.
Great app, it was worth it!
Allthe best.
Bobz
Sorry about that, Bobz. It's actually sorta fixed, but y'all have to wait. For others who experience this, simply unzip the files.zip inside the "files" folder (yup, that's right, it will be /files/files/) and edit the setup.ini to read "files" instead of "files.zip". Continue as normal.
That x-zip plugin has been more trouble than its worth, and while working on corzipper (to come) I decided to bin it. I've now built a whole new zip plugin especially for my own stuff, which will be used for the installer, as well as some of my other apps (corzipper/backup/and more). As well as great performance, it boasts password-protected zip and unzip, which I utilize elsewhere.
x-zip will still be supported (my apps will look for my own zipper first, and fall-back to x-zip if it's not available) but I'll be phasing it out ASAP. As well as occasionally not registering itself, it uses available memory for all its operations, so while it's good for small things, installers and such, it starts to wobble as you increase the size of the archive. The new zipper plays nice with your RAM, and never use more than around 8MB to zip or unzip an archive of *any* size.
The new zip plugin is just one of a whole host of new stuff coming to the corz.org windows software area, soon. I also plan to install Vista myself this week - I hear SP1 makes Vista "not-crap", so here goes!
Probably all my stuff is full of Vista-related bugs, so if all goes to plan, I'll start squashing them in a few days.
l*rz..
;o)
Just wanted to add my support for Para Noids suggestion for dealing with renamed files. Would definitely save time as opposed to renaming them in the hash file since a lot of the time I will hash files as soon as possible when i get them and only worry about renaming them later.
Totally awesome program though! Was exactly what I was looking for. Its amazing how far behind most of the other programs out there are.
Thanks Cor
PS: noticed you are hashing "Van Diemans Land" in the other screenshot. I live in what used to called Van Diemans Land!!!
Hi Cor,
Your software looks like what i've been looking for, although I'd like to confirm a few things.
I've used a few crc/md5/sha file checkers that will do just that, ie compare a single file to a published MD5 value et.
However, I'm after a program where I can say create an MD5 of the sum total of folders with hundreds of files (or even a volume with hundreds of folders and thousands of files etc.
After there being some problems in the past with external hard drive (usb) chipsets vs file sizes etc. Plus where any data is being moved say from your PC to Flash Memory, DVD media or an external hard drive, I would find it useful to check if the files are an exact copy. I don't need to have an MD5 created for each file, but simply if I copy and paste a whole Volume (bar hidden system files etc) can i use your program to confirm if the 100Gbytes of data was copied to it's destination successfully?
Thanks.
Fated to use checksum! That's just what I happened to be listening to when I did this page. In truth, I don't even know where Van Diemans Land is! Ireland? *ahem* anyways..
Thanks Jeff; I'll definitely be looking into the renamed files thing during my next code block; though it will most likely be a special function, quite removed from the regular create and verify processes. Reason being, during verification, checksum doesn't care what files are in the folder, doesn't even look; its only concern is files listed in the hash file, so checking for renamed files could potentially introduce some overhead which I wouldn't want affecting regular (super-fast) usage. Perhaps a "refresh hash file" function. Hmmm.
The way I usually do it, currently, is to verify the hashes, then make changes, renaming, ID3 tags, etc, then write a completely fresh hash. It only takes moments with checksum. Still, a refresh function would be handy.
Vorlon checksum has no problem handling whole volumes with hundreds of Gigabytes, even Terrabytes of data (I recently received a gushing thank-you mail about checksum's brisk work on a 5TB archive, so I can say that with certainty!), and if you wish, you can put all the hashes into one single "root" hash file (again, checksum can handle massive root hash files). Though initially dubious of the usefulness of root hash files, I do find myself using them more and more myself these days, particularly in cross-platform work.
Also, checksum will happily hash hidden files, system files, even locked and in-use files, so you can be 100% certain that the volume was copied exactly. Or not, as the case may be!
Give it a whirl!
;o)
ps. here's a tip.. I note that someone mentioned (either here, earlier, or in a mail) that they found reading two hashes (to compare them) a choreful process; here's how I do it.. Paste hash one into an editor, then select all and paste hash two over the top of it. If you then Undo and Redo you can see any variations at-a-glance. I think most people just read the first and last few characters, anyway!
In the future, simple checksum will do this automatically, but that's for in the meantime!
Hi Cor,
Is it possible to do retrospective checking on read only memory storage ie previously burnt cdr/dvdr media with checksum?
Scenario:
Say you have already burnt an archive DVD-R of your drivers and apps etc prior to having installed or even known about checksum, but the directory/file structure still remains exactly as it was on your hard drive volume when you did the copy.
Could you then install checksum to your system and then hash that original source volume/drive/folder/file structure etc, then afterwards use checksum to verify the DVD-R but use the *.hash files of original source (on the Hard drive) as the MD5 reference of the files on the DVD-R?
Basically does checksum need a *.hash file/s on the media it's checking in all cases or can it do it "remotely" (so to speak)?
Best regards,
Steve
Yes Steve, checksum's operation on read-only volumes has received quite a bit of attention, internally, and there are a number of ways to achieve what you want.
Probably the easiest method in your situation, is to work in reverse, that is; create a "root" checksum of the read-only volume, and then simply copy that to the original location on your hard drive, and click it. You're done.
Note, checksum won't create a root hash file by default; you can either use the one-shot options to set that, add the "1" switch manually (if working from the command-line), or if you do this sort of thing a lot, you might want to set checksum's read-only "fallback_level" to 1, in checksum.ini.
There's more information about checksum's various read-only fallback strategies, here.
I should add, you can also create and verify hashes containing absolute paths, for truly "remote" hashing, as you put it. This can be highly useful, but could be limiting if some fixed volume got a letter change in the future, or was moved to a different machine where its original drive letter was not available.
However, in a future version, I plan to add path "mappings" where absolute hash files can later be transparently remapped to new locations. It's one of those features that would probably be rarely used, but occasionally extremely handy; at least, it would be useful to those not familiar with regex search and replace in their text editors!
Anyway, that's for the future; what you want is doable right now, with a simple root hash.
;o)
Hey cor,
thanks for your reply.
Tasmania, Australia was first originally named Van Diemens Land.
Hashing is a great way to track file changes but is there any way to safegaurd against the hash files getting corrupted?
cheers
Jeff
Tasmania! What a beautiful word that is.
Guard against corrupt hash files? There are a few ways. You could start by making a backup of the hash files. It's fairly easy to create an archive of only the hash files, even from a volume of many individual hash files. When decompressed in-situ, all the hash files would drop back into the correct locations. Also, hash the archive!
Then there's absolute (aka 'remote') hashing, which I discussed in my previous post. You could keep the hash files somewhere else, and still hash the original files as if they were right next to the hash files. The hash file itself could live on some solid-state, read-only volume.
For the extremely cautious, after hashing the volume as normal, you could temporarily remove .hash (or .md5/.sha1 if you use those extensions) from checksum's ignore list, and do a root hash of *only* the hash files, using absolute paths, and store that hash file somewhere safe. Before checking the file hashes, you would check the hash hashes!
And don't forget double-hashing! You can have a sha1 AND md5 hash for each file, inside the same hash file, if required. It would be extremely unlikely, if corruption is the potential issue, for both hashes to become corrupted.
Finally, if intentional tampering is a possible issue, storing absolute hashes elsewhere is probably the best method. And don't forget, checksum will happily make invisible hashes, always handy.
for now..
;o)
gud utility
Absolutely Superb!!!
Thank you.
Hi Cor,
I've finally got around to using Checksum proper and started moving some of my old data from an old PC to my new build. Checksum is fantastic in it's "intelligence", ie knowing that you are using Read Only media when making a root hash and storing safely on your hard drive for later reference. At first I was looking for where to enter a path for the hash to be stored as the media being checked was read only. But guessing you had catered for that I just ran the program to see what would happen - brilliant!
Another great thing with the root hash is folders on your PC like say the Steam Folder which might contain GB's of data, can be copied from say your old PC to your new PC and then checked without worrying that Valve's VAC system might see innocent hash files in some of the directory structure as modifications. Of course you could remove the individual hash files after moving the data, but that would take time.
I just realised that I made a mistake adding the k switch along with the 1 switch especially when comparing Removable media with a different drive letter - doh! (and you had pointed that out - above). Well it was late and I was trying to be thorough, although I did realise that the k switch gives absolute path (ie including drive letter), I hadn't realised there was still a wealth of information given in the log file by default, ie subdirectory path. I had thought the use of the k switch implied all or nothing, ie full path or just the file name.
Checksum has got to be the best utility I have found and used in recent years.
Steve
Thanks for your comments, Steve - I always get a kick, hearing other folk appreciate checksum's "intelligence", as you call it. A lot of thought goes into it.1
The main design goal, with checksum, was to create a hashing utility that would "JUST DO IT!", skipping as many pointless steps as possible, and as far as possible, getting on what the user wants done, immediately, and without prompting. I'd already wasted a significant portion of my life instructing daft hashing utilities to do the same thing over and over. As Kosh said, It was necessary!
But, as you have discovered, checksum is also able to handle those "unusual" jobs, and absolute, root hashing is but one of its many tricks. If you find yourself doing this sort of thing a lot, I definitely recommend you check out the 1.1 beta, which hasn't prompted a single bug report (it will become the release version soon enough). Being able to see (and edit) the switches inside the one-shot GUI can be extremely useful.
By the way, if you inadvertently add a "k" switch; being plain text, it's a fairly trivial operation to run a search & replace on the .hash file, in your preferred text editor; switching the drive letter to another, or even removing it altogether; making the hashes "relative". I've done it more than once, myself.
;o)
Quite right about the "Kosh" comment Cor :-)
I had forgotten to add to my above post that I an using the Version 1.1beta, which works fine.
Checksum has already brought something minor to my attention, through the use of burning a DVD with the Full Nero 7. Occasionally I save technical webpages especially along with some utilities, so i don't have loads of printed paper etc. However and I have noted this before, Nero (as with other burning utilities) truncates long file names, which are used in abundance these days on webpages.
So after checking my DVDr media, Checksum alerted me to X amount of files were missing. Of course Nero had effectively renamed the files and directories of the saved webpages by truncating them. I believe the basic Joliet system allows upto 64 characters, but I have found some info on the Net (not from the nero site unfortunately) that they relaxed this limit on Nero 7.
Anyway, something I hadn't even checked or considered (although I had seen it before) was picked up by Checksum.
The beauty with Checksum in this scenario, is that it's not limited to just reporting a different ie non matching checksum, but alerting you too, to missing files....
Also I like the html format report summary....
Steve
Hi Cor,
Would it be possible when you Verify with Checksum and check the "Log Everything" Box (and add your own LOG path if neccessary, that the resultant HTML Report File Groups "Passed", "Missing" and (I assume as I haven't seen it)"Fail" together?
No doubt most people would just use the default of logging a problem, but then I would assume "Fail" & "Missing" wouldn't be grouped either?
Although the html report is very clear, you still have to scroll through it to make sure you haven't missed a specific error amongst a plethora of "Passed" or "Missing" etc...
Thanks,
Steve
checksum has brought a few things to my attention, too; things I would have otherwise missed. Interesting about the Joliet truncation; I'll see if I can get checksum to look for those shortened DOS file names during verification. I haven't encountered this myself - when I burn disks, I generally relax everything; it's not like computers are going backwards!
To avoid long file names in saved pages in the first place; investigate MAF. I use this almost exclusively for saving web pages. Essentially, it's a zip file, with all the resources safely tucked inside; very handy (so long as you do NOT use the "MAF Zip Archive" format - regular MAFs are zips already - strange but true).
One highly cute thing about .maff archives is, if you have multiple pages in a series, you can save over the old .maff file, and rather than overwrite it, Firefox adds the new pages to the old .maff. Another interesting undocumented feature I discovered about MAFs, is that if you save your pages in reverse order, when you click on the .maff file, the whole lot loads back into your browser, in the correct order. Nifty.
About the logging; true, most people log only errors, but even if you log everything, it's easy enough to locate a fail using type-to-find. I see you use Firefox, and I'm assuming it still has this capability (all browsers will have this, eventually, surely!). I use it hundreds of times a day.
One caveat; if I recall correctly; by default, this functionality doesn't auto-start, or only works on URLs or somesuch. At any rate, the following in Firefox's user.js, gets it working beautifully..
// "Find As You Type" settings..
// main on/off switch
user_pref("accessibility.typeaheadfind", true);
// set to true Find As You Type
// if set to false (the default), you must hit / (find text) or ' (find links) first
user_pref("accessibility.typeaheadfind.autostart", true);
// set this to true to only search for links. how crazy is that?
user_pref("accessibility.typeaheadfind.linksonly", false);
// only applies to above, with this true, you have to type the *first* characters of the link
user_pref("accessibility.typeaheadfind.startlinksonly", false);
// how many miliseconds before Find As You Type stops watching your keystrokes..
user_pref("accessibility.typeaheadfind.timeout", 3333);
Grouping the failed and missing files in the log would be problematic to code - checksum basically logging-as-it-goes, but I'll certainly look into it for a possible future option.
Thanks for the feedback.
;o)
Thanks Cor,
I will check MAF properly and see if it would preferable for me to use it, which it looks like it might be :-)
Talking about "Nifty" ideas, I keep noticing refinements in Checksum, that if it were another utility, probably wouldn't be added to version 6,7,8,9 etc.....
The more I use it, the more I can see that so much is already catered for.
Hopefully it'll get a lot more exposure, because imo, its one of those "Must Have" utilities.
Steve
Agreed. Agreed. And Agreed!
I love it when folk notice how special checksum is! I guess I would say that; at least here I would; when it comes to more exposure "out there", I leave that to you guys, having a strict no-self-promotion policy!
In other words people, feel free to drop a link to this page anywhere you like! I see you already did, Vorlon. Much appreciated, ta!
A little while back, someone made a post about a Windows accessory of mine1 (KDE-Sizer) on lifehacker, and as well as sending downloads through the roof; got its page posted and linked-to in all sorts of places over the globe. If something similar were to happen to checksum, I'd definitely approve!
About the MAF archives, Steve, there's one drawback - the contents don't show up in a search-in-files2, being inside a zip archive. Because of this, I tend to give my MAFs long, descriptive titles - I'll definitely need to relax everything when burning those to disk!
Keep on Hashing!
l*rz..
;o)
2. Google Desktop can search inside them; check out the the IFilterShop Google Desktop Search plugin.
Sterling work.
Cor,
I just checked out your utility in addition to a few others today. The synchronize function could be a little more robust.
Google a utility called SmartMD5. The author has not updated it in a while, but take it for a spin and then examine his hash file. It's not completely standard, but the additional metadata does facilitate fast, intelligent hash updates.
I like that you provide command-line functionality.
BlackholeForever, if you have any specific issues with the sync, I'd very much like to know the details.
As for SmartMD5, I remember testing it a couple of years back (I grabbed it again just now to check - yup). Aside from the infuriating gui and usual bag of assorted time-wasting foibles that checksum was specifically designed to avoid (that's as far as I got with it, last time around), it uses a completely non-standard hashing format. That's not really a lot of use these days. checksum can work with any standard md5/sha1 hash format from any platform, but I do not plan to even consider adding support for that!
As for "metadata", that would easy enough to add, and without messing up the hash format itself, either. The trouble is, I can't see any reason to. I'm always open to good suggestions, though.
I'm glad you appreciate the command-line functionality. Every program has a command-line, so it makes sense to make the most of it!
And really, about the sync; if you have any issues with it, anything at all, I want to know. If you have a lot of details/data, mail me. Ta.
;o)
I'm a bit late to the party, but, in response to Cor's tip of pasting two sums on top of each other into an editor to compare them:
You can paste just one of the sums into the editor and search for the other sum. If a match is found, both sums are the same, obviously. And you don't even have to read a single number...
But yeah, I mostly just scan the numbers to see if they're similar.
I use simple checksum to compare files, drag one in, see its hash, then drag the other, and it's easy to see any changes.
Great app! (both of them!)
LiLo
Cool app. Glad to see more work being done in the checksum space. I tried `checksum' and when it encounters filenames greater than MAX_PATH, it seems to ignore the files, but does not give an error. Sounds like a feature for V2? Thanks!
Thanks for the feedback, LowTek, but understand, it is the Win32 API itself that does not support filenames greater than 260 characters (MAX_PATH)..
http://msdn.microsoft.com/en-us/library/aa365247.aspx
While it would likely be possible to create a workaround using Unicode path schemes, such as \\?\C:\<path>, I'm not entirely sure that would be wise. Files who's path length exceeds MAX_PATH cannot be manipulated in the shell, and a scenario where checksum might actually create such files, is unthinkable. The error-checking and fall-back handling is a headache in the making.
It would be doable, but certainly not trivial. And the benefits would only be felt by a slim minority, who haven't even bought a license!
;o)
Awesome
In case you watch these comments, as folk do; simple checksum now has file compare. If that was good news to you, than I recommend you subscribe to the /devblog/, it's fairly low volume, but the plan is to announce all dev news there.
FILE COMPARE!
Simply drop two files onto the input instead of one. Dropping onto the program icon works the exact same way; simple checksum hashes one, then the other; and then reports whether or not the files match. Simple.
Please download, test, and if you find any weirdness, mail me, ta.
Have fun!
;o)
p.s. The beta is in the checksum beta folder.
THANK YOU!
This is the one thing I felt was missing from the checksum package, so thanks. I'm always comparing files. I've just tested it (XP, SP3) and it's working great.
Truth is, I've been looking for a good reason to post I've told a few people about checksum - and sent them hash files! - and got a good respose back too. Anyway, good stuff! Keep it up!
-fm
This is gonna be one of those nights! First, thank you. And that's doubly, because I would have thanked you before, but didn't know how. If that's cryptic, then know, I know you! Well, of course, I don't know you, I but I have something you did, with a hash! And so that's that's thanksX3! For the added publicity
By the way, they need a general release, know what I'm saying..
;o)
Very nicely implemented!
First off - LOVE the app. I'm moving 6TB of data from two RAID 5 arrays to another single store and had no simple way of checking it worked perfectly. Of course, now I have to wait a while for 6TB of data to hash...
Secondly - a feature request: I am hashing the files on RAID arrays on multiple drives (as in entirely separate drives, not just logically different), and as such I can do at least two checksums at once with no issue. Could you add an "allow multiple instances" option for use in such cases? It would literally half the amount of time my hashes will take!
Hi Peter. First off, thanks. I wager if I had given checksum less features, or perhaps more bugs, many more would drop in to say stuff like that. *phew*
Secondly, your requested feature was added back in v0.9.22, you simply need to mindfully enable it in the prefs..
allow_multiple=true.
For more information, check out the version.nfo (link at the very top of this page).
In my latest (beta) version of checksum, if you have this set to false and attempt a second instance, it pops up a dialog telling you exactly how to do it (rather than just let you know that you currently cannot).
If you use checksum a lot, I definitely recommend have a read through the version.nfo, there are a lot of tricks and tips in there, though it is rather long. At the very least, chuck checksum.ini into a text editor and have a scroll. There's probably other things you wish it could do that it already does!
for now..
;o)
AMAZING Software! I've been using it for weeks and it's way better than anything else for hashing. Now a feature request! I'd like to be able to rename files that I've already made checksums for. Is that possible? Pretty Please!
Is the checksum 1.1.6b beta available for download anywhere? I see it in the changelog, but can't find it anywhere, not even the beta section of your distro machine. As always, keep up the great work!
Hi Cor,
First, thanks for a great program. Very fast and does everything you stated in an intelligent way.
Second, a feature request. Suppose I wanted to create a checksum for 10 thousand historical images that I intend to back up in archives to be burned to dvd. I want a checksum to remain on my hard drive so I can verify if any of the new images I accumulate are ones I already have. In this case, the fact that the name must be identical is a problem, as many files get renamed.
Out of 10,000 files, could your program give a similar message to this: "9939 exact matches, 57 hash matches with wrong names, 4 missing files. Would you like to add the exceptions to the log? (Y/N)"
This would not require going into the .hash file to rename anything, but it does let me know that out of 10,000 files that I have some of them already (with different names) and can delete the newer duplicate files before I make my next dvd backup.
This would add tremendous functionality to an already great program. Imagine being able to know whether or not you already have a file (verified by md5 or sha1 instead of crc32) on a dvd without having to pull it off the shelf and unarchive the contents. That the file has been renamed is of little consequence to me in my scenario, but the program would specify exact matches versus matches with wrong names so as to meet everyone's needs at the same time.
Thank you for considering my suggestion.
- Nameless Checksum
Thanks for the positive feedback, Nameless Checksum, whynotbuyafeckinlicensethen! *ahem*
This sounds like tremendous functionality with tremendous overhead. For example, I have a folder with 100 files. But I only keep hashes for 25 of them, this in intentional. With this functionality enabled, checksum would NEED to hash every single file in the folder, to discover if any of the hashes match. The whole operation, then, would be 300% slower. It's something I might consider having on a switch, but not the default behaviour.
The fact that your newer files might have different names from the original files sounds like a situation specific to your own archiving scenario. In the real world, files usually get new names only when we give them one. Somehow you have downloaded files you already have, except it has a new name somehow. And because of this error (renaming the file in the first place, whoever did that), you download it again, not realizing you have it already. And then you delete it anyway. The solution is to stop this crazy renaming at its source! Mail them!
At any rate, the whole "rename" business is something I am already considering, and will doubtless tackle when next I sit with the checksum code (i.e. when I get motivated to devote yet more time to it). The functionality you seek will be rolled into this anyway, your dialog would more likely read.. "9939 exact matches, 57 renamed files, 4 missing files. Would you like to: [rename hashes to match files] | [log errors and continue] | [just quit]" or something along those lines. We might be looking at a special rename mode.
In the meantime, you do not need to unarchive the file to discover a duplicate. Simply hash you new file (maybe drag into simple checksum, perhaps with auto-copy enabled) and then search the .hash file for the hash string; a few key presses, much quicker than pulling a dvd from the shelf!
;o)
Thanks Cor for the quick reply.
I downloaded your program the day I wrote, and am still putting it through the paces and becoming familiar with all of its features. I have bookmarked your site, and when finances allow I hope to return to thank you for your work.
I see now that my request was more complicated than I realized. I have made CSVs before with another program for the DVD verification scenario that I mentioned, but it was not as fast as yours, nor as thorough (crc-32 only) so I was naturally curious how closely yours could come to what I was already familiar with.
Renamed files is a huge problem. I could have "Abraham Lincoln.jpg", "Abe Lincoln.jpg", "Abe.jpg", "Lincoln.jpg", "President.jpg" etc. etc. and not know it. That is why I asked about hash matches with wrong names, so I would know which files I can delete today rather than archive. I do not expect to be searching a hash file on an item by item basis on the possibility that some of the files are duplicates, however. There are too many other pleasant things in life to enjoy, and life is far too short as it is.
At any rate I enjoy your site. Good luck to you.
- Nameless Checksum
Not so quick this time!
The more I read about your situation, the more I think you could be doing with some kind of specific synchronization utility. When I was playing with Vista, I came across a rather good free one, from Microsoft, of all people. "SyncToy", I think it was called. Many of these apps (and there are many) will allow you to keep off-line databases, so to speak, of the archived files, and automatically synchronize the two based on a set of user-defined rules.
I'm still curious as to how you come to have so many potential duplicate files with differing names. This seems so much like a problem with the other end of things, where the images enter the system in the first place.
Good luck!
;o)
Hi cor,
great program you wrote!
I have a question and a comment.
If I create a central root checksum file for a tree of many big files (in my case: .TS video recordings of my DVB-S receiver), I can simply include the checksums for the new recordings I copy daily/weekly to the folder structure.
But If I decide to move already existing (on disc and in the central checksum file) files to other folders in this tree (for example: I decide to move the folder 'The Simpsons" from the folder "Comedy" to the folder "Animated stuff", the relative and absolute path of the files changes... and the old entries will not be deleted automaticlly out of the central root checksum file. This is "wanted" by design (and it's good in my eyes)
The question is: is there a switch / option I can set so checksum "cleans" to central root checksum file by deleting entries for no longer existing files ?
If not, maybe you consider implementing such a switch:
"Clean checksum file" -> deletes old entries from checksum file (but does not create new entries or verifies files in the same run).
And the comment:
>I'm still curious as to how you come to have so many potential duplicate files with differing names.
one guess: you download thousand of music files or picture files from whereever you get some. You have no influence of their names...and you can not get the "publishers" or "creators" to sync themselves in name schemes.
(yes, it's a typical scenario of people doing illegal things... but no real scenario where I download thousands of files comes to my mind...)
Michael
Looks like you've got a winner of a tool here, but I've got one key question:
Is there a way to get checksum to recursively check a folder (and its sub-folders) and drop the checksums into a *single* file that's located somewhere else? I don't like intermixing .hash/.md5/.sha files with the normal data files, even when hidden (my Windows Explorer default is to show hidden files).
This would be very similar (I believe) to how you handle read-only media.
I have same issue like Michael posted,
I want to remove old entries in the hash file, due to i moved them around, or to new place.
Can you develop this?
If this is available, your software is indeed complete and i would buy it many times over!
Interested, yes, create a "root" hash. Launch checksum with the SHIFT key, and you will see the option in the one-shot dialog. You can also set it in the prefs to always create hashes this way, if you prefer. There's also a command-line switch for it. See the help files.
Friend, that's not how it works. You support the author, THEN he gets motivated to add extra features. But so few do. I've already put in hundreds of hours for near-zero financial reward. People keep telling me how great checksum is, but I already know that, I use it dozens of times a day. I'd now prefer some cash, thanks.
Michael, Friend, it's definitely doable. In fact, I have something similar penciled into my 2do list. The plan is, on verification, if there are missing files, checksum will ask if you wish to remove those hashes from the checksum file. Simple.
Of course, the question is whether or not I can get around to adding these things. Currently, nothing could be farther from my mind!
;o)
after downloading Checksum and before using it my virus scanner detects this trogan in the download file
trogan horse generic_c.AJOW Is this for real or part of the program?
See here.
;o)
A quality product. Thank you.
@Me
Re: Virus False Positive.
There are a few AV products out there that use the same engine and hence the same definitions and of course if one says x file is a false positive - they all will.
However, for peace of mind, i always use on any file my AV might throw up a false positive on www.virustotal.com
I tend to hoard some files, like say Nvidia/Ati drivers (for example) and it's amazing that these files have been "clean" through years of scanning and then suddenly my AV suggests I have some sort of Virus/Trojan on one of them. 99% of the time too, they "have" (apparently) a virus that has no write-up or no database entry and are AV vendor specific. Sometimes too (and it can be months) that suddenly they become "clean" again.
I'm quite wary of files and download sources anyway and can honestly say that since 1994 (and an early Norton AV Dos version not long after that), I've only had one confirmed virus/trojan ever, which was happy99.exe
Use VirusTotal though as it will check the file with many up-to-date Antivirus programs, including the Industrial strength Sophos Anti Virus (amongst many)
If it says the file has already been uploaded (ie someone has tested the file) you can always upload yours and have it retested.
Its interesting to see which AV's show the false positive, because often you can see they use the same engines, ie kaspersky might license their engine for another vendor to use. The real giveaway is that the Virus definition version is the exactly the same.
As for buying a Checksum license, i've bought two
I bought two because I use Checksum on my laptop and pc, although Cor tells me one would have been sufficient. However at just £5 and for the amount of time and hassle checksum has saved me (plus peace of mind) it's well worth imo much more than that - hence I Wanted to by two.
When i see some other utilities, say through Cnet downloads (or similiar) at £20-£25 ($30-40)that do very little AND impose strict time allowances or functionality "for testing/trial purposes", Checksum IS a bargain.
I hope Checksum gets more exposure as it wouldn't surprise me if there are companies out there with large databases that would find Checksum more than just useful in data transfer integrity checking, but an absolute Must.
EDIT
Cor, having problems with downloading latest version of Checksum with both Firefox and IE. Unusually the result of the download is around 200k (few times). It seems after a few attempts i've managed to get an uncorrupt version of the zip file. I don't usually get any corrupt downloads.
P.S. I'll have to look up one of my old emails from yourself, but is the entering of the license info the same in the new version of Checksum?
Hi V! (who represents 50% of my licensed user base!)
You know, when I started working on making checksum something I could distribute, that exact thought was at the back of my mind. I saw, and still see, a gap in the market, where an intuitive yet highly functional data integrity checking utility like checksum would fit perfectly, and actually does. The stampede of checksum industrial licenses, as I saw it, would easily sponsor my other activities, keep all my media and information free, and my software "trial periods" infinitely long. How wrong can a man be?
It's definitely an exposure thing. I see a new site with a post about (my) KDE sizer almost every other day, after a single post on lifehacker months back. I'm getting by right now, so I can just watch and muse and think wishfully. Of course, that won't last forever, and the whole shirtware license thing will likely need to be overhauled. Bummer.
It's a culture thing, I guess. We're used to getting stuff for free; software, music, media, because mainly it's theft from "large, corrupt corporations", and is morally okay in most people's eyes. The trouble is when that thinking spills over onto the hand-crafted output of the likes of me, making the viability of any business model relying solely on the honesty of its users dicey, at best.
Och well..
;o)
ps. I've just tested both browsers, and again on a remote desktop, just to be sure. 100% every time (around 1.04MB). Erm.. Maybe some cache between you and me? Otherwise, usual tricks; delete browser cache and cookies, restart it. If that fails, it's never a problem to mail me for the latest copy. I'm always happy to go the extra mile for my users!
pps. YES! That reg code is good for the life of checksum v1.*, and as checksum users will be aware, there's a lot goes on between one version digit and the next, so it will probably be years before you need to think about buying another.
ppps. Yes, my feet are a mess of bullet-holes!
McAfee detects setup.exe as W32/Autorun.worm.gen - type Virus. How come?
Hey there, just a quick note, very nice and useful software. Don't buy too much whiskey with my £5!
One small issue though, hashing across the network seems rather slow... I copied 274GB (all legal I swear, Officer!) to my local drive at around 30MB/s, and hashed it locally in 74 minutes (so roughly 61MB/s). However hashing the same data stored remotely is running at not more than 9MB/s i.e. a lot slower than the capacity of the link would suggest. I would've thought that hashing the data is simply a linear read operation? Unless I am missing something here.
Anyways not a big deal, it'll be done by tomorrow morning.
Thanks again for this very handy piece of software!
- James
McAfee detects setup.exe as W32/Autorun.worm.gen - type Virus. How come?
Looks like a False Positive
For peace of mind, check the file with around 40 AV scanners at www.virustotal.com
Looks like a great program and I would love to switch over, but would you be willing to add CRC-32 (.sfv) verification to the routines?
I am part of www.redump.org, who hash many things, and a one-stop shop for hashes (as well as only posting a .hash file instead of .sfv, .md5, and a .sha1) could help make the site smoother and help convert a fairly large community to your program.
Of course, CRC-32 verification/generating needn't be mandatory, perhaps a .ini setting?
Looking forward to your reply!
---Qriist
PS,
Also, maybe add a function switch in your "hash type selector" to automatically do all hashes types?
Cheers!
---Qriist
Thanks James, I look forward to getting that! (note, my PayPal was offline for a few days - I forgot to update my site prefs with my new PayPal email address - oops!)
As for your network, I dunno; there's not enough information to draw any valuable conclusion. I didn't have networking specifically in mind when I created checksum, and all tests passed, though its file routines are most definitely optimized for local files. The data is read fairly linearly, yes. But like I say, I'd need more info to consider this. Feel free to mail me wads of it, and I will look more closely.
Qriist, this has been asked before, or thereabouts. I might add Automatic SFV Conversion/Update; perhaps. I've currently completed my 2do list for the hashing dll (where this code would need to live), so it wouldn't be any time soon. Having said that, if I were to suddenly receive large amounts of support, I could definitely look at squeezing something in!
And having said THAT, isn't about time you update your folks to a decent verification algorithm? checksum could convert them all.
An extra option for "both hashes" is already on my 2do list. Future versions of checksum may have the ability to keep the "one-shot" options open, so they are no longer one-shot. Tied-in with this capability, would be the discreet capability to perform double-runs, so-to-speak, and automatic multi-hashing™ would be one side-effect.
It's not at the top of the list, however.
But even if that didn't happen, some other way to perform automatic multi-hashing is definitely on the cards.
;o)
ps. cheers V!
It isn't much, but you may expect a payment from me come the end of the week.
^_^
Great software-discovered today. Does all I wanted and more! Thanks heaps! And cngratulations on the editor's pick award at Brothersoft! Def deserved.
(HY)
Ha! Nice find Henry! (Brothersoft is one of only three sites officially allowed to promote checksum, so this is definitely okay) I dunno if you read all the earlier comments here, but that's double-plus funny with a little history! I swear guys, I DID NOT PAY A DIME! And there's no bs links on this page, either!
Good stuff! I noticed a couple of new tutorials this week recommending checksum as the tool if choice, too. Always gives me a warm fuzzy glow, albeit temporary, and not entirely effective against the bitter Scottish Winters. But it's April now..
Cheers, Qriist. Though I'll reserve my actual thanks, for now. As we say in Aberdeen.. Heard it!
Editor's Pick, eh…
;o)
Heeey... Cor!
Check your recent orders... ^_^
---Qriist
Thank you very much, Qriist!
And within minutes (okay, like 110 minutes, but still, WITHIN MINUTES!) another happy checksum user drops some beans into the big money-go-round, and the checksum licensed user base almost doubles, just-like-that! Happy days, indeed.
I wonder if this has anything to do with my new, highly subtle checksum license page?
l*rz..
;o)
Glad I could help. By the way, I really got a kick out of the reg_code!
After reading this page I thought no way checksum could live up to all your hype. but it does! Way to go! Thanks m8.
Re: false positive, even after analysis
from Avira GmbH
date Mon, May 4, 2009 at 6:50 AM
Dear Sir or Madam,
Thank you for your recent inquiry.
We could not find a virus in the attachment you have sent us. This is a false positive.
We will take out the pattern recognition in one of our next updates.
We thank you for your assistance.
Thanks for the report. Most of the AV companies that messed up have now re-classified checksum as a false-positive, so hopefully there should be no problems along these lines from now on.
But that's not a guarantee; it's nothing to do with me! ;o)
I had checksum installed for quite some time but I had to uninstall it as it created far too many .hash files which I never created. It's as if it was creating hash files for any files my PC accessed. I searched for .hash files on my PC and 7500 of them were found. Of that 7500 files I created less than 20 myself so why was checksum creating so many unwanted files? The amount of wasted CPU time must be considerable which is why I had to uninstall it.
All the file modified dates of these files were dated in the time that checksum was installed and there is nothing else installed to create checksums other than hashtab which does not create such files.
So why was it creating these files?
Under no circumstances will checksum ever create .hash files "by itself", syat. You, or someone at your workstation must have created them.
Remember, by default, checksum creates hashes for all the files and folders it finds in the path you specify, and all the folders inside those. If you want to change this behaviour, hold down the SHIFT key during launch, and un-check the recurse option, or set that permanently in your preferences.
;o)
It is likely, especially since it was right after install and not used to the extra buttons, that he simply hit create checksums and....finito, 7500 lovely wasted cpus...But you damn well better believe those files were hashed like no others!
To that end, I kinda sorta did this myself....very shortly after install, though I caught it and stopped it. I must say, it isn't a big problem after a couple days.
Hmm.. maybe. I guess after using software that tends towards mollycoddling, checksum's just-do-it style of operation could lead to a small period of adjustment!
Whether this is related, dunno, I'm looking at making checksum stand-alone, in the sense that it will no longer need an installer, and will be able to setup your explorer menus itself, file associations and so on (and remove same settings). I've used this approach on other apps I've been working on recently and it feels much cleaner.
;o)
- VirusTotal.com Scan Results - May 30, 2009:
http://www.virustotal.com/analisis/81a02d38056d7b29047bbb80e03eceac2906b379a125d6f6046db86f73f20e21-1243663817
Great work, too!
With regard to my previous post and the thousands of hashes created. I could of right clicked folders/drives and mistakenly hash checked all of them. But I have since re-installed it to give it another try but I have found .hash files which I know I did not create as they were programs that I had been running! Unfortunately I deleted them without checking the contents. I will monitor it and check the hash files next time to see what is creating them. I do have a few hash programs installed.
I completed a Virustotal scan yesterday and again a short time ago. Note the hashes which do not match the hashes of idkj VT scan. The hashes for the file I uploaded match the hashes supplied by Cor underneath the download button. So this is the accurate scan. I don't know what idkj uploaded but it was NOT checksum.
http://www.virustotal.com/analisis/e149250716a2dfdd8b3888ca9ba316604ff9b8e7b993be6d2de79def0edb0085-1245044141
CAT-QuickHeal is now off the list
http://www.virustotal.com/analisis/e149250716a2dfdd8b3888ca9ba316604ff9b8e7b993be6d2de79def0edb0085-1245127745
I have attempted to contact a few of the AV companies to see if I could get them to update their databases so that they don't report checksum as infected. CAT-QuickHeal have responded saying they have removed the false from it and white listed the file.
But McAfee still have not responded and Nod32 which many people use sent me an email saying my email did not contain a valid ticket number so they have not read it. ESET (Nod32) are a terrible company when it comes to contacting them. They never reply to virus submissions or add them to their databases after 6 weeks or more so I don't hold out much hope of a successful false removal from them. The best part about it is their forum gives directions on how to get a false positive removed but after following it they send an invalid support ticket automated response.
Unfortunately Nod32 gets recommended by many online sites and magazines so people will use it. But I know their poor support so would never use or recommend that software to anyone. I would tell people to avoid it.
thank u
excelentes caracteristacas ahora a probar
Finally, something worth it !!!!
I gonna donate you $1,000,000...
Thanks
Hey Cor! If you happen to get that million, share the wealth with some new features!
does it work with vista?
Checksum does work with vista, and in the 64-bit flavors, as well.
very good
From "http://www.virustotal.com/analisis/e149250716a2dfdd8b3888ca9ba316604ff9b8e7b993be6d2de79def0edb0085-1248435459":
File checksum.zip received on 2009.07.24 11:37:39 (UTC)
Result: 8/39 (20.52%)
Antivirus Version Last Update Result
-------------------------------------------------
a-squared 4.5.0.24 2009.07.24 Worm.Win32.Podik!IK
AhnLab-V3 5.0.0.2 2009.07.24 Win-Trojan/Xema.variant
:
eSafe 7.0.17.0 2009.07.23 Suspicious File
:
Ikarus T3.1.1.64.0 2009.07.24 Worm.Win32.Podik
:
K7AntiVirus 7.10.801 2009.07.24 Trojan.Win32.Malware.3
:
McAfee 5686 2009.07.23 W32/Autorun.worm.gen
McAfee+Artemis 5686 2009.07.23 Artemis!73F82F2D64F1
:
VBA32 3.12.10.9 2009.07.24 Trojan-Downloader.Autoit.gen
Just on checksum.exe:
File checksum.exe received on 2009.07.24 11:43:18 (UTC)
Result: 3/41 (7.32%)
eSafe...7.0.17.0....2009.07.23...Suspicious File
Prevx...3.0.........2009.07.24...High Risk Worm
VBA32...3.12.10.9...2009.07.24...Trojan-Downloader.Autoit.gen
http://www.virustotal.com/analisis/86ccbc1209087aab77fa0b0d937b070fd7df5d604986fa966e8391ef548ceb7b-1248435798
That may be true, but look: they are knocking the autoit portion. Autoit is a scripting language, easier to learn the ropes of than some others. A lot of virii are made with Autoit language and so many times valid programs are buggered because of this.
When generating log files to a SD card is there a way to know if these files are tampered with later on?
It sounds awkward, but you could always hash the logs when you generate them.
Qriist - The problem is that the SD card resides in the field and someone could eject the card and alter a file and then replace it. I was hoping that there might be a way that each file is automatically hashed as it is created.
Good night
You can contact the following Antivirus producers and to correct any information is related with a false positive.
All them were run by the virustotal service ( www.virustotal.com )
a-squared 4.5.0.24 2009.08.08 Worm.Win32.Podik!IK
AhnLab-V3 5.0.0.2 2009.08.07 Win-Trojan/Xema.variant
eSafe 7.0.17.0 2009.08.06 Suspicious File
Ikarus T3.1.1.64.0 2009.08.08 Worm.Win32.Podik
K7AntiVirus 7.10.813 2009.08.07 Trojan.Win32.Malware.3
McAfee 5701 2009.08.07 W32/Autorun.worm.gen
McAfee+Artemis 5701 2009.08.07 Artemis!73F82F2D64F1
VBA32 3.12.10.9 2009.08.07 Trojan-Downloader.Autoit.gen
Re: The False Positives
I'm mega careful about viruses/spyware/trojans/rootkits/adware etc and can say i have only had 1 (one) confirmed virus since 1994, the rest being False Positives (even those only run into less than a dozen).
VirusTotal is an excellent checker and a service I use from time to time. But seeing non of what I would call the top AV's ie Sophos, Norton, Kaspersky identify anything wrong with checksum, I feel confident the software is clean. Equally even those that do detect "something" amount to 20% of the bunch and often some of those AV's use the same engine, so are essentially the same.
Eventually, if Cor has got time it would be good to iron things out with the AV vendors (False positive wise) if anything, to boost the popularity of this excellent product. I have purchased two licences because I have found checksum to probably be the best utility i have ever used. Equally, Cor has offered a "non crippled" program essentially for free and hoped people would make a contribution purely on trust. Because of that (and I never have met Cor) I genuinely feel Checksum is a "clean" piece of software. I've Not had any "issues" in any regard with Checksum.
Just to add about False Positives to my post above.
I noted whilst checking ValveSoftwares Steam platform the other day the amount of PC Games that are seen as "suspect" by "NOD32, ESET, Avira, and others" when of course they are legitimate.
https://support.steampowered.com/kb_article.php?ref=4361-MVDP-3638
The hashes for the file I uploaded match the hashes supplied by Cor underneath the download button. So this is the accurate scan. I don't know what idkj uploaded but it was NOT checksum.
I was thinking you should make it so it'll read *.md5* and *.sha* since not everyone uses .hash and the point of using the program is for ease of use so I'd rather not spend any time editing the extension so the program will verify those files.
Otherwise, excellent program, three thumbs up.
I downloaded the file "CentOS-5.3-x86_64-bin-DVD.iso" and the associated MD5 and SHA1 hashes via the ".torrent" file from the following CentOS website:
http://mirror.its.uidaho.edu/pub/centos/5.3/isos/x86_64/
Using your checksum program, I attempted to verify the "CentOS-5.3-x86_64-bin-DVD.iso" file against the CentOS supplied hashes. The MD5 hash checked out no problem. However, when I used checksum to verify the file against the SHA1 file, it showed and error and indicated the file has been changed. I notified the CentOS team about the file not matching the hash and they double checked it and said it worked.
So I downloaded another hash verification program called ExactFile (www.exactfile.com). Using that program, I created a SHA1 hash of the "CentOS-5.3-x86_64-bin-DVD.iso" file and it matched the CentOS supplied SHA1 hash perfectly. Then I used your checksum program to create a SHA1 hash of the "CentOS-5.3-x86_64-bin-DVD.iso" file and it was completely different. Here are the SHA1 hashes:
f8ca12b4acc714f4e4a21f3f35af083952ab46e0 CentOS hash
f8ca12b4acc714f4e4a21f3f35af083952ab46e0 ExactFile hash
d0b44cf7c32ed3851da1df42c594c4eda8363ca2 Checksum hash
In short, I believe your Checksum program logic for SHA1 hashes is faulty.
Mike
There is a known flaw with SHA1 starting at exactly 4 gigabytes. Cor is working on it.
helful thanks
Hope this is as easy as you say.
Thanks for this cool tool. It works great from command line. But when I execute this tool using "system" command from C++ program on Windows, it takes 2seconds to create or verify checksum for "checksum.exe" file itself. The tool status shows the action is done in 0.02 seconds but the system command returns only after 2 seconds. I am not sure what the issue is. Could you please let me know on this?
time_t startTime = time(0);
string command = "C:\\work\\BinaryPatcher-Tools\\checksum.exe cq C:\\work\\BinaryPatcher-Tools\\checksum.exe";
system(command.c_str());
cout << "time taken : " << (time(0) - startTime) << " \n";
Thanks
Nirmala
checking this out.
Yo!
Apologies for the somewhat absence; this has been a year of great changes and upheavals, not quite finished yet, either. Okay, working back, sorta..
Nirmala. checksum is reporting the time taken to do the actual job. Your tool is measuring the total time checksum ran, including posting results (where the actual time is posted). You can change these things in your prefs, by the way. If you are looking to improve speeds a little (because let's face it checksum is ridiculously fast as it is), then you can disable the per-hash timer!
Also remember, your debugger will include the time your system took to decompress checksum.exe in the first place (it is a UPX compressed binary, for smallness), so that will vary depending on the speed of the testing system, and is completely outside checksum's control.
LaSenal, I don't lie, it creates too much trouble!
The 4GB Bug. In a word... OUCH! I was on the road when I got this one, and actually had a copy of the checksum.dll's C source with me, took a look, saw nothing obvious. Obviously I'll need to get in and fix this at some point; my plan is; "when I get ten licenses". Almost there!
Another thing that is slightly hindering forward motion in this, is that, these days, I rarely use Windows! (Windows 7 was the final straw, sorree!) However, I do plan to setup an XP box to put out a few "almost finished" Windows programs I have kicking around, as well as updates, maintenance and fixes for existing programs, including fixing this nasty checksum.dll bug.
I can't make any promises about timescale, however - other things must come first for now. I will say this though; the Checksum for Linux alpha will be with you shortly!
Sam.did you actually attempt to verify some .md5 or .sha1 files? checksum handles them by default! No editing required! And I mean ANY .md5 file! The juicy generic .hash file handling is on top of that basic stuff.
Anti-Virus warning. In a word... pffff. I make no apologies for using AutoIt to stitch together useful Windows programs. Hey, if there was something even more high level, I'd use THAT, instead! But if AV "vendors" insist on using brain-dead catch-all algorithms at back of their Amazing New Products, then I have no sympathy for them. Nor for those who rely upon them to fool themselves into feeling secure. AutoIt is too darned useful to be blacklisted simply for existing.
And on the subject of vendors, I have contacted some of them, and had reports from checksum users who have done the same. The success of these endeavors is mixed. Some have removed checksum from their brain-dead blacklist (essentially adding it to their brain-dead "whitelist", instead), others have replied that they have acknowledged checksum is V-Free and will update their lists, but haven't; and other still have simply ignored us. Even if I got paid for all this, I'd have no more motivation left to correct their incompetence.
There are good reasons that people like Vorlon don't suffer the ills these vendors (cl)aim to protect us from. Firstly, they understand security, and apply the principles to their entire computer system ("installing a firewall and AV does not, by itself, infer "Good Security" - for the record, I run neither), and secondly, they research the tools they use on their systems *very* carefully. It's no mistake that Vorlon could mention three of the best windows AV off the top of his head. I'm assuming he is a he! If not, Sorry Vorlon! And for using you as an example; but hey, at least it's a good example!
Janna. yeah, Qriist's solution is good. After all, that's exactly what checksum is for. Of course, you will want to keep the actual .hash file somewhere else, perhaps on your hard drive.
[later.. Janna, I missed your second post the first time around.. ]
That's an interesting technical challenge. But you didn't mention what kind of log files they were, what kind of computer they were being created on, or even the Operating System. The only thing I know for sure is that it's a computer system with some sort of SD capability, and there's a program running on it that creates log files.
Obviously, offering an actual solution isn't possible with so little information, but you might consider something along these lines.. Run a daemon, or a scheduled script that watches the log folder for new logs (the criteria for what constitutes a "new log", depends on the system creating them, and how). When one arrives, it is automatically hashed, and that hash piped or copied immediately to a remote server (via WiFI/etc.) or even simply to a secure local folder on the same machine, somewhere only the root user (presumably you) can access them. However, as this machine is logging to an SD card, it's likely Plan A would be preferable, as it surely doesn't have a hard drive at all!
You might even want to make that script copy the actual logs!
Keep on hashing!
;o)
Cor!
W00t! After 4 emails from me over the course of 4 months (each email more more harsh than the previous one), I finally got an email today from McAfee saying that the false positive should be fixed with the next DAT file! Progress is being made one anti-virus vendor at a time!
Also, I would've sent this next part via email, but had deleted the old emails that you and and I traded, but the private release of Checksum that you sent me (to fix the bug where foreign characters in file names will mess up the .hash file) has been working perfectly for me over the past 9 months, with the exception of the AutoIT error that sometimes comes up at times when I exit the program by right-clicking on the notification bar and tell it to stop (which I already reported).
Thanks for your work! I expect to have some funds freed up in October, so hopefully I can FINALLY send a token of thanks for your generosity to the community.
Bunsen
Thanks for your effort, DrBunsen! If everyone takes on one AV vendor each, we can lick it!
You know, always, at back of my AutoIt coding is the idea that by making useful, even "professional" tools with it, and making them available, I might firstly in some way help AutoIt out of its own bad reputation (which let's face it, it has because it's so darned useful), and also, help lesser AV vendors towards more intelligent detection methods. After all, if your current AV complains every time you use perfectly good software, wouldn' t you consider switching AVs? I hope so.
Checksum on Linux will consist of no more than a bash script (utilizing KDE & Gnome's built-in ad-hoc gui programs) coupled with some service (right-click) menus. I'd wager there isn't an AV vendor in the world that would flag it as "bad", which is good, but not good, because they wouldn't flag it because it's just a bash script, which is bad, really, because a bash script can be the most powerful, devastating kind of program there is..
#!/bin/bash
rm -rf /
A Killer one-liner!
Of course, I could be completely wrong about that, and every vendor might flag it as bad, simply because it's a bash script. But what about..
#!/bin/bash
echo "Someone loves you"
Wait, that can't be right! Someone DOES love me! How dare the vendors deny me this simple truth! Thing is, any half-decent coder could write a virus that could outsmart most AV simply because they aren't very smart, and that's not right. The only reason people don't, is that most people genuinely don't want to wreak havoc on the world. AutoIt has proved this - just because you can build a virus in half the time, doesn't mean that people will.
If I want to do a task, I use whatever tools are available, DOS, BASIC, C++, AutoIt, Assembly, bash, whatever - if you can understand one manual, you can probably understand any of them. Virus coders are the same; a language's utility is what makes it popular. What matters is what the code DOES.
In the future, as well as run the usual heuristic and semantic checks, AV software will actually RUN the software, in an automatic virtual environment, watching everything the software DOES in there. The more advanced vendors will run whole swathes of "virtual user experience" tests, simulating actual user activity over the course of days and weeks, moving the date forward super-fast, collecting virtual emails, the works. Yup, virtual machines downloading porn, just to see if LatestMediaPlayer has some special chocolate chips in its cookies!
Of course, they likely haven't thought of this, and bang goes yet another idea I should be selling! Point is, a .bat can be your friend, and so can bash. But a .bat can also bite you, and bash can, well, bash. A program that does no more than post a dialog box saying "Your system is now secure" would be better than most current AV, because at least you'd get a laugh.1
for now..
;o)
Cor
Great thoughts, Cor. It'd be nice if the AV vendors would grasp some of those truths, but alas, they're locked in all sorts of unwinnable battles. Not just against the virus makers, but against each other for market share, against their own errors (false positives that do WAY worse than just delete the Checksum installer), as well as the battle with end users which I'm sure contact them for all sorts of issues, many of which aren't the vendors' fault.
Fortunately, none of those things are MY fault so I have a clear conscience when dealing with them. As an IT Manager I did my best to rattle my meager saber (future spending dollars) with McAfee. Unfortunately, it still hasn't produced any true results. My earlier report that things would be fixed was because I was contacted by someone from McAfee (as opposed to a generic support email address) who had said that they had identified it as not infected and that the false positive would be fixed with the next DAT file. But then 3 different DAT files later hasn't made much of a difference. The only change seems to be that it's now recognized as a different virus - "Artemis!33E0F7E127BA". So I wrote another email expressing my disappointment.
It took 4 emails over several months before I even got a reply. And their attempt to fix it: FAIL. Yep, brain-dead. I'm not impressed at all. I'm going to vote with my budget dollars and switch to another vendor and recommend against McAfee to my dying day. (I would usually give a vendor a pass for a screw-up, but this has been going on for at least a half-year, not to mention that I was bitten by them back in the mid-90's as well for a different reason.)
I'll let you know if I find out that they do accidentally fix it, but until then anyone can check their level of competence by uploading Checksum's setup program to VirusTotal.com.
Bunsen
Checksum has saved me yet again a load of time and ironically on something I mentioned to you before Cor.
Recently I decided to try (as it was on offer) O&O DriveImage 4 Pro, when I normally use Acronis True Image and have trouble free use.
So to give this new software a go I created an image of two of my main partitions on an internal hard drive which resulted in the creation of a 115GB compressed image file. Anyway the target for the creation of this image file was on a 1TB external hard drive and according to DriveImage 4 the operation completed successfully. However when I used the programs own Image validation function the image file failed.
Remembering that some years ago a few external drive enclosure chipsets were causing corruption on large files I thought the only way to prove a point to see if this modern chipset was doing something similiar, would be to use checksum.
So I created a few "Root option" hash's with checksum on some some folders that totalled around 120GB, "zipped" them with 7-zip using a solid block compression and copied the file to the external drive.
For a quick check I can of course use the CRC of 7-zip to check that the contents are intact, but to prove the point accurately a decompression of the folders and a verify using checksum is a really good confidence check for your setup, especially when using any form of removable/portable media/storage.
I'm fairly confident that I will be returning to my trusty acronis products, as my external drives chipset seems to be handling data correctly. Also after validating an Acronis True Image file *.tib I create a hash file for it via checksum. This method seems to be quicker (for future checks) using the windows explorer checksum verify function and checking via MD5, rather than starting Acronis true image and verifying the file via their method.
Re: Viruses and false positives.
I only ever use a virus scanner "on demand" and only have one "machine" my laptop with a full array of protection.
Firstly my main AntiVirus product KAV2010 sees all the checksum versions I have (ie one installed, older vesrions archived) as clean.
Secondly in a view to removing a particularly nasty fake AV from a friends laptop, I downloaded and have now installed for my own testing the following anti-malware programs and have since checked checksum (various builds) with them.
1)SuperAntispyware
2)Malwarebytes (excellent product)
3)SpyBot Search & destroy
All find NO problems with checksum.
The good thing is with the Virus Total site is that you can often find an innocent file will be flagged as containing malware by at least one of the AV scanners on the list. Because that does happen, I think it's reasonable for one to decide on what he/she thinks what percentage out of the various av scanners if they find a problem is acceptable. I'm quite happy if a file is deemed at least clean by 60% of the scanners, especially if they are the respectable scanners.
Thanks for your excellent input, Vorlon.
As for speed, I would imagine that to be correct. Checksum is ridiculously fast. For example, in all my tests, on the exact same hardware, the standard UNIX md5sum tool will take 2-3 times longer to hash a file than checksum. This is why I never did quite get around to documenting the hash.dll api, because I'm greedy like that and didn't want other folk's apps to get these improved routines!
I must get around to that soon, as well as a full source package for the dll + API documentation. Once I fix the code in the SHA1 routines, that is! :/
Back on topic..
Yeah V, Checksum has saved me, many times, too! Just the other day, checksum (or rather my using checksum) discovered that my new USB >> HD converter, while flawless on Linux, randomly corrupts files when attached to XP! Over 30% (THIRTY PERCENT!) of the files (1TB total) apparently copied perfectly, but in fact, were messed up. The unit also crapped out completely more than once on vanilla XP. Very nasty, and without this (27MB!) .hash file, I wouldn't have known about the corruption until I went to actually use one of the affected files, long after the original volume has been reformatted. Ouch!
This is great software!
Seriously though, it is. I realise it has a few wee foibles, and there's still some things I'd like to add yet (left click tray menu to show/hide progress, for one), but I literally couldn't live without it. Or at least, I couldn't live without checksums, and before I had a drop-dead simple way to create and verify them, I tended not to use them, much to my agonizing detriment. This is why getting checksum for Linux working was so important to me. A computer feels lost without it!
On the platform subject, I now have an XP box setup again (and I remember how bloody fast XP Explorer is as a file manager) and have started nosing around in my AutoIt dev folder, refreshing those parts of the brain involved in verbose, high-level coding. First off the block will be LoopDropZ, and a few other wee apps I need to get the hell outta here, and then into the checksum update. At least, that's the plan. Of course, it's intertwined with a dozen other plans, so...
Anyways, I love to hear about checksum user's experiences; good or bad, it's always good. I like them HERE, too. Mails are great, but often I get comments and questions that I have the urge to repeat here, in case someone else asks. The flowers I can keep to myself.
One good question I got recently: "Will Checksum for Linux be available for Windows?" (see! someone does read my dev blog, or at least the headlines!)
Now, you may laugh, but that's actually a way better question than it first appears to be. The answer is, "I WISH!".
After working some with the Bash+GUI system, I realize that it would be a beautiful common code base, but alas, Windows lacks the back-end tools and a the shell just isn't capable enough. It's a nice idea, though. Who knows, I may bite the bullet one day and start from scratch in Python, or something!
L*rz folks!
;o)
ps. Keep at it, DrBunsen! Elsewhere on site, proof of how effectively pestering large corporations (in this instance British Telecom) CAN make a difference! But your dollars talk loudest of all! Good work!
pps. I got my Ten! So now, I have no excuse, eh!
Hi Cor,
Checksum has recently brought something to my attention over the last few days whilst I've been trying out HD Imaging programs and USB HD chipset data integrity.
Anyway I've reverted back to my trusty Acronis True Image 2009 (in this case, had most the preceeding versions) and wondered why Checksum was showing an MD5 difference on a file saved to my Internal HD and "the same" saved on my external hard drive. Thinking I had a USB chipset issue here I decided to conduct an experiment, first with two image files being saved to a seperate internal drive partition and two to the external drive, but this time with No compression (so no forms of random truncation could occur).
I didn't have to do too much before I got some odd results. I used the Acronis boot/Rescue disk as i mostly do, so there is nothing else going on and it's just a basic linux enviroment. So i fully imaged just one 3GB partition (for speed) and saved the image file (say image1.tib) with no compression & with validation to a seperate internal drive. Then immediately that image was completed and whilst still in that enviroment, i did exactly the same thing calling the new full backup image file say image2.tib.
Both files validated perfectly and are interanl to the PC drives.
so, once in windows I checked both their sizes and they are exactly the same as you would expect. But using Checksum to create MD5's and checking the results, showed that the files were in some way constructed differently as the MD5 values were different. I tried windows FC.exe too and the files have been constructed differently but the original data is in fact intact in both files.
Any ideas why the same source is constructed differently in the resultant image file? I've tried it a few times with other partitions and although the resultant file/s are the same size their construction isn't.
At first i thought i had data corruption which doesn't appear to be the case, but simply Checksum highlighting that these "exact files" are in fact different.
Thanks,
Steve
BIG advertisement...
SMALL size...
Even smaller "functionality"...
No GUI? What the .... you're thinking? Who cares if it's the best or the fastest since it forces user to "waste" time by non-existent user interface. Similarly "Assembler" (fast like yours) beats ".NET" yet not chosen by most...
Advertise again after you made a GUI...
Vorlon, that is quite odd! After scratching my head for a bit, I have no answer, and dug around the web some. The only similar situation I could find was, interestingly, also about Acronis TrueImage files. It it was suggested that perhaps TrueImage adds some kind of time stamp (perhaps encoded in some way) to the images, and this causes the discrepancy. It does sound plausible. Original details here.
I've given up attaching my USB > HD converter to my windows box, there seems to be some kind of weird bottleneck causing it to crap out and corrupt files randomly. It works great attached to my Kubuntu laptop and shared up via samba, even fast enough to check hashes over the wire. Of course, I can do that easy enough on Linux now, too.
makeitup, stay clear of strong liquor! If you want a GUI, hold down the Shift key whilst launching checksum. Tada! Oh, and remember to release it again before posting here!
As for the rest, no comment required.
for now..
;o)
ps. I dropped a slightly updated hash.dll
Hi Cor,
Thanks for the reply and the link. In fact i did get a response on the new Acronis forums from one of the regulars who did suggest something similiar. I suppose as the time stamp is the only variable(in this case)and maybe is relied upon in more respects than intially come to mind then perhaps it's that, that alters much of the file structure.
At first i thought it was a compression thing as i've found in the past that some compression programs for whatever reason decide to compress identical files in a different way (ie at different times/seperately), even when the settings are the same. So in the above instance i disabled compression and went for plain "store" only.
In fact I set True Image to "Validate" the images after creation and they all passed. I'm not sure what method they use for validation comprises of, as although the fresultant image file is built differently every time, the validation for it's contents remains ok.
Steve
Your program is confusing and seems to do everything backwards just to be different. It's not an app within a zip file, and it's not an executable installer - it's a zip file with an executable installer in it. And it tries to install in c:\program files instead of c:\program files (x86) like (almost) all of my other 32-bit apps do.
The window that comes up to show all the command-line options is so tall that the buttons at the bottom are almost all the way off the bottom of my screen. The first time I ran it I clicked cancel without knowing what I was doing.
Then there is the option to use sha1 or md5, and other options (i needed sha1), but now when i run the program those options don't come up. i remember some window coming up and saying it would only come up once, with some weird exception like if you hit the shift key, so i'm assuming i have to remember that and run it with shift pressed to get those options again. (would be too hard just to provide a button in the main interface that brings up those options?)
it said i'm not using context menus, i need to provide command-line options.. but this isn't true. i ran it and performed a checksum without having to read the darn command-line list. so maybe instead of making the options window only pop up once except when you press shift, how about making that command-line listing only pop up once except when you press shift (or better, press some button).
then, instead of just acting like a normal program and showing status somewhere within the program window, it takes the liberty of making a little white banner on the top of the desktop showing what's going on. i mean it's neat that you can do that with windows, but imo it just makes things more complicated than necessary when there's no reason whatsoever for it.
finally, it spent about 4 minutes computing the SHA1 of a large iso file, and then when it was done it gave me a little pop-up window saying exactly how long it took, but nothing useful like, say, the sha1 result. i checked the desktop for the banner at top hoping it would say it there -- the banner was gone. there was no other window anywhere saying the result. considering i had to deal with the cpu usage being at 100% for 4 minutes, it was a little frustrating that your "unorthodox" style of user interface failed to provide me with an actual result.
basically i won't use this software unless i have to.
Hi inhahe,
I find it confusing that you are having problems with Checksum and with your last comment about no shown hash value after creating an SHA sum for an ISO file, i tend to think you've missed the overall idea of Checksum and the way it has been designed to work.
You will find a text/hash file residing in either the root or with the file you checked (depending on the options you set) with the MD5/SHA1 value written in it. The idea being that if you were (like most of us) to want to create the MD5 values of drive volumes or dozens of folders/sub folders that you can invoke checksum, leave it to do it's thing and it will generate a text/hash file for you which you can read, or Checksum will use to verify the file structures if you ask it to. It would be pointless to have a steady stream of MD5/SHA1's values "running" up the screen as it created values for hundreds of files, it would just be a blur - hence the results are in a .hash file that you just open with notepad.
I use Checksum all the time and can genuinely say it's the most useful piece of software I have purchased since 1993
Vorlon, I too, have seen compressors choose algorithms seemingly on a whim, and had identical files compressed in different manners. I suspect other parameters; like perhaps system load; could be factors in its decision, but that sounds far-fetched.
So long as you have hash files *inside* the volume you are creating an image of, they can, at least, always be relied upon to remain accurate.
inhahe, I firstly must echo what Vorlon said.. i tend to think you've missed the overall idea of Checksum and the way it has been designed to work. But you took the time to detail your issues, so I'll try my best to address each of them..
There is a zip, for efficient transport over the net. You unzip it somewhere, and you get a folder. Inside that folder is a program called setup.exe, which you can click. I hadn't considered this to be complex, but I'm certainly willing to hear any specific issues anyone might have. Note, though, future versions of checksum may not have an installer at all - everything I've written since checksum has been completely self-contained (can setup its own file types, menus and such), and checksum will likely end up the same way.
As for the install location, checksum will only try and install where you tell it to! A path is suggested, but that is all. Simply click the "choose" button right next to the path, to choose another.
Indeed, there a lot of options now! But that window will still fit on an 800x600 screen (and netbook users all have my KDE-Mover-Sizer installed, so I'm told, so they can just alt-click to drag it around!). Rather than click cancel without knowing what you are doing, it might be an idea to read some of the text in the dialog, even just the wee paragraph after the options list; it explains exactly what's going on, and precisely why you might want to click each of the buttons. I bet you could read it in less time than it took me to write it!
So you DID remember that if you hold the shift key, you can get back the one-shot options at any time. Good! Intuitive, isn't it! I use the one-shot options a fair bit myself, and find the shift modifier behaviour highly convenient. Otherwise, I expect checksum to just get on with it, immediately, using my default settings.
No, it would be completely impossible. By that time, checksum is already getting on with the job. It's probably finished. That dialog provides a way to alter checksum's behaviour for a particular task. During that task, is too late.
Well, technically, you are providing a command-line. That's exactly what dragging something onto a program does. But you are right, I could add that to the error dialog, too (except, isn't it tall enough!). It does state all over here that you can drag and drop stuff directly onto checksum, though. I guess I thought most people would want to get the most out of it, and read the faq, tricks & tips and stuff, but no matter how much I slave over them, this will never be the case. Och well.
I would expect, though, that someone who considers themselves savvy enough to NOT go with the default install, would be savvy enough to figure out how else they were going to run checksum! Of course they would probably want to thoroughly digest the contents of checksum.ini.
The dialog you are referring to is basically an error message, meaning "you clicked checksum directly, and that's not how it was designed to be run + lots of info". That dialog even offers you the option to launch simple checksum, just in case you really do need a GUI program that you can throw things at, set the algorithm, see the hash itself, and so on. Click "Yes" next time, and you'll see what I mean. The idea is, during normal use, you will not see that dialog, ever.
I agree, no progress information is required during hashing operation, no nothing whatsoever! Just the .hash files! But some people like that, and so I added the progress windoid we all love so much. I mostly use it myself these days, too. You can click it and drag it to another place, by the way, if the top placement bothers you. Another by the way; it might look cool and high-tech, but it's just a regular window like any other. Also note, you can disable it altogether, if you wish.
And hey, if there's "no reason whatsoever" for even a minute progress window, how much less reason is there for a full-blown hashing GUI? Which is why checksum doesn't have one.
Of course, you later realized that there was a hash file sitting right next to whatever you were hashing. That's what checksum does. Even if you were hashing in some read-only space, checksum will leave a .hash file, except in some other place, of course. In fact, checksum will go to extraordinary lengths to ensure that for no reason, ever, do you "get nothing".
Absolutely! And that's a maxim I apply to ALL software!
for now..
;o)
I routinely hash 4 or 5 large files (3GB each, several minutes of hashing per file) currently using other application that does not support selecting files via Windows Explorer's context menu. I'd like to do this with checksum, which offers such possibility (I've enabled the 'allow_multiple' option in the ini file, of course).
The question is -> Any chance for implementing a semaphore (e.g. using temporary files) that will prevent multiple instances of checksum from reading the disk simultaneously (after user selects multiple files in Windows Explorer for hashing)?
For me, checksum without semaphore - , checksum with semaphore - all I need .
Rgds,
marcinw (Poland)
OK, I can always move the files to a temporary folder (within same disk (partition) this doesn't physically move data, so it's fast) then I can use checksum (by right-clicking) on the temporary folder and move the files back.
So, I guess my proposal from previous post is not as much needed as I thought.
marcinw, take a look at the Batch Processing section.
;o)
Yes, you are right -> I can use hashDROP.
However, it is a bit frustrating (at least for me), when for hashing a couple of small files I can use multiselect and right-click menu, but when huge files come into play I need to use, in a completely analogous situation, some other methods (hashDROP or *.bat file).
As I can see, such inconsistency is not a problem for you. What can I say? OK, you're in charge here, you decide.
marcinw
marcinw, you are creating problems where they don't exist. There is nothing at all to stop you using checksum in this manner on any number of files, of any size. It simply wasn't designed to be used in this way, that's all. Why not hash the entire folder? Or use a file mask? Even doing it your way, it will work just fine; your drives will thrash a bit for the duration, that's all. No big deal.
I suggested a batch processor, as you seem to want to perform some series of unusual hashing commands (that is, not hashing a file or a folder, or a volume, but a specific set of named files). If that's the case, simply setup a batch, and it's done.
And yes, I am in charge, and I do decide. And I decided, a long time ago, to listen carefully to all intelligent suggestions, and implement any that made checksum better. I'm not so keen on dismissiveness and wild assumptions, though.
One thing I have considered, which might be useful in your case, is to have checksum accept drag and drop of multiple files onto its file mask input in the one-shot options dialog, so it might read, "file1.jpg,file3.txt,file2.ocd", and thereby produce a hash for only those three files.
If it's always the same or similar files, it would be easy enough to setup a shortcut somewhere. For example, I like to give all my latest torrent downloads an individual hash file, so I keep a shortcut in my favourites menu (which I don't use for internet stuff, it's too handy there in Explorer to waste on that stuff). Its command-line is..
"C:\Program Files\corz\checksum\checksum.exe" crtbim(movies) "D:\Torrents\Completed"
At any rate, there's usually an easy way to achieve what you want with checksum, if you are willing to explore its many options. And if not, suggest one!
;o)
Just wanted to say Checksum is a great program and keep up the good work!
Recently I decided to try (as it was on offer) O&O DriveImage 4 Pro, when I normally use Acronis True Image and have trouble free use.
So to give this new software a go I created an image of two of my main partitions on an internal hard drive which resulted in the creation of a 115GB compressed image file. Anyway the target for the creation of this image file was on a 1TB external hard drive and according to DriveImage 4 the operation completed successfully. However when I used the programs own Image validation function the image file failed.
Remembering that some years ago a few external drive enclosure chipsets were causing corruption on large files I thought the only way to prove a point to see if this modern chipset was doing something similiar, would be to use checksum.
Downloaded it, Tested it, Love it!
Features suggestions:
- Support SHA2 hash family
- Support RSA signing?
Worm detected on open of Simple Checksum download.
Generic.AOUJ
One million pounds please!
My AV software reports a trojan in this software.
@ Slorp & Person
What your Virus Scanner is doing is reporting a "False Positive", when in reality there is NO virus. Basically some Virus scanners (with their current definitions) are identifying code or bits of code wrongly in Checksum that may loosely resemble that of code that has been known to be used in a negative way. Rather than taking a more accurate approach, some scanners will Tar some types of code with a very big brush, inadvertently flagging legitimate code too.
Virus scanners are like anything else when it comes to quality, ie some are very good, fast and accurate and others let themselves down by missing malicious files and flagging those that are innocent, ie False Positives.
Because most people don't have or can afford to purchase 40 or so Anti-Virus scanners including the "Bank" strength type scanners from companies such as Sophos, i suggest you go here to www.virustotal.com and check any files you are concerned about.
Also i highly recommend three malware scanners; MalwareBytes, Spybot Search & Destroy, SuperAntispyware.
So for peace of mind try:-
www.virustotal.com
http://www.malwarebytes.org/
http://www.safer-networking.org/
http://www.superantispyware.com/
EDIT
I've just checked the Current Checksum version at virustotal.com and out of 41 AV Scanners only 6 suggest that there may be something and out of those, 2 (maybe more) appear to use the same AV engine.
Those that find NOTHING (ie the file is clean) are:-
(As you can see the most highly respected AV scanner Vendors find the file ie, the complete Zip and ALL it's contents to be clean)
a-squared
AhnLab-V3
AntiVir
Antiy-AVL
Authentium
Avast
BitDefender
CAT-QuickHeal
ClamAV
Comodo
DrWeb
eTrust-Vet
F-Prot
F-Secure
Fortinet
GData
Jiangmin
Kaspersky
McAfee
McAfee-GW-Edition
Microsoft
NOD32
Norman
nProtect
Panda
PCTools
Prevx
Rising
Sophos
Sunbelt
Symantec
TheHacker
TrendMicro
ViRobot
VirusBuster
All the AV Vendor product versions/virus definitions release dates can be found on VirusTotal along with all the results, but as you will see the vast majority, including some of the most respected scanners find Checksum to be clean.
Thought you would like to know that your md5 hashing is accurate for files 45gb+.
Good stuff!
Thanks for that, Vorlon; and a useful list. In future I'll probably just delete all those sorts of comments on my next pass through; I might even post your list in the page proper, save us have to repeat it!
You see, slorp , to qualify, it needs to be a real worm. As I know checksum has none, my Million Pounds is quite safe - for you to get paid, two miracles would have to happen!
Qriist, I didn't suspect for a moment that it wouldn't be!
It's good to know, though. But I gotta ask, what the HELL sorta file is 45GB?
;o)
I dumped 2 Blu-ray movies to ISO format to test checksum. Resultant test files were quite large.
I just installed checksum, and this program is awesome. I really like the approach, it's contrary to most windows developers out there.
I am having a problem though, I can't seem to get the options dialog to pop up when I hold down shift. I'm using win7 32bit, so I'm assuming ms might have done something in explorer to break this functionality? Command line options work as expected so I'm still happy. I was just wondering if anyone else had that problem.
A quick workaround for anyone else with this problem - modify setup.ini to always show options and then reinstall. The file is heavily commented so it should be pretty self explanatory.
Ahh Qriist, you are technologically more advanced than me, I'm not in the Blu-ray loop yet. That's some BIG files! I guess folk are gonna want to sha1sum them at some point.
Thanks for the heads-up, shoguntom, and the kind words. There's a high probability that they will make their way into my front page auto-quote feature! Oh, wait a minute, I knew there was something I forgot to put back in, last time I updated the front page. Doh! Maybe this is just the kick I need; you basically hit the nail on the head, shoguntom; it's all about the approach. In this case mainly, "Get the job done, now!".
I'll keep my eyes open for other's posting about context menus on W7/32 (through the power of web voodoo, I keep a vague and distant eye on some of the places out there where checksum is discussed; to make it better, you understand). If this issue starts getting airplay, I'll definitely look into it, I mean, context menu use is another thing checksum is "all about". Can a thing be "all about" more than one thing?
NOTE: You don't have to reinstall checksum to get this functionality. If you check out the tips page, there's a section on how to create a context menu especially for music (albums). There's even a sample registry file you could use/edit. Simply insert whatever switches you want, and merge into your registry. And it's easy enough to add a command, instead, so you have one regular launch, and one with-options. Or whetever else you need.
There's a lot of reading around here, I know, but if you are a frequent checksum user, most of it is fairly rewarding stuff.
Thanks guys!
;o)
Awesome tool - truly awesome! The only key thing it's missing IMO is ability to specify desination folder for your hash. Say I have a RW share that I'm mirroring. I want to hash it and then verify the copy using the hash. I wish I could just specify where the hash will get generated rather than having to polute source and then move from it.
I can see how that might be useful, Michael. I'll add it to my 2do list.
In the meantime, if you set the read-only bit (temporarily) on the root directory, you will force checksum into fall-back mode, and from there, you have a multitude of options about the format and location of the final .hash file.
Thanks for the flowers!
;o)
see checkum gotten itself onto the latest Reanimator Extreme Edition. http://dcp.sovserv.ru/program/48738/2009/09/02/reanimator_ee/
You don't always have permissions to change ACL on the folder that you have RW access to, so your workaround won't unfortunately work in my case.
Btw, another feature request I have is to allow filtering of sub-trees on a reg exp basis. Say I want to checksum all the files in the tree except for the sub-tree that is called "abc*x" and "yz*df" (there can be multiple instances of each hit). If the tree is several TBs in size and these are sprinkled throughout it manually selecting what to hash is not feasible. What I want is a single hash file that holds the entire tree except for a few subtrees that match by a specific name pattern, basically.
Thanks!
Nice program, almost perfect!
1) I would like to exclude files from getting checksummed based on file patterns.
In my directories there are files named "compositepage-01.jpg", compostiepage-02.jpg etc.
If i put in "compositeimage-" as textstring under ignore.. named files.. it does not work. Please let us exclude files based on patterns.
2) On the website I think a full description of all the options would be nice, and easier to read than the description in the ini file.
3) I like to be able to use multiple 'profiles' in windows (more versions of checksum.ini?) for different scenarios.
4) Maybe a windows frontend with more (all available) options.
Very nice program!
hi cor,
i'm planning to use it on my 6.5 TB raid5 (Win 2008 R2 Standard 64-bit on Phenom 9650 2.3 GHz 8 GB RAM) home-made video server to check if files are staying the same over time because i discovered a few vob files that wouldn't play in my music video shows directory. do you code for 64-bit full 4 cores (or all available processors) to maximize speed?
count me in to beta test if it doesn't already and you need proof on a version that does.
the data stream on my raid5 sub-array is between 50 and 100 MB/s depending on file size and if it reads or writes. data size is 6 TB already, only 500 GB free and i still have another 6 TB of video data to move dvd > online. i also have a 4 TB music server (80% full) and a 2 TB E-book and software server (75% full) on win server 2003 32-bit to verify beta versions but these a dual-core. i also fluently speak and write french if you need to translate and get more market share. i also have a hosting site with ~ 100 GB available if you need some web space.
mega-thanks in advance
pol :-)
tested it with 154 GB of national geographics videos from my torrent server (win xp sp3 on a dell ws530 dual 1.5 GHz Xeons 1 GB ECC RAM 500 GB Sata-2 for torrents). creating md5 job finished after 5600 seconds. then it finished verifying in 2400 seconds on my 6.5 TB raid5 (Win 2008 R2 Standard 64-bit on Phenom 9650 2.3 GHz 8 GB RAM) video server where i had sent a copy of the videos. Pretty impressive!
one suggestion: add a browse button to choose logs directory.
pol :-)
I'd like to point the "commandline geeks" here also to the free microsoft's "fciv.exe" commandline program.
It calculates MD5 and/or sha1, can recurse directories and stores the values for all files in an XML file.
E.g.:
fciv.exe -add %systemroot% -r -XML c:\MyDir\windowshashes.XML
will store all files below your Windows directory in c:\MyDir\windowshashes.XML,
while
fciv -v -XML c:\MyDir\windowshashes.XML
will verify all current file versions against the versions stored in the XML file.
You can also use "fciv.exe -list -XML {xmlfile} to display all file hashed in plain text.
It returns "0" if nothing has changed, and "1" if modified files have been found.
This is especially nice for batch operation using %errorlevel%.
Not as powerful as the "checksum.exe" here, but good enough for most simple tasks.
This is "sort of" out of the context of your program, but of great interest to me. What did you use to create the setup program?
I use InstallShield's Premier line, but they've bitten me in the sitter-downer before. To whit, if you fail to predict a system crash and uninstall Macrovision/Accresso/Flexera's product in advance, you can't reinstall on your repaired/new system because their licensing server assumes you pirated a copy.
That eventually leads you into a tangle with their support - who only work banker's hours and make no real effort to conceal their suspicion that you are ripping them off.
<b>Which ticks me off.</b>
Your install is...sweet, neat, complete. So how do you do it/what product do you use?
I may be mistaken, but I think cor created his own installer from the same language used in checksum itself - AutoIt.
For a ready made installer I have had success with WinRAR in the past. You can make a self-extracting archive that can be made to remove any references to WinRAR itself, making it more or less your own.
Great looking app and it almost do what I want, but I am really looking for a small easy to use (easy to download and do not need install) app for checking files I downloaded from Internet. I use it a lot on computers that are not my own (Costumers).
I use your app today but I have to make a hash file from a checksum on a web page, It would be great if when I use simple checksum and I have dropped a file on it and get a checksum. I could just paste a checksum to the application and it will compare it to the checksum that it just created and tell me if it is a match.
Or a button or a shortcut key to make simple checksum read the clipboard and compare it to the checksum it has just made.
Would save me a bunch of irritating steps to do a checksum comparison. And then I will for sure nag my coworkers to try it too and that they also would donate.
Keep up the good work.
Hi,
I'd like to know if there is a way to slow down checksum ?
I run it in a virtual machine that is also running another process (producing files that I then run through checksum before they are sent over sftp and then rechecked with checksum on the destination computer, I find an error rate or at least 1 bit per 100 GB)
the problem is that I run checksum about once every 3 days , but when I do it so intensely hogs the hard drive that the producing application is affected, lowering output
Í don't make if it takes 4 times longer but I have to make sure I'm not slowing down production
is there a way to do that ?
thanks !
-shodan
I have this nice big hash file sitting in the root folder but since its creation
I have deleted/moved several folder/files which are now "missing files" during checksum verification.
I had hoped verify would ask me to delete those entries in the hash file
or that the synchronise feature would take care of it.
Do I have to hash the whole drive again and replace the current hash file?
Of course I could edit the hash file manually but...
what about teaching "checksum" a new trick to deal with this situation?
Pretty please?
Your app works well with most MD5 file formats, but the following format from a Mac could not be verified with checksum (note the additional dash):
;Created with MD5 by Eternal Storms Software
01805fe7528f0d98c595ba97b798717a - 20100205_0001.dng
01805fe7528f0d98c595ba97b798717b - 20100205_0002.dng
01805fe7528f0d98c595ba97b798717c - 20100205_0003.dng
The app is called 'MD5' from here: http://www.eternalstorms.at/md5/
Would you add a parser for this?
hi, just wanted to let you know "checksum" saved me a lot of troubles today ... have to renew a file/database server and copy all the stuff on a temp. filesver: approx. 12 MILLION files (in several folders) with a couple of TBs in total. Some user had "locked"/inaccesible files etc. (dont even ask me how this is possible) - so in the end I wanted to make sure I realy got every file before cleaning up the old server and right now the machine is checking the directories over night (I just copied the root *.hash file to the other backup server) - till now it looks like your tool works flawless even for millions of files - Thanks a lot!
andi
PS.: McAffee indeed showed up a trojan-message but eset Nod32 says the prog. is fine ...
Hi, just come across to this site. real present to read. the colors and design are good. your site is versatile. don't even know which category to add in my bookmark!
do have a question about checksum. is it possible to hash a drive by sector? since your program runs under windows, the feature will only be practical if we can run it under dos or linux where the drive will not be altered when mounted. but i guess it's out of the goal of this program? (too bad...)
will come again to read your stuff!
Thanks for the continued suggestions and comments; all are considered carefully. And for the test cases, too pol and andi (it's always great to see the words "MILLIONS" and "TB" and "impressive" and "checksum" in the same sentence!), and flowers, ta.
Michael, bert0123, I remember writing (quite extensive) code for this sort of thing in mangleezee (a renamer). It's quite possible that it may work its way into a future version of checksum for both inclusion and exclusion of files and directories.
bert0123, you can create ad-hoc "profiles" right now by simply making copies of the checksum program (aye, so they each have their own ini). For a front-end, hold down SHIFT when you launch checksum!
ibsteve2u, if you (or anyone) want to play with the installer, mail me for a copy+documentation. It's being deprecated in favour of self-contained apps, but does work rather well.
Peter B, I do like the idea of simple checksum comparing its most recently calculated hash to "whatever" string is in the clipboard (either a hash, or a chunk of text/html/etc. containing a hash), and if they match, posting that information somewhere. This is definitely on the cards.
Stephan, perhaps you need a different tool.
shodan, no, there is no easy way to slow down checksum, yet. I advise creating more checksum files (i.e. per-directory) as opposed to one big root hash file. This will by itself create more "breaks" in checksum's operation, so that other processes can get stuff done. Also, you can verify smaller areas at once, if applicable. Another method is to verify hashes over a network connexion to the local machine. As crazy as it sounds; under certain configurations, it can free up I/O and keep everything running more smoothly (at the expense of a little CPU).
PeterPaulW, checksum already supports more hash formats over more platforms than any other hashing program I am aware of. However, it's unlikely I'm going to add support for every new non-standard hash format that comes along - it really is up to them to work the other way around. At any rate, this hash format looks pretty dumb to me; though I am ready and willing to be enlightened further on the subject!
Interestingly, it states that it uses the OSX built-in hashing tools. But they are based on BSD, right?, which checksum has always supported. Who or what is adding the dash? Please find out, and mail them to ask why!
pop, not sectors, no, just file; though you could always make an image of the disk and hash that. As for your bookmarks; yeah, it's a marketing nightmare! Maybe file under "Folk Art".
for now..
;o)
I am not sure if there is something wrong on my end or yours but when I download the Checksum.zip file, I can't seem to open it. It tells me that it is an invalid zip file. I would love to give this a try, but I can't seem to open it. I have downloaded it twice, I have tried different programs to try to open it and nothing seems to work.
Are you running an anti-virus? I have this problem as well when my is running (you know, always) and I think it stems from the fact that AutoIT is often flagged as bad if the file is incomplete (hell if i know why)
Try disabling your antivirus for the few moments it would take to download the checksum.zip...feel free to re-enable and scan the file you just downloaded, of course.
Good luck!
---Qriist
I dunno about Anti-Virus (cheers Qriist!), but I do know that download managers and suchlike will get you no result. There should be no problems with a regular browser.
If anyone has real difficulties obtaining checksum, feel free to mail me at the usual address, I'll mail you a copy.
;o)
ps. I have found that invalid downloaded zips are sometimes not zips at all, but regular web pages, sometimes explaining what happened, except with a .zip extension! A decent text editor reveals all.
Hi Cor,
Yes, occasionally I have problems with the checksum download where the resultant download amounts to around 100-200KB rather than around 1MB (if I remember rightly)
I'm still using Checksum often and "religiously" ,It's one of those must have programs and along with 7-Zip gets installed immediately after a windows clean install. Glad too I purchased two Checksum licences as the optional registration costs were an absolute steal (hence the two!)
Best Regards,
Steve
Hiho,
i just dl'ed you tool and started to make a check on a folder.
While it is still running, you show a "tool-tip"-like Message on the top of the desktop.
"creating checksum: ...."
status message^^
well for people who have their windows taskbar on top, its not a good placement for it
+ when you use the windows taskbar, you status message will go behind it, so you can't see it anymore.
I use win7, so i still can see a bit of it behind the bar, because of aero transparency - so its still there^^
when i move the taskbar, its back.
Well you may consider to place it somewhere else.
For example a few pixel under the top of the desktop, so there would be enough space for a possible taskbar.
but overall its a nice tool.
ty
"justmakedup"
You can drag the Checksum Status bar around the screen to wherever you like, by clicking on it, holding and moving it.
Hey there,
Love it, but I am having a brain fart issue.
All I want to do is take a single hash of a folder. I want 1 hash created OUTSIDE of that folder to verify the entire folder.
How do I do that?
Right now, the hash runs on the entire CONTENTS of the folder, even if I select the root option. It creates a hash INSIDE of the folder which would negate any hash created ON the folder.
Tom
create a folder "X"
create a sub-folder "Y"
put all the files and folders you want checksummed into "Y"
right click on "X" and do checksum, using the single-checksum option.
you get X.hash which is the single checksum for "Y"
You could also just do checksum of "Y" and get Y.hash which would also probably work, depending upon what you want to do with the files and checksum
Hi,
I'm currently working with large amounts of files that I need to create checksums for but I want to be able to set the directory that the checksums end up in. I am doing this because of the need to capture data and transfer it without writing to the drive from which it is being captured. If you have any ideas on how this can be accomplished it would be appreciated.
Thanks,
Russell
Hi
There seems to be a bug in checksum.exe, where it ignores some files. Eg
create a file on the desktop called "NCOM-20091119-060303-000.log" and then put some data into it (with a text editor or so). Right clicking and 'create checksum' on the file results in 'Completed in 0S, Nothing was done".
somehow I don't think that that behaviour is correct.
Cheers
Kym
btw, I'm using Windows Vista business 64 bit.
I can't install!
Autolt Error
Line-1:
Error:Unknown function name
I'm running XP with SP3.
Thanks, brosoph
thanks
Cor,
Today (13/7/2010) Kaspersky AV is reporting the Win32.Sality.aa virus in the http://hashdrop.50webs.com/hashDROP_v1.03.zip for hashDROP
I suspect it *may* be a false positive, as other download sites produce the same malware detection. Worth checking with your pristine copy?
FYI
As expected, it's a false-positive. ;o)
"checksum's little bother; simple checksum,"
I am wondering what that little annoyance is.
Perhaps a typo?
This failed to uninstall correctly under windows 7 (64bit). The installer ran, removed the uninstall program, however it left the software installed, the icons in the startmenu and the right click context menu hook in place.
greetings,
this is a maddeningly true self-buttkicking story.
before transferring 2 TB of tv series from one of my servers
to another one to free the 6 WD green 2 TB drives to create a
ZFS array under solaris, i corzchecksummed them. as it would
turn out later, this was the best move in a shameful series
of drive configuration missteps.
the transfer to the other server went well, i turned the
source and destination servers off and went north 600 km to
fish for 5 days with my 85-year-old dad. that was from july
2nd to 7th. we had a lot of fun jigging and threw 1/3 of our
catches back in the lake as they were too small or too large.
the evening i came back i turned the destination server
on and 2 hours later the sil3124 controller that the
2 WD 1-TB green drives were plugged on started giving error
messages because of these cheap chinese-made sata cable
connectors that warp and loosen contacts as they warm-up.
mark my words, sata connectors made of thermoplastics
instead of bakelite are badly designed...
i spent a couple hours replacing the sata cables until the
boot sequence indicated the drives were seen by the controller
and then after getting back in windows, the silicon image
sataraid utility told me the sad, sad, sad story: the raid 0
array i created out of an impulse to get higher write speed
and things done before going fishing (worst move in a 31
year IT career) was gone bringing in the tomb with it
2,000,000,000,000 precious bytes of complete series of 24,
2 and a half men, 30 rock, alias, baywatch (11 seasons of
its delicious syndication), and 5500 episodes of other
mouth-watering etc.
so i spent the month of july testing all these wonderful
raid recovery softwares proclaiming to be the best available.
i ended up using the latest trial diskinternals raid recovery
to figure out the raid 0 parameters to access the drives
which it did superbly automatically, but couldn't afford
the 250 bucks they wanted from me to register and do the job.
so i set up a 2007 file scavenger version 3.2 to the task
and after 48 hours of recovery on 2 1-TB external drives i
tested the recovery process with none other than corz' best.
of course corz helped me figure out right from the start
that scavenger's quick mode didn't know didley and files
presumed recovered with it were unplayable. but it also
proved that the long mode was dead on and did the deed.
indeed, just a few seconds of right-clicking and selecting
the "verify checksums" option on each drive roots started
an automated day-long process validating the quality of
5500 video files. mind you this was done on a quad-core
amd 9650 with 8 GB ram running 64-bit win 7. congratulations
to corz for software that works in win 7 and you will
sell me an XXL t-shirt and my son a M-one since you affordably
contributed in a major real-life recovery from a bad choice
of raid setups.
xoxoxo
paul
Just got to know that there is such a good software
One more step before donwloading~
anticiptaing~!
When I downloaded the files and tried to unzip and then execute... McAfee identified your download twice as having a virus attached. The following is what it identified:
About This Virus
Detected: Artemis!33E0F7E127BA (Virus)
Quarantined from: c\Users\<user>\AppData\Local\Temp\Temp1_checksum[1].zip\checksum\setup.exe
You might want to check it out... I saw earlier that it caught in other virus scanners... is this the same problem seen before?
Lady_Will
I would say that if you downloaded checksum from this site then Mcafee is detecting a "False Positive".
For peace of mind you could check the whole zip over at www.virustotal.com against 40 odd AV scanners which often shows up the worst AntiVirus Scanners when it comes to false reporting.
Of course the MD5 for the checksum zip is displayed on this site so that could be checked retrospectively or by another MD5 checker.
I have used checksum for ages now and whilst I did get the odd false positive in the past, I think with Kaspersky - that was fixed by them. Since then I've not had any false positives with any of the anti-malware/virus products I use. VirusTotal though will give you peace of mind as you'll find just a few scanners (not the top ones) show false positives and often if you see more it'll be antivirus products that licence the same "Engine" from an AV Vendor.
EDIT: Just Checked Checksum.zip MD5 = f396abec141a50321caa995665db2986 at Virustotal and out of 40 virus checkers 10 suggest the file is infected. However the real figure is 8 as 2 products have obvious variants.
So out of those 8, the infections are based on "AutoIT" which is a Legitimate scripting language. Unfortunately the scanners which are displaying "false positives" are likely doing so as they are "tarring" every piece of software using AutoIT with the "same brush"
No doubt larger software vendors have the ability to "shout" at these Anti-Malware companies if their software products are seen falsely as infected, because they have real clout and no doubt legal departments at their disposal! It's much more difficult for a lone software vendor to contact the right people in every anti virus/malware house out there so that their definitions remain checksum aware.
If anyone wants info on what AutoIt is see the website here: http://www.autoitscript.com/autoit3/index.shtml
HashDROP is likely malware. virustotal shows 22 out of 40 scanners flag it as malware: http://www.virustotal.com/file-scan/report.html?id=18412e30aeaa2693538aa8e4f29f0af14d7180cd3ece0022b5a67783f558b65e-1283806520
I uploaded to Avira Antivir's submission system as a suspected false positive, but they confirm it as malware. It was downloaded directly from the HashDROP website: http://hashdrop.50webs.com/ and the MD5 for the hashDROP_v1.03.zip file shown on the website is correct as: 89dc09f3649154c9e874a7cd8f3f014d
I don't know how to read this info, but if you want to look for yourself, here's an analysis of the file's activity:
http://www.sunbeltsecurity.com/cwsandboxreport.aspx?id=9032335&cs=4D5F19B607163C34A66B2CCB216C8350
Is there any MD5 software out there that can verify the files generated by checksum?
Is there a way to use checksum to verify files on several different drives at once? So far, it refuses to run more than one instance, even though I've got a multi-processor PC and several different drives that are fully capable of doing this simultaneously, instead of waiting a couple of days for each verify job to finish before starting the next one.
I tried using HashDROP to at least automate a serialized operation, if it's impossible to do it the right way, but the virus risk nixed that idea.
allow_multiple=true
;o)Thank You
I haven't tried it yet. I will leave a comment once I tried
I added checksum here:
http://alternativeto.net/software/corz-checksum/
checksum could use a few extra features that support updating a hash file after moving, renaming, deleting, changing, or adding a file, without the need to create an entirely new file. This wouldn't be too demanding if checksum could do this in a continuous process, where it constantly locates changes and updates the hash file. It would need to keep a history though, so unexpected changes could be located and corrected.
For actually correcting files, MultiPar (and its alternatives) has done a great job:
http://alternativeto.net/software/multipar/
It doesn't support updating minor changes yet, without completely redoing all of it's work for the entire file set. It isn't quite as fast as checksum for verification.
Hi There,
Love your utility. Can you or will you be adding support for sha256 hashes?
Thanks,
Petrus4
Great util. Especially love the portable mode with the .ini.
Only thing it needs is sha256 support, such as for Fedora ISOs.
Was looking for that kind of things on Windows and happy I found your app.
I have some things to ask.
-Speed: the speed you described 100MB/s sounds like it is in fact limited by your Hard Drive speed. Have you ever tried, or anyone else, on a very fast array for example? How fast would it be on a 300-500MB/s media?
-logging capabilities: there are different ways to log changes in files. I think to get the maximum information about what could have happened if a verify of a file comes negative, it would be helpful to have logged along with the hash when creating it the modification date of the file as well as its exact size in bytes. So when verifying a file if it comes negative, date and size can be checked also. If modification date have not changed but size has then there is suspicion of unwanted modification (virus...). if both have not changed it can be an unwanted change or data corruption...
That would be nice to have a report that can give that kind of information.
-last thing about creating the hashes and managing them. looking for these options:
--clean a given directory of all the .hash files
--update all: recreate all hashes for all files
--update new: only create hashes for new files, files that were not hashed previously
--update: create hashes for all new and recreate hashes for changed files without touching hashes for files whose modification date and size have not changed and were previously hashed (hence the good thing of storing these infos alongside hashes in the .hash file)
--verify: would have to read all files and check their hashes against the stored one, flagging modification dates, size changes and negative hashes, also flagging non hashed files, all in the log.
--status: do a quick check of the status of files without checking the hashes but only modification dates and sizes.
I think Artemis false positive is known and, according to the following post, a definitive solution should be possible.
Check this out: https://community.mcafee.com/thread/2016
Hmm
I noticed that you are still showing on the itstory page that the version is up to 1.1.6b, but the version that is available for download is v1.1.4.0. I've been using the v1.1.5.6 that you generated for me 2 years ago with only a few cosmetic glitches (that I reported to you). Is v1.1.6b going to become available any time soon?
Thanks!
Yes, it very useful and powerful software easy configurable and customizable, ready to be used in very different ways in the almost all imaginable purposes.
I make customized setup, customized bat files, customized context items (verify, synchronization, build all - without user input), customized ini file and almost achieved all necessary functionalities
but
have some things still not working as expected or rather not working in optimal way.
Some of them are mentioned in other posts, but let me put them again.
What I have and what I want to make?
Let's say I want to make archive of my files and have control if it is intact.
I make a folder on external HDD, there are hundreds of GBs and dozens thousands of files.
To make it simple and not to add even more files which only I use, I chose to use one file checksum.
Because it is archive, I sometimes add, move, or delete files inside the folder.
What I got this way are some files not hashed and some errors because of missing files.
The first problem I can fix by using "synchronization" (which is not synchronization for me, but rather building incremental checksum by the way it is working), but with missing files there is no easy way.
Yes, I can create checksum from scratch (after checking that everything else is correct), but with cheap 2TB HDD (sloow) it takes lot of time, even if I've just moved few little files.
First thing I tried to do was looking inside the hash file and manually delete bad entries. And there is first strange thing: the files are in illogical order for me; not by name, not by hashes, not by size I think, not by folders/filenames (my preferred expected way), so it would be very hard to manually delete data about some folders or files.
Yes I see "synchronization" is adding entries on the end of file, and it is probably the easiest way to do in such operation, but I just think that some switch to sort out output could be useful sometimes.
Anyway it is easy and can be done manually by other tools also, so not big deal, but also not hard to add (?).
So almost done, but not in the situation that lot of things happened and lot of different files in different subfolders changed place.
Sorting and manually removing "missing errors" will not save time, and the mistake is easy to perform (it is not big deal to delete more file entries, but it increased time to "synchronize" all again).
So the solution for these could be some feature to "synchronize more accurate". One thing is do it at once, and second there could be a switch to "remove missing errors" by deleting the entries relevant to not existing files: this way making the checksum to be actual would take much less time that other described manual workarounds:
1. verify existing content (and there is easy to find by log that there are only missing files errors thanks by different colors, but it can be also covered in status message: "only missing files errors found: [correct][synchronize][log]?)"),
2. delete missing errors
3. make "synchronization"
When I see how long synchronization takes if there are dozens thousands of files in the folder and only few small files to add I think that making (2) and (3) (or (1) and (2) because information about missing files is already known after making (1)) in one pass could save hours, not mentioning it will increase the HDD lifespan.
But it is still great piece of software: diamond (small only in size) I was lucky to find!
Thanks
freddy was here and took your create software.
thankyou for the software, im hoping it will be as you say, for that is exactly what it is needed, geoff.
nice tool, odd site--thanks for both!
eaters
Bloody good!
Great...my only question is how to execute as part of an SSIS package - I need to generate a file and send it AND the checksum.hash to a vendor's FTP site...any ideas?
Also, is there any way to change the resultant extension to .ctrl instead of .hash?
Thanks...
blah
** POSSIBLE VIRUS / WORM / TROJAN ***
I downloaded checksum tonight from this website and sent the two EXE files to VirusTotal
VirusTotal shows:
simple_checksum.exe contains a worm. 3/43 hits
checksum.exe contains a worm 4/43 hits
Maybe a false positive, as there were no hits from the "top" virus checkers that I consider
to be the best ones, i.e. none from Avira, Avast, AVG, Norton, Kapersky, NOD32, ClamAV, etc.
I think these are just false positives because of EXE packers used by the auther? Am I right?
Perhaps the author should disable EXE packing when he builds the application, there is no real need
to pack the EXE, and it has the potential to cause heuristic anti-virus to detect a false positive.
Does the author read this forum?
-- Cheers!
I third the call for SHA-2 (AKA SHA2 AKA SHA-224, SHA-256, SHA-384, SHA-512).
And please add SHA-3 as soon as it is done.
MD5 is severely broken, and SHA-1 is now partially compromised as well. SHA-2 is not yet broken, but is seen as under threat, thus the development of SHA-3
Thanks!
Love this one.
Small, Fast, Standalone and it Worked
thanks man
@Kelly Clowers: The reality is that the security of MD5/SHA1 doesn't matter for almost anyone using Checksum, so it doesn't matter how "broken" they are. We're not looking for absolute security, we're looking for a fast, convenient way to know if our files changed due to random corruption. Intentional corruption/manipulation can still be a problem even if a hashing algorithm is used that isn't as prone to collisions as MD5.
Take this as an example: Let's say that Checksum decided to support SHA512 to generate hashes with. If some bad guy comes along and changes a file, then couldn't they also change the value of the SHA512 checksum in the hash file? Sure, you could save the .hash file elsewhere to know that it didn't get changed, but anyone who really needs that level of security would almost certainly not be allowed to use a program like Checksum on their file systems. True security means knowing and approving all programs that can touch the data, and unless the programs were intensely tested and vetted, they shouldn't be used in high-risk environments. Checksum is about file integrity for the common man, not file security for high-risk environments.
In fact, I would argue in the other direction. I'd really like for CRC32 or Adler-32 to be an option since they can be calculated WAY faster than MD5. I don't care that my uninteresting files could be intentionally manipulated by bad guys to match CRC32 or Adler-32 checksums, I care about random corruption. The chances of a file being randomly changed such that it would still match CRC32 or Adler-32 is still extremely remote.
Good job, but I don't see the Explorer context menu add on. Is any fix available?
Great tool thanks Corz
thanks is just what im looking for
When I double-click on "checksum.exe" I get a "command-line error" window that is too big to fit my notebook screen, which is only 800 pixels tall. A window with vertical scrollers and sized to 80% of screen height would do.
Also, could there be options to save the file´s size, creation date and write date?
When you update a hash file, do you add only the newer files, or you recalculate the MD5 of files whose size or date has changed?
Amir, Cor has a solution for that too: https://corz.org/windows/software/accessories/KDE-resizing-moving-for-XP-or-Vista.php
First off, thanks for that awesome tool. It's doing all I've been searching for for years..
..except~ it doesn't seem to support unicode/UTF-8. When I try to hash files that are named with non-latin characters (east Asian e.g.) the tool simply skips the file.
If that functionality could be added I'm sure my new hashing-hobby would follow me in my dreams - in a very good way!
Thanks~ :p
Ok, installed the app after checking the MD5 with my old hash reader but ran into a similar "command line error" window as the other user above when trying to run the full app - all I could use was the widget version. Figured the build might have been bad somehow so I tried to uninstall and the windows thought it was already un-installed. At this point the shortcuts and program files were all deleted. So, although I'm not sure of the reason, this build seems a little sketchy...
I also would like the addition of more hash algorithms like sha256-512 and sha3 when it is available.
Cheers
looking forward
Any chance a command-line switch could be added to "ignore PE Header"? What I'm trying to find is a simple tool to compare all data in a .dll with another .dll - except data in the PE header (for .Net - the file version info and timestamp info).
nice info...and nice site...
Good...I like it..
Thanks
Thanks
I love it just simple and easy
thanks - just what i needed!
An education to be sure! I wanted to start teaching myself PHP and MySQL. The download sites recommend that you verify the signature before installation. Having never done this before I was at a loss. A web search came up with your tool. Very nice! Thank you and bless you for making this available.
Hi Cor -
I have been using your checksum utility and, while I like it, I still do not find it completely logical. An option exists for creating a one-file "root" checksum file in the root folder. However, this is not recursive ... if one goes down to a subfolder, the hash file for that subfolder only contains the hashes for the files in that subfolder (and not hashes for any files further down in the folder hierarchy). Currently, it is impossible to obtain that each folder in the hierarchy have a root checksum of its own except creating it manually for each folder. This is, of course, impossible for large hierarchies.
I suggest that a fully logical checksum should do the following:
if A = {B,C,D,{m,n,p,q,r}} where A, B, C, D = folders and m, n, p, q, r = files
then A.hash = Union(B.hash,C.hash,D.hash,m.hash,n.hash,p.hash,q.hash,r.hash)
RECURSIVELY.
So basically a folder's hash file collects all the hashes in the hash files for each of its subfolders and files, recursively.
This should be the base creation operation, or the default. Starting from this, one can add various negative switches to modify behaviour. For instance:
-i = would not produce hash files for individual files but still record the file hashes in the parent folder's hash file;
-f = would not record subfolder hashes in a parent folder's hash file;
-if = the current default behaviour for checksum.
The advantage of my proposal is that it would achieve completeness and full logicality while sacrificing none of the current options.
Best,
-- Ruber
I also have problems with folders and files containing Asian characters. I have tens of gigs of files that can't be hashed because of this limitation. Are there any plans to fix this?
I have no problems hashing non-roman characters, maybe it's your system.
hope it works.... save me lot of time redownloading files
Simple and powerful tool. Does what i want.
@Grizer
other-alphabetinator has the same issue. I'm running the latest version of Checksum on two different computers with Windows 7 x64 SP1. That's a pretty standard config these days so I doubt it's my system.
Hi,
first I'ld like to mention that I like checksum very much and that I detected some few incorrectly copied files verifying the destination (on more than one hardware).
Nevertheless I'ld like to request a feature (maybe I overlooked that it already exists, but I didn't find something appropriate):
I have several file structures looking like
folder1/binaries/setup.exe
folder2/binaries/setup.exe
.
.
.
folder999/binaries/setup.exe
this will result in 999 binaries.hash files,
all with the same name.
I am looking for some fail-safe feature, so that one can easily identify where each of the hash files belongs to.
(e.g. Anybody out there who accidently moved something with the mouse? ...)
Maybe there is a way to make hash file name creation not only consider the parent folder but one level more?
Or to include the path as a comment within the hash file?
But I'ld prefer not to use absolute paths.
Best, Erp
I, for one, truly hope that Cor doesn't follow RB's suggestion about storing the entire folder structure into a single hash file because there's too often when I move around folders. With the folders having their own hash file Checksum works fine in the destination. With RB's way I'd be having to re-checksum and/or manually editing hash files to take into account the moved files and folders. Checksum's handling of one file per folder is EXACTLY the way I want it to work.
However, I do have some enhancement requests, just in case you're feeling industrious...
1) How about storing the file size and data/time stamp in the hash file along with the file name? Then, when doing a verify, alert us to the files that were changed along with the current size/date/time stamps as well as the values stored in the hash files. That would help us speed up troubleshooting why the files are different, as well as help us look for backed up files with the proper size and date.
2) Almost always when Checksum finds a changed or missing file it's because it was intentionally changed or deleted. I appreciate that it found them but it would be nice if there was a nice way to automate updating the hashes in the .hash files after we've had a chance to verify that the new values are correct. How about this: When Checksum goes through and verifies files and finds a hash value that is different from what was stored in the hash file, rather than discarding that new hash information go ahead and save it in some way, like in the reporting HTML file, or maybe in a log file. Then have a special mode of Checksum that would read that information and allow us to replace the old/unwanted hashes with the newer ones that were found, as well as deleting the hashes of files that were intentionally deleted. I'd want to pick and choose the ones to update or delete, with an option to "select all".
3) I'd also like to win the lottery and have ice cream at every meal. And never get old or fat. And live forever. Please work on those requests as well.
Thanks!
Bunsen
Thanks
@DrBunsen
With regret, you did not understand what I was proposing nor what Checksum currently does. Checksum currently has the option of storing every single file hash in a unique "root" hash file. You may or may not be using this option, but it's there. As you correctly point out, this limits your flexibility if you start moving folders around. Checksum of course also has the option of creating a hash file per folder but this is limited to the files in the folder (it is non-recursive). What I was proposing was that Checksum enhances the hash-file-per-folder idea by making every folder's hash file collect all the hashes downstream (e.g. all the hashes of the files in its subfolders also, not only the hashes of its own files). It is a generalisation of the "root" hash file idea to every single folder. This way, your flexibility is enhanced, every folder being ready to become the root folder for a hierarchy of its own if you choose to extract it from the archived hierarchy and make it independent. Moreover, while giving you more options, all the existing options would be preserved as well, just perhaps activated by different switches.
Love your product. Been using it a while now. Just sent you some money (sorry it's so trivial).
I have the same issue as Dr Bunsen. I am using Checksum to verify all my older photos. They go back 20 years and there are 53,000 of them in folders by year taken.
Although I do off-site verified backup of these old pictures on a yearly basis, I don't know if a few bits got damaged in one or more pictures so your utility is very helpful to me. If a file or files get damaged or lost your utility will let me know and I can then decide to first get a new hard drive (as I assume lost bits are a sign the drive is going), copy the files to the new drive, verify and restore all damaged/lost files from my backups so that future backups won't be backing up damaged photos.
The problem is that I occasionally edit or rename photos taken even many years ago. I also use your utility on this year's photos which are even more dynamic. I often go in and fix the dark photos, delete duplicates, etc. When I run your utility I get warnings about sporadic photos that I've edited, deleted or renamed. I have to check each one in the log, then rehash the whole folder. Often it's obvious that the file changed due to the date alone but we can't trust that. I guess ideally the utility should stop, pop up the file in a way that I can open it, and then an option to request that the file have its hash updated.
Dr Bunsen's suggestions are reasonable also.
I'm sure you could find another solution that would help this issue.
This looks like an interesting tool. A few suggestions:
First, get rid of the mp3 playlist generation feautre, and make it in to a standalone utility. It's obviously completely unrelated feature-creep that has not place in a checksumming utility.
Second, please allow the standalone checksumming utility to simply and directly send the checksum to the clipboard.. bonus points for not producing any other type of output, not even a window.. though perhaps a very brief "onscreen display" indicator that the checksum had been calculated would be nice to have as an option.
Finally, it would be good if there was an option to send the checksum(s) to VirusTotal and have the results displayed on a web page in the user's default browser.
I would like a way to select 2 or more files in Explorer (using Win 7 64 bit), right-click, and get one of two dialogs:
"compared files and yes, they are all the same"
or
"compared files, and they are different. here are the checksums...."
I agree with previous poster about removing the MP3 player functionality. I like the idea about sending directly to clipboard.
I'm looking for a portable MD5 checker and yours is absolutely great. However it skipped all file names in Unicode. Wish to see a fix to that. Appreciate your work!
This is by far the best candidate to replace the many random checksum generators out there. The unified hash idea is a great one, and checksum's context menu and options dialog makes creating hash files much easier.
However, there are still quite a few issues that will impede checksum from taking over:
1) It is not open source.
1a) While I understand that Corz deserves some contribution for his work, I think that the futures of author-driven development tend to go defunct. Open source ensures the project can continue if the developer loses interest (as is the case with so many such programs).
1b) Additionally, open source is the only way to create an open standard for the unified hash file. Currently, there is no indication of the type of hash provided, and it must be chosen based on length. There must be a standard to indicate each type of hash (such as the "urn:sha1:" sequence). Of course, backward compatibility can be maintained by also simultaneously generating regulard md5/sha1/sfv files. The ".hash" file needs an expandable standard.
1c) Cross-platform capability. Open source could help to create cross-platform versions of the software, which will promote the standards it creates.
2) Not enough hashing algorithms. The unified hash file is a great idea, but without more hashing algorithms, it doesn't serve nearly as much of a purpose. FSumFrontEnd provides plenty of algorithms, but has no context menus and is not being actively developed. Checksum could use Fsum to obtain further algorithms.
3) Context menu customization. Cascading context menus are a must. In addition, this would allow the user to select the hash algorithm (or group thereof) from the menu itself (a la hkSFV). My personal dream is to be able to setup a custom group of hashes so that I can right click, select checksum for a cascade, and select "Create SFV/MD5/SHA1/HASH" to create everything necessary as defined by the group (i.e. create all of them simultaneously).
We also still need that divider fix.
There are many other suggestions (some in these comments, some I have reserved in my brain), but these are the ones I feel will most affect Checksum's adoption as the premier hashing client. Please consider all of this and feel free to e-mail me.
I really do appreciate your work on Checksum and hope it can be developed into the best hashing client. Thank you.
This is nice software, but it hasn't been updated since 2009. It needs to be open sourced so development can continue, especially on more than just windows.
Just want to try it out
The uninstaller in Win 7 does not work at all.
Hi all,
I'm very impressed with this program.
I was looking to check my files before I do backups or copy and this is a ideal tool
Talking about SILENT DATA CORRUPTION:
If CHECKSUM could alert about which files not only have changed but also which ones have changed and still preserve the same modification DATE they had when hash files were created, this would be the only tool against silent data corruption.
This way we could know which files were corrupted by physical drives failures or virus, and so these files could be repaired from a backup copy.
PS. All was Ok with installer in Windows 7 64 bit
PLEASE don't remove the playlist feature. I use this every day. Thnx!
Jolly good ware. Been using it for a year and a half now, now my downloaded programs only crash if they have bugs.
Also the GUI matches my shirt.
09/03/11 - McAfee reports that hashDROP has the Artemis! trojan virus
Do you by chance have an application hash repository of known good hash files/applications? We're looking to add to ours and are interested in finding other sources. We typically prefer the hash generated directly from the publisher for provenance reasons, but we also gen hashes from other sources and simply give those a lower rating, etc. Anyway, if you or if you know someone that has a hash repository, please send them my way. We're happy to compensate accordingly.
Don' work on file with this russian name:
Португальский.djvu
but still a very good hash application though. I'm also "forced" to write my own hash application for some of those "small" problems.
Doesn't work. Windows 7 32-bit. No right-click context menus.
This program is great. I love the unconventional flexibility.
I have a massive collection of files (over 8tb) and needed to keep tabs on it.
- Anything go missing or get corrupted?
- Do my backup copies match original? Easier to transport and compare checksums than compare 8tb over a network.
I wrote a fancy BAT file that looks for errors and adds in new files. It reports overall success via HTML formatted email. I only used this program, sendEmail and the BAT I wrote.
Big question: Do you have a list of the ErrorLevel's you used in checksum??? It's great that you used return codes so we can differentiate conditions so I can decide what is a problem or not. But without the definitions I can only create problems and capture the return code. I'm sure there are MANY more I'm missing.
-9 = cancelled out of sync/options dialog
-5 = nothing to do
-2 = there were verify errors
-1 = checksum.ini is missing or.. read-only media was encountered
(initially, this was the only non-success exit code)
0 = normal exit
1 = attempted to recurse a regular file
2 = nothing to check
3 = asked to verify a non-hash file
4 = checksum launched with no parameters
5 = process already running (and disallowed in checksum.ini)
6 = asked to hash a non-existent path
I realize they are somewhat haphazard, hopefully the next version will see them attain some logical order. Have fun!
;o)
Hi Cor,
Just like to say that I'm still using CheckSum regularly and the small sum I paid for two licences has to be the best software purchase/s I have ever made.
Checked out your blog (last one March 2011, I think). CONGRATULATIONS on the new edition to the family.
Always look out to see if you have come up with any new software gems, but appreciate that you must be mega busy at present. All the best mate.
Vor
P.S. @Jeff (above). If you downloaded CheckSum from this site, you will undoubtedly find your AntiVirus software product is generating a "False Positive". This means it is reading the Checksum code as something malicious when it is Not.
If you want reassurance I suggest yo head over to VirusTotal.com http://www.virustotal.com/ where you can upload the whole checksum disribution zip file (or seperately) and allow 40+ (last time I checked) up-to-date current market AntiVirus Scanners to check it. All the market leaders are there, ie Kaspersky, Norton and so on.
You will find that most of the AV products find the file/s CLEAN and often if there is a query it will be of scanners using the same scanning engine skewing the results.
And yes, with two extra mouths to feed, my time is somewhat limited! But be assured, checksum is not forgotten! ;o)
Hi Cor, May I ask if can I exclude a file here lets say for example. I want to Hash Folder1, i have 5 files(a,b,c,d,e files) iniside in it.. but I want to exclude 'c' file in comparing. Is it possible? Thanks..
For excluding your "c" file from the regular checksum process, there are two ways..
If this is a one time deal, simply hold <SHIFT> when you launch checksum to pop up the one-shot options and use a (standard Windows) file mask that includes everything except the "c", in this case: a,b,d,e
If this "c" file is one you always want to avoid, or want to avoid for a great many checksum tasks, drop it into your ignore_files preference inside checksum.ini..
ignore_files=desktop.ini,folder.jpg,.desktop,.directory,thumbs.db,c
;o)
Hi Cor, I would like to know if it's possible to have a log for all the hash values of the files of main folder to hash even files inside the subfolders.? for example Im about to Hash the Main folder that includes 2 subfolders. each folder have 2 files. After i create hash.. there will be a log in Main folder that will list all the hash values of Files inside Main folder and also the files inside the subfolders.. Thanks Cor..
You can also choose to have those hashes stored..
(in the root, this behaviour sounds most like what you are after)
(containing hashes for all the files in that one folder - this is the default behaviour)
Hold down the <shift> key when you launch checksum (from your explorer context menu) for lots more options.
Have fun!
;o)
ps. the new beta is almost ready
Thanks Cor.
Im really enjoying using Checksum.
More power to you..
Hi -- downloads of the checksum files using the browser Opera 11.60 (32 bit) under Windows 7 (64 bit) are incomplete:
12/27/2011 03:20 AM 3,076 checksum.zip
12/27/2011 03:15 AM 3,094 checksum_1.2b.zip
About Opera
Version information
Version
11.60
Build
1185
Platform
Win32
System
Windows 7
XHTML+Voice
Plug-in not loaded
downloads using the browser Firefox Nightly 12.0a1 (2011-12-26) are complete and functional
12/27/2011 03:22 AM 1,094,829 checksum.zip
12/27/2011 03:22 AM 1,176,397 checksum_1.2b.zip
MultiPar can verify and repair damaged files, and even restore lost files. It's not as fast as checksum for some things, so if you just want to do a quick verification, checksum might be better. <snipped>
Hi,
I just start using checksum and I have some hints for further improvements:
-when synchronizing the hash file it seems for big hash file that only one CPU is working (25% CPU) for quite some time. Maybe using multiple CPUs e.g. split hash file into parts, would be helpful
-when verifying or calculating the hash file only the number of files are displayed. It could helpful to have also the size displayed since for hashing large number of files with different file sizes the number of files does not say much for the time left.
-When the process is hashing/verifying a large file the right click on the task icon does not work or has a big delay
-I'm missing a pause menu. How to stop hashing and start later? Is the hash file complete when I "exit"?
One other suggestion would be to have a service which automatically hashes in a root hash file when any file is stored or updated on a particular drive. This would be very helpful to have an up to day hash file especially for a drive which stores only data (for sure not a drive for operating system).
Nevertheless this is a very helpful tool and very fast.
Thanks for the fast answers.
In my case I have checked the disk access in background while synchronizing and it took some minutes without any reasonable IO access but the CPU was on the limit at that time (25% at quadcore).
Maybe due to caching.
On a side note, I noticed (hunting some C++ documentation) that there is a flag one can use to force disk reads to not use caching, always fresh reads. This may appear in the next version of the hashing DLL, at the expense of checksum appearing slower in certain situations. Comments welcome. ;o)
Ok but hashing over 6h (in my case) I would like to have an estimate if 6h or some more hours since my PC is not running 24h a day. Nevertheless speed is much more important then this info.
Is it so much costly just to sum up the file size from the file just hashed?
;o)
For information.
I downloaded checksum to try as it looked ideal for me. Whilst it worked fine on the first two files I tried, it failed miserably on a file size of 5,197,092,864 bytes. I ran it twice on the same file and got two different results (!), both of which were not the same as the published checksum for the file. When I searched for and found another utility, it gave the correct checksum (consistently). I downloaded another copy of the same big file and the results were the same.
So checksum is not for large files, it seems.
This was using fully updated Windows 7 (64-bit), by the way, running with full administrator's rights.
For more of this sort of information, check out checksum's version.nfo, link at the top of this page.
;o)
great prog buddy! it is possible for it not the change/update the "last modified" date for the folders?
Do you have a ETA on the release of version 1.3?
ps. please note, you do not need to enter an email address to post here, especially a fake one.
When you charge for something, that should be UPFRONT.
I'll use the similar FREEWARE, thanks very much.
Also, you can use it for free for as long as you like. Quit bitchin.
;o)
G'Day,
*LOVE* Checksum for Windows... good & brilliant work - Many thanks. Any hope of a version for OS X? The world needs it.
PS: I appreciate you sense of humor, and applaud your restraint.
Cheers,
RGM
You may have noticed the modern checksum release contains (in the extras/ folder) a Linux version. This wouldn't be too difficult port across, the underlying tools are the same and it's pure bash. The optional GUI bits should easily find Mac equivalents. Most of the main checksum goodies are there, and it's GPL licensed, have fun!
;o)
Hello corz,
A year ago, I post with the name thomas5267, look for me!
Firstly, if you are still loving JKDefrag, you should try out MyDefrag, it is great!
Secondly, the right click menus is still not working, I cannot find any registry entry about checksum in both HKCR\Directory\shell\ and HKCR\*\shell\, fix needed.
specific. Or else try reinstalling.
Thirdly, can I be a beta tester?
Lastly, do your website send a email to the replier (reply-er) to notify that the comment has been replied, if no, add it!
My System Spec:
Windows 7 SP1 64 bit
No registry protection AFAIK
Drivers not up to date, but not crashing
Probably not relevant:
GIGABYTE GTX560
Philips 244E2
Gigabyte P55A-UD3
Acbel 380W Power Supply
Epson CX5900 LOL!
WD Harddisk
2x 2GB Kingston KVR1333D3N9/2G
Sorry for consecutive posting as this issue has to be emphasized. Your website is saying that I am hammering your server because you have set the auto-redirection time too fast.
That said, I'm woking on an AJAX version which doesn't need refreshing (see previous post for timescale on that)
Yes, I am using Google Chrome.
PS: I decided to revenge so I post this instead of editing my previous post.
Hello corz,
Answer this question first, what do you do for living?
OK, now I will elaborate on the right-click menu issue.
It doesn't not appear in the menu. Neither when the mouse is pointing to a directory or a .exe. Yet double click on .hash files still verify files.
Run with elevated privilege alone or together with XP compatibility mode doesn't help either.
Alternatively, see here for some registry files you can merge yourself.
Then answer this, how can I donate and have a beta version? I would like to have a shirt but shipping to HK is too expensive.
EDIT: NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO!!!! The redirect thingy is being misjudge as DoS...
;o)
Hi
corzcor, (Haha not funny)Cor, where do you live in and whats with those mung-beans, Tahini and Lea & Perrins.
A embarrassing question, are you a gentleman or a lady?
I have donated and gimme the beta checksum! Consideradding the soon to be announced SHA-3.
Mail me!
;o)
We downloaded the software but after all the people on Helpdesk and even our system people tried to install it we couldn't get past the error:
there was an error registering the zipper dll. We then followed the instructions to give everone access to the zip fil and registering it manually. But it still didn't work. It then proceeds to give the error message that the
ATL dll can't be found.
Please can you help. It seems the installer doesn't work correctly.
As for checksum.. Someone once mailed me about this error, then they mailed back a few minutes later saying they had uninstalled and reinstalled again, and it everything worked. I did not get any more information on the error, no OS details, nothing; much like your report. Anyway...
First, ensure your download is uncorrupted.
Then ensure you don't already have the corz installer files installed in the shared user folder, and being used. And ensure the checksum installer isn't already running on the system.
Then try running the installer as an administrator, if you haven't already.
After all that, you don't need to use the installer's unzipping mechanism, you can unzip the files yourself, right where they are and edit the setup.ini to use a raw install, change..
archive=files.zip
to..
archive=files
(FYI: checksum.exe et al will now be inside checksum/files/files/)
Then run the installer as usual.
Alternatively, simply unzip the files and drop them somewhere, for a completely portable install. The installer was always optional, and most likely soon to be deprecated.
En-Joy!
;o)
Hi,
I'm looking for a replacement for my current folder hashing proggy. I'm running win7 x64 on a LAN and the old proggy seems to have issues somewhere. Could be with large files (30 - 70 GB), or handling long filenames (all those files whose backups end up in \\servername\Andy's files\backups\2012.todays.date\original path+file name) whether checksummed across a network via a share or checksummed locally. Folder sizes are up to about 300k - 800k files. Sadly the end result was the number of files per hash output and number of files per "folder properties" were never the same with old proggy. I couldn't work out if it was because of large files, long path/filenames, network v. local use, or large numbers of files, or some combination. Great when you're trying to verify backup integrity and want it to "just work" ;-)
So here I am looking for new proggy!
Seeing you have a friendly FAQ and blog thread and Checksum looks actively developed, I thought I'd ask how Checksum handles these things. Also anything else you reckon's worth knowing
Hello!
You want(ed) to write the best hashing programm? So it would be great to implement the Tiger Algorithm (the latest algorithm) and the other algorithms ;-) (BTW md5 is already cracked - by super computers and distributed computing)
Nevertheless good programm and great motivation!
No go, Cor, can you recheck things.
I used checksum on a local folder and it missed out all files with path+filename length > 257 which DIR /S listed.
More detail:
Windows 7 file server has a folder "X:\F" with ~358k of files. Folder properties and DIR "\\?\X:\F /S" both agree the folder contains 358291 files. Hidden and system files as you can imagine are always visible (and entire drive was ATTRIB -H -S to make sure). Checksum used with a single .hash file and absolute paths selected in checksum.ini (can email you the ini file if you like). Checksum started locally on the file server, using right click of folder in Windows explorer. Status popup built the file list then stated it was on file N of 358291 (as expected!) so we know it's reading the number of files correctly. When I opened the .hash file it only had 357720 files listed in it though. No errors or log file, checksum reported successful completion. The culprit turned out to be that the 571 files with full path+filename length > 257 were nmissing in the .hash file (I can't tell if they actually got hashed or not).
Hope this helps you to figure what's up, it seems reproducible (same effect on 2 other disks on file servers here).
Didn't seem to be a problem with Windows - then again I used \\?\X:\.... rather than X:\.... for the DIR reports which apparently causes Windows to use the long filename API. I didn't check whether checksum would have worked if it was given a folder in that format as well - I should have checked.
What was curious was that checksum got a full correct folder count when building its file list, it just didn't create a line for the hash of each. So it obviously did use a Windows API that got the filenames even on long paths first time, then didn't use it second time. I gather there's a proper API for long filename access that guarantees file reads will be done using the API that handles 32k path length - maybe make Checksum use that API when it's hashing? That seemed to be all that it wasn't doing.
It would be great to not have to think about it, but we'll need to wait for a better version of Windows to appear. I can't afford the time involved in creating a workable solution for current Windows versions, assuming it's even possible, and you guys aren't paying for it, are you? ;o)
One functionality request :-) the dialog that pops up first time or when "shift" is held down .... can we have a button on it for advanced options so that location to put the hash or other options can when needed be set case by case? Just asking!
up & coming newness? Yay!
One simple "sort of fix" for the long path thingy would be if checksum could somehow detect if long filenames were likely to have affected the checksum output, and notify the user if that's likely, so at least the user has some heads-up.
Also the eventual replacement for NTFS (ReFS) is said to handle paths and filenames of 32k and as a "ground up" design it would be surprising if it didn't get this well-known issue right. No guarantees though... it's the Windows team ;-)
I googled the md5 hash for this zip file (7024ccb56480d868b82127e53bdb7a6e) and it was flagged by 4 antivirus programs.
https://www.virustotal.com/file/1471876c79d7b4ef0ca1b9ff1ce9b2035cd836192046fd99e23fba8fafd7dafa/analysis/1332082386/
Enjoy your trojan, gentlemen!
Enjoy your delusion, Steve! Try not to be like those four A/V.. Brain-Dead.
;o)
Do I need a license if I am only using Simple Checksum?
Some feature requests:
add a option to remove file hashes for missing files from the *.hash file. These show up as errors but are mostly noise. The report makes no differentiation between missing and damage and i have to look one by one. As i rename and move files around constantly, this situation is very annoying.
Add expand all nodes to the report file.
Add word wrap to the tooltip. They are divided in logical parts, this is how would it show up in three lines:
creating checksums: [folder: n of m] [file: x of y]
drive:\path\
hash filename: hash value
Right now they are too long and as i use the taskbar on top everything is covered and some of it isn't even legible because it goes beyond the screen limits. I known that i can move the tooltip around but i want it on the top of the screen.
Thank you, this program is a very useful and well thought out application. I wish all applications like this. Doing one and doing it well.
Note, feature requests go here. ;o)
Thank you for your answer. Even if it was only to show condescension.
This is what the "version.nfo" says: You can also now choose whether or not to report (and log) missing, changed, or corrupted files.
Which is not the same thing. In cosmetic terms might be, but not in practical terms. The hashes of the missing files would still be in the *.hash file. The more "moves" and "renames" the worse it would get. Removing them is the only logical solution, not hiding them.
Again you didn't read. My suggestion was to expand all nodes. Right now clicking on the title only expands the first level.
Three times bigger, but would cover less than 5% of the screen and would only be there for a small amount of time. This would be a option, if users wanted to cover their screens completely with the tooltip what would it matter to you?
But your objections makes no sense. My suggestion would make the tooltip more readable with very little side effects. Semantically and logically separated information. Three lines. 2 more than now.
Your "solution": scrolling text is, even according to you, a bad "solution"!
As for using a smaller font is not a option. Neither for long text since the font is already small as can be, nor for organizing the text.
I should perhaps add, if you are a new checksum user, it can sometimes take a while to get over the wow factor in the way it works, and the tooltip only compounds this. After a time, you may just disable it altogether, happy in the knowledge that things are getting done. In the upcoming version, a left click on checksum's tray icon will momentarily display a hidden tooltip, which is a nice compromise.
I provided useful feedback after careful reflection. I'm sorry that you think that the users of your programs are all morons.
;o)
ps. you know you shot yourself in the foot, posting "Some feature requests: …" on totally the wrong page. How am I supposed to take whatever comes next? You clearly didn't read my lovingly crafted text output, but expect me to read yours. Many of the checksum users that I've communicated with are a lot smarter than me, but with a start like that, I didn't count you among them.
Hi Cor,
May I ask if it's possible to not include a specific folder and all it subfolder and files in hashing? Thank you.
Hey Cor,
I just installed your program for the first time - looks exactly like the thing I need for my external backup HDD. Only I have problems using it with big font settings in windows xp (220% dpi setting - connected to a tv) - the gui is messed up (does not adapt). Using the screenshots from the website I can deduce the functions and hotkeys so that I'm able to use it - nontheless it would be great if you could make the gui high+low dpi ready.
Thank you for the program,
MP1
Very simple question that I can't seem to figure out. How do I use checksum to verify the SHA1 of a disc in it's entirety, ie, one checksum for an entire physical CD/DVD. Right clicking on a drive D:, for example, produces a hash for every file on the disc, which is not what I'm looking for. I just want one number to compare to the ISO file's hash.
Is it possible to amend the comment block that appears at the top of each hash file? If possible we'd like to include a link to our own organisation and some extra text.
I know it's possible to manually edit the file after it's generated but I wondered if it can be done automatically when the file is created.
Hi There,
First off, I love checksum (and simple checksum), excellent work.
I was wondering if you could comment on the reliability of interrupted checksum operations.
I've been using checksum to verify copies of backups I am making. In total, I'm running checksum on 1.3TB of data located on a server which I'm connected to by gigabit LAN. As you can imagine, this takes awhile, which is understandable, but unfortunately the new system I'm running checksum on (not the server) seems to be crashing and I have to restart the checksum procedure. This (finally) brings me to 2 questions:
(Q1) I've been using the "synchronize" option as a sort of 'resume' feature. Am I correct in thinking this mode will not re-process checksums for files that were processed before the system crashed?
(Q2) If checksum is interrupted during the processing of a file, would it have recorded an intermediate result to the .hash file that is incorrect? If I was correct about Q1, that would mess up my use of 'synchronize' as a 'resume' method because checksum would think the file had been processed when really it hadn't completed. So are the checksums only written to the .hash file AFTER the target is 100% processed?
Any other tips on how to recover from an interrupted checksum job?
If it's only checksum that's interrupted, there's nothing special to note; aside from choosing synchronize, which you got.
Thanks again for making such a great tool!
Please consider a license!
;o)
Any news regarding CRC32-implementation?
Thanks!
;o)
Windows XP SP3 portable mode checksum 1.2.3.9 and simplechecksum ignore the following checksum.ini parameters:
unified_extension=false
Should produce .md5 output file extension instead of .hash with checksum.ini parameter algorithm=md5, but does not. Other hash file checkers are incompatible with the unconventional .hash extension.
beep_alert=false
checksum's <Shift> dialog still defaults to true.
The output hash file #(pound sign) comment prefix is incompatible with other hash checkers which expect the conventional md5sum format ;(semi-colon) comment prefix, notably FileVerifier++ 0.6.3.5830, hkSFV 2.0.1.84, and ilSFV 1.10, likely many others, though not MD5 Checker 3.3.0.12. My Windows' system language is English-US, Windows' other language tabs are their original defaults.
Checksum's dialogs are microscopic on 1280 x 800 laptop screens even at 120 dpi scaling -- you so obviously have a gargantuan monitor, maybe two!
The file drag and drop paradigm is unwieldy, as it requires either two Windows Explorer instances or a third-party dual pane file browser in portable mode, or a lot of unwelcome gymnastics in conventionally installed mode.
Thanks for the feedback.
;o)
Great program.
Thank You
I like this program so far. I'm a beginner at using checksum to verify my stored data.
Whenever I would burn data to cd or dvd I would at least check that the "properties" could read the
data "size" and "size on disk". But from now on I'll store checksum values on the disks along with the data.
I have some foreign language (korean) folder and file names. Checksum doesn't work for these files. It simply ignores the files or folders and can't find anything to checksum.
yet you mentioned it can handle foreign languages as well, so quite puzzled if it's just my computer...
Hello.
I recently run into the following critical error (dialog field):
"Hashing Library Error!
I couldn't open the hashing library..
c:\bat\hash.dll
Perhaps it could not be saved in its specified location"
It appears and no hasing is done. I noticed however, that the hash.dll under c:\bat is created and is left there even after the program termination.
I should add that I run Windows 8, but I'm damn sure checksum has worked here before.
Any thoughts? Thank you!
indigital
Is the function "Find a file with specific hash" supported ?
First, many thanks for the program. It's great!
It seems that the checksum is computed across NTFS file streams. I assume that this is a consequence of it is so fast - bypassing the file system. Still, it would be nice to have the option to include the streams or not (perhaps as a command-line switch), as the same file or not. For example, I have
foo.txt
foo.txt:bar
foo.txt has two streams, the default, (often called "", ":", "$DATA"). I would like to be able to create checksums as follows.
Option 1:
abcd foo.txt (includes both the default and bar streams)
Option 2:
dcba foo.txt (the default stream only)
Option 3:
dcba foo.txt (just the default stream)
aaaa foo.txt:bar (just the bar stream)
I'm not sure how much slower this would be, but it would be worth it for some of my applications.
Maybe some big potential corporate user will come along and sponsor that change. That's unlikely though, as support for streams on Windows is shaky and inconsistent. If anything, the idea that I could attach data to a file and not change its hash is much more useful.
;o)
Hi. Installing checksum I get "Autolt Error Line -1: Error:Unknown function name". I notice that someone previously reported this and was advised to re-download. Done this - but still the same. Thanks.
Please ignore last message - a reboot fixed it.
Hi, one HUGE feature I rely very heavily on with RapidCRC is sadly (and very, very strangely) not available in checksum.
It's the ability to select multiple folders or files to be hashed.
What I normally do is select multiple non-adjacent folders or files (ctrl+click) and then right-click send-to > RapidCRC. RapidCRC launches and then proceeds to hash all the files I selected. It's dead simple and very intuitive.
Now when I do the same with checksum, there's a pop-up warning me about multiple instances of checksum thrashing my hard drive! After that checksum only works on a single entry I selected and ignores the rest! I have to manually select each item to hash and wait for it to finish before selecting another one. ONE AT A TIME. The only other option is to move all the things I want to hash into one folder, then right-click that folder > create checksums. After that I have manually move back everything to where it was along with their new hashes, which is even more tedious than hashing individual items one at a time.
RapidCRC doesn't launch multiple instances. It simply launches ONE instance with all the files and/or folders I selected and just checksums them one at a time inside a queue. It seems so natural and logical I don't understand how checksum doesn't have this function.
I can't fathom why you would assume it takes me so long to select multiple folders.
I wouldn't waste my time manually selecting tens or hundreds of folders if that's what you're imagining I'm doing. Think 15 items MAX. Usually it's just 2. TWO folders. That's it. The problem is, if I decide to hash those TWO folders with checksum, I end up wasting a lot of time. This is because the folders I work with aren't usually very small. Think 20-60 GB EACH. It may be only two folders but already they can total more than 40-100 GB.
Normally I would just select those two folders, right-click, send to RapidCRC, then walk away. When I come back, it's done. Simple.
But with checksum I can't just walk away. I have to wait for it to hash ALL 60 GB in one folder, then have it start on the OTHER 60 GB folder, and THEN I can walk away. BTW, those TWO folders reside on a 2TB drive almost completely filled with lots of large files mixed with thousands of tiny files, and it's one of many drives I manage.
You're telling me the only other options I have are:
a: Just hash the entire drive. ALL 2000 GB instead of just the two 60 GB folders. BTW I have in fact hashed 2TB worth of data before, and it takes HOURS. Checksum (or any program) can't magically hash faster than a hard drive can read.
b: Put the two folders I want to hash into another folder, and then hash that folder.
Then move the two folders back to where they were and delete the temporary folder. Like I said before, this gets old very quickly especially when you have to do this multiple times a week.
I can avoid having to move folders and files back and forth by using an extremely nifty program called "Virtual Disk" (virtualdisk.net) which is unlike anything I've ever seen before or since. Basically, it allows me to add folders or files from anywhere and they'll appear inside the virtual disk as if they were physically there, but the folders haven't actually moved. It's essentially NTFS junction points on steroids. This allows me to select multiple folders and files even from different physical drives, and have checksum hash them all in one go. While this is a neat workaround for checksum's limitation of only being able to hash one partition at a time, it is far from ideal for smaller tasks that need to be done on a regular basis.
This also applies to verifying hashes. Sometimes I just want to verify TWO large files or folders, not the entire drive, which again, takes a lot of time.
All I'm asking for is essentially an "enqueue" function, and I can finally drop RapidCRC for all my non-CRC32 hashing tasks. If you're telling me it is infeasible to implement such a simple function in checksum, then I ALMOST regret donating those $7 for a checksum license.
I do understand the convenience of selecting multiple folders for hashing though it's been years since I considered doing it myself, stuff that gets hashed tends to live together these days.
I didn't say it was infeasible, in fact, I told you nothing! But I can also assure you it is no "simple function". You are spawning multiple processes with Explorer - checksum (the first instance) must track all those processes and queue their command lines into a batch of some sort before killing them all for the task itself, and that's assuming Explorer has indeed launched them all, yet. Or else have them all running feeding their commands into a central queue. I could certainly look into it, in fact I've thought of a couple of unusual approaches as I've been typing this, but there's currently a freeze on new functions so that I can get the current beta released before Summer!
In the meantime, if it's something you do a lot, with the same folders, it's simple enough to whip up a wee batch file to handle the job, or else check out this.
If you need a hand with any of this, get in touch!
;o)
Thank you for pointing out hashDROP. It solved my problem perfectly, and it uses the same ini as checksum which is very convenient.
I mistakenly assumed queuing multiple files was simple since it seemed that way with programs like RapidCRC and foobar2000. There's even a method in basic javascript for opening multiple entries in web browsers so it seemed trivial to implement a similar function to handle local directory paths. I was thinking something along the lines of recording the file and folder paths to an internal list and then just going through it one at a time. But it seems hashDROP already does exactly this, so I understand how there's no need to do work that's already been done. I'm only curious as to what hashDROP does that can't be easily done within checksum. Is it doing what you described and just opening and closing instances of checksum automatically?
c:\path\to\program.exe "d:\path\to\some.file"
This is trivial to deal with, internally. Things get decidedly trickier when you send multiple files. If you did that via drag and drop, the command line might look like this..
c:\path\to\program.exe "d:\path\to\some.file" "d:\path\to\some.file2" "d:\path\to\some.file3"
Which is also fairly easy to deal with, internally. However, if you did the same thing via a context (right-click) Explorer menu. The command line looks like this..
c:\path\to\program.exe "d:\path\to\some.file"
and this..
c:\path\to\program.exe "d:\path\to\some.file2"
and this..
c:\path\to\program.exe "d:\path\to\some.file3"
In other words, Explorer launches three separate processes, one for each file.
So now we need some kind of built-in mechanism to track completely separate instances of the same program running on the system (checksum already has this capability) and then some way of passing the data (file name and command-line switches) from the other instances back to the first instance of the running program, queue those commands, and then kill the other instances so only the first remains to; once it it certain no more data is coming; complete all the tasks.
While not trivial, it's certainly doable. I've just never had a burning need to do it. I did quickly play around with an idea after your last post, which worked quite well, so it may appear in checksum sooner than expected!
By the way, afaik, foobar2000 was unable to do this until fairly recently, much to the annoyance of its users - in a media player, this sort of capability is definitely expected.
;o)
Is there a console application nestled in your program?
It seems your program is too good for my needs. I'm comparing files on different computers. And running hash tools over the network is ugly in the extreme. So I've written a clever perl script to write a customized batch file, copy the hash utility over to that remote directory, and run it locally. The concept is great, except I haven't found a utility that doesn't need installing, is console based, won't crash, and in fciv's case won't try to remember past work.
https://corz.org/windows/software/checksum/checksum-tricks.php#custom-command-line-switches
checksum can run in a completely portable state too, so long as there is a checksum.ini locally, checksum will use it. No installation required.
And if you are running checksum from a command prompt or some remote launching mechanism, checksum even has a special switch ("q") to suppress all dialogs on the local machine. Add to that fully customizable local logging and it looks like you are good to go.
Need anything else?
;o)
I have been sent to hash SH1 but I have know idea hoe to open them ,,reply to them or even read them,, I would love some help on what to do please.. I know how to save keys to GPA 0.9.1 svn1024 and encrypt a message and send i but that's about as far as my knowledge goes.. Here is what iv been sent and hopefully you can help me open them and teach me to do it myself..
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.11 (GNU/Linux)
iQEcBAEBAgAGBsJRPSJMAAoJENSgSmxCqWM3/DQIAKFO2oesfQZxinoxrQ7WS+xb
qiV0I5I2xcCJEFPC+33n072KYcUS4vrrXQSTlTNhvxnAxA0cNlWJdxqGtbw//hsZ
gRwuJ1/D7pf67aHnxY4iKo/NHm3DxIxnNggsAyhlke6ux/33n5xXL+qGPLRRJyMi
cq2dv36yRVWteq3n7Wn/THxWm++PJnncsrB5yzUe6nZpgT3Co/a00vlnZoLQVYFB
nNrxL3aButWpp+EqUnjz+fNyJ9epIfNzgCX61Z7uUuBleXyfHrZRcN3AcRtkeces
19sdys7mlPSzvoKNfkQTpoZjcVkMy27MUXtttnKPpMrXMbdSr7ohLi+2ciCr9cU=
=RcAm
-----END PGP SIGNATURE-----
Look forward to your reply Thanks Shabe
You don't have to do anything with this SHA1 hash. If you like, you can install gnupg, maybe the Thunderbird OpenPGP plugin and, once this individual's key has been verified and added to your keyring, use the above hash to verify a sender is who they say they are. I'm assuming by "sent", you mean email.
You can also use these signatures to import a person's public key from a public key server (there are many) and, assuming that information is okay, download, install and use their published public key to send them encrypted messages which only they can read. There may be other things you can do with them!
But again, nothing to do with checksum.
;o)
Hi, checksum seems to ignore files and folders with certain characters like: ★
(There are others but I can't remember them right now.)
Also, checksum errors out when trying to checksum folders with names longer than 100 characters? 101 or more characters results in an error and unhelpful fallback behavior.
If file name length is the problem I wish checksum would've just said so, though 100 characters is a bit low for a limit. Even the old win32 api has a limit of 255 characters, or 249 or whatever, but way higher than 100.
;o)
Can I download the beta?
I tried here: https://corz.org/engine?download=menu§ion=beta%2Fwindows%2Fchecksum
At the time of this writing, there's nothing there.
Hi, thanks for making this very useful tool. After donating, how does one get a license number?
;o)
Hi Corz
I must be missing something. I looked over the history, and there you speak about the 1.3.5 version. I see somebody above asked about trying out that beta, and you said
"Only registered users have access to unreleased betas."
What do you mean by only registered users? I don't see any signin/register subpage anywhere, unless I'm overlooking it. To what registered users are you pointing to, and can I become one?
Secondly, while at the history page, I've seen there were other versions prior to 1.3.5 mentioned there, but the one in the download area is for the 0.9.7.6
The current in-testing release is v1.3.5.6, which only a few participating registered users are using. There have been no issues so far (though feedback is scant as ever - usually a good thing), so a public release is inevitable, RSN.
Thirdly you answered on april that in a couple of weeks the 1.3.5 should be released, and it's july now, so I'm just asking if I'm doing something wrong, or looking at the wrong place. Is there some new subpage for current checksum app version?
cheers
Lucas
There used to be a special secure page for beta-testing downloads, but this proved problematic. However, with my new licensing scheme now in place, it's highly possible a beta-testing area may reappear in the future, somewhere registered users can grab the latest builds and what-not.
For now a simple mail gets you the latest version by return.
;o)
Thanks for getting back to me that fast. Ok, to go point by point.
1. I saw 0.9.7.6 version at the bar of the corz installer, and I assumed that shows the version of the checksum being installed.
Ah yes! That is the current corz installer version. As I've mentioned, this will likely be deprecated at some point.
When I went to Control Panel in windows it indeed shows 1.2.3.9 version, so all is in order it seems.
2. Lastly the registration thing. When going to license subpage, I only see this:
I don't see any registration/signin link. If you reffered me to this page to indicate that I need to buy a license to be registered
I did so months ago.
But all I got was a code in a mail which I entered in the application tool, not anywhere on the page as I didn't see anywhere a place for it.
I guess I'm missing something completely obvious. Could you please help me with that?
cheers
Lucas
;o)
I'd like to take this opportunity to thank Cor for his excellent direct email support that he showed in past couple days to me, which helped to solve my "problem" as well as answer couple further questions.
Lucas
I'm still not entirely sure of the best way to run the beta testing process, I've tried something different with every major release (my minor versions are like most developer's major version!) and have to admit I've still not found the ideal system.
But having people make the effort in sending a mail seems to be a better way of weeding out those who wouldn't make the effort to give feedback, which is the whole point of letting betas "out there".
But basically anyone who takes the trouble to mail me deserves a reply.
Have fun with the beta!
;o)
Congratulations on your initiative.
Great job!
"itstory is here"
Fabulous sentence.
The site design is formidable.
Thanks!
By the way, if you like that sentence, you'll love these!
;o)
Hi Cor,
Using Checksum has become an integral part of when I deal with moving files etc.
For example, I have over time backed up my Steam purchases (Valve's software platform) which consists mainly of games spread across different drives. I now have a 3TB usb 3.0 drive to handle the compressed backups of all of the games I have purchased/registered through the Valve Software's platform.
For simplicity I use Teamviewer to transfer files across my Lan albeit quite slowly in comparison to what the USB 3.0 standard can achieve. However I do face another issue as my laptop's USB 3.0 ports and chipset were an early adoption of this newish USB standard and that's where Checksum really helps (as I've had all sorts of odd blips when the USB 3.0 ports are in use).
On each of my game backups which vary between 100MB to 22GB I use checksum individually and select the "Root" option. I set the Steam backup program to break the files into 1100MB chunks to facilitate with file manipulation eg to help with things like defragging and the moving files.
I could of course do one "Root" file for every bunch of backups I move at a time, but I do prefer to have a separate Root entry for each game backup. The Checksum # file will then remain with that backup file set as a sort of ID badge.
Checksum is invaluable to me for checking that the game backup I made on X PC is exactly the same when it's copied to Y laptop.
I truly don't know what I'd do without Checksum now. Thanks for a fantastic program and keeping your product, website and feedback up-to-date
Indeed I am at this very moment putting the finishing touches to the new release (I always forget how much work is involved in that! I could do with an assistant!). The documentation is in need of an update, too.
I haven't had a single bug report for v1.3, so as soon as I can get everything packaged up (and run through my batch of test install scenarios) and a page created where folk can update their licenses to the new scheme, it's good to go. Of course, as is usual around release time, the Universe conspires to scupper the final process! This morning it's simple checksum no longer building with my updated compiler. Fun and games!
Watch this space!
;o)
Hi,
I have been trying your checksum utility for awhile and find it is a very useful app to have around! However, I have noticed that when verifying large files (500MB+), it appears to be using disk read caching in Windows 7 64. When i verify a large file on disk, it takes some time for the checksum app to read it, and there is clear indication of disk access. If I immediately do it again, the checksum is very fast and it's obvious that the disk is not being read, but that the cached data in memory is being checked, not the data on disk!
This unfortunately becomes an issue when copying large files and then trying to verify the copy. A scenario sometimes occurs that I first create a checksum for a file in a source dir, and verify it. I copy the file to another dir or drive, and verify the checksum on the new copied file, and it's clear that the checksum app is just reading the cached value in memory and not the data from disk. So, you can see that this creates a problem for checking copied files.
Is it possible to force the checksum app to re-verify data in large files by bypassing the Windows disk cache?
The long answer involves some fairly low-level jiggery-pokery by the operating system which I tried for a very long day to get around. FILE_FLAG_NO_BUFFERING and such only work on file writes. I played with quite a few promising code examples, but nothing actually bypassed Windows cache in the real world. Google was not my friend.
I did this because I too want checksum to always read the data from the disk. At least have the option to do this. To hell with the benchmarks! I went to bed in frustration.
The only way around it, I have found, is to either empty your cache (by filling you RAM, not as easy as it sounds), or else rename a parent folder before verifying. Of course, if you leave it a while, do some work and stuff, the cache will clear out. checksums are most useful after-the-fact, anyway.
If your disk is removable (some SATA hard drives are also technically removable, depending on your adapter), you can have windows disable the caching altogether (in the Device Manager > Disk > Policies).
And if anyone knows how to do this (in C/C++), I'm definitely up for hearing about it!
;o)
Thank you for creating Checksum, it's wonderful.
I have been eagerly looking forward to the updated version and when I saw that one was available I went to download it
But guess what?
I was disappointed to see that the download was the old version I already have
So I thought, OK, I'll download the x64 ver, maybe it also has the x86 ver bundled with it, but NO! just got a file missing error message
What happened, can you fix it? can I fix it?
Thank You.
EDIT: What is the current release, the ver I have is 1.2.3.9
Hi Cor, thanks for answering my question about the read caching!!
eric
I just came this a few days ago and it looks like a really nice program. I'm trying to decide if it will work for something I want to do, and I can't figure it out from the manuals.
I have a large folder with many files (100 GB, 100,000 files) that I often change files in, which I want to monitor for data corruption. Currently, I do this with ZFS on my Linux backup server and I've been experimenting with using Snapraid on the windows workstation to detect and recover from corruption there. The problem is that I don't really need redundancy on the workstation, because I can recover from the backup, and a RAID solution requires at least one extra disk. Since all I want is to know if corruption is occurring, I really only need checksums.
I think my goal could be achieved with Checksum if I could make the "sync" option create new checksums for files that are new or have new modtimes, not just files that are new. Is this possible, or is there another way to do what I want without a lot of manual intervention?
As it stands, checksum can't do everything you need automatically. Plans are underfoot for a checksum "post-processor" which will do mop-up type jobs like remove stale checksums from your .hash file, which would be all you need to automate this process.
The new timestamp functionality (along with the CHANGED/MODIFIED/CORRUPT flagging) was created, partly, to enable this sort of functionality. It's on the way. If you know some big company looking for file verification software, let them know about checksum, I can take a week and make it happen!
Currently you would need to parse the log output and use that to remove entries from the .hash file. It's certainly doable in a shell script but probably beyond the capabilities of the average user, who probably deserves this functionality in what is otherwise the best hashing app on the planet!
The post-processor will do other things, I'm still collating all the suggestions into the best plan of action, and accepting new ones, to boot.
;o)
Thanks very much for your reply. I'm just an academic; no company with $$ behind me, and I wouldn't trust myself to write that script properly. But it's great to know that you're thinking about this kind of functionality. I'll keep a lookout for the update!
Is version 1.3.5 available for download? I'm getting version 1.2.x I'm up for paying if that gets the most recent version. It has features I need.
Thanks,
Alex
I'm curious now, which of the new features interest you?
;o)
Thank you very much for simple-checksum!
I've read "64 and 32 bit versions now available" at itstory, but I only see the x32 package to download (although the installer put it in "Program files" not "Program files (x86)".
The x64 version is available to "free" (aka non-registered) users, too?
TIA!
Giving checksum a tryout and ...
I have changed the .ini file.
log_to_file=true
log_everything=true
I get a lot file when I do a verify but I don't get anything generated into a log file when doing a "create checksums" or "synchronize".
Am I doing something wrong?
For a log of checksum's creation operations, check out the .hash files!
It's all in the manual! ;o)
Great job!
have you consider to add SHA2 and/or SHA3 hash functions?
But of course, misinformation and market forces will likely compel me to include them at some point. Such is life!
Best regards!
Hi again!
Yes, you are right about SHA2 and SHA3 performace
BUT (hehehe) if you are concern about speed, performance and resources optimization, check BLAKE2
https://blake2.net/
About the statement that checksum is "the only tool against silent data corruption", there is actually an even better tool for that purpose. It's MultiPar, which you can get here:
http://multipar.eu/
You can read about the technology MultiPar uses on Wikipedia, here:
http://en.wikipedia.org/wiki/Parchive
Finally, you can follow development in the forum for MultiPar, here:
https://www.livebusinesschat.com/smf/index.php?board=396.0
I just did a quick comparison with the latest version of checksum, and one thing is certain, checksum is ridiculously fast, and easily beats MultiPar in speed. For myself, I use both checksum and MultiPar. I use checksum mostly when I'm transferring huge quantities of files and I just want to quickly verify the transfer was good (I use FastCopy for small copy and paste jobs where checksum's speed isn't as important).
I mostly use MultiPar for long term archival storage for data that doesn't change very often. MultiPar's primary advantage is that it can repair errors when data corruption occurs. If I already have a MultiPar data set, I don't need to use checksum to verify it, but the newer versions of checksum are so fast, I think I will start using checksum even when I already have a MultiPar data set.
For real-time data verification and repair on actively used and frequently changing data, I use the ZFS filesystem in FreeBSD in a RAIDZ configuration. Then, ZFS will pull a good copy of the data from one of the mirrors if it detects an error. It only works well on actively used data, so MultiPar is still helpful for archival storage. I haven't tried using checksum on FreeBSD + ZFS, but maybe it's speed will make it useful there too.
But I did state my belief!
What you say is true, in certain circumstances. I played around with PAR files myself some years ago but eventually concluded that any serious backup strategy of mine is going to involve at least two full copies of the data. The last thing I need is even more data. This is true for a great many people/organisations. Storage space still isn't nearly cheap enough.
Secondly, for a brief spell, I came to partly rely on PAR files, which was a mistake. I came, tragically, to realize one fundamental fact of computing, and that is; even if you store your data on ticker tape, and even then, there is always the chance of total catastrophic failure. Even partial hardware failure can corrupt enough of the par data to render repairs unexpectedly impossible. Ouch!** I like the all-or-nothingness of data, the one and zero of it. This grey area annoys me!
PAR is a nice idea (as are many RAID strategies, but they STILL NOT A BACKUP! (that was for someone else! ;o)), but at the end of that day, the "perfect" backup is, for me, a second copy of the data and of course, some way to know exactly when one needs to refer to it to replace corrupt files (I put it better, here.).
I do think PAR is a nice idea, and in certain circumstances (binary newsgroups, being the obvious example), highly useful. But for simplicity, speed (oh yeah!), ease of use, opaqueness, and all-round real world practicality, I still believe good backups + basic hashing is a better, more secure strategy for important data.
When the data is trivial (like TV/music downloads), a second copy isn't required, and PAR files are excess - it's usually easy re-got. And when the data is not trivial, nothing beats the warm, fuzzy feeling of knowing that there exist two perfect copies - even if one of those disks/tapes/drives/whatever disintegrated into a pile of molten metals and plastics, the data is still 100% (and you go get a replacement disk NOW).
There is certainly room for both systems in the world, and when PAR3 hits the masses, I might take another gander myself. But no way I'll be ditching my .hash files!
;o)
And because checksum is so darned fast (that was a month of my life, folks!), people are more likely to actually verify their data, which is really the key to staying on top of "silent data corruption".
Hi Corz. The new(new for me) 1.3.6 update looks great, thanks for that.
Got a question I asked sometime ago, and was wondering if you put any thought to that and maybe plan to implement it in the next version. Namely the "removing entries of missing files" from the hash file. It's great you gave option to ignore the missing files notifications, but is there a chance you could add also an option to remove those entries while updating (or checking the previously created) the hash file?
I'm sure you know what I mean, but just to be absolutely sure, if we have file A in directory B, and move it to directory C, then previously checksum would report the file A is missing in directory B. If we update the hash file, we will have the entries for both file A in directory B and file A in directory C. Afterwards if we ran the previous version of checksum it would obviously report the file A in B directory missing, which you kinda fixed by the flag to not report missing files.
But still this entry will be there, and will make the hash file unnecessary larger than it should, especially if we move a lot of files/directories around. What I'm asking for is a setting, to which the checksum while updating/checking the hash file, once found the missing file entry, just remove that entry from the hash file.
Lucas
Cor, since my last post about MultiPar above, I've decided to start using checksum a lot more due to its speed. You are absolutely correct that:
I've noticed that's true for myself too, and I'm checking my data more frequently when I know it will only take a few minutes to verify large quantities of data. I also keep multiple copies in backups, but the motto I coined for MultiPar is "backup is not enough".
There was one case where I had somewhere around 4 to 7 copies of my data is various mirrored backups, including at least 3 of those copies secured offsite. My backup software was checksumming the copied data every time, so I knew it was being copied correctly. What I didn't know was that my data had been corrupted, and backup software was backing up corrupted data! Several of my files just disappeared in a file system corruption event, and the files that I still had were older obsolete versions, with errors. Here is how it played out:
* My backup software failed to warn me of the problem.
* checksum succeeded in detecting the corruption.
* None of my backups contained a good copy of the original data.
MultiPar saved the day be regenerated the missing and damaged data. I lost NOTHING in a worst-case scenario, thanks to MultiPar. Similar, though less extreme versions of that story have happened to me on multiple occasions, but that was the first and only time when ALL of my normal options failed simultaneously, including ridiculous numbers of backups.
What we're missing today is some sort of software "spider" to traverse the data and verify it is good automatically in the background, without the need for me to keep clicking on .hash and .par2 files all the time. Something like that could alert me that my backups are corrupted, so I can take action before my good backups are updated with bad backups. (something along these lines is already planned for future checksum - see discussion re: "checksum agent", elsewhere - ed)
By the way, PAR1 has been obsolete for a long time, and PAR2 prior to MultiPar is obsolete too. MultiPar's version of PAR2 can handle a directory structure, much like checksum's root file feature. MultiPar has added many other novel features to PAR2, and it has been thoroughly tested and proven to work. PAR3 so far hasn't offered enough new features for anyone to feel any sense of urgency in completing specifications and an implementation of it. MultiPar's version of PAR2 is ready, NOW.
You definitely need to give MultiPar another look, it is fantastic for long term data storage, especially if you use offsite backups. There have been many cases where 100+ GB files have had a small amount of corruption after I've downloaded them from one of my remote offsite backups. With only checksum in my toolkit, I would have to redownload the files over and over until I'm lucky enough to get a 100% correct copy. MultiPar eliminates that hassle, and it can fix the defective copy in only a few seconds.
Even with ZFS and/or tools like rsync, MultiPar's archival nature is irreplaceable, because ZFS only checks data integrity when it is read. For infrequently accessed archival data, like with physical data storage devices (HDD's, tape, etc) sitting offline in a closet somewhere, MultiPar is critically necessary to ensure the data will be fully recoverable when a few random errors are expected and inevitable. The best part about MultiPar is that it can fix ANY random errors in a large data archive, using a comparatively small PAR2 data set.
You can keep 7 duplicate copies of your data like I do, but if each copy has its own unique set of corruption errors like swiss-cheese holes, the duplicates are useless. MultiPar is necessary for identifying which parts of the data are good, which parts are bad, and then replacing the bad parts with regenerated good data. checksum can't do that. In some cases, even the ultimate ZFS RAIDZ system can't do that. MultiPar CAN do that!
Backup is not enough. checksum is not enough. MultiPar alone is not enough. Combine MultiPar with backup and checksum, then your odds of permanently losing data will drop to almost nothing.
I need to report a bug. I've found that checksum doesn't generate checksums for .lnk Windows shortcut files. Those are real files, and checksum shouldn't skip them if they are present in a checksummed group of data.
ignore_types=md5,sha1,hash,sfv,crc,lnk,url,m3u,pls,log
;o)
checksum only has 2 "likes" at AlternativeTo:
http://alternativeto.net/software/corz-checksum/
It's way better than most of the other software.
I commented out all of the ignore preferences in checksum.ini, for ignore_types and ignore_files, so everything will be checksummed for me. You might want to consider putting a readme file in the checksum directory that explains where to find the ini file, and where to find more thorough documentation.
I was told by another checksum user that "The paid version has an option to include a check of the file date/time and will notify you if the file has been modified (based on a new date/time stamp) or corrupted (unchanged date/time stamp but fails verification)." Is this true? and if so does that menu show up after you right click and select "create checksums" or "verify checksums"?
As for the time stamp feature, yes, it's true! Details on this very page. No menu entry required.
;o)
Hi - great program, I'd started writing my own after having the same problems as you, but I don't need to now I know about yours.. I might also write a few utilities to post process the host file, if so I'll send you a link.
However, you've missed a BIG issues - While it does a great job maxing out the disk transfer it means (under windows) that the user better not to want to access anything else on that disk while it is running, as it will be very very slow.
You need two things
1) an option to set the priority of the checksum process (I've been setting it to low manually) either in the config file or via the right mouse button in the windows tray
2) some way of pausing the checksum process - either through the tray icon or the popup bubble (say clicking on it).
You need both of these things because even if it is reading and processing 100mbytes/s (say an average reasonable hard disk) that's still only 360GB an hour, so over 10 hours for a full 4TB drive...
And if you do those two things, you may as well do the third obvious thing -
3) a 'shutdown when finished' option ie if I run this on my main video collection it's going to take over 24 hours to run, it would be nice to auto shutdown at the end...
Ian
thanks for the quick reply!
On number 2, I didn't know about the pause key, but when I try it I only get an option to abort the whole job, not to pause it. Any idea what I might be doing wrong?
(ps, my job from yesterday is still creating checksums - it's not your program that's slow it's modern hard disks! I did it on a 12TB directory... )
ahh, I didn't think of that...(the pausing..)
And I'd try it on a faster hard disk but unfortunately anything faster I have is on my unix box ie my main video editing machine has no problems completely saturating the 6GB/s sata 3 bus reading from my array of raptor drives in it - I've measured real sequential reads in the high five hundred megabytes/sec range...
But unless you do a unix version I won't be able to test it out :-)
On that note, I'm getting a bit sick of sata 3 and gigabit lans - I can't wait to updating everything to sata 4 and 10GbE ... :-)
I have really enjoyed using Checksum over the past few months! I decided to purchase a personal use license the "old-fashioned way" with Paypal. Afterwards I attempted to register, it asked for my email so I put in my Paypal email (I am assuming that is the one that got registered?) then it asked me for the code however I have not received a code? Am I suppose to be getting an email with the code or am I missing something?
EDIT:
Yup, I should have thought of that. Thanks again!
Hi Cor, I'd like to thank you for your program - and learn how to use it.
I want to make sure that some folder with 100,000+ files is identical to its copy on other drive. Is there a better way to do it than compare 100,000 checksums visually? I'm sorry, but I didn't find the answer here.
Is there a way to allow checksum to continue while a dialog is shown?
eg. I have 3 checksums queued to verify. When the first checksum completes, a dialog box pops up to inform me that everything has been verified successfully. At this point, checksum pauses and will not continue verification of the other files until the dialog is automatically dismissed (if I have it set to do so) or I click OK.
What I'd like is for the dialog to stay open so I see it when I come back to the computer, but for the other checksum files to verify and pop up their own dialog boxes. Perhaps it would also be good if the dialog boxes contain the name of the checksum file that has just been verified. Perhaps it could also add the queued checksums to the current process and present a dialog box after all checksums in the queue have been completed?
You do have some options. The first is to simply disable queueing. This will thrash your disks a bit, though, processes fighting for disk I/O.
A better idea is to disable dialogs altogether, trust that checksum will let you know if there are any errors (by opening the log folder, assuming logging is enabled - usually is).
The latter option is superior, because is promotes a "click and forget" attitude to using checksum. As soon as you activate the verify task, you can put it out of your mind! If your attention is required (errors) checksum will show you the log file you need to examine.
And, crucially, if your attention is not required (no errors), you don't have to click any dialogs or be reminded at all, you can continue with whatever else you were doing in life!
But that's just how I prefer to work, so I will surely be considering ways of improving the queueing for others at some point. Your suggestions are noted.
;o)
Root file checksum does not work for a selection of files and folders in the root. It's all or nothing. I have to right-click on the root folder to checksum everything. It won't work do just select a portion of the file folders in the root folder. That surprised me. It tries to create many checksum files all over the place if I try to do it that way.
Also, some kind of file list GUI thing might be helpful to preview what will be done, so some things could be removed manually if I want them to be skipped.
Another issue: A file with a path of 260 characters is correctly checksummed, but in verification, it will be marked as "missing". A file with a path of 251 characters did not have that problem, so I'm guessing it has something to do with a 255 character path limit.
I have many, many files with path lengths much larger than that. Path lengths could reach into the thousands, and I seem to remember a trend toward raising path length limits to 32'000 characters to eliminate most issues from path length limits.
;o)
Setting root hashing as default does not do what you claim it does. You claim that what badon wants to do is possible. It is not. Setting one_hash=true only works if you select just one folder.
All badon wants to do is select a few files and folders and get ONE (1) hash file.
Perhaps I misunderstood. I imagined what he wanted was to select a few folders and get root hashes inside each of the folders. That is what I was giving advice about.
<more snippage>
Please try and remember, you can use this software, which already represents a huge amount of work and effort, for free. When time and finances allow, I do my best to make it the best it can be. Feel free to purchase a 250 user license, I will get right on any feature requests you may have! (though please post them on the correct page).
;o)
Hi. Just started testing this program. I'm liking it so far as it has more features than other tools that I've tried like ExactFile. But one thing that I'm finding is that the right-click tray menu option is flaky. Sometimes, I will right-click over the checksum tray icon and after ~5 minutes, the menu will finally appear. Perhaps this is because the app is too busy hashing a large video file?
Anyway, if you would consider making the right-click menu option something that can be more easily accessed, that would be great. Maybe an option to make the right-click tray menu always appear as part of the tool tip?
Thanks!
If you need to access the menu immediately, hit the <pause> key on your keyboard. Voila! ;o)
Also, perhaps you can add a hash checksum to the hash checksum file itself? That way, I can be sure that a large root checksum file for one of my 3TB drives hasn't itself been corrupted.
Thanks!
But there is nothing to stop you creating a .hash file of your .hash file. You would need to remove .hash from the ignored types, of course. I'm assuming this hash-of-hash would be stored on some other media. Or you could simple make a copy somewhere and compare them (with simple checksum).
Please note, in future, please put feature requests on the tricks and tips page. Thanks!
;o)
Hi, first of all I want to thank you for your great programs, they save me so much time!! =)
I've set up a batch of hash verifications of some directories on an external hard disk using both chescksum and batch runner. It all works perfectly, just one thing I'd like to know better about checksum returned status codes... because although the checksum logs assure me that all the verifications had been successful, in the batch runner log it says that one of the checksum run retuned 3 as result, not 0 as usual in the success exit.
So, what's the meaning of this code?
Thanks!
-33 Exited from Splash Screen
-30 User hard exited from hashing operation (dll exit)
-14 User quit (and nothing was done)
-11 checksum.ini is missing
-9 User Quit.
-5 Nothing was done.
-2 There were verification errors.
-1 Read-only media -> checksums on desktop/fallback.
0 Normal Termination.
1 Attempted recursion on a regular file.
2 Nothing to check (no checksum files found).
3 Not a checksum file (this will be returned even when there were good checksum files, too).
4 No arguments supplied.
5 checksum already running (and disabled in ini).
6 Non-existant file.
7 Invalid Path.
They are poorly ordered, and a bit brain-dead; for example; regardless of what happened, if checksum had to use read-only fallback, you get code -1. So it goes.
;o)
I have my Checksum configured in the ini file to always create a root hash file, which is why I think this problem is either a bug or a missing feature. I don't need to hold down the shift key to do root hashes, in other words. Instead of getting a single root hash, I get many root hashes, one for each file and directory.
By the way, when are you going to get a real forum? SMF is BSD-licensed, and much more capable of handling the quantity of posts your software elicits these days, as word-of-mouth spreads about what you've got here. Forum software is very flexible. I have set up SMF to work as:
* A help desk
* A bug tracker
* A task tracker and prioritizer
* An event scheduler
* A blogging platform
...and probably a few other things I've forgotten about. Really, there's not much basic forum software can't do when it comes to communication and categorization of nearly any kind. We specifically avoid using any fancy plugins or custom code for our forums. We don't even bother to customize the themes! That makes it easy to migrate to other forum software if we ever become dissatisfied with what we're using.
For everything else, we use Semantic MediaWiki, which is incredibly powerful, and a hugely complex project to maintain. Stick with the forum software, it's easy and peaceful
As it is, I just noticed that your custom comment thing won't let me link the word "SMF" to the SMF website. Instead it wants to send me to dictionary.reference.com, so here's the link to SMF in a plain URL:
http://www.simplemachines.org/
Well, it looks like it kind of borks that too, so let's try this again:
http://www.simplemachines.org/
Nope, also borked. I'm out of ideas. I guess you're going to have to copy and paste the URL if you want to go to the SMF website. I tried!
Oh, one more thing, we recently discovered that our backup techniques weren't as reliable as we thought they were on our FreeBSD system, so we need to be distrustful like we are on Windows, and use at least 2 different ways to verify that our backups are good. As I've mentioned before, I use MultiPar and Checksum on windows for verification and repair. We have par2cmdline-tbb that can take the place of MultiPar on FreeBSD, but I haven't yet found a replacement for Checksum on FreeBSD.
I noticed that there's supposedly a linux version around here somewhere - would it be possible to create FreeBSD binaries too? There might be a way to use the linux version on FreeBSD, but I haven't tried using the features for that yet, and it would be much better if we could have Checksum in the FreeBSD ports system (as a binary package is OK too).
What do you think?
EDIT: It looks like the URLs were only borked in the preview. They seem to show up fine in the actual post. A bug?
A simple question from me, a simple man.
When a 'verify' indicates an existing hash has changed and asks me whether i want to log or not, I investigate and find that what I need to do is update that particular file's hash as I do recognise the change, how can I do that without having to rehash the whole folder again? I am manually editing folder hash files when i feel there must be a quicker way. Suppose what i am asking is when it asks whether i want to log, should it not also be asking whether i want to rehash the failed file?
Thanks for a very useful app. One minor item that you might like to address:
I just got an indignant "The file [...] is not a checksum file!" error message when attempting to verify a couple of .md5 checksums I downloaded today.
They most definitely are checksum files, and after a little digging, I realized that checksum was barfing on them because there was no line delimiter at the end of the single line of text in each file. Simply adding a carriage return before EOF resolved the problem.
Might be nice if checksum could handle such an eventuality a little more gracefully.
Hi.
I have been using checksum since 2008, and am happy using it.
This comment below is for checksum Ver 1.2.3.9
<snipped>
Thank you for your info!
I've confirmed it, the other computer is using the older version 1.1.4. All the latest versions are ok. I will upgrade all with the latest version.
You do deserve some donation for your fast response!
Thank you again and keep up the good work!
Fantastic article. Getting people to use md5 hash to verify files is a good thing. I've used it many times to save myself headaches in my sysadmin career, and now with so many power users at home, and with security concerns, promoting use of hashing is a great thing.
I actually listed quite a few reasons I use them in everyday "sysadmin" life, easy to extrapolate how you can use md5 hash to everyday home use.
http://geekswing.com/geek/the-magic-of-hash-and-i-mean-of-the-md5-and-sha-1-vintage/
Thanks for the post!
Amazing!
Hey C(O)R(a)Z(y) guy !
I really really really promise to you if I'm millioner (just few of them) I'm gonna send you a hell of a lot of money !!
But NOT for your software !! (which having to fly some terrabytes around via many hard disks drives is the only software allows me to sleep well ! )
So what for ?!
Well ... having used many many many hundreds of software since 1984 (Sinclair ZX spectrum Z80) for any application you could imagine, I tell you that you are the ONLY one crazy programmer that you did remember to us (users) that behind a software lives a very human face ! Which can have a very .. "ShockDoctrine" (Naomi Klein) sense of humor which is an absolute must in this stupid hard and non human world !!
Thank you for being there !! ( ^1000 )
ps
I can't stop laughing because of your answer to Holodecker loooooooool
I intuit you would write a history as a politician or at least as journalist for politics !
Hey, if you liked these responses, you should check out some of the fun on my .htaccess tutorial (part2 comments are especially funny)
;o)
ps. I could have been a politician, and a rock star, amongst other well paying things, but I was too busy writing software for the ZX80, ZX81, Oric 1, Spectrum, Amstrad CPC, and so on... Ahh.. carefree (if somewhat naive) days!
Please make the complete drive hashing real,eg. not putting seperate hashes for each directory,but ONE hash file.
Also the directory hashing, the same for it.
Else,thanks for having such a decent program.
Is there anyway to disable the nagbox after verifying the hashes,eg. in the settings.ini ??
Sometimes it's pretty irritating.
So that's a Yes.
Please do skim through the documentation and comprehensive ini file, at least once!
;o)
Thank you for this useful and versatile tool. That's just what I needed.
And thanks for the extended documentation.
By the way, I have tried a solution to your question on how to insert a "proper" context menu divider into the registry:
A Windows Registry key can be marked as separator by setting the DWORD CommandFlags=8.
For example, I successfully changed your line in the setup.ini file
HKCR\*\shell\00. divider=————————
into
HKCR\*\shell\00.divider=CommandFlags||REG_DWORD||8
... similar for the other lines with divider.
<edit>This has since made a dramatic difference to all my own personal context menus, too. You should definitely contact me for a free checksum license!</edit>
Trying to install this on Windows 8.1 I'm getting
"corz installer error...
Error loading hashing library!
Setup aborted."
I'm not sure why, any ideas? I've tried with anti-virus disabled. It installed on my virtual W7 box fine though, so I'm not sure if it is a windows issue or elsewise.
Of course, you can dispense with the installer altogether and simply copy the contents of the checksum directory (inside files/) to anywhere in your system and checksum will run fine (note: if you leave the .ini file next to checksum.exe, it will run in portable mode).
;o)
I did attempt as admin and moved it to different drives trying just for fun no go. I ended up doing as you suggested to check a few, then upon a reboot the setup worked fine so I've no idea.
For those that don't follow my devblog but do follow these comments, please note; a new beta has been released. Available in the download section (above).
New features include FULL Unicode file handling, LONG PATH capabilities and more.
Please test, Test, TEST!
Enjoy!
;o)
And another beta release has flown out the door!
New features include the ability to delete hashes for missing files (you kept asking for it!), shutdown computer when done option, CPU priority and more.
Note: The beta releases (from v1.4.1.5) now have their own version checking URL, so if you are playing with the beta, you can be assured of keeping up-to-date with the latest changes. More info here.
;o)
Hi,
I am just trying your tool, as I am looking for a hashing tool that doesn't lock the file while it's hashing and that seems to work great!
Is there a way that it's executed really quiet? Even using q I get at a batch file a small popup. It closes, but I try to avoid it.
Thanks
Patrick
If you mean something else, let me know! ;o)
Your app started to give a nagbox about donation everytime I use it.
If I want to donate, I will, no need to remind it about every time I want to
create/verify checksums.
I guess I have to uninstall this.
But if you want to uninstall rather than even consider a license, that's your choice! ;o)
Ref Tontza Comment.
I do find it incredible that someone will take the time to complain about a perfectly functioning piece of software that he/she is using for free just because it gives the odd nag screen.
Ironically, in the same time you could of purchased a licence as I did years ago.
This website has been running for years now, with Cor giving feedback to those who have questions about his software and those requesting features. Not many commercial companies give that level of support - my point being it all costs time and money.
Hi,
I want to check that my folder was correctly copied.
Source:
I:\Pictures
destination:
J:\Pictures
In explorer, I right-click on the source and pick create checksum
In explorer, I right-click on the destination and pick verify checksum
but that is not working.
It seems that your software requires to create the checksum files before copying the files to the destination drive.
is this right?
is there a way to use checksum after the files have been copied?
thx
For quick folder compare jobs, you can also use simple checksum.
;o)
Hello!
Thanks for a great and FREE product and support!
I'd like to keep my context menu as KISS as possible. But the setup.ini editing seem a bit daunting for a non-windows BAT expert. Basically I just want to use the most basic option of having only one extra item on my context menu (right-click on file) for calculating its MD5.
1. What would I need to change to accomplish this?
2. And what is the difference between the setup.ini and the checksum.ini files?
It could also be useful with another screenshot of the context menu with what the default settings would give you.
Thanks again in advance.
setup.ini is used only once, at installation time, by setup.exe. It's quite well commented. If there are any menu items you don't want, simply delete that section.
checksum.ini is used by checksum and stores all your preferences. It has nothing to do with your explorer menus which are setup only during installation.
If you prefer, you can simply copy the main exe files somewhere and skip installation, so you will have no menus at all! Then, if you like, you can add a single menu entry, like the one here, named "checksum-all-files-command-[edit+merge-me].reg"
I will add more screenshots of context menus (there is one for drives at the top of this page) the next time I'm in my virtual test machine (the context menus on my workstation are HUGE!).
;o)
Hi Cor,
First, let me tell you Cor that I simply love checksum!!!
I've been using it for some time now to check the files I receive from other people and/or torrents I download and I am truly impressed how easy it is to work with it. Thus, for some months now checksum is placed on my system in the Quick Launch menu! along with the very few icons I can place there (just to show to you how high is my opinion of checksum and how much I came to rely on it).
Now I began to distribute software to other people (both as an .ISO image and/or as a physical bootable DVD of that image) and to let people check the integrity of the software they receive I tried at first the "conventional" way (i.e, to send the checksum separately from the data it checksums). I know it seems silly but I am being bombarded with requests for "clarifications" from people who simply are too computer-ignorant and I am spending a lot of my free time answering, over and over again, how, why and when to checksum the data.
I thought about it and imagined a way to overcome this problem but I simply cannot simply find a way to make it stick.
The way I thought of doing the software integrity control was simply to have a simple .bat job referenced in the autorun.inf at the root of the DVD that would, when the person placed the DVD in a Windows environment, open a dialog box asking if the person wanted or not to checksum the contents of the DVD and would then proceed to run the task and exit stating the integrity of the software.
So my two parts question is:
1) is there a way to have the checksum data sits "inside" the folder that is being checksum'd (or, as in my case, to sit inside the DVD that is being checksum'd and thus included in the ISO image that would be distributed)?
2) is there a way to make the batch job above and that would not interfere (or be integrated into) when the system is booted from the DVD (I can use either the boot.img that comes with the Windows Vista SP-2 DVD or one of the .img from BurnAware, either DRDOS 7.02 or 7.03).
I know that I am probably behaving in exactly the same silly fashion as the people who bombard me all the time and I apologize in advance for it but I am trying and trying and am yet to find a way to do that task.
And Cor, thank you very much for your help: it will be surely appreciated.
Mike
As to your question, yes, it's quite possible to have the checksums for the disk inside the disk. Simply create a "root" .hash file in the root directory before the disk is burned (in the original folder/mounted image/whatever you use to create your DVD). That .hash file will contain hashes for all the files on the disk.
As for actually using this .hash file, that's another story. Many people have autorun disabled, so any batch file you create and reference in your autorun.inf may fail to load, anyway. Solutions which rely on the user's system being configured a certain way are generally best avoided.
If it were me doing this (and it's not something I do), I would most likely include a readme; IMPORTANT.TXT (or similar), in the root of the drive, e.g..
1. Download checksum (https://corz.org/windows/software/checksum/#section-Download)
2. Run the installer (setup.exe)
3. click checksum.hash
Good luck!
;o)
ps. Don't forget, you cannot legally distribute checksum in anything but its original zipped form. The latest version of checksum's license can always be found here.
[edit]Actually, I've been thinking about this, and so long as the full distribution is somewhere on your release, I'm happy enough to have checksum "go it alone", for one-click data verification purposes. I have amended the license to reflect this change, and even incorporated some features to make it easier to do![/edit]
I've just found about checksum for Windows but, whilst trying it out, I've found a bug.
With v1.5.0.0 x86 version if the hash file is hidden (using hide_checksums=true in the checksum.ini file) and a file is changed then when I verify the hash file with the 'update changed hashes' option I get the msgs "Success!" and "(1 hash was updated)" but the hash file is NOT updated. If the hash file is not hidden it works as expected (i.e. same msgs but the hash file is indeed updated).
[edit] Fixed in v1.5.1.0 [/edit]
;o)
Cor, you might find this discussion interesting: Idea: Optional fast verification using Corz Checksum. The ultra-short version is that other software ought to be using Checksum and its .hash files as a standard way of doing things. The only feature that seems to be missing (I haven't checked myself) is for it to be able to call another program after it completes.
Wow, thanks for your quick action in adding those useful new features! I have another feature request. When doing synchronize operations to add new checksums to a .hash file, I'm not always 100% confident I have done the operation correctly. It would be helpful if the "hash completed" info box thing gave a summary of what was done, in the form of counts. Then I could be confident I hashed the correct data, and synchronized the new files, just by comparing the number of files in the .hash that checksum produced.
I'd been using an older version for way too long and saw that there was an update. I LOVE the new features, especially being able to delete hashes for missing files, and differentiating between changed and corrupted files. But in my first test I noticed a bug. If a hash file has missing files that you opt to delete the hashes for, they won't be logged if they're before a non-missing file in the hash file. For instance, if the 1st, 2nd, 4th, and 5th files listed in the hash file are missing and you opt to have checksum delete the hashes for those 4 missing files, then the log file will only show that the 4th and 5th files were deleted, since they appeared in the hash file after the 3rd file which does exist.
Hopefully that's the only bug (and I'm glad it's minor) , and hopefully you're able to track it down!
I'm using the 64 bit version, and I'm registered. (Hope the chicken was delicious!)
As for the logging, none of the deleted files should show in the log, unless you have chosen to log everything.
I tried to reproduce this, variations on your theme and it always works as expected. Hmmm. I will need more information about the task which produced this issue.
Please EMAIL me with more details. Oh wait a minute... [continued..]
;o)
Hmm, it's trickier than that. The way I just described it would result in nothing in the log file about deleting the files. And I'm having trouble finding the pattern. Sometimes a deleted hash is logged, but more often not.
Cor, great program! I appreciate that you've got it working with the Windows7 menus. For a quick check of the a single file in different folders, it would be nice if there were an option to leave a dialog open with the checksum.
Multiple Instances?
Cor, I'm looking at using checksum to beef up my unraid server. Right now I'm generating hashes for multiple disks each containing several terabytes of data.
While checksum is busy hashing a disk it looks like I cannot ask it to do an immediate single file hash. Instead it seems to queue the request until it gets done with its current server task.
Is there any way to run multiple checksum instances simultaneously?
Thanks for an excellent utility!
paul
WOW Cor! that's the most responsive support reply ever. Thank's a million!! - paul
Corz -- First off - REALLY good app!! I've been using it for about 2 weeks, and I'm really impressed! I tried about a dozen checksum apps before finding yours, and none of them even come close.
I have a massive mp3 collection, and I am trying to develop a solid routine to check for corrupted files, both in my main storage location and also on two backup drives. First, I create regular checksums for a folder (which could have 10 - 200 subfolders my albums), then I create a one-file root checksum and synchronize it with the existing checksums, so I can view the log showing all files, in addition to each individual album in the root folder. So far, so good.
But where I am having trouble is when I replace an album with a higher quality rip -- I rename the original folder (for example: album-old) then add the replacement folder, giving it the same name as the original, lower quality album. I delete the orinal hash file from the older album and create a new hash, and create a hash for the new folder. When I go to verify the root folder, checksum reads the one-file root checksum and tells me that the files have changed. I tried choosing create checksum, holding down the shift key, selecting "one-file root checksum", then clicking on "synchronize" when checksum finds the original, but it doesn't update the existing root hash file. The only way I have found to get the one-file root checksum hash to verify is to delete it and create a new one. This takes forever for root folders with 150 or so albums.
Is there a better way to do this? Basically, I just want to be able to verify that my backups have the same files as my main library, and it seems like using the one-file root checksum hash is the best/only way to do this. Any advice?
Okay, note: updating and deleting hashes happens only during a verify operation. So all you need to do is verify your root hash with the update option enabled* and it will happen automatically.
;o)
* hold down the <SHIFT> key at launch to bring up the one-shot options, or else add a w to your command-line. You might also want to enable the delete missing hashes option (command-line: x).
<<Thank you for your kind words.
Okay, note: updating and deleting hashes happens only during a verify operation. So all you need to do is verify your root hash with the update option enabled* and it will happen automatically.
;o)
>>
Thanks for the quick reply, Cor!
I knew I must have been missing something.... Been playing around with this and I'm able to do everything I need to now. Your software is a pleasure to use!
Yo, I like the way you think. And you listen to Sun Ra, too. I Love it. Keep up the good work.
Hi Cor-
Your software is a pleasure to use. I've been using it for years.
Many thanks for making it available to us commonfolk - it wouldn't be accessible to me were it not for your generosity in providing your work with an opensource license. I truly appreciate it.
Listened to your music - good on you for putting your thoughts to paper. It's rare to find someone committed enough to actually make their thoughts materialize ...it's easier to say 'I would if I had time...and it would be GOOD!'. My SON is a drummer. He, like you, can hear music in his head which needs to get out, and he can hear the imaginary instruments in the background even before he composes the parts.
Thanks - appreciate your hard work.
I hope that you're okay ...noticed you can't reply. Best wishes if you are struggling with a personal health issue or other difficulty.
-str8arrow
;o)
Hi corz
I have a question regarding "synchronize" functionality. How does it actually help in the speed department in terms of generating a hash file? I stopped at some point hash generation, then went back to it the next day, however I still saw on the popup same files being processed, and some larger files (which already been processed the day earlier) being up there for big portion of the time (which would seem those files were not skipped at all).
Maybe I am mistaken and I overlooked things, but I would appreciate if you could explain how the synchronize option exactly works in a little bit more detail than what there is in the ini file.
sincerely
Lucas
If you added a single file to a volume which had already been hashed, and re-hashed the volume; assuming there are no other changes; only one file (the new one) would get hashed.
If you look inside the .hash file (in a plain text editor) you will see the newest hashes at the end.
;o)
Hello,
I'm testing out checksum to see if it will work for my purpose. I'd like a way to automatically verify files periodically and notify me of silent corruption.
I saw that checksum can be run from the commandline. I ran it with the v switch and noticed that the commandline launched the UI asynchronously and returned immediately, therefore not giving me any output on the commandline. Once the verify was done, the result dialog popped up reporting no errors, etc. I thought maybe if I ran it in quiet mode (q switch), it would present something on the commandline, so I tried that too. That did remove the result dialog, but still nothing on the commandline. I tried redirecting (> output.txt) on the commandline and that didn't provide any text either.
Is there any way to get a "success" or "failure" as output on the commandline from checksum? If not, I'll look into the possibility of a more complicated orchestration of logging the output to the html/text file, parsing that, and notifying me of verification failures.
Thanks,
David
@echo off
start /wait C:\PROGRA~1\corz\checksum\checksum.exe vq-t "B:\Test\checksum\1000
small files"
:: -2 is checksum's exit code if there were verification errors.
if %ERRORLEVEL% equ -2 goto ErrorRoutine
:: 100% OK
exit
You can get a list of all the exit codes here.
;o)
Not a happy camper here!!
Trial version does not offer the ability to check directories - only individual files. I need to check entire directories for multiple files so I purchased the product in order to have this capability.
I have received no order confirmation and I am now expected to wait days to receive the unlock code for the product!!!
This is 2015 where email confirmation and receiving product registration keys instantaneously is common practice.
This is VERY disturbing!!! I needed the application today!
Ridiculous!
Note; the "trial" version is identical in function to the "full" version. There is a splash windoid at the start, and some cosmetics can't be customised; about box or preamble comment in .hash files. Either version will do what you need. It's the default behaviour!
checksum always hashes (or verifies) all the files in all the directories, digging into the entire tree, unless instructed otherwise. Perhaps you expected all the hashes to be in a single file in the root directory, but they are in each and every directory. Check for that. checksum can easily create root .hash files (with hashes for the entire tree in a single .hash file) by passing the "1" switch (no quotes) on the command-line, or by choosing that option in your one-shot options. (hold down shift key on checksum launch to get to the options)
Registraton: if you bought a license, thank you! Your registration code was sent immediately on verfied payment, to the email address used to make the payment - the system has been working well for a few years now. In almost all cases of "missing" registrations so far, a spam folder somewhere in the email chain was the culprit. Check there if you can. If not, send me details from a more accessible email address and I'll resend your registration with pleasure.
Function is identical, but there's instant startup and those other minor differences, most suitable for corporate enviromnemnts or those who like the warm fuzzy glow of paid-for, independant software.
;o)
kbucher - I'm not sure what the trial version is, but I downloaded the version here: https://corz.org/engine?section=windows&download=checksum_x64.zip
That one allows for "one file 'root' checksum" and "recurse" options, that will create one checksum file for all files, at all child levels, inside a folder. That would seem to give you what you are seeking.
David
Indeed kbucher, it's all there, and much more (the "trial" version is fully functional).
The documentation online here is also a goldmine of useful usage data.
Anything you think I've missed, or new features you want documented first, let me know here or on the feature request comments on tricks and tips page, or by email.
;o)
I love your software! I started using it in 2011 to monitor the heath of files in my collection. Funny story, in retrospect anyway: I had it working great for a month, then it got really slow. Seemed like the whole stream got hung. I had it running quiet on an unattended machine from an "AT" scheduled job. A few times I picked it up trying to figure it out. Last weekend I discovered it took exactly 10 minutes per folder. When I ran it buy hand I saw the "Register Me" popup. I wasn't aware of the effect of the trial expiring so the project laid dormant until last weekend.
I have one request: The ability to specify the log folder from the command line. I know it's configurable from the INI but I would like to keep the logs in their own folder per project. Not global to the machine and not in with the files I'm monitoring. If the command line switch is not present fall-back to the INI value.
This might cause you a problem with your queuing mechanism. I don't use queuing if that helps.
BTW, how can I clear the queue? While experimenting with options in an automated BAT file I got thousands of jobs queued. I had to walk away for the day when I couldn't figure out how to clear it.
I look forward to experimenting with new found features like playlist creation.
I'm glad you are still in business and that I can resume my project. Years ago it was a pain finding a program to handle checksumming like I wanted it. Then when I ran into the unknown trial issue I searched for a replacement several times to no avail.
I will look into making the logging location configurable on the command-line. In the meantime, consider running checksum in portable mode, where the logs go in the program folder. You could also run multiple portable copies, each with individual settings.
;o)
ps. playlist creation has been around since almost the beginning! But there are LOADS of other new features for you to play with!
Cor,
Was wanting to know if you can change the seed value and can you verify software on different machines by pluging into them? If so what connection type do u prefer? and possible have u thought about also verifying in GAT?
Thanks,
Mark
Seed value? There is no such thing in checksum or its algorithms. You're thinking about pseudo-random number generators, perhaps.
verify software on different machines by pluging into them, I have no idea what that means. You can verify files on any machine you have access to its file system.
connection type? This is outside the scope of this software. Checksum works on file systems, be they local or network. It doesn't care how they are connected, so long as they are. checksum prefers fast connexions!
verifying in GAT? Again, I have no idea what this means. No definition or connotation of the acronym/initialism that I know of has any relevance to checksum. I thought a GAT was a rifle!
;o)
I seem to be battling with excluding folders from being checksummed. I set the following in checksum.ini:
ignore_folders=M:\$AVG,M:\$RECYCLE.BIN,M:\System Volume Information,M:\Sort\uTorrent,M:\Xbox
but as I was watching the status of checksum working in the background I could see it checksumming the files in the folders mentioned above.
So what is the correct way to exclude folders? Is this better:
ignore_folders=$AVG,$RECYCLE.BIN,System Volume Information,uTorrent,Xbox
As an example I have this folder layout:
M:\Sort\uTorrent\Downloaded
M:\Sort\uTorrent\Downloading
M:\Sort\somefiles.txt
How can I checksum the files in M:\Sort but exclude ALL files/folders/subfolders in M:\Sort\uTorrent\Downloaded and M:\Sort\uTorrent\Downloading?
Thanks for any help!
In your scenario, using..
ignore_folders=uTorrent
would do the trick. Don't forget, you can also put directory exclude masks on the command-line, e.g. checksum.exe cr1x(foo*,*bar,baz*qux) "D:\MyDir"
;o)
Email notifications!
1. Do you happen to have a template to use gmail as checksums mail server?
Mail Server: smtp.gmail.com
User: You@gmail.com
Password: YourPassword
Name: checksum
From: You@gmail.com
To: whateverAddress@whereverInbox
SMTP Port: 465
Use SSL: (checked)
2. Is there a way that we can test the email notification to see if everything was setup currently?
3. Could you also add an option to always get email notifications every time we run the checksum, ie shows how many files changed and got deleted. Would be useful if we setup a windows scheduler just to confirm it ran properly.
DeMo! I moved your comment to my email inbox and have a nice reply, but when I go to send, I realize you didn't leave your email address! Pop me and a mail and we can delve into it.
People! For bug reports, especially for older versions, please use email!
;o)
Hi,
thanks for this great software!
My problem: i need to compare two md5 lists, one generated by Checksum and another from md5sum under Linux. Directory trees are different. It's possible with Checksum?
Without more details it's impossible to give worthwhile advice about the best way to tackle this.
;o)
Hello Cor,
it seems your program is what I was searching for years. A friend recommended it today and I tried it right away with very promising results under Windows 8.1 Pro.
Why do I need "Checksum"?
I have a large video collection on DVDs that I have transferred to several hard disks for easier handling. There's one root directory for every DVD which contains the subdirectories AUDIO_TS and VIDEO_TS like every DVD-Video. Until today I used Total Commander to create MD5-Checksums of the complete directory VIDEO_TS since it's not going to change. The result is e.g. \Scifi\DVD999\VIDEO_TS.md5. The checksum is placed in the same directory as the subdirectory VIDEO_TS.
While Total Commander does this fine and can even verify multiple MD5 checksums within the same directory it isn't able to verify recursively. That's why I need "Checksum". It's time consuming and boring to have to change the directories again and again and start each verify manually. Your "Checksum" is right now checking all MD5s of a 3 TB hard drive recursively in the background. What a relief! Great program!
But I have two problems I haven't found a solution for yet:
I want "Checksum" to produce hashes with the extension "md5" instead of "hash".
I need the MD5 in the root directory from where I start the creation of the checksum for the directory "VIDEO_TS". At the moment the checksum is automatically stored within the directory "VIDEO_TS".
I've searched the help here, the INI-file and preferences but didn't find out how it is done. Can you help me? I'm using the 64 bit version of the program (installed) because apart from these two problems it does everything I need right out of the box.
with kind regards
wafranyofl
For the first preference, look for the following setting in checksum.ini:
unified_extension=true
Make it false, and from then on, checksum will create hash files with .md5 (or.sha1 / .b2 depending on the algorithm used) extensions.
But before you do that, at least consider the advantages of the unified extension!
As for your second request, I'm puzzled. checksum will always create a root hash in whatever directory you start it in (a trick I often use to create hashes and playlists from multi-CD albums).
Perhaps you didn't enable the root hashing option (it's in the options dialog (hold down <SHIFT> key when you launch checksum), or put a "1" (no quotes) on the command line or context menu command).
;o)
Hello Cor!
unified_extension=false did the trick for MD5 extensions. But whatever I do the hash of a subdirectory is written to the subdirectory itself and I don't want that changed. I right-click on the "VIDEO_TS" directory, press the shift key and let Checksum generate a MD5. The switches used are "qcrtb1" which is irritating because in cheksum.ini quiet_operation=false is set. The resulting "VIDEO_TS.md5" is still written to the directory "VIDEO_TS" instead of its root directory from where I issued the command.
with kind regards
wafranyofl
OKAY. The only time that the "q" switch will be added by checksum (during hash creation) is if you have a queue in progress and you have apply_options_to_queue set to true (so that you don't get bugged over and over again to apply the same options). The queue file is created in your system temp directory, by the way.
As for the hash location, you need to create the hash in the directory above the VIDEO_TS directory; i.e. open Explorer to the folder where you can see the VIDEO_TS folder (and possibly AUDIO_TS folder) and right-click on an empty space in that folder (technically known as the folder "background menu"), create from there.
Or go up and right-click the parent folder directly. Currently, by clicking on the VIDEO_TS directly, you are telling checksum to create hashes inside VIDEO_TS.
Lastly, if you aren't already, please use the latest beta.
;o)
That's it! When you do things the same way for a very long time you're so used to it that you don't see alternatives. I would never had the idea to klick into the empty space of the directory instead of the VIDEO_TS folder. That's because I need a checksum of this folder only and not the others like AUDIO_TS which is usually empty on DVD-Video.
Yo corz,
where is this Scheduler Wizard You talkin' about?
"One of checksum's special startup tasks is a Scheduler Wizard.."
You can now create basic tasks for your system scheduler, to perform
checksum jobs on a schedule. Handy for hashing your morning torrent
downloads, weekly verifying your backups and much more..
A "wizard" will ask you a few questions about the task, user name, time
and so on, and then pop open the options dialog (create or verify,
depending on the task), where you can set your options for the scheduled
task.
You can also have schedule setup as your default startup command (see
below - v1.5.2.0 changes)..
startup_command=schedule
If you set that inside checksum.ini, launching checksum with nothing to do, will instead pop up the scheduler "wizard". The text seems to imply that there is another way to activate the scheduler wizard. Currently there is not.
If you are a "Power User" with an hour to give, I definitely recommend checking out the itstory; it is filled with juicy bits!
;o)
-> Thanks for the quick reply! ..searched the whole site but not considered taking a peek there .
Hi , great program but for me this program should have more hash algorithms.. sha256 - it is also often used hash algorithm.
For example i can't check ifranview checksum here:
http://www.irfanview.com/
I think adding this algorithm is really necessary to make this really power tool.
Best Regards
S. D.
Hi I am happy to be testing your program and have a feature request that will make it even more useful:
Context: I have several terabytes of image (photo) files and perform periodic archiving of backup disks. At times I want to test the integrity of the archive disk (against themselves to ensure no file corruption) and also to tect the active master disk to against the archive.
I know I can do a top level directory hash and verify an entire disk. I also know I can set checksum to create separate hash files in each folder. Those options are great, thanks. At times I want to test a full directory, other times I want to test a sub directory e.g. year and others I want to test a folder.
When I check my working directory disks to my archive disks I expect some files to change, specifically .xml sidecar files to RAW files and jpgs and DNG files. These all change when Adobe Lightroom (or other raw processor) writes updates metadata written to non-RAW formats.
Based on the above...
Requests:
1) Root and Folder checksums in one pass: To avoid re-running checksum multiple times to produce BOTH the top level hash AND the folder level hash files could you enable a both level hash files be created in a single run, i.e. I could select both FOLDER (DEFAULT) CHECKSUM and ROOT CHECKSUM? Taking this one step further, might you consider an option for a checksum at each level of a directory tree? In my example then I could verify at root, year and folder levels.
2) Filter verification file target: Even more important than #1 is: It would be great if one could verify only a subset of the checksum files in a hash file. In other words, if I have a root checksum hash file and only want to verify RAW files or only want to verify files from a given year or subdirectory it would be great to have that option. This could be done with a combination of file type filter and a directory structure check-box that would permit checking/un-checking part of the drive directory to yield the desired levels/folders to verify. With terabytes of files and different verification needs at different times (weekly vs monthly vs yearly) this would save huge time vs verifying entire roots of all file types (or having so many different checksum hash files created for desired filters at multiple directory levels.
3) Filter/Hide/Sort log report: With huge drives holding hundreds of thousands of files it would be great to be able to filter/hide/sort the verification log/report. It would also be great to be able to export this to a text or other file for deeper investigation (possibly the filtering would be easy for the user if imported to excel, etc. The point here is that if one gets a large number of failures (or expected changed files from archive to working drive comparison) one wants to examine the results via filters/sorting/hiding to drill down to what is important to know from the verification result, i.e. too many failure listings become meaningless because cant digest quantity of raw info.
Sorry for the very long request. I spent the entire day researching programs like yours and find yours to be the best and am hoping you can incorporate these features I found in other less good programs.
THanks!!!!
OK, 1. I have considered adding an option to create root hashes at every level of a tree, for exactly the reasons you specify but decided against it at the time because the downside (massive time taken if someone simply decides to verify the actual root).
Creating both root and folder hashes in a single run sounds doable, but the code involved Vs how easy it is for users to create a batch script/batch runner job/run two jobs means it unlikely to be done soon, unless a) someone REALLY wants it and buys a big license or b) I figure out an elegant and quick way to achieve this.
I've been working on ffe for the last month+ so the inner workings of checksum are like childhood memories at the moment. I'll get back to you about this when I'm back inside checksum.
2. I like the first part of this idea and have added it to my 2do list. I am referring to verifying only certain files/hashes based on a set of filters/wildcards, e.g. "*.mkv".
As for the ignoring certain parts of the tree, again, this sounds like a lot of work that only one or two people would benefit from. And again, hey, if those people are generous, I'll dive into it! Besides, checksum can already do a lot of this if you utilize its ignore feature. Setup verification commands/schedules for different jobs with different ignore parameters. If you need lots of commands, check out Batch Runner.
As for 3, I'm pretty sure checksum can already output plain text logs (the HTML came later), do..
log_format=text
From there, you and your text editor can filter away to your heart's content!
Seriously though, I like the idea of filtering the logs. What sort of things would you be looking to filter in/out?
;o)
Hello! Thank you for this amazing software. How can I create an md5 checksum file with the extension .md5 instead of .hash? Thank you
unified_extension=false
;o)
I have a library of large mp4 files on a NAS, and a trying to build a hash file on Win10.
After hashing a dozen or more files, the program crashes. I tried several times.
I looked into various things, and found a faulty lan cable. I changed the cable and found the problem goes away.
Maybe need a bit better error handling?
There isn't AFAIK a good automated way to simulate network errors, so people can test their software for failure modes. Unplugging the cable works
checksum already handles disconnecting drives, ejected media and other interferences with aplomb. I'll see what I can do.
;o)
ps. you might want to look into Checksum for Linux (inside the main checksum distro), build your hashes locally.
Hi Cor -
Been using checksum for about a year, now -- Really loving it, Finally I have peace of mind that I am not backing up corrupted files!!!
Is there a way to tell checksum to create a log that verifies checksum did run and that no errors were found (like in the pop-up when running it manually)?
Ideally, in the log filename, and maybe a the number of hashes verified in the log itself.
For example, File name: [2016-01-04] TUNES - NO ERRORS.htm
log contents: Success! no errors. All data is 100% intact. 4222 hashes verified.
What I've been doing so far is using task scheduler to run scripts -- First it moves any existing checksum logs from the log directory, then runs checksum, then checks for the existence of a new log (indicating errors found), then appends to a separate log whether there were errors or not. It works, but its convoluted, and kind of tedious because I have about 35 separate scripts that I use to verify all my data. So, then I need to open each of the 35 script logs and scroll to the bottom of the log to see whether checksum found errors or not. Then, if errors were found, go locate the checksum log and open it to see the errors. (Yes, I'm that obsessive about my files.)
It would be great to just open the checksum log folder, check the log file names and be able to confirm that checksum did run as scheduled and whether errors were found at a glance.
If this is mentioned in the .ini file, I apologize for taking up your time reading this, but I didn't see it. If not, maybe something you could work into a future release?
BTW, I registered checksum a couple months ago -- I hope you're doing really well, because your software is really top-shelf!
Peace!
It sounds like you are creating work for yourself. checksum's normal settings work fine. If there is no log, that means the data is 100%. Job Done. Simply checking for the existence (or not) of a log, is enough. Then, only if a log is created do you need to investigate further.
Remember also, checksum will return a non-zero exit code, which you can check in your script.
;o)
ps. Thanks. I'm great, but as for software sales.. not so much. I think I'm currently hovering around 20p/hour for my work on checksum. Perhaps I need to re-think my "use-it-for-free" policy.
You're the man Cor!
Thanks for adding the 2 flags I asked for, namely "delete_missing_hashes" and "update_changed_hashes". The last 2 things I was missing in your checksum software!
I got one question. It's not as important feature for me as the 2 above but I think it would be nice to have. Would it be possible to allow to change the "shutdown_when_done" flag dynamicaly per checksum process? I mean lets say I start a checksum operation and in the middle I decide I would like to shutdown the computer when it finishes. Currently (from my understanding) the setting is set in stone in the checksum.ini and it is unchangable after the checksum operation starts. It would be great if I could turn that setting on/off while the checksum operation is ongoing.
Hi Cor! I've been evaluating checksum lately and find it quite useful and generally well-built. I do have some feedback which might be useful for you so I thought I would share it.
First, I managed to spot what I think are a few tiny bugs.
1) When asked to verify an individual file (checksum vi C:\test.txt), checksum will forget to look at individual hash files if there's file extension in their name (test.txt.hash); AND/OR folder-level hash files which have checksum.ini-configured custom names (hashfile_name=MyHash); the result is that it will report "No matching hash could be found!", even though the respective hash entries are available in the checksum-created .hash files with a naming scheme mentioned above.
2) When asked to hash the modified files in a directory (checksum vwm C:\dir), checksum will do so; however, every entry in the .hash file, which is modified via this routine, will have "md5" as hash name in its hash header, regardless of the actual hash type (the modified hash itself is of correct type). So you might see something like this, where the hash header says it's md5, but the hash itself is blake2:
#md5#test.txt#2016.01.01@12.00:00
d74dc3ac2ac95f346fb3bd30606d214e65ca7dc76f1dd6454f5a1d94b870benc *test.txt
Currently it doesn't break anything, since checksum seems to infer hash type from the hash itself, but I suppose that is not the intended behaviour.
3) When using allow_multiple=queue setting, checksum robustly misses some of the files which should have been added to the queue. For example, I can select 3 files and launch an individual verify operation (checksum vi file_path) via explorer shell menu on Windows 7, but only 2 out of the 3 files will be verified. Similarly, I can get 4 out of 7; even 4 out of 10. Tested on a few-years-old 3+GHz Intel 4-core processor. I suspect the root of the problem is in that checksum re-launches itself to process the next queued item (instead of making an internal call to the relevant part of the code), and that interferes with adding items to the queue, if the timing is right. This may be exacerbated by the unregistered version splash on startup (I assume the splash is removed upon registration) - cannot immediately verify that since I'm playing with the unregistered version so far.
Second, there are a few improvements which I feel like I should suggest.
1) I saw this one mentioned in another comment and you didn't seem to appreciate it. Normally, that would stop me from bringing it up, but being involved with software engineering and UX, I really feel this would make checksum more intuitive and powerful, AND the necessary code appears to be already there... What I'm talking about is that for a user who opens a hash file with checksum the reasonable expectation would be that checksum will check the entries inside the hash file against the file system, and report any inconsistencies. Any files that are not listed inside the hash file are irrelevant to that procedure. That is logical and makes sense. And that is how checksum does it. Now, for a user who right-clicks a directory and selects Verify the reasonable expectation would be that checksum will check the files inside that directory against the available hash files, and report all inconsistencies - be it corrupted, missing, changed, or new (unhashed) files. That is logical and makes a lot of sense too. I imagine that when checksum is run with cy flags it does most of what would be needed to implement this, the rest being in the code for the current verify. Why would this be useful? If you deal with a directory which was modified since checksums had been created, you may want to check its state before updating the hashes. When doing so you want to see whether the changes are aligned with your expectations (the missing files are the ones you deleted, the changed files are the ones you explicitly modified, the unhashed files are the ones that you have added, no corrupt files are present). This allows to update the hashes without doubt. Currently, I see no simple way to achieve this kind of verification.
2) Since the assumption behind checksum is that any file may become corrupted, unpredictably, and the function of the app is to reliably detect such corruption and inform the user - it seems reasonable for checksum to expect corruption of the hash file itself, and have at least some option to deal with it. Currently, every report about a corrupted file may mean one of two things: either the reported file had been corrupted, or the respective part in the hash file had been corrupted. Having a checksum of the hash file at the end of said hash file would solve this...
3) It's good to have a progress indicator during longer operations, but the (sometimes) very rapidly changing text in the Progress ToolTip can be quite annoying to some people. A simpler variant (e.g. with operation type & progress %) might be a good idea.
4) And this is a really tiny one. Checksum offers an option to open the log folder after it has logged something. Having the same kind of option where it would open the log FILE would be nice too.
Sorry for such a long post, I hope some of my feedback was worth reading.
In any case I want to thank you for your work on checksum, and wish you all the best!
As for your suggestions, all reasonable and already mostly under consideration. A simplified progress indicator is on the cards. Opening the log file I hadn't considered, but is now in my 2do list.
A lot of work has gone into checksum's verify routines as you know; adding the facility to check for un-hashed files will now be easier to implement and this is something I hope to do in a future version.
Hashing the hash file is a nice idea, but storing it inside the actual hash file creates its own problems. Being plain text, they often find themselves being edited inside text editors and such, which would, of course, alter the file's hash.
Also, if the hash file itself becomes corrupted, its utility is obviously greatly diminished, so knowing that it is corrupted is not a big help, other than to inform us that the following results are not be trusted. And what if only the hash is corrupted? I briefly went here at the beginning of checksum's development (a decade ago) but decided it was a can of worms best left unopened.
A better approach for users who want this sort of double-protection might be to copy their .hash files to a second (backup) location. Even per-folder hashes can be easily collected up inside an archive and stored elsewhere. I should add, in all the years checksum has been around, no one has ever reported this happening!
Thanks again, good post.
;o)
I was wondering which is the best way to achieve this.
1) I have moved data from 4 hard disks to a single hard disk
2) I would like to check that the data from each disk has moved successfully to the new merged data disk
3) I don't want to do this folder by folder (we are talking about thousands of folders adding up to 4TB)
4) Ideally I don't want to write any files inside the nested folders (those are my movie folders and ideally I don't want to change anything inside those folders)
What is the best way for me to achieve this? if I do a DISK checsum for both disks and then compare them, what information will I get?
Thanks for the quick reply. Will it work also if I moved other data already onto the new drive (data that is not on any of the original four drives)
You said that:
"Checksum will verify only the hashes it finds inside .hash files."
I assume this will happen when I right click on the hash file for the drive I moved across to the new drive from the old drive, and I select which entry exactly?
So this is my understanding of the steps required
1) I right click on the 4 old drives and select "create Checksums...", but also press the shift key so I can ask for a single root hash to be created. I leave everything else unchanged
3) Move the hash files created on the 4 old drives to the new drives
4) right click on each .hash file and choose which option?
What is the output of action 4? A report that will tell me that all the files in the hash file for the drive were found on the new drive, and were identical?
You want to either a) right-click the DRIVE and choose "Verify checksums..", or else open the drive to the root folder and right-click in the empty space and choose the exact same option. Both actions do the same thing: verify every checksum file on the drive in one single operation. Assuming the directory structures have not changed, it will work fine.
If everything is 100%, there will be no output, other than a success dialog. If there are errors, checksum will let you know and possibly create a log file.
If you are confused, I recommend playing around with checksum on a smaller set of files until you are familiar with how it works.
;o)
Hello,
Your checksum software seems great, and I would love to try it out, but I am concerned about some of the hits it gets on VirusTotal.com, especially when I would install it on my company's servers.
Could you give us a reason why some virus scanners show it as malware, or perhaps post the lines of code that you think is tripping up the scanner?
Thanks in advance!
CRV
Load this page with all comments for discussions along these lines.
;o)
Thank you very much, for this great utility! It makes generating and checking hashes a breeze. But I have an enhancement request. Could you add support for checking the longer blake2 hashes? I'm using another tool to generate blake2 hashes on my unRAID server, but sadly that are the long version blake2. Your tool seems to generate short version ones and when it encounters the long ones, it thinks they are sha256.
Hi, Cor.
Does Checksum do MD5 of individual lines of text within a text file?
I'm trying to generate the MD5 of all IP addresses for a project I'm doing, and it's slow going having to calculate the MD5 checksums in batches of 100,000 at a time (I've only got 10 million done so far).
It'd be nice to just load the list of all IP addresses, let the program chew on it, and spit out a tab or comma delimited text file with IP ADDRESS[tab]MD5 or IP ADDRESS[,]MD5.
The checksum software seems like a perfect fit for my needs but I am having a hard time finding where I can configure the "Email on Fail" options. I have the "checksum one-shot verify options" window displayed & then right click the tray icon but the only options available are:
- About checksum..
- Edit Prefs (checksum.ini)
- Exit
I'm also unable to location the Scheduler Wizard.
I'm using checksum version 1.7.0.1 on Windows 2008r2 x64.
Can you point me in the right direction?
startup_command=schedule
inside checksum.ini and launch checksum (with nothing to do).
For LOTS more tips, check out the itstory (link at the top of this page).
;o)
Thanks for the quick reply! I purchased a license & was able to setup the email on fail as expected. It may be helpful to others to indicate when a feature is only included in licensed versions. I saw a few others mention this in the comments but I would love an "email on success" feature as well without the need for full logging of all successfully hashed files. Great work & good luck to you sir!
In moving things around on FreeBSD, PCBSD, pfSense, Windows, Ubuntu, etc, I'm really wishing I had something that is at least compatible with .hash files. Maybe you know of something? I've been using MultiPar, par2+tbb, rsync, etc, and it would be so much nicer if I could just use Corz Checksum for everything.
This is a great piece of software!
So I'm using it to verify the backup of my photos, and it seems a few files have become corrupted in the backup. Easy - just recopy them! Is there a quick way of doing that with your program? Maybe a log format that produces a batch file? The quickest way I found was copying the log text into Excel, and generating batch commands from there. But that's because Excel is my go-to tool for pretty much anything. I did read through the docs, and I didn't see anything that suggested I would be able to do this.
(And yes, I probably do need a new hard drive for my backup.)
No, there isn't an in-built way to produce a batch script from the log output. I tend to use a text editor and some quick regex for that sort of thing, as there may be many variables. It's not a bad idea, though. I'll make a note.
;o)
Possible small bug with write_command_line=true it writes all new command-lines to the same line, a new line for each new command-line executed would probably be better?
This sort of thing is usually best handled by email (mail link above these comments, where is says "If you think you have found a bug, click here.").
Thanks for caring!
;o)
ps. I think you're right about the linebreaks, I'll add it to the list.
Hi Cor
Was just going to post a comment stating that an error occurred and realized i was in utnubu...
Cheers.
Every time I run Verify the log shows several files as CHANGED. I rerun Create and then verify and the log shows the same files as Changed. Are these files being changed daily (somehow)? I'd like to make new hashes for the changed files ... why does it continue to show the files as changed in the logs? I understand I can suppress the reporting of CHANGED files but I'd like to know if there were changes since the most recent checksum. I don't care about historical changes. I create a new hash daily and then test and every day I'm seeing CHANGED in the log for the same subset of files. I do not believe that the files are being changed every day. Are the files being changed everyday? Or does the log show historical changes? Can I suppress the re-logging of historical changes? I'd prefer no output if the file is unchanged and correct since the most recent chacksum and the daily indication of CHANGED is confusing.
Thanks,
David
I've had similar questions before and the problem was either; they hadn't deleted or updated the old hash at all, but had instead synched the hashes (which adds new hashes to an old .hash file) so that the original (CHANGED) hashes were untouched. Or else the files really have changed.
Historical changed aren't logged (how could checksum know?!). However, logs can be appended, so old log data could be in your log, depending on your personal settings. Confirm by adding a date @token to your log name.
Without knowing exactly what files you are hashing and what commands you are using it's impossible to say exactly what you issue is. It would be best to mail me about this, including the command-line used, a copy of your .hash and log files, system details and such.
;o)
Hi Cor,
So I tried checksum in command line and I quite like it. That said, I do have few questions about it,
1. How to exclude certain file extensions during checksum creation? For instance say I want to create a root checksum and exclude any .txt and .ini files in the directory.
2. How to specify log output directory in command line? Also any way to force a text log rather than HTML?
3. Any plans to make it compatible with b2sum in GNU coreutils? This means,
* Add support for Blake2b (b2sum in coreutils doesn't support Blake2s)
* Output checksum without UTF-8 BOM (coreutils rely on auto detection)
* Using Linux line ending format (LF), as opposed to Windows line ending format (CRLF).
2. Again, this is set inside checksum.ini (as is plain text logging). You can also set this on a per-job basis with the options dialog.
3. No to Blake2b, it already does the rest.
For more information, see here.
;o)
Are there any plans to update Checksum to include more recent and complex hashing algorithms? Many download sites these days are providing SHA256 checksums for their products, making the current version of Checksum useless for download verification. There are other free utilities available that will do SHA256, but it would be nice to be able to stay with the product that I've known and supported for years.
Thanks for caring.
;o)
thank you for your work & maintenance of Checksum - we have used it for a long time with great success.
recently I've noticed on our audits that the process of creating a hash for a folder involves a file just called '[' briefly being created & then deleted. do you have any idea why this is? (note for HTML purposes; character is a single square bracket).
I really like this program so far, do you have any plans to make it so we can add a standalone executable to verify that hash specifically? (For example, you generate the hash, and it outputs a singe exe file, which when run verifies the hash automatically) Thanks!
Hi Cor,
Just a quick note to say hi,I sent you a note many moons ago but my health got even worse since then.
Then more gunk showed up and I was off the air for some time now this is my first eMail in many years.
Anyway, this note is just to let you know that I am still around (Well just about) but seeing you are still on line Cheers me up a lot.
I am hoping that whatever has gotten in your way will remove itself and you will end up doing better than ever.
Anyway I will say goodbye and wish you all the best.
( I don't have any eMail so am delighted I can contact you through this mail, Thank You. )
Hi Cor,
I have just tried Checksum. I am happy to report that I could hash several hundreds of GB (several hundred thousands files) within a few hours.
I have noted a few things (please correct me if I'm wrong):
1- It doesn't prevent the computer from sleeping while hashing or verifying.
2- I have tried a few times with the switch x to delete missing removed hashes but it didn't seem to work. Maybe I got I didn't understand how to type the command correctly (I found the explanation a bit ambiguous). I edited the .ini and then it worked fine.
3- There is no way to verify only files with the same timestamp. Maybe that could be useful for silent corruption checking(?).
I used Checksum to check for silent corruption. My procedure was as follow:
a- Create a checksum of the root of the folder (crtb1)
b- After some time, verify checksums with update of modified files (w) and removal of deleted files (x) in the hash file.
=> Any corrupted files should pop up here.
c- (Recover corrupted files with backups if necessary) Back up and/or sync data known to be good.
d- Create again a new checksum of the root to include new files.
Then loop back to step b.
Does that seem sensible? Any suggestion?
Many thanks!
Cheers,
Thank you for your great article about checksum !
For those who don't want to download things, I sincerely recommend online checksum tools https://md5-checksum.com , support md5, sha256, sha1 :-)
Do you plan to support BLAKE3?
Are there any plans to support SHA-256?
Thank you for this great program (checksum)! However, I generated MD5 hashes recursively (default config) by right clicking on a folder, and that worked. But then I right clicked the folder to run a "verify" and it gave me errors that stated:
"(BLAKE) MISSING" for all files in the subfolder.
Also in the log file, the file paths showed up as:
"D:\Test 1\\Test 1\Executable\Setup.exe"
When they are files in folder "D:\Test 1\Executable\"
Hi Cor,
fellow dev here. I love checksum and I haven't found anything better to be honest. But now I have a feature request:
Let's say I allow multiple instances and I'm using a fast M.2 SSD with multiple gigabytes of transfer speed.
Running one instance of checksum will hit the cpu limit with reading speeds of nearly 800 MB/s.
Now, I can mark two folders and run verify which will run two instances, doubling the reading speed.
But what if I have many folders and I'd like to use the full speed of the M.2 SSD and the cores of the CPU? I can't click on 50 folders and run 50 instances of checksum.
I had two solutions in mind:
One) allow multiple instances but always queue multiple selected folders -> This way I could select one half of all the folders and click verify which will run one instance of checksum with a queue of the selected folders. When I do that again, another instance with another queue starts
Two) MultiInstance with Queue: I can set the number of maximum instances that will be created in the ini file with their own queue -> Let's say I set it to 4 and I select 40 Folders -> 4 instances that check 10 folders each will be created
Of course, none of these solutions will make use of the M.2 SSD and the Multicore CPU in the best possible way. For that you'd have to allow X amount of instances that share one queue. Would that even be possible? Or could you use multithreading within one instance?
Kind Regards
Alex
PS: Feel free to email me. I hope I'll hear from you.
Would be great to implement also MHL hash list checking. This has become the defacto standard in parts of cinema industry. This tools as is - works great and is basically perfect at what it does. MHL with XXHASH/MD5/SHA options would be a great addition to have. For now - the existing tool that is called SealVerify from Pomfort does mostly what is needed and everything else can be had by running command line (generate MHL) but the way that the Corz tools works just makes so much sense.