# checksum tricks and tipshints, secrets, behaviours, assumptions and more..

## Get the most from your hashing!

checksum represents a whole new way of working with hashes. This page aims to help you get the most out of the experience, wherever you're at..

## Absolute beginners..

The basics: checksum creates "hash files". A hash file is a simple, plain text file containing one or more file hashes, aka. "checksums". Hashes are small strings which uniquely represent the data that was hashed. e.g..

cf88430390b98416d1fb415baa494dce *08. Allow Your Mind To Wander.mp3

(Mike Mainieri - Journey Thru An Electric Tube [1968] - I have the vinyl)

If you want to know more about the algorithms that checksum uses to hash files (MD5, SHA1 and BLAKE2), see here.

Once these hashes have been created for a particular file or folder (or disk), you have a snapshot that can be used, at any time in the future, to verify that not one bit of data has changed. And I do mean a "bit"; even the slightest change in the data will, thanks to the avalanche effect, produce a wildly different hash, which is what makes these algorithms so good for data verification.

Most people will simply install checksum, and then use the Explorer context (right-click) menu to create and verify checksums, rarely needing any of the "extra" functionality that lurks beneath checksum's simple exterior. After all, checksum is designed to save you time, as well as aid peace of mind. This is how I mostly use it, too..

### Create checksums..

Right-click a file, the checksum option produces a hash file (aka. 'checksum file') with the same name as the file you clicked, except with a .hash extension (or .md5/.sha1, if you use those, instead). So a checksum of some-movie.avi would be created, named some-movie.hash (if you don't use the unified .hash extension, your file would instead be named some-movie.md5 or some-movie.sha1, depending on the algorithm used).

Right-click a folder, the Create checksums.. option will produce a hash file in that folder, containing checksums for all the files in the folder (and so on, inside any interior folders), named after the folder(s), again, with a .hash extension, e.g. somefolder.hash

### Verify checksums..

Click (left-click) a hash file (or right-click and choose Verify this checksum file..), checksum immediately verifies all the hashes (.hash/.md5/.sha1) contained within.

Right-click a folder, the Verify checksums.. option instructs checksum to scan the directory and immediately verify any hash files contained within.

That's about it, and this simple usage is fine for most situation. But occasionally we need more..

## checksum launch modifiers..

When you launch checksum, you can modify its default behaviour in two important ways.

The first modifier is the <SHIFT> key. Hold it down when checksum launches, and you pop-up the one-shot options dialog, which enables you to change lots of other things about what checksum does next. This works with both create and verify tasks, from explorer menus or drag-and-drop. Here's what the one-shot create dialog looks like..

In there, as you can see, you can set all sorts of things. Hover over any control to get a Help ToolTip (you might need to repeat that to read the entire tip!). You can also drag files and folders directly onto that dialog, if you want to alter the path setting without typing. Same for the verify options.

The file mask: input is, by default, *.*, which means "All files", "*" being a wildcard, which matches any number of any characters. You can have multiple types, too, separated by commas. For example, if you wanted to hash only PNG files, you would use *.png; if you wanted to hash only text files beginning with "2008", you could use 2008*.txt, and so on.

If you click the drop-down button to the right of the input, you can access your pre-defined file groups, ready-for-use (you can easily add to/edit these in your checksum.ini)..

NOTE: Normally one drops folders into checksum's create options (path input). If you drop a file onto the create options, the path is inserted into the path: input, and the file name is added to the mask input - it is also inserted into the drop-down lost in case you need to get back to it.

If you drop multiple files into the create options, checksum will create a custom file mask from your selection. For example, if you dropped these three files; "hasher.jpg", ".txt" and "security.pdf", checksum would create the mask: "*.jpg,*.txt,*.pdf".

Here is what the one-shot verify options dialog looks like..

The second modifier is the <Ctrl> key. Hold it down when checksum launches and you force checksum into verify mode, that is to say, no matter what type of file it was, you instruct checksum to treat it as a hash file, and verify it. This works with drag-and-drop too, onto checksum itself, or a shortcut to checksum. checksum's default drag-and-drop action is to create hashes.

Amongst other things, this is useful for verifying folders in portable mode, simply Ctrl+drag-and-drop the folder directly onto checksum (or a shortcut to it), and all its hashes will be immediately verified.

Hit the modifier key as soon as checksum launches, in other words, hit the <SHIFT>/<Ctrl> key right after you choose the Explorer menu item, or before you let go of a drag & drop, and so on. Hold the key down until checksum appears a moment later.

## batch processing..

### hashDROPA batch-processing front-end for checksum..

Because checksum can be controlled by command-line switches, it's possible to create all sorts of interesting front-ends for it. The first of these to come to my attention, is a neat wee application called "hashDROP", which enables you to run big batches of jobs through checksum, using a single set of customizable command-line switches.

As developer seVen explains on the hashDROP page..

Actually, that looks to be down right now. I've put a temporary copy of the page and download here.

hashDROP is a front-end for checksum which enables you to queue a bunch of jobs (files/folders) and then pass them all through checksum with your own custom switches in one batch process.

On seVen's desktop, at least, it looks something like this..

### Batch RunnerRun multiple programs in a batch..

I originally designed Batch Runner to run a big batch of tests on checksum before release, but it has since proven useful for other tasks, so I spruced it up a bit, made it available.

If you want to run loads of hashing jobs using the same switches, hashDROP is probably more useful to you. But if you want to run lots of checksum jobs with different switches, or as part of a larger batch of jobs involving other programs, then check out Batch Runner.

Batches can be saved, selected from a drop-down, run from the command-line, even from inside other batches, so it's handy for repetitive scheduled tasks, or application test suites, as well as general batch duties. At least on my desktop, it looks like this..

## Automatic Music playlists..

Perhaps checksum's second most common extra usage is making music playlists. After you have ripped an album, you will most likely want a playlist along with your checksums, so why not do both at once? checksum can.

Right-click a folder and SHIFT-Select the checksum option (which pops up the one-shot options dialog), check either the Winamp playlists (.m3u/.m3u8) or shoutcast playlists (.pls) option, and then do it now! You're done.

By default, checksum will also recurse (dig) into other folders inside the root (top) folder. Now you've got music playlists that you can click to play the whole album in your media player.

Note that checksum will thoughtfully switch your file masks to your current music group when you select a playlists option, reckoning that you'll probably only want to actually hash the music files, not associated images, info files and such, but it's easy enough to switch it back to *.* (hash all files) if you need that. The rationale behind this being that it's what most people want, so the majority get the simpler, two-click task.

If you do this sort of thing a lot, check out the next section, for how to put this functionality directly into your Explorer context menu, and skip the dialog altogether..

On the subject of music files, you may encounter a lot of these, and fancy creating a custom explorer right-click command along the lines of "checksum all music files", or something like that. No problem; you can simply create a new command in the registry, add the "m" switch add your file masks, right?

But what if you change your file masks? Perhaps add a new music file type? Do you have to go and change your registry again? NO! checksum has it covered. Instead of specifying individual file masks, use your group name in the command, e.g. m(music) and checksum applies all the file masks from that group automatically, so your concept command is always up-to-date with your latest preferences.

Here's an example registry command that would do exactly that. Copy and paste into an empty plain text file, save as something.reg, and merge it into your registry. If you installed checksum in a different location, edit the path to checksum before you merge it into the registry (not forgetting to escape all path backslashes - in other words, double them)..

Windows Registry Editor Version 5.00

[HKEY_CLASSES_ROOT\Directory\shell\01b.checksum music]
@="Checksum &MUSIC files.."

[HKEY_CLASSES_ROOT\Directory\shell\01b.checksum music\command]
@="\"C:\\Program Files\\corz\\checksum\\checksum.exe\" crm(music) \"%1\""

NOTE! If you add a "3" to the switches [i.e. make them c3rm(music)] you'll get a media player album playlist files created automatically along with the checksum files. Groovy! Here's one I prepared earlier.

### Setting new default Explorer context actions..

You can also change checksum's default Explorer commands, as well as add new commands, without going anywhere near the registry. Simply edit the installer's setup.ini file, [keys] section. For example, to always bring up the one-shot options dialog when creating checksums on folders and drives, you would add an "o" to those two commands..

HKCR\Directory\shell\01.|name|\command="|InstalledApp|" cor "%1"
HKCR\Drive\shell\01.|name|\command="|InstalledApp|" cor "%1"
Then run checksum's installer (setup.exe), and install/reinstall checksum with the new options. From then on, any time you select the "Create checksums.." Explorer context menu item, you will get the one-shot options dialog. If you would prefer to synchronize hashes under all circumstances, add a y, and so on.

## Creating checksums "quietly"..

If you want to script or schedule your hashing tasks, you will probably want checksum to run without any dialogs, notifications and so forth. If so, add the Quiet switch.. q

When the q options is used alone, if checksum encounters existing hash files, it continues as if you had clicked "No to All", in the "checksum file exists!" dialog, so no existing files are altered in any way. This is the safest default.

If you would prefer checksum to act as if you had clicked "Yes to all", instead, use q+, and any existing checksums will be overwritten.

If you want synchronization, add a y switch (it can be anywhere in the switches, so long as it's in there somewhere, but qy is just fine)

Quiet operation also works for verification, failures are logged, as usual. Like most of checksum's command-line switches, these behaviours can be set permanently, in your checksum.ini.

## Working with Cross-Platform hashes..

checksum has a number of features designed to make your cross-platform, inter-networking life a bit easier.

You don't have to do anything special to verify hash files created on Linux, UNIX, Solaris, Mac, or indeed any other major computing platform; checksum can handle these out-of-the-box.

If you need to create hash files for use on other platform, perhaps with some particular system file verification tool, checksum has a few preferences which might help..

You will perhaps appreciate checksum's plain text ini file (checksum.ini) containing all the permanent preferences. Inside there you can set not only which Line Feeds checksum uses in its files (Windows, UNIX, or Mac), but also enable UTF-8 files, single-file "root" hashing, generic hash file naming, UNIX file paths, and more. Lob checksum.ini into your favourite text editor and have a scroll.

There are a number of ways to run checksum. One handy way, especially if you are running checksum in portable mode without Explorer menus, is to keep a shortcut to checksum in the SendTo menu.

Simply put; any regular file or folder sent to checksum will be immediately hashed. Send a checksum file (.hash, .md5, .sha1, plus whatever UNIX hash files you have set), and it will be immediately verified. If you want extra options, hold down the <SHIFT> key, as usual.

If you want to send a non-checksum file, but have checksum treat it as a checksum file, hold down the <Ctrl> key during launch, to force checksum into verify mode (either just after you activate the SendTo item, or perhaps easier; hold down <SHIFT> AND <Ctrl> together while you click, to bring up the one-shot verify options). This is also handy for verifying folder hashes.

## How to accurately compare two folders or disks.

This is an easy one. First, create a "root" hash in the root of the first (source) folder/disk, then copy the .hash file over to the second (target) folder and click it.

That's it!

For situations where you don't need a permanent record of the hashes, you can fully automate the process of comparing two folders with simple checksum.

## How to accurately compare two CDs, DVDs, etc.(even when they don't have hash files on them)..

When hashing read-only media, obviously we cant store the hash files on the disk itself. However, thanks to checksum's range of intelligent read-only fall-back strategies, you can make light work of comparing read-only disks with super-accurate MD5 or SHA1 hashes, even if those disk were burned without  hashes.

All we need to do, is ask checksum to create a "root" hash file of the original disk, using the "Absolute paths" option. This will produce a hash file containing hashes for the entire disk, with full, absolute paths, e.g..

531a3ce6b631bb0048508d872fb1d72f *D:\Sweet.rar
558e40b6996e8a35db668011394cb390 *D:\Backups\Efficient.cab
832e98561d3fe5464b45ce67d4007c11 *D:\Sales Reports\April.zip

There are a few ways to achieve this. For one-off jobs, you can simply add k1 to your usual command-line switches. For example, to create a recursive root hash file of a disk, with absolute paths, you would use crk1.

Another way, is to set (and forget) checksum's fallback_level preference to 2, inside checksum.ini..

fallback_level=2

With fallback_level=2, when checksum is asked to create hashes of a read-only volume, it will fall-back to creating a single "root" hash with absolute paths, inside your fall-back location (also configurable), which is exactly what we need!

Then in the future, to verify the original disk, or copies of the disk; you simply insert it, and click the hash file.

You can store the .hash file anywhere you like; so long as the disk is always at D:\, or whatever drive letter you used to create the .hash file originally, it will continue to function perfectly.

If you want to know more about checksum's read-only fall-back strategies, see here.

## Or accurately compare a burned disk to its original .iso hashes..

If you have a .hash file of the original .iso file, in theory, a future rip of the disk to ISO format, should produce an .iso file with the exact same checksum as the original. My burner is getting old, but I needed to know, and so tested the theory.

I Torrented an .iso file of a DVD (Linux Distro) - the hashes were published onsite, checksum verified these were correct. I burned the disk to a blank DVD-R, and then deleted the original .iso file. Everything is now on the disk only. The .hash file is still on my desktop.

Then I used the ever-wonderful ImgBurn, to read the DVD to a temporary .iso file on my desktop.

Fortunately, the .iso file, and the original .iso file had the same name, so I didn't need to edit the .hash file in any way. Then the moment of truth. I click the .hash file, and checksum spins into action, verifying. A few seconds later... Beep-Beep! No Errors! It's a perfect match!

I can't speak for other software, but with ImgBurn at least, a burned disk can produce an .iso file with a hash bit-perfectly identical to that of the original .iso file used to create the disk, and can be relied upon for data verification. Good to know.

## checksum as an installer watcher..

Because checksum can so accurately inform you of changes in files, it can function as an excellent ad-hoc installer watcher. All you do is create a root checksum for the area you would like to watch. Run the installer. And afterwards, verify the checksum. If anything has changed, checksum will let you know about it, with the option to log the list to funky XHTML or plain text.

checksum can be utilized in any situation where you need to know about changed files. You can even use it to compare registry files, one exported before, the other after. If the hashes match, there's no need to investigate further.

### checksum as an upgrade helper..

I have more than once used checksum to help me with upgrades. Imagine the scenario.. Your local zwamp installation keeps bugging you to update. But you put it off because you modified a few files but don't remember exactly which files. No need! Let checksum take care of it..

Hash the installation folder, using a single-file "root" hash.

Copy the .hash file to the root (same folder) of a fresh copy of zwamp and click it.

Voila! A list of modified files. Now upgrading is easy.

## checksum's custom command-line switches..

Click & Go! is the usual way to operate checksum; but checksum also contains a lot of special functionality, accessed by the use of "switches"; meaningful letter combinations which instruct checksum to alter its operation in some way.

If you have some unusual task to accomplish, the one-shot options dialog enables you to manipulate the most common switches with simple checkbox controls. You can see the current switches in a readout, updating dynamically as you check and uncheck each option. But this output is also an input, where you can manipulate the switches directly, if you wish. If that is the case, you will probably find the following reference useful.

You may also find this section useful if you are constructing a full checksum command-line for some reason, maybe a Batch Runner command or batch script, or custom front-end for checksum, or altering your explorer context menu, or creating a scheduled task, or uTorrent finished command*, or some Übertask for your Windows Run command (Win+R) or something else. In each case, switches are placed before the file/folder path, for example; the full command-line to verify a checksum file might look like this..

C:\Program Files\corz\checksum\checksum.exe v C:\path\to\file.hash

Here are all the currently available switches:

c
Create checksums.
v
Verify checksums.
r
Recurse (only for directories).
y
Synchronize (add any new file hashes to existing checksum files).
i
During creation: create individual hash files (one per file).

During verification: performs hash search for individual file (see examples, below).

s
Create SHA1 checksums (default is to create MD5 checksums).
2
Create BLAKE2 checksums (default is to create MD5 checksums).
u
UPPERECASE hashes (default is lowercase).
m

During Creation: File masks. Put these in brackets after the m. e.g..  m(*.avi,*.rm)
Note: You can use your file groups here, e.g. m(music)

During verification: Check Only Modified Files.

Perform operations only on files with a modified timestamp more recent than their recorded timestamp. This is generally used along with the "w" switch, to update the hash and timestamp of a file or set of files you have mindfully changed, whilst skipping bit-checking all other files, potentially saving a lot of disk access and massive amounts of time, especially over network links.

Note: This feature is currently marked as "experimental".

j
Custom hash name (think: "John"!). Put this in brackets after the j. e.g..  j(my-hashes)
d

During Creation: Output Directory. Put this in brackets after the d. e.g..  d(C:\hashes)
NOTE: Make this the last bracketed switch on your command-line, i.e. m(..)j(..)d(..).

During verification: Override Root Directory. *ßeta Only

Using the d switch, you can specify a new root directory for relative hash files, enabling them to be checked outside their original location.

For example, if you created a relative .hash file for files in "D:\my files" and put the hashes in a folder, "e:\my hashes" using a command-line something like..

crd("e:\my hashes") "D:\my files"

You can now verify this .hash file ("e:\my hashes\my files.hash") IN-PLACE using similar syntax:

vrd("D:\my files") "e:\my hashes"

e
Add file extensions to checksum file name (for individual file hashes)..
1
Create one-file "root" checksums, like Linux CD's often have.
3
Create .m3u/.m3u8 playlists for all music files encountered (only for folder hashing)..
p
Create .pls playlists for all music files encountered (only for folder hashing)..
q
Quiet operation, no dialogs (for scripting checksum - see help for other options)..
h
Hide checksum files (equivalent to 'attrib +h').
o
One-shot Options. Brings up a dialog where you can select extra options for a job.
(to pop up the options at run-time, hold down the <SHIFT> key at launch)
b
Beeps. Enable audio alerts (PC speaker beeps or WAV files).
t
ToolTip. Enable the progress ToolTip windoid.
n
No Pause. Normally checksum pauses on completion so you can see the status. This disables it.
(note: you can also set the length of the pause, in your prefs)
k
Absolute Paths. Record the absolute path inside the (root) checksum file.
Use this only if you are ABSOLUTELY sure the drive letter isn't going to change in the future..
f
Log to a file
(if there are failures, checksum always gives you the option to log them to a file)
g
Go to errors.
If a log was created; e.g. there were errors; open the log folder on task completion.
l
Log everything.
(the default is to only log failures, if any).
w
Update changed hashes. (think: reWrite)
(during verification, hashes and timestamps for "CHANGED" files can be updated in your .hash file).

USE WITH CAUTION ON VOLUMES YOU KNOW TO BE GOOD!

x
During creation: used to specify ignored directories, using standard file masks.
e.g. checksum.exe cr1x(foo*,*bar,baz*qux) "D:\MyDir"

During verification: delete missing hashes.
(hashes for "MISSING" files are removed from your .hash file).

a
Only verify these checksum files.
(followed by algorithm letter: am for MD5, as for SHA1, a2 for BLAKE2 - see example below).
z
Shutdown when done.
Handy for long operations on desktop systems. A dialog will appear for 60 seconds, enabling you to abort the process, if required

The 'a', 'f', 'g', 'l' and 'w' switches take effect when verifying hashes.

The '1', '2', '3', 'e', 'h', 'j', 'k', 'm', 'p', 's', 'u', and 'y' switches take effect when creating hashes.

The 'd', 'i' and 'x' switches have different functions for creation and verification.

In other words..

global switches = b, n, o, q, r, t, z.
creation switches = 1, 2, 3, c, d, e, h, i, j, k, m, p, s, u, x, y.
verify switches = a(m/s/2), d, f, g, i, l, v, w, x.

Switches can be combined, like this..

… checksum.exe v "C:\my long path\to\files.md5"
[ note 'long' path (with spaces) enclosed in "quotes" ]
[ create individual checksums for all my movie files - note use of group name ]
… checksum.exe vas c:\archives
[ check all *.sha1 files in the path, not *.md5 files ]
… checksum.exe c3rm(music) p:\audio
[ recursive music file checksum creation, with automatic playlists ]
… checksum.exe cr1m(*.zip) d:\
[ create a "root" checksum for all zip files on drive D: ]
… checksum.exe vi d:\path\to\some\video file.avi
[ search for matching entry for "d:\path\to\some\video file.avi" and verify that one file ]
… checksum.exe crkq1m(movies)j(video-hashes)d(@desktop) v:\
[ quietly create a "root" checksum (named "video-hashes.hash") for all movie files on drive V: and place it on the desktop ]

note: Although it won't appear in the options dialog, the custom name, "video-hashes", will still be set when you begin the job.

* A useful uTorrent command..
[Prefs >> Advanced >> Run Program (when a torrent finishes)]..

"C:\Program Files\corz\checksum\checksum.exe" cq "%D\%F"

### notes:

• The order of the switches isn't important, though the "m" switch must always be immediately followed by the file masks (in brackets), e.g. m(*.avi) (same with d(dir), j(name) and x(file*mask) switches), and the "a" switch must  be the first letter of a two letter combination, e.g. am

The d() (output directory/root directoy) switch must be the last (bracketed) switch on your command-line. If you are using custom hash file name and/or custom file groups, put those first, e.g..

"c:\path to\checksum.exe" cr1qnm(movies)j(my-hashes)d("c:\some (dir) here") "D:\My Movies"

..which would quietly create a root hash file for all the movie files in D:\My Movies, and put those hashes in "c:\some (dir) here\my-hashes.hash". Those tricky braces inside the example path are why it goes last.

• You don't need to specify the m(music) group switch to create playlists, only the 3. A command like checksum.exe rc3 "P:\audio" would create checksums recursively for all  files in the path p:\audio, whilst creating playlists for only  the music file types. Nifty, huh?

• Most of these switches also have a preference inside checksum.ini. If that preference is enabled, you can disable it temporarily by prepending the switch with a - (minus) character, e.g. to disable the Progress ToolTip, use -t

• Any of these switches can be easily added permanently to your Explorer right-click (context) menus. For instance, you may like to always use the one-shot options, without having to hold the SHIFT key every time. So simply add an o, and it will be so. See here for details of how to permanently alter checksum's Explorer context menu commands.

And remember, if there's some specific behaviour that you want set permanently, you can do that, and a lot more, inside checksum.ini..

## checksum.iniworking with checksum's UNIX-style preference file..

checksum has a lot of available options. Here is a page that will help you get the most out of them.

## I do requests!

If there's something you would like to accomplish with checksum, but don't think think checksum can; feel free to request a feature, below..

## Request a feature!

wraithdu - 19.02.09 5:14 pm

Hi there again. I have a request for your next version. Allow checksums to be verified against another directory. Scenario: I copied a large amount of files from a network share, I've hashed my local files, now I want to compare the hashes to the network share to verify my download. Currently a user has to copy the HASH file to the network location, or hash the network files and copy the HASH file to the local directory. Keep up the great work!

Tuwase - 25.02.09 6:37 pm

This is a Beautiful Site.

cor - 26.02.09 11:13 pm

Hi again, wraithdu. I'm looking at your post wondering what might be easier than simply copying the hash file over to the new location. It's a single-click and drag operation! If you can think of something easier, let me know, and I'll definitely consider it.

Cheers, Tuwase. People keep telling me this, but no one ever says, "Gee, Cor, your site design is so pleasing to my eyes, I'd like to donate you a hundred bucks!", sadly.

I think I'll put up a "comment thing" just for general stuff. Maybe right on the front page. That would be novel. Hmm..

;o) Cor

tessellated - 26.03.09 5:24 am

I would like to congratulate you on a very fine tool. Unfortunately for me, I seem to have hit a snag when using it with my NAS device. My OS is Vista x64 and I am using wifi (usually) to connect to my storage device. For very large files (say 10+GB but the threshold could be lower) checksum will run to completion and claim that hashing has completed at which point I lose all network connectivity on my PC (to the internet, the NAS device, and presumably anything else on the network).

Any idea what might be going on here?

cor - 26.03.09 8:50 am

Very strange. It sounds similar to troubles I used to have on my own network, transferring large files. If I remember correctly, a driver update on my network card fixed it.

It doesn't sound like a checksum error, as such, more to do with your network throughput. I'd wager that any application that tries to pull that same file over the network in one go would invoke a similar situation (mine used to crap out pulling Apache log files into my text editor, amongst other things). Though without more information, that could just be wishful thinking on my part. Have you tried another hashing application on the file(s)? Try something similarly fast, like fastsum; see what happens.

I'd like to have a lot more information about the errors you are receiving, before I consider it further. Does your system, or any other applications give some specific error messages? Can I see those? If there are none, try and do some ftp over the link and see what your ftp client says (often ftp clients give rather good network error messages). I'm assuming your NAS has an ftp server on-board. At any rate, more information is always better than less.

Also, has the hashing actually completed? In other words, is there a hash file at the end of the process? Or did that also fail?

Thanks for the compliment, though.

;o) Cor

tessellated - 27.03.09 1:53 am

I dug in to this a little more today and was able to resolve my problem. It was, indeed, a driver issue with my wireless card. I use an Intel wireless/PRO 5100 AGN. A simple driver update via device manager has made all my issues go away.

I did run an ftp test and it failed in a similar fashion (prior to my fix). Additionally, checksum was able to create a hash file, but it was of zero length. Fastsum, strangely enough, didn't suffer from this problem. It worked before I applied the driver update. I'm not entirely sure why that is. I suspect that it is a function of faster run time. Fastsum is limited to md5 and I prefer using sha1. I'm not sure if that incurs a runtime penalty with your program or not but my statistically insignificant tests indicate that Fastsum completed its md5 hashing ~10 minutes faster than checksum could complete sha1 hashing. That leaves about a 10 minute difference. That doesn't exactly leap out at me hugely significant (82 minutes vs. 73 minutes...) but it's hard to say. Anyhow, it's a moot issue now.

cor - 30.03.09 4:50 pm

Aha! Yes, it looked like a driver issue. I'm glad you got it sorted, you will probably find lots of network things work better.

As for the hashing speeds, considering that SHA1 hashing speeds are very much slower than MD5 hashing speeds, that's actually a great result for checksum! Think about it. Actually, if you really think about it, it's clear the bottle-neck isn't the so much the speed of the hash calculation, as the network link. I'm curious how long checksum would take to do an MD5 of that same folder.

As for your suggestion, do you mean long operations with many files, or long operations on a single file. If it's the former, then this might be doable, though could potentially make it tricky to see all the information (though I know some thought on that would fix it). If it's the latter, it would require completely re-coding my hashing engine, which is designed to just "get on with the job", without interruption, not handing back control until the task is complete. This kind of change is unlikely to be considered at this time.

I'm thinking about it, though. It's a nice idea.

;o) Cor

tessellated - 31.03.09 7:28 am

Ok, just for fun I ran checksum using md5. I'm afraid the result was less than illuminating: 87.9 minutes. These are very crude measurements as they completely disregard network traffic or the I/O load on the NAS device. However, I will say that in all cases I limited held my personal use of the device to zero, but that doesn't rule out background services, daemons, etc. Probably a much better test would be to copy the file to my local disk and run checksum against it multiple times using both algorithms.

Anyhow, I know you are right about the network I/O being the limiting factor because I attained much faster run times when the file was located on an external HD attached via USB.

RE: the suggested windoid change, I really just meant anytime there was very long delay in feedback given by the windoid, I found myself checking task manager for progress information. I think when you have the case of many files of short length, the user gets a fairly reliable indication of progress from the count of files thus-far processed. The shorter then avg. file length and the smaller the standard deviation from the average the more reliable an indicator this count is. If the change would cause a noticeable degradation in performance then I wouldn't want it as I have other means of tracking program performance (task manager's I/O columns work fairly well for this). I suppose if it were me (and I have no idea how you coded your engine) I'd use a separate thread for the windoid and then occasionally pass it progress info from the engine thread.

Even so, forgetting all of that and just turning your processed file counter into a visual color gradient would have utility.

cor - 28.04.09 8:05 pm

Apologies for the delay in replying; it's been crazy here, this must have slipped through.

Well, the times tell us one thing; test results on that system are not reliable! Interestingly, someone on the main page was commenting about checksumming over networks, and I had to admit that it's not something I saw checksum doing a lot, and didn't make any special consideration for over-network use - I was far more interested in being able to work with all the different formats of cross-platform hash files. All my (and my beta tester's) network tests passed without issue, I should add.

checksum is designed to load the data from the file system as fast as possible, and there probably are network setups where this could, if not overwhelm, then at least stress the networking components. Networking is always one of those areas where the mileage variables are stretched to their limits, and while there should theoretically be no issues hashing data over any network link; in practice, networks can contain many weak links, and I'd be more inclined to see checksum's behaviour as an indicator that other issues need attention. In other words, in an ideal network, there's no problem.

At any rate, checksum was designed to work with hashes from any platform, the idea being that file hashes are always created and verified locally, using whatever tools are locally available. Hashing over a network, if you think about it, is kinda backward, because the operating system will, at some point, need to pull that data over the network; to work with it; and it's AFTER that process that its integrity might need checking. The OS on the other box, that is the one best suited to checking its local hashes.

NAS sits squarely in the middle of all this, and perhaps demands special attention. I'll definitely be looking into this, checksum sensing over-network use, and perhaps altering its data reading behaviour slightly. Thanks for the heads-up.

As far as the progress bar goes, yes, I hear exactly what you're saying. My take on it is that verifying hashes is something that the computer does, not something *I* do. That there's a progress windoid at all, is an indicator of how far I'm prepared to compromise my integrity to produce usable software! Actually, I've considered this, and still am. It's in the "maybe" list.

Maybe in a future version. Once checksum has a million users and a team of coders!

;o) Cor

3picide - 17.05.09 3:52 pm

I just want to say how much I love your program. I'm a bit paranoid sometimes, so this tool really helps. Also, I've been working on a cryptology project and am proud to include an example in it using your program. If I had the ability to, I would definitely donate (and buy a shirt; I already have plenty for now, sorry).

I love all of the information about the tool on your site. It's pretty simple doing regular things, but it just makes me giddy seeing all of the capabilities that this amazing freewa--ahem, shirtware has to offer. I hope to see more amazing tools. You've done a great job!

cor - 17.05.09 6:36 pm

High praise indeed! Thanks!

Your wish is my command..   Believe it or not, I am, at this very moment, putting the finishing touches to not one but THREE  brand new Windows apps!

Although not released yet, their in-progress pages do exist, if you know where to look.

And hey! I'm proud to be a part of your cryptology project!

;o) Cor

gheil - 19.07.09 10:17 pm

Lots of capability in this proggy!
Unfortunately it is a little verbose. It took me a half hour just undoing the commenting in the checksums.ini file so my editor could properly line wrap them. Unfortunately just about every especial case was exercised, that is why it took so long - very tedious.
\
When i was done there was no way to do what i wanted;-( To my way of thinking that option is obvious: what is the checksum for the ENTIRE directory?

To work around to it i had to:

1) create a file of all checksums recursively in a directory. (with a single file, don't know why this is not the DEFAULT)
2) rename the file from .hash so checksum would not ignore it
3) create a (sub)directory for each directory to be compared
4) hash the (sub)directories in 3)
5) finally do the comparisons

An option for a simple summary checksum would obviate this convoluted workaround...

Beavis - 20.07.09 9:34 pm

lol! what you wanted to do is built-in!

cor - 21.07.09 7:48 pm

I'm not so sure, Beavis!

gheil, It sounds like you are trying to use checksum as a file-compare utility. It isn't a file compare utility. Sure, it has certain applications along these lines, and you can even drag two files into simple checksum for an instant hash-compare, but checksum is strictly for creating and verifying checksums.

Your whole procedure sounds extremely convoluted. If you tell me precisely what you are trying to accomplish, I probably know of a better way for you.

;o)
Cor

ps. You can removed ALL the comments with a single regex replace in any decent text editor.

Wurzel - 22.07.09 9:24 pm

I have zero experience with checksums but here is what I am trying to accomplish.
I will be generating data files each day to a flash card. I need to be sure that the files are not altered.
Is there a way that the software can reside on the flash card and alert me if any file has been tampered with since its original creation?

gheil - 24.07.09 7:33 am

Cor

Yes i ended up using a file compare on two .hash files to see if they were identical. But that requires using another program...

What i need is a single number (eg a MD5) for an entire directory. Not one number for each file in that directory (recursively). For my purposes that file ended up being ~66kb requiring some kind of program to test. If there is a differing ordering of files in a directory a comparison program is confusing.

MrCyberdude - 23.08.09 5:31 am

Nice work on your app, will be better when you sort out the 4Gb SHA1 calculation issue.

I have 2 EXISTING folders say
C:\movies
Z:\movies.Backup

and they both have identical (un-verified) files in them.
I want to delete the original c:\movies as i have watched them but want to keep a backup.
How can i do this?

I was hoping your app had a right click compare folders/files option that shows differences.

These are measured in the TB so copying is not an option as they took backup jobs near 200hrs to do.
At the moment i am using ztree2.0(binary compare) to do this but,
its not a simple click option and i do not know what the crc/md5 method is.

I was thinking to run your check on both the 2 different folders to create the md5 hash files but then how can i compare both sets.

Your solution or Thoughts would be much appreciated.

cor - 23.09.09 8:23 pm

MrCyberdude, comparing files, just drop them both onto Simple Checksum. Job Done.

As for directories; I do this sort of thing a fair bit myself, like this..

• Create root hash in folder A
• Copy .hash file to folder B
• Click .hash file

gheil, that simple technique will also work fine for you. Understand, you CANNOT checksum directories, per se. Only the files in them. If you really must checksum a directory as a single unit, you will need to tar/zip/archive it, first.

;o)
Cor

John - 30.11.09 7:32 pm

I downloaded the checksum but when i tried to upload it this was the message showed to me
Though i am Just a beginner and really love to learn Hacking as a whole course of my Life.

I love it Please Help me out. with this as a start and i will be going on to be a STUDENT here. NO SHAME bro

I ignored your post for ages and ages; that was your start, as you call it. Any good hacker learns to troubleshoot his universe efficiently; he asks himself, first.

The Corrupt Zip is as old a technical dilemma as the zip itself, most likely older! You wanna hack data systems, you will need to learn how to deal with it. In this case it's as simple as downloading again; not always so, though.

Your reference to shame interests me, but I don't have time to delve. Suffice to say, the best hackers mostly start by hacking themselves. ;o) Cor

mariannerd - 01.01.10 4:46 pm

my PC has died. It says the checksum files don't match (paraphrasing)and it won't boot. It takes none of my recovery disks and wants the whole Vista installation disk which I don't have.

My software won't help you with this precise issue, but this may. ;o) Cor

cantuninstall - 26.06.10 1:53 am

I am unable to uninstall checksum. When I attempt it via the uninstall option in PROGRAMS, it brings up a text file. Checksum does not show up in Windows ADD-REMOVE SOFTWARE.

It sounds like you have either deleted the checksum install directory (inside shared user app data folder) or checksum's installer registry entries. Simply re-install checksum, and then uninstall again right away.

Or else, simply delete the checksum program folder and user prefs folder, job done. Aside from the data mentioned above, there's nothing else left on your system. ;o) Cor

David - 16.08.10 1:44 pm

Is there a way run this on a folder (recursively), and then have the changed sub-folders and files logged before the new checksum is written?

For technical reasons, the log is currently written only at the very end of checksum's entire task. ;o) Cor

CELLIBI - 02.11.10 4:52 am

i am a student and wishes to learn more about hashes and the encrytption decryption processes. consider me just as a beginner.
is that possible to find out the serials of small light weight programs using brute force? is there any programs concerned to it? i have a program called ophcrack, but doesnt know how to use it.
are hases concerned to it any way?
and pls do tell me how to make a torrent file if i have a whole movie file myselves.

Dave - 05.11.10 4:46 pm

Great checksum! I was wondering is there a way to make the root file automatically go to the fallback location, and not the directory being scanned. (when the folder is not read-me). I want to run the checksum on some folders which I can right on the network, but not alter, or add to them in anyway.

checksum already does this automatically. And if yours doesn't, file a bug! Include lots of error dialog screenshots, logs, that sort of thing.

If you had a command that could change the root destination that wold be awesome!

With checksum's dazzling array of features, the ability to choose the output dir of the hash file at-will does seem like something of an omission. One that doesn't go unmentioned, either. It's on the 2do list. ;o) Cor

Dennis - 19.11.10 12:07 pm

I second Dave's proposal, and also be able to run verify on a read-only directory with a different hash location but without needing a hash with absolute paths.

Checksum already verifies read-only directories just fine with all the usual settings. Even if the hash file itself is many levels above (and technically, below) the directory in question, or in it. checsum just does it.

If you are referring to files on an entirely different volume (DVD, etc.), then how will checksum know where the files are without the absolute paths stored in the hash file? ;o) Cor

kami - 07.12.10 10:20 am

I love your Programm and I am happy to have found it. But there are a few Questions and Options a would like.

1. Working a directory tree recursively, Is there a way to create individual checksums for each file and at the same time another in the root folder (cr1 and cri combined)? So when I need the two options I don't have to run the task twice.

Not currently. You might want to take a look at Batch Runner.

2. Doubble Clicking a file in Windows Explorer which file extension has no assigned programm runs checksum, is this indended or is it ab bug? How can I switch it of?

This is most likely because checksum has the topmost position in your Explorer menu and Windows gets confused sometimes, especially in low memory situations.

You cannot. My initially confusing versioning system is explained elsewhere. NOTE: Licensed users can play with betas on request.

4. I read, that you consider checksum "only" to create and check hashes, but couldn't you implement an comparison funktion. E.g. creating a hash and open an inputbox where to paste a second hash and compare the two hashes? Or klicking a hash file like for checking it, but than opening an inputbox where to paste a second hash and compare them?
Or do you a Programm that could just that?

Clearly you missed all the blurb for Simple Checksum, which comes with checksum. Give it a whirl!

5. I would love a more sophisticated Synchronize funktion. With options to remove the hashes of deleted files to clean the checksum files and to recheck files which have been changed since the last synchronization. It might be done by either storing filesizes and date in the checksumfile or by using the archive bit, resetting it after checking a file. Then checking a file again in case the archive bit is set again.

It might be done by... has been the start of many a thought train along these lines. More thought will go in yet, much more, before anything comes out the other end.

But anyway checksum is the best Programm I found for checksums. Thank you very much!

I agree! Thank You! ;o) Cor

Nate - 29.12.10 3:13 am

3 DVD burners, 3 copies of the same image, no existing files on hdd, but want to confirm all three DVDs are identical. I assume checksum/hash, but I'm new to this, so if you could give me some pointers. What would be the best, accurate, and fastest way to go about it

There's a section on this on the FAQ page! So, Yes, checksum can certainly help you here.

Well for more information, there are 9 dvd burners, running through different SCSIs, but some comp (but are all assigned drive letters), but only 3 copies of each image, so the ultimate goal would be able to run 3 different checks on 3 hopefully identical images all at once with the 9 drives (unless it goes really fast, have no clue how long each check takes on a full dvd). Unforcinally the DVD burner software/device I use don't have any verifying options :( so i assume this is the way to go, and possibly to run 3 of the same program, to verify the 3 DVDs each. Any help at all would be greatly appreciated tyvm for your time

Yes, you can run the checksum verify tasks all at once, no problem. You could also create one checksum file to verify them all in one pre-arranged sequence. Have fun! ;o) Cor

archivist - 15.02.11 12:41 pm

Hi corz.

First and foremost, thank you for the very, very, very fast checksum-tool. I have one feature-request though:

The ability to verify a checksum-file with relative paths against the relevant folder, where-ever that folder might be situated. More verbosely, I would like to situate my hash-files in a separate folder from the hashed files, so that I can move those files around and still be able to verify them without having to move the hash-files around as well. Thanks!

/archivist

Check your switches already! (hint: the "k" switch is what you are after.. "Absolute Paths") ;o) Cor

Ver Radam - 24.03.11 9:13 am

I found your site interesting but still not clear to me what i have read. I will visit again soon if i have time.
The thing really interesting to me about the use of checksum. Maybe it just difficul to understand.
i come back soon in your site.
Godspeed...

ben - 14.02.12 7:08 am

Hi,

Realising this is probably due to deficient thinking, I can't seem to find where the logs being written. I thought it would be with my checksum.ini file, but there is nothing there and a system-wide search for log (or html) files doesn't show any that are obviously (as in written in the past day or so) belonging to checksum.

The command I'm using is:
""c:\Program Files\corz\checksum\checksum.exe" cils d:\"

My understanding was that cils would create a sha1 hash for each file and log it. d: is a writable drive.

Only checksum.ini is at %userprofile%\AppData\Roaming\corz\checksum\ nor is there a directory at %userprofile%\log\, %appdata%\log. Can't think of other places it would be hidden.

I've been through checksum.ini and other pages on this site but for the slow-witted it doesn't seem that file locations are that apparent.

P.S. Realising is British spelling, not at all improbable.

The first place you looked was valid enough, the trouble is that logging is only for verification. See the switches section, above. By the way, iff you think about it, the .hash file itself is a log of the creation operation, unless nothing was done, there will always be one of those.

During verification, if any logs are created checksum will, by default, open that folder for you at the end of the operation, no need to search.

;o) Cor

ps. "realize" and "realise" are both valid here in Scotland.

ben - 14.02.12 11:14 pm

Ah right. I missed the switches that are only activated on verification. So, the 'l' switch isn't for logging everything? Just logging every verification. No way of combining them into a single task?

No, the "l" switch is for logging "Everything", as opposed to logging only failures. But logging is only appropriate during verification.

I guess it shall still suit my purposes, with verification. The intention is to emulate end to end filesystem checksum; a poor man's ZFS checksumming if you like. Though I would like to know every time the checksum is created, and how many times previously checksums had been created. Kinda an anti-tamper process except it is about data corruption rather than security.

I don't get what value there is in knowing when and how often checksum's were created. All that matters is whether they still match the files they represent. Isn't it?

All said and done, this software is awesome, and you can count on a donation and promotion.

Thanks. I look forward to that.

In the checksum.ini file, I love how verbose you are in explaining what you are intending with each item. Unfortunately most of the explanations go over my head.

Maybe you just read them too quickly! Seriously though, if there's any in particular you find confusing, let me know and I'll take a look.

P.S. Okay, why did your website tell me 'realise' is an "improbable" word?

Ahh.. my spell-checker! I guess "realising" isn't in its word list (though "realise" definitely is), and as I generally use the "z" variants myself, it hasn't yet been added. "donation" isn't in the list, either! ;o) Cor

fred - 27.02.12 3:27 pm

This has got to be the most comprehensive MD5 hasher I've found which meets 95% of my needs.

I've a couple of questions which I'm not completely clear about
1) when i synchronize, will deleted files be updated into the .hash? and will changed files be updated into the .hash as well?

During synchronization, checksum will only add new files to an existing .hash file. Nothing is taken away.

2) will it be possible to create a .hash in each folder recursively. i can foresee the need to move subfolders and i would want to check the integrity. yet i don't have to create a .hash at each folder manually.

for example, this is my folder structure... which i've hashed at E: level.

e:\photos
e:\photos\2010
e:\photos\2010\jan...
e:\photos\2011
e:\photos\2011\jan...

e:\videos
e:\videos\english
e:\videos\french...

e:\stuff
e:\stuff\i like
e:\stuff\i hate...

This is the default behaviour.

and i want to copy all the photos in e:\photos\2011 for my friend. i would now need to rehash at that level again. am i making any sense?

thanks!

You would need to do nothing, simply copy/zip/send/whatever the folder. Its .hash file will travel with it (along with all the .hash files in all the folders inside it, and so on). ;o) Cor

David - 05.03.12 4:17 am

Hi Cor,
Firstly let me say fantastic program. You have managed to include many features but still kept it useable!

I am just using checksum from a scheduled script to verify my backup files are still good after they are transferred over a WAN connection. My backup program actually generates the MD5 hashes internally.

So far everything is working fine except I have around 400GB in total of files to check. What I really want to do is check just the few hundred MB of updated files after they arrive at the destination and go over the whole set of files once a month or so.

Can I selectively verify the hashes directly in checksum bussed on the archive bit on the files? Once the check is passed I could then reset the archive bit.

If it can't be done directly in checksum can checksum selectively verify files based on an input file list? I shouldn’t have to much trouble automatically generating the input file based on files with the archive bit set and then using the same input file to reset the bit once the test has successfully completed.

This is the command line I’m currently using and I have made some tweaks to the ini file for the stuff I couldn’t do on the command line.

checksum.exe" vflrq-t %target%\ >> %logfile% 2>&1

Love to hear your Ideas on solving this problem.

Thanks

can checksum selectively verify files based on an input file list

Yes! It's called a .hash file!

You have to understand, during verification checksum isn't rummaging through your filesystem checking archive bits or anything else, it is reading the .hash file (or .md5 or whatever) that you feed it. If you feed it a system path, checksum scans for all available .hash files. Once checksum has its .hash file(s), it simply checks all the files listed within.

If your backup software if producing hashes, it may be possible for it to pipe the new hashes to a single file, even a temporary file, which checksum could work with.

But really, 400GB isn't so much. If you set all this to happen when you are in bed, you won't even notice it has, unless there are errors (checksum will (optionally) pop open the log folder).

;o) Cor

ps. be careful with the "l" switch, it can produce HUGE logs.

Jan - 12.03.12 5:10 am

Hi Cor
Thank you for a very useful program. It really is a must-have on any computer. I have been using it for checking copies of folders made for archival purpose, works perfectly.
One problem I do have, if you can suggest a solution. I inadvertently did individual checksums instead of "root" on a number of folders containing large amounts of photo-files, ending up with hash-files on every photo. How can I remove the hash-files instead of only hiding them?
Thank you.

Use standard Windows Explorer commands.. Search [F3] for "*.hash", select all ([Ctrl+A]) and delete [Del].

If you view the results by "details", you can order by date, size, etc., makes it easy to delete only the ones you want gone.

;o) Cor

Francisco - 23.03.12 6:57 pm

Hi Cor
First I wanna thank you for your program is really unique!, also I have a problem, I have bunch of files on a drive and I have already checksumed them but I have to re check them with my girlfriend and I am sure that we will delete some of them, so what can I do to re synchronize everything? I know about the syncronize switch but when a file is deleted it just shows sha 1 missing, I wanna delete all the hashes of the deleted files to avoid the "sha 1 missing" errors.

If checksum could automatically handle this scenario, it would verify the hashes of all the files then (optionally) present the user with a list if missing files. Of course, it already does all this, as well as offering to log all those missing files for you.

If you were 100% certain that all the missing files were intentionally missing, you could have checksum go straight to manipulating the existing .hash file(s), removing entries for your "approved" missing files.

checksum does most of this already, right up to the point where the existing .hash files are altered/replaced. Something along these lines is planned for a future version.

If you use a "root" .hash file and you are handy with some form of scripting language, I'm sure it wouldn't be too difficult to create a script that parsed checksum's log (searching for "MISSING") removing those entries. Of course, you would need to instruct checksum to output plain text logs (XHTML logging is the default).

With a "root" .hash file it's also a trivial operation in your text editor, with a little regex magic, of course.

Failing that, the easiest way to approach this currently is to:
• Run a verify operation on the entire folder structure (i.e. drive)
• Check (in the log) that all the errors are "expected". If not, fix any issues (from backups, etc)
• Delete all .hash files on volume. (see my previous post here)
• Re-create hashes for the entire volume.

Now you have up-to-date, error-free hashes for the entire volume. If the volume is large, do the last part when you are sleeping.

While I'm here I should mention that the upcoming beta of checksum, as well as enabling you to differentiate between CORRUPT and CHANGED files, has the ability to toggle the reporting and logging for these and MISSING files, you can simply ignore them!

Also i have another problem when I verify the checksums, I get that error of "cannot create hashes even on the fallback directory" something like that I have read your site and it says that maybe is because my drive is read only but that is not the case.

I have seen this error when the fall-back folder has "unusual" permissions (check those) or where the fall-back path is so deep that re-creating folder structures would lead to file paths longer than Windows allows (move it somewhere closer to the root), or use root hashing (one file).

But it may be something else. Try re-installing checksum and moving/removing any checksum-created fall-back folders. The installer is completely non-destructive, so it won't affect your settings.

If you are still having issues, mail me with rough details of your computer setup and a copy of your checksum configuration (ini) file, I'll look into it.

;o) Cor

Francisco - 24.03.12 2:49 am

Thanks Cor for the quick reply and easy solutions, for my first problem I did what you told me I verified all, then deleted the hashes.

And for my second problem, it was because the files where very deep in the drive (folders and folders and folders...) and maybe because it was a web page saved, so I deleted that and I have just created a root hash.

Ricard - 11.04.12 6:51 pm

Hi,

Trying to modify options with Shift and Ctrl keys before launching but all I get is a help page. Help needed.

Are you saying that when you hold a) <SHIFT> or b) <Ctrl> during launch you get a help page instead of the a) options dialog and b) force verify controls? Seriously? Nah, I misunderstand you, surely.

I want to compare a set of folders to see if all the files are in place and, after that, to verify the integrity of the files. How should I proceed to achieve the result?

Thanks in advance. Congratulations for the soft. Regards.

Simply hash the original folder (use a root .hash file) and then copy that .hash file to the second folder, click it. ;o) Cor

Ricard - 12.04.12 4:25 pm

Hi,

"Are you saying that when you hold a) <SHIFT> or b) <Ctrl> during launch you get a help page instead of the a) options dialog and b) force verify controls? Seriously? Nah, I misunderstand you, surely."

No, you did not misunderstand. That's exactly what happens. I am in Windows 7 64 bits. But it is not a helo page exactly, is a dialog that begins with " checksum [v1.2.3.9] was given nothing to do!"

The only possible explanation I can think of is that you are tapping the key, instead of holding it down. If you are holding it down (until you SEE checksum) then I have no idea what the problem could be. It's a first!

And of course, there's no point just launching checksum (unless you want to read the "checksum was given nothing to do" dialog), you need to launch it with something to do (i.e. from your Explorer context menu for a file/folder, drag & drop, etc.)

;o) Cor

Dunc - 27.04.12 10:10 am

Hi Cor.

I have some files that I have previously checksumed that I know have changed (and know change a lot) but I want to re-checksum it prior to backing up to another drive so I can verify the checksum on the backup to make sure it has copied ok, and the backup is still good in the future.

I thought I could just run checksum with "qctry" but it doesn't seem to update the hash file if it already exists? I know synchonrise would add new files, but doesn't update existing hashes (which is entirely sensible, you generally wouldn't want to update the checksum to include any possible file errors!).

The only way I can get the behaviour I want is to delete the hash files and re-checksum, which is fine I guess - just wondered if there was a way to force checksum to regenerate the hashes itself?

During verification, checksum makes no changes to your .hash files.

The best way to go about this would be, as you suggest, to verify the existing hashes, then ensure the error log contains only files you expected to be changed, then hash again (overwriting existing hashes).

If it's something you do a lot, because checksum can be controlled by the command-line, it should be easy enough to setup a wee batch script/batch runner set. ;o) Cor

Draguen - 11.05.12 9:49 am

I got two different files in a folder with the same hash. Only one file gets written to the hash file. I would love to get a switch (I use command-line) to hash and list all files, even if they have the same hash. If the second file isn't written, you might never know if it gets corrupted.
BTW - Great App!

That is not the expected behaviour. Regardless of the hash, checksum should list ALL the files, unless they are ignored for some reason (i.e. in your preferences).

Please mail me full details, including the full names of all the files in the dir, what command-line you are using, and the resultant .hash file. If you can, zip and send the entire dir to me.

;o) Cor

Draguen - 11.05.12 7:16 pm

Right you are, the file "folder.jpg" is by default ignored in the prefs. Sorry for the trouble.... I found this after sending the zipped folder... so you can ignore the email. Thanks a lot.

No problem! I suspected it might be something along these lines. I'm glad you have your solution. ;o) Cor

Ricard - 24.05.12 5:23 pm

Hi Cor,

About a month ago, I told you that the soft was not working with Shift or Ctrl keys before launching. Still the same :-( I am using an HP Pavilion with 8 Gb Ram, Windos 7 and no success. What I want is to generate hash files for a number of files in a folder to compare with another one. What I mean is that if I have 100 files in folder A and 120 files in folder B, I want to generate a hash file for each one of the files in each folder to compare them later. How can I achieve this (with command line I mean) provided I can't launch the program eny other way?

Ricard - 24.05.12 5:45 pm

Hi Cor,

Just tested in Windows XP and it works normally. Any suggestion?

There's no reason it shouldn't work perfectly on Windows 7, as well. No one else has reported any problems with this and my first instinct is to say, "Are you SURE?". You are holding the modifier keys down until you see checksum, right?

As for the folder compare, checksum isn't really designed for this task, but creating a standard checksum file for the folders (standard right-click on folder, choose "Create checksums") is all you need. To compare with another folder simply copy over the .hash file and click it.

;o) Cor

Alex - 24.05.12 8:47 pm

I have been using checksum to verify files i archive to my LG NAS N1A1. When i run verify on a local file on my computer, checksum works very fast like it is supposed to. But when i run verify on a copy of the file over the local network which resides on my NAS box, it takes forever. Why is this and is there anything that i can do about it?

The limiting factor in checksum's speed is invariably file i/o. Even on a local file system, it's unlikely you will max out your CPU doing checksums, the disk read/write speed will slow things down. On a network, this is even more pronounced.

Basically, anything you can do to speed up your network (and there is usually a lot you can do), will speed up checksum. It's waiting for data from your NAS. Check your network settings thoroughly, and consider gigabit ethernet (compared to a modern hard disk read, even a super fast LAN is DEAD SLOW). checksum won't be the only program that will benefit from improved network speeds.

;o) Cor

Ricard - 25.05.12 11:15 am

Hi Cor,

Everything working fine now! Just that I did not press the key until the interface appears. You should emphasize that the the user must press the key until checksum appears. Otherwise, dummies like me will be doing the idiot around for a while

Thanks a lot. I will purchase a shirt, sure!

Noted. I will make the instructions more clear. Thanks! ;o) Cor

Ricard - 25.05.12 11:39 am

Hi Cor,

I have done several copies of DVDs to hard disc. Now, I want to compare each DVD with the correspondent folder. How should I proceed to compare both? For each folder I can generate a hash file but how do I proceed with the DVD as it is read-only. And, once generated the hash file for both, how do I compare them?

You can simply run a normal Create checksums.. Explorer context menu command on the folders on the DVD drive. Because it's read-only, checksum will create a folder on your desktop (or other chosen location) and create any .hash files in there.

The entire structure will be recreated, so you can simply drag and drop the whole thing over to the copy, if need be. Verify normally with the Verify checksums.. command.

Remember, if you have already hashed the copy, you will need to rename one of the (sets of) .hash files or else copying over the directory structure will overwrite the copy's .hash files!

So it's best to simply begin with hashing the DVD. If all goes to plan and there are no errors when you copy the .hash files over, your DVD .hash files can become your copy's .hash files! Job done!

checksum has a number of configurable methods of dealing with read-only fallback conditions, as well as a myriad of configurations for .hash file naming, so if this is something you do a lot, you will probably want to drop checksum.ini into a decent text editor (one with syntax highlighting) and have a scroll.

By the way, nice shirt!

;o) Cor

Ricard - 25.05.12 3:43 pm

Hi Cor,

Thanks for your answer. By the way, I can't find the file checksum.ini. I have registered with the program itself but I have been looking in the corresponding folder but there is no checksum.ini (and I have done a search in all the PC of course). Your advice would be great.

Thanks and regards.

See here (recently updated).

As well as the absolute best way to get to checkusm.ini, there's also a link there ( and right here! -> ) to the brief checksum.ini page.

Have fun with those prefs!

;o) Cor

Ricard - 31.05.12 6:32 pm

Hi,

I have copied several DVDs to corresponding folders in hard disc. Now I want to compare DVDs to folders. What I would like is to have a unique file hash for all the DVD or folder (not a hash file with a list of file hashes in it). This is because each DVD has about 17.000 files. Is there any way to do so?

And another thing. Is there any way to make the process faster because to generate the hashes for a DVD it takes about an hour and a half. I have 78 DVDs, so it will take me about 14 days working 8 hours a day :-(

Thanks in advance. Congratulations for the software and best regards.

It would have been smart to hash the folders before they were burned to the DVD.

As it is, simply hash the disk as normal, perhaps with a root hash file, and then copy that to the hard drive folder for verification.

And what's wrong with having a .hash file with all the files in it? 17,000 files is no problem for checksum. I've got .hash files with hundreds of thousands, maybe millions of entries (I know for a fact my local archive drive .hash has over 750,000 entries).

File hashing is a superior system fo disk (image) hashing -- if one single file is damaged, with a disk hash, you have total checksum failure and no way of knowing which file is damaged. Ouch! It's a lot easier to locate and renew one single file than 17,000 of them!

As for making it faster, checksum will hash the disk as fast as your operating system can read it. My workstation is ancient and I can hash a DVD in a few minutes - it sounds like you need to upgrade your DVD reader - it will only cost a couple of man-hours or less for a decent fast model. Think of the savings!

By the way, I recommend Pioneer for DVD drives.

;o) Cor

FredMora - 31.05.12 7:16 pm

I second Ricard's request.

The "tip and tricks" page describe how to checksum a burnt CD or DVD by first dumping the media contents in an ISO file with ImgBurn, then using the corz checksum utility on the ISO file.

One really cool feature would be to bypass the need for ImgBurn, and directly create or verify the checksum of the bytes on the burnt volume.

Thoughts, Cor?

Check out my response to the comment directly above this one. It's not a practice I would encourage - just hash the files!

For those that really need this sort of functionality, ImgBurn is an excellent program, beautifully simple and intuitive to operate, but with all the advanced features a geek could need. If you have a DVD drive or ever handle disk images, it's an essential tool for your kit-bag.

;o) Cor

FredMora - 06.06.12 8:28 pm

Hi Cor,

Check out my response to the comment directly above this one. It's not a practice I would encourage - just hash the files!

Actually, I sometimes use a Windows machine to handle ISOs that are not meant to be mounted on Windows, such a Linux-created ISOs. These ISOs often contain file paths that are too long to be read by the Win32 API, and thus, the file hashing is not reliable. However, the checksum of the full ISO is still possible and meaningful. Hence my request.

Thank you,

--Fred

In that scenario, hashing the ISO makes sense. I have many ISO .hash files myself. And of course, if you have access to a Linux system, you can mount the volume and use checksum for Linux. But wherever possible, hash files, not disk images. ;o) Cor

Hi Cor,
Great App!. So far I have been able to do nearly everything I want with it, only one thing is missing. The ability to make a single file hash with absolute path. Maybe I missed a switch or a combo of switches, I just cant make a single file hash with absolute path. I hope this is a "lack of reading on my end" type of post, if not then I guess it's a request type

Absolute paths is only available when creating a "root" hash file, i.e. a single .hash file for an entire directory structure. It doesn't seem sensible to do this in any other context, though I'm open to suggestions to the contrary. ;o) Cor

Hi Cor,

Here is my scenario.
I hash files on my PC's internal hard-drive with a copy of the 4 level directory structure of my two archive drives (one is redundancy). Some hashes are complete directories, others are single files. I then transfer to the archive drives. I check the hashes to verify good transfer. All my hashes are put in the root of my archive drives for easy referencing of the files. I put a copy of the hashes in a backup folder for peace of mind (I use a little app that adds the drive letter to the hashes in just a few clicks). I also append all new hashes to a master hash file for the whole drive (I am aware of Checksum's synchronization feature, but that means rehashing the new files and that is extra work/time I can easily do without). I can thus check the whole drive, if I need to transfer the whole drive (or to verify drive health) or just check one single hash when I need to pull just that file from the archive, all the while having a functional backup of the hashes.

Presently, I am adding the paths manually, until I find a better solution, quite tedious work.

I fully understand that this doesn't seem sensible (my wife is always telling me that I am not sensible enough, go figure!). Adding paths to single file hashes is only good if you don't want your hash files in the directory your file is in. I am guessing most people are content with the hash file right next to the hashed file. Unfortunately, that is not my case. I hope this is enough to convince you, if not, the search goes on. I would really appreciate an all-in-one hash tool for my needs.

Thanks for listening!

Have a great day!
Cheers,

Take a look at checksum's "Root" hashing. You can hash the entire drive and have the checksums in one single file in the root - that's how I hash my own archive drives. If you also use absolute paths, you can keep the .hash file wherever you like.

I guarantee you will save time compared to your current method, because you don't have to *do* anything, at least not manually. Let checksum do the leg-work!

Also note, during synchronization, there is no "rehashing", only files that do not already exist inside the .hash file will be hashed. Existing hashes are ignored.

I can certainly look into adding absolute paths for all contexts, but I think if you try letting checksum take care of this you will save yourself a lot of hassle.

;o) Cor

flux3000 - 13.07.12 5:57 am

Greetings - first of all, thank you so much for this wonderful tool!

I am wondering - is it not possible to create files with .md5 extension rather than .hash extension when using the batch checksum generator on directories? It seems you only have this option when creating checksums from the files themselves. I am guessing there may be a good reason for this limitation...

Thank you sir!

There is no such limitation! Simply set:

unified_extension=false

And ALL checksum files will have an .md5 or .sha1 extension, regardless of the context.

;o) Cor

Joe - 27.09.12 3:18 pm

All I want is a simple syntax to create 1 checksum file for 1 input file that I can place in a bat file.

I don't want ini files created all over the shop in every user that happens to invoke the command
I don't want popups, ever.

I want a formal syntax that does exactly 1 thing. This is the simplest possible use of a command line yet ...

checksum.exe -qualifier < myfile.txt > myfile.hash

The normal syntax will work fine. See above for how to get no "popups", as you call them. I don't know what you mean by "helpful crap", though it does sound like a contradiction in terms. Everything you need is in the documentation.

;o) Cor

Art - 28.09.12 9:26 pm

Hi Cor,
Excellent work on checksum, impressive amount of effort to cater for (so many!) various options. Still having some trouble to make it perform the way I'd like though. I'm wondering whether I've misunderstood the .ini settings, and whether you could explain.

The problem: I need to protect an entire OS inside a virtual machine from user-tampering, more specifically, from introducing their own executables renamed as OS .exes, .dlls, coms, and such. (I can prevent any unrecognised process from executing using other means). So I set out_dir to some location users can neither see nor access, and run checksum.exe cr on c:\windows\system32 with a mask for *.exe,*.dll etc, set absolute_paths=true, and all .hash files are created in my destination dir as expected.
But when I wish to verify, I call checkum v <hash-path>, and I get a log with errors for every single file as "missing" because it apparently still expects each hash file in the associated source dir. (It works fine if hashes are written out there.) Unfortunately, this is not an option for me because even with hidden attribute it would be trivially easy for users to find the hashfiles there, figure out that it's md5 or sha1, and edit it to match the rogue executable that is to replace the original. I could do a checksum on the hashfiles themselves, and so on, but life's too short for infinite recursion.

So my question is, what .ini settings would cause checksum.exe to look for the source files in their original location regardless of where the hashfiles themselves are stored? Despite your extensive documentation in ini and on the web, I haven't figured it out yet. What might I be doing wrong?

ensure you are using "root" hashing, that's a 1 on the command-line, or..

one_hash=true

in the ini file. Currently, checksum only creates absolute paths for root .hash files. For you needs, a single .hash file sounds like a better option, anyway.

;o) Cor

indigital - 03.01.13 11:01 am

Hello.

First and foremost, thank you for your hash tool.

I'm trying to create a checksum with the parameter "1sq" of the following folder/file:

"z:\Filme_1\Die Geschichte des Jungen, der geküßt werden wollte.The Story of a Boy Who Wanted to Be Kissed.L'histoire du garçon qui voulait qu'on l'embrasse.Französisch\L'Histoire du Garcon qui voulait qu'on l'Embrasse.Fr.avi"

Most likely due to the path and file name length checksum isn't able to store the sha1 file in the folder. To me it appears that the hash creation itself is done.

Do you see any solution to that problem?

Thanks.

indigital

Yes, don't break Windows' path character limit. It's just asking for all-round trouble. ;o) Cor

Lucas - 06.01.13 10:46 am

Hi corz

I'm writing here, cause I've send 2 emails to you, but they were left without a response.

Apologies. It's the time of year, the inbox fills quickly and only really high priority stuff gets attention. I get there in the end!

First of all great program, and the registration procedure(purchase procedure) made me smile a bit. You have pretty good sense of humour.

Ok, to the main point of my message. I like the synchronize option, but one thing I'm missing and I think it would benefit the application, is checking for the files that are not there anymore, and while creating checksum with the "synchronize" option, to remove those entries from the hash file created before.

Can you implement this feature in the next version of corz checkusm?

ps. When you will be responding to this message can you let me know that you did on the email, so that I don't have to come here every day and check it manually?

cheers
Lucas

This has been asked and answered (on the page you originally posted this) more than once. I'll certainly look into it as an option, but there are no immediate plans to rewrite the way checksum works. If it's any consolation, the next version of checksum has the option to ignore missing files. ;o) Cor

colorfred - 27.08.13 11:16 pm

First thing, great work! So far, it looks like it already does just about everything I need. What I really need is the command line version to return a pass/fail that I can easily capture so I can have a script perform certain operations in case of failure. What I am thinking is using this as part of a Jenkins job (a plugin for Jenkins might be a potential add on for you?? hint-hint-nudge-nudge) where if the checksum fails, I send an email to folks about the failure.

checksum already does this! If you need a hand with the command-line, or with capturing the result, let me know.

;o) Cor

Chris - 03.11.13 8:04 am

Hello,
I can say that I'm very happy with your nice checksum tool. The only missing crucial function is to support the handling of multiple selected files/folders. Very often, I have to place checksums that way, but at different locations. So preparing a separate script is no option at all.

Thank you
Chris

Steve - 18.12.13 12:20 pm

Great software.

I was wondering if it's possible to create one hash of an entire directory structure so the .hash file has one hash in it.

I want to be able to compare that 2 directories (Main & backup) are identical.

Thanks

Yes! To compare directories, simply create a root hash (use the "1" switch on the command-line, or easier; check the root hash option in the one-shot options dialog) in the root of the first directory. Then copy the .hash file over to the second directory and click it. Job Done.

If you do this a lot, consider adding a custom command for it to your Explorer's context (right-click) menu - there's a section on how to do this on this very page!

;o) Cor

Robert F. - 18.02.14 10:18 pm

Hello,

this seems to be a very great and valuable tool. As I've been looking for it for a time now, I'm really happy and thankful. Tried around and everything works great - good usability and sooo fast under Windows.
And with a little, very easy, customization the hash-files are compatible to md5deep under lx, too.

I will use this tool mainly for ensuring integrity and consistency among my multimedia directory trees (mainly photos), including backup / archive trees and current photo workflow.

There is one remaining question, a cmdline switch combination (or ini setting) I must have missed and I simply cannot find it in the online docu or by playing around.

It's regarding hash verification.
When I verify a larger tree, I can easily find (and get a report about) missing and changed files.
But at the same time I would like to know whether there are any new files in the tree (valid files that match the pattern, are not excluded, but do not have any line in the hash-file).

It would be like ... calling checksum vyr, which does not work together (can be called together, but does not report new files).

To make a clear point (as English is obviously not my native language and what I write could easily be misleading):

I am just looking for a report / log of additional, unhashed files during a verification run.
Did I miss something very simply and obvious? Could you be so kind to help or hint me?

Or is the only solution to run checksum twice, once "vr" (to get missing and changed) and then "y" (to hash new ones). But with this attempt I did not find the possibility to obtain a summary log of the newly added files. Of course it would be preferable to get the new files just reported and have the possility to deceide whether to add them to the hashtable in a 2nd run (or simply to delete them if they are there for no good).

Thank you again and greetings from Austria,
Robert F.

During verification, checksum doesn't look at the file system, it simply reads the .hash file(s) and verifies that the checksums within are valid.

If you want to create/add hashes, you need to run checksum in create mode (with synchrinize enabled/chosen). Any new hashes will simply be tagged on to the end of the .hash file. You can have checksum time-stamp the entries, so it's easy enough to see what was added, see: do_timestamp=false in your checksum.ini (set it to true, obviously!).

;o) Cor

ps. thank you for your support! much appreciated!

supernoob - 21.02.14 6:26 am

Hello,

Here is my case:
I have a program that can create a bootable disc. I created 6 months ago one and decided to make an iso file from it and save it. Then, today I grab the same program and did again a bootable disc and created a new iso file as well just like the last time. Surprisingly, at least for me, both iso files have exactly the same checksum, I mean same md5 and SHA-1.
I thought every time I create an iso file from a CD the resulting file had a different hash or if I burned different bootable CDs with the same creator program and make iso files from each one, the resulting hash should be different. But no, it's the same.
Is this correct and I am missing something?

Thanks.

If the bootable disk is the same (same creator app, same settings), there's no reason why the checksum wouldn't also be the same.

The real question is, what made you decide to create another bootable disk when you already have one?

;o) Cor

Mike - 19.05.14 2:39 am

Got a problem.

Just finished a full 4TB disk hash to root (1day 5hrs!!) and decided to check it was ok so ran checksum again using 'create' followed by synchronise not expecting to find anything, that would be my check. BUT ... it found a missing movie folder (Zero D*** Thi*ty) and began creating fresh checksums. As nothing had changed and there was literally only 5 mins between runs this must be a bug, yes? Something to do with the folder and files starting with 'Z' maybe?? Both runs of checksum finished with success but the first run did NOT check all of the disks filestore.

In actual fact, there was 1 day 5 hours between runs (building the list of current files is the first thing checksum does). The folder was most likely been added during that time. By the way, that does seem a bit slow. Perhaps that disk needs defragmented. ;o) Cor

jumperalex - 20.08.14 8:48 pm

OK so I erroneously emailed you (vice using this form) about a feature request concerning adding Tiger and Whirlpool hashes to be compatible with hashdeep. Ignore that please.

What will REALLY be clutch, and will help the entire headless media server community like Limetech's Unraid community is if you can release a command line utility for linux. I know you've got a gui version, but it would be best if we could avoid having to do that on generally headless machines. We have the ability (still in beta at the moment) to create Docker Containers and KVM virtual machines that can run Windows or something with KDE, but that is really not an ideal situation for what most of us really need. What we need is a command line that can be scripted and added to cron

100% ideal solution - a slackware 14.1 command line build for native unraid use
90% solution - Arch (for the KVM users) and Ubuntu (for the Dockers users) command line builds

Thank you for your consideration and an awesome tool.

-Alex

Mike - 20.08.14 9:43 pm

Hi! Is there a way I can check only one file from a folder? I have a .hash file with all the file hashes, but when I tell it to verify it checks all the files which takes too much time.
The only solution I see is to open the hash file in notepad, find the hash of the file I want to check, copy it, then use simple checksum to compare them.

If you want to be able to verify individual files, make individual .hash files (there is an option in the one-shot dialog for this, or use the "i" switch on the command-line). ;o) Cor

ps. I do however like the idea of being able to pluck a single hash, or group of hashes out of a big .hash file, and only verify those. Hmmm.. Let me think about this!

jumperalex - 20.08.14 10:04 pm

Sorry to double post, I was too slow to edit:

So I see that I can grab "checksum" out of /usr/local/bin and use that in the command lines of UnRaid Slackware 14.1 64-bit and Arch 64-bit. Unfortunately I cannot seem to get "verify" to work as it tells me it is not executable. "file verify" tells me it is "data". And no command switches seem to work using "checksum" so how would I go about syncronizing?

Am I doing something wrong?
Or do I need to just hope you see it in your heart to create a full command line suite for linux?

I know this now reads like a straight question but it is still of course related to my feature request

For reference Unraid on Slackware 14.1 does NOT have the ability to run 32-bit programs.

The other commands (verify, kverify, etc.) are just symlinks to checksum. In fact, I think something got messed up in the zipping, because they no longer seem to be valid symlinks. It's been a long while since I did any work on the Linux version of checksum. Simply make a symlink to checksum named "verify" (checksum will then recognize that it has been invoked for verification purposes). You can also use the --verify switch with checksum.

At the end of the day, "checksum" is simply a bash script (therefore not 32 or 64 bit) front-end for the tools that are already on your system (md5sum, sha1sum, etc.). There is no checksum "executable" as such, just a script.

You could easily script md5deep or hashdeep to do a similar job.

;o) Cor

hmmm ok so here is what I typed based on your comment above

root@Tower:~# checksum /mnt/disk1/test.txt

Creating hashes in "/mnt/disk1/test.hash" ..
** To abort, press Ctrl-C **

All done with hashing.

All done in 0 seconds.

I then tried to verify using the switch and got this:
root@Tower:~# checksum --verify /mnt/disk1/test.hash
This is a checksum file. You can verify it.
I haven't tried to symlink yet because well ... I need to learn how to do that first and I have to leave for a Dr's appointment

"You could easily script md5deep or hashdeep to do a similar job."

Yeah if that were true I wouldn't be having the problems I've having

Looking inside the checksum script, it looks like, for some reason, that switch isn't implemented yet. Using a symlink does work, though..

cd /usr/local/bin
ln -s checksum verify

;o) Cor

jumperalex - 21.08.14 4:56 am

Success!!! Verify now works. It will tell me what is OK, MISSING, FAILED.

However, it also seems like some of the other switches aren't working. I intentionally renamed a file so that I had a new file and a missing file, and I went inside another file and intentionally changed it. A straight verify operation correctly identified the MISSING and FAILED hashes.

I then tried 'verify x' to delete the missing hash (it did not) and 'w' to reWrite the 'FAILED' hash (it did not).

re: the added file after initial hash generation. Using 'checksum' it adds the new file's hash and skips all the others without using the 's' switch. I tried again with 'checksum s' and got the same result.

I'm guessing that is normal behavior but then I wonder what 's' is for in this context. Since the verify 'w' switch isn't working I'm left to ask how do I refresh hashes besides just deleting the .hash file?

Ssorry I'm bringing this all back from the dead, as you said you haven't touched it in a while.

FWIW, I think you'll get a few more customers from UnRaid if this can all get hashed out. Some of us are in fact using checksum over the network, but I know others aren't because it can be pretty slow and even more would love to have a way to automate this with alerts. To do that means command line.

Thanks,
Alex

Firstly, understand checksum is a Windows application. The Linux version is a VERY basic implementation so that I could perform checksum-like operations on my old KDE desktop.

You really need to read the notes at the top of the script itself. The switches are listed there, namely: --kde, --zenity | --zen, --hybrid, --x, --append, --noappend, --algo, --md5, --sha | --sha1, --sha2, --sha3, --sha5, --mask=file.mask

And that's it!

Any of the notes and information you see around this part of the site relates to the Windows version, so none of the switches you mentioned will work. Only those above.

checksum on Windows is under constant development, but I haven't touched the Linux version for a long time - these days I don't use a Linux desktop at all - my Slackware box is CLI only.

I recently added the Linux version to the regular Windows distro in the hope was that someone might find it as useful as I had (it seemed a shame to not let it out there) and perhaps even expand its functionality. Feel free!

;o) Cor

Oh crickey!!! I'm really sorry about that. Thanks for pointing me in the right direction. And yes indeed my Unraid Slackware and Arch VM are CLI only as well. That's why I'm not pinging you about getting the KDE version running ;-)

Well all I can say is that if you have any inclination on beefing up the linux cli version I'm pretty sure there is a market for any of us running headless NAS / Media centers that worry about long term bit rot and identifying which files have caused a parity failure check..

Hey, no worries! You can head over to this directory, where I did eventually get a checksum on Linux section sort of up. You can grab the Linux checksum distro, as well as a working set of symlinks (the bin directory) in a gzip, to replace the non-functioning set in the distro, and a syntax-highlighted (though your favourite text editor would probably do better) web view of the main checksum script.

You are right, there is still a lot could be done to the Linux cli end of things. The error reporting needs fixed for a start; when there are errors, it's not immediately obvious, which it should be. And more of the Windows functionality needs added, of course.

The documentation at the top of the Linux script is slightly confusing in parts. Och, loads! It's only when people show an interest that I develop things beyond my own basic needs. checksum on Windows has come a long, long way since I got it doing most of what I wanted for myself.

Tell me what's important to you.

If there's interest, I'll start some comments over in linux/ specifically for this stuff, implement what I can in what time I have available.

;o) Cor

Gabriel - 03.09.14 11:38 pm

Hi Cor,

just starting to test your checksum sw and it rocks! Congrats.

Would you consider to develop a QNAP NAS App (linux) so I could check the files that landed into the NAS are still fine? I would definitely pay for it also

Cheers
Gabriel
http://www.qnap.com/i/en/app_center/

Thanks.

checksum is already quite popular in the NAS community, either creating and verifying hashes over the network, or else using the Linux version (included in the Windows distro) to work with hashes locally on the NAS box. The Linux version is pretty basic but quite functional.

I don't have the time right now to develop a QNAP-specific app, but if you let me know what the requirements are (or point me to an SDK or similar), I could certainly look into making the next Linux version more QNAP friendly, if it isn't already.

;o) Cor

Gabriel - 05.09.14 8:33 pm

Thanks Cor, please find here the details about the QNAP packages http://www.qnap.com/dev/en/

Anything I could tease you to develop it?

Otherwise where can I find details on how to use the linux version in console-mode since the QNAP is a linux box but without letting to install GUI afaik (and my linux days are decades behind me)

Cheers
Gabriel

It's quite simple to operate from the command-line. Once you have the binaries and symlinks in place, you simply do..

checksum /some/path

and to verify..

verify /some/path/path.hash

The Linux version produces "root" hashes, which is a single .hash file in the root of the directory checked. It's quite basic, but capable enough to hash/veriry an entire drive in one command.

See: http://corz.org/linux/software/checksum/

;o) Cor

archedraft - 12.09.14 8:45 pm

Cor,

I just wanted to add to Jumperalex's posts that having an updated corz linux cli would be fantastic. I currently use my Windows box with checksum but I am sure it would check much faster if I could perform checksums directly from my NAS box. Either way, keep up the great work! I'll continue to send as many friends your way as I can.

-archedraft

Friends are good, thank you! Customers, too - you can never have too many of those!

checksum on Linux is only sleeping. There will be more ASAP!

;o) Cor

jumperalex - 17.09.14 11:20 am

The Revolution Has Begun!!!

Thanks Cor I appreciate you taking the time to setup the linux section and giving consideration to beefing up linux-checksum. "We" over at unraid recently had a bit of a "thing" with some silent file corruption due to a bad kernel patch. As you can imagine there are a bunch of people suddenly very interested in hash checks. Those of use who took the time to run hashes using windows checksum over the network were feeling very smug and self-satisfied even if running the verifies took a while.

So thanks again for a great product and for any future work to get more capabilities into the linux scripts.

The Linux section was well overdue. Thanks (to the other *nix users, too) for giving me a kick in that direction. I must admit, I use checksum on Windows over my own network, my workstation is on 24/7, so the checksum "way" of click-done , suits me fine. But I realize not everyone works that way and a local checksum operation on Linux box would certainly be faster (on most real-world networks, anyway - in the next few years it will become irrelevant - the choice for where to run checksum from could quickly become academic, basically, wherever the most CPU power resides)

In the meantime, checksum on Linux will pick up features some needed features, and hopefully, as time allows, more of the checksum finesse Windows users have come to know and love!

Suggestions you make NOW, loudly and/or in numbers will most likely find there way in, so let me know what is most important to you and I'll get on that first.

;o) Cor

Nacho - 06.10.14 11:34 pm

Hi. How about adding the ability to specify a relative path when you want to verify a non-absolute path hash located anywhere else instead of the original folder?

e.g.:
1.- I'm creating a one file, non-absolute path, root hash of the entire C:\test1 folder, and placing it into C:\myhashes\test1.hash

checksum crq1d(C:\myhashes) "C:\test1"

2.- If I want to verify that, I must move C:\myhashes\test1.hash to "C:\test1" in order to run:

checksum v "C:\test1\test1.hash"

It would be great to add a switch to specify the verifying path without having to move the .hash file, like this:

checksum p("C:\test1")v "C:\myhashes\test1.hash"

It would mean: verify the relative path hashes found in C:\myhashes\test1.hash against the path C:\test1

PS: I'm using command-line only quiet mode.

I have something very similar in my 2do list, but in my idea, the file name specified the path, e.g. C:\some\path\to\c~test1.hash (the "~" donating path separators)

I think I like your idea better, I will look into adding this.

;o) Cor

ps. I see in my (large) 2do file that I also have an idea for an ini [section] for moved directory mapping. I will consolidate all ideas into a grand solution!

Gerard Lally - 14.10.14 6:21 am

Hi,
nice utility. Couple of issues for me:
1) text in help dialog is quite small on a high-resolution monitor (1920*1080) and towards the bottom of the dialog, where the switches examples are, some of the text is completely missing;
2) it would be nice to have a command-line switch for the help file.
Otherwise, job well done. Will sort you out when I get my next cheque ;-)

I noticed recently that 32 bit installs have the final brace chopped off the longest example command (only the notes, not the actual command) because of the extra install path length (i.e. the " (x86)" part). In longer install paths, even more will get chopped off, because the path is inserted dynamically from checksum's actual path on your system.

At any rate, the real help file is here online - my plan is to reduce the "checksum was given nothing to do" dialog to something much simpler, basically linking to this page.

With the advent of the new startup command facility (v1.5.2.0+), I expect less people will be seeing that dialog (I am toying with the idea of having checksum simply launch a web page instead of a dialog, so that I can keep it better up-to-date/avoid screen resolution issues, etc.).

You can always get here from checksum's about box.

By the way, for users of KDE-Mover-Sizer (which is surely everyone these days!) these sorts of things are never an issue!

;o) Cor

NPSR - 24.10.14 5:06 pm

Hi Cor,

First, thanks a lot for Checksum : very usefull and very fast ;-)

Considering I want to monitor a complete folder, when I want to check and synchronise the hash file, I think I have to make it in three steps :
1/ run the check and check the log file to ensure change/missing detection are right and correct corrupted file.
2/ run again the check with option "delete missing files" and "update changed files"
3/ run a "synchronise" to update hash file with new files

It is probably possible to optimize in CLI by combining 2/ and 3/ but it is still 2 steps (except if I missed something)

My request :
Is it possible to get an interface at the end of check step to ask user to confirm action for each hash :
- if file changed => default action selected : update hash
- if file missing => default action selected : delete hash
- if file corrupted => keep the hash + possibility to check again the file (so I could recover the file from backup and make a check for this file specifically)
- if file is new => default action selected : add hash

Interface could be like "simulation" step in SynckBack for example :

Hoping it is a usefull request ;-)

Kind regards,
Nico

P.S. If there is a way to translate the software, I offer to translate it in French.

I do like the idea of having default actions based on error conditions, e.g. "if file changed => default action = update hash". I will look into adding something like this in the future.

It's unlikely an extra GUI will be involved; more likely it will be a simple ini setting/switch to control the behaviour.

Translation is already on my 2do list. At the moment there is no mechanism for it, but if all goes to plan, it will eventually be possible to add simple translation mapping files for any language.

When this happens, I will be in touch!

Thanks!

;o) Cor

frito - 02.11.14 6:03 am

First: Very NEAT tool! I like the site design too.

Second, I've got a question/feature req.: Is it possible to batch compare Renamed files (with the same content)?
I tried with verify but it is sensitive to the original file name (even if the checksum matches).

cheers
f

Thank you!

Yes, checksum is not only "sensitive" to the file name, it absolutely relies on it, and does a lot of behind-the-scenes voodoo to ensure any file it attempts to hash matches a required file listed in your .hash file, regardless of what path scheme was used, and where we are in the directory tree.

For checksum to predict user renaming would involve hashing irrelevant files (gasp!) to even know if a "hash match" was possible, which takes time. And then it would have to know somehow if that really was a renaming and not simply two identical files.

What is more likely to happen is that I make my renaming tool (MangleeZee) "checksum-aware", switching out the names of affected .hash files. Ideally it will search up the tree for any .hash files potentially containing the renamed file(s) and switch them automatically. I'm still thinking about it.

It's on the 2do list!

In the meantime, you can compare any two files (or folders) with simple checksum, which is installed alongside checksum. I keep a copy in my SendTo menu and often send two files or folders to it for a quick compare job.

;o) Cor

Skippy - 21.01.15 3:02 pm

Hi,

First great tool !! i discovered yesterday

I have a suggestion

Can you add a option in the verify procedure for folders, today you check the hash file to determine if the previous hashed folders are ok, that's good but not 100% because you check only the files who have be hash previoulsy. So there is missing the information for the new files who have been introduce in the folders.

For me, you have in the log "yyyyyyy.xxx is a new file" and with this option you updatein // the previous xxx.hash for the next verification, so next time this file will be compared normally.

Thanks

Regards

Skippy

After adding the new files, perform a hash create on the drive, using the synchronize option. Then the .hash will contain hashes for ALL the files. ;o) Cor

Skippy - 21.01.15 4:10 pm

Another request:

Can you add the possibility when you use the checksum launch modifiers with add/remove some options to put this hash/verify modification do a a txt or bat files on the root folder from checksum. So you can use it again on a easy way.

thanks

Skippy

You want checksum to copy its current command-line to a text file? Sure that's fairly trivial. Coming up. ;o) Cor

Diamond - 25.01.15 4:40 pm

Hi Corz...
Hope this isn't a stupid question...but how can I minimize simple checksum to tray on exit (pressing close 'X' button)? Is there a way to accomplish this w/the .ini file, and if so, how?
I'm using 4t tray minimizer -freeware- at present but I keep forgetting to hit 'minimize' and I keep closing simple checksum accidentally (very annoying!)

Pressing close (or hitting Esc) will always exit simple checksum. You can minimize it to the tray by clicking its tray icon.

Or you might prefer to set the transparency higher and leave it on the desktop somewhere.

;o) Cor

Gryph - 02.02.15 12:46 am

Hello Corz,

I'd like to exclude from hashing all system folders and .svn folders, and exclude . Also I'd like get both sha1 and md5 checksums for all files to be hashed in a single run. What is the easiest way to do so?

At the moment I use two runs (for two hashes) and command line string
cqyrb1snkx(".s*n","progra*s","*ecycle*","cache","*pplicati*ata","*ygwin*","*ython*")d(k:\infopack)

It works, but... )

Alternatively I tried feeding checksum with list of files I need a hash for (filenames exported from locate32), but it takes too long: for each file checksum process is started and terminated.

Thanks,
Gryph

There is currently no facility to add multiple hashing algorithms in a single run, though I will consider this.

When I find myself doing this sort of thing with any app, especially if there are large command-line parameters, I turn to Batch Runner.

;o) Cor

Paul - 13.02.15 4:11 am

Hey there!

I have the same request as Nacho. I'd like to be able to specifically state and compare a hash file to a directory, but I don't want the hash file to reside inside the directory. I can see how I can state which directory the hash gets written to, but I'm confused at how to actually verify or compare afterwards without copying it into the folder.

If you use absolute paths, you can simply click it wherever it is.

Also, I'd like to be able to run the create flag and consolidate nested folders hashes into the top level hash. Instead of having a hash file per nested folder, I'd like to have just one has file.

Paul

checksum has always been able to do this. See the "root" hashing option ("1" on the command-line) ;o) Cor

Keith Douglas - 27.04.15 12:08 pm

I love this tool.

How about being able to checksum just the image-portion of a JPEG? That way if, for example, I update the keywords, the JPEG checksum will still read as OK.

I was thinking of doing something like having ExifTool strip all metadata, then doing the checksum.

exiftool -all= image.jpg
<do checksum on resulting file>

Similarly for video file types that support metadata -- although that subject seems to be much more murky.

Thanks!

It's something I've been thinking about for a long time. Specifically, with MP3 files (so I can change the ID3 tags without re-hashing). But even with jpegs there is EXIF, IPTC, Comments and more. The entire subject is murky!

And quite a big job. Part of the work (hashing only part of a file) has been done with the recent embedded hash facility, but there is still a lot to be done. It's unlikely I'll get around to this any time soon unless some large corporation purchases a large license and wants it badly!

;o) Cor

Keith Douglas - 28.04.15 11:58 pm

... follow on to the JPG request above ...

I've made a python script to open and checksum the actual jpg image (so no metadata). I'd like to re-use your ".hash" container.

Would you add a ".ini" switch to ignore unknown checksum types? For example, if I specify hash-type "md5_jpg" in the hash file, your tool would ignore it rather than reporting a changed/corrupt file. I'd like to continue to use your tool for all but JPG files.

(And also, unknown checksum types should not be overwritten)

The hash types in the comment (date/time) line are, in fact, not used. One day they may be. So currently, it's not trivial to skip hashes based on what these lines contain. I will look into adding a mechanism for this. ;o) Cor

<edit>Incorporated into v1.7.1.*</edit>

Guy Gordon - 16.06.15 9:41 am

Great tool, thanks. Have a question about the synchronization option. If I've got a folder full of a huge music collection and I want to synchronize the hashes once every week (for example) to a single root .hash file, and I want it to update the hash for any changed files (from things like tagging & art changes, etc) and remove any from the .hash for files that are no longer in the collection, what is the process to do that? There seems to be options in the verify for updating changed, but I can't seem to get a create process that would pick up on new stuff to also be able to update changed and remove missing. I'm doing cloud backup of the .hash files so I can always go back to a previous version if I want to. I'm just trying to keep the root .hash file in true sync with what's currently (and up to date) in the collection.

Thanks!

You need two commands for this.

First, a verify command (which you can configure to remove hashes for missing files, update changed hashes and such), and then a create command to add any new hashes.

Note: you can put both commands into a single scheduled task.

;o) Cor

Guy Gordon - 22.06.15 5:57 pm

Ah, was wondering if a separate verify step was needed. But it seems to be verifying all files when I try, not just the changed/added ones. Is there a switch I'm not noticing that only checks files with a newer date/time stamp, and not the rest that haven't changed? I'm checking 20+ TB of stuff, and the "vrywxq1" switches I'm testing with takes a long time, and slow down to read all the files during the verify step on all the files that haven't changed.

But how do you know they haven't changed? There may be corruption, and unless you check them all, you won't know about it. Checking only "new" files is a recipe for disaster.

This sounds like the sort of task you would want to schedule for when you are asleep.

Having said that, I think I know where you are coming from; you want checksum to perform the "special" functions that can only be done during a verify operation without performing an actual verify operation.

But the problem is, checksum can't know if a file is changed, unless it checks it!

By the way, the "y" switch is only for checksum creation. More details here.

;o) Cor

Guy Gordon - 27.06.15 11:30 pm

I'm using Checksum as a "just to be safe" fallback. I'm checking against 15 drives in an unRAID array, with more than 20 TB across a GB network link. It takes days to run a full check, and is a lot of needless drive activity when those drives could be spun down, particularly with directory caching not spinning up drives unless actual file access is needed (iow, more than just name, date and usual directory info). unRAID is already doing full monthly health and parity checks. I'm just wanting to use Checksum as an extra level of safety, just in case the unthinkable happens and I suffer more than one drive death at the same time. In the case of unRAID, the other drives would still be readable and contiguous file data would still be on those, what Checksum would do for me is on the drives that failed (in an unlikely multi-drive failure situation) I would be able to verify against the hashes and see what on those drives is still OK. The other ones that didn't fail could also be checked, of course, and added back into a parity protect array again.

Two put it more simply, it's really not worth me using Checksum if I have to read more than 20TB of stuff across a network for days every time I want to simply update the hashes for changed/added files (which I'd probably do on a weekly basis). I simply want a mirror of hash info for what's there, as of the last time I ran Checksum. I don't recall how long it took for the first time I ran it across everything, but it was at least a couple days. It doesn't take too long to do the create pass to add in any new files, and of course only needs to spin up drives for when it does have new files to add which it needs to read & hash. If, however, I can't trust the hashes on files that were already there in the past to be up to date to any edits, tag changes, etc, then it's simply not worth the effort and resources of maintaining.

You stated that Checksum can't know if a file changed unless it checks it. Then what's the date/time stamp in the hash for? If it sees the file is newer than the stamp in the hash, then it updates the hash. At least that's what I'm trying to get it to do.

If Checksum can't do this, do you happen to know if there's a different app out there that can? I haven't gone hunting to see as yet. Was hoping this one could, as I really like everything else about it so far.

Apologies for the slightly delayed reply. The whole Google Malware fiasco was more than a bit consuming.

Back on topic.. If all you want to do is add hashes for new files, checksum can rip through your 20TB array in no time, adding hashes only for files that doesn't already exist in your .hash file(s).

Updating changed files is where it gets tricky. As it stands, when creating hashes, if checksum discovers a file that already has an entry in the corresponding .hash file, it moves immediately onto the next file.

During verification, checksum has the option to update changed files, as you know. Ideally, this is something that would only be done only after a full verification of the volume(s), and subsequent manual inspection of the resultant log file.

The trouble with simply enabling some flag and having checksum update all changed files on a scheduled basis for entire volumes is that that it only takes one single pass to update the hash of a recently CORRUPTED file, which also has a changed modification time. And WHAM! Now you have a false checksum, file corruption goes unnoticed, accumulates, until it's too late. This is, of course, exactly the sort of situation checksum was designed to prevent!

It's only after a full verification of the files that you have the information (a comprehensive log) required to make that sort of decision for each file. If all the changed files in the log were mindfully changed, you can go ahead and rewrite hashes and timestamps for those files.

Having this happen automatically, or on a schedule, is a recipe for disaster. And adding a flag to skip unmodified files encourages schedules, mindless hashing, defeating the whole purpose of hashing.

Having said all that, I do see the utility of having checksum update the hashes of only changed files for many users and actually slipped this exact functionality into the latest release (1.7.0.1), which I put up yesterday (slightly ahead of schedule thanks to the GoogleDebackle).

Most unusually for a general (non-beta) release this functionality hasn't been tested much and there is a known issue with verification inside Daylight Saving Time (BST) adding an hour (or perhaps other amounts) to the calculated modification times of files hashed outside DST (Somewhere deep in the Windows API!). If you don't use the "m" switch during verification, there is no impact to the rest of the code-base, so I left it in as an "experimental feature". Have fun! And do let me know of any issues.

Although it inevitably takes time, a full verification is always advisable before switching on any of checksum's auto-update flags. But if you know your drive and what you are doing, knock yourself out!

Please do enable logging and do examine those LOGS!

;o) Cor

archedraft - 07.07.15 2:21 am

some code..
archedraft - 12.09.14 8:45 pm

Cor,

I just wanted to add to Jumperalex's posts that having an updated corz linux cli would be fantastic. I currently use my Windows box with checksum but I am sure it would check much faster if I could perform checksums directly from my NAS box. Either way, keep up the great work! I'll continue to send as many friends your way as I can.

-archedraft

Friends are good, thank you! Customers, too - you can never have too many of those!

checksum on Linux is only sleeping. There will be more ASAP!

;o) Cor

Just wanted to check in and see if checksum on Linux has awaken!

Any updates to the Linux version will be noted on the checksum for Linux page. ;o) Cor

Nacho - 08.07.15 6:21 pm

Hi! Any updates on the ability to specify a custom target path to verify hashes (see my last post Nacho - 06.10.14 11:34 pm)?
Also Paul asked about it (Paul - 13.02.15 4:11 am). You answered "If you use absolute paths, you can simply click it wherever it is.", but that's not an option if you want batch processing and you are trying to verify a hash file with relative paths from another place, for instance from a backup on USB drive where drive letters change.

I also would like to suggest the ability to use relative paths while creating and verifying hashes:

checksum cr1 "..\myfolder"
checksum v "..\myfolder"

That way you could run portable checksum.exe from a removable device no matter the assigned drive letter is.

Thanks again for your time and effort

All updates are posted in the usual place!

As for relative operation. checksum can already do this. Ensure you correctly set the working directory (or run from the portable location) and you should be good to go.

;o) Cor

Nacho - 08.07.15 7:50 pm

Hi again.

"As for relative operation. checksum can already do this. Ensure you correctly set the working directory (or run from the portable location) and you should be good to go."

I haven't run the installer. Simply copied checksum.exe, simple checksum.exe and checksum.ini to a folder (R:\checksum). Then I open a CMD in there. When I run checksum cr1 "..\myfolder" all I get is "Path does not exist!". And obviously R:\myfolder exists. Any idea?

Thanks.

checksum currently does not support relative path traversal. It will happily translate ".\" and "..\", but won't accept anything after those. If you really want this, I could look into adding support. ;o) Cor

Nacho - 08.07.15 11:08 pm

"(...) If you really want this, I could look into adding support."

Thanks for the heads up and being so well disposed, Cor. If I had to choose, I'd rather insist on the ability to specify a custom target path while verifying. I find that feature way more useful.
Since you can specify a destination output folder for the hash files d(c:\myhashes), you could use the same command-line switch to specify the target folder for the verification of a non-absolute path root hash file. That way you could store all your hash files wherever you want and won't need to copy them back to each folder to verify them. Useful for batch automation, sorting things and processing.

That's actually a really elegant solution to an existing dilemma; how best to specify absolute paths for relative hash files. I've been pondering this for quite a while. Thanks!

Now I must find an equally elegant way to implement this internally. Leave it with me.

;o) Cor

ps. both features have now been implemented in the new beta (1.7.1.*), available soon.

Guy Gordon - 14.07.15 5:26 pm

Having said all that, I do see the utility of having checksum update the hashes of only changed files for many users and actually slipped this exact functionality into the latest release (1.7.0.1), which I put up yesterday (slightly ahead of schedule thanks to the GoogleDebackle).

Thanks for the feature update. Very cool. Been busy the last week or two and haven't gotten back to messing with this. Have some time during the next couple days to do some testing, and will let you know the results. I get the points, and they are certainly valid point, but my specific use case is perhaps a little bit less common. I do have the .hash files stored in a location that is backed up to CrashPLan (as are the more important of the files that I'm hashing), so I've got revision retention on the .hash files themselves to be able to go back to in particular situations. Like I said, the hashing is something of an extra tertiary level of confidence, and I'm aware of the tradeoffs of how I'm wanting to do it.

I'll let you know if I run into any weirdness after testing.

TiM - 19.08.15 10:26 am

Hi - is there any timescale for getting the erroneous second set of Create Checksums & Verify Checksums (that throw up "This file does not have a program associated with it..." errors) removed from the File Explorer menu when right clicking a folder in the left hand window?

This has been there in Win 8, Win 8.1 & Win 10.

Cheers

TiM.

It sounds like you are reporting a bug in checksum's installer. My "timescale" on fixing bugs is "immediately on hearing about them", so there is no timescale, as I don't know what you are referring to.

If you want to report a bug in the installer (the installer creates Explorer context menus, not checksum), please email me with details of your system, OS, how you installed checksum, copy of checksum.ini and so on. But please, by email.

Whatever the cause, a few seconds in Regedit would certainly fix it.

;o) Cor

My apologies for not reading that i should email - though obviously i can't provide attachments using the webform on the contact page.

No worries. Hey, I might look into adding an attachment facility to the mailer. That could be useful. Cheers. ;o) Cor

Charles Barnes - 23.08.15 6:48 pm

A very useful tool indeed. One small addition would make it even more useful to me: can you add a switch so that creation and/or verification optionally wholly ignore NTFS directory junctions and symlinks (ie don't follow them)? I have a few of those and of course they point, in my various backup levels, always to the original directories which may well have changed since the backup. So I sometimes get entirely correct yet spurious CHANGED entries in the logs.

I can imagine how this could get annoying. I will look into it. ;o) Cor

magicool - 03.09.15 6:52 am

Hi! If I (or someone else) add new files to a directory (after creating the checksum in that directory's root) and then I verify, the tool reports that all is ok ignoring the new files. Is it possible to get a warning about the new files? Or a complete report containing all the new and not yet hashed files?
Thank you!

During verification, checksum is examining your .hash file(s) and verifying the hashes found within. It isn't examining your file system.

If you want to add hashes to an old .hash file, simply run checksum over it (in create mode) with the synchronize option enabled. New hashes are added to the end of the .hash file, so it's easy to see what was added. You can even have checksum timestamp new entries.

;o) Cor

magicool - 07.09.15 3:55 am

Hi!
I hope you will consider adding this as an option in the future releases, so we can do one click folder integrity check.
Thank you!

It won't happen. That's just not how checksum works. If you want checksum to perform two operations, make a script/schedule/macro/whatever to do both tasks.

We already have a one-click integrity check.

;o) Cor

Pedro - 02.10.15 5:48 am

Hi,

I'm testing your checksum here and found it amazing, thank you.
I have only one question for now, why do you use Blake S implementation? do you use B for the x64? just read about this in Blake website.

Thank you

Thanks. I think it's pretty amazing, too!

As for BLAKE2, different algorithms produce different checksums, so you would think that the main reason to use the S variant would be comparability, but in actual fact, it was primarily for speed. On all my test systems, x64 included, the S variant runs faster.

;o) Cor

DavidS - 21.10.15 5:33 am

I am a developer both personal/business and for a company.
I have need for a checksum library that I can call that would give me a checksum for a file.
Of course I could find the code to do so but from reading about your checksum I would much rather have a library from you.
Is this possible?

Yes. Mail me. ;o) Cor

ljm42 - 09.11.15 6:47 am

Checksum is great, thank you for it! I have one request/suggestion:

I have individual_hashes=false in checksum.ini, so when I right-click on a folder and choose "Create checksums", it creates a single foldername.hash file in that directory, which is perfect.

However, if I then right-click on a file inside that folder and choose "checksum", it ignores the foldername.hash file and creates a redundant filename.hash. Could the app be modified to recognize that the foldername.hash file exists and use that instead of creating a new one?

Put another way, I'd prefer to not have any filename.hash files, only foldername.hash

Thanks for considering it

I'm not considering it!

Checksum isn't ignoring anything. You specifically asked for an individual hash of a file. checksum is doing exactly what is expected.

If you only want folder hashes, only hash folders!

;o) Cor

Scott - 04.12.15 4:01 pm

Nice Utility. Hopefully I can get this working the way that I want.

I would like to script / batch file the success/failure of the checksum command.

Are return codes / error levels documented somewhere?

Thanks,

Scott

Gavin Greenwalt - 17.12.15 7:49 am

I had a bunch of checksums fail because I was missing /path/to/file/._filename.ext files. Looks like it's another form of thumbnail file that would make a good addition to the default ignore list:
"._*"

Alternately they're also always in "*\__MACOSX*" directories.

Winrar now optionally stores Blake2 hashes within it's archives, and you can also set a switch which puts that entire filelist at an accessible location for fast reads.

That feature you describe with "folder compare" is intriguing. Wonder if it would be possible to do the same for winrar archives? Drop the folder and the backup into a window then have checksum spit out any differences?

We use the standard duplicity shell for Linux backups (acceptable), but windows backup software pretty much always was crap. But if you use Winrar for incremental backups - you get those Blake hashes + reed solomon error correction. For data backups - probably nothing better. Unfortunately, tools for making use of these features are lacking.

Curious! I don't use WinRar, but if you want to send me details of where I can find out about how and where it is storing the hashes, I will certainly look into it. ;o) Cor

Ron - 13.03.16 8:18 pm

Hello. Sorry in advance for asking (most likely) a stupid usage question. I will use your utility to check the health of all my data which I had to transfer (copy) to a new HD on a new PC. This means thousands of subdirectories and therefore thousands of hash files. How can I compare them? I thought of the following: searching and selecting all the hash files, moving them into a new location on PC and packing all the hash files in a single zip file. Do the same for the source drive too..Create hash files for the archives, and compare the two? Thanks. Am I being an idiot? Is this already a function of your utility? Is there a user guide for your utility? Thank you.

On this very page! See here. ;o) Cor

Ron - 13.03.16 11:36 pm

That was easy...I told you I was an idiot. Thanks

Klaus Hummel - 18.04.16 4:30 am

I like that checksum creates hash files for every subfolder. I use it for a large archive of jpegs.

But is it also possible to create an top hash file only over all hashfiles in the subfolders? Otherwise it is not possible to detect lost subfolder (because both subfolder and subfolder-hashfile is lost...).

This happend to me because the subfolder was accidentally deleted by my daughter...
(Thank God I found the data on old DVDs again...).

Klaus

Klaus Hummel - 19.04.16 3:53 am

hash of all hash files...

Finally I got a solution. Now I can create a hash file over all subfolder hash files.

The solution is to create a portable checksum version by copy checksum.exe and checksum.ini in a new folder. Then I deleted the file type "hash" in the ini setting "ignore_types" and called checksum by command line:

checksum.exe cr1m(*.hash)j(rooth) path_to_folder

Now I have a root.hash file with all subfolder hash files.
The Explorer menu item "Verify checksum..." still works and checks also the root hash file.

So my last request can be closed

There is no need for any of that! You simply use the "1" switch (root hashing). Hold down the SHIFT key when you launch checksum for this, and many more options. ;o) Cor

Klaus Hummel - 23.04.16 4:53 am

Hi. I think that's not right. I tried your way before I realized, that checksum ignores files with the *.hash extension. The checksum.ini setting "ignore_types" is not available in the user interface (or I am blind... :-) )
So i have to use a solution with a modified checksum.ini file.

Rusty - 16.05.16 5:44 pm

Hi

I would like the following features:
1 A tab to display the files in a folder that has been scanned where the hashes do not match.
2 Same as "1" except to display the files that do match
3 A window to show "All files in a folder match" or "Do not match"
4 The ability to export the results in some form to Excel - Probably CSV.

Checksum is a great program. However for large folders, I have found it very tedious to identify the copied files that do not match the original.

Phil - 03.06.16 10:33 pm

Hi,

I sent an email a few days ago, before noticing there is a feature request blog, so here it is:

I have data folders where I often remove old files and add new ones. Many files never change.
Most are 3-8 GB files.

I regularly run "create checksums..." on the folders to add new files. However, during creation, there is no option to update hash index by removing non existent files.

Because of the folder size, I don't like running a "verify", as often as "create checksums" for new files.I do verify only on specific timed occasions.

For now, I can remove hashes of missing files only during verify of whole folder contents.

What about making the "delete_missing_hashes" option work also during "create" or just adding an option to clean hash files from orphans without verifying each data file ?

Adding it during create seems a good choice as it will keep the current context menu unchanged.

Best regards

I have touched on this before. Basically, during creation, checksum isn't looking at the existing .hash file, so it's not gonna happen. ;o) Cor

Luke - 29.06.16 3:12 pm

It would be great if checksum had options to be more command console friendly:
Don't launch system tray icon.
Print progress to standard console output.
Print only success/fail line to standard console output.
Print list of switches to standard console output.

Thank you!

Ralph - 10.07.16 7:53 am

Nevermind - it works as I wanted.

Lester - 20.07.16 8:47 am

I am trying to create the MD5 hash key from a string, ie. "20160720Lester" to use as a password key on a website.
I can generate the string but not the hash key in batch files, can I do this using this app ?

checksum won't do this (though it seems like a nice idea for a future version of checksum and/or simple checksum). In a batch file, try this. ;o) Cor

Steve - 28.07.16 2:27 am

Not sure if this is the place for this...

I'm interested in using hashDROP with checksum, but Chrome flagged the download as malicious. When I check the link at VirusTotal, it returns a lot of hits:

https://www.virustotal.com/en/file/355a385d179b2fd69ffee8ace86fec3bd74884a60cf7be1a8faf433b4d3dfe99/analysis/

Is there some explanation for this other than the file being infected?

It's as good a place as any!

Hey, funny link! But also fascinating. I've seen this happen with some of my own (AutoIt-based) releases.. One or more brain-dead vendors flags the app for some generic-unknown-whatever-pattern-based-shite-they-saw-in-some-app-somewhere, and then this pops up in someone's scanner and they report it, and other brain-dead vendors blacklist it, and so on and so on and so on. 50% is an impressive ratio for this kind of Chinese misinformation whispers!

At any rate, I know the author of HashDrop (seVen was one of checksum's original beta testers) and I know for a fact his downloads are completely malware-free.

If in doubt, run it in a sandboxie inside a virtual machine.

;o) Cor

Ralph - 04.08.16 5:18 am

Is there any way to ignore directories on verify? I want to not verify the $RECYCLE.BIN directory. The x($*) option works on create, but not on verify.

KP - 06.08.16 7:30 am

Hi Cor,
Can you add SHA256, SHA384 & SHA512 to your program? I have a need to check files that are in SHA512. It would be great to be able to use your program for any and all checksums!

Thanks!

Support for FreeBSD, PCBSD, Mac, Ubuntu Linux, other Linux/BSD, would all be nice. I saw you did a little bit with a Bash script - maybe something like Lua or Python would be better for that. I quit creating shell scripts (bash, sh, zsh, csh, tcsh) because the code is completely useless in terms of platform independence. I'm now choosing pretty anything else, and I've found it much easier to use multiple popular programming languages instead of trying to port obscure shell scripts.

swg42 - 26.09.16 9:45 am

Has anything changed recently with hashDROP? Both Chrome and my antivirus (Avast) are blocking it.

Two months ago in July, VirusTotal reported only one detection for the site (http://imgur.com/9K1Tqro), but now there are three positive results.

Edit: Nevermind, my apologies, just saw Steve's post from two months back.

Grant - 27.10.16 9:33 am

It would be nice to have an option to report on files that do not have hashes when verifying. Maybe even add the checksums right then if w switch is used, like it updates changed files now.

It would be fantastic to be able to run it from a script and have the script wait while it runs, instead of it running separately and the command returning immediately. Maybe even return an exit code to %ERRORLEVEL% if there are any errors. That would make writing scripts to automatically verify files and email me if there are errors much easier to write. Or maybe just an option to send an email if changes are detected?

## MAINTENANCE MODE!Posting is currently disabled while I switch servers.

Welcome to corz.org!

If something isn't working, I'm probably improving it, try again in a minute. If it's still not working, please mail me!