MD5 Calculator
MD5 Calculator
EDIT 2013-02-11 : new version, new shot :-)
EDIT 2008-07-06 : deleted obsolete and desperately buggy 2 years old version, see below for new info and shot
Download now: http://jafile.com/uploads/dos386/fbmd5.zip (118 KiB)
http://ibiblio.org/pub/micro/pc-stuff/f ... /fbmd5.zip
http://ibiblio.org/pub/micro/pc-stuff/f ... /util/file
EDIT 2008-07-06 : deleted obsolete and desperately buggy 2 years old version, see below for new info and shot
Download now: http://jafile.com/uploads/dos386/fbmd5.zip (118 KiB)
http://ibiblio.org/pub/micro/pc-stuff/f ... /fbmd5.zip
http://ibiblio.org/pub/micro/pc-stuff/f ... /util/file
Last edited by DOS386 on Feb 13, 2013 2:42, edited 8 times in total.
Had to edit the following line:
to
so it works on linux. Doesn't report performance info, says file too small and its bigger than the one in your screenshot. It gives the same result as the system's md5sum command.
*edit*
looking at your code apparently its reading over 5 Mb/s on a Athlon64 3200, 1gig ram, Ubuntu 6.06
Code: Select all
COMMA=Ucase$(Command$(FFF2+1))
Code: Select all
COMMA=Command$(FFF2+1)
*edit*
looking at your code apparently its reading over 5 Mb/s on a Athlon64 3200, 1gig ram, Ubuntu 6.06
Too small
Good.It gives the same result as the system's md5sum command.
Hugh. Have you a bigger file ? >100 MB for your CPU ;-)Doesn't report performance info, says file too small and its bigger than the one in your screenshot.
looking at your code apparently its reading over 5 Mb/s on a Athlon64 3200"
MD5 licensing
It's not mine. You can consider my "contribution" as public domain. IIRC I based it on some older MD5 code ported to FB by others.I want to use your code in
Should be no problem.an closed-source project.
Give credit to RSA in About box or the Help ;-)In which form should I get you credit (About, Help
No problem, it's a BSD-like license, not silly LGPL ;-)So, i must edit your code, to get it work properly
sir_mud wrote:
> Had to edit the following line
Fixed.
What's new:
- Support for files > 4 GiB (up to 128 GiB) , also in DOS
- Added progress indicator
- Added size override, can process less than full file size
- Faster
- Fixed sir_mud's "bug"
- Deleted floats from DOS version, should work on 80386 and 80486 without FPU also (tests welcome)
> Had to edit the following line
Fixed.
What's new:
- Support for files > 4 GiB (up to 128 GiB) , also in DOS
- Added progress indicator
- Added size override, can process less than full file size
- Faster
- Fixed sir_mud's "bug"
- Deleted floats from DOS version, should work on 80386 and 80486 without FPU also (tests welcome)
Last edited by DOS386 on Feb 11, 2013 4:17, edited 3 times in total.
Sorry, but you still need the FPU emulator (I used WMEMU387.DXE) in order to run it. I tested my 486 Sx/25 yesterday. It does run with emulation though, and it's quite fast (well, 3x as fast as Blair's 16-bit FreeDOS version: 24 secs. vs. his 1min 12secs. on a 1.3 MB file, no cache loaded). I got approx. 320+ kb / sec. (which apparently wasn't fast enough for you, heh, "BUH it's sloooooooow!").DOS386 wrote:- Deleted floats from DOS version, should work on 80386 and 80486 without FPU also (tests welcome)
Already e-mailed you all this info, just posting it here for others' curiosity. ;-)
P.S. P4 2.52 Ghz "Northwood", XP Home SP3: 14 secs, 15000 kb/s (DOS), yet 10 secs, 20000 kb/s (Win32), but 24 secs (Blair's) on a 214 MB .ISO.
I tried extracting this on my Linux system, and found the zip contained a tar(?!?), which GNU tar 1.20 extracted, but complained "tar: A lone zero block at 394", so I don't know if it's really extracted correctly or not.
Tested on a few sample files on an ext2 partition, with an initial run of md5sum to get the file cached if small enough, then a run of fbmd5, then md5sum again to compare times. Machine has 2 GB of RAM, Athlon 64 X2 4200+, 64-bit Gentoo with kernel 2.6.25, SATA 3.0 Gbps disk, md5sum from GNU coreutils 6.10 compiled with GCC 4.3.1 -O2 -march=k8.
346 MB - ~6.6s (~1.2 md5sum)
I tried some larger files (3.7, 3.9 and 6.4 GB), but got 'FATAL: Failed to open file "..." !!!'
After changing back to the built-in file routines (they're there for a reason!) I can successfully hash larger files.
346 MB (same file as before, retested to validate built-in file routines) - ~6.6 s
6.4 GB - ~2m24s (~1m47s md5sum)
So this md5 calculator gets about 52 MB/s when cached and 42 MB/s when not, and md5sum gets about 288 MB/s when cached and 61 MB/s when not.
Tested on a few sample files on an ext2 partition, with an initial run of md5sum to get the file cached if small enough, then a run of fbmd5, then md5sum again to compare times. Machine has 2 GB of RAM, Athlon 64 X2 4200+, 64-bit Gentoo with kernel 2.6.25, SATA 3.0 Gbps disk, md5sum from GNU coreutils 6.10 compiled with GCC 4.3.1 -O2 -march=k8.
346 MB - ~6.6s (~1.2 md5sum)
I tried some larger files (3.7, 3.9 and 6.4 GB), but got 'FATAL: Failed to open file "..." !!!'
After changing back to the built-in file routines (they're there for a reason!) I can successfully hash larger files.
346 MB (same file as before, retested to validate built-in file routines) - ~6.6 s
6.4 GB - ~2m24s (~1m47s md5sum)
So this md5 calculator gets about 52 MB/s when cached and 42 MB/s when not, and md5sum gets about 288 MB/s when cached and 61 MB/s when not.
Very valuable infos :-)
Rugxulo wrote:
No problem for me, my code is almost not optimized, and, TotalCommander is nevertheless even slower (how did the guy achieve it ??? ).
Rugxulo wrote:
Known fact ... already discussed elsewhere :-(Sorry, but you still need the FPU emulator
Thanks. :-) Unsurprisingly Blair's 16-bit code is very slow :-|tested my 486 Sx/25 yesterday. It does run with emulation though, and it's quite fast (well, 3x as fast as Blair's 16-bit FreeDOS version: 24 secs. vs. his 1min 12secs. on a 1.3 MB file, no cache loaded). I got approx. 320+ kb / sec. (which apparently wasn't fast enough for you, heh, "BUH it's sloooooooow!").
I'll check :hungry:e-mailed you
P4 2.52 Ghz "Northwood" with DOS with and SATA-III drivers would be cool :-)P4 2.52 Ghz "Northwood", XP Home SP3: 14 secs, 15000 kb/s (DOS), yet 10 secs, 20000 kb/s (Win32), but 24 secs (Blair's) on a 214 MB
GNU TAR is unusable (well know fact, working on a better one). MD5.BAS is supposed to be 20'711 Bytes.DrV wrote:I tried extracting this on my Linux system, and found the zip contained a tar(?!?), which GNU tar 1.20 extracted, but complained "tar: A lone zero block at 394", so I don't know if it's really extracted correctly or not.
Thanks.Tested on a few sample files
Thus the "md5sum" competitor is 5 times faster ? Probably the AT&T-64 vs 32-bit FB effect ;-)Machine has 2 GB of RAM, Athlon 64 X2 4200+, 64-bit Gentoo with kernel 2.6.25, SATA 3.0 Gbps disk, md5sum from GNU coreutils 6.10 compiled with GCC 4.3.1 -O2 -march=k8.346 MB - ~6.6s (~1.2 md5sum)
No problem for me, my code is almost not optimized, and, TotalCommander is nevertheless even slower (how did the guy achieve it ??? ).
Some CRT Linux problem ... I don't have Linux, it worked in DOS (more tests, also > 4 GiB) and in Win32 (less tests, all < 2 GiB).I tried some larger files (3.7, 3.9 and 6.4 GB), but got 'FATAL: Failed to open file "..." !!!'
Using GET from FB 0.20 returning amount of data read ? Since you don't complain about wrong results, I assume there are none :-)After changing back to the built-in file routines (they're there for a reason!) I can successfully hash larger files.
Seems quite usable otherwise; anyway, I was more concerned that there was a TAR within a ZIP (which already provides the capabilities of TAR).DOS386 wrote:GNU TAR is unusable (well know fact, working on a better one). MD5.BAS is supposed to be 20'711 Bytes.DrV wrote:I tried extracting this on my Linux system, and found the zip contained a tar(?!?), which GNU tar 1.20 extracted, but complained "tar: A lone zero block at 394", so I don't know if it's really extracted correctly or not.
That might be part of it, but I also think that the size of the buffer is important; even if fread() (used internally by GET as well) does some buffering, there will be tons of (slow) syscalls if you are using an 8 KB buffer for a 6 GB file. :)Thanks.Tested on a few sample files
Thus the "md5sum" competitor is 5 times faster ? Probably the AT&T-64 vs 32-bit FB effect ;-)Machine has 2 GB of RAM, Athlon 64 X2 4200+, 64-bit Gentoo with kernel 2.6.25, SATA 3.0 Gbps disk, md5sum from GNU coreutils 6.10 compiled with GCC 4.3.1 -O2 -march=k8.346 MB - ~6.6s (~1.2 md5sum)
Not really a "problem", but more of a "compatibility artifact": you must specify some #defines to use 64-bit file offset capable fopen() and friends, which the rtlib already takes care of if you use the built-in routines.Some CRT Linux problem ... I don't have Linux, it worked in DOS (more tests, also > 4 GiB) and in Win32 (less tests, all < 2 GiB).I tried some larger files (3.7, 3.9 and 6.4 GB), but got 'FATAL: Failed to open file "..." !!!'
Yes, but you don't need the amount of data read, as it should always be the amount you've specified except for the very last read, which you can calculate since you know the size of the file.Using GET from FB 0.20 returning amount of data read ? Since you don't complain about wrong results, I assume there are none :-)After changing back to the built-in file routines (they're there for a reason!) I can successfully hash larger files.
And yes, all results were correct compared with md5sum, so it is at least bug-free in that regard. :)
Last edited by DrV on Jul 25, 2008 18:33, edited 1 time in total.
"tar -xvzf fbmd5.zip" will work too (since Gzip seems to handle a .ZIP if it *only* has one file in it). At least, it works on DOS (with old, old GNU tar 1.12a).DrV wrote: Seems quite usable otherwise; anyway, I was more concerned that there was a TAR within a ZIP (which already provides the capabilities of TAR).
You mean this? (Not sure, just looking what bzip2 does.)Not really a "problem", but more of a "compatibility artifact": you must specify some #defines to use 64-bit file offset capable fopen() and friends, which the rtlib already takes care of if you use the built-in routines.
#define _FILE_OFFSET_BITS 64
I recently got pretty desastrous results with the DGJPP version in DOS ... not sure whether / how far it's supposed to work at all, just placed it on my ignore list and started my own ;-)DrV wrote:Seems quite usable otherwise
It's my style ... for good reasons :-Dmore concerned that there was a TAR within a ZIP (which already provides the capabilities of TAR).
Missed the point: the buffer grows up to 256 KiB ;-) (DOS regrettably doesn't "directly" support >=64 KiB :-( ).tons of (slow) syscalls if you are using an 8 KB buffer for a 6 GB file.
Heh, funny, released an MD5 calculator supporting huge files ONLY in DOS :-D If someone reveals me the "some" defines, I can add them ...Not really a "problem", but more of a "compatibility artifact": you must specify some #defines to use 64-bit file offset capable fopen() and friends