18

I was looking this question and appears everything but crc. Is there a good Ubuntu way around there to do this?

Alter Lagos
  • 1,119
  • 2
  • 14
  • 29
  • 1
    CRC means Cyclic Redundancy check. It's a type of (insecure) hash, rather than a specific standard. https://en.wikipedia.org/wiki/Cyclic_redundancy_check lists many kinds of CRC. (CRC32 is perhaps the most common.) – mwfearnley Mar 18 '19 at 12:35

6 Answers6

26
$ sudo apt-get install libarchive-zip-perl
$ crc32 my_file
Michael Mrozek
  • 123
  • 1
  • 8
Alter Lagos
  • 1,119
  • 2
  • 14
  • 29
  • 4
    `perl-Archive-Zip` in fedora, mind you – Nemo Dec 18 '15 at 13:00
  • Very helpful for comparing that a file inside a JAR is the correct version. – jjj Oct 06 '17 at 15:40
  • 3
    Adding to what Nemo said, for CentOs, Redhat, Fedora, and similar distros the library is installed with `yum install perl-Archive-Zip` – Terry Sep 25 '19 at 19:33
17

One way to calculate it is this:

cksum "file"

Another one is

crc32 "file"

To use this last command you need to install libarchive-zip-perl package

Leo
  • 3,634
  • 2
  • 19
  • 36
  • 8
    `cksum`is not compatible with `crc32`, it uses different algorithm – red75prime Apr 27 '18 at 08:29
  • I don't know which type of CRC the OP had, but the version of `cksum` on my Linux box (a Synology NAS unit) can produce four different outputs. One with no parameters (`cksum file`) but it also accepts `-o1` through `-o3` options. Using `-o3` produces the same value as used in "CSV verification files" (albeit it produces them in decimal, the files have them in hex)... that _might_ be the same algorithm as the OP needs. – TripeHound Nov 20 '20 at 12:29
4

I'd use the internal md5sum one of the provided sha programs:

sha1sum (1)          - compute and check SHA1 message digest
sha224sum (1)        - compute and check SHA224 message digest
sha256sum (1)        - compute and check SHA256 message digest
sha384sum (1)        - compute and check SHA384 message digest
sha512sum (1)        - compute and check SHA512 message digest

cksum is pretty much outmoded these days because of its problems.

mdpc
  • 1,179
  • 1
  • 10
  • 18
  • 2
    What problems? I want to know if two files are duplicates - is `cksum` not good enough for that purpose? – Marc.2377 Nov 24 '19 at 06:06
  • Perhaps the "problem" is the fact that CRC is not a cryptographic hash (meaning it's considered easy to create two files with different contents that have the same CRC if that's what you're trying to do). However, when you're talking about random errors, CRC is not too bad AFAIK. – adentinger Jan 28 '20 at 19:28
  • 1
    CRC is about 10x faster than md5 in my current tests. So CRC vs md5/sha involves a tradeoff between key space (probability of accidental collision) and performance. In a small device or with high data volume performance might matter. – Joseph Sheedy Feb 11 '21 at 17:11
3

cksfv app from cksfv package generates CRC32 checksum as well.

muru
  • 193,181
  • 53
  • 473
  • 722
Miki
  • 31
  • 1
  • 2
    could you give a usage example? – Zanna Dec 08 '16 at 06:58
  • `cksfv -c "file"` prints the CRC32 to stdout. If you want to suppress the header, a `cksfv -c "file" 2>/dev/null | grep -v ^\;` gives the filename + CRC32 and no warning for a directory. – emk2203 Jun 14 '19 at 17:12
2

You can try to use rhash.

Test:

$ sudo apt-get install rhash
$ echo -n 123456789 | rhash --simple -
cbf43926  (stdin)
Woosung
  • 21
  • 2
0

you can accomplish this with a very simple perl script. use String::CRC32;

my $s = <STDIN>;

my $crc = crc32($s);
printf "%08x\n", $crc;