Commit graph

7 commits

Author SHA1 Message Date
Stefan Eßer
c8f8c94b50 Fix two bugs affecting bonnie runs with large data files:
- A printf format specified a long operand but received an offset_t value.

- The random numbers for large seek ranges were computed without regard
  for operator precedence and could become negative (for file sizes larger
  than 4GB, which was a wrong test for "large", since it did not account
  for the chunk size).

The problems were found by etc at fluffles dot net (Enlightenment). Thanks
for the detailed report. Guess there aren't many Bonnie users with multi-
terabyte RAID systems; these bugs had been found long ago, else ...
2006-06-11 18:41:57 +00:00
David E. O'Brien
7a89e59b7d Use 8k chunks as Bonnie 1.0 did rather than 16k ones.
Otherwise the benchmark output is different (and noticeably lower) when
compared to 1.0.

Requested by:	se
2002-08-27 16:13:47 +00:00
David E. O'Brien
de3739fa31 Upgrade to version 2.0.6.
Reviewed by:	se
2002-08-27 15:57:24 +00:00
David E. O'Brien
62252d236c It is easier to modify the manpage if it is a normal text rather than a patch. 2002-08-27 15:55:42 +00:00
Kris Kennaway
f5e1e9e924 Respect CC and CFLAGS 2000-01-24 00:32:34 +00:00
Stefan Eßer
c50b317dcd Prevent overflow of "size" for file sizes of 2048MB and more (PR 11430).
While I'm here:
 - Make a error message start on a new line.
 - Move installation into port Makefile.
 - Split patch-aa in two (add -ac), it modified two files.
 - Close file descriptor 0 in seeker processes.
PR:		11430
1999-12-30 17:18:22 +00:00
Satoshi Asami
d483bc7ba1 Bonnie benchmarking tool.
Submitted by:	se
1995-05-19 09:49:09 +00:00