The fundamental insight for this engine is that wiki pages are read far more often than they are modified. Thus, the generated HTML can be cached. It follows that the main code path will check that the .html file exists and simply copy it to stdout in the vast majority of cases. The .html file generated from each .wiki file is about the same size as the .wiki file itself, so there will be no particular I/O advantage, but there is a huge CPU advantage, and a significant memory footprint advantage, and since I want to run a wiki on a geriatric 20MB 33MHz 386 machine, this is a good thing. Online demo: http://quickie.sourceforge.net/cgi-bin/quickie
16 lines
324 B
Text
16 lines
324 B
Text
$NetBSD: patch-aa,v 1.1.1.1 2006/09/08 04:18:14 samott Exp $
|
|
|
|
--- lib/arglex.cc.orig Sat May 20 23:56:52 2006
|
|
+++ lib/arglex.cc Thu May 25 12:36:38 2006
|
|
@@ -21,6 +21,11 @@
|
|
|
|
#include <cctype>
|
|
|
|
+#ifndef linux
|
|
+#include <unistd.h>
|
|
+#include <stdio.h>
|
|
+#endif
|
|
+
|
|
#include <arglex.h>
|
|
#include <progname.h>
|
|
#include <quit.h>
|