Switch to PostegreSQL.

This commit is contained in:
Mathieu Othacehe 2021-01-05 10:20:34 +01:00
parent ca7a7ca989
commit cbc462679d
No known key found for this signature in database
GPG Key ID: 8354763531769CA6
36 changed files with 1458 additions and 1729 deletions

View File

@ -13,9 +13,6 @@
(eval put 'test-error 'scheme-indent-function 1)
(eval put 'make-parameter 'scheme-indent-function 1)
(eval put 'with-database 'scheme-indent-function 0)
(eval put 'with-queue-writer-worker 'scheme-indent-function 0)
(eval put 'with-db-worker-thread 'scheme-indent-function 1)
(eval put 'with-db-writer-worker-thread 'scheme-indent-function 1))
(texinfo-mode
(indent-tabs-mode)
(fill-column . 72)

View File

@ -79,25 +79,7 @@ nodist_webobject_DATA = \
dist_pkgdata_DATA = src/schema.sql
dist_sql_DATA = \
src/sql/upgrade-1.sql \
src/sql/upgrade-2.sql \
src/sql/upgrade-3.sql \
src/sql/upgrade-4.sql \
src/sql/upgrade-5.sql \
src/sql/upgrade-6.sql \
src/sql/upgrade-7.sql \
src/sql/upgrade-8.sql \
src/sql/upgrade-9.sql \
src/sql/upgrade-10.sql \
src/sql/upgrade-11.sql \
src/sql/upgrade-12.sql \
src/sql/upgrade-13.sql \
src/sql/upgrade-14.sql \
src/sql/upgrade-15.sql \
src/sql/upgrade-16.sql \
src/sql/upgrade-17.sql \
src/sql/upgrade-18.sql \
src/sql/upgrade-19.sql
src/sql/upgrade-1.sql
dist_css_DATA = \
src/static/css/cuirass.css \
@ -163,12 +145,6 @@ CLEANFILES = \
$(nodist_guileobject_DATA) \
src/cuirass/config.go
.PHONY: sql-check
sql-check: src/schema.sql
@echo "$<"
$(AM_V_at)sqlite3 tmp-$$$.db < $< ; \
rm tmp-$$$.db
## -------------- ##
## Distribution. ##
## -------------- ##

78
README
View File

@ -1,8 +1,8 @@
Cuirass is a continuous integration tool using GNU Guix. It is intended as a
replacement for Hydra.
-*- mode: org -*-
Requirements
============
Cuirass is a continuous integration tool using GNU Guix.
* Requirements
Cuirass currently depends on the following packages:
@ -10,7 +10,7 @@ Cuirass currently depends on the following packages:
- GNU Guix (and all its development dependencies)
- GNU Make
- Guile-JSON 3.x
- Guile-SQLite3
- Guile-Squee
- Guile-Git
- Guile-zlib
- Fibers
@ -18,52 +18,94 @@ Cuirass currently depends on the following packages:
A convenient way to install those dependencies is to install Guix and execute
the following command:
#+BEGIN_EXAMPLE
$ guix environment -l build-aux/guix.scm
#+END_EXAMPLE
This will build and enter an environment which provides all the necessary
dependencies.
Build Instructions
==================
* Build Instructions
When all the dependencies are available on you system, in order to build and
install Cuirass, you can proceed with the usual:
#+BEGIN_EXAMPLE
$ ./configure && sudo make install
#+END_EXAMPLE
An alternative way is to directly install Cuirass in your Guix profile, using:
#+BEGIN_EXAMPLE
$ guix package -f build-aux/guix.scm
#+END_EXAMPLE
To build it, but not install it, run:
#+BEGIN_EXAMPLE
$ guix build -f build-aux/guix.scm
#+END_EXAMPLE
Example
=======
* Database connection
Cuirass uses PostgreSQL to store information about jobs, past build results
and to coordinate the execution of jobs. The database connection string must
be passed to Cuirass using the =database= argument, under the keyword/value
format described [[https://www.postgresql.org/docs/10/libpq-connect.html#LIBPQ-CONNSTRING][here]]. The PostgreSQL database must be created beforehand.
For instance, to connect using Unix sockets to the =cuirass= database:
#+BEGIN_EXAMPLE
./pre-inst-env cuirass --database="dbname=cuirass host=/var/run/postgresql"
#+END_EXAMPLE
or using a TCP connection:
#+BEGIN_EXAMPLE
./pre-inst-env cuirass --database="dbname=cuirass host=127.0.0.1"
#+END_EXAMPLE
* Run tests
Cuirass tests also require an access to a PostgreSQL database. This database
must be dedicated to testing as its content will be dropped. The database
name and host must be passed using =CUIRASS_DATABASE= and =CUIRASS_HOST=
environment variables respectively.
#+BEGIN_EXAMPLE
CUIRASS_DATABASE="test_tmp" CUIRASS_HOST="/var/run/postgresql" make check
#+END_EXAMPLE
* Example
A quick way to manually test Cuirass is to execute:
./pre-inst-env cuirass --specifications=examples/hello-singleton.scm --database=test.db
#+BEGIN_EXAMPLE
./pre-inst-env cuirass --specifications=examples/hello-singleton.scm --database="dbname=cuirass host=/var/run/postgresql"
#+END_EXAMPLE
This will read the file "examples/hello-singleton.scm" which contains a list of
specifications and add them to the database "test.db" which is created if it
doesn't already exist.
This will read the file "examples/hello-singleton.scm" which contains a list
of specifications and add them to the =cuirass= database.
'cuirass' then loops evaluating/building the specs. The database keeps track
Cuirass then loops evaluating/building the specs. The database keeps track
of the specifications in order to allow users to accumulate specifications.
To resume the evaluation/build process you can execute the same command
without the '--specifications' option:
./pre-inst-env cuirass --database=test.db
#+BEGIN_EXAMPLE
./pre-inst-env cuirass --database="dbname=cuirass host=/var/run/postgresql"
#+END_EXAMPLE
To start the web interface run:
./pre-inst-env cuirass --web
#+BEGIN_EXAMPLE
./pre-inst-env cuirass --database="dbname=cuirass host=/var/run/postgresql" --web
#+END_EXAMPLE
Contributing
============
* Contributing
See the manual for useful hacking informations, by running
#+BEGIN_EXAMPLE
info -f doc/cuirass.info "Contributing"
#+END_EXAMPLE

View File

@ -57,8 +57,6 @@ exec ${GUILE:-@GUILE@} --no-auto-compile -e main -s "$0" "$@"
-p --port=NUM Port of the HTTP server.
--listen=HOST Listen on the network interface for HOST
-I, --interval=N Wait N seconds between each poll
-Q, --queue-size=N Set the writer queue size to N elements.
--log-queries=FILE Log SQL queries in FILE.
--build-remote Use the remote build mechanism
--use-substitutes Allow usage of pre-built substitutes
--record-events Record events for distribution
@ -77,12 +75,10 @@ exec ${GUILE:-@GUILE@} --no-auto-compile -e main -s "$0" "$@"
(port (single-char #\p) (value #t))
(listen (value #t))
(interval (single-char #\I) (value #t))
(queue-size (single-char #\Q) (value #t))
(build-remote (value #f))
(use-substitutes (value #f))
(threads (value #t))
(fallback (value #f))
(log-queries (value #t))
(record-events (value #f))
(ttl (value #t))
(version (single-char #\V) (value #f))
@ -110,9 +106,7 @@ exec ${GUILE:-@GUILE@} --no-auto-compile -e main -s "$0" "$@"
(%fallback? (option-ref opts 'fallback #f))
(%record-events? (option-ref opts 'record-events #f))
(%gc-root-ttl
(time-second (string->duration (option-ref opts 'ttl "30d"))))
(%db-writer-queue-size
(string->number (option-ref opts 'queue-size "1"))))
(time-second (string->duration (option-ref opts 'ttl "30d")))))
(cond
((option-ref opts 'help #f)
(show-help)
@ -129,7 +123,6 @@ exec ${GUILE:-@GUILE@} --no-auto-compile -e main -s "$0" "$@"
(host (option-ref opts 'listen "localhost"))
(interval (string->number (option-ref opts 'interval "300")))
(specfile (option-ref opts 'specifications #f))
(queries-file (option-ref opts 'log-queries #f))
;; Since our work is mostly I/O-bound, default to a maximum of 4
;; kernel threads. Going beyond that can increase overhead (GC
@ -140,95 +133,87 @@ exec ${GUILE:-@GUILE@} --no-auto-compile -e main -s "$0" "$@"
(min (current-processor-count) 4))))
(prepare-git)
(unless (option-ref opts 'web #f)
(log-message "performing database optimizations")
(db-optimize))
(log-message "running Fibers on ~a kernel threads" threads)
(run-fibers
(lambda ()
(with-database
(with-queue-writer-worker
(and specfile
(let ((new-specs (save-module-excursion
(lambda ()
(set-current-module
(make-user-module '()))
(primitive-load specfile)))))
(for-each db-add-specification new-specs)))
(and specfile
(let ((new-specs (save-module-excursion
(lambda ()
(set-current-module
(make-user-module '()))
(primitive-load specfile)))))
(when queries-file
(log-message "Enable SQL query logging.")
(db-log-queries queries-file))
(for-each db-add-specification new-specs)))
(if one-shot?
(process-specs (db-get-specifications))
(let ((exit-channel (make-channel)))
(start-watchdog)
(if (option-ref opts 'web #f)
(begin
(spawn-fiber
(essential-task
'web exit-channel
(lambda ()
(run-cuirass-server #:host host
#:port port)))
#:parallel? #t)
(if one-shot?
(process-specs (db-get-specifications))
(let ((exit-channel (make-channel)))
(start-watchdog)
(if (option-ref opts 'web #f)
(begin
(spawn-fiber
(essential-task
'web exit-channel
(lambda ()
(run-cuirass-server #:host host
#:port port)))
#:parallel? #t)
(spawn-fiber
(essential-task
'monitor exit-channel
(lambda ()
(while #t
(log-monitoring-stats)
(sleep 600))))))
(spawn-fiber
(essential-task
'monitor exit-channel
(lambda ()
(while #t
(log-monitoring-stats)
(sleep 600))))))
(begin
(clear-build-queue)
(begin
(clear-build-queue)
;; If Cuirass was stopped during an evaluation,
;; abort it. Builds that were not registered
;; during this evaluation will be registered
;; during the next evaluation.
(db-abort-pending-evaluations)
;; If Cuirass was stopped during an evaluation,
;; abort it. Builds that were not registered
;; during this evaluation will be registered
;; during the next evaluation.
(db-abort-pending-evaluations)
;; First off, restart builds that had not
;; completed or were not even started on a
;; previous run.
(spawn-fiber
(essential-task
'restart-builds exit-channel
(lambda ()
(restart-builds))))
;; First off, restart builds that had not
;; completed or were not even started on a
;; previous run.
(spawn-fiber
(essential-task
'restart-builds exit-channel
(lambda ()
(restart-builds))))
(spawn-fiber
(essential-task
'build exit-channel
(lambda ()
(while #t
(process-specs (db-get-specifications))
(log-message
"next evaluation in ~a seconds" interval)
(sleep interval)))))
(spawn-fiber
(essential-task
'build exit-channel
(lambda ()
(while #t
(process-specs (db-get-specifications))
(log-message
"next evaluation in ~a seconds" interval)
(sleep interval)))))
(spawn-fiber
(essential-task
'metrics exit-channel
(lambda ()
(while #t
(with-time-logging
"Metrics update"
(db-update-metrics))
(sleep 3600)))))
(spawn-fiber
(essential-task
'metrics exit-channel
(lambda ()
(while #t
(with-time-logging
"Metrics update"
(db-update-metrics))
(sleep 3600)))))
(spawn-fiber
(essential-task
'monitor exit-channel
(lambda ()
(while #t
(log-monitoring-stats)
(sleep 600)))))))
(primitive-exit (get-message exit-channel)))))))
(spawn-fiber
(essential-task
'monitor exit-channel
(lambda ()
(while #t
(log-monitoring-stats)
(sleep 600)))))))
(primitive-exit (get-message exit-channel))))))
;; Most of our code is I/O so preemption doesn't matter much (it
;; could help while we're doing SQL requests, for instance, but it

View File

@ -67,11 +67,11 @@
;; Wrap the 'cuirass' command to refer to the right modules.
(let* ((out (assoc-ref outputs "out"))
(json (assoc-ref inputs "guile-json"))
(sqlite (assoc-ref inputs "guile-sqlite3"))
(squee (assoc-ref inputs "guile-squee"))
(zlib (assoc-ref inputs "guile-zlib"))
(guix (assoc-ref inputs "guix"))
(mods (string-append json "/share/guile/site/3.0:"
sqlite "/share/guile/site/3.0:"
squee "/share/guile/site/3.0:"
zlib "/share/guile/site/3.0:"
guix "/share/guile/site/3.0")))
(wrap-program (string-append out "/bin/cuirass")
@ -82,7 +82,7 @@
'("guile"
"guile-fibers"
"guile-json"
"guile-sqlite3"
"guile-squee"
"guile-git"
"guile-zlib"
"guix")))

View File

@ -47,7 +47,7 @@ GUILE_MODULE_REQUIRED([guix])
GUILE_MODULE_REQUIRED([guix git])
GUILE_MODULE_REQUIRED([git])
GUILE_MODULE_REQUIRED([json])
GUILE_MODULE_REQUIRED([sqlite3])
GUILE_MODULE_REQUIRED([squee])
GUILE_MODULE_REQUIRED([fibers])
GUILE_MODULE_REQUIRED([zlib])

View File

@ -173,7 +173,7 @@ Currently the only way to add those specifications to cuirass is to put
a list of them in a file and set the @code{--specifications} command
line option argument with the file name when launching the daemon
(@pxref{Invocation}). The specifications are persistent (they are kept
in a SQLite database) so the next time @command{cuirass} is run the
in a PostgreSQL database) so the next time @command{cuirass} is run the
previously added specifications will remain active even if you don't
keep the @code{--specifications} option.
@ -209,9 +209,9 @@ database before launching the evaluation and build processes.
@item --database=@var{database}
@itemx -D @var{database}
Use @var{database} as the database containing the jobs and the past
build results. Since @code{cuirass} uses SQLite as a database engine,
@var{database} must be a file name. If the file doesn't exist, it will
be created.
build results. Since @code{cuirass} uses PostgreSQL as a database
engine, @var{database} must be a file name. If the file doesn't exist,
it will be created.
@item --ttl=@var{duration}
Cuirass registers build results as garbage collector (GC) roots, thereby
@ -263,11 +263,11 @@ Display an help message that summarize all the options provided.
@node Database
@chapter Database schema
@cindex cuirass database
@cindex sqlite database
@cindex postgresql database
@cindex persistent configuration
Cuirass uses a SQLite database to store information about jobs and past
build results, but also to coordinate the execution of jobs.
Cuirass uses a PostgreSQL database to store information about jobs and
past build results, but also to coordinate the execution of jobs.
The database contains the following tables: @code{Specifications},
@code{Inputs}, @code{Checkouts}, @code{Evaluations}, @code{Builds} and

View File

@ -636,8 +636,7 @@ updating the database accordingly."
"Reset the status of builds in the database that are marked as \"started\".
This procedure is meant to be called at startup."
(log-message "marking stale builds as \"scheduled\"...")
(with-db-worker-thread db
(sqlite-exec db "UPDATE Builds SET status = -2 WHERE status = -1;")))
(db-clear-build-queue))
(define (restart-builds)
"Restart builds whose status in the database is \"pending\" (scheduled or

File diff suppressed because it is too large Load Diff

View File

@ -250,14 +250,14 @@ Hydra format."
#:avg-eval-build-start-time
(db-get-metrics-with-id 'average-eval-build-start-time
#:limit 100
#:order "field ASC")
#:order "cast(field as int) ASC")
#:builds-per-day
(db-get-metrics-with-id 'builds-per-day
#:limit 100)
#:eval-completion-speed
(db-get-metrics-with-id 'evaluation-completion-speed
#:limit 100
#:order "field ASC")
#:order "cast(field as int) ASC")
#:new-derivations-per-day
(db-get-metrics-with-id 'new-derivations-per-day
#:limit 100)

View File

@ -20,13 +20,16 @@
#:use-module (cuirass database)
#:use-module (cuirass logging)
#:use-module (guix records)
#:use-module (squee)
#:use-module (srfi srfi-1)
#:use-module (srfi srfi-19)
#:use-module (srfi srfi-26)
#:use-module (ice-9 i18n)
#:use-module (ice-9 match)
#:export (metric
metric?
metric-id
metric-field-type
metric-proc
%metrics
@ -47,6 +50,8 @@
metric?
(id metric-id)
(compute-proc metric-compute-proc)
(field-type metric-field-type
(default 'int))
(field-proc metric-field-proc
(default #f)))
@ -55,72 +60,98 @@
;;; Database procedures.
;;;
(define-syntax-rule (return-exact body ...)
(match (expect-one-row body ...)
((result)
(and result (string->number result)))))
(define-syntax-rule (return-inexact body ...)
(match (expect-one-row body ...)
((result)
(and result (locale-string->inexact result)))))
(define* (db-average-eval-duration-per-spec spec #:key limit)
"Return the average evaluation duration for SPEC. Limit the average
computation to the most recent LIMIT records if this argument is set."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT AVG(duration) FROM
(let ((query "\
SELECT AVG(m.duration) FROM
(SELECT (evaltime - timestamp) as duration
FROM Evaluations WHERE specification = " spec
" AND evaltime != 0 ORDER BY rowid DESC
LIMIT " (or limit -1) ");")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
FROM Evaluations WHERE specification = :spec
AND evaltime != 0 ORDER BY id DESC LIMIT ~a) m;")
(params `((#:spec . ,spec))))
(return-inexact
(exec-query/bind-params db
(format #f query
(if limit
(number->string limit)
"ALL"))
params)))))
(define (db-builds-previous-day _)
"Return the builds count of the previous day."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT COUNT(*) from Builds
WHERE date(timestamp, 'unixepoch') = date('now', '-1 day') AND
date(stoptime, 'unixepoch') = date('now', '-1 day');")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
(return-exact
(exec-query/bind db "SELECT COUNT(*) from Builds
WHERE to_timestamp(timestamp)::date = 'yesterday'::date AND
to_timestamp(stoptime)::date = 'yesterday'::date;"))))
(define (db-new-derivations-previous-day _)
"Return the new derivations count of the previous day."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT COUNT(*) from Builds
WHERE date(timestamp, 'unixepoch') = date('now', '-1 day');")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
(return-exact
(exec-query/bind db "SELECT COUNT(*) from Builds
WHERE to_timestamp(timestamp)::date = 'yesterday'::date;"))))
(define (db-pending-builds _)
"Return the current pending builds count."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT COUNT(*) from Builds
WHERE status < 0;")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
(return-exact
(exec-query/bind db "SELECT COUNT(*) from Builds
WHERE status < 0;"))))
(define* (db-percentage-failed-eval-per-spec spec #:key limit)
"Return the failed evaluation percentage for SPEC. If LIMIT is set, limit
the percentage computation to the most recent LIMIT records."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "\
SELECT 100 * CAST(SUM(status > 0) as float) / COUNT(*) FROM
(SELECT status from Evaluations WHERE specification = " spec
" ORDER BY rowid DESC LIMIT " (or limit -1) ");")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
(let ((query "\
SELECT 100 *
CAST(SUM(CASE WHEN m.status > 0 THEN 1 ELSE 0 END) as float) /
COUNT(*) FROM
(SELECT status from Evaluations WHERE specification = :spec
ORDER BY id DESC LIMIT ~a) m")
(params `((#:spec . ,spec))))
(return-inexact
(exec-query/bind-params db
(format #f query
(if limit
(number->string limit)
"ALL"))
params)))))
(define* (db-average-build-start-time-per-eval eval)
"Return the average build start time for the given EVAL."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "\
(return-inexact
(exec-query/bind db "\
SELECT AVG(B.starttime - E.evaltime) FROM
(SELECT id, evaltime
FROM Evaluations WHERE id = " eval ") E
LEFT JOIN Builds as B
ON E.id = B.evaluation and B.starttime > 0
GROUP BY E.id;")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
GROUP BY E.id;"))))
(define* (db-average-build-complete-time-per-eval eval)
"Return the average build complete time for the given EVAL."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "\
(return-inexact
(exec-query/bind db "\
SELECT AVG(B.stoptime - E.evaltime) FROM
(SELECT id, evaltime
FROM Evaluations WHERE id = " eval ") E
LEFT JOIN Builds as B
ON E.id = B.evaluation and B.stoptime > 0
GROUP BY E.id;")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
GROUP BY E.id;"))))
(define* (db-evaluation-completion-speed eval)
"Return the evaluation completion speed of the given EVAL. The speed is
@ -133,45 +164,45 @@ expressed in builds per hour."
;; evaluation_duration (seconds) = max(build_stop_time) - eval_start_time
;; If the evaluation builds are all completed.
(with-db-worker-thread db
(let ((rows (sqlite-exec db "\
(return-inexact
(exec-query/bind db "\
SELECT
3600.0 * SUM(B.status = 0) /
(CASE SUM(status < 0)
3600.0 * SUM(CASE WHEN B.status = 0 THEN 1 ELSE 0 END) /
(CASE SUM(CASE WHEN status < 0 THEN 1 ELSE 0 END)
WHEN 0 THEN MAX(stoptime)
ELSE strftime('%s', 'now')
ELSE extract(epoch from 'today'::date)
END - E.evaltime) FROM
(SELECT id, evaltime
FROM Evaluations WHERE id = " eval ") E
LEFT JOIN Builds as B
ON E.id = B.evaluation and B.stoptime > 0
GROUP BY E.id;")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
GROUP BY E.id, E.evaltime;"))))
(define (db-previous-day-timestamp)
"Return the timestamp of the previous day."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT strftime('%s',
date('now', '-1 day'));")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
(return-exact
(exec-query
db "SELECT extract(epoch from 'yesterday'::date);"))))
(define (db-current-day-timestamp)
"Return the timestamp of the current day."
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT strftime('%s',
date('now'));")))
(and=> (expect-one-row rows) (cut vector-ref <> 0)))))
(return-exact
(exec-query
db "SELECT extract(epoch from 'today'::date);"))))
(define* (db-latest-evaluations #:key (days 3))
"Return the successful evaluations added during the previous DAYS."
(with-db-worker-thread db
(let ((query (format #f "SELECT id from Evaluations
WHERE date(timestamp, 'unixepoch') > date('now', '-~a day') AND
status = 0 ORDER BY rowid DESC" days)))
(let loop ((rows (sqlite-exec db query))
WHERE to_timestamp(timestamp)::date > 'today'::date - interval '~a day' AND
status = 0 ORDER BY id DESC" days)))
(let loop ((rows (exec-query db query))
(evaluations '()))
(match rows
(() (reverse evaluations))
((#(id) . rest)
(((id) . rest)
(loop rest
(cons id evaluations))))))))
@ -187,16 +218,19 @@ status = 0 ORDER BY rowid DESC" days)))
;; Average evaluation duration per specification.
(metric
(id 'average-10-last-eval-duration-per-spec)
(field-type 'string)
(compute-proc
(cut db-average-eval-duration-per-spec <> #:limit 10)))
(metric
(id 'average-100-last-eval-duration-per-spec)
(field-type 'string)
(compute-proc
(cut db-average-eval-duration-per-spec <> #:limit 100)))
(metric
(id 'average-eval-duration-per-spec)
(field-type 'string)
(compute-proc db-average-eval-duration-per-spec))
;; Builds count per day.
@ -220,16 +254,19 @@ status = 0 ORDER BY rowid DESC" days)))
;; Percentage of failed evaluations per specification.
(metric
(id 'percentage-failure-10-last-eval-per-spec)
(field-type 'string)
(compute-proc
(cut db-percentage-failed-eval-per-spec <> #:limit 10)))
(metric
(id 'percentage-failure-100-last-eval-per-spec)
(field-type 'string)
(compute-proc
(cut db-percentage-failed-eval-per-spec <> #:limit 100)))
(metric
(id 'percentage-failed-eval-per-spec)
(field-type 'string)
(compute-proc db-percentage-failed-eval-per-spec))
;; Average time to start a build for an evaluation.
@ -268,33 +305,38 @@ to identify the metric type in database."
(define* (db-get-metric id field)
"Return the metric with the given ID and FIELD."
(let* ((metric (find-metric id))
(type (metric->type metric)))
(with-db-worker-thread db
(let ((rows (sqlite-exec db "SELECT value from Metrics
WHERE type = " type " AND field = " field ";")))
(and=> (expect-one-row rows) (cut vector-ref <> 0))))))
(with-db-worker-thread db
(let* ((metric (find-metric id))
(type (metric->type metric)))
(return-inexact
(exec-query/bind db "SELECT value from Metrics
WHERE type = " type " AND field = " field ";")))))
(define* (db-get-metrics-with-id id
#:key
limit
(order "rowid DESC"))
(order "id DESC"))
"Return the metrics with the given ID. If LIMIT is set, the resulting list
if restricted to LIMIT records."
(let* ((metric (find-metric id))
(type (metric->type metric))
(limit (or limit -1)))
(with-db-worker-thread db
(with-db-worker-thread db
(let* ((metric (find-metric id))
(type (metric->type metric))
(field-type (metric-field-type metric))
(limit (or limit "ALL")))
(let ((query (format #f "SELECT field, value from Metrics
WHERE type = ? ORDER BY ~a LIMIT ~a" order limit)))
(let loop ((rows (%sqlite-exec db query type))
WHERE type = :type ORDER BY ~a LIMIT ~a" order limit))
(params `((#:type . ,type))))
(let loop ((rows (exec-query/bind-params db query params))
(metrics '()))
(match rows
(() (reverse metrics))
((#(field value) . rest)
(loop rest
`((,field . ,value)
,@metrics)))))))))
(((field value) . rest)
(let ((field (match field-type
('int (string->number field))
(else field))))
(loop rest
`((,field . ,(locale-string->inexact value))
,@metrics))))))))))
(define* (db-update-metric id #:optional field)
"Compute and update the value of the metric ID in database.
@ -306,67 +348,66 @@ for periodical metrics for instance."
(define now
(time-second (current-time time-utc)))
(let* ((metric (find-metric id))
(field-proc (metric-field-proc metric))
(field (or field (field-proc)))
(value (compute-metric metric field)))
(if value
(begin
(log-message "Updating metric ~a (~a) to ~a."
(symbol->string id) field value)
(with-db-worker-thread db
(sqlite-exec db "\
INSERT OR REPLACE INTO Metrics (field, type, value,
(with-db-worker-thread db
(let* ((metric (find-metric id))
(field-proc (metric-field-proc metric))
(field (or field (field-proc)))
(value (compute-metric metric field)))
(if value
(begin
(log-message "Updating metric ~a (~a) to ~a."
(symbol->string id) field value)
(exec-query/bind db "\
INSERT INTO Metrics (field, type, value,
timestamp) VALUES ("
field ", "
(metric->type metric) ", "
value ", "
now ");")
(last-insert-rowid db)))
(log-message "Failed to compute metric ~a (~a)."
(symbol->string id) field))))
field ", "
(metric->type metric) ", "
value ", "
now ")
ON CONFLICT ON CONSTRAINT metrics_pkey DO
UPDATE SET value = " value ", timestamp = " now ";"))
(log-message "Failed to compute metric ~a (~a)."
(symbol->string id) field)))))
(define (db-update-metrics)
"Compute and update all available metrics in database."
(with-db-writer-worker-thread/force db
(catch-sqlite-error
;; We can not update all evaluations metrics for performance reasons.
;; Limit to the evaluations that were added during the past three days.
(let ((specifications
(map (cut assq-ref <> #:name) (db-get-specifications)))
(evaluations (db-latest-evaluations)))
(sqlite-exec db "BEGIN TRANSACTION;")
;; We can not update all evaluations metrics for performance reasons.
;; Limit to the evaluations that were added during the past three days.
(with-db-worker-thread db
(let ((specifications
(map (cut assq-ref <> #:name) (db-get-specifications)))
(evaluations (db-latest-evaluations)))
(exec-query db "BEGIN TRANSACTION;")
(db-update-metric 'builds-per-day)
(db-update-metric 'new-derivations-per-day)
(db-update-metric 'pending-builds)
(db-update-metric 'builds-per-day)
(db-update-metric 'new-derivations-per-day)
(db-update-metric 'pending-builds)
;; Update specification related metrics.
(for-each (lambda (spec)
(db-update-metric
'average-10-last-eval-duration-per-spec spec)
(db-update-metric
'average-100-last-eval-duration-per-spec spec)
(db-update-metric
'average-eval-duration-per-spec spec)
;; Update specification related metrics.
(for-each (lambda (spec)
(db-update-metric
'average-10-last-eval-duration-per-spec spec)
(db-update-metric
'average-100-last-eval-duration-per-spec spec)
(db-update-metric
'average-eval-duration-per-spec spec)
(db-update-metric
'percentage-failure-10-last-eval-per-spec spec)
(db-update-metric
'percentage-failure-100-last-eval-per-spec spec)
(db-update-metric
'percentage-failed-eval-per-spec spec))
specifications)
(db-update-metric
'percentage-failure-10-last-eval-per-spec spec)
(db-update-metric
'percentage-failure-100-last-eval-per-spec spec)
(db-update-metric
'percentage-failed-eval-per-spec spec))
specifications)
;; Update evaluation related metrics.
(for-each (lambda (evaluation)
(db-update-metric
'average-eval-build-start-time evaluation)
(db-update-metric
'average-eval-build-complete-time evaluation)
(db-update-metric
'evaluation-completion-speed evaluation))
evaluations)
;; Update evaluation related metrics.
(for-each (lambda (evaluation)
(db-update-metric
'average-eval-build-start-time evaluation)
(db-update-metric
'average-eval-build-complete-time evaluation)
(db-update-metric
'evaluation-completion-speed evaluation))
evaluations)
(sqlite-exec db "COMMIT;"))
(on SQLITE_BUSY_SNAPSHOT => #f))))
(exec-query db "COMMIT;"))))

View File

@ -157,6 +157,7 @@ system whose names start with " (code "guile-") ":" (br)
(define (status-class status)
(cond
((= (build-status submitted) status) "oi oi-clock text-warning")
((= (build-status scheduled) status) "oi oi-clock text-warning")
((= (build-status started) status) "oi oi-reload text-warning")
((= (build-status succeeded) status) "oi oi-check text-success")
@ -168,6 +169,7 @@ system whose names start with " (code "guile-") ":" (br)
(define (status-title status)
(cond
((= (build-status submitted) status) "Submitted")
((= (build-status scheduled) status) "Scheduled")
((= (build-status started) status) "Started")
((= (build-status succeeded) status) "Succeeded")

View File

@ -23,6 +23,10 @@
#:use-module (cuirass logging)
#:use-module (ice-9 match)
#:use-module (ice-9 threads)
#:use-module ((ice-9 suspendable-ports)
#:select (current-read-waiter
current-write-waiter))
#:use-module (ice-9 ports internal)
#:use-module (rnrs bytevectors)
#:use-module (system foreign)
#:use-module (srfi srfi-1)
@ -106,58 +110,32 @@ delimited continuations and fibers."
(make-parameter #f))
(define* (make-worker-thread-channel initializer
#:key
(parallelism 1)
queue-size
(queue-proc (const #t)))
#:key (parallelism 1))
"Return a channel used to offload work to a dedicated thread. ARGS are the
arguments of the worker thread procedure. This procedure supports deferring
work sent to the worker. If QUEUE-SIZE is set, each work query will be
appended to a queue that will be run once it reaches QUEUE-SIZE elements.
When that happens, the QUEUE-PROC procedure is called with %WORKER-THREAD-ARGS
and a procedure running the queued work as arguments. The worker thread can
be passed options. When #:FORCE? option is set, the worker runs the sent work
immediately even if QUEUE-SIZE has been set."
arguments of the worker thread procedure."
(parameterize (((@@ (fibers internal) current-fiber) #f))
(let ((channel (make-channel)))
(for-each
(lambda _
(let ((args (initializer)))
(call-with-new-thread
(lambda ()
(parameterize ((%worker-thread-args args))
(let loop ((queue '()))
(match (get-message channel)
(((? channel? reply) options (? procedure? proc))
(put-message
reply
(catch #t
(lambda ()
(cond
((or (not queue-size)
(assq-ref options #:force?))
(parameterize ((current-read-waiter (lambda (port)
(port-poll port "r")))
(current-write-waiter (lambda (port)
(port-poll port "w"))))
(lambda ()
(parameterize ((%worker-thread-args args))
(let loop ()
(match (get-message channel)
(((? channel? reply) . (? procedure? proc))
(put-message
reply
(catch #t
(lambda ()
(apply proc args))
(else
(length queue))))
(lambda (key . args)
(cons* 'worker-thread-error key args))))
(let ((new-queue
(cond
((or (not queue-size)
(assq-ref options #:force?))
'())
((= (1+ (length queue)) queue-size)
(let ((run-queue
(lambda ()
(for-each (lambda (thunk)
(apply thunk args))
(append queue (list proc))))))
(apply queue-proc (append args (list run-queue)))
'()))
(else
(append queue (list proc))))))
(loop new-queue))))))))))
(lambda (key . args)
(cons* 'worker-thread-error key args))))))
(loop))))))))
(iota parallelism))
channel)))
@ -225,7 +203,6 @@ put-operation until it succeeds."
(define* (call-with-worker-thread channel proc
#:key
options
send-timeout
send-timeout-proc
receive-timeout
@ -239,15 +216,12 @@ to a worker thread.
The same goes for RECEIVE-TIMEOUT and RECEIVE-TIMEOUT-PROC, except that the
timer expires if there is no response from the database worker PROC was sent
to.
OPTIONS are forwarded to the worker thread. See MAKE-WORKER-THREAD-CHANNEL
for a description of the supported options."
to."
(let ((args (%worker-thread-args)))
(if args
(apply proc args)
(let* ((reply (make-channel))
(message (list reply options proc)))
(message (cons reply proc)))
(if (and send-timeout (current-fiber))
(put-message-with-timeout channel message
#:seconds send-timeout

View File

@ -1,5 +1,9 @@
BEGIN TRANSACTION;
CREATE TABLE SchemaVersion (
version INTEGER NOT NULL
);
CREATE TABLE Specifications (
name TEXT NOT NULL PRIMARY KEY,
load_path_inputs TEXT NOT NULL, -- list of input names whose load path will be in Guile's %load-path
@ -23,7 +27,17 @@ CREATE TABLE Inputs (
revision TEXT,
no_compile_p INTEGER,
PRIMARY KEY (specification, name),
FOREIGN KEY (specification) REFERENCES Specifications (name)
FOREIGN KEY (specification) REFERENCES Specifications(name)
);
CREATE TABLE Evaluations (
id SERIAL PRIMARY KEY,
specification TEXT NOT NULL,
status INTEGER NOT NULL,
timestamp INTEGER NOT NULL,
checkouttime INTEGER NOT NULL,
evaltime INTEGER NOT NULL,
FOREIGN KEY (specification) REFERENCES Specifications(name)
);
CREATE TABLE Checkouts (
@ -34,30 +48,13 @@ CREATE TABLE Checkouts (
directory TEXT NOT NULL,
timestamp INTEGER NOT NULL,
PRIMARY KEY (specification, revision),
FOREIGN KEY (evaluation) REFERENCES Evaluations (id),
FOREIGN KEY (specification) REFERENCES Specifications (name),
FOREIGN KEY (input) REFERENCES Inputs (name)
);
CREATE TABLE Evaluations (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
specification TEXT NOT NULL,
status INTEGER NOT NULL,
timestamp INTEGER NOT NULL,
checkouttime INTEGER NOT NULL,
evaltime INTEGER NOT NULL,
FOREIGN KEY (specification) REFERENCES Specifications (name)
);
CREATE TABLE Outputs (
derivation TEXT NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL PRIMARY KEY,
FOREIGN KEY (derivation) REFERENCES Builds (derivation)
FOREIGN KEY (evaluation) REFERENCES Evaluations(id),
FOREIGN KEY (specification) REFERENCES Specifications(name),
FOREIGN KEY (specification, input) REFERENCES Inputs(specification, name)
);
CREATE TABLE Builds (
id INTEGER NOT NULL PRIMARY KEY,
id SERIAL PRIMARY KEY,
derivation TEXT NOT NULL UNIQUE,
evaluation INTEGER NOT NULL,
job_name TEXT NOT NULL,
@ -72,11 +69,19 @@ CREATE TABLE Builds (
timestamp INTEGER NOT NULL,
starttime INTEGER NOT NULL,
stoptime INTEGER NOT NULL,
FOREIGN KEY (evaluation) REFERENCES Evaluations (id)
FOREIGN KEY (evaluation) REFERENCES Evaluations(id)
);
CREATE TABLE Outputs (
derivation TEXT NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL PRIMARY KEY,
FOREIGN KEY (derivation) REFERENCES Builds(derivation) ON DELETE CASCADE
);
CREATE TABLE Metrics (
field INTEGER NOT NULL,
id SERIAL,
field TEXT NOT NULL,
type INTEGER NOT NULL,
value DOUBLE PRECISION NOT NULL,
timestamp INTEGER NOT NULL,
@ -84,17 +89,18 @@ CREATE TABLE Metrics (
);
CREATE TABLE BuildProducts (
id SERIAL,
build INTEGER NOT NULL,
type TEXT NOT NULL,
file_size BIGINT NOT NULL,
checksum TEXT NOT NULL,
path TEXT NOT NULL,
PRIMARY KEY (build, path)
FOREIGN KEY (build) REFERENCES Builds (id) ON DELETE CASCADE
PRIMARY KEY (build, path),
FOREIGN KEY (build) REFERENCES Builds(id) ON DELETE CASCADE
);
CREATE TABLE Events (
id INTEGER PRIMARY KEY,
id SERIAL PRIMARY KEY,
type TEXT NOT NULL,
timestamp INTEGER NOT NULL,
event_json TEXT NOT NULL
@ -112,12 +118,12 @@ CREATE TABLE Workers (
CREATE INDEX Builds_status_index ON Builds (status);
CREATE INDEX Builds_evaluation_index ON Builds (evaluation, status);
CREATE INDEX Builds_job_name_timestamp on Builds(job_name, timestamp);
CREATE INDEX Builds_nix_name ON Builds (nix_name COLLATE NOCASE);
CREATE INDEX Builds_nix_name ON Builds (nix_name);
CREATE INDEX Builds_timestamp_stoptime on Builds(timestamp, stoptime);
CREATE INDEX Builds_stoptime on Builds(stoptime DESC);
CREATE INDEX Builds_stoptime_id on Builds(stoptime DESC, id DESC);
CREATE INDEX Builds_status_ts_id on Builds(status DESC, timestamp DESC, id ASC);
CREATE INDEX Builds_priority_timestamp on Builds(priority DESC, timestamp ASC);
CREATE INDEX Builds_priority_timestamp on Builds(priority ASC, timestamp DESC);
CREATE INDEX Evaluations_status_index ON Evaluations (id, status);
CREATE INDEX Evaluations_specification_index ON Evaluations (specification, id DESC);

View File

@ -1,78 +1,3 @@
BEGIN TRANSACTION;
DROP INDEX Specifications_index;
ALTER TABLE Specifications RENAME TO tmp_Specifications;
ALTER TABLE Stamps RENAME TO tmp_Stamps;
ALTER TABLE Evaluations RENAME TO tmp_Evaluations;
CREATE TABLE Specifications (
name TEXT NOT NULL PRIMARY KEY,
load_path_inputs TEXT NOT NULL, -- list of input names whose load path will be in Guile's %load-path
package_path_inputs TEXT NOT NULL, -- list of input names whose load paths will be in GUIX_PACKAGE_PATH
proc_input TEXT NOT NULL, -- name of the input containing the proc that does the evaluation
proc_file TEXT NOT NULL, -- file containing the procedure that does the evaluation, relative to proc_input
proc TEXT NOT NULL, -- defined in proc_file
proc_args TEXT NOT NULL -- passed to proc
);
CREATE TABLE Inputs (
specification TEXT NOT NULL,
name TEXT NOT NULL,
url TEXT NOT NULL,
load_path TEXT NOT NULL,
-- The following columns are optional.
branch TEXT,
tag TEXT,
revision TEXT,
no_compile_p INTEGER,
PRIMARY KEY (specification, name),
FOREIGN KEY (specification) REFERENCES Specifications (name)
);
CREATE TABLE Stamps (
specification TEXT NOT NULL PRIMARY KEY,
stamp TEXT NOT NULL,
FOREIGN KEY (specification) REFERENCES Specifications (name)
);
CREATE TABLE Evaluations (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
specification TEXT NOT NULL,
commits TEXT NOT NULL,
FOREIGN KEY (specification) REFERENCES Specifications (name)
);
INSERT INTO Specifications (name, load_path_inputs, package_path_inputs, proc_input, proc_file, proc, proc_args)
SELECT printf('%s-%s', repo_name, branch) AS name,
printf('("%s")', repo_name) AS load_path_inputs,
'()' AS package_path_inputs,
repo_name AS proc_input,
file AS proc_file,
proc,
arguments AS proc_args
FROM tmp_Specifications;
INSERT INTO Inputs (specification, name, url, load_path, branch, tag, revision, no_compile_p)
SELECT printf('%s-%s', repo_name, branch) AS specification,
repo_name AS name,
url, load_path, branch, tag, revision, no_compile_p
FROM tmp_Specifications;
INSERT INTO Stamps (specification, stamp)
SELECT Specifications.name AS specification, stamp
FROM tmp_Stamps
LEFT JOIN Specifications ON Specifications.proc_input = tmp_Stamps.specification;
INSERT INTO Evaluations (id, specification, commits)
SELECT id, Specifications.name AS specification, revision
FROM tmp_Evaluations
LEFT JOIN Specifications ON Specifications.proc_input = tmp_Evaluations.specification;
CREATE INDEX Inputs_index ON Inputs(specification, name, branch);
DROP TABLE tmp_Specifications;
DROP TABLE tmp_Stamps;
DROP TABLE tmp_Evaluations;
COMMIT;

View File

@ -1,12 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Evaluations RENAME COLUMN in_progress TO status;
-- Set all pending evaluations to aborted.
UPDATE Evaluations SET status = 2 WHERE status = 1;
-- All evaluations that did not trigger any build are set to failed.
UPDATE Evaluations SET status = 1 WHERE id NOT IN
(SELECT evaluation FROM Builds);
COMMIT;

View File

@ -1,11 +0,0 @@
BEGIN TRANSACTION;
CREATE TABLE Metrics (
field INTEGER NOT NULL,
type INTEGER NOT NULL,
value DOUBLE PRECISION NOT NULL,
timestamp INTEGER NOT NULL,
PRIMARY KEY (field, type)
);
COMMIT;

View File

@ -1,7 +0,0 @@
BEGIN TRANSACTION;
CREATE INDEX Builds_evaluation_index ON Builds (evaluation, status);
CREATE INDEX Evaluations_status_index ON Evaluations (id, status);
CREATE INDEX Evaluations_specification_index ON Evaluations (specification, id DESC);
COMMIT;

View File

@ -1,5 +0,0 @@
BEGIN TRANSACTION;
CREATE INDEX Builds_nix_name ON Builds (nix_name COLLATE NOCASE);
COMMIT;

View File

@ -1,5 +0,0 @@
BEGIN TRANSACTION;
CREATE INDEX Builds_timestamp_stoptime on Builds(timestamp, stoptime);
COMMIT;

View File

@ -1,7 +0,0 @@
BEGIN TRANSACTION;
CREATE INDEX Builds_stoptime on Builds(stoptime DESC);
CREATE INDEX Builds_stoptime_id on Builds(stoptime DESC, id DESC);
CREATE INDEX Builds_status_ts_id on Builds(status DESC, timestamp DESC, id ASC);
COMMIT;

View File

@ -1,5 +0,0 @@
BEGIN TRANSACTION;
CREATE INDEX Builds_job_name_timestamp on Builds(job_name, timestamp);
COMMIT;

View File

@ -1,5 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Builds ADD worker TEXT DEFAULT NULL;
COMMIT;

View File

@ -1,10 +0,0 @@
BEGIN TRANSACTION;
CREATE TABLE Workers (
name TEXT NOT NULL PRIMARY KEY,
address TEXT NOT NULL,
systems TEXT NOT NULL,
last_seen INTEGER NOT NULL
);
COMMIT;

View File

@ -1,11 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Specifications ADD priority INTEGER NOT NULL DEFAULT 0;
ALTER TABLE Builds ADD priority INTEGER NOT NULL DEFAULT 0;
ALTER TABLE Builds ADD max_silent INTEGER NOT NULL DEFAULT 0;
ALTER TABLE Builds ADD timeout INTEGER NOT NULL DEFAULT 0;
CREATE INDEX Builds_priority_timestamp on Builds(priority DESC, timestamp ASC);
COMMIT;

View File

@ -1,49 +0,0 @@
BEGIN TRANSACTION;
DROP INDEX Derivations_index;
DROP INDEX Builds_Derivations_index;
ALTER TABLE Outputs RENAME TO tmp_Outputs;
ALTER TABLE Builds RENAME TO tmp_Builds;
CREATE TABLE Builds (
derivation TEXT NOT NULL PRIMARY KEY,
evaluation INTEGER NOT NULL,
job_name TEXT NOT NULL,
system TEXT NOT NULL,
nix_name TEXT NOT NULL,
log TEXT NOT NULL,
status INTEGER NOT NULL,
timestamp INTEGER NOT NULL,
starttime INTEGER NOT NULL,
stoptime INTEGER NOT NULL,
FOREIGN KEY (evaluation) REFERENCES Evaluations (id)
);
CREATE TABLE Outputs (
derivation TEXT NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL,
PRIMARY KEY (derivation, name),
FOREIGN KEY (derivation) REFERENCES Builds (derivation)
);
INSERT OR IGNORE INTO Builds (derivation, evaluation, job_name, system, nix_name, log, status, timestamp, starttime, stoptime)
SELECT Derivations.derivation, Derivations.evaluation, Derivations.job_name, Derivations.system, Derivations.nix_name,
tmp_Builds.log, tmp_Builds.status, tmp_Builds.timestamp, tmp_Builds.starttime, tmp_Builds.stoptime
FROM Derivations
INNER JOIN tmp_Builds ON tmp_Builds.derivation = Derivations.derivation
AND tmp_Builds.evaluation = Derivations.evaluation;
INSERT OR IGNORE INTO Outputs (derivation, name, path)
SELECT tmp_Builds.derivation, tmp_Outputs.name, tmp_Outputs.path
FROM tmp_Outputs
INNER JOIN tmp_Builds on tmp_Builds.id = tmp_Outputs.build;
CREATE INDEX Builds_index ON Builds(job_name, system, status ASC, timestamp ASC, derivation, evaluation, stoptime DESC);
DROP TABLE tmp_Builds;
DROP TABLE tmp_Outputs;
DROP TABLE Derivations;
COMMIT;

View File

@ -1,46 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Evaluations RENAME TO tmp_Evaluations;
CREATE TABLE Evaluations (
id INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT,
specification TEXT NOT NULL,
in_progress INTEGER NOT NULL,
FOREIGN KEY (specification) REFERENCES Specifications (name)
);
CREATE TABLE Checkouts (
specification TEXT NOT NULL,
revision TEXT NOT NULL,
evaluation INTEGER NOT NULL,
input TEXT NOT NULL,
directory TEXT NOT NULL,
PRIMARY KEY (specification, revision),
FOREIGN KEY (evaluation) REFERENCES Evaluations (id),
FOREIGN KEY (specification) REFERENCES Specifications (name),
FOREIGN KEY (input) REFERENCES Inputs (name)
);
INSERT INTO Evaluations (id, specification, in_progress)
SELECT id, specification, false
FROM tmp_Evaluations;
-- Copied from https://www.samuelbosch.com/2018/02/split-into-rows-sqlite.html.
INSERT OR IGNORE INTO Checkouts (specification, revision, evaluation, input, directory)
WITH RECURSIVE split(id, specification, revision, rest) AS (
SELECT id, specification, '', commits || ' ' FROM tmp_Evaluations
UNION ALL
SELECT id,
specification,
substr(rest, 0, instr(rest, ' ')),
substr(rest, instr(rest, ' ') + 1)
FROM split
WHERE rest <> '')
SELECT specification, revision, id, 'unknown', 'unknown'
FROM split
WHERE revision <> '';
DROP TABLE tmp_Evaluations;
DROP TABLE Stamps;
COMMIT;

View File

@ -1,18 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Outputs RENAME TO tmp_Outputs;
CREATE TABLE Outputs (
derivation TEXT NOT NULL,
name TEXT NOT NULL,
path TEXT NOT NULL PRIMARY KEY,
FOREIGN KEY (derivation) REFERENCES Builds (derivation)
);
INSERT OR IGNORE INTO Outputs (derivation, name, path)
SELECT derivation, name, path
FROM tmp_Outputs;
DROP TABLE tmp_Outputs;
COMMIT;

View File

@ -1,15 +0,0 @@
BEGIN TRANSACTION;
CREATE TABLE Events (
id INTEGER PRIMARY KEY,
type TEXT NOT NULL,
timestamp INTEGER NOT NULL,
event_json TEXT NOT NULL
);
CREATE TABLE EventsOutbox (
event_id INTEGER NOT NULL,
FOREIGN KEY (event_id) REFERENCES Events (id)
);
COMMIT;

View File

@ -1,47 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Builds RENAME TO OldBuilds;
CREATE TABLE Builds (
id INTEGER NOT NULL PRIMARY KEY,
derivation TEXT NOT NULL UNIQUE,
evaluation INTEGER NOT NULL,
job_name TEXT NOT NULL,
system TEXT NOT NULL,
nix_name TEXT NOT NULL,
log TEXT NOT NULL,
status INTEGER NOT NULL,
timestamp INTEGER NOT NULL,
starttime INTEGER NOT NULL,
stoptime INTEGER NOT NULL,
FOREIGN KEY (evaluation) REFERENCES Evaluations (id)
);
INSERT INTO Builds(
id,
derivation,
evaluation,
job_name,
system,
nix_name,
log,
status,
timestamp,
starttime,
stoptime
) SELECT rowid,
derivation,
evaluation,
job_name,
system,
nix_name,
log,
status,
timestamp,
starttime,
stoptime
FROM OldBuilds;
DROP TABLE OldBuilds;
COMMIT;

View File

@ -1,15 +0,0 @@
BEGIN TRANSACTION;
CREATE TABLE BuildProducts (
build INTEGER NOT NULL,
type TEXT NOT NULL,
file_size BIGINT NOT NULL,
checksum TEXT NOT NULL,
path TEXT NOT NULL,
PRIMARY KEY (build, path)
FOREIGN KEY (build) REFERENCES Builds (id) ON DELETE CASCADE
);
ALTER TABLE Specifications ADD build_outputs TEXT NOT NULL DEFAULT "()";
COMMIT;

View File

@ -1,7 +0,0 @@
BEGIN TRANSACTION;
CREATE INDEX Builds_status_index ON Builds (status);
CREATE INDEX Outputs_derivation_index ON Outputs (derivation);
COMMIT;

View File

@ -1,9 +0,0 @@
BEGIN TRANSACTION;
ALTER TABLE Evaluations ADD timestamp INTEGER NOT NULL DEFAULT 0;
ALTER TABLE Evaluations ADD checkouttime INTEGER NOT NULL DEFAULT 0;
ALTER TABLE Evaluations ADD evaltime INTEGER NOT NULL DEFAULT 0;
ALTER TABLE Checkouts ADD timestamp INTEGER NOT NULL DEFAULT 0;
COMMIT;

View File

@ -21,8 +21,12 @@
;;; along with Cuirass. If not, see <http://www.gnu.org/licenses/>.
(use-modules (cuirass database)
((guix utils) #:select (call-with-temporary-output-file))
(cuirass remote)
(cuirass utils)
((guix utils) #:select (call-with-temporary-output-file))
(squee)
(ice-9 match)
(srfi srfi-19)
(srfi srfi-64))
(define example-spec
@ -33,15 +37,15 @@
(#:proc-file . "/tmp/gnu-system.scm")
(#:proc . hydra-jobs)
(#:proc-args (subset . "hello"))
(#:inputs . (((#:name . "savannah")
(#:url . "git://git.savannah.gnu.org/guix.git")
(#:inputs . (((#:name . "maintenance")
(#:url . "git://git.savannah.gnu.org/guix/maintenance.git")
(#:load-path . ".")
(#:branch . "master")
(#:tag . #f)
(#:commit . #f)
(#:no-compile? . #f))
((#:name . "maintenance")
(#:url . "git://git.savannah.gnu.org/guix/maintenance.git")
((#:name . "savannah")
(#:url . "git://git.savannah.gnu.org/guix.git")
(#:load-path . ".")
(#:branch . "master")
(#:tag . #f)
@ -52,173 +56,353 @@
(define (make-dummy-checkouts fakesha1 fakesha2)
`(((#:commit . ,fakesha1)
(#:input . "guix")
(#:input . "savannah")
(#:directory . "foo"))
((#:commit . ,fakesha2)
(#:input . "packages")
(#:input . "maintenance")
(#:directory . "bar"))))
(define* (make-dummy-build drv
#:optional (eval-id 42)
#:optional (eval-id 2)
#:key (outputs
`(("foo" . ,(format #f "~a.output" drv)))))
`((#:derivation . ,drv)
(#:eval-id . ,eval-id)
(#:job-name . "job")
(#:timestamp . ,(time-second (current-time time-utc)))
(#:system . "x86_64-linux")
(#:nix-name . "foo")
(#:log . "log")
(#:outputs . ,outputs)))
(define-syntax-rule (with-temporary-database body ...)
(call-with-temporary-output-file
(lambda (file port)
(parameterize ((%package-database file))
(db-init file)
(with-database
(parameterize ((%db-writer-channel (%db-channel)))
body ...))))))
(define %dummy-worker
(worker
(name "worker")
(address "address")
(systems '("a" "b"))
(last-seen "1")))
(define %db
;; Global Slot for a database object.
(make-parameter #t))
(make-parameter #f))
(define database-name
;; Use an empty and temporary database for the tests.
(string-append (getcwd) "/" (number->string (getpid)) "-tmp.db"))
(define db-name "test_database")
(%record-events? #t)
(test-group-with-cleanup "database"
(test-assert "db-init"
(begin
(%db (db-init database-name))
(%db (db-open))
(%db-channel (make-worker-thread-channel
(lambda ()
(list (%db)))))
(%db-writer-channel (%db-channel))
#t))
(test-assert "sqlite-exec"
(begin
(sqlite-exec (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES (1, 0, 0, 0, 0);")
(sqlite-exec (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES (2, 0, 0, 0, 0);")
(sqlite-exec (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES (3, 0, 0, 0, 0);")
(sqlite-exec (%db) "SELECT * FROM Evaluations;")))
(test-equal "db-add-specification"
example-spec
"guix"
(db-add-specification example-spec))
(test-assert "exec-query"
(begin
(db-add-specification example-spec)
(car (db-get-specifications))))
(exec-query (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES ('guix', 0, 0, 0, 0);")
(exec-query (%db) "SELECT * FROM Evaluations;")))
(test-equal "db-get-specification"
example-spec
(db-get-specification "guix"))
(test-equal "db-add-build"
#f
(let ((build (make-dummy-build "/foo.drv")))
(db-add-build build)
(test-equal "db-add-evaluation"
'(2 3)
(list
(db-add-evaluation "guix"
(make-dummy-checkouts "fakesha1" "fakesha2"))
(db-add-evaluation "guix"
(make-dummy-checkouts "fakesha3" "fakesha4"))))
;; Should return #f when adding a build whose derivation is already
;; there, see <https://bugs.gnu.org/28094>.
(catch-sqlite-error
(db-add-build build)
(on SQLITE_CONSTRAINT_UNIQUE => #f))))
(test-assert "db-set-evaluation-status"
(db-set-evaluation-status 2 (evaluation-status started)))
(test-assert "db-set-evaluation-time"
(db-set-evaluation-time 2))
(test-assert "db-abort-pending-evaluations"
(db-abort-pending-evaluations))
(test-equal "db-add-build"
"/foo.drv"
(let ((build (make-dummy-build "/foo.drv")))
(db-add-build build)))
(test-equal "db-add-build duplicate"
"/foo.drv"
(let ((build (make-dummy-build "/foo.drv")))
(db-add-build build)))
(test-assert "db-add-build-product"
(db-add-build-product `((#:build . 1)
(#:type . "1")
(#:file-size . 1)
(#:checksum . "sum")
(#:path . "path"))))
(test-equal "db-get-output"
'((#:derivation . "/foo.drv") (#:name . "foo"))
(db-get-output "/foo.drv.output"))
(test-equal "db-get-outputs"
'(("foo" (#:path . "/foo.drv.output")))
(db-get-outputs "/foo.drv"))
(test-assert "db-get-time-since-previous-build"
(db-get-time-since-previous-build "job" "guix"))
(test-assert "db-register-builds"
(let ((drv "/test.drv"))
(db-register-builds `(((#:job-name . "test")
(#:derivation . ,drv)
(#:system . "x86_64-linux")
(#:nix-name . "test")
(#:log . "log")
(#:outputs .
(("foo" . ,(format #f "~a.output" drv))
("foo2" . ,(format #f "~a.output.2" drv))))))
2 (db-get-specification "guix"))))
(test-assert "db-update-build-status!"
(db-update-build-status! "/test.drv"
(build-status failed)))
(test-assert "db-update-build-worker!"
(db-update-build-worker! "/test.drv" "worker"))
(test-equal "db-get-builds-by-search"
'(3 1 "test")
(let ((build
(match (db-get-builds-by-search
'((nr . 1)
(query . "status:failed test")))
((build) build))))
(list
(assoc-ref build #:id)
(assoc-ref build #:status)
(assoc-ref build #:job-name))))
(test-assert "db-get-builds"
(let* ((build (match (db-get-builds `((order . build-id)
(status . failed)))
((build) build)))
(outputs (assq-ref build #:outputs)))
(equal? outputs
'(("foo" (#:path . "/test.drv.output"))
("foo2" (#:path . "/test.drv.output.2"))))))
(test-equal "db-get-builds job-name"
"/foo.drv"
(let ((build (match (db-get-builds `((order . build-id)
(job . "job")))
((build) build))))
(assoc-ref build #:derivation)))
(test-equal "db-get-build"
"/foo.drv"
(let ((build (db-get-build 1)))
(assoc-ref build #:derivation)))
(test-equal "db-get-build derivation"
1
(let ((build (db-get-build "/foo.drv")))
(assoc-ref build #:id)))
(test-equal "db-get-events"
'evaluation
(let ((event (match (db-get-events '((nr . 1)
(type . evaluation)))
((event) event))))
(assoc-ref event #:type)))
(test-equal "db-delete-events-with-ids-<=-to"
1
(db-delete-events-with-ids-<=-to 1))
(test-equal "db-get-pending-derivations"
'("/foo.drv")
(db-get-pending-derivations))
(test-assert "db-get-checkouts"
(equal? (db-get-checkouts 2)
(make-dummy-checkouts "fakesha1" "fakesha2")))
(test-equal "db-get-evaluation"
"guix"
(let ((evaluation (db-get-evaluation 2)))
(assq-ref evaluation #:specification)))
(test-equal "db-get-evaluations"
'("guix" "guix")
(map (lambda (eval)
(assq-ref eval #:specification))
(db-get-evaluations 2)))
(test-equal "db-get-evaluations-build-summary"
'((0 0 0) (0 1 1))
(let ((summaries
(db-get-evaluations-build-summary "guix" 2 #f #f)))
(map (lambda (summary)
(list
(assq-ref summary #:succeeded)
(assq-ref summary #:failed)
(assq-ref summary #:scheduled)))
summaries)))
(test-equal "db-get-evaluations-id-min"
1
(db-get-evaluations-id-min "guix"))
(test-equal "db-get-evaluations-id-min"
#f
(db-get-evaluations-id-min "foo"))
(test-equal "db-get-evaluations-id-max"
3
(db-get-evaluations-id-max "guix"))
(test-equal "db-get-evaluations-id-max"
#f
(db-get-evaluations-id-max "foo"))
(test-equal "db-get-evaluation-summary"
'(2 0 1 1)
(let* ((summary (db-get-evaluation-summary 2))
(total (assq-ref summary #:total))
(succeeded (assq-ref summary #:succeeded))
(failed (assq-ref summary #:failed))
(scheduled (assq-ref summary #:scheduled)))
(list total succeeded failed scheduled)))
(test-equal "db-get-evaluation-summary empty"
'(0 0 0 0)
(let* ((summary (db-get-evaluation-summary 3))
(total (assq-ref summary #:total))
(succeeded (assq-ref summary #:succeeded))
(failed (assq-ref summary #:failed))
(scheduled (assq-ref summary #:scheduled)))
(list total succeeded failed scheduled)))
(test-equal "db-get-builds-query-min"
'(1)
(db-get-builds-query-min "spec:guix foo"))
(test-equal "db-get-builds-query-max"
'(3)
(db-get-builds-query-min "spec:guix status:failed test"))
(test-equal "db-get-builds-min"
3
(match (db-get-builds-min 2 "failed")
((timestamp id)
id)))
(test-equal "db-get-builds-max"
1
(match (db-get-builds-max 2 "pending")
((timestamp id)
id)))
(test-equal "db-get-evaluation-specification"
"guix"
(db-get-evaluation-specification 2))
(test-equal "db-get-build-products"
`(((#:id . 1)
(#:type . "1")
(#:file-size . 1)
(#:checksum . "sum")
(#:path . "path")))
(db-get-build-products 1))
(test-equal "db-get-build-product-path"
"path"
(db-get-build-product-path 1))
(test-equal "db-add-worker"
1
(db-add-worker %dummy-worker))
(test-equal "db-get-workers"
(list %dummy-worker)
(db-get-workers))
(test-equal "db-clear-workers"
'()
(begin
(db-clear-workers)
(db-get-workers)))
(test-equal "db-update-build-status!"
(list (build-status scheduled)
(build-status started)
(build-status succeeded)
"/foo.drv.log")
(with-temporary-database
(let* ((derivation (db-add-build
(make-dummy-build "/foo.drv" 1
#:outputs '(("out" . "/foo")))))
(get-status (lambda* (#:optional (key #:status))
(assq-ref (db-get-build derivation) key))))
(db-add-evaluation "guix" (make-dummy-checkouts "fakesha1"
"fakesha2"))
(db-add-specification example-spec)
"/foo2.drv.log")
(let* ((derivation (db-add-build
(make-dummy-build "/foo2.drv" 2
#:outputs '(("out" . "/foo")))))
(get-status (lambda* (#:optional (key #:status))
(assq-ref (db-get-build derivation) key))))
(let ((status0 (get-status)))
(db-update-build-status! "/foo2.drv" (build-status started))
(let ((status1 (get-status)))
(db-update-build-status! "/foo2.drv" (build-status succeeded)
#:log-file "/foo2.drv.log")
(let ((status0 (get-status)))
(db-update-build-status! "/foo.drv" (build-status started))
(let ((status1 (get-status)))
(db-update-build-status! "/foo.drv" (build-status succeeded)
#:log-file "/foo.drv.log")
;; Second call shouldn't make any difference.
(db-update-build-status! "/foo2.drv" (build-status succeeded)
#:log-file "/foo2.drv.log")
;; Second call shouldn't make any difference.
(db-update-build-status! "/foo.drv" (build-status succeeded)
#:log-file "/foo.drv.log")
(let ((status2 (get-status))
(start (get-status #:starttime))
(end (get-status #:stoptime))
(log (get-status #:log)))
(and (> start 0) (>= end start)
(list status0 status1 status2 log))))))))
(let ((status2 (get-status))
(start (get-status #:starttime))
(end (get-status #:stoptime))
(log (get-status #:log)))
(and (> start 0) (>= end start)
(list status0 status1 status2 log)))))))
(test-equal "db-get-builds"
#(((1 "/foo.drv") (2 "/bar.drv") (3 "/baz.drv")) ;ascending order
((3 "/baz.drv") (2 "/bar.drv") (1 "/foo.drv")) ;descending order
((3 "/baz.drv") (2 "/bar.drv") (1 "/foo.drv")) ;ditto
((3 "/baz.drv")) ;nr = 1
((2 "/bar.drv") (1 "/foo.drv") (3 "/baz.drv"))) ;status+submission-time
(with-temporary-database
;; Populate the 'Builds'', 'Evaluations', and
;; 'Specifications' tables in a consistent way, as expected by the
;; 'db-get-builds' query.
(db-add-build (make-dummy-build "/foo.drv" 1
#:outputs `(("out" . "/foo"))))
'(("/baa.drv" "/bar.drv" "/baz.drv") ;ascending order
("/baz.drv" "/bar.drv" "/baa.drv") ;descending order
("/baz.drv" "/bar.drv" "/baa.drv") ;ditto
("/baz.drv") ;nr = 1
("/bar.drv" "/baa.drv" "/baz.drv")) ;status+submission-time
(begin
(exec-query (%db) "DELETE FROM Builds;")
(db-add-build (make-dummy-build "/baa.drv" 2
#:outputs `(("out" . "/baa"))))
(db-add-build (make-dummy-build "/bar.drv" 2
#:outputs `(("out" . "/bar"))))
(db-add-build (make-dummy-build "/baz.drv" 3
(db-add-build (make-dummy-build "/baz.drv" 2
#:outputs `(("out" . "/baz"))))
(db-add-evaluation "guix" (make-dummy-checkouts "fakesha1" "fakesha2"))
(db-add-evaluation "guix" (make-dummy-checkouts "fakesha1" "fakesha3"))
(db-add-evaluation "guix" (make-dummy-checkouts "fakssha2" "fakesha3"))
(db-add-specification example-spec)
(db-update-build-status! "/bar.drv" (build-status started)
#:log-file "/bar.drv.log")
(let ((summarize (lambda (alist)
(list (assq-ref alist #:id)
(assq-ref alist #:derivation)))))
(vector (map summarize (db-get-builds '((nr . 3) (order . build-id))))
(map summarize (db-get-builds '()))
(map summarize (db-get-builds '((jobset . "guix"))))
(map summarize (db-get-builds '((nr . 1))))
(map summarize
(db-get-builds '((order . status+submission-time))))))))
(assq-ref alist #:derivation))))
(list (map summarize (db-get-builds '((nr . 3) (order . build-id))))
(map summarize (db-get-builds '()))
(map summarize (db-get-builds '((jobset . "guix"))))
(map summarize (db-get-builds '((nr . 1))))
(map summarize
(db-get-builds '((order . status+submission-time))))))))
(test-equal "db-get-pending-derivations"
'("/bar.drv" "/foo.drv")
(with-temporary-database
;; Populate the 'Builds', 'Evaluations', and 'Specifications' tables.
(begin
(exec-query (%db) "DELETE FROM Builds;")
(db-add-build (make-dummy-build "/foo.drv" 1
#:outputs `(("out" . "/foo"))))
(db-add-build (make-dummy-build "/bar.drv" 2
#:outputs `(("out" . "/bar"))))
(db-add-evaluation "guix" (make-dummy-checkouts "fakesha1" "fakesha2"))
(db-add-evaluation "guix" (make-dummy-checkouts "fakesha1" "fakesha3"))
(db-add-evaluation "guix" (make-dummy-checkouts "fakssha2" "fakesha3"))
(db-add-specification example-spec)
(sort (db-get-pending-derivations) string<?)))
(test-assert "db-close"
(db-close (%db)))
(begin
(%db-channel #f)
(delete-file database-name)))
;;; Local Variables:
;;; eval: (put 'with-temporary-database 'scheme-indent-function 0)
;;; End:
(begin
(exec-query (%db) (format #f "DROP OWNED BY CURRENT_USER;"))
(db-close (%db))
#t)))

View File

@ -24,6 +24,7 @@
(cuirass utils)
(json)
(fibers)
(squee)
(web uri)
(web client)
(web response)
@ -48,13 +49,8 @@
(define (test-cuirass-uri route)
(string-append "http://localhost:6688" route))
(define database-name
;; Use an empty and temporary database for the tests.
(string-append (getcwd) "/" (number->string (getpid)) "-tmp.db"))
(define %db
;; Global Slot for a database object.
(make-parameter #t))
(make-parameter #f))
(define build-query-result
'((#:id . 1)
@ -111,11 +107,10 @@
(test-assert "db-init"
(begin
(%db (db-init database-name))
(%db (db-open))
(%db-channel (make-worker-thread-channel
(lambda ()
(list (%db)))))
(%db-writer-channel (%db-channel))
#t))
(test-assert "cuirass-run"
@ -191,13 +186,13 @@
((#:commit . "fakesha3")
(#:input . "packages")
(#:directory . "dir4")))))
(db-add-build build1)
(db-add-build build2)
(db-add-specification specification)
(db-add-evaluation "guix" checkouts1
#:timestamp 1501347493)
(db-add-evaluation "guix" checkouts2
#:timestamp 1501347493)))
#:timestamp 1501347493)
(db-add-build build1)
(db-add-build build2)))
(test-assert "/specifications"
(match (call-with-input-string
@ -290,8 +285,7 @@
(http-get-body (test-cuirass-uri "/api/evaluations?nr=1")))))
(test-assert "db-close"
(db-close (%db)))
(begin
(%db-channel #f)
(delete-file database-name)))
(begin
(exec-query (%db) (format #f "DROP OWNED BY CURRENT_USER;"))
(db-close (%db))
#t)))

View File

@ -21,16 +21,9 @@
(cuirass metrics)
(cuirass utils)
((guix utils) #:select (call-with-temporary-output-file))
(squee)
(srfi srfi-64))
(define-syntax-rule (with-temporary-database body ...)
(call-with-temporary-output-file
(lambda (file port)
(parameterize ((%package-database file))
(db-init file)
(with-database
body ...)))))
(define today
(let ((time (current-time)))
(- time (modulo time 86400))))
@ -39,50 +32,49 @@
(- today 86400))
(define %db
;; Global Slot for a database object.
(make-parameter #t))
(define database-name
;; Use an empty and temporary database for the tests.
(string-append (getcwd) "/" (number->string (getpid)) "-tmp.db"))
(make-parameter #f))
(test-group-with-cleanup "database"
(test-assert "db-init"
(begin
(%db (db-init database-name))
(%db (db-open))
(%db-channel (make-worker-thread-channel
(lambda ()
(list (%db)))))
(%db-writer-channel (%db-channel))
#t))
(test-assert "sqlite-exec"
(test-assert "exec-query"
(begin
(sqlite-exec (%db) "\
(exec-query (%db) "\
INSERT INTO Specifications (name, load_path_inputs, package_path_inputs,
proc_input, proc_file, proc, proc_args, build_outputs, priority)
VALUES ('guix', '()', '()', 'guix',' build-aux/cuirass/gnu-system.scm',
'cuirass-jobs', '', '', 2);")
(exec-query (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES ('guix', -1, 1600174547, 0, 0);")
(sqlite-exec (%db) (format #f "\
(exec-query (%db) (format #f "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES ('guix', 0, ~a, ~a, ~a);\
" yesterday (+ yesterday 100) (+ yesterday 600)))
(sqlite-exec (%db) "\
(exec-query (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES ('guix', 1, 1600174547,
1600174548, 0);")
(sqlite-exec (%db) "\
(exec-query (%db) "\
INSERT INTO Evaluations (specification, status,
timestamp, checkouttime, evaltime) VALUES ('guix', 1, 1600174547,
1600174548, 1600174647);")
(sqlite-exec (%db) (format #f "\
(exec-query (%db) (format #f "\
INSERT INTO Builds (id, derivation, evaluation, job_name, system,
nix_name, log, status, timestamp, starttime, stoptime) VALUES
(1, '/gnu/store/1.drv', 2, '', '', '', '', 0, ~a, ~a, ~a);\
" yesterday (+ yesterday 1600) (+ yesterday 2600)))
(sqlite-exec (%db) (format #f "\
(exec-query (%db) (format #f "\
INSERT INTO Builds (id, derivation, evaluation, job_name, system,
nix_name, log, status, timestamp, starttime, stoptime) VALUES
(2, '/gnu/store/2.drv', 2, '', '', '', '', -2, 0, 0, 0);"))
(sqlite-exec (%db) (format #f "\
(exec-query (%db) (format #f "\
INSERT INTO Builds (id, derivation, evaluation, job_name, system,
nix_name, log, status, timestamp, starttime, stoptime) VALUES
(3, '/gnu/store/3.drv', 4, '', '', '', '', 0, 1600174451, 1600174451,
@ -94,65 +86,60 @@ nix_name, log, status, timestamp, starttime, stoptime) VALUES
(db-update-metric 'average-eval-duration-per-spec "guix")
(db-get-metrics-with-id 'average-eval-duration-per-spec)))
(test-equal "builds-per-day"
1.0
(begin
(db-update-metric 'builds-per-day)
(db-get-metric 'builds-per-day yesterday)))
(test-equal "builds-per-day"
1.0
(begin
(db-update-metric 'builds-per-day)
(db-get-metric 'builds-per-day yesterday)))
(test-equal "pending-builds"
`((,today . 1.0))
(begin
(db-update-metric 'pending-builds)
(db-get-metrics-with-id 'pending-builds)))
(test-equal "pending-builds"
`((,today . 1.0))
(begin
(db-update-metric 'pending-builds)
(db-get-metrics-with-id 'pending-builds)))
(test-equal "new-derivations-per-day"
`((,yesterday . 1.0))
(begin
(db-update-metric 'new-derivations-per-day)
(db-get-metrics-with-id 'new-derivations-per-day)))
(test-equal "new-derivations-per-day"
`((,yesterday . 1.0))
(begin
(db-update-metric 'new-derivations-per-day)
(db-get-metrics-with-id 'new-derivations-per-day)))
(test-equal "percentage-failed-eval-per-spec"
`(("guix" . 50.0))
(begin
(db-update-metric 'percentage-failed-eval-per-spec "guix")
(db-get-metrics-with-id 'percentage-failed-eval-per-spec)))
(test-equal "percentage-failed-eval-per-spec"
`(("guix" . 50.0))
(begin
(db-update-metric 'percentage-failed-eval-per-spec "guix")
(db-get-metrics-with-id 'percentage-failed-eval-per-spec)))
(test-equal "db-update-metrics"
`((,today . 2.0))
(begin
(sqlite-exec (%db) (format #f "\
(test-equal "db-update-metrics"
`((,today . 2.0))
(begin
(exec-query (%db) (format #f "\
INSERT INTO Builds (id, derivation, evaluation, job_name, system,
nix_name, log, status, timestamp, starttime, stoptime) VALUES
(4, '/gnu/store/4.drv', 1, '', '', '', '', -2, 0, 0, 0);"))
(db-update-metrics)
(db-get-metrics-with-id 'pending-builds)))
(db-update-metrics)
(db-get-metrics-with-id 'pending-builds)))
(test-equal "average-eval-build-start-time"
`((2 . 1000.0))
(begin
(db-update-metric 'average-eval-build-start-time 2)
(db-get-metrics-with-id 'average-eval-build-start-time)))
(test-equal "average-eval-build-start-time"
`((2 . 1000.0))
(begin
(db-update-metric 'average-eval-build-start-time 2)
(db-get-metrics-with-id 'average-eval-build-start-time)))
(test-equal "average-eval-build-complete-time"
`((2 . 2000.0))
(begin
(db-update-metric 'average-eval-build-complete-time 2)
(db-get-metrics-with-id 'average-eval-build-complete-time)))
(test-equal "average-eval-build-complete-time"
`((2 . 2000.0))
(begin
(db-update-metric 'average-eval-build-complete-time 2)
(db-get-metrics-with-id 'average-eval-build-complete-time)))
(test-equal "evaluation-completion-speed"
900.0
(begin
(db-update-metric 'evaluation-completion-speed 4)
(db-get-metric 'evaluation-completion-speed 4)))
(test-equal "evaluation-completion-speed"
900.0
(begin
(db-update-metric 'evaluation-completion-speed 4)
(db-get-metric 'evaluation-completion-speed 4)))
(test-assert "db-close"
(db-close (%db)))
(begin
(%db-channel #f)
(delete-file database-name)))
;;; Local Variables:
;;; eval: (put 'with-temporary-database 'scheme-indent-function 0)
;;; End:
(begin
(exec-query (%db) (format #f "DROP OWNED BY CURRENT_USER;"))
(db-close (%db))
#t)))