_id stringlengths 64 64 | repository stringlengths 6 84 | name stringlengths 4 110 | content stringlengths 0 248k | license null | download_url stringlengths 89 454 | language stringclasses 7 values | comments stringlengths 0 74.6k | code stringlengths 0 248k |
|---|---|---|---|---|---|---|---|---|
be49a02be786327e38ddc0ccfeb63d305dc41675643fb6a7417cb40b6c0e1f78 | YoshikuniJujo/funpaala | sumN1.hs | sumN :: Integer -> Integer
sumN n = sum [0 .. n]
sum3N :: Integer -> Integer
sum3N n = sum $ map (* 3) [0 .. n]
sum3N5 :: Integer -> Integer
sum3N5 n = sum . map (* 3) $ filter ((/= 0) . (`mod` 5)) [0 .. n]
| null | https://raw.githubusercontent.com/YoshikuniJujo/funpaala/5366130826da0e6b1180992dfff94c4a634cda99/samples/10_repetition/sumN1.hs | haskell | sumN :: Integer -> Integer
sumN n = sum [0 .. n]
sum3N :: Integer -> Integer
sum3N n = sum $ map (* 3) [0 .. n]
sum3N5 :: Integer -> Integer
sum3N5 n = sum . map (* 3) $ filter ((/= 0) . (`mod` 5)) [0 .. n]
| |
cb1c09bb351572e1bed98f4c0177d36fe05d4e3caab552bbfb1244f02fc366d7 | dmitryvk/sbcl-win32-threads | stack.lisp | ;;;; This file implements the stack analysis phase in the compiler. We
;;;; analyse lifetime of dynamically allocated object packets on stack
;;;; and insert cleanups where necessary.
;;;;
Currently there are two kinds of interesting stack packets : UVLs ,
;;;; whose use and destination lie in different blocks, and LVARs of
;;;; constructors of dynamic-extent objects.
This software is part of the SBCL system . See the README file for
;;;; more information.
;;;;
This software is derived from the CMU CL system , which was
written at Carnegie Mellon University and released into the
;;;; public domain. The software is in the public domain and is
;;;; provided with absolutely no warranty. See the COPYING and CREDITS
;;;; files for more information.
(in-package "SB!C")
;;; Scan through BLOCK looking for uses of :UNKNOWN lvars that have
;;; their DEST outside of the block. We do some checking to verify the
;;; invariant that all pushes come after the last pop.
(defun find-pushed-lvars (block)
(let* ((2block (block-info block))
(popped (ir2-block-popped 2block))
(last-pop (if popped
(lvar-dest (car (last popped)))
nil)))
(collect ((pushed))
(let ((saw-last nil))
(do-nodes (node lvar block)
(when (eq node last-pop)
(setq saw-last t))
(when (and lvar
(or (lvar-dynamic-extent lvar)
(let ((dest (lvar-dest lvar))
(2lvar (lvar-info lvar)))
(and (not (eq (node-block dest) block))
2lvar
(eq (ir2-lvar-kind 2lvar) :unknown)))))
(aver (or saw-last (not last-pop)))
(pushed lvar))))
(setf (ir2-block-pushed 2block) (pushed))))
(values))
;;;; Computation of live UVL sets
(defun nle-block-nlx-info (block)
(let* ((start-node (block-start-node block))
(nlx-ref (ctran-next (node-next start-node)))
(nlx-info (constant-value (ref-leaf nlx-ref))))
nlx-info))
(defun nle-block-entry-block (block)
(let* ((nlx-info (nle-block-nlx-info block))
(mess-up (cleanup-mess-up (nlx-info-cleanup nlx-info)))
(entry-block (node-block mess-up)))
entry-block))
Add LVARs from LATE to EARLY ; use EQ to check whether EARLY has
;;; been changed.
(defun merge-uvl-live-sets (early late)
(declare (type list early late))
;; FIXME: O(N^2)
(dolist (e late early)
(pushnew e early)))
;;; Update information on stacks of unknown-values LVARs on the
;;; boundaries of BLOCK. Return true if the start stack has been
;;; changed.
;;;
An LVAR is live at the end iff it is live at some of blocks , which
BLOCK can transfer control to . There are two kind of control
transfers : normal , expressed with BLOCK - SUCC , and NLX .
(defun update-uvl-live-sets (block)
(declare (type cblock block))
(let* ((2block (block-info block))
(original-start (ir2-block-start-stack 2block))
(end (ir2-block-end-stack 2block))
(new-end end))
(dolist (succ (block-succ block))
(setq new-end (merge-uvl-live-sets new-end
(ir2-block-start-stack (block-info succ)))))
(map-block-nlxes (lambda (nlx-info)
(let* ((nle (nlx-info-target nlx-info))
(nle-start-stack (ir2-block-start-stack
(block-info nle)))
(exit-lvar (nlx-info-lvar nlx-info))
(next-stack (if exit-lvar
(remove exit-lvar nle-start-stack)
nle-start-stack)))
(setq new-end (merge-uvl-live-sets
new-end next-stack))))
block
(lambda (dx-cleanup)
(dolist (lvar (cleanup-info dx-cleanup))
(do-uses (generator lvar)
(let* ((block (node-block generator))
(2block (block-info block)))
DX objects , living in the LVAR , are alive in
;; the environment, protected by the CLEANUP. We
;; also cannot move them (because, in general, we
;; cannot track all references to them).
;; Therefore, everything, allocated deeper than a
DX object -- that is , before the DX object --
;; should be kept alive until the object is
;; deallocated.
;;
Since DX generators end their blocks , we can
;; find out UVLs allocated before them by looking
;; at the stack at the end of the block.
;;
FIXME : This is not quite true : REFs to DX
;; closures don't end their blocks!
(setq new-end (merge-uvl-live-sets
new-end (ir2-block-end-stack 2block)))
(setq new-end (merge-uvl-live-sets
new-end (ir2-block-pushed 2block))))))))
(setf (ir2-block-end-stack 2block) new-end)
(let ((start new-end))
(setq start (set-difference start (ir2-block-pushed 2block)))
(setq start (merge-uvl-live-sets start (ir2-block-popped 2block)))
We can not delete unused UVLs during NLX , so all UVLs live at
ENTRY will be actually live at NLE .
;;
;; BUT, UNWIND-PROTECTor is called in the environment, which has
;; nothing in common with the environment of its entry. So we
;; fictively compute its stack from the containing cleanups, but
;; do not propagate additional LVARs from the entry, thus
;; preveting bogus stack cleanings.
;;
;; TODO: Insert a check that no values are discarded in UWP. Or,
;; maybe, we just don't need to create NLX-ENTRY for UWP?
(when (and (eq (component-head (block-component block))
(first (block-pred block)))
(not (bind-p (block-start-node block))))
(let* ((nlx-info (nle-block-nlx-info block))
(cleanup (nlx-info-cleanup nlx-info)))
(unless (eq (cleanup-kind cleanup) :unwind-protect)
(let* ((entry-block (node-block (cleanup-mess-up cleanup)))
(entry-stack (ir2-block-start-stack (block-info entry-block))))
(setq start (merge-uvl-live-sets start entry-stack))))))
(when *check-consistency*
(aver (subsetp original-start start)))
(cond ((subsetp start original-start)
nil)
(t
(setf (ir2-block-start-stack 2block) start)
t)))))
;;;; Ordering of live UVL stacks
;;; Put UVLs on the start/end stacks of BLOCK in the right order. PRED
;;; is a predecessor of BLOCK with already sorted stacks; because all
;;; UVLs being live at the BLOCK start are live in PRED, we just need
;;; to delete dead UVLs.
(defun order-block-uvl-sets (block pred)
(let* ((2block (block-info block))
(pred-end-stack (ir2-block-end-stack (block-info pred)))
(start (ir2-block-start-stack 2block))
(start-stack (loop for lvar in pred-end-stack
when (memq lvar start)
collect lvar))
(end (ir2-block-end-stack 2block)))
(when *check-consistency*
(aver (subsetp start start-stack)))
(setf (ir2-block-start-stack 2block) start-stack)
(let* ((last (block-last block))
(tailp-lvar (if (node-tail-p last) (node-lvar last)))
(end-stack start-stack))
(dolist (pop (ir2-block-popped 2block))
(aver (eq pop (car end-stack)))
(pop end-stack))
(dolist (push (ir2-block-pushed 2block))
(aver (not (memq push end-stack)))
(push push end-stack))
(aver (subsetp end end-stack))
(when (and tailp-lvar
(eq (ir2-lvar-kind (lvar-info tailp-lvar)) :unknown))
(aver (eq tailp-lvar (first end-stack)))
(pop end-stack))
(setf (ir2-block-end-stack 2block) end-stack))))
(defun order-uvl-sets (component)
(clear-flags component)
: Workaround for lp#308914 : we keep track of number of blocks
;; needing repeats, and bug out if we get stuck.
(loop with head = (component-head component)
with todo = 0
with last-todo = 0
do (psetq last-todo todo
todo 0)
do (do-blocks (block component)
(unless (block-flag block)
(let ((pred (find-if #'block-flag (block-pred block))))
(when (and (eq pred head)
(not (bind-p (block-start-node block))))
(let ((entry (nle-block-entry-block block)))
(setq pred (if (block-flag entry) entry nil))))
(cond (pred
(setf (block-flag block) t)
(order-block-uvl-sets block pred))
(t
(incf todo))))))
do (when (= last-todo todo)
;; If the todo count is the same as on last iteration, it means
;; we are stuck, which in turn means the unmarked blocks are
actually unreachable , so UVL set ordering for them does n't
;; matter.
(return-from order-uvl-sets))
while (plusp todo)))
;;; This is called when we discover that the stack-top unknown-values
;;; lvar at the end of BLOCK1 is different from that at the start of
;;; BLOCK2 (its successor).
;;;
;;; We insert a call to a funny function in a new cleanup block
;;; introduced between BLOCK1 and BLOCK2. Since control analysis and
LTN have already run , we must do make an IR2 block , then do
;;; ADD-TO-EMIT-ORDER and LTN-ANALYZE-BELATED-BLOCK on the new
;;; block. The new block is inserted after BLOCK1 in the emit order.
;;;
;;; If the control transfer between BLOCK1 and BLOCK2 represents a
;;; tail-recursive return or a non-local exit, then the cleanup code
;;; will never actually be executed. It doesn't seem to be worth the
;;; risk of trying to optimize this, since this rarely happens and
;;; wastes only space.
(defun discard-unused-values (block1 block2)
(declare (type cblock block1 block2))
(collect ((cleanup-code))
(labels ((find-popped (before after)
;; Returns (VALUES popped last-popped rest), where
BEFORE = ( APPEND popped rest ) and
( EQ ( FIRST rest ) ( FIRST after ) )
(if (null after)
(values before (first (last before)) nil)
(loop with first-preserved = (car after)
for last-popped = nil then maybe-popped
for rest on before
for maybe-popped = (car rest)
while (neq maybe-popped first-preserved)
collect maybe-popped into popped
finally (return (values popped last-popped rest)))))
(discard (before-stack after-stack)
(cond
((eq (car before-stack) (car after-stack))
(binding* ((moved-count (mismatch before-stack after-stack)
:exit-if-null)
((moved qmoved)
(loop for moved-lvar in before-stack
repeat moved-count
collect moved-lvar into moved
collect `',moved-lvar into qmoved
finally (return (values moved qmoved))))
(q-last-moved (car (last qmoved)))
((nil last-nipped rest)
(find-popped (nthcdr moved-count before-stack)
(nthcdr moved-count after-stack))))
(cleanup-code
`(%nip-values ',last-nipped ,q-last-moved
,@qmoved))
(discard (nconc moved rest) after-stack)))
(t
(multiple-value-bind (popped last-popped rest)
(find-popped before-stack after-stack)
(declare (ignore popped))
(cleanup-code `(%pop-values ',last-popped))
(discard rest after-stack))))))
(discard (ir2-block-end-stack (block-info block1))
(ir2-block-start-stack (block-info block2))))
(when (cleanup-code)
(let* ((block (insert-cleanup-code block1 block2
(block-start-node block2)
`(progn ,@(cleanup-code))))
(2block (make-ir2-block block)))
(setf (block-info block) 2block)
(add-to-emit-order 2block (block-info block1))
(ltn-analyze-belated-block block))))
(values))
;;;; stack analysis
Return a list of all the blocks containing genuine uses of one of
the RECEIVERS ( blocks ) and DX - LVARS . Exits are excluded , since
;;; they don't drop through to the receiver.
(defun find-pushing-blocks (receivers dx-lvars)
(declare (list receivers dx-lvars))
(collect ((res nil adjoin))
(dolist (rec receivers)
(dolist (pop (ir2-block-popped (block-info rec)))
(do-uses (use pop)
(unless (exit-p use)
(res (node-block use))))))
(dolist (dx-lvar dx-lvars)
(do-uses (use dx-lvar)
(res (node-block use))))
(res)))
Analyze the use of unknown - values and DX lvars in COMPONENT ,
;;; inserting cleanup code to discard values that are generated but
;;; never received. This phase doesn't need to be run when
Values - Receivers and Dx - Lvars are null , i.e. there are no
unknown - values lvars used across block boundaries and no DX LVARs .
(defun stack-analyze (component)
(declare (type component component))
(let* ((2comp (component-info component))
(receivers (ir2-component-values-receivers 2comp))
(generators (find-pushing-blocks receivers
(component-dx-lvars component))))
(dolist (block generators)
(find-pushed-lvars block))
Compute sets of live UVLs and DX LVARs
(loop for did-something = nil
do (do-blocks-backwards (block component)
(when (update-uvl-live-sets block)
(setq did-something t)))
while did-something)
(order-uvl-sets component)
(do-blocks (block component)
(let ((top (ir2-block-end-stack (block-info block))))
(dolist (succ (block-succ block))
(when (and (block-start succ)
(not (eq (ir2-block-start-stack (block-info succ))
top)))
(discard-unused-values block succ))))))
(values))
| null | https://raw.githubusercontent.com/dmitryvk/sbcl-win32-threads/5abfd64b00a0937ba2df2919f177697d1d91bde4/src/compiler/stack.lisp | lisp | This file implements the stack analysis phase in the compiler. We
analyse lifetime of dynamically allocated object packets on stack
and insert cleanups where necessary.
whose use and destination lie in different blocks, and LVARs of
constructors of dynamic-extent objects.
more information.
public domain. The software is in the public domain and is
provided with absolutely no warranty. See the COPYING and CREDITS
files for more information.
Scan through BLOCK looking for uses of :UNKNOWN lvars that have
their DEST outside of the block. We do some checking to verify the
invariant that all pushes come after the last pop.
Computation of live UVL sets
use EQ to check whether EARLY has
been changed.
FIXME: O(N^2)
Update information on stacks of unknown-values LVARs on the
boundaries of BLOCK. Return true if the start stack has been
changed.
the environment, protected by the CLEANUP. We
also cannot move them (because, in general, we
cannot track all references to them).
Therefore, everything, allocated deeper than a
should be kept alive until the object is
deallocated.
find out UVLs allocated before them by looking
at the stack at the end of the block.
closures don't end their blocks!
BUT, UNWIND-PROTECTor is called in the environment, which has
nothing in common with the environment of its entry. So we
fictively compute its stack from the containing cleanups, but
do not propagate additional LVARs from the entry, thus
preveting bogus stack cleanings.
TODO: Insert a check that no values are discarded in UWP. Or,
maybe, we just don't need to create NLX-ENTRY for UWP?
Ordering of live UVL stacks
Put UVLs on the start/end stacks of BLOCK in the right order. PRED
is a predecessor of BLOCK with already sorted stacks; because all
UVLs being live at the BLOCK start are live in PRED, we just need
to delete dead UVLs.
needing repeats, and bug out if we get stuck.
If the todo count is the same as on last iteration, it means
we are stuck, which in turn means the unmarked blocks are
matter.
This is called when we discover that the stack-top unknown-values
lvar at the end of BLOCK1 is different from that at the start of
BLOCK2 (its successor).
We insert a call to a funny function in a new cleanup block
introduced between BLOCK1 and BLOCK2. Since control analysis and
ADD-TO-EMIT-ORDER and LTN-ANALYZE-BELATED-BLOCK on the new
block. The new block is inserted after BLOCK1 in the emit order.
If the control transfer between BLOCK1 and BLOCK2 represents a
tail-recursive return or a non-local exit, then the cleanup code
will never actually be executed. It doesn't seem to be worth the
risk of trying to optimize this, since this rarely happens and
wastes only space.
Returns (VALUES popped last-popped rest), where
stack analysis
they don't drop through to the receiver.
inserting cleanup code to discard values that are generated but
never received. This phase doesn't need to be run when | Currently there are two kinds of interesting stack packets : UVLs ,
This software is part of the SBCL system . See the README file for
This software is derived from the CMU CL system , which was
written at Carnegie Mellon University and released into the
(in-package "SB!C")
(defun find-pushed-lvars (block)
(let* ((2block (block-info block))
(popped (ir2-block-popped 2block))
(last-pop (if popped
(lvar-dest (car (last popped)))
nil)))
(collect ((pushed))
(let ((saw-last nil))
(do-nodes (node lvar block)
(when (eq node last-pop)
(setq saw-last t))
(when (and lvar
(or (lvar-dynamic-extent lvar)
(let ((dest (lvar-dest lvar))
(2lvar (lvar-info lvar)))
(and (not (eq (node-block dest) block))
2lvar
(eq (ir2-lvar-kind 2lvar) :unknown)))))
(aver (or saw-last (not last-pop)))
(pushed lvar))))
(setf (ir2-block-pushed 2block) (pushed))))
(values))
(defun nle-block-nlx-info (block)
(let* ((start-node (block-start-node block))
(nlx-ref (ctran-next (node-next start-node)))
(nlx-info (constant-value (ref-leaf nlx-ref))))
nlx-info))
(defun nle-block-entry-block (block)
(let* ((nlx-info (nle-block-nlx-info block))
(mess-up (cleanup-mess-up (nlx-info-cleanup nlx-info)))
(entry-block (node-block mess-up)))
entry-block))
(defun merge-uvl-live-sets (early late)
(declare (type list early late))
(dolist (e late early)
(pushnew e early)))
An LVAR is live at the end iff it is live at some of blocks , which
BLOCK can transfer control to . There are two kind of control
transfers : normal , expressed with BLOCK - SUCC , and NLX .
(defun update-uvl-live-sets (block)
(declare (type cblock block))
(let* ((2block (block-info block))
(original-start (ir2-block-start-stack 2block))
(end (ir2-block-end-stack 2block))
(new-end end))
(dolist (succ (block-succ block))
(setq new-end (merge-uvl-live-sets new-end
(ir2-block-start-stack (block-info succ)))))
(map-block-nlxes (lambda (nlx-info)
(let* ((nle (nlx-info-target nlx-info))
(nle-start-stack (ir2-block-start-stack
(block-info nle)))
(exit-lvar (nlx-info-lvar nlx-info))
(next-stack (if exit-lvar
(remove exit-lvar nle-start-stack)
nle-start-stack)))
(setq new-end (merge-uvl-live-sets
new-end next-stack))))
block
(lambda (dx-cleanup)
(dolist (lvar (cleanup-info dx-cleanup))
(do-uses (generator lvar)
(let* ((block (node-block generator))
(2block (block-info block)))
DX objects , living in the LVAR , are alive in
DX object -- that is , before the DX object --
Since DX generators end their blocks , we can
FIXME : This is not quite true : REFs to DX
(setq new-end (merge-uvl-live-sets
new-end (ir2-block-end-stack 2block)))
(setq new-end (merge-uvl-live-sets
new-end (ir2-block-pushed 2block))))))))
(setf (ir2-block-end-stack 2block) new-end)
(let ((start new-end))
(setq start (set-difference start (ir2-block-pushed 2block)))
(setq start (merge-uvl-live-sets start (ir2-block-popped 2block)))
We can not delete unused UVLs during NLX , so all UVLs live at
ENTRY will be actually live at NLE .
(when (and (eq (component-head (block-component block))
(first (block-pred block)))
(not (bind-p (block-start-node block))))
(let* ((nlx-info (nle-block-nlx-info block))
(cleanup (nlx-info-cleanup nlx-info)))
(unless (eq (cleanup-kind cleanup) :unwind-protect)
(let* ((entry-block (node-block (cleanup-mess-up cleanup)))
(entry-stack (ir2-block-start-stack (block-info entry-block))))
(setq start (merge-uvl-live-sets start entry-stack))))))
(when *check-consistency*
(aver (subsetp original-start start)))
(cond ((subsetp start original-start)
nil)
(t
(setf (ir2-block-start-stack 2block) start)
t)))))
(defun order-block-uvl-sets (block pred)
(let* ((2block (block-info block))
(pred-end-stack (ir2-block-end-stack (block-info pred)))
(start (ir2-block-start-stack 2block))
(start-stack (loop for lvar in pred-end-stack
when (memq lvar start)
collect lvar))
(end (ir2-block-end-stack 2block)))
(when *check-consistency*
(aver (subsetp start start-stack)))
(setf (ir2-block-start-stack 2block) start-stack)
(let* ((last (block-last block))
(tailp-lvar (if (node-tail-p last) (node-lvar last)))
(end-stack start-stack))
(dolist (pop (ir2-block-popped 2block))
(aver (eq pop (car end-stack)))
(pop end-stack))
(dolist (push (ir2-block-pushed 2block))
(aver (not (memq push end-stack)))
(push push end-stack))
(aver (subsetp end end-stack))
(when (and tailp-lvar
(eq (ir2-lvar-kind (lvar-info tailp-lvar)) :unknown))
(aver (eq tailp-lvar (first end-stack)))
(pop end-stack))
(setf (ir2-block-end-stack 2block) end-stack))))
(defun order-uvl-sets (component)
(clear-flags component)
: Workaround for lp#308914 : we keep track of number of blocks
(loop with head = (component-head component)
with todo = 0
with last-todo = 0
do (psetq last-todo todo
todo 0)
do (do-blocks (block component)
(unless (block-flag block)
(let ((pred (find-if #'block-flag (block-pred block))))
(when (and (eq pred head)
(not (bind-p (block-start-node block))))
(let ((entry (nle-block-entry-block block)))
(setq pred (if (block-flag entry) entry nil))))
(cond (pred
(setf (block-flag block) t)
(order-block-uvl-sets block pred))
(t
(incf todo))))))
do (when (= last-todo todo)
actually unreachable , so UVL set ordering for them does n't
(return-from order-uvl-sets))
while (plusp todo)))
LTN have already run , we must do make an IR2 block , then do
(defun discard-unused-values (block1 block2)
(declare (type cblock block1 block2))
(collect ((cleanup-code))
(labels ((find-popped (before after)
BEFORE = ( APPEND popped rest ) and
( EQ ( FIRST rest ) ( FIRST after ) )
(if (null after)
(values before (first (last before)) nil)
(loop with first-preserved = (car after)
for last-popped = nil then maybe-popped
for rest on before
for maybe-popped = (car rest)
while (neq maybe-popped first-preserved)
collect maybe-popped into popped
finally (return (values popped last-popped rest)))))
(discard (before-stack after-stack)
(cond
((eq (car before-stack) (car after-stack))
(binding* ((moved-count (mismatch before-stack after-stack)
:exit-if-null)
((moved qmoved)
(loop for moved-lvar in before-stack
repeat moved-count
collect moved-lvar into moved
collect `',moved-lvar into qmoved
finally (return (values moved qmoved))))
(q-last-moved (car (last qmoved)))
((nil last-nipped rest)
(find-popped (nthcdr moved-count before-stack)
(nthcdr moved-count after-stack))))
(cleanup-code
`(%nip-values ',last-nipped ,q-last-moved
,@qmoved))
(discard (nconc moved rest) after-stack)))
(t
(multiple-value-bind (popped last-popped rest)
(find-popped before-stack after-stack)
(declare (ignore popped))
(cleanup-code `(%pop-values ',last-popped))
(discard rest after-stack))))))
(discard (ir2-block-end-stack (block-info block1))
(ir2-block-start-stack (block-info block2))))
(when (cleanup-code)
(let* ((block (insert-cleanup-code block1 block2
(block-start-node block2)
`(progn ,@(cleanup-code))))
(2block (make-ir2-block block)))
(setf (block-info block) 2block)
(add-to-emit-order 2block (block-info block1))
(ltn-analyze-belated-block block))))
(values))
Return a list of all the blocks containing genuine uses of one of
the RECEIVERS ( blocks ) and DX - LVARS . Exits are excluded , since
(defun find-pushing-blocks (receivers dx-lvars)
(declare (list receivers dx-lvars))
(collect ((res nil adjoin))
(dolist (rec receivers)
(dolist (pop (ir2-block-popped (block-info rec)))
(do-uses (use pop)
(unless (exit-p use)
(res (node-block use))))))
(dolist (dx-lvar dx-lvars)
(do-uses (use dx-lvar)
(res (node-block use))))
(res)))
Analyze the use of unknown - values and DX lvars in COMPONENT ,
Values - Receivers and Dx - Lvars are null , i.e. there are no
unknown - values lvars used across block boundaries and no DX LVARs .
(defun stack-analyze (component)
(declare (type component component))
(let* ((2comp (component-info component))
(receivers (ir2-component-values-receivers 2comp))
(generators (find-pushing-blocks receivers
(component-dx-lvars component))))
(dolist (block generators)
(find-pushed-lvars block))
Compute sets of live UVLs and DX LVARs
(loop for did-something = nil
do (do-blocks-backwards (block component)
(when (update-uvl-live-sets block)
(setq did-something t)))
while did-something)
(order-uvl-sets component)
(do-blocks (block component)
(let ((top (ir2-block-end-stack (block-info block))))
(dolist (succ (block-succ block))
(when (and (block-start succ)
(not (eq (ir2-block-start-stack (block-info succ))
top)))
(discard-unused-values block succ))))))
(values))
|
024c96448c344bd0eccdad0908b47847f5ed996aae7e86149b5b3a23e6d19a69 | otabat/couchbase-clj | client.clj | (ns couchbase-clj.client
(:import [java.net URI]
[java.util Collection]
[java.util.concurrent TimeUnit Future]
[net.spy.memcached CASValue]
[net.spy.memcached.internal GetFuture BulkGetFuture OperationFuture]
[net.spy.memcached.transcoders Transcoder]
[net.spy.memcached PersistTo ReplicateTo]
[com.couchbase.client CouchbaseClient CouchbaseConnectionFactory]
[com.couchbase.client.internal HttpFuture]
[com.couchbase.client.protocol.views Query View ViewRow])
(:refer-clojure :exclude [get set replace flush inc dec replicate
future-cancel future-cancelled? future-done?])
(:require [couchbase-clj.query :as cb-query]
[couchbase-clj.config :as cb-config]
[couchbase-clj.client-builder :as cb-client-builder]
[couchbase-clj.future :as cb-future]
[couchbase-clj.util :as cb-util]))
(def ^:private persist-to-map {:master PersistTo/MASTER
:one PersistTo/ONE
:two PersistTo/TWO
:three PersistTo/THREE
:four PersistTo/FOUR})
(def ^:private replicate-to-map {:zero ReplicateTo/ZERO
:one ReplicateTo/ONE
:two ReplicateTo/TWO
:three ReplicateTo/THREE})
(defn persist-to
"Get the PersistTo object by specifying a corresponding keyword argument.
persist can be :master, :one, :two, :three, :four.
If other value or no argument is specified,
@couchbase-clj.config/default-persist will be specified as the default value.
MASTER or ONE requires Persist to the Master.
TWO requires Persist to at least two nodes including the Master.
THREE requires Persist to at least three nodes including the Master.
FOUR requires Persist to at least four nodes including the Master."
([] (@cb-config/default-persist persist-to-map))
([persist]
(or (and persist (persist persist-to-map))
(@cb-config/default-persist persist-to-map))))
(defn replicate-to
"Get the ReplicateTo object by specifying a corresponding keyword argument.
replicate can be :zero, :one, :two, :three
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as the default value.
ZERO requires no replication.
ONE requires the data to be replicated with at least one replica.
TWO requires the data to be replicated with at least two replicas.
THREE requires the data to be replicated with at least three replicas."
([] (@cb-config/default-replicate replicate-to-map))
([replicate]
(or (and replicate (replicate replicate-to-map))
(@cb-config/default-replicate replicate-to-map))))
(defn cas-id
"Get the cas ID from the CASValue object."
[^CASValue c]
(when c
(.getCas c)))
(defn cas-val
"Get the value from the CASValue object"
[^CASValue c]
(when c
(.getValue c)))
(defn cas-val-json
"Get the JSON string value converted to Clojure data from the CASValue object.
nil is returned, if c is nil."
[^CASValue c]
(cb-util/read-json (.getValue c)))
(defn view-id
"Get the ID of query result from ViewRow object."
[^ViewRow view]
(.getId view))
(defn view-key
"Get the key of query result from ViewRow object."
[^ViewRow view]
(.getKey view))
(defn view-key-json
"Get the JSON string key of query result from ViewRow object,
converted to Clojure data."
[^ViewRow view]
(cb-util/read-json (.getKey view)))
(defn view-val
"Get the value of query result from ViewRow object."
[^ViewRow view]
(.getValue view))
(defn view-val-json
"Get the JSON string value of query result from ViewRow object,
converted to Clojure data."
[^ViewRow view]
(cb-util/read-json (.getValue view)))
(defn view-doc
"Get the document of query result when include-docs is set to true."
[^ViewRow view]
(.getDocument view))
(defn view-doc-json
"Get the JSON string document of query result converted to Clojure data
when include-docs is set to true."
[^ViewRow view]
(cb-util/read-json (.getDocument view)))
(defprotocol ICouchbaseCljClient
(get-client [clj-client] "Get the CouchbaseClient object.")
(get-factory [clj-client] "Get the CouchbaseConnectionFactory object.")
(get-available-servers [clj-client]
"Get the addresses of available servers in a Vector.")
(get-unavailable-servers [clj-client]
"Get the addresses of unavailable servers in a Vector.")
(get-node-locator [clj-client]
"Get a read-only wrapper around the node locator wrapping this instance.")
(get-versions [clj-client]
"Get versions of all of the connected servers in a Map.")
(get-sasl-mechanisms [clj-client] "Get the list of sasl mechanisms in a Set.")
(get-client-status
[clj-client]
"Get all of the stats from all of the connections in a Map."
;; TODO: Seems not working
; [clj-client k]
)
(get-auth-descriptor [clj-client] "Get the auth descriptor.")
(get-failure-mode [clj-client] "Get the failure mode.")
(get-hash-alg [clj-client] "Get the hashing algorithm.")
(get-max-reconnect-delay [clj-client] "Get the max reconnect delay.")
;; TODO: Not working
;(get-min-reconnect-interval [clj-client] "Get the min reconnect interval.")
;; TODO: APIs not provided?
;(get-obs-poll-interval [clj-client])
;(get-obs-poll-max [clj-client])
(get-op-queue-max-block-time [clj-client] "Get the op queue max block time.")
(get-op-timeout [clj-client]
"Get the operation timeout.
This is used as a default timeout value for sync and async client operations.")
(get-read-buffer-size [clj-client] "Get the read buffer size.")
(get-timeout-exception-threshold [clj-client]
"Get the timeout exception threshold.")
(get-transcoder [clj-client] "Get the default transcoder.")
(daemon? [clj-client]
"Return true if IO thread should be a daemon thread,
otherwise return false.")
(should-optimize? [clj-client]
"Return if the performance should be optimized for the network,
otherwise return false.")
(use-nagle-algorithm? [clj-client]
"Return true if the Nagle algorithm is specified, otherwise return false.")
(async-add
[clj-client k v]
[clj-client k v opts]
"Asynchronously add a value with the specified key
that does not already exist.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :observe, :persist,
and :replicate.
When :observe is set to true, persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(add
[clj-client k v]
[clj-client k v opts]
"Synchronously add a value with the specified key
that does not already exist.
If adding has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :timeout, :observe, :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days)
are interpreted as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(async-add-json
[clj-client k v]
[clj-client k v opts]
"Asynchronously add a value that will be converted to JSON string
with the specified key that does not already exist.
Return value will be a CouchbaseCljOperationFuture object.
Arguments are the same as async-add.")
(add-json
[clj-client k v]
[clj-client k v opts]
"Synchronously add a value that will be converted to JSON string
with the specified key that does not already exist.
If adding has succeeded then true is returned, otherwise false.
Arguments are the same as add.")
(async-append
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously append a value to an existing key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(append
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously append a value to an existing key.
If appending has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :transcoder and :timeout.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-prepend
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously prepend a value to an existing key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(prepend
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously prepend a value to an existing key.
If prepending has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :transcoder and :timeout.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-delete
[clj-client k]
[clj-client k opts]
"Asynchronously delete the specified key.
k is the key and can be a keyword, symbol or a string.
Currently no options can be specified.")
(delete
[clj-client k]
[clj-client k opts]
"Synchronously delete the specified key.
If deletion has succeeded then true is returned, otherwise false.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :timeout.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-get
[clj-client k]
[clj-client k opts]
"Asynchronously get the value of the specified key.
Return value is a CouchbaseCljGetFuture object.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get
[clj-client k]
[clj-client k opts]
"Synchronously get the value of the specified key.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-json
[clj-client k]
[clj-client k opts]
"Synchronously get the JSON string value converted
to a Clojure data of the specified key.
You can specify a optional transcoder keyword in a map.
Arguments are the same as get.")
(async-get-touch
[clj-client k]
[clj-client k opts]
"Asynchronously get a value and update the expiration time for a given key
Return value is a CouchbaseCljOperationFuture object.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-touch
[clj-client k]
[clj-client k opts]
"Synchronously get a value and update the expiration time for a given key
Return value is a CASValue object.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(async-get-multi
[clj-client ks]
[clj-client ks opts]
"Asynchronously get multiple keys.
ks is a sequential collection containing keys.
Key can be a keyword, symbol or a string.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-multi
[clj-client ks]
[clj-client ks opts]
"Synchronously get multiple keys.
ks is a sequential collection containing keys.
Key can be a keyword, symbol or a string.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-multi-json
[clj-client k]
[clj-client k opts]
"Synchronously get multiple JSON string value converted
to a Clojure data of the specified key.
You can specify a optional transcoder keyword in a map.
Arguments are the same as get-multi.")
(async-get-lock
[clj-client k]
[clj-client k opts]
"Asynchronously get a lock.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-lock
[clj-client k]
[clj-client k opts]
"Synchronously get a lock.
Return value is a CASValue object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(locked? [clj-client k]
"Retrun true if key is locked.
k is the key and can be a keyword, symbol or a string.")
(async-get-cas
[clj-client k]
[clj-client k opts]
"Asynchronously get single key value with CAS value.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-cas
[clj-client k]
[clj-client k opts]
"Synchronously get single key value with CAS value.
Return value is a CASValue object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-cas-id
[clj-client k]
[clj-client k opts]
"Synchronously get a CAS ID.
Integer CAS ID is returned.
Arguments are the same as get-cas.")
(async-inc
[clj-client k]
[clj-client k opts]
"Asynchronously increment the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :offset.
offset is the integer offset value to increment.
If offset is not specified,
@couchbase-clj.config/default-inc-offset will be specified
as the default value.")
(inc
[clj-client k]
[clj-client k opts]
"Synchronously increment the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :offset, :default and :expiry.
offset is the integer offset value to increment.
If offset is not specified,
@couchbase-clj.config/default-inc-offset will be specified
as the default value.
default is the default value to increment if key does not exist.
If default is not specified,
@couchbase-clj.config/default-inc-default will be specified
as the default value.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.")
(async-dec
[clj-client k]
[clj-client k opts]
"Asynchronously decrement the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :offset.
offset is the integer offset value to decrement.
If offset is not specified,
@couchbase-clj.config/default-dec-offset will be specified
as the default value.")
(dec
[clj-client k]
[clj-client k opts]
"Synchronously decrement the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :offset, :default and :expiry.
offset is the integer offset value to increment.
If offset is not specified,
@couchbase-clj.config/default-inc-offset will be specified
as the default value.
default is the default value to increment if key does not exist.
If default is not specified,
@couchbase-clj.config/default-inc-default will be specified
as the default value.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.")
(async-replace
[clj-client k v]
[clj-client k v opts]
"Asynchronously update an existing key with a new value.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :observe, :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(replace
[clj-client k v]
[clj-client k v opts]
"Synchronously update an existing key with a new value.
If replacing has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :timeout, :observe :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days)
are interpreted as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(async-replace-json
[clj-client k v]
[clj-client k v opts]
"Asynchronously update an existing key with a new value
that will be converted to a JSON string value.
Arguments are the same as async-replace.")
(replace-json
[clj-client k v]
[clj-client k v opts]
"Synchronously update an existing key with a new value
that will be converted to a JSON string value.
Arguments are the same as replace.")
(async-set
[clj-client k v]
[clj-client k v opts]
"Asynchronously store a value using the specified key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :observe, :persist,
and :repliate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(set
[clj-client k v]
[clj-client k v opts]
"Synchronously store a value using the specified key.
If set has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :timeout, :observe, :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(async-set-json
[clj-client k v]
[clj-client k v opts]
"Asynchronously store a value that will be converted to a JSON String
using the specified key.
Return value is a CouchbaseCljOperationFuture object.
Arguments are the same as async-set.")
(set-json
[clj-client k v]
[clj-client k v opts]
"Synchronously store a value that will be converted to a JSON String
using the specified key.
If set has succeeded then true is returned, otherwise false.
Arguments are the same as set.")
(async-set-cas
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously compare the CAS ID and store a value using the specified key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(set-cas
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously compare the CAS ID and store a value using the specified key.
Keyword results that are originally defined in CASResponse
and mapped by cas-response function will be returned.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(async-set-cas-json
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously compare the CAS ID and store a value that is
converted to a JSON string
using the specified key.
Return value is a CouchbaseCljOperationFuture object.
Arguments are the same as async-set-cas.")
(set-cas-json
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously compare the CAS ID and store a value that is
converted to a JSON string
using the specified key.
Keyword results that are originally defined in CASResponse
and mapped by cas-response function will be returned.
Arguments are the same as set-cas.")
(async-touch
[clj-client k]
[clj-client k opts]
"Asynchronously update the expiration time for a given key
Return value is a CouchbaseCljOperationFuture object.
You can specify a optional key value map as the opts argument.
Optional keyword is :expiry.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.")
(touch
[clj-client k]
[clj-client k opts]
"Synchronously update the expiration time for a given key
If update has succeeded then true is returned, otherwise false.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :timeout.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-unlock
[clj-client k cas-id]
[clj-client k cas-id opts]
"Asynchronously unlock.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(unlock
[clj-client k cas-id]
[clj-client k cas-id opts]
"Synchronously unlock.
If unlocking has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(async-get-view [clj-client design-doc view-name]
"Asynchronously get a new view.
Return value is a CouchbaseCljHttpFuture object.
design-doc is a design document name.
view-name is a view name within a design document.")
(get-view [clj-client design-doc view-name]
"Synchronously get a new view.
Return value is a View object.
design-doc is a design document name.
view-name is a view name within a design document.")
TODO : Currently not supported due to API change in the Couchbase Client .
;(async-get-views [clj-client design-doc]
; "Asynchronously get a Vector of views.
;Return value is a CouchbaseCljHttpFuture object.
;design-doc is a design document name.")
;(get-views [clj-client design-doc]
" Synchronously get a sequence of views .
Return value is a Vector of views .
;design-doc is a design document name.")
(async-query
[clj-client view q]
[clj-client design-doc view-name q]
"Asynchronously query a view within a design doc.
Return value is a CouchbaseCljHttpFuture object.
view is a View object.
q is a CouchbaseCljQuery object or query parameters.
design-doc is a design document name.
view-name is a view name within a design document.")
(query
[clj-client view q]
[clj-client design-doc view-name q]
"Synchronously query a view within a design doc.
Return value is a sequence of ViewRows.
view is a View object.
q is a CouchbaseCljQuery object or query parameters.
design-doc is a design document name.
view-name is a view name within a design document.")
(lazy-query
[clj-client view q num]
[clj-client design-doc view-name q num]
"Lazily query a view within a design doc.
Response is a lazy sequence of a ViewResponse.
view is a View object.
q is a CouchbaseCljQuery object or query parameters.
design-doc is a design document name.
view-name is a view name within a design document.
num is an integer to specify the amount of documents to get in each iterations.
lazy-query can be used to query a large data lazily
that it allows you to only get the amount of documents specified per iteration.
ex:
(doseq [res (lazy-query clj-client view q num)]
(println (map view-id res)))
=> (:id1 :id2 :id3 :id4 :id5)")
(wait-queue
[clj-client]
[clj-client timeout]
"Synchronously wait for the queues to die down.
Return true if the queues have died down, otherwise false.
You can specifiy a optional operation timeout value.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
;; TODO: Add observer methods
;(observe [clj-client k cas-id])
;(add-observer [clj-client conn-obs])
;(remove-observer [clj-client conn-obs])
; (flush
; [clj-client]
; [clj-client delay]
; "Flush all cached and persisted data.
; If flushing has succeeded then true is returned, otherwise false.
delay is the period of time to delay , in seconds
To do flushing , you 'll need to enable flush_all by using cbepctl command .
ex : cbepctl localhost:11210 set flush_param flushall_enabled true
; Currently there is a bug in this command and it may not work as expected.")
(shutdown
[clj-client]
[clj-client timeout]
"Shut down the client.
If no argument is specified, client will shutdown immediately.
If you specify a optional integer operation timeout value (in milliseconds),
shutdown will occur gracefully.
timeout is the max waiting time."))
(deftype CouchbaseCljClient [^CouchbaseClient cc ^CouchbaseConnectionFactory cf]
ICouchbaseCljClient
(get-client [clj-client] cc)
(get-factory [clj-client] cf)
(get-available-servers [clj-client]
(let [vc (into [] (.getAvailableServers cc))]
(when-not (empty? vc)
vc)))
(get-unavailable-servers [clj-client]
(let [vc (into [] (.getUnavailableServers cc))]
(when-not (empty? vc)
vc)))
(get-node-locator [clj-client] (.getNodeLocator cc))
(get-versions [clj-client] (into {} (.getVersions cc)))
(get-sasl-mechanisms [clj-client] (into #{} (.listSaslMechanisms cc)))
(get-client-status [clj-client] (into {} (.getStats cc )))
;; TODO: Seems not working?
;(get-client-status [clj-client k]
; (let [^String nk (name k)]
( into { } ( .getStats cc nk ) ) ) )
(get-auth-descriptor [clj-client] (.getAuthDescriptor cf))
(get-failure-mode [clj-client] (.getFailureMode cf))
(get-hash-alg [clj-client] (.getHashAlg cf))
(get-max-reconnect-delay [clj-client] (.getMaxReconnectDelay cf))
;; TODO: Not working
;(get-min-reconnect-interval [clj-client] (.getMinReconnectInterval cf))
;; TODO: APIs not provided?
;(get-obs-poll-interval [clj-client] (.getObsPollInterval cf))
;(get-obs-poll-max [clj-client])
(get-op-queue-max-block-time [clj-client] (.getOpQueueMaxBlockTime cf))
(get-op-timeout [clj-client] (.getOperationTimeout cf))
(get-read-buffer-size [clj-client] (.getReadBufSize cf))
(get-timeout-exception-threshold [clj-client]
(.getTimeoutExceptionThreshold cf))
(get-transcoder [clj-client] (.getTranscoder cc))
(daemon? [clj-client] (.isDaemon cf))
(should-optimize? [clj-client] (.shouldOptimize cf))
(use-nagle-algorithm? [clj-client] (.useNagleAlgorithm cf))
(async-add [clj-client k v] (async-add clj-client k v {}))
(async-add [clj-client k v {:keys [expiry ^Transcoder transcoder
observe persist replicate]}]
(let [^String nk (name k)
^String sv (str v)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^PersistTo p (persist-to persist)
^ReplicateTo r (replicate-to replicate)
^OperationFuture fut (if transcoder
(.add cc nk exp v transcoder)
(if (true? observe)
(.add cc nk exp sv p r)
(.add cc nk exp v)))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(add [clj-client k v] (add clj-client k v {}))
(add [clj-client k v {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-add clj-client k v opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-add-json [clj-client k v] (async-add-json clj-client k v {}))
(async-add-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(async-add clj-client k jv opts)))
(add-json [clj-client k v] (add-json clj-client k v {}))
(add-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(add clj-client k jv opts)))
(async-append [clj-client k v cas-id] (async-append clj-client k v cas-id {}))
(async-append [clj-client k v cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.append cc ^long cas-id nk v transcoder)
(.append cc ^long cas-id nk v))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(append [clj-client k v cas-id] (append clj-client k v cas-id {}))
(append [clj-client k v cas-id {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-append clj-client k v cas-id opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-prepend [clj-client k v cas-id]
(async-prepend clj-client k v cas-id {}))
(async-prepend [clj-client k v cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.prepend cc ^long cas-id nk v transcoder)
(.prepend cc ^long cas-id nk v))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(prepend [clj-client k v cas-id] (prepend clj-client k v cas-id {}))
(prepend [clj-client k v cas-id {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-prepend clj-client k v cas-id opts))]
(.get fut to TimeUnit/MILLISECONDS)))
TODO : Currently delete command through observe is unavailable
;; due to a bug in the couchbase cilent sdk.
(async-delete [clj-client k]
(let [^String nk (name k)
^OperationFuture fut (.delete cc nk)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(delete [clj-client k] (delete clj-client k {}))
TODO : Currently delete command through observe is unavailable
;; due to a bug in the couchbase-cilent.
(delete [clj-client k {:keys [^long timeout]}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-delete clj-client k))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-get [clj-client k]
(async-get clj-client k {}))
(async-get [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^GetFuture fut (if transcoder
(.asyncGet cc nk transcoder)
(.asyncGet cc nk))]
(cb-future/->CouchbaseCljGetFuture cf fut)))
(get [clj-client k] (get clj-client k {}))
(get [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)]
(if transcoder
(.get cc nk transcoder)
(.get cc nk))))
(get-json [clj-client k] (get-json clj-client k {}))
(get-json [clj-client k opts] (cb-util/read-json (get clj-client k opts)))
(async-get-touch [clj-client k]
(async-get-touch clj-client k {}))
(async-get-touch [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^OperationFuture fut (if transcoder
(.asyncGetAndTouch cc nk exp transcoder)
(.asyncGetAndTouch cc nk exp))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(get-touch [clj-client k] (get-touch clj-client k {}))
(get-touch [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)]
(when-let [^CASValue c (if transcoder
(.getAndTouch cc nk exp transcoder)
(.getAndTouch cc nk exp))]
c)))
(async-get-multi [clj-client ks]
(async-get-multi clj-client ks {}))
(async-get-multi [clj-client ks {:keys [^Transcoder transcoder]}]
(let [^Collection seq-ks (map name ks)
^BulkGetFuture fut (if transcoder
(.asyncGetBulk cc seq-ks transcoder)
(.asyncGetBulk cc seq-ks))]
(cb-future/->CouchbaseCljBulkGetFuture cf fut)))
(get-multi [clj-client ks] (get-multi clj-client ks {}))
(get-multi [clj-client ks {:keys [^Transcoder transcoder]}]
(let [^Collection seq-ks (map name ks)
m (into {} (if transcoder
(.getBulk cc seq-ks transcoder)
(.getBulk cc seq-ks)))]
(when-not (empty? m)
m)))
(get-multi-json [clj-client k] (get-multi-json clj-client k {}))
(get-multi-json [clj-client k opts]
(reduce #(merge %1 {(key %2)
(cb-util/read-json (val %2))})
nil
(get-multi clj-client k opts)))
(async-get-lock [clj-client k]
(async-get-lock clj-client k {}))
(async-get-lock [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-lock-expiry) int)
^OperationFuture fut (if transcoder
(.asyncGetAndLock cc nk exp transcoder)
(.asyncGetAndLock cc nk exp))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(get-lock [clj-client k] (get-lock clj-client k {}))
(get-lock [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-lock-expiry) int)
^CASValue c (if transcoder
(.getAndLock cc nk exp transcoder)
(.getAndLock cc nk exp))]
c))
(locked?
[clj-client k]
(let [cas-id (get-cas-id clj-client k)]
(if (= cas-id -1)
true
false)))
(async-get-cas [clj-client k]
(async-get-cas clj-client k {}))
(async-get-cas [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.asyncGets cc nk transcoder)
(.asyncGets cc nk))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(get-cas [clj-client k] (get-cas clj-client k {}))
(get-cas [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)]
(when-let [^CASValue c (if transcoder
(.gets cc nk transcoder)
(.gets cc nk))]
c)))
(get-cas-id [clj-client k] (get-cas-id clj-client k {}))
(get-cas-id [clj-client k opts]
(let [^CASValue c (get-cas clj-client k opts)]
(when c
(.getCas c))))
(async-inc [clj-client k]
(async-inc clj-client k {}))
(async-inc [clj-client k {:keys [^long offset]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-inc-offset)
^OperationFuture fut (.asyncIncr cc nk ofst)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(inc [clj-client k] (inc clj-client k {}))
(inc [clj-client k {:keys [^long offset ^long default expiry]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-inc-offset)
^long dflt (or default ^long @cb-config/default-inc-default)
exp (-> (or expiry @cb-config/default-data-expiry) int)]
(.incr cc nk ofst dflt exp)))
(async-dec [clj-client k]
(async-dec clj-client k {}))
(async-dec [clj-client k {:keys [^long offset]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-dec-offset)
^OperationFuture fut (.asyncDecr cc nk ofst)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(dec [clj-client k] (dec clj-client k {}))
(dec [clj-client k {:keys [^long offset ^long default expiry]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-dec-offset)
^long dflt (or default ^long @cb-config/default-dec-default)
exp (-> (or expiry @cb-config/default-data-expiry) int)]
(.decr cc nk ofst dflt exp)))
(async-replace [clj-client k v]
(async-replace clj-client k v {}))
(async-replace [clj-client k v {:keys [expiry ^Transcoder transcoder
observe persist replicate]}]
(let [^String nk (name k)
^String sv (str v)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^PersistTo p (persist-to persist)
^ReplicateTo r (replicate-to replicate)
^OperationFuture fut (if transcoder
(.replace cc nk exp v transcoder)
(if (true? observe)
(.replace cc nk exp sv p r)
(.replace cc nk exp v)))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(replace [clj-client k v] (replace clj-client k v {}))
(replace [clj-client k v {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-replace clj-client k v opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-replace-json [clj-client k v] (async-replace-json clj-client k v {}))
(async-replace-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(async-replace clj-client k jv opts)))
(replace-json [clj-client k v] (replace-json clj-client k v {}))
(replace-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(replace clj-client k jv opts)))
(async-set [clj-client k v] (async-set clj-client k v {}))
(async-set [clj-client k v {:keys [expiry ^Transcoder transcoder
observe persist replicate]}]
(let [^String nk (name k)
^String sv (str v)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^PersistTo p (persist-to persist)
^ReplicateTo r (replicate-to replicate)
^OperationFuture fut (if transcoder
(.set cc nk exp v transcoder)
(if (true? observe)
(.set cc nk exp sv p r)
(.set cc nk exp v)))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(set [clj-client k v] (set clj-client k v {}))
(set [clj-client k v {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-set clj-client k v opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-set-json [clj-client k v] (async-set-json clj-client k v {}))
(async-set-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(async-set clj-client k jv opts)))
(set-json [clj-client k v] (set-json clj-client k v {}))
(set-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(set clj-client k jv opts)))
(async-set-cas [clj-client k v cas-id]
(async-set-cas clj-client k v cas-id {}))
(async-set-cas [clj-client k v cas-id
{:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^Transcoder tc (or transcoder (get-transcoder clj-client))
^OperationFuture fut (.asyncCAS cc nk ^long cas-id exp v tc)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(set-cas [clj-client k v cas-id] (set-cas clj-client k v cas-id {}))
(set-cas [clj-client k v cas-id {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^Transcoder tc (or transcoder (get-transcoder clj-client))]
(cb-future/cas-response (.cas cc nk ^long cas-id exp v tc))))
(async-set-cas-json [clj-client k v cas-id]
(async-set-cas-json clj-client k v cas-id {}))
(async-set-cas-json [clj-client k v cas-id opts]
(let [jv (cb-util/write-json v)]
(async-set-cas clj-client k jv cas-id opts)))
(set-cas-json [clj-client k v cas-id]
(set-cas-json clj-client k v cas-id {}))
(set-cas-json [clj-client k v cas-id opts]
(let [jv (cb-util/write-json v)]
(set-cas clj-client k jv cas-id opts)))
(async-touch [clj-client k]
(async-touch clj-client k {}))
(async-touch [clj-client k {:keys [expiry]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^OperationFuture fut (.touch cc nk exp)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(touch [clj-client k] (touch clj-client k {}))
(touch [clj-client k {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-touch clj-client k opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-unlock [clj-client k cas-id]
(async-unlock clj-client k cas-id {}))
(async-unlock [clj-client k cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.asyncUnlock cc nk ^long cas-id transcoder)
(.asyncUnlock cc nk ^long cas-id))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(unlock [clj-client k cas-id] (unlock clj-client k cas-id {}))
(unlock [clj-client k cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)]
(if transcoder
(.unlock cc nk ^long cas-id transcoder)
(.unlock cc nk ^long cas-id))))
(async-get-view [clj-client design-doc view-name]
(let [^HttpFuture fut (.asyncGetView cc design-doc view-name)]
(cb-future/->CouchbaseCljHttpFuture cf fut)))
(get-view [clj-client design-doc view-name] (.getView cc design-doc view-name))
TODO : Currently not supported due to API change in the Couchbase Client .
;(async-get-views [clj-client design-doc]
( let [ ^Future fut ( .asyncGetViews cc design - doc ) ]
; (->CouchbaseCljHttpFuture cf fut)))
;(get-views [clj-client design-doc]
( when - let [ rs ( .getViews cc design - doc ) ]
( seq rs ) ) )
(async-query [clj-client view q]
(let [^couchbase_clj.query.CouchbaseCljQuery
new-q (if (instance? couchbase_clj.query.CouchbaseCljQuery q)
q
(cb-query/create-query q))
^HttpFuture fut (.asyncQuery cc view (cb-query/get-query new-q))]
(cb-future/->CouchbaseCljHttpFuture cf fut)))
(async-query [clj-client design-doc view-name q]
(let [^View view (get-view clj-client design-doc view-name)]
(async-query clj-client view q)))
(query [clj-client view q]
(let [^couchbase_clj.query.CouchbaseCljQuery
new-q (if (instance? couchbase_clj.query.CouchbaseCljQuery q)
q
(cb-query/create-query q))]
(seq (.query cc view (cb-query/get-query new-q)))))
(query [clj-client design-doc view-name q]
(let [^View view (get-view clj-client design-doc view-name)]
(query clj-client view q)))
(lazy-query [clj-client view q num]
(let [^couchbase_clj.query.CouchbaseCljQuery
new-q (if (instance? couchbase_clj.query.CouchbaseCljQuery q)
q
(cb-query/create-query q))]
(-> (.paginatedQuery cc view (cb-query/get-query new-q) num)
iterator-seq
lazy-seq)))
(lazy-query [clj-client design-doc view-name q num]
(let [^View view (get-view clj-client design-doc view-name)]
(lazy-query clj-client view q num)))
(wait-queue [clj-client] (wait-queue clj-client (.getOperationTimeout cf)))
(wait-queue [clj-client timeout]
(.waitForQueues cc timeout TimeUnit/MILLISECONDS))
;; TODO: Currently not working
; (observe [clj-client k cas-id]
; (let [^String nk (name k)]
; (.observe cc nk ^long cas-id)))
;; TODO: Add observer methods
( add - observer [ clj - client conn - obs ] ( .addObserver cc conn - obs ) )
( remove - observer [ clj - client conn - obs ] ( conn - obs ) )
;; TODO: Currently not working
;(flush [clj-client] (flush clj-client -1))
;(flush [clj-client delay] (.isSuccess (.getStatus (.flush cc delay))))
(shutdown [clj-client] (shutdown clj-client -1))
(shutdown [clj-client timeout] (.shutdown cc timeout TimeUnit/MILLISECONDS)))
(defn create-client
"Create and return a Couchbase client.
If no parameters are specified, client will be created
from default values specified in couchbase-clj.config.
You can specify keywords parameters: bucket, username, password, uris,
client-builder, factory and other opts.
bucket is the bucket name. Default value is defined
as @default-bucket and is \"default\".
username is the bucket username. Default value is defined
as @default-username and is a empty string.
Currently username is ignored.
password is the bucket password. Default value is defined
as @default-password and is a empty string.
uris is a Collection of string uris, ex: [\":8091/pools\"]
Other options can be specified for CouchbaseConnectionFactoryBuilder
object creation.
Internally, :failure-mode and :hash-alg must have a value and those
default values are :redistribute and :native-hash respectively.
All options for CouchbaseConnectionFactoryBuilder can be looked at
couchbase-clj.client-builder/method-map Var.
You can specify the client-builder keyword with the value of
CouchbaseCljClientBuilder object which is created by
couchbase-clj.client-builder/create-client-builder function.
When doing this, bucket, username, password keywords should be specified.
By using a factory keyword, you can pass a CouchbaseConnectionFactory object
which is created by couchbase-clj.client-builder/create-factory function.
ex:
(create-client)
(create-client {:bucket \"default\"
:username \"\"
:password \"\"
:uris [\":8091/pools\"]})
(create-client {:auth-descriptor auth-descriptor-object
:daemon false
:failure-mode :redistribute
:hash-alg :native-hash
:max-reconnect-delay 30000
:obs-poll-interval 100
:obs-poll-max 400
:op-queue-max-block-time 10000
:op-timeout 10000
:read-buffer-size 16384
:should-optimize false
:timeout-exception-threshold 1000
:transcoder (SerializingTranscoder.)
:use-nagle-algorithm false})
(create-client {:client-builder (create-client-builder
{:hash-alg :native-hash
:failure-mode :redistribute
:max-reconnect-delay 30000})
:uris [(URI. \":8091/pools\")]
:bucket \"default\"
:username \"\"
:password \"\"})
(create-client {:factory couchbase-connection-factory-object})"
([] (create-client {}))
([{:keys [client-builder factory] :as opts}]
(let [cf (cond
(and client-builder
(instance?
couchbase_clj.client_builder.CouchbaseCljClientBuilder
client-builder))
(cb-client-builder/create-factory
(-> (assoc opts
:factory-builder
(cb-client-builder/get-factory-builder client-builder))
(dissoc :client-builder)))
(and factory (instance? CouchbaseConnectionFactory factory))
factory
:else (cb-client-builder/build opts))]
(->CouchbaseCljClient (CouchbaseClient. cf) cf))))
(defmacro defclient
"A macro that defines a Var with Couchbase client specified by a name
with or without options.
See create-client function for detail."
([name]
`(def ~name (create-client)))
([name opts]
`(def ~name (create-client ~opts))))
| null | https://raw.githubusercontent.com/otabat/couchbase-clj/5f975dc85d0eec554034eefa9d97f1ee7ad9e84a/src/couchbase_clj/client.clj | clojure | TODO: Seems not working
[clj-client k]
TODO: Not working
(get-min-reconnect-interval [clj-client] "Get the min reconnect interval.")
TODO: APIs not provided?
(get-obs-poll-interval [clj-client])
(get-obs-poll-max [clj-client])
(async-get-views [clj-client design-doc]
"Asynchronously get a Vector of views.
Return value is a CouchbaseCljHttpFuture object.
design-doc is a design document name.")
(get-views [clj-client design-doc]
design-doc is a design document name.")
TODO: Add observer methods
(observe [clj-client k cas-id])
(add-observer [clj-client conn-obs])
(remove-observer [clj-client conn-obs])
(flush
[clj-client]
[clj-client delay]
"Flush all cached and persisted data.
If flushing has succeeded then true is returned, otherwise false.
Currently there is a bug in this command and it may not work as expected.")
TODO: Seems not working?
(get-client-status [clj-client k]
(let [^String nk (name k)]
TODO: Not working
(get-min-reconnect-interval [clj-client] (.getMinReconnectInterval cf))
TODO: APIs not provided?
(get-obs-poll-interval [clj-client] (.getObsPollInterval cf))
(get-obs-poll-max [clj-client])
due to a bug in the couchbase cilent sdk.
due to a bug in the couchbase-cilent.
(async-get-views [clj-client design-doc]
(->CouchbaseCljHttpFuture cf fut)))
(get-views [clj-client design-doc]
TODO: Currently not working
(observe [clj-client k cas-id]
(let [^String nk (name k)]
(.observe cc nk ^long cas-id)))
TODO: Add observer methods
TODO: Currently not working
(flush [clj-client] (flush clj-client -1))
(flush [clj-client delay] (.isSuccess (.getStatus (.flush cc delay)))) | (ns couchbase-clj.client
(:import [java.net URI]
[java.util Collection]
[java.util.concurrent TimeUnit Future]
[net.spy.memcached CASValue]
[net.spy.memcached.internal GetFuture BulkGetFuture OperationFuture]
[net.spy.memcached.transcoders Transcoder]
[net.spy.memcached PersistTo ReplicateTo]
[com.couchbase.client CouchbaseClient CouchbaseConnectionFactory]
[com.couchbase.client.internal HttpFuture]
[com.couchbase.client.protocol.views Query View ViewRow])
(:refer-clojure :exclude [get set replace flush inc dec replicate
future-cancel future-cancelled? future-done?])
(:require [couchbase-clj.query :as cb-query]
[couchbase-clj.config :as cb-config]
[couchbase-clj.client-builder :as cb-client-builder]
[couchbase-clj.future :as cb-future]
[couchbase-clj.util :as cb-util]))
(def ^:private persist-to-map {:master PersistTo/MASTER
:one PersistTo/ONE
:two PersistTo/TWO
:three PersistTo/THREE
:four PersistTo/FOUR})
(def ^:private replicate-to-map {:zero ReplicateTo/ZERO
:one ReplicateTo/ONE
:two ReplicateTo/TWO
:three ReplicateTo/THREE})
(defn persist-to
"Get the PersistTo object by specifying a corresponding keyword argument.
persist can be :master, :one, :two, :three, :four.
If other value or no argument is specified,
@couchbase-clj.config/default-persist will be specified as the default value.
MASTER or ONE requires Persist to the Master.
TWO requires Persist to at least two nodes including the Master.
THREE requires Persist to at least three nodes including the Master.
FOUR requires Persist to at least four nodes including the Master."
([] (@cb-config/default-persist persist-to-map))
([persist]
(or (and persist (persist persist-to-map))
(@cb-config/default-persist persist-to-map))))
(defn replicate-to
"Get the ReplicateTo object by specifying a corresponding keyword argument.
replicate can be :zero, :one, :two, :three
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as the default value.
ZERO requires no replication.
ONE requires the data to be replicated with at least one replica.
TWO requires the data to be replicated with at least two replicas.
THREE requires the data to be replicated with at least three replicas."
([] (@cb-config/default-replicate replicate-to-map))
([replicate]
(or (and replicate (replicate replicate-to-map))
(@cb-config/default-replicate replicate-to-map))))
(defn cas-id
"Get the cas ID from the CASValue object."
[^CASValue c]
(when c
(.getCas c)))
(defn cas-val
"Get the value from the CASValue object"
[^CASValue c]
(when c
(.getValue c)))
(defn cas-val-json
"Get the JSON string value converted to Clojure data from the CASValue object.
nil is returned, if c is nil."
[^CASValue c]
(cb-util/read-json (.getValue c)))
(defn view-id
"Get the ID of query result from ViewRow object."
[^ViewRow view]
(.getId view))
(defn view-key
"Get the key of query result from ViewRow object."
[^ViewRow view]
(.getKey view))
(defn view-key-json
"Get the JSON string key of query result from ViewRow object,
converted to Clojure data."
[^ViewRow view]
(cb-util/read-json (.getKey view)))
(defn view-val
"Get the value of query result from ViewRow object."
[^ViewRow view]
(.getValue view))
(defn view-val-json
"Get the JSON string value of query result from ViewRow object,
converted to Clojure data."
[^ViewRow view]
(cb-util/read-json (.getValue view)))
(defn view-doc
"Get the document of query result when include-docs is set to true."
[^ViewRow view]
(.getDocument view))
(defn view-doc-json
"Get the JSON string document of query result converted to Clojure data
when include-docs is set to true."
[^ViewRow view]
(cb-util/read-json (.getDocument view)))
(defprotocol ICouchbaseCljClient
(get-client [clj-client] "Get the CouchbaseClient object.")
(get-factory [clj-client] "Get the CouchbaseConnectionFactory object.")
(get-available-servers [clj-client]
"Get the addresses of available servers in a Vector.")
(get-unavailable-servers [clj-client]
"Get the addresses of unavailable servers in a Vector.")
(get-node-locator [clj-client]
"Get a read-only wrapper around the node locator wrapping this instance.")
(get-versions [clj-client]
"Get versions of all of the connected servers in a Map.")
(get-sasl-mechanisms [clj-client] "Get the list of sasl mechanisms in a Set.")
(get-client-status
[clj-client]
"Get all of the stats from all of the connections in a Map."
)
(get-auth-descriptor [clj-client] "Get the auth descriptor.")
(get-failure-mode [clj-client] "Get the failure mode.")
(get-hash-alg [clj-client] "Get the hashing algorithm.")
(get-max-reconnect-delay [clj-client] "Get the max reconnect delay.")
(get-op-queue-max-block-time [clj-client] "Get the op queue max block time.")
(get-op-timeout [clj-client]
"Get the operation timeout.
This is used as a default timeout value for sync and async client operations.")
(get-read-buffer-size [clj-client] "Get the read buffer size.")
(get-timeout-exception-threshold [clj-client]
"Get the timeout exception threshold.")
(get-transcoder [clj-client] "Get the default transcoder.")
(daemon? [clj-client]
"Return true if IO thread should be a daemon thread,
otherwise return false.")
(should-optimize? [clj-client]
"Return if the performance should be optimized for the network,
otherwise return false.")
(use-nagle-algorithm? [clj-client]
"Return true if the Nagle algorithm is specified, otherwise return false.")
(async-add
[clj-client k v]
[clj-client k v opts]
"Asynchronously add a value with the specified key
that does not already exist.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :observe, :persist,
and :replicate.
When :observe is set to true, persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(add
[clj-client k v]
[clj-client k v opts]
"Synchronously add a value with the specified key
that does not already exist.
If adding has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :timeout, :observe, :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days)
are interpreted as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(async-add-json
[clj-client k v]
[clj-client k v opts]
"Asynchronously add a value that will be converted to JSON string
with the specified key that does not already exist.
Return value will be a CouchbaseCljOperationFuture object.
Arguments are the same as async-add.")
(add-json
[clj-client k v]
[clj-client k v opts]
"Synchronously add a value that will be converted to JSON string
with the specified key that does not already exist.
If adding has succeeded then true is returned, otherwise false.
Arguments are the same as add.")
(async-append
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously append a value to an existing key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(append
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously append a value to an existing key.
If appending has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :transcoder and :timeout.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-prepend
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously prepend a value to an existing key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(prepend
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously prepend a value to an existing key.
If prepending has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :transcoder and :timeout.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-delete
[clj-client k]
[clj-client k opts]
"Asynchronously delete the specified key.
k is the key and can be a keyword, symbol or a string.
Currently no options can be specified.")
(delete
[clj-client k]
[clj-client k opts]
"Synchronously delete the specified key.
If deletion has succeeded then true is returned, otherwise false.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :timeout.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-get
[clj-client k]
[clj-client k opts]
"Asynchronously get the value of the specified key.
Return value is a CouchbaseCljGetFuture object.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get
[clj-client k]
[clj-client k opts]
"Synchronously get the value of the specified key.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-json
[clj-client k]
[clj-client k opts]
"Synchronously get the JSON string value converted
to a Clojure data of the specified key.
You can specify a optional transcoder keyword in a map.
Arguments are the same as get.")
(async-get-touch
[clj-client k]
[clj-client k opts]
"Asynchronously get a value and update the expiration time for a given key
Return value is a CouchbaseCljOperationFuture object.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-touch
[clj-client k]
[clj-client k opts]
"Synchronously get a value and update the expiration time for a given key
Return value is a CASValue object.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(async-get-multi
[clj-client ks]
[clj-client ks opts]
"Asynchronously get multiple keys.
ks is a sequential collection containing keys.
Key can be a keyword, symbol or a string.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-multi
[clj-client ks]
[clj-client ks opts]
"Synchronously get multiple keys.
ks is a sequential collection containing keys.
Key can be a keyword, symbol or a string.
You can specify a optional transcoder keyword in a map.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-multi-json
[clj-client k]
[clj-client k opts]
"Synchronously get multiple JSON string value converted
to a Clojure data of the specified key.
You can specify a optional transcoder keyword in a map.
Arguments are the same as get-multi.")
(async-get-lock
[clj-client k]
[clj-client k opts]
"Asynchronously get a lock.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-lock
[clj-client k]
[clj-client k opts]
"Synchronously get a lock.
Return value is a CASValue object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(locked? [clj-client k]
"Retrun true if key is locked.
k is the key and can be a keyword, symbol or a string.")
(async-get-cas
[clj-client k]
[clj-client k opts]
"Asynchronously get single key value with CAS value.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-cas
[clj-client k]
[clj-client k opts]
"Synchronously get single key value with CAS value.
Return value is a CASValue object.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(get-cas-id
[clj-client k]
[clj-client k opts]
"Synchronously get a CAS ID.
Integer CAS ID is returned.
Arguments are the same as get-cas.")
(async-inc
[clj-client k]
[clj-client k opts]
"Asynchronously increment the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :offset.
offset is the integer offset value to increment.
If offset is not specified,
@couchbase-clj.config/default-inc-offset will be specified
as the default value.")
(inc
[clj-client k]
[clj-client k opts]
"Synchronously increment the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :offset, :default and :expiry.
offset is the integer offset value to increment.
If offset is not specified,
@couchbase-clj.config/default-inc-offset will be specified
as the default value.
default is the default value to increment if key does not exist.
If default is not specified,
@couchbase-clj.config/default-inc-default will be specified
as the default value.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.")
(async-dec
[clj-client k]
[clj-client k opts]
"Asynchronously decrement the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keyword is :offset.
offset is the integer offset value to decrement.
If offset is not specified,
@couchbase-clj.config/default-dec-offset will be specified
as the default value.")
(dec
[clj-client k]
[clj-client k opts]
"Synchronously decrement the value of an existing key.
k is the key and can be a keyword, symbol or a string.
You can specify a optional key value map as the opts argument.
Optional keywords are :offset, :default and :expiry.
offset is the integer offset value to increment.
If offset is not specified,
@couchbase-clj.config/default-inc-offset will be specified
as the default value.
default is the default value to increment if key does not exist.
If default is not specified,
@couchbase-clj.config/default-inc-default will be specified
as the default value.
expiry is the integer expiry time for key in seconds.
If expiry is not specified,
@couchbase-clj.config/default-lock-expiry will be specified
as the default value.")
(async-replace
[clj-client k v]
[clj-client k v opts]
"Asynchronously update an existing key with a new value.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :observe, :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(replace
[clj-client k v]
[clj-client k v opts]
"Synchronously update an existing key with a new value.
If replacing has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :timeout, :observe :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days)
are interpreted as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(async-replace-json
[clj-client k v]
[clj-client k v opts]
"Asynchronously update an existing key with a new value
that will be converted to a JSON string value.
Arguments are the same as async-replace.")
(replace-json
[clj-client k v]
[clj-client k v opts]
"Synchronously update an existing key with a new value
that will be converted to a JSON string value.
Arguments are the same as replace.")
(async-set
[clj-client k v]
[clj-client k v opts]
"Asynchronously store a value using the specified key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :observe, :persist,
and :repliate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(set
[clj-client k v]
[clj-client k v opts]
"Synchronously store a value using the specified key.
If set has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry, :transcoder, :timeout, :observe, :persist,
and :replicate.
When :observe is set to true, :persist, and :replicate can be set.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.
observe is the Boolean flag to enable persist and replicate options.
persist is the keyword to specify Persist requirements
to Master and more servers.
Values can be :master, :one, :two, :three, :four.
If persist is not specified,
@couchbase-clj.config/default-persist will be specified as the default value.
replicate is the keyword to specify Replication requirements
to zero or more replicas.
Values can be :zero, :one, :two, :three.
If other value or no argument is specified,
@couchbase-clj.config/default-replicate will be specified as a default value.")
(async-set-json
[clj-client k v]
[clj-client k v opts]
"Asynchronously store a value that will be converted to a JSON String
using the specified key.
Return value is a CouchbaseCljOperationFuture object.
Arguments are the same as async-set.")
(set-json
[clj-client k v]
[clj-client k v opts]
"Synchronously store a value that will be converted to a JSON String
using the specified key.
If set has succeeded then true is returned, otherwise false.
Arguments are the same as set.")
(async-set-cas
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously compare the CAS ID and store a value using the specified key.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(set-cas
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously compare the CAS ID and store a value using the specified key.
Keyword results that are originally defined in CASResponse
and mapped by cas-response function will be returned.
k is the key and can be a keyword, symbol or a string.
v is the value to be stored.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :transcoder.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(async-set-cas-json
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Asynchronously compare the CAS ID and store a value that is
converted to a JSON string
using the specified key.
Return value is a CouchbaseCljOperationFuture object.
Arguments are the same as async-set-cas.")
(set-cas-json
[clj-client k v cas-id]
[clj-client k v cas-id opts]
"Synchronously compare the CAS ID and store a value that is
converted to a JSON string
using the specified key.
Keyword results that are originally defined in CASResponse
and mapped by cas-response function will be returned.
Arguments are the same as set-cas.")
(async-touch
[clj-client k]
[clj-client k opts]
"Asynchronously update the expiration time for a given key
Return value is a CouchbaseCljOperationFuture object.
You can specify a optional key value map as the opts argument.
Optional keyword is :expiry.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.")
(touch
[clj-client k]
[clj-client k opts]
"Synchronously update the expiration time for a given key
If update has succeeded then true is returned, otherwise false.
You can specify a optional key value map as the opts argument.
Optional keywords are :expiry and :timeout.
expiry is the integer expiry time for key in seconds.
Values larger than 30*24*60*60 seconds (30 days) are interpreted
as absolute times from the epoch.
By specifying -1, expiry can be disabled.
If expiry is not specified,
@couchbase-clj.config/default-data-expiry will be specified
as the default value.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
(async-unlock
[clj-client k cas-id]
[clj-client k cas-id opts]
"Asynchronously unlock.
Return value is a CouchbaseCljOperationFuture object.
k is the key and can be a keyword, symbol or a string.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(unlock
[clj-client k cas-id]
[clj-client k cas-id opts]
"Synchronously unlock.
If unlocking has succeeded then true is returned, otherwise false.
k is the key and can be a keyword, symbol or a string.
cas-id is the integer unique value to identify key/value combination.
You can specify a optional key value map as the opts argument.
Optional keyword is :transcoder.
transcoder is the Transcoder object to be used to serialize the value.
If transcoder is not specified,
SerializingTranscoder will be specified as the default transcoder.")
(async-get-view [clj-client design-doc view-name]
"Asynchronously get a new view.
Return value is a CouchbaseCljHttpFuture object.
design-doc is a design document name.
view-name is a view name within a design document.")
(get-view [clj-client design-doc view-name]
"Synchronously get a new view.
Return value is a View object.
design-doc is a design document name.
view-name is a view name within a design document.")
TODO : Currently not supported due to API change in the Couchbase Client .
" Synchronously get a sequence of views .
Return value is a Vector of views .
(async-query
[clj-client view q]
[clj-client design-doc view-name q]
"Asynchronously query a view within a design doc.
Return value is a CouchbaseCljHttpFuture object.
view is a View object.
q is a CouchbaseCljQuery object or query parameters.
design-doc is a design document name.
view-name is a view name within a design document.")
(query
[clj-client view q]
[clj-client design-doc view-name q]
"Synchronously query a view within a design doc.
Return value is a sequence of ViewRows.
view is a View object.
q is a CouchbaseCljQuery object or query parameters.
design-doc is a design document name.
view-name is a view name within a design document.")
(lazy-query
[clj-client view q num]
[clj-client design-doc view-name q num]
"Lazily query a view within a design doc.
Response is a lazy sequence of a ViewResponse.
view is a View object.
q is a CouchbaseCljQuery object or query parameters.
design-doc is a design document name.
view-name is a view name within a design document.
num is an integer to specify the amount of documents to get in each iterations.
lazy-query can be used to query a large data lazily
that it allows you to only get the amount of documents specified per iteration.
ex:
(doseq [res (lazy-query clj-client view q num)]
(println (map view-id res)))
=> (:id1 :id2 :id3 :id4 :id5)")
(wait-queue
[clj-client]
[clj-client timeout]
"Synchronously wait for the queues to die down.
Return true if the queues have died down, otherwise false.
You can specifiy a optional operation timeout value.
timeout is the integer operation timeout value in milliseconds.
If timeout is not specified, the default value will be the value
set by create-client function.
It can be retrived by get-op-timeout function.")
delay is the period of time to delay , in seconds
To do flushing , you 'll need to enable flush_all by using cbepctl command .
ex : cbepctl localhost:11210 set flush_param flushall_enabled true
(shutdown
[clj-client]
[clj-client timeout]
"Shut down the client.
If no argument is specified, client will shutdown immediately.
If you specify a optional integer operation timeout value (in milliseconds),
shutdown will occur gracefully.
timeout is the max waiting time."))
(deftype CouchbaseCljClient [^CouchbaseClient cc ^CouchbaseConnectionFactory cf]
ICouchbaseCljClient
(get-client [clj-client] cc)
(get-factory [clj-client] cf)
(get-available-servers [clj-client]
(let [vc (into [] (.getAvailableServers cc))]
(when-not (empty? vc)
vc)))
(get-unavailable-servers [clj-client]
(let [vc (into [] (.getUnavailableServers cc))]
(when-not (empty? vc)
vc)))
(get-node-locator [clj-client] (.getNodeLocator cc))
(get-versions [clj-client] (into {} (.getVersions cc)))
(get-sasl-mechanisms [clj-client] (into #{} (.listSaslMechanisms cc)))
(get-client-status [clj-client] (into {} (.getStats cc )))
( into { } ( .getStats cc nk ) ) ) )
(get-auth-descriptor [clj-client] (.getAuthDescriptor cf))
(get-failure-mode [clj-client] (.getFailureMode cf))
(get-hash-alg [clj-client] (.getHashAlg cf))
(get-max-reconnect-delay [clj-client] (.getMaxReconnectDelay cf))
(get-op-queue-max-block-time [clj-client] (.getOpQueueMaxBlockTime cf))
(get-op-timeout [clj-client] (.getOperationTimeout cf))
(get-read-buffer-size [clj-client] (.getReadBufSize cf))
(get-timeout-exception-threshold [clj-client]
(.getTimeoutExceptionThreshold cf))
(get-transcoder [clj-client] (.getTranscoder cc))
(daemon? [clj-client] (.isDaemon cf))
(should-optimize? [clj-client] (.shouldOptimize cf))
(use-nagle-algorithm? [clj-client] (.useNagleAlgorithm cf))
(async-add [clj-client k v] (async-add clj-client k v {}))
(async-add [clj-client k v {:keys [expiry ^Transcoder transcoder
observe persist replicate]}]
(let [^String nk (name k)
^String sv (str v)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^PersistTo p (persist-to persist)
^ReplicateTo r (replicate-to replicate)
^OperationFuture fut (if transcoder
(.add cc nk exp v transcoder)
(if (true? observe)
(.add cc nk exp sv p r)
(.add cc nk exp v)))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(add [clj-client k v] (add clj-client k v {}))
(add [clj-client k v {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-add clj-client k v opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-add-json [clj-client k v] (async-add-json clj-client k v {}))
(async-add-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(async-add clj-client k jv opts)))
(add-json [clj-client k v] (add-json clj-client k v {}))
(add-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(add clj-client k jv opts)))
(async-append [clj-client k v cas-id] (async-append clj-client k v cas-id {}))
(async-append [clj-client k v cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.append cc ^long cas-id nk v transcoder)
(.append cc ^long cas-id nk v))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(append [clj-client k v cas-id] (append clj-client k v cas-id {}))
(append [clj-client k v cas-id {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-append clj-client k v cas-id opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-prepend [clj-client k v cas-id]
(async-prepend clj-client k v cas-id {}))
(async-prepend [clj-client k v cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.prepend cc ^long cas-id nk v transcoder)
(.prepend cc ^long cas-id nk v))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(prepend [clj-client k v cas-id] (prepend clj-client k v cas-id {}))
(prepend [clj-client k v cas-id {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-prepend clj-client k v cas-id opts))]
(.get fut to TimeUnit/MILLISECONDS)))
TODO : Currently delete command through observe is unavailable
(async-delete [clj-client k]
(let [^String nk (name k)
^OperationFuture fut (.delete cc nk)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(delete [clj-client k] (delete clj-client k {}))
TODO : Currently delete command through observe is unavailable
(delete [clj-client k {:keys [^long timeout]}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-delete clj-client k))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-get [clj-client k]
(async-get clj-client k {}))
(async-get [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^GetFuture fut (if transcoder
(.asyncGet cc nk transcoder)
(.asyncGet cc nk))]
(cb-future/->CouchbaseCljGetFuture cf fut)))
(get [clj-client k] (get clj-client k {}))
(get [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)]
(if transcoder
(.get cc nk transcoder)
(.get cc nk))))
(get-json [clj-client k] (get-json clj-client k {}))
(get-json [clj-client k opts] (cb-util/read-json (get clj-client k opts)))
(async-get-touch [clj-client k]
(async-get-touch clj-client k {}))
(async-get-touch [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^OperationFuture fut (if transcoder
(.asyncGetAndTouch cc nk exp transcoder)
(.asyncGetAndTouch cc nk exp))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(get-touch [clj-client k] (get-touch clj-client k {}))
(get-touch [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)]
(when-let [^CASValue c (if transcoder
(.getAndTouch cc nk exp transcoder)
(.getAndTouch cc nk exp))]
c)))
(async-get-multi [clj-client ks]
(async-get-multi clj-client ks {}))
(async-get-multi [clj-client ks {:keys [^Transcoder transcoder]}]
(let [^Collection seq-ks (map name ks)
^BulkGetFuture fut (if transcoder
(.asyncGetBulk cc seq-ks transcoder)
(.asyncGetBulk cc seq-ks))]
(cb-future/->CouchbaseCljBulkGetFuture cf fut)))
(get-multi [clj-client ks] (get-multi clj-client ks {}))
(get-multi [clj-client ks {:keys [^Transcoder transcoder]}]
(let [^Collection seq-ks (map name ks)
m (into {} (if transcoder
(.getBulk cc seq-ks transcoder)
(.getBulk cc seq-ks)))]
(when-not (empty? m)
m)))
(get-multi-json [clj-client k] (get-multi-json clj-client k {}))
(get-multi-json [clj-client k opts]
(reduce #(merge %1 {(key %2)
(cb-util/read-json (val %2))})
nil
(get-multi clj-client k opts)))
(async-get-lock [clj-client k]
(async-get-lock clj-client k {}))
(async-get-lock [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-lock-expiry) int)
^OperationFuture fut (if transcoder
(.asyncGetAndLock cc nk exp transcoder)
(.asyncGetAndLock cc nk exp))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(get-lock [clj-client k] (get-lock clj-client k {}))
(get-lock [clj-client k {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-lock-expiry) int)
^CASValue c (if transcoder
(.getAndLock cc nk exp transcoder)
(.getAndLock cc nk exp))]
c))
(locked?
[clj-client k]
(let [cas-id (get-cas-id clj-client k)]
(if (= cas-id -1)
true
false)))
(async-get-cas [clj-client k]
(async-get-cas clj-client k {}))
(async-get-cas [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.asyncGets cc nk transcoder)
(.asyncGets cc nk))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(get-cas [clj-client k] (get-cas clj-client k {}))
(get-cas [clj-client k {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)]
(when-let [^CASValue c (if transcoder
(.gets cc nk transcoder)
(.gets cc nk))]
c)))
(get-cas-id [clj-client k] (get-cas-id clj-client k {}))
(get-cas-id [clj-client k opts]
(let [^CASValue c (get-cas clj-client k opts)]
(when c
(.getCas c))))
(async-inc [clj-client k]
(async-inc clj-client k {}))
(async-inc [clj-client k {:keys [^long offset]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-inc-offset)
^OperationFuture fut (.asyncIncr cc nk ofst)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(inc [clj-client k] (inc clj-client k {}))
(inc [clj-client k {:keys [^long offset ^long default expiry]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-inc-offset)
^long dflt (or default ^long @cb-config/default-inc-default)
exp (-> (or expiry @cb-config/default-data-expiry) int)]
(.incr cc nk ofst dflt exp)))
(async-dec [clj-client k]
(async-dec clj-client k {}))
(async-dec [clj-client k {:keys [^long offset]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-dec-offset)
^OperationFuture fut (.asyncDecr cc nk ofst)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(dec [clj-client k] (dec clj-client k {}))
(dec [clj-client k {:keys [^long offset ^long default expiry]}]
(let [^String nk (name k)
^long ofst (or offset ^long @cb-config/default-dec-offset)
^long dflt (or default ^long @cb-config/default-dec-default)
exp (-> (or expiry @cb-config/default-data-expiry) int)]
(.decr cc nk ofst dflt exp)))
(async-replace [clj-client k v]
(async-replace clj-client k v {}))
(async-replace [clj-client k v {:keys [expiry ^Transcoder transcoder
observe persist replicate]}]
(let [^String nk (name k)
^String sv (str v)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^PersistTo p (persist-to persist)
^ReplicateTo r (replicate-to replicate)
^OperationFuture fut (if transcoder
(.replace cc nk exp v transcoder)
(if (true? observe)
(.replace cc nk exp sv p r)
(.replace cc nk exp v)))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(replace [clj-client k v] (replace clj-client k v {}))
(replace [clj-client k v {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-replace clj-client k v opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-replace-json [clj-client k v] (async-replace-json clj-client k v {}))
(async-replace-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(async-replace clj-client k jv opts)))
(replace-json [clj-client k v] (replace-json clj-client k v {}))
(replace-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(replace clj-client k jv opts)))
(async-set [clj-client k v] (async-set clj-client k v {}))
(async-set [clj-client k v {:keys [expiry ^Transcoder transcoder
observe persist replicate]}]
(let [^String nk (name k)
^String sv (str v)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^PersistTo p (persist-to persist)
^ReplicateTo r (replicate-to replicate)
^OperationFuture fut (if transcoder
(.set cc nk exp v transcoder)
(if (true? observe)
(.set cc nk exp sv p r)
(.set cc nk exp v)))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(set [clj-client k v] (set clj-client k v {}))
(set [clj-client k v {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-set clj-client k v opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-set-json [clj-client k v] (async-set-json clj-client k v {}))
(async-set-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(async-set clj-client k jv opts)))
(set-json [clj-client k v] (set-json clj-client k v {}))
(set-json [clj-client k v opts]
(let [jv (cb-util/write-json v)]
(set clj-client k jv opts)))
(async-set-cas [clj-client k v cas-id]
(async-set-cas clj-client k v cas-id {}))
(async-set-cas [clj-client k v cas-id
{:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^Transcoder tc (or transcoder (get-transcoder clj-client))
^OperationFuture fut (.asyncCAS cc nk ^long cas-id exp v tc)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(set-cas [clj-client k v cas-id] (set-cas clj-client k v cas-id {}))
(set-cas [clj-client k v cas-id {:keys [expiry ^Transcoder transcoder]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^Transcoder tc (or transcoder (get-transcoder clj-client))]
(cb-future/cas-response (.cas cc nk ^long cas-id exp v tc))))
(async-set-cas-json [clj-client k v cas-id]
(async-set-cas-json clj-client k v cas-id {}))
(async-set-cas-json [clj-client k v cas-id opts]
(let [jv (cb-util/write-json v)]
(async-set-cas clj-client k jv cas-id opts)))
(set-cas-json [clj-client k v cas-id]
(set-cas-json clj-client k v cas-id {}))
(set-cas-json [clj-client k v cas-id opts]
(let [jv (cb-util/write-json v)]
(set-cas clj-client k jv cas-id opts)))
(async-touch [clj-client k]
(async-touch clj-client k {}))
(async-touch [clj-client k {:keys [expiry]}]
(let [^String nk (name k)
exp (-> (or expiry @cb-config/default-data-expiry) int)
^OperationFuture fut (.touch cc nk exp)]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(touch [clj-client k] (touch clj-client k {}))
(touch [clj-client k {:keys [^long timeout] :as opts}]
(let [^long to (or timeout (.getOperationTimeout cf))
^OperationFuture fut (cb-future/get-future
(async-touch clj-client k opts))]
(.get fut to TimeUnit/MILLISECONDS)))
(async-unlock [clj-client k cas-id]
(async-unlock clj-client k cas-id {}))
(async-unlock [clj-client k cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)
^OperationFuture fut (if transcoder
(.asyncUnlock cc nk ^long cas-id transcoder)
(.asyncUnlock cc nk ^long cas-id))]
(cb-future/->CouchbaseCljOperationFuture cf fut)))
(unlock [clj-client k cas-id] (unlock clj-client k cas-id {}))
(unlock [clj-client k cas-id {:keys [^Transcoder transcoder]}]
(let [^String nk (name k)]
(if transcoder
(.unlock cc nk ^long cas-id transcoder)
(.unlock cc nk ^long cas-id))))
(async-get-view [clj-client design-doc view-name]
(let [^HttpFuture fut (.asyncGetView cc design-doc view-name)]
(cb-future/->CouchbaseCljHttpFuture cf fut)))
(get-view [clj-client design-doc view-name] (.getView cc design-doc view-name))
TODO : Currently not supported due to API change in the Couchbase Client .
( let [ ^Future fut ( .asyncGetViews cc design - doc ) ]
( when - let [ rs ( .getViews cc design - doc ) ]
( seq rs ) ) )
(async-query [clj-client view q]
(let [^couchbase_clj.query.CouchbaseCljQuery
new-q (if (instance? couchbase_clj.query.CouchbaseCljQuery q)
q
(cb-query/create-query q))
^HttpFuture fut (.asyncQuery cc view (cb-query/get-query new-q))]
(cb-future/->CouchbaseCljHttpFuture cf fut)))
(async-query [clj-client design-doc view-name q]
(let [^View view (get-view clj-client design-doc view-name)]
(async-query clj-client view q)))
(query [clj-client view q]
(let [^couchbase_clj.query.CouchbaseCljQuery
new-q (if (instance? couchbase_clj.query.CouchbaseCljQuery q)
q
(cb-query/create-query q))]
(seq (.query cc view (cb-query/get-query new-q)))))
(query [clj-client design-doc view-name q]
(let [^View view (get-view clj-client design-doc view-name)]
(query clj-client view q)))
(lazy-query [clj-client view q num]
(let [^couchbase_clj.query.CouchbaseCljQuery
new-q (if (instance? couchbase_clj.query.CouchbaseCljQuery q)
q
(cb-query/create-query q))]
(-> (.paginatedQuery cc view (cb-query/get-query new-q) num)
iterator-seq
lazy-seq)))
(lazy-query [clj-client design-doc view-name q num]
(let [^View view (get-view clj-client design-doc view-name)]
(lazy-query clj-client view q num)))
(wait-queue [clj-client] (wait-queue clj-client (.getOperationTimeout cf)))
(wait-queue [clj-client timeout]
(.waitForQueues cc timeout TimeUnit/MILLISECONDS))
( add - observer [ clj - client conn - obs ] ( .addObserver cc conn - obs ) )
( remove - observer [ clj - client conn - obs ] ( conn - obs ) )
(shutdown [clj-client] (shutdown clj-client -1))
(shutdown [clj-client timeout] (.shutdown cc timeout TimeUnit/MILLISECONDS)))
(defn create-client
"Create and return a Couchbase client.
If no parameters are specified, client will be created
from default values specified in couchbase-clj.config.
You can specify keywords parameters: bucket, username, password, uris,
client-builder, factory and other opts.
bucket is the bucket name. Default value is defined
as @default-bucket and is \"default\".
username is the bucket username. Default value is defined
as @default-username and is a empty string.
Currently username is ignored.
password is the bucket password. Default value is defined
as @default-password and is a empty string.
uris is a Collection of string uris, ex: [\":8091/pools\"]
Other options can be specified for CouchbaseConnectionFactoryBuilder
object creation.
Internally, :failure-mode and :hash-alg must have a value and those
default values are :redistribute and :native-hash respectively.
All options for CouchbaseConnectionFactoryBuilder can be looked at
couchbase-clj.client-builder/method-map Var.
You can specify the client-builder keyword with the value of
CouchbaseCljClientBuilder object which is created by
couchbase-clj.client-builder/create-client-builder function.
When doing this, bucket, username, password keywords should be specified.
By using a factory keyword, you can pass a CouchbaseConnectionFactory object
which is created by couchbase-clj.client-builder/create-factory function.
ex:
(create-client)
(create-client {:bucket \"default\"
:username \"\"
:password \"\"
:uris [\":8091/pools\"]})
(create-client {:auth-descriptor auth-descriptor-object
:daemon false
:failure-mode :redistribute
:hash-alg :native-hash
:max-reconnect-delay 30000
:obs-poll-interval 100
:obs-poll-max 400
:op-queue-max-block-time 10000
:op-timeout 10000
:read-buffer-size 16384
:should-optimize false
:timeout-exception-threshold 1000
:transcoder (SerializingTranscoder.)
:use-nagle-algorithm false})
(create-client {:client-builder (create-client-builder
{:hash-alg :native-hash
:failure-mode :redistribute
:max-reconnect-delay 30000})
:uris [(URI. \":8091/pools\")]
:bucket \"default\"
:username \"\"
:password \"\"})
(create-client {:factory couchbase-connection-factory-object})"
([] (create-client {}))
([{:keys [client-builder factory] :as opts}]
(let [cf (cond
(and client-builder
(instance?
couchbase_clj.client_builder.CouchbaseCljClientBuilder
client-builder))
(cb-client-builder/create-factory
(-> (assoc opts
:factory-builder
(cb-client-builder/get-factory-builder client-builder))
(dissoc :client-builder)))
(and factory (instance? CouchbaseConnectionFactory factory))
factory
:else (cb-client-builder/build opts))]
(->CouchbaseCljClient (CouchbaseClient. cf) cf))))
(defmacro defclient
"A macro that defines a Var with Couchbase client specified by a name
with or without options.
See create-client function for detail."
([name]
`(def ~name (create-client)))
([name opts]
`(def ~name (create-client ~opts))))
|
b638c54523fe04b76eb70ac8cdb343e426185d598db636b952363a608bd8c764 | aws-beam/aws-erlang | aws_amplifybackend.erl | %% WARNING: DO NOT EDIT, AUTO-GENERATED CODE!
See -beam/aws-codegen for more details .
%% @doc AWS Amplify Admin API
-module(aws_amplifybackend).
-export([clone_backend/4,
clone_backend/5,
create_backend/2,
create_backend/3,
create_backend_api/3,
create_backend_api/4,
create_backend_auth/3,
create_backend_auth/4,
create_backend_config/3,
create_backend_config/4,
create_backend_storage/3,
create_backend_storage/4,
create_token/3,
create_token/4,
delete_backend/4,
delete_backend/5,
delete_backend_api/4,
delete_backend_api/5,
delete_backend_auth/4,
delete_backend_auth/5,
delete_backend_storage/4,
delete_backend_storage/5,
delete_token/4,
delete_token/5,
generate_backend_api_models/4,
generate_backend_api_models/5,
get_backend/3,
get_backend/4,
get_backend_api/4,
get_backend_api/5,
get_backend_api_models/4,
get_backend_api_models/5,
get_backend_auth/4,
get_backend_auth/5,
get_backend_job/4,
get_backend_job/6,
get_backend_job/7,
get_backend_storage/4,
get_backend_storage/5,
get_token/3,
get_token/5,
get_token/6,
import_backend_auth/4,
import_backend_auth/5,
import_backend_storage/4,
import_backend_storage/5,
list_backend_jobs/4,
list_backend_jobs/5,
list_s3_buckets/2,
list_s3_buckets/3,
remove_all_backends/3,
remove_all_backends/4,
remove_backend_config/3,
remove_backend_config/4,
update_backend_api/4,
update_backend_api/5,
update_backend_auth/4,
update_backend_auth/5,
update_backend_config/3,
update_backend_config/4,
update_backend_job/5,
update_backend_job/6,
update_backend_storage/4,
update_backend_storage/5]).
-include_lib("hackney/include/hackney_lib.hrl").
%%====================================================================
%% API
%%====================================================================
%% @doc This operation clones an existing backend.
clone_backend(Client, AppId, BackendEnvironmentName, Input) ->
clone_backend(Client, AppId, BackendEnvironmentName, Input, []).
clone_backend(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/environments/", aws_util:encode_uri(BackendEnvironmentName), "/clone"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc This operation creates a backend for an Amplify app .
%%
%% Backends are automatically created at the time of app creation.
create_backend(Client, Input) ->
create_backend(Client, Input, []).
create_backend(Client, Input0, Options0) ->
Method = post,
Path = ["/backend"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Creates a new backend API resource.
create_backend_api(Client, AppId, Input) ->
create_backend_api(Client, AppId, Input, []).
create_backend_api(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Creates a new backend authentication resource.
create_backend_auth(Client, AppId, Input) ->
create_backend_auth(Client, AppId, Input, []).
create_backend_auth(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Creates a config object for a backend.
create_backend_config(Client, AppId, Input) ->
create_backend_config(Client, AppId, Input, []).
create_backend_config(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/config"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Creates a backend storage resource.
create_backend_storage(Client, AppId, Input) ->
create_backend_storage(Client, AppId, Input, []).
create_backend_storage(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Generates a one - time challenge code to authenticate a user into your
Amplify Admin UI .
create_token(Client, AppId, Input) ->
create_token(Client, AppId, Input, []).
create_token(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/challenge"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Removes an existing environment from your Amplify project.
delete_backend(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/environments/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Deletes an existing backend API resource.
delete_backend_api(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend_api(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend_api(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Deletes an existing backend authentication resource.
delete_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Removes the specified backend storage resource.
delete_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Deletes the challenge token based on the given appId and sessionId .
delete_token(Client, AppId, SessionId, Input) ->
delete_token(Client, AppId, SessionId, Input, []).
delete_token(Client, AppId, SessionId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/challenge/", aws_util:encode_uri(SessionId), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Generates a model schema for an existing backend API resource.
generate_backend_api_models(Client, AppId, BackendEnvironmentName, Input) ->
generate_backend_api_models(Client, AppId, BackendEnvironmentName, Input, []).
generate_backend_api_models(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/generateModels"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Provides project-level details for your Amplify UI project.
get_backend(Client, AppId, Input) ->
get_backend(Client, AppId, Input, []).
get_backend(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Gets the details for a backend API.
get_backend_api(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_api(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_api(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Gets a model introspection schema for an existing backend API
%% resource.
get_backend_api_models(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_api_models(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_api_models(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/getModels"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Gets a backend auth details.
get_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Returns information about a specific job.
get_backend_job(Client, AppId, BackendEnvironmentName, JobId)
when is_map(Client) ->
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, #{}, #{}).
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, QueryMap, HeadersMap)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap) ->
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, QueryMap, HeadersMap, []).
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, QueryMap, HeadersMap, Options0)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap), is_list(Options0) ->
Path = ["/backend/", aws_util:encode_uri(AppId), "/job/", aws_util:encode_uri(BackendEnvironmentName), "/", aws_util:encode_uri(JobId), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false}
| Options0],
Headers = [],
Query_ = [],
request(Client, get, Path, Query_, Headers, undefined, Options, SuccessStatusCode).
%% @doc Gets details for a backend storage resource.
get_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Gets the challenge token based on the given appId and sessionId .
get_token(Client, AppId, SessionId)
when is_map(Client) ->
get_token(Client, AppId, SessionId, #{}, #{}).
get_token(Client, AppId, SessionId, QueryMap, HeadersMap)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap) ->
get_token(Client, AppId, SessionId, QueryMap, HeadersMap, []).
get_token(Client, AppId, SessionId, QueryMap, HeadersMap, Options0)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap), is_list(Options0) ->
Path = ["/backend/", aws_util:encode_uri(AppId), "/challenge/", aws_util:encode_uri(SessionId), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false}
| Options0],
Headers = [],
Query_ = [],
request(Client, get, Path, Query_, Headers, undefined, Options, SuccessStatusCode).
%% @doc Imports an existing backend authentication resource.
import_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
import_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
import_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), "/import"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Imports an existing backend storage resource.
import_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
import_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
import_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), "/import"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Lists the jobs for the backend of an Amplify app .
list_backend_jobs(Client, AppId, BackendEnvironmentName, Input) ->
list_backend_jobs(Client, AppId, BackendEnvironmentName, Input, []).
list_backend_jobs(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/job/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc The list of S3 buckets in your account.
list_s3_buckets(Client, Input) ->
list_s3_buckets(Client, Input, []).
list_s3_buckets(Client, Input0, Options0) ->
Method = post,
Path = ["/s3Buckets"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Removes all backend environments from your Amplify project.
remove_all_backends(Client, AppId, Input) ->
remove_all_backends(Client, AppId, Input, []).
remove_all_backends(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Removes the AWS resources required to access the Amplify Admin UI .
remove_backend_config(Client, AppId, Input) ->
remove_backend_config(Client, AppId, Input, []).
remove_backend_config(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/config/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Updates an existing backend API resource.
update_backend_api(Client, AppId, BackendEnvironmentName, Input) ->
update_backend_api(Client, AppId, BackendEnvironmentName, Input, []).
update_backend_api(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Updates an existing backend authentication resource.
update_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
update_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
update_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Updates the AWS resources required to access the Amplify Admin UI .
update_backend_config(Client, AppId, Input) ->
update_backend_config(Client, AppId, Input, []).
update_backend_config(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/config/update"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Updates a specific job.
update_backend_job(Client, AppId, BackendEnvironmentName, JobId, Input) ->
update_backend_job(Client, AppId, BackendEnvironmentName, JobId, Input, []).
update_backend_job(Client, AppId, BackendEnvironmentName, JobId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/job/", aws_util:encode_uri(BackendEnvironmentName), "/", aws_util:encode_uri(JobId), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%% @doc Updates an existing backend storage resource.
update_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
update_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
update_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
%%====================================================================
Internal functions
%%====================================================================
-spec request(aws_client:aws_client(), atom(), iolist(), list(),
list(), map() | undefined, list(), pos_integer() | undefined) ->
{ok, {integer(), list()}} |
{ok, Result, {integer(), list(), hackney:client()}} |
{error, Error, {integer(), list(), hackney:client()}} |
{error, term()} when
Result :: map(),
Error :: map().
request(Client, Method, Path, Query, Headers0, Input, Options, SuccessStatusCode) ->
RequestFun = fun() -> do_request(Client, Method, Path, Query, Headers0, Input, Options, SuccessStatusCode) end,
aws_request:request(RequestFun, Options).
do_request(Client, Method, Path, Query, Headers0, Input, Options, SuccessStatusCode) ->
Client1 = Client#{service => <<"amplifybackend">>},
Host = build_host(<<"amplifybackend">>, Client1),
URL0 = build_url(Host, Path, Client1),
URL = aws_request:add_query(URL0, Query),
AdditionalHeaders1 = [ {<<"Host">>, Host}
, {<<"Content-Type">>, <<"application/x-amz-json-1.1">>}
],
Payload =
case proplists:get_value(send_body_as_binary, Options) of
true ->
maps:get(<<"Body">>, Input, <<"">>);
false ->
encode_payload(Input)
end,
AdditionalHeaders = case proplists:get_value(append_sha256_content_hash, Options, false) of
true ->
add_checksum_hash_header(AdditionalHeaders1, Payload);
false ->
AdditionalHeaders1
end,
Headers1 = aws_request:add_headers(AdditionalHeaders, Headers0),
MethodBin = aws_request:method_to_binary(Method),
SignedHeaders = aws_request:sign_request(Client1, MethodBin, URL, Headers1, Payload),
Response = hackney:request(Method, URL, SignedHeaders, Payload, Options),
DecodeBody = not proplists:get_value(receive_body_as_binary, Options),
handle_response(Response, SuccessStatusCode, DecodeBody).
add_checksum_hash_header(Headers, Body) ->
[ {<<"X-Amz-CheckSum-SHA256">>, base64:encode(crypto:hash(sha256, Body))}
| Headers
].
handle_response({ok, StatusCode, ResponseHeaders}, SuccessStatusCode, _DecodeBody)
when StatusCode =:= 200;
StatusCode =:= 202;
StatusCode =:= 204;
StatusCode =:= 206;
StatusCode =:= SuccessStatusCode ->
{ok, {StatusCode, ResponseHeaders}};
handle_response({ok, StatusCode, ResponseHeaders}, _, _DecodeBody) ->
{error, {StatusCode, ResponseHeaders}};
handle_response({ok, StatusCode, ResponseHeaders, Client}, SuccessStatusCode, DecodeBody)
when StatusCode =:= 200;
StatusCode =:= 202;
StatusCode =:= 204;
StatusCode =:= 206;
StatusCode =:= SuccessStatusCode ->
case hackney:body(Client) of
{ok, <<>>} when StatusCode =:= 200;
StatusCode =:= SuccessStatusCode ->
{ok, #{}, {StatusCode, ResponseHeaders, Client}};
{ok, Body} ->
Result = case DecodeBody of
true ->
try
jsx:decode(Body)
catch
Error:Reason:Stack ->
erlang:raise(error, {body_decode_failed, Error, Reason, StatusCode, Body}, Stack)
end;
false -> #{<<"Body">> => Body}
end,
{ok, Result, {StatusCode, ResponseHeaders, Client}}
end;
handle_response({ok, StatusCode, _ResponseHeaders, _Client}, _, _DecodeBody)
when StatusCode =:= 503 ->
Retriable error if retries are enabled
{error, service_unavailable};
handle_response({ok, StatusCode, ResponseHeaders, Client}, _, _DecodeBody) ->
{ok, Body} = hackney:body(Client),
try
DecodedError = jsx:decode(Body),
{error, DecodedError, {StatusCode, ResponseHeaders, Client}}
catch
Error:Reason:Stack ->
erlang:raise(error, {body_decode_failed, Error, Reason, StatusCode, Body}, Stack)
end;
handle_response({error, Reason}, _, _DecodeBody) ->
{error, Reason}.
build_host(_EndpointPrefix, #{region := <<"local">>, endpoint := Endpoint}) ->
Endpoint;
build_host(_EndpointPrefix, #{region := <<"local">>}) ->
<<"localhost">>;
build_host(EndpointPrefix, #{region := Region, endpoint := Endpoint}) ->
aws_util:binary_join([EndpointPrefix, Region, Endpoint], <<".">>).
build_url(Host, Path0, Client) ->
Proto = aws_client:proto(Client),
Path = erlang:iolist_to_binary(Path0),
Port = aws_client:port(Client),
aws_util:binary_join([Proto, <<"://">>, Host, <<":">>, Port, Path], <<"">>).
-spec encode_payload(undefined | map()) -> binary().
encode_payload(undefined) ->
<<>>;
encode_payload(Input) ->
jsx:encode(Input).
| null | https://raw.githubusercontent.com/aws-beam/aws-erlang/699287cee7dfc9dc8c08ced5f090dcc192c9cba8/src/aws_amplifybackend.erl | erlang | WARNING: DO NOT EDIT, AUTO-GENERATED CODE!
@doc AWS Amplify Admin API
====================================================================
API
====================================================================
@doc This operation clones an existing backend.
Backends are automatically created at the time of app creation.
@doc Creates a new backend API resource.
@doc Creates a new backend authentication resource.
@doc Creates a config object for a backend.
@doc Creates a backend storage resource.
@doc Removes an existing environment from your Amplify project.
@doc Deletes an existing backend API resource.
@doc Deletes an existing backend authentication resource.
@doc Removes the specified backend storage resource.
@doc Generates a model schema for an existing backend API resource.
@doc Provides project-level details for your Amplify UI project.
@doc Gets the details for a backend API.
@doc Gets a model introspection schema for an existing backend API
resource.
@doc Gets a backend auth details.
@doc Returns information about a specific job.
@doc Gets details for a backend storage resource.
@doc Imports an existing backend authentication resource.
@doc Imports an existing backend storage resource.
@doc The list of S3 buckets in your account.
@doc Removes all backend environments from your Amplify project.
@doc Updates an existing backend API resource.
@doc Updates an existing backend authentication resource.
@doc Updates a specific job.
@doc Updates an existing backend storage resource.
====================================================================
==================================================================== | See -beam/aws-codegen for more details .
-module(aws_amplifybackend).
-export([clone_backend/4,
clone_backend/5,
create_backend/2,
create_backend/3,
create_backend_api/3,
create_backend_api/4,
create_backend_auth/3,
create_backend_auth/4,
create_backend_config/3,
create_backend_config/4,
create_backend_storage/3,
create_backend_storage/4,
create_token/3,
create_token/4,
delete_backend/4,
delete_backend/5,
delete_backend_api/4,
delete_backend_api/5,
delete_backend_auth/4,
delete_backend_auth/5,
delete_backend_storage/4,
delete_backend_storage/5,
delete_token/4,
delete_token/5,
generate_backend_api_models/4,
generate_backend_api_models/5,
get_backend/3,
get_backend/4,
get_backend_api/4,
get_backend_api/5,
get_backend_api_models/4,
get_backend_api_models/5,
get_backend_auth/4,
get_backend_auth/5,
get_backend_job/4,
get_backend_job/6,
get_backend_job/7,
get_backend_storage/4,
get_backend_storage/5,
get_token/3,
get_token/5,
get_token/6,
import_backend_auth/4,
import_backend_auth/5,
import_backend_storage/4,
import_backend_storage/5,
list_backend_jobs/4,
list_backend_jobs/5,
list_s3_buckets/2,
list_s3_buckets/3,
remove_all_backends/3,
remove_all_backends/4,
remove_backend_config/3,
remove_backend_config/4,
update_backend_api/4,
update_backend_api/5,
update_backend_auth/4,
update_backend_auth/5,
update_backend_config/3,
update_backend_config/4,
update_backend_job/5,
update_backend_job/6,
update_backend_storage/4,
update_backend_storage/5]).
-include_lib("hackney/include/hackney_lib.hrl").
clone_backend(Client, AppId, BackendEnvironmentName, Input) ->
clone_backend(Client, AppId, BackendEnvironmentName, Input, []).
clone_backend(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/environments/", aws_util:encode_uri(BackendEnvironmentName), "/clone"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc This operation creates a backend for an Amplify app .
create_backend(Client, Input) ->
create_backend(Client, Input, []).
create_backend(Client, Input0, Options0) ->
Method = post,
Path = ["/backend"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
create_backend_api(Client, AppId, Input) ->
create_backend_api(Client, AppId, Input, []).
create_backend_api(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
create_backend_auth(Client, AppId, Input) ->
create_backend_auth(Client, AppId, Input, []).
create_backend_auth(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
create_backend_config(Client, AppId, Input) ->
create_backend_config(Client, AppId, Input, []).
create_backend_config(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/config"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
create_backend_storage(Client, AppId, Input) ->
create_backend_storage(Client, AppId, Input, []).
create_backend_storage(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Generates a one - time challenge code to authenticate a user into your
Amplify Admin UI .
create_token(Client, AppId, Input) ->
create_token(Client, AppId, Input, []).
create_token(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/challenge"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
delete_backend(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/environments/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
delete_backend_api(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend_api(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend_api(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
delete_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
delete_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
delete_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
delete_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Deletes the challenge token based on the given appId and sessionId .
delete_token(Client, AppId, SessionId, Input) ->
delete_token(Client, AppId, SessionId, Input, []).
delete_token(Client, AppId, SessionId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/challenge/", aws_util:encode_uri(SessionId), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
generate_backend_api_models(Client, AppId, BackendEnvironmentName, Input) ->
generate_backend_api_models(Client, AppId, BackendEnvironmentName, Input, []).
generate_backend_api_models(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/generateModels"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
get_backend(Client, AppId, Input) ->
get_backend(Client, AppId, Input, []).
get_backend(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
get_backend_api(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_api(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_api(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
get_backend_api_models(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_api_models(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_api_models(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), "/getModels"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
get_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
get_backend_job(Client, AppId, BackendEnvironmentName, JobId)
when is_map(Client) ->
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, #{}, #{}).
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, QueryMap, HeadersMap)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap) ->
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, QueryMap, HeadersMap, []).
get_backend_job(Client, AppId, BackendEnvironmentName, JobId, QueryMap, HeadersMap, Options0)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap), is_list(Options0) ->
Path = ["/backend/", aws_util:encode_uri(AppId), "/job/", aws_util:encode_uri(BackendEnvironmentName), "/", aws_util:encode_uri(JobId), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false}
| Options0],
Headers = [],
Query_ = [],
request(Client, get, Path, Query_, Headers, undefined, Options, SuccessStatusCode).
get_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
get_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
get_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), "/details"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Gets the challenge token based on the given appId and sessionId .
get_token(Client, AppId, SessionId)
when is_map(Client) ->
get_token(Client, AppId, SessionId, #{}, #{}).
get_token(Client, AppId, SessionId, QueryMap, HeadersMap)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap) ->
get_token(Client, AppId, SessionId, QueryMap, HeadersMap, []).
get_token(Client, AppId, SessionId, QueryMap, HeadersMap, Options0)
when is_map(Client), is_map(QueryMap), is_map(HeadersMap), is_list(Options0) ->
Path = ["/backend/", aws_util:encode_uri(AppId), "/challenge/", aws_util:encode_uri(SessionId), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false}
| Options0],
Headers = [],
Query_ = [],
request(Client, get, Path, Query_, Headers, undefined, Options, SuccessStatusCode).
import_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
import_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
import_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), "/import"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
import_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
import_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
import_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), "/import"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Lists the jobs for the backend of an Amplify app .
list_backend_jobs(Client, AppId, BackendEnvironmentName, Input) ->
list_backend_jobs(Client, AppId, BackendEnvironmentName, Input, []).
list_backend_jobs(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/job/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
list_s3_buckets(Client, Input) ->
list_s3_buckets(Client, Input, []).
list_s3_buckets(Client, Input0, Options0) ->
Method = post,
Path = ["/s3Buckets"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
remove_all_backends(Client, AppId, Input) ->
remove_all_backends(Client, AppId, Input, []).
remove_all_backends(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Removes the AWS resources required to access the Amplify Admin UI .
remove_backend_config(Client, AppId, Input) ->
remove_backend_config(Client, AppId, Input, []).
remove_backend_config(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/config/remove"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
update_backend_api(Client, AppId, BackendEnvironmentName, Input) ->
update_backend_api(Client, AppId, BackendEnvironmentName, Input, []).
update_backend_api(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/api/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
update_backend_auth(Client, AppId, BackendEnvironmentName, Input) ->
update_backend_auth(Client, AppId, BackendEnvironmentName, Input, []).
update_backend_auth(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/auth/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
@doc Updates the AWS resources required to access the Amplify Admin UI .
update_backend_config(Client, AppId, Input) ->
update_backend_config(Client, AppId, Input, []).
update_backend_config(Client, AppId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/config/update"],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
update_backend_job(Client, AppId, BackendEnvironmentName, JobId, Input) ->
update_backend_job(Client, AppId, BackendEnvironmentName, JobId, Input, []).
update_backend_job(Client, AppId, BackendEnvironmentName, JobId, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/job/", aws_util:encode_uri(BackendEnvironmentName), "/", aws_util:encode_uri(JobId), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
update_backend_storage(Client, AppId, BackendEnvironmentName, Input) ->
update_backend_storage(Client, AppId, BackendEnvironmentName, Input, []).
update_backend_storage(Client, AppId, BackendEnvironmentName, Input0, Options0) ->
Method = post,
Path = ["/backend/", aws_util:encode_uri(AppId), "/storage/", aws_util:encode_uri(BackendEnvironmentName), ""],
SuccessStatusCode = 200,
Options = [{send_body_as_binary, false},
{receive_body_as_binary, false},
{append_sha256_content_hash, false}
| Options0],
Headers = [],
Input1 = Input0,
CustomHeaders = [],
Input2 = Input1,
Query_ = [],
Input = Input2,
request(Client, Method, Path, Query_, CustomHeaders ++ Headers, Input, Options, SuccessStatusCode).
Internal functions
-spec request(aws_client:aws_client(), atom(), iolist(), list(),
list(), map() | undefined, list(), pos_integer() | undefined) ->
{ok, {integer(), list()}} |
{ok, Result, {integer(), list(), hackney:client()}} |
{error, Error, {integer(), list(), hackney:client()}} |
{error, term()} when
Result :: map(),
Error :: map().
request(Client, Method, Path, Query, Headers0, Input, Options, SuccessStatusCode) ->
RequestFun = fun() -> do_request(Client, Method, Path, Query, Headers0, Input, Options, SuccessStatusCode) end,
aws_request:request(RequestFun, Options).
do_request(Client, Method, Path, Query, Headers0, Input, Options, SuccessStatusCode) ->
Client1 = Client#{service => <<"amplifybackend">>},
Host = build_host(<<"amplifybackend">>, Client1),
URL0 = build_url(Host, Path, Client1),
URL = aws_request:add_query(URL0, Query),
AdditionalHeaders1 = [ {<<"Host">>, Host}
, {<<"Content-Type">>, <<"application/x-amz-json-1.1">>}
],
Payload =
case proplists:get_value(send_body_as_binary, Options) of
true ->
maps:get(<<"Body">>, Input, <<"">>);
false ->
encode_payload(Input)
end,
AdditionalHeaders = case proplists:get_value(append_sha256_content_hash, Options, false) of
true ->
add_checksum_hash_header(AdditionalHeaders1, Payload);
false ->
AdditionalHeaders1
end,
Headers1 = aws_request:add_headers(AdditionalHeaders, Headers0),
MethodBin = aws_request:method_to_binary(Method),
SignedHeaders = aws_request:sign_request(Client1, MethodBin, URL, Headers1, Payload),
Response = hackney:request(Method, URL, SignedHeaders, Payload, Options),
DecodeBody = not proplists:get_value(receive_body_as_binary, Options),
handle_response(Response, SuccessStatusCode, DecodeBody).
add_checksum_hash_header(Headers, Body) ->
[ {<<"X-Amz-CheckSum-SHA256">>, base64:encode(crypto:hash(sha256, Body))}
| Headers
].
handle_response({ok, StatusCode, ResponseHeaders}, SuccessStatusCode, _DecodeBody)
when StatusCode =:= 200;
StatusCode =:= 202;
StatusCode =:= 204;
StatusCode =:= 206;
StatusCode =:= SuccessStatusCode ->
{ok, {StatusCode, ResponseHeaders}};
handle_response({ok, StatusCode, ResponseHeaders}, _, _DecodeBody) ->
{error, {StatusCode, ResponseHeaders}};
handle_response({ok, StatusCode, ResponseHeaders, Client}, SuccessStatusCode, DecodeBody)
when StatusCode =:= 200;
StatusCode =:= 202;
StatusCode =:= 204;
StatusCode =:= 206;
StatusCode =:= SuccessStatusCode ->
case hackney:body(Client) of
{ok, <<>>} when StatusCode =:= 200;
StatusCode =:= SuccessStatusCode ->
{ok, #{}, {StatusCode, ResponseHeaders, Client}};
{ok, Body} ->
Result = case DecodeBody of
true ->
try
jsx:decode(Body)
catch
Error:Reason:Stack ->
erlang:raise(error, {body_decode_failed, Error, Reason, StatusCode, Body}, Stack)
end;
false -> #{<<"Body">> => Body}
end,
{ok, Result, {StatusCode, ResponseHeaders, Client}}
end;
handle_response({ok, StatusCode, _ResponseHeaders, _Client}, _, _DecodeBody)
when StatusCode =:= 503 ->
Retriable error if retries are enabled
{error, service_unavailable};
handle_response({ok, StatusCode, ResponseHeaders, Client}, _, _DecodeBody) ->
{ok, Body} = hackney:body(Client),
try
DecodedError = jsx:decode(Body),
{error, DecodedError, {StatusCode, ResponseHeaders, Client}}
catch
Error:Reason:Stack ->
erlang:raise(error, {body_decode_failed, Error, Reason, StatusCode, Body}, Stack)
end;
handle_response({error, Reason}, _, _DecodeBody) ->
{error, Reason}.
build_host(_EndpointPrefix, #{region := <<"local">>, endpoint := Endpoint}) ->
Endpoint;
build_host(_EndpointPrefix, #{region := <<"local">>}) ->
<<"localhost">>;
build_host(EndpointPrefix, #{region := Region, endpoint := Endpoint}) ->
aws_util:binary_join([EndpointPrefix, Region, Endpoint], <<".">>).
build_url(Host, Path0, Client) ->
Proto = aws_client:proto(Client),
Path = erlang:iolist_to_binary(Path0),
Port = aws_client:port(Client),
aws_util:binary_join([Proto, <<"://">>, Host, <<":">>, Port, Path], <<"">>).
-spec encode_payload(undefined | map()) -> binary().
encode_payload(undefined) ->
<<>>;
encode_payload(Input) ->
jsx:encode(Input).
|
45f5949ea5f2e52ce25838c8d2c27f3573ccf15b251680561dbfb59583fa9cde | amnh/poy5 | pdfdraft.ml | (* \chaptertitle{Pdfdraft}{Make Draft Documents} *)
(* Make a PDF suitable for draft printing by replacing its images by crossed
boxes. Summary: pdfdraft \texttt{input.pdf} \texttt{output.pdf}.*)
open Utility
(* Predicate on an xobject: true if an image xobject. *)
let isimage pdf (_, xobj) =
Pdf.lookup_direct pdf "/Subtype" xobj = Some (Pdf.Name "/Image")
(* Given a set of resources for a page, and the name of a resource, determine if
that name refers to an image xobject. *)
let xobject_isimage pdf resources name =
match resources with
| Pdf.Dictionary _ ->
begin match Pdf.lookup_direct pdf "/XObject" resources with
| Some xobjects ->
isimage pdf ("", Pdf.lookup_fail "xobject not there" pdf name xobjects)
| _ -> false
end
| _ -> failwith "bad resources"
(* Remove any image xobjects from a set of resources. *)
let remove_image_xobjects pdf resources =
match resources with
| Pdf.Dictionary res ->
begin match Pdf.lookup_direct pdf "/XObject" resources with
| Some (Pdf.Dictionary xobjects) ->
Pdf.Dictionary
(replace "/XObject" (Pdf.Dictionary (lose (isimage pdf) xobjects)) res)
| _ -> resources
end
| _ -> failwith "bad resources"
(* The subsitute for an image. *)
let substitute =
rev
[Pdfpages.Op_q;
Pdfpages.Op_w 0.;
Pdfpages.Op_G 0.;
Pdfpages.Op_re (0., 0., 1., 1.);
Pdfpages.Op_m (0., 0.);
Pdfpages.Op_l (1., 1.);
Pdfpages.Op_m (0., 1.);
Pdfpages.Op_l (1., 0.);
Pdfpages.Op_S;
Pdfpages.Op_Q]
(* Remove references to images from a graphics stream. *)
let rec remove_images_stream pdf resources prev = function
| [] -> rev prev
| (Pdfpages.Op_Do name) as h::t ->
if xobject_isimage pdf resources name
then remove_images_stream pdf resources (substitute @ prev) t
else remove_images_stream pdf resources (h::prev) t
| Pdfpages.InlineImage _::t ->
remove_images_stream pdf resources (substitute @ prev) t
| h::t ->
remove_images_stream pdf resources (h::prev) t
(* Remove images from a page. *)
let remove_images_page pdf page =
let content' =
remove_images_stream pdf page.Pdfdoc.resources []
(Pdfpages.parse_operators pdf page.Pdfdoc.resources page.Pdfdoc.content)
in
{page with
Pdfdoc.content =
(let stream = Pdfpages.stream_of_ops content' in
Pdfcodec.encode_pdfstream pdf Pdfcodec.Flate stream;
[stream]);
Pdfdoc.resources =
remove_image_xobjects pdf page.Pdfdoc.resources}
(* Remove images from all pages in a document. *)
let remove_images pdf =
let pages = Pdfdoc.pages_of_pagetree pdf in
let pages' = map (remove_images_page pdf) pages in
let pdf, pagetree_num = Pdfdoc.add_pagetree pages' pdf in
let pdf = Pdfdoc.add_root pagetree_num [] pdf in
Pdf.remove_unreferenced pdf
(* Read command line arguments and call [remove_images] *)
let _ =
match Array.to_list Sys.argv with
| [_; in_file; out_file] ->
begin try
let ch = open_in_bin in_file in
let pdf = Pdfread.pdf_of_channel ch in
Pdfwrite.pdf_to_file (remove_images pdf) out_file;
close_in ch
with
err ->
Printf.printf "Failed to produce output.\n%s\n\n" (Printexc.to_string err);
exit 1
end
| _ ->
print_string "Syntax: pdfdraft <input> <output>\n\n"; exit 1
| null | https://raw.githubusercontent.com/amnh/poy5/da563a2339d3fa9c0110ae86cc35fad576f728ab/src/camlpdf-0.3/pdfdraft.ml | ocaml | \chaptertitle{Pdfdraft}{Make Draft Documents}
Make a PDF suitable for draft printing by replacing its images by crossed
boxes. Summary: pdfdraft \texttt{input.pdf} \texttt{output.pdf}.
Predicate on an xobject: true if an image xobject.
Given a set of resources for a page, and the name of a resource, determine if
that name refers to an image xobject.
Remove any image xobjects from a set of resources.
The subsitute for an image.
Remove references to images from a graphics stream.
Remove images from a page.
Remove images from all pages in a document.
Read command line arguments and call [remove_images] |
open Utility
let isimage pdf (_, xobj) =
Pdf.lookup_direct pdf "/Subtype" xobj = Some (Pdf.Name "/Image")
let xobject_isimage pdf resources name =
match resources with
| Pdf.Dictionary _ ->
begin match Pdf.lookup_direct pdf "/XObject" resources with
| Some xobjects ->
isimage pdf ("", Pdf.lookup_fail "xobject not there" pdf name xobjects)
| _ -> false
end
| _ -> failwith "bad resources"
let remove_image_xobjects pdf resources =
match resources with
| Pdf.Dictionary res ->
begin match Pdf.lookup_direct pdf "/XObject" resources with
| Some (Pdf.Dictionary xobjects) ->
Pdf.Dictionary
(replace "/XObject" (Pdf.Dictionary (lose (isimage pdf) xobjects)) res)
| _ -> resources
end
| _ -> failwith "bad resources"
let substitute =
rev
[Pdfpages.Op_q;
Pdfpages.Op_w 0.;
Pdfpages.Op_G 0.;
Pdfpages.Op_re (0., 0., 1., 1.);
Pdfpages.Op_m (0., 0.);
Pdfpages.Op_l (1., 1.);
Pdfpages.Op_m (0., 1.);
Pdfpages.Op_l (1., 0.);
Pdfpages.Op_S;
Pdfpages.Op_Q]
let rec remove_images_stream pdf resources prev = function
| [] -> rev prev
| (Pdfpages.Op_Do name) as h::t ->
if xobject_isimage pdf resources name
then remove_images_stream pdf resources (substitute @ prev) t
else remove_images_stream pdf resources (h::prev) t
| Pdfpages.InlineImage _::t ->
remove_images_stream pdf resources (substitute @ prev) t
| h::t ->
remove_images_stream pdf resources (h::prev) t
let remove_images_page pdf page =
let content' =
remove_images_stream pdf page.Pdfdoc.resources []
(Pdfpages.parse_operators pdf page.Pdfdoc.resources page.Pdfdoc.content)
in
{page with
Pdfdoc.content =
(let stream = Pdfpages.stream_of_ops content' in
Pdfcodec.encode_pdfstream pdf Pdfcodec.Flate stream;
[stream]);
Pdfdoc.resources =
remove_image_xobjects pdf page.Pdfdoc.resources}
let remove_images pdf =
let pages = Pdfdoc.pages_of_pagetree pdf in
let pages' = map (remove_images_page pdf) pages in
let pdf, pagetree_num = Pdfdoc.add_pagetree pages' pdf in
let pdf = Pdfdoc.add_root pagetree_num [] pdf in
Pdf.remove_unreferenced pdf
let _ =
match Array.to_list Sys.argv with
| [_; in_file; out_file] ->
begin try
let ch = open_in_bin in_file in
let pdf = Pdfread.pdf_of_channel ch in
Pdfwrite.pdf_to_file (remove_images pdf) out_file;
close_in ch
with
err ->
Printf.printf "Failed to produce output.\n%s\n\n" (Printexc.to_string err);
exit 1
end
| _ ->
print_string "Syntax: pdfdraft <input> <output>\n\n"; exit 1
|
0ca8fd5fd260004935d5bfaa22c3d7ebdfeba608def97b250f095235d751e251 | own-pt/cl-krr | tptp.lisp | Copyright 2016 IBM
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
;; you may not use this file except in compliance with the License.
;; You may obtain a copy of the License at
;; -2.0
;; Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an " AS IS " BASIS ,
;; WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
;; See the License for the specific language governing permissions and
;; limitations under the License.
(in-package #:suo-kif)
(defun binary-logical-formula-tptp (l)
(ecase (car l)
(|and| (format nil "(~{~a~^ & ~})" (mapcar #'formula-tptp (cdr l))))
(|or| (format nil "(~{~a~^ | ~})" (mapcar #'formula-tptp (cdr l))))
(=> (format nil "(~a => ~a)" (formula-tptp (cadr l)) (formula-tptp (caddr l))))
(<=> (format nil "(~a <=> ~a)" (formula-tptp (cadr l)) (formula-tptp (caddr l))))
(|equal| (format nil "(~a = ~a)" (formula-tptp (cadr l)) (formula-tptp (caddr l))))))
(defun unary-logical-formula-tptp (l)
(let ((op (ecase (car l)
(|not| "~"))))
(format nil "(~a ~a)" op (formula-tptp (cadr l)))))
(defun atom-tptp (a)
(cond
((eq '|True| a) "$true")
((eq '|False| a) "$false")
((numberp a) (write-to-string a))
((stringp a) (format nil "'~a'" (escape-quotes (remove #\return (remove #\newline a)))))
((regular-varp a) (variable-tptp a))
(t (let ((name (symbol-name a)))
(if (relationp a)
(replace-special-chars (format nil "s_~a_m" name))
(replace-special-chars (format nil "s_~a" name)))))))
(defun relation-name-tptp (r)
(case r
(<= "lesseq")
(< "less")
(> "greater")
(>= "greatereq")
(lessThanOrEqualTo "lesseq")
(lessThan "less")
(greaterThan "greater")
(greatherThanOrEqualTo "greatereq")
(MultiplicationFn "times")
(DivisionFn "divide")
(AdditionFn "plus")
(SutractionFn "minus")
(otherwise (format nil "s_~a" (symbol-name r)))))
(defun predicate-tptp (l)
(format nil "~a(~{~a~^, ~})" (relation-name-tptp (car l)) (mapcar #'formula-tptp (cdr l))))
(defun formula-tptp (l)
(cond
((atom l) (atom-tptp l))
((quantifierp l) (quantifier-tptp (car l) (cadr l) (caddr l)))
((binary-logical-formulap l) (binary-logical-formula-tptp l))
((unary-logical-formulap l) (unary-logical-formula-tptp l))
(t (predicate-tptp l))))
(defun variable-tptp (variable)
(replace-special-chars (string-upcase (subseq (symbol-name variable) 1))))
(defun atoms-tptp (atoms)
(format nil "~{~a~^,~}" (mapcar #'atom-tptp atoms)))
(defun quantifier-tptp (quantifier variables formula)
(let ((fmt (ecase quantifier
(|forall| "! [~a] : (~a)")
(|exists| "? [~a] : (~a)"))))
(format nil fmt (atoms-tptp variables) (formula-tptp formula))))
(defun can-translate-to-FOL (f &optional ctx)
"Checks if F is a traditional first-order logic formula that is
supported by the FOF variant of TPTP."
(cond
((atom f) (not (member f '(|True| |False|))))
((quantifier-termp (car f)) (every #'identity (mapcar (lambda (x) (can-translate-to-FOL x ctx)) (cddr f))))
((logical-operatorp (car f))
(unless (member :predicate ctx)
(every #'identity (mapcar (lambda (x) (can-translate-to-FOL x (cons :logic ctx))) (cdr f)))))
((and (relationp (car f)) (not (kif-functionp (car f))))
(unless (or (member :function ctx) (member :predicate ctx))
(every #'identity (mapcar (lambda (x) (can-translate-to-FOL x (cons :predicate ctx))) (cdr f)))))
((kif-functionp (car f))
(unless (eq :logic (car ctx))
(every #'identity (mapcar (lambda (x) (can-translate-to-FOL x (cons :function ctx))) (cdr f)))))
(t (and (can-translate-to-FOL (car f) ctx) (can-translate-to-FOL (cdr f) ctx)))))
(defun kif-tptp (file formulas &optional (statement-type "axiom"))
(with-output-to-file (out file :if-exists :supersede)
(let ((count 0))
(dolist (frm formulas)
(format out "~%/*~%~a~%*/~%" frm)
(if (can-translate-to-FOL frm)
(format out "fof(a~a,~a,~a).~%" (incf count) statement-type (formula-tptp frm))
(format out "%% no translation to TPTP/FOF available.~%"))))))
| null | https://raw.githubusercontent.com/own-pt/cl-krr/d21ce3c385ecb0b5b51bd2b6491a082532f9867c/tptp.lisp | lisp |
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License. | Copyright 2016 IBM
distributed under the License is distributed on an " AS IS " BASIS ,
(in-package #:suo-kif)
(defun binary-logical-formula-tptp (l)
(ecase (car l)
(|and| (format nil "(~{~a~^ & ~})" (mapcar #'formula-tptp (cdr l))))
(|or| (format nil "(~{~a~^ | ~})" (mapcar #'formula-tptp (cdr l))))
(=> (format nil "(~a => ~a)" (formula-tptp (cadr l)) (formula-tptp (caddr l))))
(<=> (format nil "(~a <=> ~a)" (formula-tptp (cadr l)) (formula-tptp (caddr l))))
(|equal| (format nil "(~a = ~a)" (formula-tptp (cadr l)) (formula-tptp (caddr l))))))
(defun unary-logical-formula-tptp (l)
(let ((op (ecase (car l)
(|not| "~"))))
(format nil "(~a ~a)" op (formula-tptp (cadr l)))))
(defun atom-tptp (a)
(cond
((eq '|True| a) "$true")
((eq '|False| a) "$false")
((numberp a) (write-to-string a))
((stringp a) (format nil "'~a'" (escape-quotes (remove #\return (remove #\newline a)))))
((regular-varp a) (variable-tptp a))
(t (let ((name (symbol-name a)))
(if (relationp a)
(replace-special-chars (format nil "s_~a_m" name))
(replace-special-chars (format nil "s_~a" name)))))))
(defun relation-name-tptp (r)
(case r
(<= "lesseq")
(< "less")
(> "greater")
(>= "greatereq")
(lessThanOrEqualTo "lesseq")
(lessThan "less")
(greaterThan "greater")
(greatherThanOrEqualTo "greatereq")
(MultiplicationFn "times")
(DivisionFn "divide")
(AdditionFn "plus")
(SutractionFn "minus")
(otherwise (format nil "s_~a" (symbol-name r)))))
(defun predicate-tptp (l)
(format nil "~a(~{~a~^, ~})" (relation-name-tptp (car l)) (mapcar #'formula-tptp (cdr l))))
(defun formula-tptp (l)
(cond
((atom l) (atom-tptp l))
((quantifierp l) (quantifier-tptp (car l) (cadr l) (caddr l)))
((binary-logical-formulap l) (binary-logical-formula-tptp l))
((unary-logical-formulap l) (unary-logical-formula-tptp l))
(t (predicate-tptp l))))
(defun variable-tptp (variable)
(replace-special-chars (string-upcase (subseq (symbol-name variable) 1))))
(defun atoms-tptp (atoms)
(format nil "~{~a~^,~}" (mapcar #'atom-tptp atoms)))
(defun quantifier-tptp (quantifier variables formula)
(let ((fmt (ecase quantifier
(|forall| "! [~a] : (~a)")
(|exists| "? [~a] : (~a)"))))
(format nil fmt (atoms-tptp variables) (formula-tptp formula))))
(defun can-translate-to-FOL (f &optional ctx)
"Checks if F is a traditional first-order logic formula that is
supported by the FOF variant of TPTP."
(cond
((atom f) (not (member f '(|True| |False|))))
((quantifier-termp (car f)) (every #'identity (mapcar (lambda (x) (can-translate-to-FOL x ctx)) (cddr f))))
((logical-operatorp (car f))
(unless (member :predicate ctx)
(every #'identity (mapcar (lambda (x) (can-translate-to-FOL x (cons :logic ctx))) (cdr f)))))
((and (relationp (car f)) (not (kif-functionp (car f))))
(unless (or (member :function ctx) (member :predicate ctx))
(every #'identity (mapcar (lambda (x) (can-translate-to-FOL x (cons :predicate ctx))) (cdr f)))))
((kif-functionp (car f))
(unless (eq :logic (car ctx))
(every #'identity (mapcar (lambda (x) (can-translate-to-FOL x (cons :function ctx))) (cdr f)))))
(t (and (can-translate-to-FOL (car f) ctx) (can-translate-to-FOL (cdr f) ctx)))))
(defun kif-tptp (file formulas &optional (statement-type "axiom"))
(with-output-to-file (out file :if-exists :supersede)
(let ((count 0))
(dolist (frm formulas)
(format out "~%/*~%~a~%*/~%" frm)
(if (can-translate-to-FOL frm)
(format out "fof(a~a,~a,~a).~%" (incf count) statement-type (formula-tptp frm))
(format out "%% no translation to TPTP/FOF available.~%"))))))
|
df1e8c67bca1add5658e73a0cf7f93751f7db8667ee35970533269ac01a47986 | backtracking/ocamlgraph | prim.ml | (**************************************************************************)
(* *)
: a generic graph library for OCaml
Copyright ( C ) 2004 - 2010
, and
(* *)
(* This software is free software; you can redistribute it and/or *)
modify it under the terms of the GNU Library General Public
License version 2.1 , with the special exception on linking
(* described in file LICENSE. *)
(* *)
(* This software is distributed in the hope that it will be useful, *)
(* but WITHOUT ANY WARRANTY; without even the implied warranty of *)
(* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. *)
(* *)
(**************************************************************************)
module type G = sig
type t
module V : Sig.COMPARABLE
module E : sig
type t
type label
val label : t -> label
val dst : t -> V.t
val src : t -> V.t
val compare : t -> t -> int
end
val iter_vertex : (V.t -> unit) -> t -> unit
val iter_edges_e : (E.t -> unit) -> t -> unit
val iter_succ_e : (E.t -> unit) -> t -> V.t -> unit
end
module Make
(G: G)
(W: Sig.WEIGHT with type edge = G.E.t) =
struct
open G.E
module H = Hashtbl.Make(G.V)
module Elt = struct
type t = W.t * G.V.t
weights are compared first , and minimal weights come first in the
queue
queue *)
let compare (w1,v1) (w2,v2) =
let cw = W.compare w2 w1 in
if cw != 0 then cw else G.V.compare v1 v2
end
module Q = Heap.Imperative(Elt)
let spanningtree_from g r =
let visited = H.create 97 in
let key = H.create 97 in
let q = Q.create 17 in
Q.add q (W.zero, r);
while not (Q.is_empty q) do
let (_,u) = Q.pop_maximum q in
if not (H.mem visited u) then begin
H.add visited u ();
G.iter_succ_e (fun e ->
let v = dst e in
if not (H.mem visited v) then begin
let wuv = W.weight e in
let improvement =
try W.compare wuv (fst (H.find key v)) < 0 with Not_found -> true
in
if improvement then begin
H.replace key v (wuv, e);
Q.add q (wuv, v)
end;
end) g u
end
done;
H.fold (fun _ (_, e) acc -> e :: acc) key []
let spanningtree g =
let r = ref None in
try
G.iter_vertex (fun v -> r := Some v; raise Exit) g;
invalid_arg "spanningtree"
with Exit ->
match !r with
| None -> assert false
| Some r -> spanningtree_from g r
end
| null | https://raw.githubusercontent.com/backtracking/ocamlgraph/1c028af097339ca8bc379436f7bd9477fa3a49cd/src/prim.ml | ocaml | ************************************************************************
This software is free software; you can redistribute it and/or
described in file LICENSE.
This software is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
************************************************************************ | : a generic graph library for OCaml
Copyright ( C ) 2004 - 2010
, and
modify it under the terms of the GNU Library General Public
License version 2.1 , with the special exception on linking
module type G = sig
type t
module V : Sig.COMPARABLE
module E : sig
type t
type label
val label : t -> label
val dst : t -> V.t
val src : t -> V.t
val compare : t -> t -> int
end
val iter_vertex : (V.t -> unit) -> t -> unit
val iter_edges_e : (E.t -> unit) -> t -> unit
val iter_succ_e : (E.t -> unit) -> t -> V.t -> unit
end
module Make
(G: G)
(W: Sig.WEIGHT with type edge = G.E.t) =
struct
open G.E
module H = Hashtbl.Make(G.V)
module Elt = struct
type t = W.t * G.V.t
weights are compared first , and minimal weights come first in the
queue
queue *)
let compare (w1,v1) (w2,v2) =
let cw = W.compare w2 w1 in
if cw != 0 then cw else G.V.compare v1 v2
end
module Q = Heap.Imperative(Elt)
let spanningtree_from g r =
let visited = H.create 97 in
let key = H.create 97 in
let q = Q.create 17 in
Q.add q (W.zero, r);
while not (Q.is_empty q) do
let (_,u) = Q.pop_maximum q in
if not (H.mem visited u) then begin
H.add visited u ();
G.iter_succ_e (fun e ->
let v = dst e in
if not (H.mem visited v) then begin
let wuv = W.weight e in
let improvement =
try W.compare wuv (fst (H.find key v)) < 0 with Not_found -> true
in
if improvement then begin
H.replace key v (wuv, e);
Q.add q (wuv, v)
end;
end) g u
end
done;
H.fold (fun _ (_, e) acc -> e :: acc) key []
let spanningtree g =
let r = ref None in
try
G.iter_vertex (fun v -> r := Some v; raise Exit) g;
invalid_arg "spanningtree"
with Exit ->
match !r with
| None -> assert false
| Some r -> spanningtree_from g r
end
|
4f0448502c6dd21d5f19a52a69ec8ae899a5e715954270d5c0562d86937050f1 | erlangonrails/devdb | file_handle_cache.erl | The contents of this file are subject to the Mozilla Public License
Version 1.1 ( the " License " ) ; you may not use this file except in
%% compliance with the License. You may obtain a copy of the License at
%% /
%%
Software distributed under the License is distributed on an " AS IS "
%% basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See the
%% License for the specific language governing rights and limitations
%% under the License.
%%
The Original Code is RabbitMQ .
%%
The Initial Developers of the Original Code are LShift Ltd ,
Cohesive Financial Technologies LLC , and Rabbit Technologies Ltd.
%%
Portions created before 22 - Nov-2008 00:00:00 GMT by LShift Ltd ,
Cohesive Financial Technologies LLC , or Rabbit Technologies Ltd
are Copyright ( C ) 2007 - 2008 LShift Ltd , Cohesive Financial
Technologies LLC , and Rabbit Technologies Ltd.
%%
Portions created by LShift Ltd are Copyright ( C ) 2007 - 2010 LShift
Ltd. Portions created by Cohesive Financial Technologies LLC are
Copyright ( C ) 2007 - 2010 Cohesive Financial Technologies
LLC . Portions created by Rabbit Technologies Ltd are Copyright
( C ) 2007 - 2010 Rabbit Technologies Ltd.
%%
%% All Rights Reserved.
%%
%% Contributor(s): ______________________________________.
%%
-module(file_handle_cache).
%% A File Handle Cache
%%
This extends a subset of the functionality of the Erlang file
%% module.
%%
%% Some constraints
1 ) This supports one writer , multiple readers per file . Nothing
%% else.
2 ) Do not open the same file from different processes . Bad things
%% may happen.
3 ) Writes are all appends . You can not write to the middle of a
%% file, although you can truncate and then append if you want.
4 ) Although there is a write buffer , there is no read buffer . Feel
%% free to use the read_ahead mode, but beware of the interaction
%% between that buffer and the write buffer.
%%
%% Some benefits
1 ) You do not have to remember to call sync before close
2 ) Buffering is much more flexible than with plain file module , and
%% you can control when the buffer gets flushed out. This means that
%% you can rely on reads-after-writes working, without having to call
%% the expensive sync.
3 ) Unnecessary calls to position and sync get optimised out .
4 ) You can find out what your ' real ' offset is , and what your
' virtual ' offset is ( i.e. where the hdl really is , and where it
%% would be after the write buffer is written out).
5 ) You can find out what the offset was when you last sync'd .
%%
%% There is also a server component which serves to limit the number
%% of open file handles in a "soft" way - the server will never
%% prevent a client from opening a handle, but may immediately tell it
to close the handle . Thus you can set the limit to zero and it will
%% still all work correctly, it is just that effectively no caching
%% will take place. The operation of limiting is as follows:
%%
%% On open and close, the client sends messages to the server
%% informing it of opens and closes. This allows the server to keep
%% track of the number of open handles. The client also keeps a
%% gb_tree which is updated on every use of a file handle, mapping the
%% time at which the file handle was last used (timestamp) to the
%% handle. Thus the smallest key in this tree maps to the file handle
%% that has not been used for the longest amount of time. This
%% smallest key is included in the messages to the server. As such,
%% the server keeps track of when the least recently used file handle
%% was used *at the point of the most recent open or close* by each
%% client.
%%
%% Note that this data can go very out of date, by the client using
%% the least recently used handle.
%%
%% When the limit is reached, the server calculates the average age of
%% the last reported least recently used file handle of all the
%% clients. It then tells all the clients to close any handles not
%% used for longer than this average, by invoking the callback the
%% client registered. The client should receive this message and pass
%% it into set_maximum_since_use/1. However, it is highly possible
%% this age will be greater than the ages of all the handles the
%% client knows of because the client has used its file handles in the
%% mean time. Thus at this point the client reports to the server the
%% current timestamp at which its least recently used file handle was
last used . The server will check two seconds later that either it
%% is back under the limit, in which case all is well again, or if
%% not, it will calculate a new average age. Its data will be much
%% more recent now, and so it is very likely that when this is
%% communicated to the clients, the clients will close file handles.
%%
%% The advantage of this scheme is that there is only communication
%% from the client to the server on open, close, and when in the
%% process of trying to reduce file handle usage. There is no
%% communication from the client to the server on normal file handle
%% operations. This scheme forms a feed-back loop - the server does
%% not care which file handles are closed, just that some are, and it
%% checks this repeatedly when over the limit. Given the guarantees of
now ( ) , even if there is just one file handle open , a limit of 1 ,
and one client , it is certain that when the client calculates the
%% age of the handle, it will be greater than when the server
%% calculated it, hence it should be closed.
%%
%% Handles which are closed as a result of the server are put into a
%% "soft-closed" state in which the handle is closed (data flushed out
and sync'd first ) but the state is maintained . The handle will be
%% fully reopened again as soon as needed, thus users of this library
%% do not need to worry about their handles being closed by the server
%% - reopening them when necessary is handled transparently.
%%
%% The server also supports obtain and release_on_death. obtain/0
blocks until a file descriptor is available . release_on_death/1
takes a pid and monitors the pid , reducing the count by 1 when the
pid dies . Thus the assumption is that obtain/0 is called first , and
when that returns , release_on_death/1 is called with the pid who
%% "owns" the file descriptor. This is, for example, used to track the
%% use of file descriptors through network sockets.
-behaviour(gen_server).
-export([register_callback/3]).
-export([open/3, close/1, read/2, append/2, sync/1, position/2, truncate/1,
last_sync_offset/1, current_virtual_offset/1, current_raw_offset/1,
flush/1, copy/3, set_maximum_since_use/1, delete/1, clear/1]).
-export([release_on_death/1, obtain/0]).
-export([start_link/0, init/1, handle_call/3, handle_cast/2, handle_info/2,
terminate/2, code_change/3]).
-define(SERVER, ?MODULE).
-define(RESERVED_FOR_OTHERS, 100).
-define(FILE_HANDLES_LIMIT_WINDOWS, 10000000).
-define(FILE_HANDLES_LIMIT_OTHER, 1024).
-define(FILE_HANDLES_CHECK_INTERVAL, 2000).
%%----------------------------------------------------------------------------
-record(file,
{ reader_count,
has_writer
}).
-record(handle,
{ hdl,
offset,
trusted_offset,
is_dirty,
write_buffer_size,
write_buffer_size_limit,
write_buffer,
at_eof,
path,
mode,
options,
is_write,
is_read,
last_used_at
}).
-record(fhc_state,
{ elders,
limit,
count,
obtains,
callbacks,
client_mrefs,
timer_ref
}).
%%----------------------------------------------------------------------------
Specs
%%----------------------------------------------------------------------------
-ifdef(use_specs).
-type(ref() :: any()).
-type(error() :: {'error', any()}).
-type(ok_or_error() :: ('ok' | error())).
-type(val_or_error(T) :: ({'ok', T} | error())).
-type(position() :: ('bof' | 'eof' | non_neg_integer() |
{('bof' |'eof'), non_neg_integer()} | {'cur', integer()})).
-type(offset() :: non_neg_integer()).
-spec(register_callback/3 :: (atom(), atom(), [any()]) -> 'ok').
-spec(open/3 ::
(string(), [any()],
[{'write_buffer', (non_neg_integer() | 'infinity' | 'unbuffered')}]) ->
val_or_error(ref())).
-spec(close/1 :: (ref()) -> ok_or_error()).
-spec(read/2 :: (ref(), non_neg_integer()) ->
val_or_error([char()] | binary()) | 'eof').
-spec(append/2 :: (ref(), iodata()) -> ok_or_error()).
-spec(sync/1 :: (ref()) -> ok_or_error()).
-spec(position/2 :: (ref(), position()) -> val_or_error(offset())).
-spec(truncate/1 :: (ref()) -> ok_or_error()).
-spec(last_sync_offset/1 :: (ref()) -> val_or_error(offset())).
-spec(current_virtual_offset/1 :: (ref()) -> val_or_error(offset())).
-spec(current_raw_offset/1 :: (ref()) -> val_or_error(offset())).
-spec(flush/1 :: (ref()) -> ok_or_error()).
-spec(copy/3 :: (ref(), ref(), non_neg_integer()) ->
val_or_error(non_neg_integer())).
-spec(set_maximum_since_use/1 :: (non_neg_integer()) -> 'ok').
-spec(delete/1 :: (ref()) -> ok_or_error()).
-spec(clear/1 :: (ref()) -> ok_or_error()).
-spec(release_on_death/1 :: (pid()) -> 'ok').
-spec(obtain/0 :: () -> 'ok').
-endif.
%%----------------------------------------------------------------------------
%% Public API
%%----------------------------------------------------------------------------
start_link() ->
gen_server:start_link({local, ?SERVER}, ?MODULE, [], [{timeout, infinity}]).
register_callback(M, F, A)
when is_atom(M) andalso is_atom(F) andalso is_list(A) ->
gen_server:cast(?SERVER, {register_callback, self(), {M, F, A}}).
open(Path, Mode, Options) ->
Path1 = filename:absname(Path),
File1 = #file { reader_count = RCount, has_writer = HasWriter } =
case get({Path1, fhc_file}) of
File = #file {} -> File;
undefined -> #file { reader_count = 0,
has_writer = false }
end,
Mode1 = append_to_write(Mode),
IsWriter = is_writer(Mode1),
case IsWriter andalso HasWriter of
true -> {error, writer_exists};
false -> Ref = make_ref(),
case open1(Path1, Mode1, Options, Ref, bof, new) of
{ok, _Handle} ->
RCount1 = case is_reader(Mode1) of
true -> RCount + 1;
false -> RCount
end,
HasWriter1 = HasWriter orelse IsWriter,
put({Path1, fhc_file},
File1 #file { reader_count = RCount1,
has_writer = HasWriter1 }),
{ok, Ref};
Error ->
Error
end
end.
close(Ref) ->
case erase({Ref, fhc_handle}) of
undefined -> ok;
Handle -> case hard_close(Handle) of
ok -> ok;
{Error, Handle1} -> put_handle(Ref, Handle1),
Error
end
end.
read(Ref, Count) ->
with_flushed_handles(
[Ref],
fun ([#handle { is_read = false }]) ->
{error, not_open_for_reading};
([Handle = #handle { hdl = Hdl, offset = Offset }]) ->
case file:read(Hdl, Count) of
{ok, Data} = Obj -> Offset1 = Offset + iolist_size(Data),
{Obj,
[Handle #handle { offset = Offset1 }]};
eof -> {eof, [Handle #handle { at_eof = true }]};
Error -> {Error, [Handle]}
end
end).
append(Ref, Data) ->
with_handles(
[Ref],
fun ([#handle { is_write = false }]) ->
{error, not_open_for_writing};
([Handle]) ->
case maybe_seek(eof, Handle) of
{{ok, _Offset}, #handle { hdl = Hdl, offset = Offset,
write_buffer_size_limit = 0,
at_eof = true } = Handle1} ->
Offset1 = Offset + iolist_size(Data),
{file:write(Hdl, Data),
[Handle1 #handle { is_dirty = true, offset = Offset1 }]};
{{ok, _Offset}, #handle { write_buffer = WriteBuffer,
write_buffer_size = Size,
write_buffer_size_limit = Limit,
at_eof = true } = Handle1} ->
WriteBuffer1 = [Data | WriteBuffer],
Size1 = Size + iolist_size(Data),
Handle2 = Handle1 #handle { write_buffer = WriteBuffer1,
write_buffer_size = Size1 },
case Limit /= infinity andalso Size1 > Limit of
true -> {Result, Handle3} = write_buffer(Handle2),
{Result, [Handle3]};
false -> {ok, [Handle2]}
end;
{{error, _} = Error, Handle1} ->
{Error, [Handle1]}
end
end).
sync(Ref) ->
with_flushed_handles(
[Ref],
fun ([#handle { is_dirty = false, write_buffer = [] }]) ->
ok;
([Handle = #handle { hdl = Hdl, offset = Offset,
is_dirty = true, write_buffer = [] }]) ->
case file:sync(Hdl) of
ok -> {ok, [Handle #handle { trusted_offset = Offset,
is_dirty = false }]};
Error -> {Error, [Handle]}
end
end).
position(Ref, NewOffset) ->
with_flushed_handles(
[Ref],
fun ([Handle]) -> {Result, Handle1} = maybe_seek(NewOffset, Handle),
{Result, [Handle1]}
end).
truncate(Ref) ->
with_flushed_handles(
[Ref],
fun ([Handle1 = #handle { hdl = Hdl, offset = Offset,
trusted_offset = TOffset }]) ->
case file:truncate(Hdl) of
ok -> TOffset1 = lists:min([Offset, TOffset]),
{ok, [Handle1 #handle { trusted_offset = TOffset1,
at_eof = true }]};
Error -> {Error, [Handle1]}
end
end).
last_sync_offset(Ref) ->
with_handles([Ref], fun ([#handle { trusted_offset = TOffset }]) ->
{ok, TOffset}
end).
current_virtual_offset(Ref) ->
with_handles([Ref], fun ([#handle { at_eof = true, is_write = true,
offset = Offset,
write_buffer_size = Size }]) ->
{ok, Offset + Size};
([#handle { offset = Offset }]) ->
{ok, Offset}
end).
current_raw_offset(Ref) ->
with_handles([Ref], fun ([Handle]) -> {ok, Handle #handle.offset} end).
flush(Ref) ->
with_flushed_handles([Ref], fun ([Handle]) -> {ok, [Handle]} end).
copy(Src, Dest, Count) ->
with_flushed_handles(
[Src, Dest],
fun ([SHandle = #handle { is_read = true, hdl = SHdl, offset = SOffset },
DHandle = #handle { is_write = true, hdl = DHdl, offset = DOffset }]
) ->
case file:copy(SHdl, DHdl, Count) of
{ok, Count1} = Result1 ->
{Result1,
[SHandle #handle { offset = SOffset + Count1 },
DHandle #handle { offset = DOffset + Count1 }]};
Error ->
{Error, [SHandle, DHandle]}
end;
(_Handles) ->
{error, incorrect_handle_modes}
end).
delete(Ref) ->
case erase({Ref, fhc_handle}) of
undefined ->
ok;
Handle = #handle { path = Path } ->
case hard_close(Handle #handle { is_dirty = false,
write_buffer = [] }) of
ok -> file:delete(Path);
{Error, Handle1} -> put_handle(Ref, Handle1),
Error
end
end.
clear(Ref) ->
with_handles(
[Ref],
fun ([#handle { at_eof = true, write_buffer_size = 0, offset = 0 }]) ->
ok;
([Handle]) ->
case maybe_seek(bof, Handle #handle { write_buffer = [],
write_buffer_size = 0 }) of
{{ok, 0}, Handle1 = #handle { hdl = Hdl }} ->
case file:truncate(Hdl) of
ok -> {ok, [Handle1 #handle {trusted_offset = 0,
at_eof = true }]};
Error -> {Error, [Handle1]}
end;
{{error, _} = Error, Handle1} ->
{Error, [Handle1]}
end
end).
set_maximum_since_use(MaximumAge) ->
Now = now(),
case lists:foldl(
fun ({{Ref, fhc_handle},
Handle = #handle { hdl = Hdl, last_used_at = Then }}, Rep) ->
Age = timer:now_diff(Now, Then),
case Hdl /= closed andalso Age >= MaximumAge of
true -> {Res, Handle1} = soft_close(Handle),
case Res of
ok -> put({Ref, fhc_handle}, Handle1),
false;
_ -> put_handle(Ref, Handle1),
Rep
end;
false -> Rep
end;
(_KeyValuePair, Rep) ->
Rep
end, true, get()) of
true -> age_tree_change(), ok;
false -> ok
end.
release_on_death(Pid) when is_pid(Pid) ->
gen_server:cast(?SERVER, {release_on_death, Pid}).
obtain() ->
gen_server:call(?SERVER, obtain, infinity).
%%----------------------------------------------------------------------------
Internal functions
%%----------------------------------------------------------------------------
is_reader(Mode) -> lists:member(read, Mode).
is_writer(Mode) -> lists:member(write, Mode).
append_to_write(Mode) ->
case lists:member(append, Mode) of
true -> [write | Mode -- [append, write]];
false -> Mode
end.
with_handles(Refs, Fun) ->
ResHandles = lists:foldl(
fun (Ref, {ok, HandlesAcc}) ->
case get_or_reopen(Ref) of
{ok, Handle} -> {ok, [Handle | HandlesAcc]};
Error -> Error
end;
(_Ref, Error) ->
Error
end, {ok, []}, Refs),
case ResHandles of
{ok, Handles} ->
case Fun(lists:reverse(Handles)) of
{Result, Handles1} when is_list(Handles1) ->
lists:zipwith(fun put_handle/2, Refs, Handles1),
Result;
Result ->
Result
end;
Error ->
Error
end.
with_flushed_handles(Refs, Fun) ->
with_handles(
Refs,
fun (Handles) ->
case lists:foldl(
fun (Handle, {ok, HandlesAcc}) ->
{Res, Handle1} = write_buffer(Handle),
{Res, [Handle1 | HandlesAcc]};
(Handle, {Error, HandlesAcc}) ->
{Error, [Handle | HandlesAcc]}
end, {ok, []}, Handles) of
{ok, Handles1} ->
Fun(lists:reverse(Handles1));
{Error, Handles1} ->
{Error, lists:reverse(Handles1)}
end
end).
get_or_reopen(Ref) ->
case get({Ref, fhc_handle}) of
undefined ->
{error, not_open, Ref};
#handle { hdl = closed, offset = Offset,
path = Path, mode = Mode, options = Options } ->
open1(Path, Mode, Options, Ref, Offset, reopen);
Handle ->
{ok, Handle}
end.
put_handle(Ref, Handle = #handle { last_used_at = Then }) ->
Now = now(),
age_tree_update(Then, Now, Ref),
put({Ref, fhc_handle}, Handle #handle { last_used_at = Now }).
with_age_tree(Fun) ->
put(fhc_age_tree, Fun(case get(fhc_age_tree) of
undefined -> gb_trees:empty();
AgeTree -> AgeTree
end)).
age_tree_insert(Now, Ref) ->
with_age_tree(
fun (Tree) ->
Tree1 = gb_trees:insert(Now, Ref, Tree),
{Oldest, _Ref} = gb_trees:smallest(Tree1),
gen_server:cast(?SERVER, {open, self(), Oldest}),
Tree1
end).
age_tree_update(Then, Now, Ref) ->
with_age_tree(
fun (Tree) ->
gb_trees:insert(Now, Ref, gb_trees:delete_any(Then, Tree))
end).
age_tree_delete(Then) ->
with_age_tree(
fun (Tree) ->
Tree1 = gb_trees:delete_any(Then, Tree),
Oldest = case gb_trees:is_empty(Tree1) of
true ->
undefined;
false ->
{Oldest1, _Ref} = gb_trees:smallest(Tree1),
Oldest1
end,
gen_server:cast(?SERVER, {close, self(), Oldest}),
Tree1
end).
age_tree_change() ->
with_age_tree(
fun (Tree) ->
case gb_trees:is_empty(Tree) of
true -> Tree;
false -> {Oldest, _Ref} = gb_trees:smallest(Tree),
gen_server:cast(?SERVER, {update, self(), Oldest})
end,
Tree
end).
open1(Path, Mode, Options, Ref, Offset, NewOrReopen) ->
Mode1 = case NewOrReopen of
new -> Mode;
reopen -> [read | Mode]
end,
case file:open(Path, Mode1) of
{ok, Hdl} ->
WriteBufferSize =
case proplists:get_value(write_buffer, Options, unbuffered) of
unbuffered -> 0;
infinity -> infinity;
N when is_integer(N) -> N
end,
Now = now(),
Handle = #handle { hdl = Hdl,
offset = 0,
trusted_offset = 0,
is_dirty = false,
write_buffer_size = 0,
write_buffer_size_limit = WriteBufferSize,
write_buffer = [],
at_eof = false,
path = Path,
mode = Mode,
options = Options,
is_write = is_writer(Mode),
is_read = is_reader(Mode),
last_used_at = Now },
{{ok, Offset1}, Handle1} = maybe_seek(Offset, Handle),
Handle2 = Handle1 #handle { trusted_offset = Offset1 },
put({Ref, fhc_handle}, Handle2),
age_tree_insert(Now, Ref),
{ok, Handle2};
{error, Reason} ->
{error, Reason}
end.
soft_close(Handle = #handle { hdl = closed }) ->
{ok, Handle};
soft_close(Handle) ->
case write_buffer(Handle) of
{ok, #handle { hdl = Hdl, offset = Offset, is_dirty = IsDirty,
last_used_at = Then } = Handle1 } ->
ok = case IsDirty of
true -> file:sync(Hdl);
false -> ok
end,
ok = file:close(Hdl),
age_tree_delete(Then),
{ok, Handle1 #handle { hdl = closed, trusted_offset = Offset,
is_dirty = false }};
{_Error, _Handle} = Result ->
Result
end.
hard_close(Handle) ->
case soft_close(Handle) of
{ok, #handle { path = Path,
is_read = IsReader, is_write = IsWriter }} ->
#file { reader_count = RCount, has_writer = HasWriter } = File =
get({Path, fhc_file}),
RCount1 = case IsReader of
true -> RCount - 1;
false -> RCount
end,
HasWriter1 = HasWriter andalso not IsWriter,
case RCount1 =:= 0 andalso not HasWriter1 of
true -> erase({Path, fhc_file});
false -> put({Path, fhc_file},
File #file { reader_count = RCount1,
has_writer = HasWriter1 })
end,
ok;
{_Error, _Handle} = Result ->
Result
end.
maybe_seek(NewOffset, Handle = #handle { hdl = Hdl, offset = Offset,
at_eof = AtEoF }) ->
{AtEoF1, NeedsSeek} = needs_seek(AtEoF, Offset, NewOffset),
case (case NeedsSeek of
true -> file:position(Hdl, NewOffset);
false -> {ok, Offset}
end) of
{ok, Offset1} = Result ->
{Result, Handle #handle { offset = Offset1, at_eof = AtEoF1 }};
{error, _} = Error ->
{Error, Handle}
end.
needs_seek( AtEoF, _CurOffset, cur ) -> {AtEoF, false};
needs_seek( AtEoF, _CurOffset, {cur, 0}) -> {AtEoF, false};
needs_seek( true, _CurOffset, eof ) -> {true , false};
needs_seek( true, _CurOffset, {eof, 0}) -> {true , false};
needs_seek( false, _CurOffset, eof ) -> {true , true };
needs_seek( false, _CurOffset, {eof, 0}) -> {true , true };
needs_seek( AtEoF, 0, bof ) -> {AtEoF, false};
needs_seek( AtEoF, 0, {bof, 0}) -> {AtEoF, false};
needs_seek( AtEoF, CurOffset, CurOffset) -> {AtEoF, false};
needs_seek( true, CurOffset, {bof, DesiredOffset})
when DesiredOffset >= CurOffset ->
{true, true};
needs_seek( true, _CurOffset, {cur, DesiredOffset})
when DesiredOffset > 0 ->
{true, true};
needs_seek( true, CurOffset, DesiredOffset) %% same as {bof, DO}
when is_integer(DesiredOffset) andalso DesiredOffset >= CurOffset ->
{true, true};
because we ca n't really track size , we could well end up at EoF and not know
needs_seek(_AtEoF, _CurOffset, _DesiredOffset) ->
{false, true}.
write_buffer(Handle = #handle { write_buffer = [] }) ->
{ok, Handle};
write_buffer(Handle = #handle { hdl = Hdl, offset = Offset,
write_buffer = WriteBuffer,
write_buffer_size = DataSize,
at_eof = true }) ->
case file:write(Hdl, lists:reverse(WriteBuffer)) of
ok ->
Offset1 = Offset + DataSize,
{ok, Handle #handle { offset = Offset1, is_dirty = true,
write_buffer = [], write_buffer_size = 0 }};
{error, _} = Error ->
{Error, Handle}
end.
%%----------------------------------------------------------------------------
%% gen_server callbacks
%%----------------------------------------------------------------------------
init([]) ->
Limit = case application:get_env(file_handles_high_watermark) of
{ok, Watermark} when (is_integer(Watermark) andalso
Watermark > 0) ->
Watermark;
_ ->
ulimit()
end,
error_logger:info_msg("Limiting to approx ~p file handles~n", [Limit]),
{ok, #fhc_state { elders = dict:new(), limit = Limit, count = 0,
obtains = [], callbacks = dict:new(),
client_mrefs = dict:new(), timer_ref = undefined }}.
handle_call(obtain, From, State = #fhc_state { count = Count }) ->
State1 = #fhc_state { count = Count1, limit = Limit, obtains = Obtains } =
maybe_reduce(State #fhc_state { count = Count + 1 }),
case Limit /= infinity andalso Count1 >= Limit of
true -> {noreply, State1 #fhc_state { obtains = [From | Obtains],
count = Count1 - 1 }};
false -> {reply, ok, State1}
end.
handle_cast({register_callback, Pid, MFA},
State = #fhc_state { callbacks = Callbacks }) ->
{noreply, ensure_mref(
Pid, State #fhc_state {
callbacks = dict:store(Pid, MFA, Callbacks) })};
handle_cast({open, Pid, EldestUnusedSince}, State =
#fhc_state { elders = Elders, count = Count }) ->
Elders1 = dict:store(Pid, EldestUnusedSince, Elders),
{noreply, maybe_reduce(
ensure_mref(Pid, State #fhc_state { elders = Elders1,
count = Count + 1 }))};
handle_cast({update, Pid, EldestUnusedSince}, State =
#fhc_state { elders = Elders }) ->
Elders1 = dict:store(Pid, EldestUnusedSince, Elders),
%% don't call maybe_reduce from here otherwise we can create a
%% storm of messages
{noreply, ensure_mref(Pid, State #fhc_state { elders = Elders1 })};
handle_cast({close, Pid, EldestUnusedSince}, State =
#fhc_state { elders = Elders, count = Count }) ->
Elders1 = case EldestUnusedSince of
undefined -> dict:erase(Pid, Elders);
_ -> dict:store(Pid, EldestUnusedSince, Elders)
end,
{noreply, process_obtains(
ensure_mref(Pid, State #fhc_state { elders = Elders1,
count = Count - 1 }))};
handle_cast(check_counts, State) ->
{noreply, maybe_reduce(State #fhc_state { timer_ref = undefined })};
handle_cast({release_on_death, Pid}, State) ->
_MRef = erlang:monitor(process, Pid),
{noreply, State}.
handle_info({'DOWN', MRef, process, Pid, _Reason}, State =
#fhc_state { count = Count, callbacks = Callbacks,
client_mrefs = ClientMRefs, elders = Elders }) ->
{noreply, process_obtains(
case dict:find(Pid, ClientMRefs) of
{ok, MRef} -> State #fhc_state {
elders = dict:erase(Pid, Elders),
client_mrefs = dict:erase(Pid, ClientMRefs),
callbacks = dict:erase(Pid, Callbacks) };
_ -> State #fhc_state { count = Count - 1 }
end)}.
terminate(_Reason, State) ->
State.
code_change(_OldVsn, State, _Extra) ->
{ok, State}.
%%----------------------------------------------------------------------------
%% server helpers
%%----------------------------------------------------------------------------
process_obtains(State = #fhc_state { obtains = [] }) ->
State;
process_obtains(State = #fhc_state { limit = Limit, count = Count })
when Limit /= infinity andalso Count >= Limit ->
State;
process_obtains(State = #fhc_state { limit = Limit, count = Count,
obtains = Obtains }) ->
ObtainsLen = length(Obtains),
ObtainableLen = lists:min([ObtainsLen, Limit - Count]),
Take = ObtainsLen - ObtainableLen,
{ObtainsNew, ObtainableRev} = lists:split(Take, Obtains),
[gen_server:reply(From, ok) || From <- ObtainableRev],
State #fhc_state { count = Count + ObtainableLen, obtains = ObtainsNew }.
maybe_reduce(State = #fhc_state { limit = Limit, count = Count, elders = Elders,
callbacks = Callbacks, timer_ref = TRef })
when Limit /= infinity andalso Count >= Limit ->
Now = now(),
{Pids, Sum, ClientCount} =
dict:fold(fun (_Pid, undefined, Accs) ->
Accs;
(Pid, Eldest, {PidsAcc, SumAcc, CountAcc}) ->
{[Pid|PidsAcc], SumAcc + timer:now_diff(Now, Eldest),
CountAcc + 1}
end, {[], 0, 0}, Elders),
case Pids of
[] -> ok;
_ -> AverageAge = Sum / ClientCount,
lists:foreach(
fun (Pid) ->
case dict:find(Pid, Callbacks) of
error -> ok;
{ok, {M, F, A}} -> apply(M, F, A ++ [AverageAge])
end
end, Pids)
end,
case TRef of
undefined -> {ok, TRef1} = timer:apply_after(
?FILE_HANDLES_CHECK_INTERVAL,
gen_server, cast, [?SERVER, check_counts]),
State #fhc_state { timer_ref = TRef1 };
_ -> State
end;
maybe_reduce(State) ->
State.
Googling around suggests that Windows has a limit somewhere around
16 M , eg
%% For everything else, assume ulimit exists. Further googling
suggests that BSDs ( incl OS X ) , solaris and linux all agree that
%% ulimit -n is file handles
ulimit() ->
case os:type() of
{win32, _OsName} ->
?FILE_HANDLES_LIMIT_WINDOWS;
{unix, _OsName} ->
Under Linux , Solaris and FreeBSD , ulimit is a shell
builtin , not a command . In OS X , it 's a command .
%% Fortunately, os:cmd invokes the cmd in a shell env, so
%% we're safe in all cases.
case os:cmd("ulimit -n") of
"unlimited" ->
infinity;
String = [C|_] when $0 =< C andalso C =< $9 ->
Num = list_to_integer(
lists:takewhile(
fun (D) -> $0 =< D andalso D =< $9 end, String)) -
?RESERVED_FOR_OTHERS,
lists:max([1, Num]);
_ ->
%% probably a variant of
" /bin / sh : line 1 : ulimit : command not found\n "
?FILE_HANDLES_LIMIT_OTHER - ?RESERVED_FOR_OTHERS
end;
_ ->
?FILE_HANDLES_LIMIT_OTHER - ?RESERVED_FOR_OTHERS
end.
ensure_mref(Pid, State = #fhc_state { client_mrefs = ClientMRefs }) ->
case dict:find(Pid, ClientMRefs) of
{ok, _MRef} -> State;
error -> MRef = erlang:monitor(process, Pid),
State #fhc_state {
client_mrefs = dict:store(Pid, MRef, ClientMRefs) }
end.
| null | https://raw.githubusercontent.com/erlangonrails/devdb/0e7eaa6bd810ec3892bfc3d933439560620d0941/dev/rabbitmq-server-1.8.0/src/file_handle_cache.erl | erlang | compliance with the License. You may obtain a copy of the License at
/
basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See the
License for the specific language governing rights and limitations
under the License.
All Rights Reserved.
Contributor(s): ______________________________________.
A File Handle Cache
module.
Some constraints
else.
may happen.
file, although you can truncate and then append if you want.
free to use the read_ahead mode, but beware of the interaction
between that buffer and the write buffer.
Some benefits
you can control when the buffer gets flushed out. This means that
you can rely on reads-after-writes working, without having to call
the expensive sync.
would be after the write buffer is written out).
There is also a server component which serves to limit the number
of open file handles in a "soft" way - the server will never
prevent a client from opening a handle, but may immediately tell it
still all work correctly, it is just that effectively no caching
will take place. The operation of limiting is as follows:
On open and close, the client sends messages to the server
informing it of opens and closes. This allows the server to keep
track of the number of open handles. The client also keeps a
gb_tree which is updated on every use of a file handle, mapping the
time at which the file handle was last used (timestamp) to the
handle. Thus the smallest key in this tree maps to the file handle
that has not been used for the longest amount of time. This
smallest key is included in the messages to the server. As such,
the server keeps track of when the least recently used file handle
was used *at the point of the most recent open or close* by each
client.
Note that this data can go very out of date, by the client using
the least recently used handle.
When the limit is reached, the server calculates the average age of
the last reported least recently used file handle of all the
clients. It then tells all the clients to close any handles not
used for longer than this average, by invoking the callback the
client registered. The client should receive this message and pass
it into set_maximum_since_use/1. However, it is highly possible
this age will be greater than the ages of all the handles the
client knows of because the client has used its file handles in the
mean time. Thus at this point the client reports to the server the
current timestamp at which its least recently used file handle was
is back under the limit, in which case all is well again, or if
not, it will calculate a new average age. Its data will be much
more recent now, and so it is very likely that when this is
communicated to the clients, the clients will close file handles.
The advantage of this scheme is that there is only communication
from the client to the server on open, close, and when in the
process of trying to reduce file handle usage. There is no
communication from the client to the server on normal file handle
operations. This scheme forms a feed-back loop - the server does
not care which file handles are closed, just that some are, and it
checks this repeatedly when over the limit. Given the guarantees of
age of the handle, it will be greater than when the server
calculated it, hence it should be closed.
Handles which are closed as a result of the server are put into a
"soft-closed" state in which the handle is closed (data flushed out
fully reopened again as soon as needed, thus users of this library
do not need to worry about their handles being closed by the server
- reopening them when necessary is handled transparently.
The server also supports obtain and release_on_death. obtain/0
"owns" the file descriptor. This is, for example, used to track the
use of file descriptors through network sockets.
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------------------------------
Public API
----------------------------------------------------------------------------
----------------------------------------------------------------------------
----------------------------------------------------------------------------
same as {bof, DO}
----------------------------------------------------------------------------
gen_server callbacks
----------------------------------------------------------------------------
don't call maybe_reduce from here otherwise we can create a
storm of messages
----------------------------------------------------------------------------
server helpers
----------------------------------------------------------------------------
For everything else, assume ulimit exists. Further googling
ulimit -n is file handles
Fortunately, os:cmd invokes the cmd in a shell env, so
we're safe in all cases.
probably a variant of | The contents of this file are subject to the Mozilla Public License
Version 1.1 ( the " License " ) ; you may not use this file except in
Software distributed under the License is distributed on an " AS IS "
The Original Code is RabbitMQ .
The Initial Developers of the Original Code are LShift Ltd ,
Cohesive Financial Technologies LLC , and Rabbit Technologies Ltd.
Portions created before 22 - Nov-2008 00:00:00 GMT by LShift Ltd ,
Cohesive Financial Technologies LLC , or Rabbit Technologies Ltd
are Copyright ( C ) 2007 - 2008 LShift Ltd , Cohesive Financial
Technologies LLC , and Rabbit Technologies Ltd.
Portions created by LShift Ltd are Copyright ( C ) 2007 - 2010 LShift
Ltd. Portions created by Cohesive Financial Technologies LLC are
Copyright ( C ) 2007 - 2010 Cohesive Financial Technologies
LLC . Portions created by Rabbit Technologies Ltd are Copyright
( C ) 2007 - 2010 Rabbit Technologies Ltd.
-module(file_handle_cache).
This extends a subset of the functionality of the Erlang file
1 ) This supports one writer , multiple readers per file . Nothing
2 ) Do not open the same file from different processes . Bad things
3 ) Writes are all appends . You can not write to the middle of a
4 ) Although there is a write buffer , there is no read buffer . Feel
1 ) You do not have to remember to call sync before close
2 ) Buffering is much more flexible than with plain file module , and
3 ) Unnecessary calls to position and sync get optimised out .
4 ) You can find out what your ' real ' offset is , and what your
' virtual ' offset is ( i.e. where the hdl really is , and where it
5 ) You can find out what the offset was when you last sync'd .
to close the handle . Thus you can set the limit to zero and it will
last used . The server will check two seconds later that either it
now ( ) , even if there is just one file handle open , a limit of 1 ,
and one client , it is certain that when the client calculates the
and sync'd first ) but the state is maintained . The handle will be
blocks until a file descriptor is available . release_on_death/1
takes a pid and monitors the pid , reducing the count by 1 when the
pid dies . Thus the assumption is that obtain/0 is called first , and
when that returns , release_on_death/1 is called with the pid who
-behaviour(gen_server).
-export([register_callback/3]).
-export([open/3, close/1, read/2, append/2, sync/1, position/2, truncate/1,
last_sync_offset/1, current_virtual_offset/1, current_raw_offset/1,
flush/1, copy/3, set_maximum_since_use/1, delete/1, clear/1]).
-export([release_on_death/1, obtain/0]).
-export([start_link/0, init/1, handle_call/3, handle_cast/2, handle_info/2,
terminate/2, code_change/3]).
-define(SERVER, ?MODULE).
-define(RESERVED_FOR_OTHERS, 100).
-define(FILE_HANDLES_LIMIT_WINDOWS, 10000000).
-define(FILE_HANDLES_LIMIT_OTHER, 1024).
-define(FILE_HANDLES_CHECK_INTERVAL, 2000).
-record(file,
{ reader_count,
has_writer
}).
-record(handle,
{ hdl,
offset,
trusted_offset,
is_dirty,
write_buffer_size,
write_buffer_size_limit,
write_buffer,
at_eof,
path,
mode,
options,
is_write,
is_read,
last_used_at
}).
-record(fhc_state,
{ elders,
limit,
count,
obtains,
callbacks,
client_mrefs,
timer_ref
}).
Specs
-ifdef(use_specs).
-type(ref() :: any()).
-type(error() :: {'error', any()}).
-type(ok_or_error() :: ('ok' | error())).
-type(val_or_error(T) :: ({'ok', T} | error())).
-type(position() :: ('bof' | 'eof' | non_neg_integer() |
{('bof' |'eof'), non_neg_integer()} | {'cur', integer()})).
-type(offset() :: non_neg_integer()).
-spec(register_callback/3 :: (atom(), atom(), [any()]) -> 'ok').
-spec(open/3 ::
(string(), [any()],
[{'write_buffer', (non_neg_integer() | 'infinity' | 'unbuffered')}]) ->
val_or_error(ref())).
-spec(close/1 :: (ref()) -> ok_or_error()).
-spec(read/2 :: (ref(), non_neg_integer()) ->
val_or_error([char()] | binary()) | 'eof').
-spec(append/2 :: (ref(), iodata()) -> ok_or_error()).
-spec(sync/1 :: (ref()) -> ok_or_error()).
-spec(position/2 :: (ref(), position()) -> val_or_error(offset())).
-spec(truncate/1 :: (ref()) -> ok_or_error()).
-spec(last_sync_offset/1 :: (ref()) -> val_or_error(offset())).
-spec(current_virtual_offset/1 :: (ref()) -> val_or_error(offset())).
-spec(current_raw_offset/1 :: (ref()) -> val_or_error(offset())).
-spec(flush/1 :: (ref()) -> ok_or_error()).
-spec(copy/3 :: (ref(), ref(), non_neg_integer()) ->
val_or_error(non_neg_integer())).
-spec(set_maximum_since_use/1 :: (non_neg_integer()) -> 'ok').
-spec(delete/1 :: (ref()) -> ok_or_error()).
-spec(clear/1 :: (ref()) -> ok_or_error()).
-spec(release_on_death/1 :: (pid()) -> 'ok').
-spec(obtain/0 :: () -> 'ok').
-endif.
start_link() ->
gen_server:start_link({local, ?SERVER}, ?MODULE, [], [{timeout, infinity}]).
register_callback(M, F, A)
when is_atom(M) andalso is_atom(F) andalso is_list(A) ->
gen_server:cast(?SERVER, {register_callback, self(), {M, F, A}}).
open(Path, Mode, Options) ->
Path1 = filename:absname(Path),
File1 = #file { reader_count = RCount, has_writer = HasWriter } =
case get({Path1, fhc_file}) of
File = #file {} -> File;
undefined -> #file { reader_count = 0,
has_writer = false }
end,
Mode1 = append_to_write(Mode),
IsWriter = is_writer(Mode1),
case IsWriter andalso HasWriter of
true -> {error, writer_exists};
false -> Ref = make_ref(),
case open1(Path1, Mode1, Options, Ref, bof, new) of
{ok, _Handle} ->
RCount1 = case is_reader(Mode1) of
true -> RCount + 1;
false -> RCount
end,
HasWriter1 = HasWriter orelse IsWriter,
put({Path1, fhc_file},
File1 #file { reader_count = RCount1,
has_writer = HasWriter1 }),
{ok, Ref};
Error ->
Error
end
end.
close(Ref) ->
case erase({Ref, fhc_handle}) of
undefined -> ok;
Handle -> case hard_close(Handle) of
ok -> ok;
{Error, Handle1} -> put_handle(Ref, Handle1),
Error
end
end.
read(Ref, Count) ->
with_flushed_handles(
[Ref],
fun ([#handle { is_read = false }]) ->
{error, not_open_for_reading};
([Handle = #handle { hdl = Hdl, offset = Offset }]) ->
case file:read(Hdl, Count) of
{ok, Data} = Obj -> Offset1 = Offset + iolist_size(Data),
{Obj,
[Handle #handle { offset = Offset1 }]};
eof -> {eof, [Handle #handle { at_eof = true }]};
Error -> {Error, [Handle]}
end
end).
append(Ref, Data) ->
with_handles(
[Ref],
fun ([#handle { is_write = false }]) ->
{error, not_open_for_writing};
([Handle]) ->
case maybe_seek(eof, Handle) of
{{ok, _Offset}, #handle { hdl = Hdl, offset = Offset,
write_buffer_size_limit = 0,
at_eof = true } = Handle1} ->
Offset1 = Offset + iolist_size(Data),
{file:write(Hdl, Data),
[Handle1 #handle { is_dirty = true, offset = Offset1 }]};
{{ok, _Offset}, #handle { write_buffer = WriteBuffer,
write_buffer_size = Size,
write_buffer_size_limit = Limit,
at_eof = true } = Handle1} ->
WriteBuffer1 = [Data | WriteBuffer],
Size1 = Size + iolist_size(Data),
Handle2 = Handle1 #handle { write_buffer = WriteBuffer1,
write_buffer_size = Size1 },
case Limit /= infinity andalso Size1 > Limit of
true -> {Result, Handle3} = write_buffer(Handle2),
{Result, [Handle3]};
false -> {ok, [Handle2]}
end;
{{error, _} = Error, Handle1} ->
{Error, [Handle1]}
end
end).
sync(Ref) ->
with_flushed_handles(
[Ref],
fun ([#handle { is_dirty = false, write_buffer = [] }]) ->
ok;
([Handle = #handle { hdl = Hdl, offset = Offset,
is_dirty = true, write_buffer = [] }]) ->
case file:sync(Hdl) of
ok -> {ok, [Handle #handle { trusted_offset = Offset,
is_dirty = false }]};
Error -> {Error, [Handle]}
end
end).
position(Ref, NewOffset) ->
with_flushed_handles(
[Ref],
fun ([Handle]) -> {Result, Handle1} = maybe_seek(NewOffset, Handle),
{Result, [Handle1]}
end).
truncate(Ref) ->
with_flushed_handles(
[Ref],
fun ([Handle1 = #handle { hdl = Hdl, offset = Offset,
trusted_offset = TOffset }]) ->
case file:truncate(Hdl) of
ok -> TOffset1 = lists:min([Offset, TOffset]),
{ok, [Handle1 #handle { trusted_offset = TOffset1,
at_eof = true }]};
Error -> {Error, [Handle1]}
end
end).
last_sync_offset(Ref) ->
with_handles([Ref], fun ([#handle { trusted_offset = TOffset }]) ->
{ok, TOffset}
end).
current_virtual_offset(Ref) ->
with_handles([Ref], fun ([#handle { at_eof = true, is_write = true,
offset = Offset,
write_buffer_size = Size }]) ->
{ok, Offset + Size};
([#handle { offset = Offset }]) ->
{ok, Offset}
end).
current_raw_offset(Ref) ->
with_handles([Ref], fun ([Handle]) -> {ok, Handle #handle.offset} end).
flush(Ref) ->
with_flushed_handles([Ref], fun ([Handle]) -> {ok, [Handle]} end).
copy(Src, Dest, Count) ->
with_flushed_handles(
[Src, Dest],
fun ([SHandle = #handle { is_read = true, hdl = SHdl, offset = SOffset },
DHandle = #handle { is_write = true, hdl = DHdl, offset = DOffset }]
) ->
case file:copy(SHdl, DHdl, Count) of
{ok, Count1} = Result1 ->
{Result1,
[SHandle #handle { offset = SOffset + Count1 },
DHandle #handle { offset = DOffset + Count1 }]};
Error ->
{Error, [SHandle, DHandle]}
end;
(_Handles) ->
{error, incorrect_handle_modes}
end).
delete(Ref) ->
case erase({Ref, fhc_handle}) of
undefined ->
ok;
Handle = #handle { path = Path } ->
case hard_close(Handle #handle { is_dirty = false,
write_buffer = [] }) of
ok -> file:delete(Path);
{Error, Handle1} -> put_handle(Ref, Handle1),
Error
end
end.
clear(Ref) ->
with_handles(
[Ref],
fun ([#handle { at_eof = true, write_buffer_size = 0, offset = 0 }]) ->
ok;
([Handle]) ->
case maybe_seek(bof, Handle #handle { write_buffer = [],
write_buffer_size = 0 }) of
{{ok, 0}, Handle1 = #handle { hdl = Hdl }} ->
case file:truncate(Hdl) of
ok -> {ok, [Handle1 #handle {trusted_offset = 0,
at_eof = true }]};
Error -> {Error, [Handle1]}
end;
{{error, _} = Error, Handle1} ->
{Error, [Handle1]}
end
end).
set_maximum_since_use(MaximumAge) ->
Now = now(),
case lists:foldl(
fun ({{Ref, fhc_handle},
Handle = #handle { hdl = Hdl, last_used_at = Then }}, Rep) ->
Age = timer:now_diff(Now, Then),
case Hdl /= closed andalso Age >= MaximumAge of
true -> {Res, Handle1} = soft_close(Handle),
case Res of
ok -> put({Ref, fhc_handle}, Handle1),
false;
_ -> put_handle(Ref, Handle1),
Rep
end;
false -> Rep
end;
(_KeyValuePair, Rep) ->
Rep
end, true, get()) of
true -> age_tree_change(), ok;
false -> ok
end.
release_on_death(Pid) when is_pid(Pid) ->
gen_server:cast(?SERVER, {release_on_death, Pid}).
obtain() ->
gen_server:call(?SERVER, obtain, infinity).
Internal functions
is_reader(Mode) -> lists:member(read, Mode).
is_writer(Mode) -> lists:member(write, Mode).
append_to_write(Mode) ->
case lists:member(append, Mode) of
true -> [write | Mode -- [append, write]];
false -> Mode
end.
with_handles(Refs, Fun) ->
ResHandles = lists:foldl(
fun (Ref, {ok, HandlesAcc}) ->
case get_or_reopen(Ref) of
{ok, Handle} -> {ok, [Handle | HandlesAcc]};
Error -> Error
end;
(_Ref, Error) ->
Error
end, {ok, []}, Refs),
case ResHandles of
{ok, Handles} ->
case Fun(lists:reverse(Handles)) of
{Result, Handles1} when is_list(Handles1) ->
lists:zipwith(fun put_handle/2, Refs, Handles1),
Result;
Result ->
Result
end;
Error ->
Error
end.
with_flushed_handles(Refs, Fun) ->
with_handles(
Refs,
fun (Handles) ->
case lists:foldl(
fun (Handle, {ok, HandlesAcc}) ->
{Res, Handle1} = write_buffer(Handle),
{Res, [Handle1 | HandlesAcc]};
(Handle, {Error, HandlesAcc}) ->
{Error, [Handle | HandlesAcc]}
end, {ok, []}, Handles) of
{ok, Handles1} ->
Fun(lists:reverse(Handles1));
{Error, Handles1} ->
{Error, lists:reverse(Handles1)}
end
end).
get_or_reopen(Ref) ->
case get({Ref, fhc_handle}) of
undefined ->
{error, not_open, Ref};
#handle { hdl = closed, offset = Offset,
path = Path, mode = Mode, options = Options } ->
open1(Path, Mode, Options, Ref, Offset, reopen);
Handle ->
{ok, Handle}
end.
put_handle(Ref, Handle = #handle { last_used_at = Then }) ->
Now = now(),
age_tree_update(Then, Now, Ref),
put({Ref, fhc_handle}, Handle #handle { last_used_at = Now }).
with_age_tree(Fun) ->
put(fhc_age_tree, Fun(case get(fhc_age_tree) of
undefined -> gb_trees:empty();
AgeTree -> AgeTree
end)).
age_tree_insert(Now, Ref) ->
with_age_tree(
fun (Tree) ->
Tree1 = gb_trees:insert(Now, Ref, Tree),
{Oldest, _Ref} = gb_trees:smallest(Tree1),
gen_server:cast(?SERVER, {open, self(), Oldest}),
Tree1
end).
age_tree_update(Then, Now, Ref) ->
with_age_tree(
fun (Tree) ->
gb_trees:insert(Now, Ref, gb_trees:delete_any(Then, Tree))
end).
age_tree_delete(Then) ->
with_age_tree(
fun (Tree) ->
Tree1 = gb_trees:delete_any(Then, Tree),
Oldest = case gb_trees:is_empty(Tree1) of
true ->
undefined;
false ->
{Oldest1, _Ref} = gb_trees:smallest(Tree1),
Oldest1
end,
gen_server:cast(?SERVER, {close, self(), Oldest}),
Tree1
end).
age_tree_change() ->
with_age_tree(
fun (Tree) ->
case gb_trees:is_empty(Tree) of
true -> Tree;
false -> {Oldest, _Ref} = gb_trees:smallest(Tree),
gen_server:cast(?SERVER, {update, self(), Oldest})
end,
Tree
end).
open1(Path, Mode, Options, Ref, Offset, NewOrReopen) ->
Mode1 = case NewOrReopen of
new -> Mode;
reopen -> [read | Mode]
end,
case file:open(Path, Mode1) of
{ok, Hdl} ->
WriteBufferSize =
case proplists:get_value(write_buffer, Options, unbuffered) of
unbuffered -> 0;
infinity -> infinity;
N when is_integer(N) -> N
end,
Now = now(),
Handle = #handle { hdl = Hdl,
offset = 0,
trusted_offset = 0,
is_dirty = false,
write_buffer_size = 0,
write_buffer_size_limit = WriteBufferSize,
write_buffer = [],
at_eof = false,
path = Path,
mode = Mode,
options = Options,
is_write = is_writer(Mode),
is_read = is_reader(Mode),
last_used_at = Now },
{{ok, Offset1}, Handle1} = maybe_seek(Offset, Handle),
Handle2 = Handle1 #handle { trusted_offset = Offset1 },
put({Ref, fhc_handle}, Handle2),
age_tree_insert(Now, Ref),
{ok, Handle2};
{error, Reason} ->
{error, Reason}
end.
soft_close(Handle = #handle { hdl = closed }) ->
{ok, Handle};
soft_close(Handle) ->
case write_buffer(Handle) of
{ok, #handle { hdl = Hdl, offset = Offset, is_dirty = IsDirty,
last_used_at = Then } = Handle1 } ->
ok = case IsDirty of
true -> file:sync(Hdl);
false -> ok
end,
ok = file:close(Hdl),
age_tree_delete(Then),
{ok, Handle1 #handle { hdl = closed, trusted_offset = Offset,
is_dirty = false }};
{_Error, _Handle} = Result ->
Result
end.
hard_close(Handle) ->
case soft_close(Handle) of
{ok, #handle { path = Path,
is_read = IsReader, is_write = IsWriter }} ->
#file { reader_count = RCount, has_writer = HasWriter } = File =
get({Path, fhc_file}),
RCount1 = case IsReader of
true -> RCount - 1;
false -> RCount
end,
HasWriter1 = HasWriter andalso not IsWriter,
case RCount1 =:= 0 andalso not HasWriter1 of
true -> erase({Path, fhc_file});
false -> put({Path, fhc_file},
File #file { reader_count = RCount1,
has_writer = HasWriter1 })
end,
ok;
{_Error, _Handle} = Result ->
Result
end.
maybe_seek(NewOffset, Handle = #handle { hdl = Hdl, offset = Offset,
at_eof = AtEoF }) ->
{AtEoF1, NeedsSeek} = needs_seek(AtEoF, Offset, NewOffset),
case (case NeedsSeek of
true -> file:position(Hdl, NewOffset);
false -> {ok, Offset}
end) of
{ok, Offset1} = Result ->
{Result, Handle #handle { offset = Offset1, at_eof = AtEoF1 }};
{error, _} = Error ->
{Error, Handle}
end.
needs_seek( AtEoF, _CurOffset, cur ) -> {AtEoF, false};
needs_seek( AtEoF, _CurOffset, {cur, 0}) -> {AtEoF, false};
needs_seek( true, _CurOffset, eof ) -> {true , false};
needs_seek( true, _CurOffset, {eof, 0}) -> {true , false};
needs_seek( false, _CurOffset, eof ) -> {true , true };
needs_seek( false, _CurOffset, {eof, 0}) -> {true , true };
needs_seek( AtEoF, 0, bof ) -> {AtEoF, false};
needs_seek( AtEoF, 0, {bof, 0}) -> {AtEoF, false};
needs_seek( AtEoF, CurOffset, CurOffset) -> {AtEoF, false};
needs_seek( true, CurOffset, {bof, DesiredOffset})
when DesiredOffset >= CurOffset ->
{true, true};
needs_seek( true, _CurOffset, {cur, DesiredOffset})
when DesiredOffset > 0 ->
{true, true};
when is_integer(DesiredOffset) andalso DesiredOffset >= CurOffset ->
{true, true};
because we ca n't really track size , we could well end up at EoF and not know
needs_seek(_AtEoF, _CurOffset, _DesiredOffset) ->
{false, true}.
write_buffer(Handle = #handle { write_buffer = [] }) ->
{ok, Handle};
write_buffer(Handle = #handle { hdl = Hdl, offset = Offset,
write_buffer = WriteBuffer,
write_buffer_size = DataSize,
at_eof = true }) ->
case file:write(Hdl, lists:reverse(WriteBuffer)) of
ok ->
Offset1 = Offset + DataSize,
{ok, Handle #handle { offset = Offset1, is_dirty = true,
write_buffer = [], write_buffer_size = 0 }};
{error, _} = Error ->
{Error, Handle}
end.
init([]) ->
Limit = case application:get_env(file_handles_high_watermark) of
{ok, Watermark} when (is_integer(Watermark) andalso
Watermark > 0) ->
Watermark;
_ ->
ulimit()
end,
error_logger:info_msg("Limiting to approx ~p file handles~n", [Limit]),
{ok, #fhc_state { elders = dict:new(), limit = Limit, count = 0,
obtains = [], callbacks = dict:new(),
client_mrefs = dict:new(), timer_ref = undefined }}.
handle_call(obtain, From, State = #fhc_state { count = Count }) ->
State1 = #fhc_state { count = Count1, limit = Limit, obtains = Obtains } =
maybe_reduce(State #fhc_state { count = Count + 1 }),
case Limit /= infinity andalso Count1 >= Limit of
true -> {noreply, State1 #fhc_state { obtains = [From | Obtains],
count = Count1 - 1 }};
false -> {reply, ok, State1}
end.
handle_cast({register_callback, Pid, MFA},
State = #fhc_state { callbacks = Callbacks }) ->
{noreply, ensure_mref(
Pid, State #fhc_state {
callbacks = dict:store(Pid, MFA, Callbacks) })};
handle_cast({open, Pid, EldestUnusedSince}, State =
#fhc_state { elders = Elders, count = Count }) ->
Elders1 = dict:store(Pid, EldestUnusedSince, Elders),
{noreply, maybe_reduce(
ensure_mref(Pid, State #fhc_state { elders = Elders1,
count = Count + 1 }))};
handle_cast({update, Pid, EldestUnusedSince}, State =
#fhc_state { elders = Elders }) ->
Elders1 = dict:store(Pid, EldestUnusedSince, Elders),
{noreply, ensure_mref(Pid, State #fhc_state { elders = Elders1 })};
handle_cast({close, Pid, EldestUnusedSince}, State =
#fhc_state { elders = Elders, count = Count }) ->
Elders1 = case EldestUnusedSince of
undefined -> dict:erase(Pid, Elders);
_ -> dict:store(Pid, EldestUnusedSince, Elders)
end,
{noreply, process_obtains(
ensure_mref(Pid, State #fhc_state { elders = Elders1,
count = Count - 1 }))};
handle_cast(check_counts, State) ->
{noreply, maybe_reduce(State #fhc_state { timer_ref = undefined })};
handle_cast({release_on_death, Pid}, State) ->
_MRef = erlang:monitor(process, Pid),
{noreply, State}.
handle_info({'DOWN', MRef, process, Pid, _Reason}, State =
#fhc_state { count = Count, callbacks = Callbacks,
client_mrefs = ClientMRefs, elders = Elders }) ->
{noreply, process_obtains(
case dict:find(Pid, ClientMRefs) of
{ok, MRef} -> State #fhc_state {
elders = dict:erase(Pid, Elders),
client_mrefs = dict:erase(Pid, ClientMRefs),
callbacks = dict:erase(Pid, Callbacks) };
_ -> State #fhc_state { count = Count - 1 }
end)}.
terminate(_Reason, State) ->
State.
code_change(_OldVsn, State, _Extra) ->
{ok, State}.
process_obtains(State = #fhc_state { obtains = [] }) ->
State;
process_obtains(State = #fhc_state { limit = Limit, count = Count })
when Limit /= infinity andalso Count >= Limit ->
State;
process_obtains(State = #fhc_state { limit = Limit, count = Count,
obtains = Obtains }) ->
ObtainsLen = length(Obtains),
ObtainableLen = lists:min([ObtainsLen, Limit - Count]),
Take = ObtainsLen - ObtainableLen,
{ObtainsNew, ObtainableRev} = lists:split(Take, Obtains),
[gen_server:reply(From, ok) || From <- ObtainableRev],
State #fhc_state { count = Count + ObtainableLen, obtains = ObtainsNew }.
maybe_reduce(State = #fhc_state { limit = Limit, count = Count, elders = Elders,
callbacks = Callbacks, timer_ref = TRef })
when Limit /= infinity andalso Count >= Limit ->
Now = now(),
{Pids, Sum, ClientCount} =
dict:fold(fun (_Pid, undefined, Accs) ->
Accs;
(Pid, Eldest, {PidsAcc, SumAcc, CountAcc}) ->
{[Pid|PidsAcc], SumAcc + timer:now_diff(Now, Eldest),
CountAcc + 1}
end, {[], 0, 0}, Elders),
case Pids of
[] -> ok;
_ -> AverageAge = Sum / ClientCount,
lists:foreach(
fun (Pid) ->
case dict:find(Pid, Callbacks) of
error -> ok;
{ok, {M, F, A}} -> apply(M, F, A ++ [AverageAge])
end
end, Pids)
end,
case TRef of
undefined -> {ok, TRef1} = timer:apply_after(
?FILE_HANDLES_CHECK_INTERVAL,
gen_server, cast, [?SERVER, check_counts]),
State #fhc_state { timer_ref = TRef1 };
_ -> State
end;
maybe_reduce(State) ->
State.
Googling around suggests that Windows has a limit somewhere around
16 M , eg
suggests that BSDs ( incl OS X ) , solaris and linux all agree that
ulimit() ->
case os:type() of
{win32, _OsName} ->
?FILE_HANDLES_LIMIT_WINDOWS;
{unix, _OsName} ->
Under Linux , Solaris and FreeBSD , ulimit is a shell
builtin , not a command . In OS X , it 's a command .
case os:cmd("ulimit -n") of
"unlimited" ->
infinity;
String = [C|_] when $0 =< C andalso C =< $9 ->
Num = list_to_integer(
lists:takewhile(
fun (D) -> $0 =< D andalso D =< $9 end, String)) -
?RESERVED_FOR_OTHERS,
lists:max([1, Num]);
_ ->
" /bin / sh : line 1 : ulimit : command not found\n "
?FILE_HANDLES_LIMIT_OTHER - ?RESERVED_FOR_OTHERS
end;
_ ->
?FILE_HANDLES_LIMIT_OTHER - ?RESERVED_FOR_OTHERS
end.
ensure_mref(Pid, State = #fhc_state { client_mrefs = ClientMRefs }) ->
case dict:find(Pid, ClientMRefs) of
{ok, _MRef} -> State;
error -> MRef = erlang:monitor(process, Pid),
State #fhc_state {
client_mrefs = dict:store(Pid, MRef, ClientMRefs) }
end.
|
cbec615018c11fb07b481776956cdc13ba0b59f4cf2e2ee4d040a7022b8fce0f | binaryage/cljs-devtools | main.cljs | (ns devtools.main
(:require [devtools.core]
[devtools.dead-code.core]))
| null | https://raw.githubusercontent.com/binaryage/cljs-devtools/d07fc6d404479b1ddd32cecc105009de77e3cba7/test/src/dead-code-no-mention/devtools/main.cljs | clojure | (ns devtools.main
(:require [devtools.core]
[devtools.dead-code.core]))
| |
6f2632712bb7b8536f54cc689af6d0d8cc527a2b1fe7642098a7c7b91a4e2360 | mfp/oraft | oraft.mli | * Implentation of the RAFT consensus algorithm .
*
* Refer to
* " In Search of an Understandable Consensus Algorithm " , and * Ousterhout , Stanford University . ( Draft of October 7 , 2013 ) .
* [ ]
*
*
* Refer to
* "In Search of an Understandable Consensus Algorithm", Diego Ongaro and John
* Ousterhout, Stanford University. (Draft of October 7, 2013).
* []
* *)
module Types :
sig
type status = Leader | Follower | Candidate
type term = Int64.t
type index = Int64.t
type rep_id = string
type client_id = string
type req_id = client_id * Int64.t
type address = string
type config =
Simple_config of simple_config * passive_peers
| Joint_config of simple_config * simple_config * passive_peers
and simple_config = (rep_id * address) list
and passive_peers = (rep_id * address) list
type 'a message =
Request_vote of request_vote
| Vote_result of vote_result
| Append_entries of 'a append_entries
| Append_result of append_result
| Ping of ping
| Pong of ping
and request_vote = {
term : term;
candidate_id : rep_id;
last_log_index : index;
last_log_term : term;
}
and vote_result = {
term : term;
vote_granted : bool;
}
and 'a append_entries = {
term : term;
leader_id : rep_id;
prev_log_index : index;
prev_log_term : term;
entries : (index * ('a entry * term)) list;
leader_commit : index;
}
and 'a entry = Nop | Op of 'a | Config of config
and append_result =
{
term : term;
result : actual_append_result;
}
and actual_append_result =
Append_success of index (* last log entry included in msg we respond to *)
| Append_failure of index (* index of log entry preceding those in
message we respond to *)
and ping = { term : term; n : Int64.t; }
type 'a action =
Apply of (index * 'a * term) list
| Become_candidate
| Become_follower of rep_id option
| Become_leader
| Changed_config
| Exec_readonly of Int64.t
| Redirect of rep_id option * 'a
| Reset_election_timeout
| Reset_heartbeat
| Send of rep_id * address * 'a message
| Send_snapshot of rep_id * address * index * config
| Stop
end
module Core :
sig
open Types
type 'a state
val make :
id:rep_id -> current_term:term -> voted_for:rep_id option ->
log:(index * 'a entry * term) list ->
config:config -> unit -> 'a state
val is_single_node_cluster : 'a state -> bool
val leader_id : 'a state -> rep_id option
val id : 'a state -> rep_id
val status : 'a state -> status
val config : 'a state -> config
val committed_config : 'a state -> config
val last_index : 'a state -> index
val last_term : 'a state -> term
val peers : 'a state -> (rep_id * address) list
val receive_msg :
'a state -> rep_id -> 'a message -> 'a state * 'a action list
val election_timeout : 'a state -> 'a state * 'a action list
val heartbeat_timeout : 'a state -> 'a state * 'a action list
val client_command : 'a -> 'a state -> 'a state * 'a action list
* @return [ ( state , None ) ] if the node is not the leader ,
* [ ( state , Some ( i d , actions ) ) ] otherwise , where [ i d ] identifies the
* requested read - only operation , which can be executed once an
* [ Exec_readonly m ] action with [ m > = i d ] is returned within the same term
* ( i.e. , with no intermediate [ Become_candidate ] , [ Become_follower ] or
* [ Become_leader ] ) .
* [(state, Some (id, actions))] otherwise, where [id] identifies the
* requested read-only operation, which can be executed once an
* [Exec_readonly m] action with [m >= id] is returned within the same term
* (i.e., with no intermediate [Become_candidate], [Become_follower] or
* [Become_leader]). *)
val readonly_operation :
'a state -> 'a state * (Int64.t * 'a action list) option
val snapshot_sent :
rep_id -> last_index:index -> 'a state -> ('a state * 'a action list)
val snapshot_send_failed : rep_id -> 'a state -> ('a state * 'a action list)
val install_snapshot :
last_term:term -> last_index:index -> config:config -> 'a state ->
'a state * bool
val compact_log : index -> 'a state -> 'a state
module Config :
sig
type 'a result =
[
| `Already_changed
| `Cannot_change
| `Change_in_process
| `Redirect of (rep_id * address) option
| `Start_change of 'a state
| `Unsafe_change of simple_config * passive_peers
]
val add_failover : rep_id -> address -> 'a state -> 'a result
val remove_failover : rep_id -> 'a state -> 'a result
val decommission : rep_id -> 'a state -> 'a result
val demote : rep_id -> 'a state -> 'a result
val promote : rep_id -> 'a state -> 'a result
val replace : replacee:rep_id -> failover:rep_id -> 'a state -> 'a result
end
end
| null | https://raw.githubusercontent.com/mfp/oraft/cf7352eb8f1324717d47dc294a058c857ebef9eb/src/oraft.mli | ocaml | last log entry included in msg we respond to
index of log entry preceding those in
message we respond to | * Implentation of the RAFT consensus algorithm .
*
* Refer to
* " In Search of an Understandable Consensus Algorithm " , and * Ousterhout , Stanford University . ( Draft of October 7 , 2013 ) .
* [ ]
*
*
* Refer to
* "In Search of an Understandable Consensus Algorithm", Diego Ongaro and John
* Ousterhout, Stanford University. (Draft of October 7, 2013).
* []
* *)
module Types :
sig
type status = Leader | Follower | Candidate
type term = Int64.t
type index = Int64.t
type rep_id = string
type client_id = string
type req_id = client_id * Int64.t
type address = string
type config =
Simple_config of simple_config * passive_peers
| Joint_config of simple_config * simple_config * passive_peers
and simple_config = (rep_id * address) list
and passive_peers = (rep_id * address) list
type 'a message =
Request_vote of request_vote
| Vote_result of vote_result
| Append_entries of 'a append_entries
| Append_result of append_result
| Ping of ping
| Pong of ping
and request_vote = {
term : term;
candidate_id : rep_id;
last_log_index : index;
last_log_term : term;
}
and vote_result = {
term : term;
vote_granted : bool;
}
and 'a append_entries = {
term : term;
leader_id : rep_id;
prev_log_index : index;
prev_log_term : term;
entries : (index * ('a entry * term)) list;
leader_commit : index;
}
and 'a entry = Nop | Op of 'a | Config of config
and append_result =
{
term : term;
result : actual_append_result;
}
and actual_append_result =
and ping = { term : term; n : Int64.t; }
type 'a action =
Apply of (index * 'a * term) list
| Become_candidate
| Become_follower of rep_id option
| Become_leader
| Changed_config
| Exec_readonly of Int64.t
| Redirect of rep_id option * 'a
| Reset_election_timeout
| Reset_heartbeat
| Send of rep_id * address * 'a message
| Send_snapshot of rep_id * address * index * config
| Stop
end
module Core :
sig
open Types
type 'a state
val make :
id:rep_id -> current_term:term -> voted_for:rep_id option ->
log:(index * 'a entry * term) list ->
config:config -> unit -> 'a state
val is_single_node_cluster : 'a state -> bool
val leader_id : 'a state -> rep_id option
val id : 'a state -> rep_id
val status : 'a state -> status
val config : 'a state -> config
val committed_config : 'a state -> config
val last_index : 'a state -> index
val last_term : 'a state -> term
val peers : 'a state -> (rep_id * address) list
val receive_msg :
'a state -> rep_id -> 'a message -> 'a state * 'a action list
val election_timeout : 'a state -> 'a state * 'a action list
val heartbeat_timeout : 'a state -> 'a state * 'a action list
val client_command : 'a -> 'a state -> 'a state * 'a action list
* @return [ ( state , None ) ] if the node is not the leader ,
* [ ( state , Some ( i d , actions ) ) ] otherwise , where [ i d ] identifies the
* requested read - only operation , which can be executed once an
* [ Exec_readonly m ] action with [ m > = i d ] is returned within the same term
* ( i.e. , with no intermediate [ Become_candidate ] , [ Become_follower ] or
* [ Become_leader ] ) .
* [(state, Some (id, actions))] otherwise, where [id] identifies the
* requested read-only operation, which can be executed once an
* [Exec_readonly m] action with [m >= id] is returned within the same term
* (i.e., with no intermediate [Become_candidate], [Become_follower] or
* [Become_leader]). *)
val readonly_operation :
'a state -> 'a state * (Int64.t * 'a action list) option
val snapshot_sent :
rep_id -> last_index:index -> 'a state -> ('a state * 'a action list)
val snapshot_send_failed : rep_id -> 'a state -> ('a state * 'a action list)
val install_snapshot :
last_term:term -> last_index:index -> config:config -> 'a state ->
'a state * bool
val compact_log : index -> 'a state -> 'a state
module Config :
sig
type 'a result =
[
| `Already_changed
| `Cannot_change
| `Change_in_process
| `Redirect of (rep_id * address) option
| `Start_change of 'a state
| `Unsafe_change of simple_config * passive_peers
]
val add_failover : rep_id -> address -> 'a state -> 'a result
val remove_failover : rep_id -> 'a state -> 'a result
val decommission : rep_id -> 'a state -> 'a result
val demote : rep_id -> 'a state -> 'a result
val promote : rep_id -> 'a state -> 'a result
val replace : replacee:rep_id -> failover:rep_id -> 'a state -> 'a result
end
end
|
3e862bfdb97235edf7eedfcfdefd52cbb38bb13b696be671a27f0b6e1163373a | Elzair/nazghul | player.scm | ;;============================================================================
This defines the gob for the player party in haxima .
;;----------------------------------------------------------------------------
Generic functions
(define (player-has? key)
(not (null? (tbl-get (gob (kern-get-player)) key))))
(define (player-get key)
(tbl-get (gob (kern-get-player)) key))
(define (player-set! key val)
( tbl - set ! ( gob ( kern - get - player ) ) key ) )
(tbl-set! (gob (kern-get-player)) key val))
;; Update older versions of saved games with new player gob fields.
(define (player-reconcile-gob kplayer)
(if (not (player-has? 'rep))
(player-set! 'rep 0))
(if (not (player-has? 'rapsheet))
(player-set! 'rapsheet '()))
;; update the dtable
(kern-mk-dtable
non pla men cgb acc mon tro spd out
(list 2 0 0 0 -1 -2 -2 -2 0 -2 -2 0 0 0 0) ;; none
(list 0 2 2 -2 -2 -2 -2 -2 -2 -2 -2 -2 2 2 2) ;; player
(list -1 2 2 -1 -2 -2 -2 -2 -2 -2 -2 -2 2 2 -2) ;; men
(list -1 -2 -2 2 -1 -2 0 -2 -2 -1 -2 -2 0 -2 -2) ;; cave goblin
(list -1 -2 -1 -1 2 -2 -1 -1 -2 -1 -2 -2 0 -2 -2) ;; accursed
(list -2 -2 -2 -2 -2 2 -2 0 -2 0 -2 0 0 -2 -2) ;; monsters
(list -2 -2 -2 0 -1 -2 2 -2 -2 -1 -2 -1 0 -2 -2) ;; hill trolls
(list -2 -2 -2 -2 -1 0 -2 2 -2 -1 -2 0 0 -2 -2) ;; wood spiders
(list 0 -2 -2 -2 -2 -2 -2 -2 2 -2 -2 -1 0 -2 0) ;; outlaws
(list -2 -2 -2 -1 -1 0 -1 -1 -2 2 -2 -1 0 -2 -2) ;; gint
(list -2 -2 -2 -2 -2 -2 -2 -2 -2 -2 2 -2 0 -2 -2) ;; demon
(list 0 -2 -2 -2 -2 0 -2 0 -1 -1 -2 2 0 -2 -2) ;; forest goblin
(list 0 2 2 0 0 0 0 0 0 0 0 0 2 2 2) ;; prisoners
(list -1 2 2 -1 -2 -2 -2 -2 -2 -2 -2 -2 2 2 -2) ;; glasdrin
(list 0 2 -2 -2 -2 -2 -2 -2 0 -2 -2 -2 2 -2 2) ;; player-outlaw
))
(kern-add-hook 'new_game_start_hook 'player-reconcile-gob)
;;----------------------------------------------------------------------------
Specialized queries
(define (player-found-warritrix?)
(not (null? (quest-data-getvalue 'questentry-warritrix 'found))))
(define (player-stewardess-trial-done?)
(not (null? (quest-data-getvalue 'questentry-warritrix 'avenged))))
| null | https://raw.githubusercontent.com/Elzair/nazghul/8f3a45ed6289cd9f469c4ff618d39366f2fbc1d8/worlds/haxima-1.002/player.scm | scheme | ============================================================================
----------------------------------------------------------------------------
Update older versions of saved games with new player gob fields.
update the dtable
none
player
men
cave goblin
accursed
monsters
hill trolls
wood spiders
outlaws
gint
demon
forest goblin
prisoners
glasdrin
player-outlaw
---------------------------------------------------------------------------- | This defines the gob for the player party in haxima .
Generic functions
(define (player-has? key)
(not (null? (tbl-get (gob (kern-get-player)) key))))
(define (player-get key)
(tbl-get (gob (kern-get-player)) key))
(define (player-set! key val)
( tbl - set ! ( gob ( kern - get - player ) ) key ) )
(tbl-set! (gob (kern-get-player)) key val))
(define (player-reconcile-gob kplayer)
(if (not (player-has? 'rep))
(player-set! 'rep 0))
(if (not (player-has? 'rapsheet))
(player-set! 'rapsheet '()))
(kern-mk-dtable
non pla men cgb acc mon tro spd out
))
(kern-add-hook 'new_game_start_hook 'player-reconcile-gob)
Specialized queries
(define (player-found-warritrix?)
(not (null? (quest-data-getvalue 'questentry-warritrix 'found))))
(define (player-stewardess-trial-done?)
(not (null? (quest-data-getvalue 'questentry-warritrix 'avenged))))
|
bf76042628510e082c149a71ed6e83bf08d91ee3e79794b1f5ba0d0e9c945b63 | ulises/sliver | util.clj | (ns sliver.util
(:require [bytebuffer.buff :refer [pack byte-buffer]])
(:import [java.nio ByteBuffer]
[java.security MessageDigest]))
(defn flip-pack
[size fmt bytes-seq]
(.flip ^ByteBuffer (apply pack (byte-buffer size) fmt bytes-seq)))
(defn gen-challenge []
(rand-int Integer/MAX_VALUE))
(defn- md5 [^String string]
(let [^MessageDigest md5 (MessageDigest/getInstance "MD5")]
(do (.reset md5)
(.update md5 (.getBytes string))
(.digest md5))))
(defn digest
[challenge cookie]
(let [dig (md5 (str cookie (String/valueOf challenge)))]
(take 16 dig)))
(defn maybe-split [name]
(clojure.string/split name #"@"))
(defn plain-name [{:keys [node-name] :as maybe-name}]
"Strips the @IP/host part of a node name"
(if node-name
(first (maybe-split node-name))
(first (maybe-split maybe-name))))
(defn fqdn [{:keys [node-name host]}]
(str node-name "@" host))
(defn register-shutdown [node name]
(swap! (:state node) update-in
[:shutdown-notify] conj name))
(defn writer-name [other-node]
(symbol (str (plain-name other-node) "-writer")))
(defn reader-name [other-node]
(symbol (str (plain-name other-node) "-reader")))
| null | https://raw.githubusercontent.com/ulises/sliver/a0b600e157e41298f026e030868465b618fbd73c/src/sliver/util.clj | clojure | (ns sliver.util
(:require [bytebuffer.buff :refer [pack byte-buffer]])
(:import [java.nio ByteBuffer]
[java.security MessageDigest]))
(defn flip-pack
[size fmt bytes-seq]
(.flip ^ByteBuffer (apply pack (byte-buffer size) fmt bytes-seq)))
(defn gen-challenge []
(rand-int Integer/MAX_VALUE))
(defn- md5 [^String string]
(let [^MessageDigest md5 (MessageDigest/getInstance "MD5")]
(do (.reset md5)
(.update md5 (.getBytes string))
(.digest md5))))
(defn digest
[challenge cookie]
(let [dig (md5 (str cookie (String/valueOf challenge)))]
(take 16 dig)))
(defn maybe-split [name]
(clojure.string/split name #"@"))
(defn plain-name [{:keys [node-name] :as maybe-name}]
"Strips the @IP/host part of a node name"
(if node-name
(first (maybe-split node-name))
(first (maybe-split maybe-name))))
(defn fqdn [{:keys [node-name host]}]
(str node-name "@" host))
(defn register-shutdown [node name]
(swap! (:state node) update-in
[:shutdown-notify] conj name))
(defn writer-name [other-node]
(symbol (str (plain-name other-node) "-writer")))
(defn reader-name [other-node]
(symbol (str (plain-name other-node) "-reader")))
| |
578a83f50188953c5aa8e1c0cd58ec0243268c28f88fb36e45f7e34fb7eadddf | patrikja/AFPcourse | Families.hs | # LANGUAGE TypeFamilies , GADTs #
module Families where
type Nat = Int
The " " function can be simulated using type families
type family Silly n :: *
type instance Silly Zero = Bool
type instance Silly (Suc Zero) = Nat
type instance Silly (Suc (Suc n)) = (Silly n , Silly (Suc n))
-- But note that the argument n is a type, not a natural number.
-- Thus we cannot pattern match on it to define "silly".
The family can be " simulated " using type level
data Zero
data Suc n
data Vec a n where
Nil :: Vec a Zero
Cons :: a -> Vec a n -> Vec a (Suc n)
-- Some simple functions
head :: Vec a (Suc n) -> a
head (Cons x _) = x -- Providing a Nil case would be a type error
-- Does the definition look familiar?
vmap :: (a -> b) -> Vec a n -> Vec b n
vmap _ Nil = Nil
vmap f (Cons x xs) = Cons (f x) (vmap f xs)
(+++) :: Vec a n -> Vec a m -> Vec a (Add n m)
Nil +++ ys = ys
(Cons x xs) +++ ys = Cons x (xs +++ ys)
-- We have no predefined addition types, so we have to define it
type family Add m n :: *
type instance Add Zero m = m
type instance Add (Suc n) m = Suc (Add n m)
-- Nothing is stopping us from strange cases:
type instance Add Bool Char = Float
-- The identity type.
data Equal a b where
Refl :: Equal a a
{-
Exercise: implement Fin and vector indexing.
-}
data Fin n where
FZ :: Fin (Suc n)
FS :: Fin n -> Fin (Suc n)
-- no constructor for Fin Zero
type Fin0 = Fin Zero
type Fin1 = Fin (Suc Zero)
type Fin2 = Fin (Suc (Suc Zero))
type Fin3 = Fin (Suc (Suc (Suc Zero)))
-- Some example values
zer1 :: Fin1
zer1 = FZ
zer2, one2 :: Fin2
zer2 = FZ
one2 = FS zer1
zer3, one3, two3 :: Fin3
zer3 = FZ
one3 = FS zer2
two3 = FS one2
----------------
-- Vector indexing:
index :: Vec a n -> Fin n -> a
index (Cons x _) (FZ) = x
index (Cons _ xs) (FS m) = index xs m
index Nil _ = error "index: an empty vector has no elements"
type Four = Suc (Suc (Suc (Suc Zero)))
testVec :: Vec Int Four
testVec = Cons 1 $ Cons 7 $ Cons 3 $ Cons 8 $ Nil
test1 :: Int
test1 = index testVec (FS FZ)
-- Mention "Promotion": /~ghc/latest/docs/html/users_guide/promotion.html
| null | https://raw.githubusercontent.com/patrikja/AFPcourse/1a079ae80ba2dbb36f3f79f0fc96a502c0f670b6/L13/src/Families.hs | haskell | But note that the argument n is a type, not a natural number.
Thus we cannot pattern match on it to define "silly".
Some simple functions
Providing a Nil case would be a type error
Does the definition look familiar?
We have no predefined addition types, so we have to define it
Nothing is stopping us from strange cases:
The identity type.
Exercise: implement Fin and vector indexing.
no constructor for Fin Zero
Some example values
--------------
Vector indexing:
Mention "Promotion": /~ghc/latest/docs/html/users_guide/promotion.html | # LANGUAGE TypeFamilies , GADTs #
module Families where
type Nat = Int
The " " function can be simulated using type families
type family Silly n :: *
type instance Silly Zero = Bool
type instance Silly (Suc Zero) = Nat
type instance Silly (Suc (Suc n)) = (Silly n , Silly (Suc n))
The family can be " simulated " using type level
data Zero
data Suc n
data Vec a n where
Nil :: Vec a Zero
Cons :: a -> Vec a n -> Vec a (Suc n)
head :: Vec a (Suc n) -> a
vmap :: (a -> b) -> Vec a n -> Vec b n
vmap _ Nil = Nil
vmap f (Cons x xs) = Cons (f x) (vmap f xs)
(+++) :: Vec a n -> Vec a m -> Vec a (Add n m)
Nil +++ ys = ys
(Cons x xs) +++ ys = Cons x (xs +++ ys)
type family Add m n :: *
type instance Add Zero m = m
type instance Add (Suc n) m = Suc (Add n m)
type instance Add Bool Char = Float
data Equal a b where
Refl :: Equal a a
data Fin n where
FZ :: Fin (Suc n)
FS :: Fin n -> Fin (Suc n)
type Fin0 = Fin Zero
type Fin1 = Fin (Suc Zero)
type Fin2 = Fin (Suc (Suc Zero))
type Fin3 = Fin (Suc (Suc (Suc Zero)))
zer1 :: Fin1
zer1 = FZ
zer2, one2 :: Fin2
zer2 = FZ
one2 = FS zer1
zer3, one3, two3 :: Fin3
zer3 = FZ
one3 = FS zer2
two3 = FS one2
index :: Vec a n -> Fin n -> a
index (Cons x _) (FZ) = x
index (Cons _ xs) (FS m) = index xs m
index Nil _ = error "index: an empty vector has no elements"
type Four = Suc (Suc (Suc (Suc Zero)))
testVec :: Vec Int Four
testVec = Cons 1 $ Cons 7 $ Cons 3 $ Cons 8 $ Nil
test1 :: Int
test1 = index testVec (FS FZ)
|
933b81a6865a9254f98ab8952731ccbc632ce26c9279491149e7cd9565be85f6 | hasufell/hsfm | Dialogs.hs | -
HSFM , a written in Haskell .
Copyright ( C ) 2016
This program is free software ; you can redistribute it and/or
modify it under the terms of the GNU General Public License
version 2 as published by the Free Software Foundation .
This program is distributed in the hope that it will be useful ,
but WITHOUT ANY WARRANTY ; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
GNU General Public License for more details .
You should have received a copy of the GNU General Public License
along with this program ; if not , write to the Free Software
Foundation , Inc. , 51 Franklin Street , Fifth Floor , Boston , MA 02110 - 1301 , USA .
-
HSFM, a filemanager written in Haskell.
Copyright (C) 2016 Julian Ospald
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
version 2 as published by the Free Software Foundation.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
--}
# LANGUAGE CPP #
# OPTIONS_HADDOCK ignore - exports #
module HSFM.GUI.Gtk.Dialogs where
import Codec.Binary.UTF8.String
(
decodeString
)
import Control.Exception
(
catches
, displayException
, throwIO
, IOException
, Handler(..)
)
import Control.Monad
(
forM
, when
, void
)
import Data.ByteString
(
ByteString
)
import qualified Data.ByteString as BS
import Data.ByteString.UTF8
(
fromString
)
import Distribution.Package
(
PackageIdentifier(..)
, packageVersion
, unPackageName
)
#if MIN_VERSION_Cabal(2,0,0)
import Distribution.Version
(
showVersion
)
#else
import Data.Version
(
showVersion
)
#endif
import Distribution.PackageDescription
(
GenericPackageDescription(..)
, PackageDescription(..)
)
#if MIN_VERSION_Cabal(2,2,0)
import Distribution.PackageDescription.Parsec
#else
import Distribution.PackageDescription.Parse
#endif
(
#if MIN_VERSION_Cabal(2,0,0)
readGenericPackageDescription,
#else
readPackageDescription,
#endif
)
import Distribution.Verbosity
(
silent
)
import Graphics.UI.Gtk
import qualified HPath as P
import HPath.IO.Errors
import HSFM.FileSystem.FileType
import HSFM.FileSystem.UtilTypes
import HSFM.GUI.Glib.GlibString()
import HSFM.GUI.Gtk.Data
import HSFM.GUI.Gtk.Errors
import Paths_hsfm
(
getDataFileName
)
import System.Glib.UTFString
(
GlibString
)
import System.Posix.FilePath
(
takeFileName
)
---------------------
--[ Dialog popups ]--
---------------------
-- |Pops up an error Dialog with the given String.
showErrorDialog :: String -> IO ()
showErrorDialog str = do
errorDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageError
ButtonsClose
str
_ <- dialogRun errorDialog
widgetDestroy errorDialog
-- |Asks the user for confirmation and returns True/False.
showConfirmationDialog :: String -> IO Bool
showConfirmationDialog str = do
confirmDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsYesNo
str
rID <- dialogRun confirmDialog
widgetDestroy confirmDialog
case rID of
ResponseYes -> return True
ResponseNo -> return False
_ -> return False
fileCollisionDialog :: ByteString -> IO (Maybe FCollisonMode)
fileCollisionDialog t = do
chooserDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsNone
(fromString "Target \"" `BS.append`
t `BS.append`
fromString "\" exists, how to proceed?")
_ <- dialogAddButton chooserDialog "Cancel" (ResponseUser 0)
_ <- dialogAddButton chooserDialog "Overwrite" (ResponseUser 1)
_ <- dialogAddButton chooserDialog "Overwrite all" (ResponseUser 2)
_ <- dialogAddButton chooserDialog "Skip" (ResponseUser 3)
_ <- dialogAddButton chooserDialog "Rename" (ResponseUser 4)
rID <- dialogRun chooserDialog
widgetDestroy chooserDialog
case rID of
ResponseUser 0 -> return Nothing
ResponseUser 1 -> return (Just Overwrite)
ResponseUser 2 -> return (Just OverwriteAll)
ResponseUser 3 -> return (Just Skip)
ResponseUser 4 -> do
mfn <- textInputDialog (fromString "Enter new name") (takeFileName t)
forM mfn $ \fn -> do
pfn <- P.parseRel (fromString fn)
return $ Rename pfn
_ -> throwIO UnknownDialogButton
renameDialog :: ByteString -> IO (Maybe FCollisonMode)
renameDialog t = do
chooserDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsNone
(fromString "Target \"" `BS.append`
t `BS.append`
fromString "\" exists, how to proceed?")
_ <- dialogAddButton chooserDialog "Cancel" (ResponseUser 0)
_ <- dialogAddButton chooserDialog "Skip" (ResponseUser 1)
_ <- dialogAddButton chooserDialog "Rename" (ResponseUser 2)
rID <- dialogRun chooserDialog
widgetDestroy chooserDialog
case rID of
ResponseUser 0 -> return Nothing
ResponseUser 1 -> return (Just Skip)
ResponseUser 2 -> do
mfn <- textInputDialog (fromString "Enter new name") (takeFileName t)
forM mfn $ \fn -> do
pfn <- P.parseRel (fromString fn)
return $ Rename pfn
_ -> throwIO UnknownDialogButton
-- |Shows the about dialog from the help menu.
showAboutDialog :: IO ()
showAboutDialog = do
ad <- aboutDialogNew
lstr <- Prelude.readFile =<< getDataFileName "LICENSE"
hsfmicon <- pixbufNewFromFile =<< getDataFileName "data/Gtk/icons/hsfm.png"
pdesc <- fmap packageDescription
#if MIN_VERSION_Cabal(2,0,0)
(readGenericPackageDescription silent
#else
(readPackageDescription silent
#endif
=<< getDataFileName "hsfm.cabal")
set ad
[ aboutDialogProgramName := (unPackageName . pkgName . package) pdesc
, aboutDialogName := (unPackageName . pkgName . package) pdesc
, aboutDialogVersion := (showVersion . packageVersion . package) pdesc
, aboutDialogCopyright := copyright pdesc
, aboutDialogComments := description pdesc
, aboutDialogLicense := Just lstr
, aboutDialogWebsite := homepage pdesc
, aboutDialogAuthors := [author pdesc]
, aboutDialogLogo := Just hsfmicon
, aboutDialogWrapLicense := True
]
_ <- dialogRun ad
widgetDestroy ad
-- |Carry out an IO action with a confirmation dialog.
-- If the user presses "No", then do nothing.
withConfirmationDialog :: String -> IO () -> IO ()
withConfirmationDialog str io = do
run <- showConfirmationDialog str
when run io
-- |Execute the given IO action. If the action throws exceptions,
-- visualize them via 'showErrorDialog'.
withErrorDialog :: IO a -> IO ()
withErrorDialog io =
catches (void io)
[ Handler (\e -> showErrorDialog
. decodeString
. displayException
$ (e :: IOException))
, Handler (\e -> showErrorDialog
$ displayException (e :: HPathIOException))
]
-- |Asks the user which directory copy mode he wants via dialog popup
-- and returns 'DirCopyMode'.
textInputDialog :: (GlibString s1, GlibString s2)
=> s1 -- ^ window title
-> s2 -- ^ initial text in input widget
-> IO (Maybe String)
textInputDialog title inittext = do
chooserDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsNone
title
entry <- entryNew
entrySetText entry inittext
cbox <- dialogGetActionArea chooserDialog
_ <- dialogAddButton chooserDialog "Ok" (ResponseUser 0)
_ <- dialogAddButton chooserDialog "Cancel" (ResponseUser 1)
boxPackStart (castToBox cbox) entry PackNatural 5
widgetShowAll chooserDialog
rID <- dialogRun chooserDialog
ret <- case rID of
-- TODO: make this more safe
ResponseUser 0 -> Just <$> entryGetText entry
ResponseUser 1 -> return Nothing
_ -> throwIO UnknownDialogButton
widgetDestroy chooserDialog
return ret
showFilePropertyDialog :: [Item] -> MyGUI -> MyView -> IO ()
showFilePropertyDialog [item] mygui _ = do
dialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageInfo
ButtonsNone
"File Properties"
let fprop' = fprop mygui
grid = fpropGrid fprop'
entrySetText (fpropFnEntry fprop') (maybe BS.empty P.fromRel
$ P.basename . path $ item)
entrySetText (fpropLocEntry fprop') (P.fromAbs . P.dirname . path $ item)
entrySetText (fpropTsEntry fprop') (show . fileSize $ fvar item)
entrySetText (fpropModEntry fprop') (packModTime item)
entrySetText (fpropAcEntry fprop') (packAccessTime item)
entrySetText (fpropFTEntry fprop') (packFileType item)
entrySetText (fpropPermEntry fprop')
(tail $ packPermissions item) -- throw away the filetype part
case packLinkDestination item of
(Just dest) -> do
widgetSetSensitive (fpropLDEntry fprop') True
entrySetText (fpropLDEntry fprop') dest
Nothing -> do
widgetSetSensitive (fpropLDEntry fprop') False
entrySetText (fpropLDEntry fprop') "( Not a symlink )"
cbox <- dialogGetActionArea dialog
_ <- dialogAddButton dialog "Ok" (ResponseUser 0)
_ <- dialogAddButton dialog "Cancel" (ResponseUser 1)
boxPackStart (castToBox cbox) grid PackNatural 5
widgetShowAll dialog
_ <- dialogRun dialog
-- make sure our grid does not get destroyed
containerRemove (castToBox cbox) grid
widgetDestroy dialog
return ()
showFilePropertyDialog _ _ _ = return ()
| null | https://raw.githubusercontent.com/hasufell/hsfm/322c766ae534fb21e3427d2845011123ddb90952/src/HSFM/GUI/Gtk/Dialogs.hs | haskell | }
-------------------
[ Dialog popups ]--
-------------------
|Pops up an error Dialog with the given String.
|Asks the user for confirmation and returns True/False.
|Shows the about dialog from the help menu.
|Carry out an IO action with a confirmation dialog.
If the user presses "No", then do nothing.
|Execute the given IO action. If the action throws exceptions,
visualize them via 'showErrorDialog'.
|Asks the user which directory copy mode he wants via dialog popup
and returns 'DirCopyMode'.
^ window title
^ initial text in input widget
TODO: make this more safe
throw away the filetype part
make sure our grid does not get destroyed | -
HSFM , a written in Haskell .
Copyright ( C ) 2016
This program is free software ; you can redistribute it and/or
modify it under the terms of the GNU General Public License
version 2 as published by the Free Software Foundation .
This program is distributed in the hope that it will be useful ,
but WITHOUT ANY WARRANTY ; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
GNU General Public License for more details .
You should have received a copy of the GNU General Public License
along with this program ; if not , write to the Free Software
Foundation , Inc. , 51 Franklin Street , Fifth Floor , Boston , MA 02110 - 1301 , USA .
-
HSFM, a filemanager written in Haskell.
Copyright (C) 2016 Julian Ospald
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU General Public License
version 2 as published by the Free Software Foundation.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
# LANGUAGE CPP #
# OPTIONS_HADDOCK ignore - exports #
module HSFM.GUI.Gtk.Dialogs where
import Codec.Binary.UTF8.String
(
decodeString
)
import Control.Exception
(
catches
, displayException
, throwIO
, IOException
, Handler(..)
)
import Control.Monad
(
forM
, when
, void
)
import Data.ByteString
(
ByteString
)
import qualified Data.ByteString as BS
import Data.ByteString.UTF8
(
fromString
)
import Distribution.Package
(
PackageIdentifier(..)
, packageVersion
, unPackageName
)
#if MIN_VERSION_Cabal(2,0,0)
import Distribution.Version
(
showVersion
)
#else
import Data.Version
(
showVersion
)
#endif
import Distribution.PackageDescription
(
GenericPackageDescription(..)
, PackageDescription(..)
)
#if MIN_VERSION_Cabal(2,2,0)
import Distribution.PackageDescription.Parsec
#else
import Distribution.PackageDescription.Parse
#endif
(
#if MIN_VERSION_Cabal(2,0,0)
readGenericPackageDescription,
#else
readPackageDescription,
#endif
)
import Distribution.Verbosity
(
silent
)
import Graphics.UI.Gtk
import qualified HPath as P
import HPath.IO.Errors
import HSFM.FileSystem.FileType
import HSFM.FileSystem.UtilTypes
import HSFM.GUI.Glib.GlibString()
import HSFM.GUI.Gtk.Data
import HSFM.GUI.Gtk.Errors
import Paths_hsfm
(
getDataFileName
)
import System.Glib.UTFString
(
GlibString
)
import System.Posix.FilePath
(
takeFileName
)
showErrorDialog :: String -> IO ()
showErrorDialog str = do
errorDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageError
ButtonsClose
str
_ <- dialogRun errorDialog
widgetDestroy errorDialog
showConfirmationDialog :: String -> IO Bool
showConfirmationDialog str = do
confirmDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsYesNo
str
rID <- dialogRun confirmDialog
widgetDestroy confirmDialog
case rID of
ResponseYes -> return True
ResponseNo -> return False
_ -> return False
fileCollisionDialog :: ByteString -> IO (Maybe FCollisonMode)
fileCollisionDialog t = do
chooserDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsNone
(fromString "Target \"" `BS.append`
t `BS.append`
fromString "\" exists, how to proceed?")
_ <- dialogAddButton chooserDialog "Cancel" (ResponseUser 0)
_ <- dialogAddButton chooserDialog "Overwrite" (ResponseUser 1)
_ <- dialogAddButton chooserDialog "Overwrite all" (ResponseUser 2)
_ <- dialogAddButton chooserDialog "Skip" (ResponseUser 3)
_ <- dialogAddButton chooserDialog "Rename" (ResponseUser 4)
rID <- dialogRun chooserDialog
widgetDestroy chooserDialog
case rID of
ResponseUser 0 -> return Nothing
ResponseUser 1 -> return (Just Overwrite)
ResponseUser 2 -> return (Just OverwriteAll)
ResponseUser 3 -> return (Just Skip)
ResponseUser 4 -> do
mfn <- textInputDialog (fromString "Enter new name") (takeFileName t)
forM mfn $ \fn -> do
pfn <- P.parseRel (fromString fn)
return $ Rename pfn
_ -> throwIO UnknownDialogButton
renameDialog :: ByteString -> IO (Maybe FCollisonMode)
renameDialog t = do
chooserDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsNone
(fromString "Target \"" `BS.append`
t `BS.append`
fromString "\" exists, how to proceed?")
_ <- dialogAddButton chooserDialog "Cancel" (ResponseUser 0)
_ <- dialogAddButton chooserDialog "Skip" (ResponseUser 1)
_ <- dialogAddButton chooserDialog "Rename" (ResponseUser 2)
rID <- dialogRun chooserDialog
widgetDestroy chooserDialog
case rID of
ResponseUser 0 -> return Nothing
ResponseUser 1 -> return (Just Skip)
ResponseUser 2 -> do
mfn <- textInputDialog (fromString "Enter new name") (takeFileName t)
forM mfn $ \fn -> do
pfn <- P.parseRel (fromString fn)
return $ Rename pfn
_ -> throwIO UnknownDialogButton
showAboutDialog :: IO ()
showAboutDialog = do
ad <- aboutDialogNew
lstr <- Prelude.readFile =<< getDataFileName "LICENSE"
hsfmicon <- pixbufNewFromFile =<< getDataFileName "data/Gtk/icons/hsfm.png"
pdesc <- fmap packageDescription
#if MIN_VERSION_Cabal(2,0,0)
(readGenericPackageDescription silent
#else
(readPackageDescription silent
#endif
=<< getDataFileName "hsfm.cabal")
set ad
[ aboutDialogProgramName := (unPackageName . pkgName . package) pdesc
, aboutDialogName := (unPackageName . pkgName . package) pdesc
, aboutDialogVersion := (showVersion . packageVersion . package) pdesc
, aboutDialogCopyright := copyright pdesc
, aboutDialogComments := description pdesc
, aboutDialogLicense := Just lstr
, aboutDialogWebsite := homepage pdesc
, aboutDialogAuthors := [author pdesc]
, aboutDialogLogo := Just hsfmicon
, aboutDialogWrapLicense := True
]
_ <- dialogRun ad
widgetDestroy ad
withConfirmationDialog :: String -> IO () -> IO ()
withConfirmationDialog str io = do
run <- showConfirmationDialog str
when run io
withErrorDialog :: IO a -> IO ()
withErrorDialog io =
catches (void io)
[ Handler (\e -> showErrorDialog
. decodeString
. displayException
$ (e :: IOException))
, Handler (\e -> showErrorDialog
$ displayException (e :: HPathIOException))
]
textInputDialog :: (GlibString s1, GlibString s2)
-> IO (Maybe String)
textInputDialog title inittext = do
chooserDialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageQuestion
ButtonsNone
title
entry <- entryNew
entrySetText entry inittext
cbox <- dialogGetActionArea chooserDialog
_ <- dialogAddButton chooserDialog "Ok" (ResponseUser 0)
_ <- dialogAddButton chooserDialog "Cancel" (ResponseUser 1)
boxPackStart (castToBox cbox) entry PackNatural 5
widgetShowAll chooserDialog
rID <- dialogRun chooserDialog
ret <- case rID of
ResponseUser 0 -> Just <$> entryGetText entry
ResponseUser 1 -> return Nothing
_ -> throwIO UnknownDialogButton
widgetDestroy chooserDialog
return ret
showFilePropertyDialog :: [Item] -> MyGUI -> MyView -> IO ()
showFilePropertyDialog [item] mygui _ = do
dialog <- messageDialogNew Nothing
[DialogDestroyWithParent]
MessageInfo
ButtonsNone
"File Properties"
let fprop' = fprop mygui
grid = fpropGrid fprop'
entrySetText (fpropFnEntry fprop') (maybe BS.empty P.fromRel
$ P.basename . path $ item)
entrySetText (fpropLocEntry fprop') (P.fromAbs . P.dirname . path $ item)
entrySetText (fpropTsEntry fprop') (show . fileSize $ fvar item)
entrySetText (fpropModEntry fprop') (packModTime item)
entrySetText (fpropAcEntry fprop') (packAccessTime item)
entrySetText (fpropFTEntry fprop') (packFileType item)
entrySetText (fpropPermEntry fprop')
case packLinkDestination item of
(Just dest) -> do
widgetSetSensitive (fpropLDEntry fprop') True
entrySetText (fpropLDEntry fprop') dest
Nothing -> do
widgetSetSensitive (fpropLDEntry fprop') False
entrySetText (fpropLDEntry fprop') "( Not a symlink )"
cbox <- dialogGetActionArea dialog
_ <- dialogAddButton dialog "Ok" (ResponseUser 0)
_ <- dialogAddButton dialog "Cancel" (ResponseUser 1)
boxPackStart (castToBox cbox) grid PackNatural 5
widgetShowAll dialog
_ <- dialogRun dialog
containerRemove (castToBox cbox) grid
widgetDestroy dialog
return ()
showFilePropertyDialog _ _ _ = return ()
|
032339a1baa2cab0981b4ef7dcea12294ea869545a6138704406f01a2ba40566 | tategakibunko/jingoo | test.ml | open OUnit2
let () =
run_test_tt_main ("jingoo" >::: [ Test_runtime.suite
; Test_output.suite
; Test_parser.suite
; Test_dead_code_elimination.suite
])
| null | https://raw.githubusercontent.com/tategakibunko/jingoo/1ed8f036c8f37294f282fe147f767bbd11a5386d/tests/test.ml | ocaml | open OUnit2
let () =
run_test_tt_main ("jingoo" >::: [ Test_runtime.suite
; Test_output.suite
; Test_parser.suite
; Test_dead_code_elimination.suite
])
| |
a1bc0beef0c977c19ba2e1c5ce650c0c3b1b748943460da016aa222f0d0bf1f5 | rescript-lang/rescript-compiler | cmij_cache.mli | type t = { module_names : string array; module_data : bytes array }
type cmi_data = Cmi_format.cmi_infos
type cmj_data = { values : Js_cmj_format.keyed_cmj_value array; pure : bool }
val marshal_cmi_data : cmi_data -> bytes
val marshal_cmj_data : cmj_data -> bytes
val unmarshal_cmi_data : bytes -> cmi_data
val unmarshal_cmj_data : bytes -> cmj_data
| null | https://raw.githubusercontent.com/rescript-lang/rescript-compiler/81a3dc63ca387b2af23fed297db283254ae3ab20/jscomp/cmij/cmij_cache.mli | ocaml | type t = { module_names : string array; module_data : bytes array }
type cmi_data = Cmi_format.cmi_infos
type cmj_data = { values : Js_cmj_format.keyed_cmj_value array; pure : bool }
val marshal_cmi_data : cmi_data -> bytes
val marshal_cmj_data : cmj_data -> bytes
val unmarshal_cmi_data : bytes -> cmi_data
val unmarshal_cmj_data : bytes -> cmj_data
| |
e7252e09ca7464ecc24e34362dab2d43c24c8371de63a8cfee4ac136fa8f7c55 | tmattio/spin | cmd_new.ml | open Spin
let run ~ignore_config ~use_defaults ~template ~path =
let open Result.Syntax in
let path = Option.value path ~default:Filename.current_dir_name in
let* () =
try
match Sys.readdir path with
| [||] ->
Ok ()
| _ ->
Error
(Spin_error.failed_to_generate "The output directory is not empty.")
with
| Sys_error _ ->
Sys.mkdir_p path;
Ok ()
in
let* context =
if ignore_config then
Ok None
else
let* user_config = User_config.read () in
match user_config with
| None ->
Logs.app (fun m ->
m
"\n\
⚠️ No config file found. To save some time in the future, \
create one with %a"
Pp.pp_blue
"spin config");
Ok None
| Some user_config ->
let context = User_config.to_context user_config in
Ok (Some context)
in
match Template.source_of_string template with
| Some source ->
(try
let* template = Template.read ?context ~use_defaults source in
Template.generate ~path template
with
| Sys.Break | Failure _ ->
exit 1
| e ->
raise e)
| None ->
Logs.err (fun m -> m "This template does not exist");
Ok ()
(* Command line interface *)
open Cmdliner
let doc = "Generate a new project from a template"
let sdocs = Manpage.s_common_options
let exits = Common.exits
let envs = Common.envs
let man_xrefs = [ `Main; `Cmd "ls" ]
let man =
[ `S Manpage.s_description
; `P
"$(tname) generates projects from templates. The template can be either \
an official template, local directory or a remote git repository."
; `P "You can use spin-ls(1) to list the official templates."
]
let info = Term.info "new" ~doc ~sdocs ~exits ~envs ~man ~man_xrefs
let term =
let open Common.Syntax in
let+ _term = Common.term
and+ ignore_config = Common.ignore_config_arg
and+ use_defaults = Common.use_defaults_arg
and+ template =
let doc =
"The template to use. The template can be the name of an official \
template, a local directory or a remote git repository."
in
let docv = "TEMPLATE" in
Arg.(required & pos 0 (some string) None & info [] ~doc ~docv)
and+ path =
let doc =
"The path where the project will be generated. If absent, the project \
will be generated in the current working directory."
in
let docv = "PATH" in
Arg.(value & pos 1 (some string) None & info [] ~doc ~docv)
in
run ~ignore_config ~use_defaults ~template ~path |> Common.handle_errors
let cmd = term, info
| null | https://raw.githubusercontent.com/tmattio/spin/5aad207960ded083a948cf0aea2d5fd2eb5dd555/bin/commands/cmd_new.ml | ocaml | Command line interface | open Spin
let run ~ignore_config ~use_defaults ~template ~path =
let open Result.Syntax in
let path = Option.value path ~default:Filename.current_dir_name in
let* () =
try
match Sys.readdir path with
| [||] ->
Ok ()
| _ ->
Error
(Spin_error.failed_to_generate "The output directory is not empty.")
with
| Sys_error _ ->
Sys.mkdir_p path;
Ok ()
in
let* context =
if ignore_config then
Ok None
else
let* user_config = User_config.read () in
match user_config with
| None ->
Logs.app (fun m ->
m
"\n\
⚠️ No config file found. To save some time in the future, \
create one with %a"
Pp.pp_blue
"spin config");
Ok None
| Some user_config ->
let context = User_config.to_context user_config in
Ok (Some context)
in
match Template.source_of_string template with
| Some source ->
(try
let* template = Template.read ?context ~use_defaults source in
Template.generate ~path template
with
| Sys.Break | Failure _ ->
exit 1
| e ->
raise e)
| None ->
Logs.err (fun m -> m "This template does not exist");
Ok ()
open Cmdliner
let doc = "Generate a new project from a template"
let sdocs = Manpage.s_common_options
let exits = Common.exits
let envs = Common.envs
let man_xrefs = [ `Main; `Cmd "ls" ]
let man =
[ `S Manpage.s_description
; `P
"$(tname) generates projects from templates. The template can be either \
an official template, local directory or a remote git repository."
; `P "You can use spin-ls(1) to list the official templates."
]
let info = Term.info "new" ~doc ~sdocs ~exits ~envs ~man ~man_xrefs
let term =
let open Common.Syntax in
let+ _term = Common.term
and+ ignore_config = Common.ignore_config_arg
and+ use_defaults = Common.use_defaults_arg
and+ template =
let doc =
"The template to use. The template can be the name of an official \
template, a local directory or a remote git repository."
in
let docv = "TEMPLATE" in
Arg.(required & pos 0 (some string) None & info [] ~doc ~docv)
and+ path =
let doc =
"The path where the project will be generated. If absent, the project \
will be generated in the current working directory."
in
let docv = "PATH" in
Arg.(value & pos 1 (some string) None & info [] ~doc ~docv)
in
run ~ignore_config ~use_defaults ~template ~path |> Common.handle_errors
let cmd = term, info
|
ad3945c4e33069350f81eaa7932b3228a07787ac005b0d86527a486c343b35ff | tsahyt/clingo-haskell | Solving.hs | module Clingo.Solving
(
ResultReady(..),
MonadSolve(..),
solverClose,
withModel
)
where
import Control.Monad.IO.Class
import Control.Monad.Catch
import Clingo.Model (MonadModel)
import Clingo.Internal.Types
import Foreign.Ptr
import Foreign.Marshal.Utils
import Clingo.Internal.Utils
import qualified Clingo.Raw as Raw
data ResultReady = Ready | NotReady
deriving (Eq, Show, Read, Ord)
toRR :: Bool -> ResultReady
toRR True = Ready
toRR False = NotReady
class MonadModel m => MonadSolve m where
-- | Get the next solve result.
getResult :: Solver s -> m s SolveResult
-- | Get the next model if it exists.
getModel :: Solver s -> m s (Maybe (Model s))
-- | Wait for the specified time to check if the result is ready.
solverWait :: Solver s -> Double -> m s ResultReady
-- | Discard the last model and start search for the next.
solverResume :: Solver s -> m s ()
-- | Stop the running search and block until done.
solverCancel :: Solver s -> m s ()
instance MonadSolve IOSym where
getResult = getResult'
getModel = getModel'
solverWait = solverWait'
solverResume = solverResume'
solverCancel = solverCancel'
instance (MonadThrow m, MonadIO m) => MonadSolve (ClingoT m) where
getResult = getResult'
getModel = getModel'
solverWait = solverWait'
solverResume = solverResume'
solverCancel = solverCancel'
-- | Convenience method to get a models. Provide a callback function which is called with a model as its argument.
-- Note that this is dependent on the solver configuration!
withModel :: (Monad (m s), MonadSolve m, Monoid a) => (Model s -> m s a) -> Solver s -> m s a
withModel f solver = do
solverResume solver
m <- getModel solver
case m of
Nothing -> pure mempty
Just x -> (<>) <$> f x <*> withModel f solver
getResult' :: (MonadThrow m, MonadIO m) => Solver s -> m SolveResult
getResult' (Solver s) = fromRawSolveResult <$> marshal1 (Raw.solveHandleGet s)
getModel' :: (MonadThrow m, MonadIO m) => Solver s -> m (Maybe (Model s))
getModel' (Solver s) = do
m@(Raw.Model x) <- marshal1 $ Raw.solveHandleModel s
pure $ if x == nullPtr then Nothing else Just (Model m)
solverWait' :: MonadIO m => Solver s -> Double -> m ResultReady
solverWait' (Solver s) timeout = do
x <- marshal1V $ Raw.solveHandleWait s (realToFrac timeout)
pure . toRR . toBool $ x
solverResume' :: (MonadThrow m, MonadIO m) => Solver s -> m ()
solverResume' (Solver s) = marshal0 $ Raw.solveHandleResume s
solverCancel' :: (MonadThrow m, MonadIO m) => Solver s -> m ()
solverCancel' (Solver s) = marshal0 $ Raw.solveHandleCancel s
-- | Stops the running search and releases the handle. Blocks until search is
-- stopped.
solverClose :: (MonadThrow m, MonadIO m) => Solver s -> ClingoT m s ()
solverClose (Solver s) = marshal0 $ Raw.solveHandleClose s
| null | https://raw.githubusercontent.com/tsahyt/clingo-haskell/083c84aae63565067644ccaa72223a4c12b33b88/src/Clingo/Solving.hs | haskell | | Get the next solve result.
| Get the next model if it exists.
| Wait for the specified time to check if the result is ready.
| Discard the last model and start search for the next.
| Stop the running search and block until done.
| Convenience method to get a models. Provide a callback function which is called with a model as its argument.
Note that this is dependent on the solver configuration!
| Stops the running search and releases the handle. Blocks until search is
stopped. | module Clingo.Solving
(
ResultReady(..),
MonadSolve(..),
solverClose,
withModel
)
where
import Control.Monad.IO.Class
import Control.Monad.Catch
import Clingo.Model (MonadModel)
import Clingo.Internal.Types
import Foreign.Ptr
import Foreign.Marshal.Utils
import Clingo.Internal.Utils
import qualified Clingo.Raw as Raw
data ResultReady = Ready | NotReady
deriving (Eq, Show, Read, Ord)
toRR :: Bool -> ResultReady
toRR True = Ready
toRR False = NotReady
class MonadModel m => MonadSolve m where
getResult :: Solver s -> m s SolveResult
getModel :: Solver s -> m s (Maybe (Model s))
solverWait :: Solver s -> Double -> m s ResultReady
solverResume :: Solver s -> m s ()
solverCancel :: Solver s -> m s ()
instance MonadSolve IOSym where
getResult = getResult'
getModel = getModel'
solverWait = solverWait'
solverResume = solverResume'
solverCancel = solverCancel'
instance (MonadThrow m, MonadIO m) => MonadSolve (ClingoT m) where
getResult = getResult'
getModel = getModel'
solverWait = solverWait'
solverResume = solverResume'
solverCancel = solverCancel'
withModel :: (Monad (m s), MonadSolve m, Monoid a) => (Model s -> m s a) -> Solver s -> m s a
withModel f solver = do
solverResume solver
m <- getModel solver
case m of
Nothing -> pure mempty
Just x -> (<>) <$> f x <*> withModel f solver
getResult' :: (MonadThrow m, MonadIO m) => Solver s -> m SolveResult
getResult' (Solver s) = fromRawSolveResult <$> marshal1 (Raw.solveHandleGet s)
getModel' :: (MonadThrow m, MonadIO m) => Solver s -> m (Maybe (Model s))
getModel' (Solver s) = do
m@(Raw.Model x) <- marshal1 $ Raw.solveHandleModel s
pure $ if x == nullPtr then Nothing else Just (Model m)
solverWait' :: MonadIO m => Solver s -> Double -> m ResultReady
solverWait' (Solver s) timeout = do
x <- marshal1V $ Raw.solveHandleWait s (realToFrac timeout)
pure . toRR . toBool $ x
solverResume' :: (MonadThrow m, MonadIO m) => Solver s -> m ()
solverResume' (Solver s) = marshal0 $ Raw.solveHandleResume s
solverCancel' :: (MonadThrow m, MonadIO m) => Solver s -> m ()
solverCancel' (Solver s) = marshal0 $ Raw.solveHandleCancel s
solverClose :: (MonadThrow m, MonadIO m) => Solver s -> ClingoT m s ()
solverClose (Solver s) = marshal0 $ Raw.solveHandleClose s
|
80025a5b6b5e5a0d1fb5ab932114994362856457316979a58e1dcf6b1ded0542 | coccinelle/herodotos | rowDisplacement.ml | (**************************************************************************)
(* *)
Menhir
(* *)
, INRIA Rocquencourt
, PPS , Université Paris Diderot
(* *)
Copyright 2005 - 2008 Institut National de Recherche en Informatique
(* et en Automatique. All rights reserved. This file is distributed *)
under the terms of the GNU Library General Public License , with the
(* special exception on linking described in file LICENSE. *)
(* *)
(**************************************************************************)
This module compresses a two - dimensional table , where some values
are considered insignificant , via row displacement .
are considered insignificant, via row displacement. *)
This idea reportedly appears in and 's ` ` Principles
of Compiler Design '' ( 1977 ) . It is evaluated in Tarjan and 's
` ` Storing a Sparse Table '' ( 1979 ) and in , , and Heuft 's
` ` Optimization of Parser Tables for Portable Compilers '' ( 1984 ) .
of Compiler Design'' (1977). It is evaluated in Tarjan and Yao's
``Storing a Sparse Table'' (1979) and in Dencker, Dürre, and Heuft's
``Optimization of Parser Tables for Portable Compilers'' (1984). *)
(* A compressed table is represented as a pair of arrays. The
displacement array is an array of offsets into the data array. *)
type 'a table =
int array * (* displacement *)
'a array (* data *)
In a natural version of this algorithm , displacements would be greater
than ( or equal to ) [ -n ] . However , in the particular setting of Menhir ,
both arrays are intended to be compressed with [ PackedIntArray ] , which
does not efficiently support negative numbers . For this reason , we are
careful not to produce negative displacements .
than (or equal to) [-n]. However, in the particular setting of Menhir,
both arrays are intended to be compressed with [PackedIntArray], which
does not efficiently support negative numbers. For this reason, we are
careful not to produce negative displacements. *)
(* In order to avoid producing negative displacements, we simply use the
least significant bit as the sign bit. This is implemented by [encode]
and [decode] below. *)
(* One could also think, say, of adding [n] to every displacement, so as
to ensure that all displacements are nonnegative. This would work, but
would require [n] to be published, for use by the decoder. *)
let encode (displacement : int) : int =
if displacement >= 0 then
displacement lsl 1
else
(-displacement) lsl 1 + 1
let decode (displacement : int) : int =
if displacement land 1 = 0 then
displacement lsr 1
else
-(displacement lsr 1)
(* It is reasonable to assume that, as matrices grow large, their
density becomes low, i.e., they have many insignificant entries.
As a result, it is important to work with a sparse data structure
for rows. We internally represent a row as a list of its
significant entries, where each entry is a pair of a [j] index and
an element. *)
type 'a row =
(int * 'a) list
[ compress equal insignificant dummy m n t ] turns the two - dimensional table
[ t ] into a compressed table . The parameter [ equal ] is equality of data
values . The parameter [ wildcard ] tells which data values are insignificant ,
and can thus be overwritten with other values . The parameter [ dummy ] is
used to fill holes in the data array . [ m ] and [ n ] are the integer
dimensions of the table [ t ] .
[t] into a compressed table. The parameter [equal] is equality of data
values. The parameter [wildcard] tells which data values are insignificant,
and can thus be overwritten with other values. The parameter [dummy] is
used to fill holes in the data array. [m] and [n] are the integer
dimensions of the table [t]. *)
let compress
(equal : 'a -> 'a -> bool)
(insignificant : 'a -> bool)
(dummy : 'a)
(m : int) (n : int)
(t : 'a array array)
: 'a table =
(* Be defensive. *)
assert (Array.length t = m);
assert begin
for i = 0 to m - 1 do
assert (Array.length t.(i) = n)
done;
true
end;
(* This turns a row-as-array into a row-as-sparse-list. *)
let sparse (line : 'a array) : 'a row =
let rec loop (j : int) (row : 'a row) =
if j < 0 then
row
else
let x = line.(j) in
loop
(j - 1)
(if insignificant x then row else (j, x) :: row)
in
loop (n - 1) []
in
(* Define the rank of a row as its number of significant entries. *)
let rank (row : 'a row) : int =
List.length row
in
Construct a list of all rows , together with their index and rank .
let rows : (int * int * 'a row) list = (* index, rank, row *)
Array.to_list (
Array.mapi (fun i line ->
let row = sparse line in
i, rank row, row
) t
)
in
Sort this list by decreasing rank . This does not have any impact
on correctness , but reportedly improves compression . The
intuitive idea is that rows with few significant elements are
easy to fit , so they should be inserted last , after the problem
has become quite constrained by fitting the heavier rows . This
heuristic is attributed to .
on correctness, but reportedly improves compression. The
intuitive idea is that rows with few significant elements are
easy to fit, so they should be inserted last, after the problem
has become quite constrained by fitting the heavier rows. This
heuristic is attributed to Ziegler. *)
let rows =
List.sort (fun (_, rank1, _) (_, rank2, _) ->
compare rank2 rank1
) rows
in
Allocate a one - dimensional array of displacements .
let displacement : int array =
Array.make m 0
in
Allocate a one - dimensional , infinite array of values . Indices
into this array are written [ k ] .
into this array are written [k]. *)
let data : 'a InfiniteArray.t =
InfiniteArray.make dummy
in
(* Determine whether [row] fits at offset [k] within the current [data]
array, up to extension of this array. *)
(* Note that this check always succeeds when [k] equals the length of
the [data] array. Indeed, the loop is then skipped. This property
guarantees the termination of the recursive function [fit] below. *)
let fits k (row : 'a row) : bool =
let d = InfiniteArray.extent data in
let rec loop = function
| [] ->
true
| (j, x) :: row ->
(* [x] is a significant element. *)
(* By hypothesis, [k + j] is nonnegative. If it is greater than or
equal to the current length of the data array, stop -- the row
fits. *)
assert (k + j >= 0);
if k + j >= d then
true
(* We now know that [k + j] is within bounds of the data
array. Check whether it is compatible with the element [y] found
there. If it is, continue. If it isn't, stop -- the row does not
fit. *)
else
let y = InfiniteArray.get data (k + j) in
if insignificant y || equal x y then
loop row
else
false
in
loop row
in
(* Find the leftmost position where a row fits. *)
(* If the leftmost significant element in this row is at offset [j],
then we can hope to fit as far left as [-j] -- so this element
lands at offset [0] in the data array. *)
(* Note that displacements may be negative. This means that, for
insignificant elements, accesses to the data array could fail: they could
be out of bounds, either towards the left or towards the right. This is
not a problem, as long as [get] is invoked only at significant
elements. *)
let rec fit k row : int =
if fits k row then
k
else
fit (k + 1) row
in
let fit row =
match row with
| [] ->
0 (* irrelevant *)
| (j, _) :: _ ->
fit (-j) row
in
(* Write [row] at (compatible) offset [k]. *)
let rec write k = function
| [] ->
()
| (j, x) :: row ->
InfiniteArray.set data (k + j) x;
write k row
in
(* Iterate over the sorted list of rows. Fit and write each row at
the leftmost compatible offset. Update the displacement table. *)
let () =
List.iter (fun (i, _, row) ->
let k = fit row in (* if [row] has leading insignificant elements, then [k] can be negative *)
write k row;
displacement.(i) <- encode k
) rows
in
(* Return the compressed tables. *)
displacement, InfiniteArray.domain data
(* [get ct i j] returns the value found at indices [i] and [j] in the
compressed table [ct]. This function call is permitted only if the
value found at indices [i] and [j] in the original table is
significant -- otherwise, it could fail abruptly. *)
(* Together, [compress] and [get] have the property that, if the value
found at indices [i] and [j] in an uncompressed table [t] is
significant, then [get (compress t) i j] is equal to that value. *)
let get (displacement, data) i j =
assert (0 <= i && i < Array.length displacement);
let k = decode displacement.(i) in
assert (0 <= k + j && k + j < Array.length data);
(* failure of this assertion indicates an attempt to access an
insignificant element that happens to be mapped out of the bounds
of the [data] array. *)
data.(k + j)
[ getget ] is a variant of [ get ] which only requires read access ,
via accessors , to the two components of the table .
via accessors, to the two components of the table. *)
let getget get_displacement get_data (displacement, data) i j =
let k = decode (get_displacement displacement i) in
get_data data (k + j)
| null | https://raw.githubusercontent.com/coccinelle/herodotos/5da230a18962ca445ed2368bc21abe0a8402e00f/menhirLib/rowDisplacement.ml | ocaml | ************************************************************************
et en Automatique. All rights reserved. This file is distributed
special exception on linking described in file LICENSE.
************************************************************************
A compressed table is represented as a pair of arrays. The
displacement array is an array of offsets into the data array.
displacement
data
In order to avoid producing negative displacements, we simply use the
least significant bit as the sign bit. This is implemented by [encode]
and [decode] below.
One could also think, say, of adding [n] to every displacement, so as
to ensure that all displacements are nonnegative. This would work, but
would require [n] to be published, for use by the decoder.
It is reasonable to assume that, as matrices grow large, their
density becomes low, i.e., they have many insignificant entries.
As a result, it is important to work with a sparse data structure
for rows. We internally represent a row as a list of its
significant entries, where each entry is a pair of a [j] index and
an element.
Be defensive.
This turns a row-as-array into a row-as-sparse-list.
Define the rank of a row as its number of significant entries.
index, rank, row
Determine whether [row] fits at offset [k] within the current [data]
array, up to extension of this array.
Note that this check always succeeds when [k] equals the length of
the [data] array. Indeed, the loop is then skipped. This property
guarantees the termination of the recursive function [fit] below.
[x] is a significant element.
By hypothesis, [k + j] is nonnegative. If it is greater than or
equal to the current length of the data array, stop -- the row
fits.
We now know that [k + j] is within bounds of the data
array. Check whether it is compatible with the element [y] found
there. If it is, continue. If it isn't, stop -- the row does not
fit.
Find the leftmost position where a row fits.
If the leftmost significant element in this row is at offset [j],
then we can hope to fit as far left as [-j] -- so this element
lands at offset [0] in the data array.
Note that displacements may be negative. This means that, for
insignificant elements, accesses to the data array could fail: they could
be out of bounds, either towards the left or towards the right. This is
not a problem, as long as [get] is invoked only at significant
elements.
irrelevant
Write [row] at (compatible) offset [k].
Iterate over the sorted list of rows. Fit and write each row at
the leftmost compatible offset. Update the displacement table.
if [row] has leading insignificant elements, then [k] can be negative
Return the compressed tables.
[get ct i j] returns the value found at indices [i] and [j] in the
compressed table [ct]. This function call is permitted only if the
value found at indices [i] and [j] in the original table is
significant -- otherwise, it could fail abruptly.
Together, [compress] and [get] have the property that, if the value
found at indices [i] and [j] in an uncompressed table [t] is
significant, then [get (compress t) i j] is equal to that value.
failure of this assertion indicates an attempt to access an
insignificant element that happens to be mapped out of the bounds
of the [data] array. | Menhir
, INRIA Rocquencourt
, PPS , Université Paris Diderot
Copyright 2005 - 2008 Institut National de Recherche en Informatique
under the terms of the GNU Library General Public License , with the
This module compresses a two - dimensional table , where some values
are considered insignificant , via row displacement .
are considered insignificant, via row displacement. *)
This idea reportedly appears in and 's ` ` Principles
of Compiler Design '' ( 1977 ) . It is evaluated in Tarjan and 's
` ` Storing a Sparse Table '' ( 1979 ) and in , , and Heuft 's
` ` Optimization of Parser Tables for Portable Compilers '' ( 1984 ) .
of Compiler Design'' (1977). It is evaluated in Tarjan and Yao's
``Storing a Sparse Table'' (1979) and in Dencker, Dürre, and Heuft's
``Optimization of Parser Tables for Portable Compilers'' (1984). *)
type 'a table =
In a natural version of this algorithm , displacements would be greater
than ( or equal to ) [ -n ] . However , in the particular setting of Menhir ,
both arrays are intended to be compressed with [ PackedIntArray ] , which
does not efficiently support negative numbers . For this reason , we are
careful not to produce negative displacements .
than (or equal to) [-n]. However, in the particular setting of Menhir,
both arrays are intended to be compressed with [PackedIntArray], which
does not efficiently support negative numbers. For this reason, we are
careful not to produce negative displacements. *)
let encode (displacement : int) : int =
if displacement >= 0 then
displacement lsl 1
else
(-displacement) lsl 1 + 1
let decode (displacement : int) : int =
if displacement land 1 = 0 then
displacement lsr 1
else
-(displacement lsr 1)
type 'a row =
(int * 'a) list
[ compress equal insignificant dummy m n t ] turns the two - dimensional table
[ t ] into a compressed table . The parameter [ equal ] is equality of data
values . The parameter [ wildcard ] tells which data values are insignificant ,
and can thus be overwritten with other values . The parameter [ dummy ] is
used to fill holes in the data array . [ m ] and [ n ] are the integer
dimensions of the table [ t ] .
[t] into a compressed table. The parameter [equal] is equality of data
values. The parameter [wildcard] tells which data values are insignificant,
and can thus be overwritten with other values. The parameter [dummy] is
used to fill holes in the data array. [m] and [n] are the integer
dimensions of the table [t]. *)
let compress
(equal : 'a -> 'a -> bool)
(insignificant : 'a -> bool)
(dummy : 'a)
(m : int) (n : int)
(t : 'a array array)
: 'a table =
assert (Array.length t = m);
assert begin
for i = 0 to m - 1 do
assert (Array.length t.(i) = n)
done;
true
end;
let sparse (line : 'a array) : 'a row =
let rec loop (j : int) (row : 'a row) =
if j < 0 then
row
else
let x = line.(j) in
loop
(j - 1)
(if insignificant x then row else (j, x) :: row)
in
loop (n - 1) []
in
let rank (row : 'a row) : int =
List.length row
in
Construct a list of all rows , together with their index and rank .
Array.to_list (
Array.mapi (fun i line ->
let row = sparse line in
i, rank row, row
) t
)
in
Sort this list by decreasing rank . This does not have any impact
on correctness , but reportedly improves compression . The
intuitive idea is that rows with few significant elements are
easy to fit , so they should be inserted last , after the problem
has become quite constrained by fitting the heavier rows . This
heuristic is attributed to .
on correctness, but reportedly improves compression. The
intuitive idea is that rows with few significant elements are
easy to fit, so they should be inserted last, after the problem
has become quite constrained by fitting the heavier rows. This
heuristic is attributed to Ziegler. *)
let rows =
List.sort (fun (_, rank1, _) (_, rank2, _) ->
compare rank2 rank1
) rows
in
Allocate a one - dimensional array of displacements .
let displacement : int array =
Array.make m 0
in
Allocate a one - dimensional , infinite array of values . Indices
into this array are written [ k ] .
into this array are written [k]. *)
let data : 'a InfiniteArray.t =
InfiniteArray.make dummy
in
let fits k (row : 'a row) : bool =
let d = InfiniteArray.extent data in
let rec loop = function
| [] ->
true
| (j, x) :: row ->
assert (k + j >= 0);
if k + j >= d then
true
else
let y = InfiniteArray.get data (k + j) in
if insignificant y || equal x y then
loop row
else
false
in
loop row
in
let rec fit k row : int =
if fits k row then
k
else
fit (k + 1) row
in
let fit row =
match row with
| [] ->
| (j, _) :: _ ->
fit (-j) row
in
let rec write k = function
| [] ->
()
| (j, x) :: row ->
InfiniteArray.set data (k + j) x;
write k row
in
let () =
List.iter (fun (i, _, row) ->
write k row;
displacement.(i) <- encode k
) rows
in
displacement, InfiniteArray.domain data
let get (displacement, data) i j =
assert (0 <= i && i < Array.length displacement);
let k = decode displacement.(i) in
assert (0 <= k + j && k + j < Array.length data);
data.(k + j)
[ getget ] is a variant of [ get ] which only requires read access ,
via accessors , to the two components of the table .
via accessors, to the two components of the table. *)
let getget get_displacement get_data (displacement, data) i j =
let k = decode (get_displacement displacement i) in
get_data data (k + j)
|
e762f8d9449c79bfd5a1f40f0f800958b798c4d2227ad07b372d7bfd9365b658 | mzp/bs-lwt | ppx_lwt.mli | Lightweight thread library for OCaml
*
* Module Ppx_lwt
* Copyright ( C ) 2014 , .
*
* This program is free software ; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as
* published by the Free Software Foundation , with linking exceptions ;
* either version 2.1 of the License , or ( at your option ) any later
* version . See COPYING file for details .
*
* This program is distributed in the hope that it will be useful , but
* WITHOUT ANY WARRANTY ; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the GNU
* Lesser General Public License for more details .
*
* You should have received a copy of the GNU Lesser General Public
* License along with this program ; if not , write to the Free Software
* Foundation , Inc. , 59 Temple Place - Suite 330 , Boston , MA
* 02111 - 1307 , USA .
*
* Module Ppx_lwt
* Copyright (C) 2014 Gabriel Radanne, Peter Zotov.
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as
* published by the Free Software Foundation, with linking exceptions;
* either version 2.1 of the License, or (at your option) any later
* version. See COPYING file for details.
*
* This program is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public
* License along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA
* 02111-1307, USA.
*)
(** Ppx syntax extension for Lwt *)
* { 2 Ppx extensions }
This Ppx extension adds various syntactic shortcut for lwt programming .
It needs OCaml > = 4.02 and { { : }ppx_tools } .
To use it , simply use the ocamlfind package [ lwt.ppx ] .
This extension adds the following syntax :
- lwt - binding :
{ [
let%lwt ch = in
code
] }
is the same as [ bind ( get_char stdin ) ( fun ch - > code ) ] .
Moreover , it supports parallel binding :
{ [
let%lwt x = do_something1 ( )
and y = do_something2 in
code
] }
will run [ do_something1 ( ) ] and [ do_something2 ( ) ] , then
bind their results to [ x ] and [ y ] . It is the same as :
{ [
let t1 = do_something1
and t2 = do_something2 in
bind t1 ( fun x - > bind t2 ( fun y - > code ) )
] }
- exception catching :
{ [
try%lwt
< expr >
with
< branches >
] }
For example :
{ [
try%lwt
f x
with
| Failure msg - >
prerr_endline msg ;
return ( )
] }
is expanded to :
{ [
catch ( fun ( ) - > f x )
( function
| Failure msg - >
prerr_endline msg ;
return ( )
| exn - >
Lwt.fail exn )
] }
Note that the [ exn - > Lwt.fail exn ] branch is automatically added
when needed .
- finalizer :
{ [
( < expr > ) [ % finally < expr > ]
] }
You can use [ [ % lwt.finally ... ] ] instead of [ [ % finally ... ] ] .
- assertion :
{ [
assert%lwt < expr >
] }
- for loop :
{ [
for%lwt i = < expr > to < expr > do
< expr >
done
] }
and :
{ [
for%lwt i = < expr > downto < expr > do
< expr >
done
] }
- while loop :
{ [
while%lwt < expr > do
< expr >
done
] }
- pattern matching :
{ [
match%lwt < expr > with
| < patt_1 > - > < expr_1 >
...
| < patt_n > - > < expr_n >
] }
Exception cases are also supported :
{ [
match%lwt < expr > with
| exception < exn > - > < expr_1 >
| < patt_2 > - > < expr_2 >
...
| < patt_n > - > < expr_n >
] }
- conditional :
{ [
< expr > then
< expr_1 >
else
< expr_2 >
] }
and
{ [
< expr > then < expr_1 >
] }
- exception raising :
For all other expression , the construct
{ [
[ % lwt < expr > ]
] }
is expanded to :
{ [
Lwt.catch ( fun ( ) - > < expr > ) Lwt.fail
] }
It allows to encode the old [ raise_lwt < e > ] as [ [ % lwt raise < e > ] ] , and offers a convenient way to interact with non - Lwt code .
{ 2 Debug }
By default , the debug mode is enabled . This means that the [ backtrace ] versions of the [ bind ] , [ finalize ] and [ catch ] functions are used , enabling proper backtraces for the Lwt exceptions .
The debug mode can be disabled with the option [ -no - debug ] :
{ v
$ ocamlfind ocamlc -package lwt.ppx \
-ppxopt lwt.ppx,-no - debug -linkpkg -o foo foo.ml
v }
{ 2 Sequence }
It is also possible to sequence Lwt operations with the [ > > ] operator :
{ [
write stdout " Hello , " > > write stdout " world ! "
] }
By default , each operation must return [ unit . This constraint can be
lifted with the option [ -no - strict - sequence ] . The operator can be disabled
with the option [ -no - sequence ] .
If you are mixing ` > > ` and ` ; ` , you need to use parentheses or ` begin`/`end `
to get the result you expect :
{ [
write stdout " Hello , " > > ( ignore ( ) ; write stdout " world ! " )
] }
Note that unlike [ > > =] , [ > > ] is not an OCaml value . it is a piece of syntax
added by the ppx rewriter - i.e. , you can not refer to [ ( > > ) ] .
{ 2 Logging }
The logging syntax extension is enabled with [ -log ] .
It will replace expressions of the form :
{ [
Lwt_log.info_f ~section " x = % d " x
] }
by
{ [
if Lwt_log.Section.level section < = Lwt_log . Info then
Lwt_log.info_f ~section " x = % d " x
else
return ( )
] }
Notes :
- The application must be complete . For example : [ Log.info " % d " ]
will make compilation fail .
- Debug messages are removed if the option [ -no - debug ] is passed .
This Ppx extension adds various syntactic shortcut for lwt programming.
It needs OCaml >= 4.02 and {{:}ppx_tools}.
To use it, simply use the ocamlfind package [lwt.ppx].
This extension adds the following syntax:
- lwt-binding:
{[
let%lwt ch = get_char stdin in
code
]}
is the same as [bind (get_char stdin) (fun ch -> code)].
Moreover, it supports parallel binding:
{[
let%lwt x = do_something1 ()
and y = do_something2 in
code
]}
will run [do_something1 ()] and [do_something2 ()], then
bind their results to [x] and [y]. It is the same as:
{[
let t1 = do_something1
and t2 = do_something2 in
bind t1 (fun x -> bind t2 (fun y -> code))
]}
- exception catching:
{[
try%lwt
<expr>
with
<branches>
]}
For example:
{[
try%lwt
f x
with
| Failure msg ->
prerr_endline msg;
return ()
]}
is expanded to:
{[
catch (fun () -> f x)
(function
| Failure msg ->
prerr_endline msg;
return ()
| exn ->
Lwt.fail exn)
]}
Note that the [exn -> Lwt.fail exn] branch is automatically added
when needed.
- finalizer:
{[
(<expr>) [%finally <expr>]
]}
You can use [[%lwt.finally ...]] instead of [[%finally ...]].
- assertion:
{[
assert%lwt <expr>
]}
- for loop:
{[
for%lwt i = <expr> to <expr> do
<expr>
done
]}
and:
{[
for%lwt i = <expr> downto <expr> do
<expr>
done
]}
- while loop:
{[
while%lwt <expr> do
<expr>
done
]}
- pattern matching:
{[
match%lwt <expr> with
| <patt_1> -> <expr_1>
...
| <patt_n> -> <expr_n>
]}
Exception cases are also supported:
{[
match%lwt <expr> with
| exception <exn> -> <expr_1>
| <patt_2> -> <expr_2>
...
| <patt_n> -> <expr_n>
]}
- conditional:
{[
if%lwt <expr> then
<expr_1>
else
<expr_2>
]}
and
{[
if%lwt <expr> then <expr_1>
]}
- exception raising:
For all other expression, the construct
{[
[%lwt <expr>]
]}
is expanded to:
{[
Lwt.catch (fun () -> <expr>) Lwt.fail
]}
It allows to encode the old [raise_lwt <e>] as [[%lwt raise <e>]], and offers a convenient way to interact with non-Lwt code.
{2 Debug}
By default, the debug mode is enabled. This means that the [backtrace] versions of the [bind], [finalize] and [catch] functions are used, enabling proper backtraces for the Lwt exceptions.
The debug mode can be disabled with the option [-no-debug]:
{v
$ ocamlfind ocamlc -package lwt.ppx \
-ppxopt lwt.ppx,-no-debug -linkpkg -o foo foo.ml
v}
{2 Sequence}
It is also possible to sequence Lwt operations with the [>>] operator:
{[
write stdout "Hello, " >> write stdout "world!"
]}
By default, each operation must return [unit Lwt.t]. This constraint can be
lifted with the option [-no-strict-sequence]. The operator can be disabled
with the option [-no-sequence].
If you are mixing `>>` and `;`, you need to use parentheses or `begin`/`end`
to get the result you expect:
{[
write stdout "Hello, " >> (ignore (); write stdout "world!")
]}
Note that unlike [>>=], [>>] is not an OCaml value. it is a piece of syntax
added by the ppx rewriter - i.e., you cannot refer to [(>>)].
{2 Logging}
The logging syntax extension is enabled with [-log].
It will replace expressions of the form:
{[
Lwt_log.info_f ~section "x = %d" x
]}
by
{[
if Lwt_log.Section.level section <= Lwt_log.Info then
Lwt_log.info_f ~section "x = %d" x
else
return ()
]}
Notes:
- The application must be complete. For example: [Log.info "%d"]
will make compilation fail.
- Debug messages are removed if the option [-no-debug] is passed.
*)
| null | https://raw.githubusercontent.com/mzp/bs-lwt/f37a3c47d038f4efcd65912c41fab95d1e6633ce/lwt/src/ppx/ppx_lwt.mli | ocaml | * Ppx syntax extension for Lwt | Lightweight thread library for OCaml
*
* Module Ppx_lwt
* Copyright ( C ) 2014 , .
*
* This program is free software ; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as
* published by the Free Software Foundation , with linking exceptions ;
* either version 2.1 of the License , or ( at your option ) any later
* version . See COPYING file for details .
*
* This program is distributed in the hope that it will be useful , but
* WITHOUT ANY WARRANTY ; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the GNU
* Lesser General Public License for more details .
*
* You should have received a copy of the GNU Lesser General Public
* License along with this program ; if not , write to the Free Software
* Foundation , Inc. , 59 Temple Place - Suite 330 , Boston , MA
* 02111 - 1307 , USA .
*
* Module Ppx_lwt
* Copyright (C) 2014 Gabriel Radanne, Peter Zotov.
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU Lesser General Public License as
* published by the Free Software Foundation, with linking exceptions;
* either version 2.1 of the License, or (at your option) any later
* version. See COPYING file for details.
*
* This program is distributed in the hope that it will be useful, but
* WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
* Lesser General Public License for more details.
*
* You should have received a copy of the GNU Lesser General Public
* License along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place - Suite 330, Boston, MA
* 02111-1307, USA.
*)
* { 2 Ppx extensions }
This Ppx extension adds various syntactic shortcut for lwt programming .
It needs OCaml > = 4.02 and { { : }ppx_tools } .
To use it , simply use the ocamlfind package [ lwt.ppx ] .
This extension adds the following syntax :
- lwt - binding :
{ [
let%lwt ch = in
code
] }
is the same as [ bind ( get_char stdin ) ( fun ch - > code ) ] .
Moreover , it supports parallel binding :
{ [
let%lwt x = do_something1 ( )
and y = do_something2 in
code
] }
will run [ do_something1 ( ) ] and [ do_something2 ( ) ] , then
bind their results to [ x ] and [ y ] . It is the same as :
{ [
let t1 = do_something1
and t2 = do_something2 in
bind t1 ( fun x - > bind t2 ( fun y - > code ) )
] }
- exception catching :
{ [
try%lwt
< expr >
with
< branches >
] }
For example :
{ [
try%lwt
f x
with
| Failure msg - >
prerr_endline msg ;
return ( )
] }
is expanded to :
{ [
catch ( fun ( ) - > f x )
( function
| Failure msg - >
prerr_endline msg ;
return ( )
| exn - >
Lwt.fail exn )
] }
Note that the [ exn - > Lwt.fail exn ] branch is automatically added
when needed .
- finalizer :
{ [
( < expr > ) [ % finally < expr > ]
] }
You can use [ [ % lwt.finally ... ] ] instead of [ [ % finally ... ] ] .
- assertion :
{ [
assert%lwt < expr >
] }
- for loop :
{ [
for%lwt i = < expr > to < expr > do
< expr >
done
] }
and :
{ [
for%lwt i = < expr > downto < expr > do
< expr >
done
] }
- while loop :
{ [
while%lwt < expr > do
< expr >
done
] }
- pattern matching :
{ [
match%lwt < expr > with
| < patt_1 > - > < expr_1 >
...
| < patt_n > - > < expr_n >
] }
Exception cases are also supported :
{ [
match%lwt < expr > with
| exception < exn > - > < expr_1 >
| < patt_2 > - > < expr_2 >
...
| < patt_n > - > < expr_n >
] }
- conditional :
{ [
< expr > then
< expr_1 >
else
< expr_2 >
] }
and
{ [
< expr > then < expr_1 >
] }
- exception raising :
For all other expression , the construct
{ [
[ % lwt < expr > ]
] }
is expanded to :
{ [
Lwt.catch ( fun ( ) - > < expr > ) Lwt.fail
] }
It allows to encode the old [ raise_lwt < e > ] as [ [ % lwt raise < e > ] ] , and offers a convenient way to interact with non - Lwt code .
{ 2 Debug }
By default , the debug mode is enabled . This means that the [ backtrace ] versions of the [ bind ] , [ finalize ] and [ catch ] functions are used , enabling proper backtraces for the Lwt exceptions .
The debug mode can be disabled with the option [ -no - debug ] :
{ v
$ ocamlfind ocamlc -package lwt.ppx \
-ppxopt lwt.ppx,-no - debug -linkpkg -o foo foo.ml
v }
{ 2 Sequence }
It is also possible to sequence Lwt operations with the [ > > ] operator :
{ [
write stdout " Hello , " > > write stdout " world ! "
] }
By default , each operation must return [ unit . This constraint can be
lifted with the option [ -no - strict - sequence ] . The operator can be disabled
with the option [ -no - sequence ] .
If you are mixing ` > > ` and ` ; ` , you need to use parentheses or ` begin`/`end `
to get the result you expect :
{ [
write stdout " Hello , " > > ( ignore ( ) ; write stdout " world ! " )
] }
Note that unlike [ > > =] , [ > > ] is not an OCaml value . it is a piece of syntax
added by the ppx rewriter - i.e. , you can not refer to [ ( > > ) ] .
{ 2 Logging }
The logging syntax extension is enabled with [ -log ] .
It will replace expressions of the form :
{ [
Lwt_log.info_f ~section " x = % d " x
] }
by
{ [
if Lwt_log.Section.level section < = Lwt_log . Info then
Lwt_log.info_f ~section " x = % d " x
else
return ( )
] }
Notes :
- The application must be complete . For example : [ Log.info " % d " ]
will make compilation fail .
- Debug messages are removed if the option [ -no - debug ] is passed .
This Ppx extension adds various syntactic shortcut for lwt programming.
It needs OCaml >= 4.02 and {{:}ppx_tools}.
To use it, simply use the ocamlfind package [lwt.ppx].
This extension adds the following syntax:
- lwt-binding:
{[
let%lwt ch = get_char stdin in
code
]}
is the same as [bind (get_char stdin) (fun ch -> code)].
Moreover, it supports parallel binding:
{[
let%lwt x = do_something1 ()
and y = do_something2 in
code
]}
will run [do_something1 ()] and [do_something2 ()], then
bind their results to [x] and [y]. It is the same as:
{[
let t1 = do_something1
and t2 = do_something2 in
bind t1 (fun x -> bind t2 (fun y -> code))
]}
- exception catching:
{[
try%lwt
<expr>
with
<branches>
]}
For example:
{[
try%lwt
f x
with
| Failure msg ->
prerr_endline msg;
return ()
]}
is expanded to:
{[
catch (fun () -> f x)
(function
| Failure msg ->
prerr_endline msg;
return ()
| exn ->
Lwt.fail exn)
]}
Note that the [exn -> Lwt.fail exn] branch is automatically added
when needed.
- finalizer:
{[
(<expr>) [%finally <expr>]
]}
You can use [[%lwt.finally ...]] instead of [[%finally ...]].
- assertion:
{[
assert%lwt <expr>
]}
- for loop:
{[
for%lwt i = <expr> to <expr> do
<expr>
done
]}
and:
{[
for%lwt i = <expr> downto <expr> do
<expr>
done
]}
- while loop:
{[
while%lwt <expr> do
<expr>
done
]}
- pattern matching:
{[
match%lwt <expr> with
| <patt_1> -> <expr_1>
...
| <patt_n> -> <expr_n>
]}
Exception cases are also supported:
{[
match%lwt <expr> with
| exception <exn> -> <expr_1>
| <patt_2> -> <expr_2>
...
| <patt_n> -> <expr_n>
]}
- conditional:
{[
if%lwt <expr> then
<expr_1>
else
<expr_2>
]}
and
{[
if%lwt <expr> then <expr_1>
]}
- exception raising:
For all other expression, the construct
{[
[%lwt <expr>]
]}
is expanded to:
{[
Lwt.catch (fun () -> <expr>) Lwt.fail
]}
It allows to encode the old [raise_lwt <e>] as [[%lwt raise <e>]], and offers a convenient way to interact with non-Lwt code.
{2 Debug}
By default, the debug mode is enabled. This means that the [backtrace] versions of the [bind], [finalize] and [catch] functions are used, enabling proper backtraces for the Lwt exceptions.
The debug mode can be disabled with the option [-no-debug]:
{v
$ ocamlfind ocamlc -package lwt.ppx \
-ppxopt lwt.ppx,-no-debug -linkpkg -o foo foo.ml
v}
{2 Sequence}
It is also possible to sequence Lwt operations with the [>>] operator:
{[
write stdout "Hello, " >> write stdout "world!"
]}
By default, each operation must return [unit Lwt.t]. This constraint can be
lifted with the option [-no-strict-sequence]. The operator can be disabled
with the option [-no-sequence].
If you are mixing `>>` and `;`, you need to use parentheses or `begin`/`end`
to get the result you expect:
{[
write stdout "Hello, " >> (ignore (); write stdout "world!")
]}
Note that unlike [>>=], [>>] is not an OCaml value. it is a piece of syntax
added by the ppx rewriter - i.e., you cannot refer to [(>>)].
{2 Logging}
The logging syntax extension is enabled with [-log].
It will replace expressions of the form:
{[
Lwt_log.info_f ~section "x = %d" x
]}
by
{[
if Lwt_log.Section.level section <= Lwt_log.Info then
Lwt_log.info_f ~section "x = %d" x
else
return ()
]}
Notes:
- The application must be complete. For example: [Log.info "%d"]
will make compilation fail.
- Debug messages are removed if the option [-no-debug] is passed.
*)
|
bfff548b8d3cd17fea88463ed87048ece6156f718b895568ad160d3197a1994f | racket/web-server | soft.rkt | #lang racket/base
(require racket/contract
racket/match
racket/local
racket/serialize)
(define-serializable-struct soft-state-record (thnk))
(define-struct some (value))
(define *soft-state-cache*
(make-weak-hash))
(define (make-soft-state thnk)
(make-soft-state-record thnk))
(define (soft-state-ref ss)
(match ss
[(struct soft-state-record (thnk))
(define the-weak-box
(hash-ref! *soft-state-cache* ss (lambda () (make-weak-box (make-some (thnk))))))
(define the-val
(weak-box-value the-weak-box))
(if (some? the-val)
(some-value the-val)
(local [(define real-val (thnk))]
(hash-set! *soft-state-cache* ss (make-weak-box (make-some real-val)))
real-val))]))
(define soft-state? soft-state-record?)
(define-syntax-rule (soft-state expr ...)
(make-soft-state (lambda () expr ...)))
(provide
soft-state)
(provide/contract
[soft-state? (any/c . -> . boolean?)]
[make-soft-state ((-> any/c) . -> . soft-state?)]
[soft-state-ref (soft-state? . -> . any/c)])
| null | https://raw.githubusercontent.com/racket/web-server/f718800b5b3f407f7935adf85dfa663c4bba1651/web-server-lib/web-server/lang/soft.rkt | racket | #lang racket/base
(require racket/contract
racket/match
racket/local
racket/serialize)
(define-serializable-struct soft-state-record (thnk))
(define-struct some (value))
(define *soft-state-cache*
(make-weak-hash))
(define (make-soft-state thnk)
(make-soft-state-record thnk))
(define (soft-state-ref ss)
(match ss
[(struct soft-state-record (thnk))
(define the-weak-box
(hash-ref! *soft-state-cache* ss (lambda () (make-weak-box (make-some (thnk))))))
(define the-val
(weak-box-value the-weak-box))
(if (some? the-val)
(some-value the-val)
(local [(define real-val (thnk))]
(hash-set! *soft-state-cache* ss (make-weak-box (make-some real-val)))
real-val))]))
(define soft-state? soft-state-record?)
(define-syntax-rule (soft-state expr ...)
(make-soft-state (lambda () expr ...)))
(provide
soft-state)
(provide/contract
[soft-state? (any/c . -> . boolean?)]
[make-soft-state ((-> any/c) . -> . soft-state?)]
[soft-state-ref (soft-state? . -> . any/c)])
| |
0043ad785ee4348942628c295f68fb896adef2e03da04008b8b4317e1c2c52ae | b0-system/brzo | a.ml |
let revolt () = print_endline "Revolt!"
let () = match !Sys.interactive with
| true -> ()
| false ->
let open Cmdliner in
let revolt_t = Term.(const revolt $ const ()) in
Term.exit @@ Term.eval (revolt_t, Term.info "revolt")
| null | https://raw.githubusercontent.com/b0-system/brzo/79d316a5024025a0112a8569a6335241b4620da8/examples/ocaml-ext-deps/a.ml | ocaml |
let revolt () = print_endline "Revolt!"
let () = match !Sys.interactive with
| true -> ()
| false ->
let open Cmdliner in
let revolt_t = Term.(const revolt $ const ()) in
Term.exit @@ Term.eval (revolt_t, Term.info "revolt")
| |
f208d6f4b4576ccfba05b29eade163e72becb872c6cf9c10b94aac1cd6a8bc0c | racket/typed-racket | no-bound-fl.rkt | #lang typed/racket/optional
(: fold-left (All (a b ...) ((a b ... -> a) a (Listof b) ... -> a)))
(define (fold-left f a . bss)
(if (ormap null? bss)
a
(apply fold-left
f
(apply f a (map car bss))
(map cdr bss))))
| null | https://raw.githubusercontent.com/racket/typed-racket/1dde78d165472d67ae682b68622d2b7ee3e15e1e/typed-racket-test/succeed/optional/no-bound-fl.rkt | racket | #lang typed/racket/optional
(: fold-left (All (a b ...) ((a b ... -> a) a (Listof b) ... -> a)))
(define (fold-left f a . bss)
(if (ormap null? bss)
a
(apply fold-left
f
(apply f a (map car bss))
(map cdr bss))))
| |
8eb01ebada1295f6454c4de81ef9211870499bfcc8d72ffbd0979b758981dfa1 | smallhadroncollider/taskell | Utility.hs | module Taskell.Data.Utility where
updateLast :: (a -> a) -> [a] -> [a]
updateLast _ [] = []
updateLast f [k] = [f k]
updateLast f (y : ys) = y : updateLast f ys | null | https://raw.githubusercontent.com/smallhadroncollider/taskell/fb7feee61a4538869b76060651cf5c3bc2fcf3fd/src/Taskell/Data/Utility.hs | haskell | module Taskell.Data.Utility where
updateLast :: (a -> a) -> [a] -> [a]
updateLast _ [] = []
updateLast f [k] = [f k]
updateLast f (y : ys) = y : updateLast f ys | |
845a354a641cdc9a23c8cb2795f36d8098ea81f33613840c159b21be3564000d | well-typed-lightbulbs/ocaml-esp32 | cow.ml | type t = int
let c = 1
let moo _t = () [@@inline never]
| null | https://raw.githubusercontent.com/well-typed-lightbulbs/ocaml-esp32/c24fcbfbee0e3aa6bb71c9b467c60c6bac326cc7/testsuite/tests/lib-dynlink-private/plugin2b/cow.ml | ocaml | type t = int
let c = 1
let moo _t = () [@@inline never]
| |
2498150b6de7d184654c82e3f7a681aaedfc78e9ae5f50b961cb5021512b3ed9 | Smoltbob/Caml-Est-Belle | farmgen.ml | let print_arm l =
let s = (Fparser.toplevel Flexer.token l) in
print_string (Fsyntax.to_arm_top s); print_newline ()
let file f =
let inchan = open_in f in
try
print_arm (Lexing.from_channel inchan);
close_in inchan
with e -> (close_in inchan; raise e)
let () =
let files = ref [] in
Arg.parse
[ ]
(fun s -> files := !files @ [s])
(Printf.sprintf "usage: %s filenames" Sys.argv.(0));
List.iter
(fun f -> ignore (file f))
!files
| null | https://raw.githubusercontent.com/Smoltbob/Caml-Est-Belle/3d6f53d4e8e01bbae57a0a402b7c0f02f4ed767c/compiler/farmgen.ml | ocaml | let print_arm l =
let s = (Fparser.toplevel Flexer.token l) in
print_string (Fsyntax.to_arm_top s); print_newline ()
let file f =
let inchan = open_in f in
try
print_arm (Lexing.from_channel inchan);
close_in inchan
with e -> (close_in inchan; raise e)
let () =
let files = ref [] in
Arg.parse
[ ]
(fun s -> files := !files @ [s])
(Printf.sprintf "usage: %s filenames" Sys.argv.(0));
List.iter
(fun f -> ignore (file f))
!files
| |
0698f1191e62ccea63f758355b72c3895d11b297dea147e7c3296495c68a07f9 | thi-ng/ws-ldn-1 | webgl.cljs | (ns ws-ldn-1.ui.day3.webgl
"WebGL reagent example: Interactive gear mesh generator & animation
Demonstrates usage of React component lifecycle stages and how to
dynamically update parts of the WebGL scene via UI controls."
(:require
[reagent.core :as r]
[thi.ng.geom.webgl.core :as gl]
[thi.ng.geom.webgl.animator :as anim]
[thi.ng.geom.webgl.buffers :as buf]
[thi.ng.geom.webgl.shaders :as sh]
[thi.ng.geom.webgl.utils :as glu]
[thi.ng.geom.core :as g]
[thi.ng.geom.core.vector :as v :refer [vec2 vec3]]
[thi.ng.geom.core.matrix :as mat :refer [M44]]
[thi.ng.geom.webgl.shaders.phong :as phong]
[thi.ng.geom.circle :as c]
[thi.ng.geom.polygon :as poly]
[thi.ng.geom.basicmesh :refer [basic-mesh]]
[thi.ng.geom.gmesh :refer [gmesh]]
[thi.ng.geom.mesh.io :as mio]
[thi.ng.geom.mesh.subdivision :as sd]
[thi.ng.typedarrays.core :as arrays]
[thi.ng.math.core :as m :refer [PI HALF_PI TWO_PI]]
[thi.ng.dstruct.streams :as streams]
[thi.ng.strf.core :as f]))
the app state will be updated when the webgl - canvas component ( below ) initializes
(defonce app-state (r/atom {:solid? true :depth 0.1 :teeth 10 :inner 0.8}))
(defn save-mesh
"Triggers download of the mesh (in STL format) stored under the :mesh key
in the app-state atom to the user's drive."
[]
(let [out (mio/wrapped-output-stream (streams/output-stream))]
(mio/write-stl out (g/tessellate (:mesh @app-state)))
(let [url (streams/as-data-url out)]
(js/setTimeout (fn [] (set! (.-href js/location) @url)) 500))))
(defn generate-mesh
"Defines a gear mesh with given number of teeth, inner radius for
profile shape and extrusion depth. The gear is initially a 2d polygon
which is then extruded as 3d mesh. Updates :solid?, :teeth, :inner,
:depth, :mesh and :model keys in app-state atom.
Returns WebGL model structure (a map)."
[gl solid? teeth inner depth]
(let [poly (poly/cog 0.5 teeth [inner 1 1 inner])
mesh (if solid?
(g/extrude
poly {:mesh (gmesh) :depth depth :scale (- 1 depth)})
(g/extrude-shell
poly {:mesh (gmesh) :depth depth :inset 0.025 :wall 0.015 :bottom? true}))
model (-> mesh
(gl/as-webgl-buffer-spec {})
(buf/make-attribute-buffers-in-spec gl gl/static-draw))]
(swap! app-state
(fn [state]
(-> state
(assoc :solid? solid? :teeth teeth :inner inner :depth depth)
(assoc :mesh mesh)
(update :model merge model))))
model))
(defn checkbox
"HTML checkbox with label"
[opts label]
[:div [:input (assoc opts :type "checkbox") label]])
(defn slider
"HTML5 slider component"
[opts label]
[:div
[:input (assoc opts :type "range")] " "
(or (:defaultValue opts) (:value opts)) " "
label])
(defn event-value
"Helper fn to retrieve an event's target DOM element value attrib"
[e] (-> e .-target .-value))
(defn webgl-canvas
"Defines a WebGL canvas component using reagent.core/create-class,
which allows us to use the various React lifecycle methods.
The :component-did-mount fn is only run once to initialize the component
and here used to do the following:
- setup all WebGL elements (context, mesh & shader)
- store the generated mesh instance & webgl model structure in the app-state atom
- attach an update loop/function triggered during every render cycle
to animate the scene
The :reagent-render function creates the component's canvas element and various
UI controls (sub-components) to allow user to adjust mesh parameters."
[id]
(r/create-class
{:component-did-mount
#(let [gl (gl/gl-context id)
view-rect (gl/get-viewport-rect gl)
;; record current time stamp (used as reference for animation)
t0 (.getTime (js/Date.))
animation fn , triggered at each React render cycle
update (fn update []
(let [;; extract keys from app state
{:keys [solid? teeth model inner depth]} @app-state
compute elapsed time ( in seconds )
t (* (- (.getTime (js/Date.)) t0) 0.001)
;; new timebased rotation matrix (base for both gears)
M44 is the 4x4 identity matrix
rot (g/rotate-y M44 (* t 1.))
matrix ( coordinate system ) for 1st gear
;; multiplying matrices = transforming coordinate systems
offset (+ (/ (inc inner) 4) (if solid? (* teeth 0.0002) (* depth 0.2)))
tx1 (g/* rot (-> M44
(g/translate (- offset) 0 0)
(g/rotate-y 0.3)
(g/rotate-z t)))
matrix for 2nd gear
tx2 (g/* rot (-> M44
(g/translate offset 0 0)
(g/rotate-y -0.3)
(g/rotate-z (- (+ t (/ HALF_PI teeth))))))]
;; clear background & depth buffer
(gl/clear-color-buffer gl 1.0 1.0 1.0 1.0)
(gl/clear-depth-buffer gl 1.0)
draw 1st gear
(phong/draw
gl (assoc-in model [:uniforms :model] tx1))
draw 2nd gear with modified transform matrix & color
(phong/draw
gl (-> model
(assoc-in [:uniforms :model] tx2)
(assoc-in [:uniforms :diffuseCol] 0x2277ff)))
;; retrigger update fn in next render cycle
(r/next-tick update)))
{:keys [solid? depth teeth inner]} @app-state]
setup complete WebGL data structure for gear mesh
(swap! app-state assoc :model
(-> (generate-mesh gl solid? teeth inner depth)
(assoc :shader (sh/make-shader-from-spec gl phong/shader-spec))
(update-in [:uniforms] merge
{:view (mat/look-at (vec3 0 0 2) (vec3) v/V3Y)
:proj (gl/perspective 45 view-rect 0.1 10.0)
:lightPos (vec3 0.1 0 1)
:ambientCol 0x111111
:diffuseCol 0xff3310
:specularCol 0xcccccc
:shininess 100
:wrap 0
:useBlinnPhong true})))
;; setup viewport
(gl/set-viewport gl view-rect)
(gl/enable gl gl/depth-test)
;; kick off update loop
(r/next-tick update))
;; :display-name is used for React.js developer tools
:display-name id
;; the render fn merely constructs the canvas
;; width & height could be passed in arguments to the parent webgl-canvas fn
;; or could be kept in app-state and referenced from there
;; the latter is useful when creating a full-window canvas which needs to be resizable
;; left as exercise for the reader...
:reagent-render
(fn []
(let [{:keys [solid? teeth inner depth]} @app-state
gl (gl/gl-context id)]
[:div
[:canvas {:key id :id id :width 640 :height 480}]
[checkbox
{:checked solid?
:on-change #(generate-mesh gl (-> % .-target .-checked) teeth inner depth)}
"solid mesh"]
[slider
{:min 4 :max 20 :step 2 :defaultValue teeth
:on-change #(generate-mesh gl solid? (f/parse-int (event-value %) 10) inner depth)}
"teeth"]
[slider
{:min 0.6 :max 0.9 :step 0.05 :defaultValue inner
:on-change #(generate-mesh gl solid? teeth (f/parse-float (event-value %)) depth)}
"inner radius"]
[slider
{:min 0.1 :max 0.5 :step 0.01 :defaultValue depth
:on-change #(generate-mesh gl solid? teeth inner (f/parse-float (event-value %)))}
"extrusion"]]))}))
(defn app-component
"Main React root/application component"
[]
[:div
[webgl-canvas "main"]
[:div
[:p "Download the gear 3d model as STL (the downloaded file
should be renamed with the .stl file extension -
can't be specified via JS)"]
[:p "Use " [:a {:href ""} "Meshlab"] " to view the file."]
[:p [:button {:on-click save-mesh} "Download STL"]]]])
(defn main
"App entry point"
[]
(r/render-component [app-component] (.-body js/document)))
(main) | null | https://raw.githubusercontent.com/thi-ng/ws-ldn-1/090c851cc699f0f994779ddc09275fc1718400a8/src-cljs-day3-2/ws_ldn_1/ui/day3/webgl.cljs | clojure | record current time stamp (used as reference for animation)
extract keys from app state
new timebased rotation matrix (base for both gears)
multiplying matrices = transforming coordinate systems
clear background & depth buffer
retrigger update fn in next render cycle
setup viewport
kick off update loop
:display-name is used for React.js developer tools
the render fn merely constructs the canvas
width & height could be passed in arguments to the parent webgl-canvas fn
or could be kept in app-state and referenced from there
the latter is useful when creating a full-window canvas which needs to be resizable
left as exercise for the reader... | (ns ws-ldn-1.ui.day3.webgl
"WebGL reagent example: Interactive gear mesh generator & animation
Demonstrates usage of React component lifecycle stages and how to
dynamically update parts of the WebGL scene via UI controls."
(:require
[reagent.core :as r]
[thi.ng.geom.webgl.core :as gl]
[thi.ng.geom.webgl.animator :as anim]
[thi.ng.geom.webgl.buffers :as buf]
[thi.ng.geom.webgl.shaders :as sh]
[thi.ng.geom.webgl.utils :as glu]
[thi.ng.geom.core :as g]
[thi.ng.geom.core.vector :as v :refer [vec2 vec3]]
[thi.ng.geom.core.matrix :as mat :refer [M44]]
[thi.ng.geom.webgl.shaders.phong :as phong]
[thi.ng.geom.circle :as c]
[thi.ng.geom.polygon :as poly]
[thi.ng.geom.basicmesh :refer [basic-mesh]]
[thi.ng.geom.gmesh :refer [gmesh]]
[thi.ng.geom.mesh.io :as mio]
[thi.ng.geom.mesh.subdivision :as sd]
[thi.ng.typedarrays.core :as arrays]
[thi.ng.math.core :as m :refer [PI HALF_PI TWO_PI]]
[thi.ng.dstruct.streams :as streams]
[thi.ng.strf.core :as f]))
the app state will be updated when the webgl - canvas component ( below ) initializes
(defonce app-state (r/atom {:solid? true :depth 0.1 :teeth 10 :inner 0.8}))
(defn save-mesh
"Triggers download of the mesh (in STL format) stored under the :mesh key
in the app-state atom to the user's drive."
[]
(let [out (mio/wrapped-output-stream (streams/output-stream))]
(mio/write-stl out (g/tessellate (:mesh @app-state)))
(let [url (streams/as-data-url out)]
(js/setTimeout (fn [] (set! (.-href js/location) @url)) 500))))
(defn generate-mesh
"Defines a gear mesh with given number of teeth, inner radius for
profile shape and extrusion depth. The gear is initially a 2d polygon
which is then extruded as 3d mesh. Updates :solid?, :teeth, :inner,
:depth, :mesh and :model keys in app-state atom.
Returns WebGL model structure (a map)."
[gl solid? teeth inner depth]
(let [poly (poly/cog 0.5 teeth [inner 1 1 inner])
mesh (if solid?
(g/extrude
poly {:mesh (gmesh) :depth depth :scale (- 1 depth)})
(g/extrude-shell
poly {:mesh (gmesh) :depth depth :inset 0.025 :wall 0.015 :bottom? true}))
model (-> mesh
(gl/as-webgl-buffer-spec {})
(buf/make-attribute-buffers-in-spec gl gl/static-draw))]
(swap! app-state
(fn [state]
(-> state
(assoc :solid? solid? :teeth teeth :inner inner :depth depth)
(assoc :mesh mesh)
(update :model merge model))))
model))
(defn checkbox
"HTML checkbox with label"
[opts label]
[:div [:input (assoc opts :type "checkbox") label]])
(defn slider
"HTML5 slider component"
[opts label]
[:div
[:input (assoc opts :type "range")] " "
(or (:defaultValue opts) (:value opts)) " "
label])
(defn event-value
"Helper fn to retrieve an event's target DOM element value attrib"
[e] (-> e .-target .-value))
(defn webgl-canvas
"Defines a WebGL canvas component using reagent.core/create-class,
which allows us to use the various React lifecycle methods.
The :component-did-mount fn is only run once to initialize the component
and here used to do the following:
- setup all WebGL elements (context, mesh & shader)
- store the generated mesh instance & webgl model structure in the app-state atom
- attach an update loop/function triggered during every render cycle
to animate the scene
The :reagent-render function creates the component's canvas element and various
UI controls (sub-components) to allow user to adjust mesh parameters."
[id]
(r/create-class
{:component-did-mount
#(let [gl (gl/gl-context id)
view-rect (gl/get-viewport-rect gl)
t0 (.getTime (js/Date.))
animation fn , triggered at each React render cycle
update (fn update []
{:keys [solid? teeth model inner depth]} @app-state
compute elapsed time ( in seconds )
t (* (- (.getTime (js/Date.)) t0) 0.001)
M44 is the 4x4 identity matrix
rot (g/rotate-y M44 (* t 1.))
matrix ( coordinate system ) for 1st gear
offset (+ (/ (inc inner) 4) (if solid? (* teeth 0.0002) (* depth 0.2)))
tx1 (g/* rot (-> M44
(g/translate (- offset) 0 0)
(g/rotate-y 0.3)
(g/rotate-z t)))
matrix for 2nd gear
tx2 (g/* rot (-> M44
(g/translate offset 0 0)
(g/rotate-y -0.3)
(g/rotate-z (- (+ t (/ HALF_PI teeth))))))]
(gl/clear-color-buffer gl 1.0 1.0 1.0 1.0)
(gl/clear-depth-buffer gl 1.0)
draw 1st gear
(phong/draw
gl (assoc-in model [:uniforms :model] tx1))
draw 2nd gear with modified transform matrix & color
(phong/draw
gl (-> model
(assoc-in [:uniforms :model] tx2)
(assoc-in [:uniforms :diffuseCol] 0x2277ff)))
(r/next-tick update)))
{:keys [solid? depth teeth inner]} @app-state]
setup complete WebGL data structure for gear mesh
(swap! app-state assoc :model
(-> (generate-mesh gl solid? teeth inner depth)
(assoc :shader (sh/make-shader-from-spec gl phong/shader-spec))
(update-in [:uniforms] merge
{:view (mat/look-at (vec3 0 0 2) (vec3) v/V3Y)
:proj (gl/perspective 45 view-rect 0.1 10.0)
:lightPos (vec3 0.1 0 1)
:ambientCol 0x111111
:diffuseCol 0xff3310
:specularCol 0xcccccc
:shininess 100
:wrap 0
:useBlinnPhong true})))
(gl/set-viewport gl view-rect)
(gl/enable gl gl/depth-test)
(r/next-tick update))
:display-name id
:reagent-render
(fn []
(let [{:keys [solid? teeth inner depth]} @app-state
gl (gl/gl-context id)]
[:div
[:canvas {:key id :id id :width 640 :height 480}]
[checkbox
{:checked solid?
:on-change #(generate-mesh gl (-> % .-target .-checked) teeth inner depth)}
"solid mesh"]
[slider
{:min 4 :max 20 :step 2 :defaultValue teeth
:on-change #(generate-mesh gl solid? (f/parse-int (event-value %) 10) inner depth)}
"teeth"]
[slider
{:min 0.6 :max 0.9 :step 0.05 :defaultValue inner
:on-change #(generate-mesh gl solid? teeth (f/parse-float (event-value %)) depth)}
"inner radius"]
[slider
{:min 0.1 :max 0.5 :step 0.01 :defaultValue depth
:on-change #(generate-mesh gl solid? teeth inner (f/parse-float (event-value %)))}
"extrusion"]]))}))
(defn app-component
"Main React root/application component"
[]
[:div
[webgl-canvas "main"]
[:div
[:p "Download the gear 3d model as STL (the downloaded file
should be renamed with the .stl file extension -
can't be specified via JS)"]
[:p "Use " [:a {:href ""} "Meshlab"] " to view the file."]
[:p [:button {:on-click save-mesh} "Download STL"]]]])
(defn main
"App entry point"
[]
(r/render-component [app-component] (.-body js/document)))
(main) |
869521bbd84ca0d3617d91c47a90a24fb2d551b88e85124c0aadc8eca91239e8 | PeterDWhite/Osker | UserTrap.hs | Copyright ( C ) , 2001 , 2002 , 2003
Copyright ( c ) OHSU , 2001 , 2002 , 2003
module UserTrap ( UserTrap (..) ) where
----------------------------------------------------------------------
-- The trap message defines the requests and response of the trap
-- handler thread
----------------------------------------------------------------------
-- Haskell imports
import qualified Dynamic as DYN
import Dynamic ( TyCon, Typeable, mkTyCon, typeOf, mkAppTy )
imports
import qualified SystemCall as SC
import qualified SystemCallOptions as SCO
-- For inputs from the user process
data UserTrap tid =
UserTrap { utSysReq :: SC.SystemRequest
, utTid :: tid
, utCallType :: SCO.SystemCallOptions
} deriving (Show)
utCon :: TyCon
utCon = mkTyCon "User Trap"
instance Typeable (UserTrap tid) where
typeOf _ = mkAppTy utCon []
| null | https://raw.githubusercontent.com/PeterDWhite/Osker/301e1185f7c08c62c2929171cc0469a159ea802f/Posix/UserTrap.hs | haskell | --------------------------------------------------------------------
The trap message defines the requests and response of the trap
handler thread
--------------------------------------------------------------------
Haskell imports
For inputs from the user process | Copyright ( C ) , 2001 , 2002 , 2003
Copyright ( c ) OHSU , 2001 , 2002 , 2003
module UserTrap ( UserTrap (..) ) where
import qualified Dynamic as DYN
import Dynamic ( TyCon, Typeable, mkTyCon, typeOf, mkAppTy )
imports
import qualified SystemCall as SC
import qualified SystemCallOptions as SCO
data UserTrap tid =
UserTrap { utSysReq :: SC.SystemRequest
, utTid :: tid
, utCallType :: SCO.SystemCallOptions
} deriving (Show)
utCon :: TyCon
utCon = mkTyCon "User Trap"
instance Typeable (UserTrap tid) where
typeOf _ = mkAppTy utCon []
|
2143065f71b07e58d5a95d4aec5e523224a9dae477b564a54e613a8885f0f092 | RefactoringTools/HaRe | Where2.hs | module LiftToToplevel.Where1 where
import Data.Tree.DUAL.Internal
import Data.Semigroup
import Data.List.NonEmpty (NonEmpty (..))
import qualified Data.List.NonEmpty as NEL
unpack = undefined
foldDUALNE :: (Semigroup d, Monoid d)
=> (d -> l -> r) -- ^ Process a leaf datum along with the
-- accumulation of @d@ values along the
-- path from the root
^ Replace @LeafU@ nodes
-> (NonEmpty r -> r) -- ^ Combine results at a branch node
-> (d -> r -> r) -- ^ Process an internal d node
-> (a -> r -> r) -- ^ Process an internal datum
-> DUALTreeNE d u a l -> r
foldDUALNE = foldDUALNE' (Option Nothing)
where
foldDUALNE' dacc lf lfU con down ann (Concat ts)
= con (NEL.map (foldDUALNE' dacc lf lfU con down ann . snd . unpack) ts)
| null | https://raw.githubusercontent.com/RefactoringTools/HaRe/ef5dee64c38fb104e6e5676095946279fbce381c/test/testdata/LiftToToplevel/Where2.hs | haskell | ^ Process a leaf datum along with the
accumulation of @d@ values along the
path from the root
^ Combine results at a branch node
^ Process an internal d node
^ Process an internal datum | module LiftToToplevel.Where1 where
import Data.Tree.DUAL.Internal
import Data.Semigroup
import Data.List.NonEmpty (NonEmpty (..))
import qualified Data.List.NonEmpty as NEL
unpack = undefined
foldDUALNE :: (Semigroup d, Monoid d)
^ Replace @LeafU@ nodes
-> DUALTreeNE d u a l -> r
foldDUALNE = foldDUALNE' (Option Nothing)
where
foldDUALNE' dacc lf lfU con down ann (Concat ts)
= con (NEL.map (foldDUALNE' dacc lf lfU con down ann . snd . unpack) ts)
|
fff5d5d3b4c37b599cca219b3445b8834ac1fd5f6703de43b6a7b3ccdefd06c5 | jvf/scalaris | yaws_revproxy.erl | %%%-------------------------------------------------------------------
%%% File : yaws_revproxy.erl
%%% Author : <>
%%% Description : reverse proxy
%%%
Created : 3 Dec 2003 by < >
%%%-------------------------------------------------------------------
-module(yaws_revproxy).
-include("../include/yaws.hrl").
-include("../include/yaws_api.hrl").
-include("yaws_debug.hrl").
-export([out/1]).
%% reverse proxy implementation.
the revproxy internal state
-record(revproxy, {srvsock, %% the socket opened on the backend server
the socket type : ssl | nossl
cliconn_status, %% "Connection:" header value:
srvconn_status, %% "keep-alive' or "close"
state, %% revproxy state:
%% sendheaders | sendcontent | sendchunk |
%% recvheaders | recvcontent | recvchunk |
%% terminate
prefix, %% The prefix to strip and add
url, %% the url we're proxying to
r_meth, %% what req method are we processing
r_host, %% and value of Host: for the cli request
resp, %% response received from the server
headers, %% and associated headers
srvdata, %% the server data
is_chunked, %% true if the response is chunked
intercept_mod %% revproxy request/response intercept module
}).
%% TODO: Activate proxy keep-alive with a new option ?
-define(proxy_keepalive, false).
Initialize the connection to the backend server . If an error occurred , return
an error 404 .
out(#arg{req=Req, headers=Hdrs, state=#proxy_cfg{url=URL}=State}=Arg) ->
case connect(URL) of
{ok, Sock, Type} ->
?Debug("Connection established on ~p: Socket=~p, Type=~p~n",
[URL, Sock, Type]),
RPState = #revproxy{srvsock = Sock,
type = Type,
state = sendheaders,
prefix = State#proxy_cfg.prefix,
url = URL,
r_meth = Req#http_request.method,
r_host = Hdrs#headers.host,
intercept_mod = State#proxy_cfg.intercept_mod},
out(Arg#arg{state=RPState});
_ERR ->
?Debug("Connection failed: ~p~n", [_ERR]),
out404(Arg)
end;
%% Send the client request to the server then check if the request content is
%% chunked or not
out(#arg{state=#revproxy{}=RPState}=Arg)
when RPState#revproxy.state == sendheaders ->
?Debug("Send request headers to backend server: ~n"
" - ~s~n", [?format_record(Arg#arg.req, http_request)]),
Req = rewrite_request(RPState, Arg#arg.req),
Hdrs0 = Arg#arg.headers,
Hdrs = rewrite_client_headers(RPState, Hdrs0),
{NewReq, NewHdrs} = case RPState#revproxy.intercept_mod of
undefined ->
{Req, Hdrs};
InterceptMod ->
case catch InterceptMod:rewrite_request(
Req, Hdrs) of
{ok, NewReq0, NewHdrs0} ->
{NewReq0, NewHdrs0};
InterceptError ->
error_logger:error_msg(
"revproxy intercept module ~p:"
"rewrite_request failed: ~p~n",
[InterceptMod, InterceptError]),
exit({error, intercept_mod})
end
end,
ReqStr = yaws_api:reformat_request(NewReq),
HdrsStr = yaws:headers_to_str(NewHdrs),
case send(RPState, [ReqStr, "\r\n", HdrsStr, "\r\n"]) of
ok ->
TE = yaws:to_lower(Hdrs#headers.transfer_encoding),
RPState1 = if
(Hdrs#headers.content_length == undefined andalso
TE == "chunked") ->
?Debug("Request content is chunked~n", []),
RPState#revproxy{state=sendchunk};
true ->
RPState#revproxy{state=sendcontent}
end,
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
%% Send the request content to the server. Here the content is not chunked. But
%% it can be split because of 'partial_post_size' value.
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == sendcontent ->
case Arg#arg.clidata of
{partial, Bin} ->
?Debug("Send partial content to backend server: ~p bytes~n",
[size(Bin)]),
case send(RPState, Bin) of
ok ->
{get_more, undefined, RPState};
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
Bin when is_binary(Bin), Bin /= <<>> ->
?Debug("Send content to backend server: ~p bytes~n", [size(Bin)]),
case send(RPState, Bin) of
ok ->
RPState1 = RPState#revproxy{state=recvheaders},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
_ ->
?Debug("no content found~n", []),
RPState1 = RPState#revproxy{state=recvheaders},
out(Arg#arg{state=RPState1})
end;
%% Send the request content to the server. Here the content is chunked, so we
%% must rebuild the chunk before sending it. Chunks can have different size than
%% the original request because of 'partial_post_size' value.
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == sendchunk ->
case Arg#arg.clidata of
{partial, Bin} ->
?Debug("Send chunked content to backend server: ~p bytes~n",
[size(Bin)]),
Res = send(RPState,
[yaws:integer_to_hex(size(Bin)),"\r\n",Bin,"\r\n"]),
case Res of
ok ->
{get_more, undefined, RPState};
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
<<>> ->
?Debug("Send last chunk to backend server~n", []),
case send(RPState, "0\r\n\r\n") of
ok ->
RPState1 = RPState#revproxy{state=recvheaders},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end
end;
%% The request and its content were sent. Now, we try to read the response
%% headers. Then we check if the response content is chunked or not.
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == recvheaders ->
Res = yaws:http_get_headers(RPState#revproxy.srvsock,
RPState#revproxy.type),
case Res of
{error, {too_many_headers, _Resp}} ->
?Debug("Response headers too large from backend server~n", []),
close(RPState),
outXXX(500, Arg);
{Resp0, RespHdrs0} when is_record(Resp0, http_response) ->
?Debug("Response headers received from backend server:~n"
" - ~s~n - ~s~n", [?format_record(Resp0, http_response),
?format_record(RespHdrs0, headers)]),
{Resp, RespHdrs} =
case RPState#revproxy.intercept_mod of
undefined ->
{Resp0, RespHdrs0};
InterceptMod ->
case catch InterceptMod:rewrite_response(
Resp0, RespHdrs0) of
{ok, NewResp, NewRespHdrs} ->
{NewResp, NewRespHdrs};
InterceptError ->
error_logger:error_msg(
"revproxy intercept module ~p:"
"rewrite_response failure: ~p~n",
[InterceptMod, InterceptError]),
exit({error, intercept_mod})
end
end,
{CliConn, SrvConn} = get_connection_status(
(Arg#arg.req)#http_request.version,
Arg#arg.headers, RespHdrs
),
RPState1 = RPState#revproxy{cliconn_status = CliConn,
srvconn_status = SrvConn,
resp = Resp,
headers = RespHdrs},
if
RPState1#revproxy.r_meth =:= 'HEAD' ->
RPState2 = RPState1#revproxy{state=terminate},
out(Arg#arg{state=RPState2});
Resp#http_response.status =:= 100 orelse
Resp#http_response.status =:= 204 orelse
Resp#http_response.status =:= 205 orelse
Resp#http_response.status =:= 304 orelse
Resp#http_response.status =:= 406 ->
RPState2 = RPState1#revproxy{state=terminate},
out(Arg#arg{state=RPState2});
true ->
RPState2 =
case RespHdrs#headers.content_length of
undefined ->
TE = yaws:to_lower(
RespHdrs#headers.transfer_encoding),
case TE of
"chunked" ->
?Debug("Response content is chunked~n",
[]),
RPState1#revproxy{state=recvchunk};
_ ->
RPState1#revproxy{
cliconn_status="close",
srvconn_status="close",
state=recvcontent}
end;
_ ->
RPState1#revproxy{state=recvcontent}
end,
out(Arg#arg{state=RPState2})
end;
{_R, _H} ->
%% bad_request
?Debug("Bad response received from backend server: ~p~n", [_R]),
close(RPState),
outXXX(500, Arg);
closed ->
?Debug("TCP error: ~p~n", [closed]),
outXXX(500, Arg)
end;
%% The response content is not chunked.
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == recvcontent ->
Len = case (RPState#revproxy.headers)#headers.content_length of
undefined -> undefined;
CLen -> list_to_integer(CLen)
end,
SC=get(sc),
if
is_integer(Len) andalso Len =< SC#sconf.partial_post_size ->
case read(RPState, Len) of
{ok, Data} ->
?Debug("Response content received from the backend server: "
"~p bytes~n", [size(Data)]),
RPState1 = RPState#revproxy{state = terminate,
is_chunked = false,
srvdata = {content, Data}},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
is_integer(Len) ->
Here partial_post_size is always an integer
BlockSize = SC#sconf.partial_post_size,
BlockCount = Len div BlockSize,
LastBlock = Len rem BlockSize,
SrvData = {block, BlockCount, BlockSize, LastBlock},
RPState1 = RPState#revproxy{state = terminate,
is_chunked = true,
srvdata = SrvData},
out(Arg#arg{state=RPState1});
true ->
SrvData = {block, undefined, undefined, undefined},
RPState1 = RPState#revproxy{state = terminate,
is_chunked = true,
srvdata = SrvData},
out(Arg#arg{state=RPState1})
end;
The response content is chunked . Read the first chunk here and spawn a
%% process to read others.
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == recvchunk ->
case read_chunk(RPState) of
{ok, Data} ->
?Debug("First chunk received from the backend server : "
"~p bytes~n", [size(Data)]),
RPState1 = RPState#revproxy{state = terminate,
is_chunked = (Data /= <<>>),
srvdata = {stream, Data}},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
%% Now, we return the result and we let yaws_server deals with it. If it is
%% possible, we try to cache the connection.
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == terminate ->
case RPState#revproxy.srvconn_status of
"close" when RPState#revproxy.is_chunked == false -> close(RPState);
"close" -> ok;
_ -> cache_connection(RPState)
end,
AllHdrs = [{header, H} || H <- yaws_api:reformat_header(
rewrite_server_headers(RPState)
)],
?Debug("~p~n", [AllHdrs]),
Res = [
{status, (RPState#revproxy.resp)#http_response.status},
{allheaders, AllHdrs}
],
case RPState#revproxy.srvdata of
{content, <<>>} ->
Res;
{content, Data} ->
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{content, MimeType, Data}];
{stream, <<>>} ->
Chunked response with only the last empty chunk : do not spawn a
%% process to manage chunks
yaws_api:stream_chunk_end(self()),
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{streamcontent, MimeType, <<>>}];
{stream, Chunk} ->
Self = self(),
GC = get(gc),
spawn(fun() -> put(gc, GC), recv_next_chunk(Self, Arg) end),
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{streamcontent, MimeType, Chunk}];
{block, BlockCnt, BlockSz, LastBlock} ->
GC = get(gc),
Pid = spawn(fun() ->
put(gc, GC),
receive
{ok, YawsPid} ->
recv_blocks(YawsPid, Arg, BlockCnt,
BlockSz, LastBlock);
{discard, YawsPid} ->
recv_blocks(YawsPid, Arg, 0, BlockSz, 0)
end
end),
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{streamcontent_from_pid, MimeType, Pid}];
_ ->
Res
end;
Catch unexpected state by sending an error 500
out(#arg{state=RPState}=Arg) ->
?Debug("Unexpected revproxy state:~n - ~s~n",
[?format_record(RPState, revproxy)]),
case RPState#revproxy.srvsock of
undefined -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg).
%%==========================================================================
out404(Arg) ->
SC=get(sc),
(SC#sconf.errormod_404):out404(Arg,get(gc),SC).
outXXX(Code, _Arg) ->
Content = ["<html><h1>", integer_to_list(Code), $\ ,
yaws_api:code_to_phrase(Code), "</h1></html>"],
[
{status, Code},
{header, {connection, "close"}},
{content, "text/html", Content}
].
%%==========================================================================
%% This function is used to read a chunk and to stream it to the client.
recv_next_chunk(YawsPid, #arg{state=RPState}=Arg) ->
case read_chunk(RPState) of
{ok, <<>>} ->
?Debug("Last chunk received from the backend server~n", []),
yaws_api:stream_chunk_end(YawsPid),
case RPState#revproxy.srvconn_status of
"close" -> close(RPState);
_ -> ok %% Cached by the main process
end;
{ok, Data} ->
?Debug("Next chunk received from the backend server : "
"~p bytes~n", [size(Data)]),
yaws_api:stream_chunk_deliver(YawsPid, Data),
recv_next_chunk(YawsPid, Arg);
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
yaws_api:stream_chunk_end(YawsPid),
case Reason of
closed -> ok;
_ -> close(RPState)
end
end.
%%==========================================================================
%% This function reads blocks from the server and streams them to the client.
recv_blocks(YawsPid, #arg{state=RPState}=Arg,
undefined, undefined, undefined) ->
case read(RPState) of
{ok, <<>>} ->
no data , wait 100 msec to avoid time - consuming loop and retry
timer:sleep(100),
recv_blocks(YawsPid, Arg, undefined, undefined, undefined);
{ok, Data} ->
?Debug("Response content received from the backend server : "
"~p bytes~n", [size(Data)]),
ok = yaws_api:stream_process_deliver(Arg#arg.clisock, Data),
recv_blocks(YawsPid, Arg, undefined, undefined, undefined);
{error, closed} ->
yaws_api:stream_process_end(closed, YawsPid);
{error, _Reason} ->
?Debug("TCP error: ~p~n", [_Reason]),
yaws_api:stream_process_end(closed, YawsPid),
close(RPState)
end;
recv_blocks(YawsPid, #arg{state=RPState}=Arg, 0, _, 0) ->
yaws_api:stream_process_end(Arg#arg.clisock, YawsPid),
case RPState#revproxy.srvconn_status of
"close" -> close(RPState);
_ -> ok %% Cached by the main process
end;
recv_blocks(YawsPid, #arg{state=RPState}=Arg, 0, _, LastBlock) ->
Sock = Arg#arg.clisock,
case read(RPState, LastBlock) of
{ok, Data} ->
?Debug("Response content received from the backend server : "
"~p bytes~n", [size(Data)]),
ok = yaws_api:stream_process_deliver(Sock, Data),
yaws_api:stream_process_end(Sock, YawsPid),
case RPState#revproxy.srvconn_status of
"close" -> close(RPState);
_ -> ok %% Cached by the main process
end;
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
yaws_api:stream_process_end(closed, YawsPid),
case Reason of
closed -> ok;
_ -> close(RPState)
end
end;
recv_blocks(YawsPid, #arg{state=RPState}=Arg, BlockCnt, BlockSz, LastBlock) ->
case read(RPState, BlockSz) of
{ok, Data} ->
?Debug("Response content received from the backend server : "
"~p bytes~n", [size(Data)]),
ok = yaws_api:stream_process_deliver(Arg#arg.clisock, Data),
recv_blocks(YawsPid, Arg, BlockCnt-1, BlockSz, LastBlock);
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
yaws_api:stream_process_end(closed, YawsPid),
case Reason of
closed -> ok;
_ -> close(RPState)
end
end.
%%==========================================================================
%% TODO: find a better way to cache connections to backend servers. Here we can
have 1 connection per gserv process for each backend server .
get_cached_connection(URL) ->
Key = lists:flatten(yaws_api:reformat_url(URL)),
case erase(Key) of
undefined ->
undefined;
{Sock, nossl} ->
case gen_tcp:recv(Sock, 0, 1) of
{error, closed} ->
?Debug("Invalid cached connection~n", []),
undefined;
_ ->
?Debug("Found cached connection to ~s~n", [Key]),
{ok, Sock, nossl}
end;
{Sock, ssl} ->
case ssl:recv(Sock, 0, 1) of
{error, closed} ->
?Debug("Invalid cached connection~n", []),
undefined;
_ ->
?Debug("Found cached connection to ~s~n", [Key]),
{ok, Sock, ssl}
end
end.
cache_connection(RPState) ->
Key = lists:flatten(yaws_api:reformat_url(RPState#revproxy.url)),
?Debug("Cache connection to ~s~n", [Key]),
InitDB0 = get(init_db),
InitDB1 = lists:keystore(
Key, 1, InitDB0,
{Key, {RPState#revproxy.srvsock, RPState#revproxy.type}}
),
put(init_db, InitDB1),
ok.
%%==========================================================================
connect(URL) ->
case get_cached_connection(URL) of
{ok, Sock, Type} -> {ok, Sock, Type};
undefined -> do_connect(URL)
end.
do_connect(URL) ->
Opts = [
binary,
{packet, raw},
{active, false},
{reuseaddr, true}
],
case URL#url.scheme of
http ->
Port = case URL#url.port of
undefined -> 80;
P -> P
end,
case yaws:tcp_connect(URL#url.host, Port, Opts) of
{ok, S} -> {ok, S, nossl};
Err -> Err
end;
https ->
Port = case URL#url.port of
undefined -> 443;
P -> P
end,
case yaws:ssl_connect(URL#url.host, Port, Opts) of
{ok, S} -> {ok, S, ssl};
Err -> Err
end;
_ ->
{error, unsupported_protocol}
end.
send(#revproxy{srvsock=Sock, type=ssl}, Data) ->
ssl:send(Sock, Data);
send(#revproxy{srvsock=Sock, type=nossl}, Data) ->
gen_tcp:send(Sock, Data).
read(#revproxy{srvsock=Sock, type=Type}) ->
yaws:setopts(Sock, [{packet, raw}, binary], Type),
yaws:do_recv(Sock, 0, Type).
read(RPState, Len) ->
yaws:setopts(RPState#revproxy.srvsock, [{packet, raw}, binary],
RPState#revproxy.type),
read(RPState, Len, []).
read(_, 0, Data) ->
{ok, iolist_to_binary(lists:reverse(Data))};
read(RPState = #revproxy{srvsock=Sock, type=Type}, Len, Data) ->
case yaws:do_recv(Sock, Len, Type) of
{ok, Bin} -> read(RPState, Len-size(Bin), [Bin|Data]);
{error, Reason} -> {error, Reason}
end.
read_chunk(#revproxy{srvsock=Sock, type=Type}) ->
try
yaws:setopts(Sock, [binary, {packet, line}], Type),
%% Ignore chunk extentions
{Len, _Exts} = yaws:get_chunk_header(Sock, Type),
yaws:setopts(Sock, [binary, {packet, raw}], Type),
if
Len == 0 ->
%% Ignore chunk trailer
yaws:get_chunk_trailer(Sock, Type),
{ok, <<>>};
true ->
B = yaws:get_chunk(Sock, Len, 0, Type),
ok = yaws:eat_crnl(Sock, Type),
{ok, iolist_to_binary(B)}
end
catch
_:Reason ->
{error, Reason}
end.
close(#revproxy{srvsock=Sock, type=ssl}) ->
ssl:close(Sock);
close(#revproxy{srvsock=Sock, type=nossl}) ->
gen_tcp:close(Sock).
get_connection_status(Version, ReqHdrs, RespHdrs) ->
CliConn = case Version of
{0,9} ->
"close";
{1, 0} ->
case ReqHdrs#headers.connection of
undefined -> "close";
C1 -> yaws:to_lower(C1)
end;
{1, 1} ->
case ReqHdrs#headers.connection of
undefined -> "keep-alive";
C1 -> yaws:to_lower(C1)
end
end,
?Debug("Client Connection header: ~p~n", [CliConn]),
%% below, ignore dialyzer warning:
%% "The pattern 'true' can never match the type 'false'"
SrvConn = case ?proxy_keepalive of
true ->
case RespHdrs#headers.connection of
undefined -> CliConn;
C2 -> yaws:to_lower(C2)
end;
false ->
"close"
end,
?Debug("Server Connection header: ~p~n", [SrvConn]),
{CliConn, SrvConn}.
%%==========================================================================
rewrite_request(RPState, Req) ->
?Debug("Request path to rewrite: ~p~n", [Req#http_request.path]),
{abs_path, Path} = Req#http_request.path,
NewPath = strip_prefix(Path, RPState#revproxy.prefix),
?Debug("New Request path: ~p~n", [NewPath]),
Req#http_request{path = {abs_path, NewPath}}.
rewrite_client_headers(RPState, Hdrs) ->
?Debug("Host header to rewrite: ~p~n", [Hdrs#headers.host]),
Host = case Hdrs#headers.host of
undefined ->
undefined;
_ ->
ProxyUrl = RPState#revproxy.url,
[ProxyUrl#url.host,
case ProxyUrl#url.port of
undefined -> [];
P -> [$:|integer_to_list(P)]
end]
end,
?Debug("New Host header: ~p~n", [Host]),
Hdrs#headers{host = Host}.
rewrite_server_headers(RPState) ->
Hdrs = RPState#revproxy.headers,
?Debug("Location header to rewrite: ~p~n", [Hdrs#headers.location]),
Loc = case Hdrs#headers.location of
undefined ->
undefined;
L ->
?Debug("parse_url(~p)~n", [L]),
LocUrl = (catch yaws_api:parse_url(L)),
ProxyUrl = RPState#revproxy.url,
if
LocUrl#url.scheme == ProxyUrl#url.scheme andalso
LocUrl#url.host == ProxyUrl#url.host andalso
LocUrl#url.port == ProxyUrl#url.port ->
rewrite_loc_url(RPState, LocUrl);
element(1, L) == 'EXIT' ->
rewrite_loc_rel(RPState, L);
true ->
L
end
end,
?Debug("New Location header: ~p~n", [Loc]),
%% FIXME: And we also should do cookies here ...
Hdrs#headers{location = Loc, connection = RPState#revproxy.cliconn_status}.
Rewrite a properly formatted location redir
rewrite_loc_url(RPState, LocUrl) ->
SC=get(sc),
Scheme = yaws:redirect_scheme(SC),
RedirHost = yaws:redirect_host(SC, RPState#revproxy.r_host),
[Scheme, RedirHost, slash_append(RPState#revproxy.prefix, LocUrl#url.path)].
%% This is the case for broken webservers that reply with
%% Location: /path
%% or even worse, Location: path
rewrite_loc_rel(RPState, Loc) ->
SC=get(sc),
Scheme = yaws:redirect_scheme(SC),
RedirHost = yaws:redirect_host(SC, RPState#revproxy.r_host),
[Scheme, RedirHost, Loc].
strip_prefix("", "") ->
"/";
strip_prefix(P, "") ->
P;
strip_prefix(P, "/") ->
P;
strip_prefix([H|T1], [H|T2]) ->
strip_prefix(T1, T2).
slash_append("/", [$/|T]) ->
[$/|T];
slash_append("/", T) ->
[$/|T];
slash_append([], [$/|T]) ->
[$/|T];
slash_append([], T) ->
[$/|T];
slash_append([H|T], X) ->
[H | slash_append(T, X)].
| null | https://raw.githubusercontent.com/jvf/scalaris/c069f44cf149ea6c69e24bdb08714bda242e7ee0/contrib/yaws/src/yaws_revproxy.erl | erlang | -------------------------------------------------------------------
File : yaws_revproxy.erl
Author : <>
Description : reverse proxy
-------------------------------------------------------------------
reverse proxy implementation.
the socket opened on the backend server
"Connection:" header value:
"keep-alive' or "close"
revproxy state:
sendheaders | sendcontent | sendchunk |
recvheaders | recvcontent | recvchunk |
terminate
The prefix to strip and add
the url we're proxying to
what req method are we processing
and value of Host: for the cli request
response received from the server
and associated headers
the server data
true if the response is chunked
revproxy request/response intercept module
TODO: Activate proxy keep-alive with a new option ?
Send the client request to the server then check if the request content is
chunked or not
Send the request content to the server. Here the content is not chunked. But
it can be split because of 'partial_post_size' value.
Send the request content to the server. Here the content is chunked, so we
must rebuild the chunk before sending it. Chunks can have different size than
the original request because of 'partial_post_size' value.
The request and its content were sent. Now, we try to read the response
headers. Then we check if the response content is chunked or not.
bad_request
The response content is not chunked.
process to read others.
Now, we return the result and we let yaws_server deals with it. If it is
possible, we try to cache the connection.
process to manage chunks
==========================================================================
==========================================================================
This function is used to read a chunk and to stream it to the client.
Cached by the main process
==========================================================================
This function reads blocks from the server and streams them to the client.
Cached by the main process
Cached by the main process
==========================================================================
TODO: find a better way to cache connections to backend servers. Here we can
==========================================================================
Ignore chunk extentions
Ignore chunk trailer
below, ignore dialyzer warning:
"The pattern 'true' can never match the type 'false'"
==========================================================================
FIXME: And we also should do cookies here ...
This is the case for broken webservers that reply with
Location: /path
or even worse, Location: path | Created : 3 Dec 2003 by < >
-module(yaws_revproxy).
-include("../include/yaws.hrl").
-include("../include/yaws_api.hrl").
-include("yaws_debug.hrl").
-export([out/1]).
the revproxy internal state
the socket type : ssl | nossl
}).
-define(proxy_keepalive, false).
Initialize the connection to the backend server . If an error occurred , return
an error 404 .
out(#arg{req=Req, headers=Hdrs, state=#proxy_cfg{url=URL}=State}=Arg) ->
case connect(URL) of
{ok, Sock, Type} ->
?Debug("Connection established on ~p: Socket=~p, Type=~p~n",
[URL, Sock, Type]),
RPState = #revproxy{srvsock = Sock,
type = Type,
state = sendheaders,
prefix = State#proxy_cfg.prefix,
url = URL,
r_meth = Req#http_request.method,
r_host = Hdrs#headers.host,
intercept_mod = State#proxy_cfg.intercept_mod},
out(Arg#arg{state=RPState});
_ERR ->
?Debug("Connection failed: ~p~n", [_ERR]),
out404(Arg)
end;
out(#arg{state=#revproxy{}=RPState}=Arg)
when RPState#revproxy.state == sendheaders ->
?Debug("Send request headers to backend server: ~n"
" - ~s~n", [?format_record(Arg#arg.req, http_request)]),
Req = rewrite_request(RPState, Arg#arg.req),
Hdrs0 = Arg#arg.headers,
Hdrs = rewrite_client_headers(RPState, Hdrs0),
{NewReq, NewHdrs} = case RPState#revproxy.intercept_mod of
undefined ->
{Req, Hdrs};
InterceptMod ->
case catch InterceptMod:rewrite_request(
Req, Hdrs) of
{ok, NewReq0, NewHdrs0} ->
{NewReq0, NewHdrs0};
InterceptError ->
error_logger:error_msg(
"revproxy intercept module ~p:"
"rewrite_request failed: ~p~n",
[InterceptMod, InterceptError]),
exit({error, intercept_mod})
end
end,
ReqStr = yaws_api:reformat_request(NewReq),
HdrsStr = yaws:headers_to_str(NewHdrs),
case send(RPState, [ReqStr, "\r\n", HdrsStr, "\r\n"]) of
ok ->
TE = yaws:to_lower(Hdrs#headers.transfer_encoding),
RPState1 = if
(Hdrs#headers.content_length == undefined andalso
TE == "chunked") ->
?Debug("Request content is chunked~n", []),
RPState#revproxy{state=sendchunk};
true ->
RPState#revproxy{state=sendcontent}
end,
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == sendcontent ->
case Arg#arg.clidata of
{partial, Bin} ->
?Debug("Send partial content to backend server: ~p bytes~n",
[size(Bin)]),
case send(RPState, Bin) of
ok ->
{get_more, undefined, RPState};
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
Bin when is_binary(Bin), Bin /= <<>> ->
?Debug("Send content to backend server: ~p bytes~n", [size(Bin)]),
case send(RPState, Bin) of
ok ->
RPState1 = RPState#revproxy{state=recvheaders},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
_ ->
?Debug("no content found~n", []),
RPState1 = RPState#revproxy{state=recvheaders},
out(Arg#arg{state=RPState1})
end;
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == sendchunk ->
case Arg#arg.clidata of
{partial, Bin} ->
?Debug("Send chunked content to backend server: ~p bytes~n",
[size(Bin)]),
Res = send(RPState,
[yaws:integer_to_hex(size(Bin)),"\r\n",Bin,"\r\n"]),
case Res of
ok ->
{get_more, undefined, RPState};
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
<<>> ->
?Debug("Send last chunk to backend server~n", []),
case send(RPState, "0\r\n\r\n") of
ok ->
RPState1 = RPState#revproxy{state=recvheaders},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end
end;
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == recvheaders ->
Res = yaws:http_get_headers(RPState#revproxy.srvsock,
RPState#revproxy.type),
case Res of
{error, {too_many_headers, _Resp}} ->
?Debug("Response headers too large from backend server~n", []),
close(RPState),
outXXX(500, Arg);
{Resp0, RespHdrs0} when is_record(Resp0, http_response) ->
?Debug("Response headers received from backend server:~n"
" - ~s~n - ~s~n", [?format_record(Resp0, http_response),
?format_record(RespHdrs0, headers)]),
{Resp, RespHdrs} =
case RPState#revproxy.intercept_mod of
undefined ->
{Resp0, RespHdrs0};
InterceptMod ->
case catch InterceptMod:rewrite_response(
Resp0, RespHdrs0) of
{ok, NewResp, NewRespHdrs} ->
{NewResp, NewRespHdrs};
InterceptError ->
error_logger:error_msg(
"revproxy intercept module ~p:"
"rewrite_response failure: ~p~n",
[InterceptMod, InterceptError]),
exit({error, intercept_mod})
end
end,
{CliConn, SrvConn} = get_connection_status(
(Arg#arg.req)#http_request.version,
Arg#arg.headers, RespHdrs
),
RPState1 = RPState#revproxy{cliconn_status = CliConn,
srvconn_status = SrvConn,
resp = Resp,
headers = RespHdrs},
if
RPState1#revproxy.r_meth =:= 'HEAD' ->
RPState2 = RPState1#revproxy{state=terminate},
out(Arg#arg{state=RPState2});
Resp#http_response.status =:= 100 orelse
Resp#http_response.status =:= 204 orelse
Resp#http_response.status =:= 205 orelse
Resp#http_response.status =:= 304 orelse
Resp#http_response.status =:= 406 ->
RPState2 = RPState1#revproxy{state=terminate},
out(Arg#arg{state=RPState2});
true ->
RPState2 =
case RespHdrs#headers.content_length of
undefined ->
TE = yaws:to_lower(
RespHdrs#headers.transfer_encoding),
case TE of
"chunked" ->
?Debug("Response content is chunked~n",
[]),
RPState1#revproxy{state=recvchunk};
_ ->
RPState1#revproxy{
cliconn_status="close",
srvconn_status="close",
state=recvcontent}
end;
_ ->
RPState1#revproxy{state=recvcontent}
end,
out(Arg#arg{state=RPState2})
end;
{_R, _H} ->
?Debug("Bad response received from backend server: ~p~n", [_R]),
close(RPState),
outXXX(500, Arg);
closed ->
?Debug("TCP error: ~p~n", [closed]),
outXXX(500, Arg)
end;
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == recvcontent ->
Len = case (RPState#revproxy.headers)#headers.content_length of
undefined -> undefined;
CLen -> list_to_integer(CLen)
end,
SC=get(sc),
if
is_integer(Len) andalso Len =< SC#sconf.partial_post_size ->
case read(RPState, Len) of
{ok, Data} ->
?Debug("Response content received from the backend server: "
"~p bytes~n", [size(Data)]),
RPState1 = RPState#revproxy{state = terminate,
is_chunked = false,
srvdata = {content, Data}},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
is_integer(Len) ->
Here partial_post_size is always an integer
BlockSize = SC#sconf.partial_post_size,
BlockCount = Len div BlockSize,
LastBlock = Len rem BlockSize,
SrvData = {block, BlockCount, BlockSize, LastBlock},
RPState1 = RPState#revproxy{state = terminate,
is_chunked = true,
srvdata = SrvData},
out(Arg#arg{state=RPState1});
true ->
SrvData = {block, undefined, undefined, undefined},
RPState1 = RPState#revproxy{state = terminate,
is_chunked = true,
srvdata = SrvData},
out(Arg#arg{state=RPState1})
end;
The response content is chunked . Read the first chunk here and spawn a
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == recvchunk ->
case read_chunk(RPState) of
{ok, Data} ->
?Debug("First chunk received from the backend server : "
"~p bytes~n", [size(Data)]),
RPState1 = RPState#revproxy{state = terminate,
is_chunked = (Data /= <<>>),
srvdata = {stream, Data}},
out(Arg#arg{state=RPState1});
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
case Reason of
closed -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg)
end;
out(#arg{state=RPState}=Arg) when RPState#revproxy.state == terminate ->
case RPState#revproxy.srvconn_status of
"close" when RPState#revproxy.is_chunked == false -> close(RPState);
"close" -> ok;
_ -> cache_connection(RPState)
end,
AllHdrs = [{header, H} || H <- yaws_api:reformat_header(
rewrite_server_headers(RPState)
)],
?Debug("~p~n", [AllHdrs]),
Res = [
{status, (RPState#revproxy.resp)#http_response.status},
{allheaders, AllHdrs}
],
case RPState#revproxy.srvdata of
{content, <<>>} ->
Res;
{content, Data} ->
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{content, MimeType, Data}];
{stream, <<>>} ->
Chunked response with only the last empty chunk : do not spawn a
yaws_api:stream_chunk_end(self()),
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{streamcontent, MimeType, <<>>}];
{stream, Chunk} ->
Self = self(),
GC = get(gc),
spawn(fun() -> put(gc, GC), recv_next_chunk(Self, Arg) end),
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{streamcontent, MimeType, Chunk}];
{block, BlockCnt, BlockSz, LastBlock} ->
GC = get(gc),
Pid = spawn(fun() ->
put(gc, GC),
receive
{ok, YawsPid} ->
recv_blocks(YawsPid, Arg, BlockCnt,
BlockSz, LastBlock);
{discard, YawsPid} ->
recv_blocks(YawsPid, Arg, 0, BlockSz, 0)
end
end),
MimeType = (RPState#revproxy.headers)#headers.content_type,
Res ++ [{streamcontent_from_pid, MimeType, Pid}];
_ ->
Res
end;
Catch unexpected state by sending an error 500
out(#arg{state=RPState}=Arg) ->
?Debug("Unexpected revproxy state:~n - ~s~n",
[?format_record(RPState, revproxy)]),
case RPState#revproxy.srvsock of
undefined -> ok;
_ -> close(RPState)
end,
outXXX(500, Arg).
out404(Arg) ->
SC=get(sc),
(SC#sconf.errormod_404):out404(Arg,get(gc),SC).
outXXX(Code, _Arg) ->
Content = ["<html><h1>", integer_to_list(Code), $\ ,
yaws_api:code_to_phrase(Code), "</h1></html>"],
[
{status, Code},
{header, {connection, "close"}},
{content, "text/html", Content}
].
recv_next_chunk(YawsPid, #arg{state=RPState}=Arg) ->
case read_chunk(RPState) of
{ok, <<>>} ->
?Debug("Last chunk received from the backend server~n", []),
yaws_api:stream_chunk_end(YawsPid),
case RPState#revproxy.srvconn_status of
"close" -> close(RPState);
end;
{ok, Data} ->
?Debug("Next chunk received from the backend server : "
"~p bytes~n", [size(Data)]),
yaws_api:stream_chunk_deliver(YawsPid, Data),
recv_next_chunk(YawsPid, Arg);
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
yaws_api:stream_chunk_end(YawsPid),
case Reason of
closed -> ok;
_ -> close(RPState)
end
end.
recv_blocks(YawsPid, #arg{state=RPState}=Arg,
undefined, undefined, undefined) ->
case read(RPState) of
{ok, <<>>} ->
no data , wait 100 msec to avoid time - consuming loop and retry
timer:sleep(100),
recv_blocks(YawsPid, Arg, undefined, undefined, undefined);
{ok, Data} ->
?Debug("Response content received from the backend server : "
"~p bytes~n", [size(Data)]),
ok = yaws_api:stream_process_deliver(Arg#arg.clisock, Data),
recv_blocks(YawsPid, Arg, undefined, undefined, undefined);
{error, closed} ->
yaws_api:stream_process_end(closed, YawsPid);
{error, _Reason} ->
?Debug("TCP error: ~p~n", [_Reason]),
yaws_api:stream_process_end(closed, YawsPid),
close(RPState)
end;
recv_blocks(YawsPid, #arg{state=RPState}=Arg, 0, _, 0) ->
yaws_api:stream_process_end(Arg#arg.clisock, YawsPid),
case RPState#revproxy.srvconn_status of
"close" -> close(RPState);
end;
recv_blocks(YawsPid, #arg{state=RPState}=Arg, 0, _, LastBlock) ->
Sock = Arg#arg.clisock,
case read(RPState, LastBlock) of
{ok, Data} ->
?Debug("Response content received from the backend server : "
"~p bytes~n", [size(Data)]),
ok = yaws_api:stream_process_deliver(Sock, Data),
yaws_api:stream_process_end(Sock, YawsPid),
case RPState#revproxy.srvconn_status of
"close" -> close(RPState);
end;
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
yaws_api:stream_process_end(closed, YawsPid),
case Reason of
closed -> ok;
_ -> close(RPState)
end
end;
recv_blocks(YawsPid, #arg{state=RPState}=Arg, BlockCnt, BlockSz, LastBlock) ->
case read(RPState, BlockSz) of
{ok, Data} ->
?Debug("Response content received from the backend server : "
"~p bytes~n", [size(Data)]),
ok = yaws_api:stream_process_deliver(Arg#arg.clisock, Data),
recv_blocks(YawsPid, Arg, BlockCnt-1, BlockSz, LastBlock);
{error, Reason} ->
?Debug("TCP error: ~p~n", [Reason]),
yaws_api:stream_process_end(closed, YawsPid),
case Reason of
closed -> ok;
_ -> close(RPState)
end
end.
have 1 connection per gserv process for each backend server .
get_cached_connection(URL) ->
Key = lists:flatten(yaws_api:reformat_url(URL)),
case erase(Key) of
undefined ->
undefined;
{Sock, nossl} ->
case gen_tcp:recv(Sock, 0, 1) of
{error, closed} ->
?Debug("Invalid cached connection~n", []),
undefined;
_ ->
?Debug("Found cached connection to ~s~n", [Key]),
{ok, Sock, nossl}
end;
{Sock, ssl} ->
case ssl:recv(Sock, 0, 1) of
{error, closed} ->
?Debug("Invalid cached connection~n", []),
undefined;
_ ->
?Debug("Found cached connection to ~s~n", [Key]),
{ok, Sock, ssl}
end
end.
cache_connection(RPState) ->
Key = lists:flatten(yaws_api:reformat_url(RPState#revproxy.url)),
?Debug("Cache connection to ~s~n", [Key]),
InitDB0 = get(init_db),
InitDB1 = lists:keystore(
Key, 1, InitDB0,
{Key, {RPState#revproxy.srvsock, RPState#revproxy.type}}
),
put(init_db, InitDB1),
ok.
connect(URL) ->
case get_cached_connection(URL) of
{ok, Sock, Type} -> {ok, Sock, Type};
undefined -> do_connect(URL)
end.
do_connect(URL) ->
Opts = [
binary,
{packet, raw},
{active, false},
{reuseaddr, true}
],
case URL#url.scheme of
http ->
Port = case URL#url.port of
undefined -> 80;
P -> P
end,
case yaws:tcp_connect(URL#url.host, Port, Opts) of
{ok, S} -> {ok, S, nossl};
Err -> Err
end;
https ->
Port = case URL#url.port of
undefined -> 443;
P -> P
end,
case yaws:ssl_connect(URL#url.host, Port, Opts) of
{ok, S} -> {ok, S, ssl};
Err -> Err
end;
_ ->
{error, unsupported_protocol}
end.
send(#revproxy{srvsock=Sock, type=ssl}, Data) ->
ssl:send(Sock, Data);
send(#revproxy{srvsock=Sock, type=nossl}, Data) ->
gen_tcp:send(Sock, Data).
read(#revproxy{srvsock=Sock, type=Type}) ->
yaws:setopts(Sock, [{packet, raw}, binary], Type),
yaws:do_recv(Sock, 0, Type).
read(RPState, Len) ->
yaws:setopts(RPState#revproxy.srvsock, [{packet, raw}, binary],
RPState#revproxy.type),
read(RPState, Len, []).
read(_, 0, Data) ->
{ok, iolist_to_binary(lists:reverse(Data))};
read(RPState = #revproxy{srvsock=Sock, type=Type}, Len, Data) ->
case yaws:do_recv(Sock, Len, Type) of
{ok, Bin} -> read(RPState, Len-size(Bin), [Bin|Data]);
{error, Reason} -> {error, Reason}
end.
read_chunk(#revproxy{srvsock=Sock, type=Type}) ->
try
yaws:setopts(Sock, [binary, {packet, line}], Type),
{Len, _Exts} = yaws:get_chunk_header(Sock, Type),
yaws:setopts(Sock, [binary, {packet, raw}], Type),
if
Len == 0 ->
yaws:get_chunk_trailer(Sock, Type),
{ok, <<>>};
true ->
B = yaws:get_chunk(Sock, Len, 0, Type),
ok = yaws:eat_crnl(Sock, Type),
{ok, iolist_to_binary(B)}
end
catch
_:Reason ->
{error, Reason}
end.
close(#revproxy{srvsock=Sock, type=ssl}) ->
ssl:close(Sock);
close(#revproxy{srvsock=Sock, type=nossl}) ->
gen_tcp:close(Sock).
get_connection_status(Version, ReqHdrs, RespHdrs) ->
CliConn = case Version of
{0,9} ->
"close";
{1, 0} ->
case ReqHdrs#headers.connection of
undefined -> "close";
C1 -> yaws:to_lower(C1)
end;
{1, 1} ->
case ReqHdrs#headers.connection of
undefined -> "keep-alive";
C1 -> yaws:to_lower(C1)
end
end,
?Debug("Client Connection header: ~p~n", [CliConn]),
SrvConn = case ?proxy_keepalive of
true ->
case RespHdrs#headers.connection of
undefined -> CliConn;
C2 -> yaws:to_lower(C2)
end;
false ->
"close"
end,
?Debug("Server Connection header: ~p~n", [SrvConn]),
{CliConn, SrvConn}.
rewrite_request(RPState, Req) ->
?Debug("Request path to rewrite: ~p~n", [Req#http_request.path]),
{abs_path, Path} = Req#http_request.path,
NewPath = strip_prefix(Path, RPState#revproxy.prefix),
?Debug("New Request path: ~p~n", [NewPath]),
Req#http_request{path = {abs_path, NewPath}}.
rewrite_client_headers(RPState, Hdrs) ->
?Debug("Host header to rewrite: ~p~n", [Hdrs#headers.host]),
Host = case Hdrs#headers.host of
undefined ->
undefined;
_ ->
ProxyUrl = RPState#revproxy.url,
[ProxyUrl#url.host,
case ProxyUrl#url.port of
undefined -> [];
P -> [$:|integer_to_list(P)]
end]
end,
?Debug("New Host header: ~p~n", [Host]),
Hdrs#headers{host = Host}.
rewrite_server_headers(RPState) ->
Hdrs = RPState#revproxy.headers,
?Debug("Location header to rewrite: ~p~n", [Hdrs#headers.location]),
Loc = case Hdrs#headers.location of
undefined ->
undefined;
L ->
?Debug("parse_url(~p)~n", [L]),
LocUrl = (catch yaws_api:parse_url(L)),
ProxyUrl = RPState#revproxy.url,
if
LocUrl#url.scheme == ProxyUrl#url.scheme andalso
LocUrl#url.host == ProxyUrl#url.host andalso
LocUrl#url.port == ProxyUrl#url.port ->
rewrite_loc_url(RPState, LocUrl);
element(1, L) == 'EXIT' ->
rewrite_loc_rel(RPState, L);
true ->
L
end
end,
?Debug("New Location header: ~p~n", [Loc]),
Hdrs#headers{location = Loc, connection = RPState#revproxy.cliconn_status}.
Rewrite a properly formatted location redir
rewrite_loc_url(RPState, LocUrl) ->
SC=get(sc),
Scheme = yaws:redirect_scheme(SC),
RedirHost = yaws:redirect_host(SC, RPState#revproxy.r_host),
[Scheme, RedirHost, slash_append(RPState#revproxy.prefix, LocUrl#url.path)].
rewrite_loc_rel(RPState, Loc) ->
SC=get(sc),
Scheme = yaws:redirect_scheme(SC),
RedirHost = yaws:redirect_host(SC, RPState#revproxy.r_host),
[Scheme, RedirHost, Loc].
strip_prefix("", "") ->
"/";
strip_prefix(P, "") ->
P;
strip_prefix(P, "/") ->
P;
strip_prefix([H|T1], [H|T2]) ->
strip_prefix(T1, T2).
slash_append("/", [$/|T]) ->
[$/|T];
slash_append("/", T) ->
[$/|T];
slash_append([], [$/|T]) ->
[$/|T];
slash_append([], T) ->
[$/|T];
slash_append([H|T], X) ->
[H | slash_append(T, X)].
|
8b404c028044b20c42469ede107d57ab3dd6cff1a96647292a7a578a76a09268 | xsc/claro | error_test.clj | (ns claro.data.error-test
(:require [clojure.test.check
[properties :as prop]
[generators :as gen]
[clojure-test :refer [defspec]]]
[claro.data.error :refer :all]))
;; ## Operations
(def err
#(error (str %2) {:current-value %1}))
(def operations
{`+ +
`- -
`* *
`err err})
(def gen-operation
(gen/let [op (gen/elements (keys operations))
arg gen/int]
(list op arg)))
(def gen-operations
(gen/vector gen-operation))
(defn expected-result
[ops call-fn value]
(if-let [[[op arg] & rst] (seq ops)]
(let [f (get operations op)
result (call-fn f value arg)]
(if (error? result)
result
(recur rst call-fn result)))
value))
(defn equals?
[v1 v2]
(if (and (error? v1) (error? v2))
(and (= (error-message v1) (error-message v2))
(= (error-data v1) (error-data v2)))
(= v1 v2)))
;; ## Tests
(defspec t-unless-error-> 100
(prop/for-all
[initial-value gen/int
operations gen-operations]
(equals?
(expected-result operations #(%1 %2 %3) initial-value)
(eval `(unless-error-> ~initial-value ~@operations)))))
(defspec t-unless-error->> 100
(prop/for-all
[initial-value gen/int
operations gen-operations]
(equals?
(expected-result operations #(%1 %3 %2) initial-value)
(eval `(unless-error->> ~initial-value ~@operations)))))
| null | https://raw.githubusercontent.com/xsc/claro/16db75b7a775a14f3b656362e8ee4f65dd8b0d49/test/claro/data/error_test.clj | clojure | ## Operations
## Tests | (ns claro.data.error-test
(:require [clojure.test.check
[properties :as prop]
[generators :as gen]
[clojure-test :refer [defspec]]]
[claro.data.error :refer :all]))
(def err
#(error (str %2) {:current-value %1}))
(def operations
{`+ +
`- -
`* *
`err err})
(def gen-operation
(gen/let [op (gen/elements (keys operations))
arg gen/int]
(list op arg)))
(def gen-operations
(gen/vector gen-operation))
(defn expected-result
[ops call-fn value]
(if-let [[[op arg] & rst] (seq ops)]
(let [f (get operations op)
result (call-fn f value arg)]
(if (error? result)
result
(recur rst call-fn result)))
value))
(defn equals?
[v1 v2]
(if (and (error? v1) (error? v2))
(and (= (error-message v1) (error-message v2))
(= (error-data v1) (error-data v2)))
(= v1 v2)))
(defspec t-unless-error-> 100
(prop/for-all
[initial-value gen/int
operations gen-operations]
(equals?
(expected-result operations #(%1 %2 %3) initial-value)
(eval `(unless-error-> ~initial-value ~@operations)))))
(defspec t-unless-error->> 100
(prop/for-all
[initial-value gen/int
operations gen-operations]
(equals?
(expected-result operations #(%1 %3 %2) initial-value)
(eval `(unless-error->> ~initial-value ~@operations)))))
|
dd131c0c1a7b505da55f35835aa9dc0a0b0928f707395551069b3492137cf5b6 | aspiwack/porcupine | TaskPipeline.hs | module System.TaskPipeline
( module System.TaskPipeline.PTask
, module System.TaskPipeline.PorcupineTree
, module System.TaskPipeline.VirtualFileAccess
, module System.TaskPipeline.Options
, module System.TaskPipeline.Repetition
, module Data.Locations.LogAndErrors
) where
import Data.Locations.LogAndErrors
import System.TaskPipeline.Options
import System.TaskPipeline.PorcupineTree
import System.TaskPipeline.PTask
import System.TaskPipeline.Repetition
import System.TaskPipeline.VirtualFileAccess
| null | https://raw.githubusercontent.com/aspiwack/porcupine/23dcba1523626af0fdf6085f4107987d4bf718d7/porcupine-core/src/System/TaskPipeline.hs | haskell | module System.TaskPipeline
( module System.TaskPipeline.PTask
, module System.TaskPipeline.PorcupineTree
, module System.TaskPipeline.VirtualFileAccess
, module System.TaskPipeline.Options
, module System.TaskPipeline.Repetition
, module Data.Locations.LogAndErrors
) where
import Data.Locations.LogAndErrors
import System.TaskPipeline.Options
import System.TaskPipeline.PorcupineTree
import System.TaskPipeline.PTask
import System.TaskPipeline.Repetition
import System.TaskPipeline.VirtualFileAccess
| |
79a177233578631d0f70fc67d2be790558345e43fdc4a30b5e219cbfad309990 | jafingerhut/clojure-benchmarks | knucleotide.clj-14-for-clj13.clj | The Computer Language Benchmarks Game
;; /
contributed by
(ns knucleotide
(:gen-class))
(set! *warn-on-reflection* true)
;; Handle slight difference in function name between Clojure 1.2.0 and
;; 1.3.0-alpha* ability to use type hints to infer fast bit
;; operations.
(defmacro my-unchecked-inc-int [& args]
(if (and (== (*clojure-version* :major) 1)
(== (*clojure-version* :minor) 2))
`(unchecked-inc ~@args)
`(unchecked-inc-int ~@args)))
(defmacro key-type [num]
(if (and (== (*clojure-version* :major) 1)
(== (*clojure-version* :minor) 2))
num
`(long ~num)))
(definterface ITallyCounter
(^int get_count [])
(inc_BANG_ []))
(deftype TallyCounter [^{:unsynchronized-mutable true :tag int} cnt]
ITallyCounter
(get-count [this] cnt)
(inc! [this]
(set! cnt (my-unchecked-inc-int cnt))))
(defn my-lazy-map [f coll]
(lazy-seq
(when-let [s (seq coll)]
(cons (f (first s)) (my-lazy-map f (rest s))))))
modified - pmap is like pmap from Clojure 1.1 , but with only as much
;; parallelism as specified by the parameter num-threads. Uses
;; my-lazy-map instead of map from core.clj, since that version of map
;; can use unwanted additional parallelism for chunked collections,
;; like ranges.
(defn modified-pmap
([num-threads f coll]
(if (== num-threads 1)
(map f coll)
(let [n (if (>= num-threads 2) (dec num-threads) 1)
rets (my-lazy-map #(future (f %)) coll)
step (fn step [[x & xs :as vs] fs]
(lazy-seq
(if-let [s (seq fs)]
(cons (deref x) (step xs (rest s)))
(map deref vs))))]
(step rets (drop n rets)))))
([num-threads f coll & colls]
(let [step (fn step [cs]
(lazy-seq
(let [ss (my-lazy-map seq cs)]
(when (every? identity ss)
(cons (my-lazy-map first ss)
(step (my-lazy-map rest ss)))))))]
(modified-pmap num-threads #(apply f %) (step (cons coll colls))))))
;; Return true when the line l is a FASTA description line
(defn fasta-description-line [l]
(= \> (first (seq l))))
;; Return true when the line l is a FASTA description line that begins
;; with the string desc-str.
(defn fasta-description-line-beginning [desc-str l]
(and (fasta-description-line l)
(= desc-str (subs l 1 (min (count l) (inc (count desc-str)))))))
Take a sequence of lines from a FASTA format file , and a string
;; desc-str. Look for a FASTA record with a description that begins
;; with desc-str, and if one is found, return its DNA sequence as a
;; single (potentially quite long) string. If input file is big,
;; you'll save lots of memory if you call this function in a with-open
;; for the file, and don't hold on to the head of the lines parameter.
(defn fasta-dna-str-with-desc-beginning [desc-str lines]
(when-let [x (drop-while
(fn [l] (not (fasta-description-line-beginning desc-str l)))
lines)]
(when-let [x (seq x)]
(let [y (take-while (fn [l] (not (fasta-description-line l)))
(map (fn [#^java.lang.String s] (.toUpperCase s))
(rest x)))]
(apply str y)))))
(def dna-char-to-code-val {\A 0, \C 1, \T 2, \G 3})
(def code-val-to-dna-char {0 \A, 1 \C, 2 \T, 3 \G})
;; In the hash map 'tally' in tally-dna-subs-with-len, it is more
straightforward to use a Clojure string ( same as a Java string ) as
;; the key, but such a key is significantly bigger than it needs to
;; be, increasing memory and time required to hash the value. By
;; converting a string of A, C, T, and G characters down to an integer
;; that contains only 2 bits for each character, we make a value that
;; is significantly smaller and faster to use as a key in the map.
;; most least
;; significant significant
;; bits of int bits of int
;; | |
;; V V
;; code code code .... code code
;; ^ ^
;; | |
;; code for code for
;; *latest* *earliest*
;; char in char in
;; sequence sequence
;; Note: Given Clojure 1.2's implementation of bit-shift-left/right
operations , when the value being shifted is larger than a 32 - bit
;; int, they are faster when the shift amount is a compile time
;; constant.
(defn dna-str-to-key [s]
;; Accessing a local let binding is much faster than accessing a var
(let [dna-char-to-code-val dna-char-to-code-val]
(loop [key 0
offset (int (dec (count s)))]
(if (neg? offset)
key
(let [c (nth s offset)
code (int (dna-char-to-code-val c))
new-key (+ (bit-shift-left key 2) code)]
(recur new-key (dec offset)))))))
(defn key-to-dna-str [k len]
(apply str (map code-val-to-dna-char
(map (fn [pos] (bit-and 3 (bit-shift-right k pos)))
(range 0 (* 2 len) 2)))))
(defn tally-dna-subs-with-len [len dna-str]
(let [mask-width (* 2 len)
mask (key-type (dec (bit-shift-left 1 mask-width)))
dna-char-to-code-val dna-char-to-code-val]
(loop [offset (int (- (count dna-str) len))
key (key-type (dna-str-to-key (subs dna-str offset (+ offset len))))
tally (let [h (java.util.HashMap.)
one (TallyCounter. (int 1))]
(.put h key one)
h)]
(if (zero? offset)
tally
(let [new-offset (dec offset)
new-first-char-code (dna-char-to-code-val
(nth dna-str new-offset))
new-key (key-type (bit-and mask (+ (bit-shift-left key 2)
new-first-char-code)))]
(if-let [^TallyCounter cur-count (get tally new-key)]
(.inc! cur-count)
(let [one (TallyCounter. (int 1))]
(.put tally new-key one)))
(recur new-offset new-key tally))))))
(defn getcnt [^TallyCounter tc]
(.get-count tc))
(defn all-tally-to-str [tally fn-key-to-str]
(with-out-str
(let [total (reduce + (map getcnt (vals tally)))
cmp-keys (fn [k1 k2]
;; Return negative integer if k1 should come earlier
;; in the sort order than k2, 0 if they are equal,
;; otherwise a positive integer.
(let [cnt1 (int (getcnt (get tally k1)))
cnt2 (int (getcnt (get tally k2)))]
(if (not= cnt1 cnt2)
(- cnt2 cnt1)
(let [^String s1 (fn-key-to-str k1)
^String s2 (fn-key-to-str k2)]
(.compareTo s1 s2)))))]
(doseq [k (sort cmp-keys (keys tally))]
(printf "%s %.3f\n" (fn-key-to-str k)
(double (* 100 (/ (getcnt (get tally k)) total))))))))
(defn one-tally-to-str [dna-str tally]
(let [zerotc (TallyCounter. 0)]
(format "%d\t%s" (getcnt (get tally (dna-str-to-key dna-str) zerotc))
dna-str)))
(defn compute-one-part [dna-str part]
[part
(condp = part
0 (all-tally-to-str (tally-dna-subs-with-len 1 dna-str)
(fn [k] (key-to-dna-str k 1)))
1 (all-tally-to-str (tally-dna-subs-with-len 2 dna-str)
(fn [k] (key-to-dna-str k 2)))
2 (one-tally-to-str "GGT"
(tally-dna-subs-with-len 3 dna-str))
3 (one-tally-to-str "GGTA"
(tally-dna-subs-with-len 4 dna-str))
4 (one-tally-to-str "GGTATT"
(tally-dna-subs-with-len 6 dna-str))
5 (one-tally-to-str "GGTATTTTAATT"
(tally-dna-subs-with-len 12 dna-str))
6 (one-tally-to-str "GGTATTTTAATTTATAGT"
(tally-dna-subs-with-len 18 dna-str)))])
(def +default-modified-pmap-num-threads+
(+ 2 (.. Runtime getRuntime availableProcessors)))
(defn -main [& args]
(def num-threads
(if (and (>= (count args) 1)
(re-matches #"^\d+$" (nth args 0)))
(let [n (. Integer valueOf (nth args 0) 10)]
(if (== n 0)
+default-modified-pmap-num-threads+
n))
+default-modified-pmap-num-threads+))
(with-open [br (java.io.BufferedReader. *in*)]
(let [dna-str (fasta-dna-str-with-desc-beginning "THREE" (line-seq br))
;; Select the order of computing parts such that it is
unlikely that parts 5 and 6 will be computed concurrently .
Those are the two that take the most memory . It would be
nice if we could specify a DAG for which jobs should finish
before others begin -- then we could prevent those two
;; parts from running simultaneously.
results (map second
(sort #(< (first %1) (first %2))
(modified-pmap num-threads
#(compute-one-part dna-str %)
'(0 5 6 1 2 3 4)
)))]
(doseq [r results]
(println r)
(flush))))
(shutdown-agents))
| null | https://raw.githubusercontent.com/jafingerhut/clojure-benchmarks/474a8a4823727dd371f1baa9809517f9e0b508d4/knucleotide/knucleotide.clj-14-for-clj13.clj | clojure | /
Handle slight difference in function name between Clojure 1.2.0 and
1.3.0-alpha* ability to use type hints to infer fast bit
operations.
parallelism as specified by the parameter num-threads. Uses
my-lazy-map instead of map from core.clj, since that version of map
can use unwanted additional parallelism for chunked collections,
like ranges.
Return true when the line l is a FASTA description line
Return true when the line l is a FASTA description line that begins
with the string desc-str.
desc-str. Look for a FASTA record with a description that begins
with desc-str, and if one is found, return its DNA sequence as a
single (potentially quite long) string. If input file is big,
you'll save lots of memory if you call this function in a with-open
for the file, and don't hold on to the head of the lines parameter.
In the hash map 'tally' in tally-dna-subs-with-len, it is more
the key, but such a key is significantly bigger than it needs to
be, increasing memory and time required to hash the value. By
converting a string of A, C, T, and G characters down to an integer
that contains only 2 bits for each character, we make a value that
is significantly smaller and faster to use as a key in the map.
most least
significant significant
bits of int bits of int
| |
V V
code code code .... code code
^ ^
| |
code for code for
*latest* *earliest*
char in char in
sequence sequence
Note: Given Clojure 1.2's implementation of bit-shift-left/right
int, they are faster when the shift amount is a compile time
constant.
Accessing a local let binding is much faster than accessing a var
Return negative integer if k1 should come earlier
in the sort order than k2, 0 if they are equal,
otherwise a positive integer.
Select the order of computing parts such that it is
parts from running simultaneously. | The Computer Language Benchmarks Game
contributed by
(ns knucleotide
(:gen-class))
(set! *warn-on-reflection* true)
(defmacro my-unchecked-inc-int [& args]
(if (and (== (*clojure-version* :major) 1)
(== (*clojure-version* :minor) 2))
`(unchecked-inc ~@args)
`(unchecked-inc-int ~@args)))
(defmacro key-type [num]
(if (and (== (*clojure-version* :major) 1)
(== (*clojure-version* :minor) 2))
num
`(long ~num)))
(definterface ITallyCounter
(^int get_count [])
(inc_BANG_ []))
(deftype TallyCounter [^{:unsynchronized-mutable true :tag int} cnt]
ITallyCounter
(get-count [this] cnt)
(inc! [this]
(set! cnt (my-unchecked-inc-int cnt))))
(defn my-lazy-map [f coll]
(lazy-seq
(when-let [s (seq coll)]
(cons (f (first s)) (my-lazy-map f (rest s))))))
modified - pmap is like pmap from Clojure 1.1 , but with only as much
(defn modified-pmap
([num-threads f coll]
(if (== num-threads 1)
(map f coll)
(let [n (if (>= num-threads 2) (dec num-threads) 1)
rets (my-lazy-map #(future (f %)) coll)
step (fn step [[x & xs :as vs] fs]
(lazy-seq
(if-let [s (seq fs)]
(cons (deref x) (step xs (rest s)))
(map deref vs))))]
(step rets (drop n rets)))))
([num-threads f coll & colls]
(let [step (fn step [cs]
(lazy-seq
(let [ss (my-lazy-map seq cs)]
(when (every? identity ss)
(cons (my-lazy-map first ss)
(step (my-lazy-map rest ss)))))))]
(modified-pmap num-threads #(apply f %) (step (cons coll colls))))))
(defn fasta-description-line [l]
(= \> (first (seq l))))
(defn fasta-description-line-beginning [desc-str l]
(and (fasta-description-line l)
(= desc-str (subs l 1 (min (count l) (inc (count desc-str)))))))
Take a sequence of lines from a FASTA format file , and a string
(defn fasta-dna-str-with-desc-beginning [desc-str lines]
(when-let [x (drop-while
(fn [l] (not (fasta-description-line-beginning desc-str l)))
lines)]
(when-let [x (seq x)]
(let [y (take-while (fn [l] (not (fasta-description-line l)))
(map (fn [#^java.lang.String s] (.toUpperCase s))
(rest x)))]
(apply str y)))))
(def dna-char-to-code-val {\A 0, \C 1, \T 2, \G 3})
(def code-val-to-dna-char {0 \A, 1 \C, 2 \T, 3 \G})
straightforward to use a Clojure string ( same as a Java string ) as
operations , when the value being shifted is larger than a 32 - bit
(defn dna-str-to-key [s]
(let [dna-char-to-code-val dna-char-to-code-val]
(loop [key 0
offset (int (dec (count s)))]
(if (neg? offset)
key
(let [c (nth s offset)
code (int (dna-char-to-code-val c))
new-key (+ (bit-shift-left key 2) code)]
(recur new-key (dec offset)))))))
(defn key-to-dna-str [k len]
(apply str (map code-val-to-dna-char
(map (fn [pos] (bit-and 3 (bit-shift-right k pos)))
(range 0 (* 2 len) 2)))))
(defn tally-dna-subs-with-len [len dna-str]
(let [mask-width (* 2 len)
mask (key-type (dec (bit-shift-left 1 mask-width)))
dna-char-to-code-val dna-char-to-code-val]
(loop [offset (int (- (count dna-str) len))
key (key-type (dna-str-to-key (subs dna-str offset (+ offset len))))
tally (let [h (java.util.HashMap.)
one (TallyCounter. (int 1))]
(.put h key one)
h)]
(if (zero? offset)
tally
(let [new-offset (dec offset)
new-first-char-code (dna-char-to-code-val
(nth dna-str new-offset))
new-key (key-type (bit-and mask (+ (bit-shift-left key 2)
new-first-char-code)))]
(if-let [^TallyCounter cur-count (get tally new-key)]
(.inc! cur-count)
(let [one (TallyCounter. (int 1))]
(.put tally new-key one)))
(recur new-offset new-key tally))))))
(defn getcnt [^TallyCounter tc]
(.get-count tc))
(defn all-tally-to-str [tally fn-key-to-str]
(with-out-str
(let [total (reduce + (map getcnt (vals tally)))
cmp-keys (fn [k1 k2]
(let [cnt1 (int (getcnt (get tally k1)))
cnt2 (int (getcnt (get tally k2)))]
(if (not= cnt1 cnt2)
(- cnt2 cnt1)
(let [^String s1 (fn-key-to-str k1)
^String s2 (fn-key-to-str k2)]
(.compareTo s1 s2)))))]
(doseq [k (sort cmp-keys (keys tally))]
(printf "%s %.3f\n" (fn-key-to-str k)
(double (* 100 (/ (getcnt (get tally k)) total))))))))
(defn one-tally-to-str [dna-str tally]
(let [zerotc (TallyCounter. 0)]
(format "%d\t%s" (getcnt (get tally (dna-str-to-key dna-str) zerotc))
dna-str)))
(defn compute-one-part [dna-str part]
[part
(condp = part
0 (all-tally-to-str (tally-dna-subs-with-len 1 dna-str)
(fn [k] (key-to-dna-str k 1)))
1 (all-tally-to-str (tally-dna-subs-with-len 2 dna-str)
(fn [k] (key-to-dna-str k 2)))
2 (one-tally-to-str "GGT"
(tally-dna-subs-with-len 3 dna-str))
3 (one-tally-to-str "GGTA"
(tally-dna-subs-with-len 4 dna-str))
4 (one-tally-to-str "GGTATT"
(tally-dna-subs-with-len 6 dna-str))
5 (one-tally-to-str "GGTATTTTAATT"
(tally-dna-subs-with-len 12 dna-str))
6 (one-tally-to-str "GGTATTTTAATTTATAGT"
(tally-dna-subs-with-len 18 dna-str)))])
(def +default-modified-pmap-num-threads+
(+ 2 (.. Runtime getRuntime availableProcessors)))
(defn -main [& args]
(def num-threads
(if (and (>= (count args) 1)
(re-matches #"^\d+$" (nth args 0)))
(let [n (. Integer valueOf (nth args 0) 10)]
(if (== n 0)
+default-modified-pmap-num-threads+
n))
+default-modified-pmap-num-threads+))
(with-open [br (java.io.BufferedReader. *in*)]
(let [dna-str (fasta-dna-str-with-desc-beginning "THREE" (line-seq br))
unlikely that parts 5 and 6 will be computed concurrently .
Those are the two that take the most memory . It would be
nice if we could specify a DAG for which jobs should finish
before others begin -- then we could prevent those two
results (map second
(sort #(< (first %1) (first %2))
(modified-pmap num-threads
#(compute-one-part dna-str %)
'(0 5 6 1 2 3 4)
)))]
(doseq [r results]
(println r)
(flush))))
(shutdown-agents))
|
07833a8489b36df4f791569108718e2fa848fa5287cba894fe15f8c4dbed2096 | erlang/erlide_kernel | erlide_user.erl | ` ` The contents of this file are subject to the Erlang Public License ,
Version 1.1 , ( the " License " ) ; you may not use this file except in
%% compliance with the License. You should have received a copy of the
%% Erlang Public License along with this software. If not, it can be
%% retrieved via the world wide web at /.
%%
Software distributed under the License is distributed on an " AS IS "
%% basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See
%% the License for the specific language governing rights and limitations
%% under the License.
%%
The Initial Developer of the Original Code is Ericsson Utvecklings AB .
Portions created by Ericsson are Copyright 1999 ,
%% AB. All Rights Reserved.''
%%
%% $Id$
%%
-module(erlide_user).
%% Basic standard i/o server for user interface port.
-export([start/0, start/1, start_out/0]).
-export([interfaces/1]).
-define(NAME, user).
%% Internal exports
-export([server/1, server/2]).
%% Defines for control ops
-define(CTRL_OP_GET_WINSIZE,100).
%%
%% The basic server and start-up.
%%
start() ->
start_port([eof,binary]).
start([Mod,Fun|Args]) ->
Mod , Fun , should return a pid . That process is supposed to act
%% as the io port.
Pid = apply(Mod, Fun, Args), % This better work!
Id = spawn(?MODULE, server, [Pid]),
register(?NAME, Id),
Id.
start_out() ->
%% Output-only version of start/0
start_port([out,binary]).
start_port(PortSettings) ->
Id = spawn(?MODULE,server,[{fd,0,1},PortSettings]),
register(?NAME,Id),
Id.
%% Return the pid of the shell process.
%% Note: We can't ask the user process for this info since it
%% may be busy waiting for data from the port.
interfaces(User) ->
case process_info(User, dictionary) of
{dictionary,Dict} ->
case lists:keysearch(shell, 1, Dict) of
{value,Sh={shell,Shell}} when is_pid(Shell) ->
[Sh];
_ ->
[]
end;
_ ->
[]
end.
server(Pid) when is_pid(Pid) ->
process_flag(trap_exit, true),
link(Pid),
run(Pid).
server(PortName,PortSettings) ->
process_flag(trap_exit, true),
Port = open_port(PortName,PortSettings),
run(Port).
run(P) ->
put(read_mode,list),
case init:get_argument(noshell) of
non - empty list - > noshell
{ok, [_|_]} ->
put(shell, noshell),
server_loop(P, queue:new());
_ ->
group_leader(self(), self()),
catch_loop(P, start_init_shell())
end.
catch_loop(Port, Shell) ->
catch_loop(Port, Shell, queue:new()).
catch_loop(Port, Shell, Q) ->
case catch server_loop(Port, Q) of
new_shell ->
exit(Shell, kill),
catch_loop(Port, start_new_shell());
{unknown_exit,{Shell,Reason},_} -> % shell has exited
case Reason of
normal ->
put_chars("*** ", Port, []);
_ ->
put_chars("*** ERROR: ", Port, [])
end,
put_chars("Shell process terminated! ***\n", Port, []),
catch_loop(Port, start_new_shell());
{unknown_exit,_,Q1} ->
catch_loop(Port, Shell, Q1);
{'EXIT',R} ->
exit(R)
end.
link_and_save_shell(Shell) ->
link(Shell),
put(shell, Shell),
Shell.
start_init_shell() ->
link_and_save_shell(shell:start(init)).
start_new_shell() ->
link_and_save_shell(shell:start()).
server_loop(Port, Q) ->
receive
{Port,{data,Bytes}} ->
case get(shell) of
noshell ->
server_loop(Port, queue:snoc(Q, Bytes));
_ ->
case contains_ctrl_g_or_ctrl_c(Bytes) of
false ->
server_loop(Port, queue:snoc(Q, Bytes));
_ ->
throw(new_shell)
end
end;
{io_request,From,ReplyAs,Request}=Msg when is_pid(From) ->
erlide_log:logp(Msg),
server_loop(Port, do_io_request(Request, From, ReplyAs, Port, Q));
{Port, eof} ->
put(eof, true),
server_loop(Port, Q);
%% Ignore messages from port here.
{'EXIT',Port,badsig} -> % Ignore badsig errors
server_loop(Port, Q);
{'EXIT',Port,What} -> % Port has exited
exit(What);
%% Check if shell has exited
{'EXIT',SomePid,What} ->
case get(shell) of
noshell ->
server_loop(Port, Q); % Ignore
_ ->
throw({unknown_exit,{SomePid,What},Q})
end;
_Other -> % Ignore other messages
server_loop(Port, Q)
end.
get_fd_geometry(Port) ->
case (catch port_control(Port,?CTRL_OP_GET_WINSIZE,[])) of
List when is_list(List), length(List) =:= 8 ->
<<W:32/native,H:32/native>> = list_to_binary(List),
{W,H};
_ ->
error
end.
NewSaveBuffer , FromPid , ReplyAs , Port , SaveBuffer )
do_io_request(Req, From, ReplyAs, Port, Q0) ->
case io_request(Req, Port, Q0) of
{_Status,Reply,Q1} ->
io_reply(From, ReplyAs, Reply),
Q1;
{exit,What} ->
send_port(Port, close),
exit(What)
end.
io_request({put_chars,Chars}, Port, Q) -> % Binary new in R9C
put_chars(Chars, Port, Q);
io_request({put_chars,Mod,Func,Args}, Port, Q) ->
put_chars(catch apply(Mod,Func,Args), Port, Q);
io_request({get_chars,Prompt,N}, Port, Q) -> % New in R9C
get_chars(Prompt, io_lib, collect_chars, N, Port, Q);
%% New in R12
io_request({get_geometry,columns},Port,Q) ->
case get_fd_geometry(Port) of
{W,_H} ->
{ok,W,Q};
_ ->
{error,{error,enotsup},Q}
end;
io_request({get_geometry,rows},Port,Q) ->
case get_fd_geometry(Port) of
{_W,H} ->
{ok,H,Q};
_ ->
{error,{error,enotsup},Q}
end;
%% These are new in R9C
io_request({get_chars,Prompt,Mod,Func,XtraArg}, Port, Q) ->
: , Q } ) ,
get_chars(Prompt, Mod, Func, XtraArg, Port, Q);
io_request({get_line,Prompt}, Port, Q) ->
: , Q } ) ,
get_chars(Prompt, io_lib, collect_line, [], Port, Q);
io_request({setopts,Opts}, Port, Q) when is_list(Opts) ->
setopts(Opts, Port, Q);
%% End of new in R9C
io_request({get_until,Prompt,M,F,As}, Port, Q) ->
get_chars(Prompt, io_lib, get_until, {M,F,As}, Port, Q);
io_request({requests,Reqs}, Port, Q) ->
io_requests(Reqs, {ok,ok,Q}, Port);
io_request(R, _Port, Q) -> %Unknown request
{error,{error,{request,R}},Q}. %Ignore but give error (?)
Status = io_requests(RequestList , PrevStat , Port )
%% Process a list of output requests as long as the previous status is 'ok'.
io_requests([R|Rs], {ok,_Res,Q}, Port) ->
io_requests(Rs, io_request(R, Port, Q), Port);
io_requests([_|_], Error, _) ->
Error;
io_requests([], Stat, _) ->
Stat.
put_port(DeepList , Port )
%% Take a deep list of characters, flatten and output them to the
%% port.
put_port(List, Port) ->
send_port(Port, {command, List}).
%% send_port(Port, Command)
send_port(Port, Command) ->
Port ! {self(),Command}.
io_reply(From , , Reply )
%% The function for sending i/o command acknowledgement.
%% The ACK contains the return value.
io_reply(From, ReplyAs, Reply) ->
From ! {io_reply,ReplyAs,Reply}.
%% put_chars
put_chars(Chars, Port, Q) when is_binary(Chars) ->
put_port(Chars, Port),
{ok,ok,Q};
put_chars(Chars, Port, Q) ->
case catch list_to_binary(Chars) of
Binary when is_binary(Binary) ->
put_chars(Binary, Port, Q);
_ ->
{error,{error,put_chars},Q}
end.
%% setopts
setopts(Opts0, _Port, Q) ->
Opts = proplists:substitute_negations([{list,binary}], Opts0),
case proplists:get_value(binary, Opts) of
true ->
put(read_mode,binary),
{ok,ok,Q};
false ->
put(read_mode,list),
{ok,ok,Q};
_ ->
{error,{error,badarg},Q}
end.
get_chars(Prompt , Module , Function , XtraArg , Port , Queue )
%% Gets characters from the input port until the applied function
returns { stop , Result , RestBuf } . Does not block output until input
%% has been received.
%% Returns:
{ Status , Result , NewQueue }
%% {exit,Reason}
%% Entry function.
get_chars(Prompt, M, F, Xa, Port, Q) ->
prompt(Port, Prompt),
case {get(eof),queue:is_empty(Q)} of
{true,true} ->
{ok,eof,Q};
_ ->
get_chars(Prompt, M, F, Xa, Port, Q, start)
end.
First loop . Wait for port data . Respond to output requests .
get_chars(Prompt, M, F, Xa, Port, Q, State) ->
case queue:is_empty(Q) of
true ->
receive
{Port,{data,Bytes}} ->
get_chars_bytes(State, M, F, Xa, Port, Q, Bytes);
{Port, eof} ->
put(eof, true),
{ok, eof, []};
{ io_request , From , ReplyAs , Request } when is_pid(From ) - >
get_chars_req(Prompt , M , F , Xa , Port , queue : new ( ) , State ,
Request , From , ) ;
{io_request,From,ReplyAs,{get_geometry,_}=Req} when is_pid(From) ->
do_io_request(Req, From, ReplyAs, Port,
queue:new()), %Keep Q over this call
%% No prompt.
get_chars(Prompt, M, F, Xa, Port, Q, State);
{io_request,From,ReplyAs,Request} when is_pid(From) ->
get_chars_req(Prompt, M, F, Xa, Port, Q, State,
Request, From, ReplyAs);
{'EXIT',From,What} when node(From) =:= node() ->
{exit,What}
end;
false ->
get_chars_apply(State, M, F, Xa, Port, Q)
end.
get_chars_req(Prompt, M, F, XtraArg, Port, Q, State,
Req, From, ReplyAs) ->
do_io_request(Req, From, ReplyAs, Port, queue:new()), %Keep Q over this call
prompt(Port, Prompt),
get_chars(Prompt, M, F, XtraArg, Port, Q, State).
Second loop . Pass data to client as long as it wants more .
%% A ^G in data interrupts loop if 'noshell' is not undefined.
get_chars_bytes(State, M, F, Xa, Port, Q, Bytes) ->
case get(shell) of
noshell ->
get_chars_apply(State, M, F, Xa, Port, queue:snoc(Q, Bytes));
_ ->
case contains_ctrl_g_or_ctrl_c(Bytes) of
false ->
get_chars_apply(State, M, F, Xa, Port,
queue:snoc(Q, Bytes));
_ ->
throw(new_shell)
end
end.
get_chars_apply(State0, M, F, Xa, Port, Q) ->
case catch M:F(State0, cast(queue:head(Q)), Xa) of
{stop,Result,<<>>} ->
{ok,Result,queue:tail(Q)};
{stop,Result,[]} ->
{ok,Result,queue:tail(Q)};
{stop,Result,eof} ->
{ok,Result,queue:tail(Q)};
{stop,Result,Buf} ->
{ok,Result,queue:cons(Buf, queue:tail(Q))};
{'EXIT',_} ->
{error,{error,err_func(M, F, Xa)},[]};
State1 ->
get_chars_more(State1, M, F, Xa, Port, queue:tail(Q))
end.
get_chars_more(State, M, F, Xa, Port, Q) ->
case queue:is_empty(Q) of
true ->
case get(eof) of
undefined ->
receive
{Port,{data,Bytes}} ->
get_chars_bytes(State, M, F, Xa, Port, Q, Bytes);
{Port,eof} ->
put(eof, true),
get_chars_apply(State, M, F, Xa, Port,
queue:snoc(Q, eof));
{'EXIT',From,What} when node(From) =:= node() ->
{exit,What}
end;
_ ->
get_chars_apply(State, M, F, Xa, Port, queue:snoc(Q, eof))
end;
false ->
get_chars_apply(State, M, F, Xa, Port, Q)
end.
%% prompt(Port, Prompt)
%% Print Prompt onto Port
common case , reduces execution time by 20 %
prompt(_Port, '') -> ok;
prompt(Port, Prompt) ->
put_port(io_lib:format_prompt(Prompt), Port).
Convert error code to make it look as before
err_func(io_lib, get_until, {_,F,_}) ->
F;
err_func(_, F, _) ->
F.
using regexp reduces execution time by > 50 % compared to old code
running two regexps in sequence is much faster than \\x03|\\x07
contains_ctrl_g_or_ctrl_c(BinOrList)->
case {re:run(BinOrList, <<3>>),re:run(BinOrList, <<7>>)} of
{nomatch, nomatch} -> false;
_ -> true
end.
%% Convert a buffer between list and binary
cast(Data) ->
cast(Data, get(read_mode)).
cast(List, binary) when is_list(List) ->
list_to_binary(List);
cast(Binary, list) when is_binary(Binary) ->
binary_to_list(Binary);
cast(Data, _) ->
Data.
| null | https://raw.githubusercontent.com/erlang/erlide_kernel/763a7fe47213f374b59862fd5a17d5dcc2811c7b/common/apps/erlide_common/src/erlide_user.erl | erlang | compliance with the License. You should have received a copy of the
Erlang Public License along with this software. If not, it can be
retrieved via the world wide web at /.
basis, WITHOUT WARRANTY OF ANY KIND, either express or implied. See
the License for the specific language governing rights and limitations
under the License.
AB. All Rights Reserved.''
$Id$
Basic standard i/o server for user interface port.
Internal exports
Defines for control ops
The basic server and start-up.
as the io port.
This better work!
Output-only version of start/0
Return the pid of the shell process.
Note: We can't ask the user process for this info since it
may be busy waiting for data from the port.
shell has exited
Ignore messages from port here.
Ignore badsig errors
Port has exited
Check if shell has exited
Ignore
Ignore other messages
Binary new in R9C
New in R9C
New in R12
These are new in R9C
End of new in R9C
Unknown request
Ignore but give error (?)
Process a list of output requests as long as the previous status is 'ok'.
Take a deep list of characters, flatten and output them to the
port.
send_port(Port, Command)
The function for sending i/o command acknowledgement.
The ACK contains the return value.
put_chars
setopts
Gets characters from the input port until the applied function
has been received.
Returns:
{exit,Reason}
Entry function.
Keep Q over this call
No prompt.
Keep Q over this call
A ^G in data interrupts loop if 'noshell' is not undefined.
prompt(Port, Prompt)
Print Prompt onto Port
compared to old code
Convert a buffer between list and binary | ` ` The contents of this file are subject to the Erlang Public License ,
Version 1.1 , ( the " License " ) ; you may not use this file except in
Software distributed under the License is distributed on an " AS IS "
The Initial Developer of the Original Code is Ericsson Utvecklings AB .
Portions created by Ericsson are Copyright 1999 ,
-module(erlide_user).
-export([start/0, start/1, start_out/0]).
-export([interfaces/1]).
-define(NAME, user).
-export([server/1, server/2]).
-define(CTRL_OP_GET_WINSIZE,100).
start() ->
start_port([eof,binary]).
start([Mod,Fun|Args]) ->
Mod , Fun , should return a pid . That process is supposed to act
Id = spawn(?MODULE, server, [Pid]),
register(?NAME, Id),
Id.
start_out() ->
start_port([out,binary]).
start_port(PortSettings) ->
Id = spawn(?MODULE,server,[{fd,0,1},PortSettings]),
register(?NAME,Id),
Id.
interfaces(User) ->
case process_info(User, dictionary) of
{dictionary,Dict} ->
case lists:keysearch(shell, 1, Dict) of
{value,Sh={shell,Shell}} when is_pid(Shell) ->
[Sh];
_ ->
[]
end;
_ ->
[]
end.
server(Pid) when is_pid(Pid) ->
process_flag(trap_exit, true),
link(Pid),
run(Pid).
server(PortName,PortSettings) ->
process_flag(trap_exit, true),
Port = open_port(PortName,PortSettings),
run(Port).
run(P) ->
put(read_mode,list),
case init:get_argument(noshell) of
non - empty list - > noshell
{ok, [_|_]} ->
put(shell, noshell),
server_loop(P, queue:new());
_ ->
group_leader(self(), self()),
catch_loop(P, start_init_shell())
end.
catch_loop(Port, Shell) ->
catch_loop(Port, Shell, queue:new()).
catch_loop(Port, Shell, Q) ->
case catch server_loop(Port, Q) of
new_shell ->
exit(Shell, kill),
catch_loop(Port, start_new_shell());
case Reason of
normal ->
put_chars("*** ", Port, []);
_ ->
put_chars("*** ERROR: ", Port, [])
end,
put_chars("Shell process terminated! ***\n", Port, []),
catch_loop(Port, start_new_shell());
{unknown_exit,_,Q1} ->
catch_loop(Port, Shell, Q1);
{'EXIT',R} ->
exit(R)
end.
link_and_save_shell(Shell) ->
link(Shell),
put(shell, Shell),
Shell.
start_init_shell() ->
link_and_save_shell(shell:start(init)).
start_new_shell() ->
link_and_save_shell(shell:start()).
server_loop(Port, Q) ->
receive
{Port,{data,Bytes}} ->
case get(shell) of
noshell ->
server_loop(Port, queue:snoc(Q, Bytes));
_ ->
case contains_ctrl_g_or_ctrl_c(Bytes) of
false ->
server_loop(Port, queue:snoc(Q, Bytes));
_ ->
throw(new_shell)
end
end;
{io_request,From,ReplyAs,Request}=Msg when is_pid(From) ->
erlide_log:logp(Msg),
server_loop(Port, do_io_request(Request, From, ReplyAs, Port, Q));
{Port, eof} ->
put(eof, true),
server_loop(Port, Q);
server_loop(Port, Q);
exit(What);
{'EXIT',SomePid,What} ->
case get(shell) of
noshell ->
_ ->
throw({unknown_exit,{SomePid,What},Q})
end;
server_loop(Port, Q)
end.
get_fd_geometry(Port) ->
case (catch port_control(Port,?CTRL_OP_GET_WINSIZE,[])) of
List when is_list(List), length(List) =:= 8 ->
<<W:32/native,H:32/native>> = list_to_binary(List),
{W,H};
_ ->
error
end.
NewSaveBuffer , FromPid , ReplyAs , Port , SaveBuffer )
do_io_request(Req, From, ReplyAs, Port, Q0) ->
case io_request(Req, Port, Q0) of
{_Status,Reply,Q1} ->
io_reply(From, ReplyAs, Reply),
Q1;
{exit,What} ->
send_port(Port, close),
exit(What)
end.
put_chars(Chars, Port, Q);
io_request({put_chars,Mod,Func,Args}, Port, Q) ->
put_chars(catch apply(Mod,Func,Args), Port, Q);
get_chars(Prompt, io_lib, collect_chars, N, Port, Q);
io_request({get_geometry,columns},Port,Q) ->
case get_fd_geometry(Port) of
{W,_H} ->
{ok,W,Q};
_ ->
{error,{error,enotsup},Q}
end;
io_request({get_geometry,rows},Port,Q) ->
case get_fd_geometry(Port) of
{_W,H} ->
{ok,H,Q};
_ ->
{error,{error,enotsup},Q}
end;
io_request({get_chars,Prompt,Mod,Func,XtraArg}, Port, Q) ->
: , Q } ) ,
get_chars(Prompt, Mod, Func, XtraArg, Port, Q);
io_request({get_line,Prompt}, Port, Q) ->
: , Q } ) ,
get_chars(Prompt, io_lib, collect_line, [], Port, Q);
io_request({setopts,Opts}, Port, Q) when is_list(Opts) ->
setopts(Opts, Port, Q);
io_request({get_until,Prompt,M,F,As}, Port, Q) ->
get_chars(Prompt, io_lib, get_until, {M,F,As}, Port, Q);
io_request({requests,Reqs}, Port, Q) ->
io_requests(Reqs, {ok,ok,Q}, Port);
Status = io_requests(RequestList , PrevStat , Port )
io_requests([R|Rs], {ok,_Res,Q}, Port) ->
io_requests(Rs, io_request(R, Port, Q), Port);
io_requests([_|_], Error, _) ->
Error;
io_requests([], Stat, _) ->
Stat.
put_port(DeepList , Port )
put_port(List, Port) ->
send_port(Port, {command, List}).
send_port(Port, Command) ->
Port ! {self(),Command}.
io_reply(From , , Reply )
io_reply(From, ReplyAs, Reply) ->
From ! {io_reply,ReplyAs,Reply}.
put_chars(Chars, Port, Q) when is_binary(Chars) ->
put_port(Chars, Port),
{ok,ok,Q};
put_chars(Chars, Port, Q) ->
case catch list_to_binary(Chars) of
Binary when is_binary(Binary) ->
put_chars(Binary, Port, Q);
_ ->
{error,{error,put_chars},Q}
end.
setopts(Opts0, _Port, Q) ->
Opts = proplists:substitute_negations([{list,binary}], Opts0),
case proplists:get_value(binary, Opts) of
true ->
put(read_mode,binary),
{ok,ok,Q};
false ->
put(read_mode,list),
{ok,ok,Q};
_ ->
{error,{error,badarg},Q}
end.
get_chars(Prompt , Module , Function , XtraArg , Port , Queue )
returns { stop , Result , RestBuf } . Does not block output until input
{ Status , Result , NewQueue }
get_chars(Prompt, M, F, Xa, Port, Q) ->
prompt(Port, Prompt),
case {get(eof),queue:is_empty(Q)} of
{true,true} ->
{ok,eof,Q};
_ ->
get_chars(Prompt, M, F, Xa, Port, Q, start)
end.
First loop . Wait for port data . Respond to output requests .
get_chars(Prompt, M, F, Xa, Port, Q, State) ->
case queue:is_empty(Q) of
true ->
receive
{Port,{data,Bytes}} ->
get_chars_bytes(State, M, F, Xa, Port, Q, Bytes);
{Port, eof} ->
put(eof, true),
{ok, eof, []};
{ io_request , From , ReplyAs , Request } when is_pid(From ) - >
get_chars_req(Prompt , M , F , Xa , Port , queue : new ( ) , State ,
Request , From , ) ;
{io_request,From,ReplyAs,{get_geometry,_}=Req} when is_pid(From) ->
do_io_request(Req, From, ReplyAs, Port,
get_chars(Prompt, M, F, Xa, Port, Q, State);
{io_request,From,ReplyAs,Request} when is_pid(From) ->
get_chars_req(Prompt, M, F, Xa, Port, Q, State,
Request, From, ReplyAs);
{'EXIT',From,What} when node(From) =:= node() ->
{exit,What}
end;
false ->
get_chars_apply(State, M, F, Xa, Port, Q)
end.
get_chars_req(Prompt, M, F, XtraArg, Port, Q, State,
Req, From, ReplyAs) ->
prompt(Port, Prompt),
get_chars(Prompt, M, F, XtraArg, Port, Q, State).
Second loop . Pass data to client as long as it wants more .
get_chars_bytes(State, M, F, Xa, Port, Q, Bytes) ->
case get(shell) of
noshell ->
get_chars_apply(State, M, F, Xa, Port, queue:snoc(Q, Bytes));
_ ->
case contains_ctrl_g_or_ctrl_c(Bytes) of
false ->
get_chars_apply(State, M, F, Xa, Port,
queue:snoc(Q, Bytes));
_ ->
throw(new_shell)
end
end.
get_chars_apply(State0, M, F, Xa, Port, Q) ->
case catch M:F(State0, cast(queue:head(Q)), Xa) of
{stop,Result,<<>>} ->
{ok,Result,queue:tail(Q)};
{stop,Result,[]} ->
{ok,Result,queue:tail(Q)};
{stop,Result,eof} ->
{ok,Result,queue:tail(Q)};
{stop,Result,Buf} ->
{ok,Result,queue:cons(Buf, queue:tail(Q))};
{'EXIT',_} ->
{error,{error,err_func(M, F, Xa)},[]};
State1 ->
get_chars_more(State1, M, F, Xa, Port, queue:tail(Q))
end.
get_chars_more(State, M, F, Xa, Port, Q) ->
case queue:is_empty(Q) of
true ->
case get(eof) of
undefined ->
receive
{Port,{data,Bytes}} ->
get_chars_bytes(State, M, F, Xa, Port, Q, Bytes);
{Port,eof} ->
put(eof, true),
get_chars_apply(State, M, F, Xa, Port,
queue:snoc(Q, eof));
{'EXIT',From,What} when node(From) =:= node() ->
{exit,What}
end;
_ ->
get_chars_apply(State, M, F, Xa, Port, queue:snoc(Q, eof))
end;
false ->
get_chars_apply(State, M, F, Xa, Port, Q)
end.
prompt(_Port, '') -> ok;
prompt(Port, Prompt) ->
put_port(io_lib:format_prompt(Prompt), Port).
Convert error code to make it look as before
err_func(io_lib, get_until, {_,F,_}) ->
F;
err_func(_, F, _) ->
F.
running two regexps in sequence is much faster than \\x03|\\x07
contains_ctrl_g_or_ctrl_c(BinOrList)->
case {re:run(BinOrList, <<3>>),re:run(BinOrList, <<7>>)} of
{nomatch, nomatch} -> false;
_ -> true
end.
cast(Data) ->
cast(Data, get(read_mode)).
cast(List, binary) when is_list(List) ->
list_to_binary(List);
cast(Binary, list) when is_binary(Binary) ->
binary_to_list(Binary);
cast(Data, _) ->
Data.
|
59f4008201d12488540f81f6dfbc0281ed7375911dda1ca07739524a46ab7204 | opencog/learn | disjunct-stats-p6.scm | ;
; disjunct-stats-p6.scm
;
; Assorted ad-hoc collection of tools for understanding the
word - disjunct MI distribution . This is an extract of the
earlier , 2017 - era disjunct-stats.scm file , redone for Part Six
; of the diary, i.e. with newer datasets.
;
Copyright ( c ) 2022 Linas Vepstas
(use-modules (srfi srfi-1))
(define pca (make-pseudo-cset-api))
(define psa (add-pair-stars pca))
(define psf (add-pair-freq-api psa))
(define psu (add-support-api psa))
(psa 'fetch-pairs)
; ---------------------------------------------------------------------
; Ranking and printing utilities
;
Assign each item a score , using SCORE - FN
(define (score SCORE-FN ITEM-LIST)
(map (lambda (wrd) (cons (SCORE-FN wrd) wrd)) ITEM-LIST))
(define (prt-hist bins fname)
(define outport (open-file fname "w"))
(print-bincounts-tsv bins outport)
(close outport))
; ---------------------------------------------------------------------
Bin - count word - disjunct pairs according to thier fractional MI .
(define all-sects (psa 'get-all-elts))
(define (sect-mi SECT) (psf 'pair-fmi SECT))
(define scored-sect-mi (score sect-mi all-sects))
(define binned-sect-mi (bin-count-simple scored-sect-mi 200))
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-10-4-2.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-5-2-2.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-2-2-2.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-1-1-1.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-0-0-0.dat")
(define (sect-freq SECT) (psf 'pair-freq SECT))
(define weighted-sect-mi
(bin-count-weighted scored-sect-mi 200
(lambda (scored-item) (sect-freq (cdr scored-item)))))
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-10-4-2.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-5-2-2.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-2-2-2.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-1-1-1.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-0-0-0.dat" )
; -------------
| null | https://raw.githubusercontent.com/opencog/learn/64eb5a2d27b1fc4bd2ff76e6909f7b73a003a723/learn-lang-diary/utils/disjunct-stats-p6.scm | scheme |
disjunct-stats-p6.scm
Assorted ad-hoc collection of tools for understanding the
of the diary, i.e. with newer datasets.
---------------------------------------------------------------------
Ranking and printing utilities
---------------------------------------------------------------------
------------- | word - disjunct MI distribution . This is an extract of the
earlier , 2017 - era disjunct-stats.scm file , redone for Part Six
Copyright ( c ) 2022 Linas Vepstas
(use-modules (srfi srfi-1))
(define pca (make-pseudo-cset-api))
(define psa (add-pair-stars pca))
(define psf (add-pair-freq-api psa))
(define psu (add-support-api psa))
(psa 'fetch-pairs)
Assign each item a score , using SCORE - FN
(define (score SCORE-FN ITEM-LIST)
(map (lambda (wrd) (cons (SCORE-FN wrd) wrd)) ITEM-LIST))
(define (prt-hist bins fname)
(define outport (open-file fname "w"))
(print-bincounts-tsv bins outport)
(close outport))
Bin - count word - disjunct pairs according to thier fractional MI .
(define all-sects (psa 'get-all-elts))
(define (sect-mi SECT) (psf 'pair-fmi SECT))
(define scored-sect-mi (score sect-mi all-sects))
(define binned-sect-mi (bin-count-simple scored-sect-mi 200))
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-10-4-2.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-5-2-2.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-2-2-2.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-1-1-1.dat")
(prt-hist binned-sect-mi "/tmp/r4-sect-mi-0-0-0.dat")
(define (sect-freq SECT) (psf 'pair-freq SECT))
(define weighted-sect-mi
(bin-count-weighted scored-sect-mi 200
(lambda (scored-item) (sect-freq (cdr scored-item)))))
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-10-4-2.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-5-2-2.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-2-2-2.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-1-1-1.dat" )
(prt-hist weighted-sect-mi "/tmp/r4-wei-mi-0-0-0.dat" )
|
c34f1cf631792e8a2c2fe224ccea0420ee2b6c5746be7c246bad826d2c0f5ee3 | RedPRL/algaett | Locate.ml | open Earley_core
let lexing_position i c =
Lexing.{
pos_fname = Input.filename i;
pos_lnum = Input.line_num i;
pos_bol = Input.line_offset i;
pos_cnum = Input.line_offset i + c;
(* XXX: How to do UTF-8 here? Should be possible with Input.utf8_col_num. *)
}
let locate i1 c1 i2 c2 =
Elaborator.Syntax.{
start = lexing_position i1 c1;
stop = lexing_position i2 c2;
}
let located p =
p |> Earley.apply_position @@ fun i1 c1 i2 c2 x ->
Elaborator.Syntax.{
node = x;
loc = Some (locate i1 c1 i2 c2);
}
| null | https://raw.githubusercontent.com/RedPRL/algaett/5685e49608aaf230f5691f3e339cd2bd3acb6ec9/src/parser/Locate.ml | ocaml | XXX: How to do UTF-8 here? Should be possible with Input.utf8_col_num. | open Earley_core
let lexing_position i c =
Lexing.{
pos_fname = Input.filename i;
pos_lnum = Input.line_num i;
pos_bol = Input.line_offset i;
pos_cnum = Input.line_offset i + c;
}
let locate i1 c1 i2 c2 =
Elaborator.Syntax.{
start = lexing_position i1 c1;
stop = lexing_position i2 c2;
}
let located p =
p |> Earley.apply_position @@ fun i1 c1 i2 c2 x ->
Elaborator.Syntax.{
node = x;
loc = Some (locate i1 c1 i2 c2);
}
|
9595a2935daa331dd5021910f4389c6fc32670e4ccb5fc85d6de5072e588303c | nikomatsakis/a-mir-formality | crate-decl.rkt | #lang racket
(require redex/reduction-semantics
"../grammar.rkt"
"crate-item.rkt"
)
(provide crate-decl-rules
)
(define-metafunction formality-decl
Generate the complete set of rules that result from ` CrateDecl `
when checking the crate ` CrateId ` .
;;
NB : This assumes that we can compile to a complete set of
;; clauses. This will eventually not suffice, e.g., with
;; auto traits. But this helper is private, so we can refactor
;; that later.
crate-decl-rules : CrateDecls CrateDecl CrateId -> (Clauses Invariants)
[(crate-decl-rules CrateDecls (crate CrateId_0 (CrateItemDecl ...)) CrateId_1)
((Clause ... ...) (Invariant ... ...))
(where/error (((Clause ...) (Invariant ...)) ...)
((crate-item-decl-rules CrateDecls CrateId_0 CrateItemDecl) ...))
]
) | null | https://raw.githubusercontent.com/nikomatsakis/a-mir-formality/71be4d5c4bd5e91d326277eaedd19a7abe3ac76a/racket-src/decl/decl-to-clause/crate-decl.rkt | racket |
clauses. This will eventually not suffice, e.g., with
auto traits. But this helper is private, so we can refactor
that later. | #lang racket
(require redex/reduction-semantics
"../grammar.rkt"
"crate-item.rkt"
)
(provide crate-decl-rules
)
(define-metafunction formality-decl
Generate the complete set of rules that result from ` CrateDecl `
when checking the crate ` CrateId ` .
NB : This assumes that we can compile to a complete set of
crate-decl-rules : CrateDecls CrateDecl CrateId -> (Clauses Invariants)
[(crate-decl-rules CrateDecls (crate CrateId_0 (CrateItemDecl ...)) CrateId_1)
((Clause ... ...) (Invariant ... ...))
(where/error (((Clause ...) (Invariant ...)) ...)
((crate-item-decl-rules CrateDecls CrateId_0 CrateItemDecl) ...))
]
) |
5692c1c99493420c69d23ba0e26b8655a8f7330368f353878516d2a9b3f049e6 | hasktorch/hasktorch | Autograd.hs | # LANGUAGE DeriveGeneric #
# LANGUAGE FlexibleInstances #
# LANGUAGE MultiParamTypeClasses #
# LANGUAGE ScopedTypeVariables #
module Torch.Autograd where
import Foreign.ForeignPtr
import GHC.Generics
import System.IO.Unsafe
import Torch.Internal.Cast
import Torch.Internal.Class
import qualified Torch.Internal.Managed.Autograd
import qualified Torch.Internal.Managed.Type.Tensor as ATen
import qualified Torch.Internal.Type as ATen
import Torch.Tensor
| Note : to create an ` IndependentTensor ` use ` makeIndependent ` ;
| otherwise , Torch will complain the parameter does not require a gradient .
newtype IndependentTensor = IndependentTensor
{ toDependent :: Tensor
}
deriving (Show, Generic)
grad :: Tensor -> [IndependentTensor] -> [Tensor]
grad y inputs = unsafePerformIO $ cast2 Torch.Internal.Managed.Autograd.grad y (map toDependent inputs)
requiresGrad :: Tensor -> Bool
requiresGrad t = unsafePerformIO $ cast1 ATen.tensor_requires_grad t
setRequiresGrad :: Bool -> Tensor -> Tensor
setRequiresGrad flag t = unsafePerformIO $ cast2 ATen.tensor_set_requires_grad_b t flag
makeIndependent :: Tensor -> IO IndependentTensor
makeIndependent tensor = makeIndependentWithRequiresGrad tensor True
makeIndependentWithRequiresGrad :: Tensor -> Bool -> IO IndependentTensor
makeIndependentWithRequiresGrad tensor requires_grad = IndependentTensor <$> cast2 Torch.Internal.Managed.Autograd.makeIndependent tensor requires_grad
| null | https://raw.githubusercontent.com/hasktorch/hasktorch/0e942d6af90585662463c913acfbbe8036644cb1/hasktorch/src/Torch/Autograd.hs | haskell | # LANGUAGE DeriveGeneric #
# LANGUAGE FlexibleInstances #
# LANGUAGE MultiParamTypeClasses #
# LANGUAGE ScopedTypeVariables #
module Torch.Autograd where
import Foreign.ForeignPtr
import GHC.Generics
import System.IO.Unsafe
import Torch.Internal.Cast
import Torch.Internal.Class
import qualified Torch.Internal.Managed.Autograd
import qualified Torch.Internal.Managed.Type.Tensor as ATen
import qualified Torch.Internal.Type as ATen
import Torch.Tensor
| Note : to create an ` IndependentTensor ` use ` makeIndependent ` ;
| otherwise , Torch will complain the parameter does not require a gradient .
newtype IndependentTensor = IndependentTensor
{ toDependent :: Tensor
}
deriving (Show, Generic)
grad :: Tensor -> [IndependentTensor] -> [Tensor]
grad y inputs = unsafePerformIO $ cast2 Torch.Internal.Managed.Autograd.grad y (map toDependent inputs)
requiresGrad :: Tensor -> Bool
requiresGrad t = unsafePerformIO $ cast1 ATen.tensor_requires_grad t
setRequiresGrad :: Bool -> Tensor -> Tensor
setRequiresGrad flag t = unsafePerformIO $ cast2 ATen.tensor_set_requires_grad_b t flag
makeIndependent :: Tensor -> IO IndependentTensor
makeIndependent tensor = makeIndependentWithRequiresGrad tensor True
makeIndependentWithRequiresGrad :: Tensor -> Bool -> IO IndependentTensor
makeIndependentWithRequiresGrad tensor requires_grad = IndependentTensor <$> cast2 Torch.Internal.Managed.Autograd.makeIndependent tensor requires_grad
| |
98d3a3e34820451b460a002e87eef501b78ed4535c106448d6052eaea4fb3641 | erlangonrails/devdb | tester_value_collector.erl | Copyright 2010 fuer Informationstechnik Berlin
%
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
% you may not use this file except in compliance with the License.
% You may obtain a copy of the License at
%
% -2.0
%
% Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an " AS IS " BASIS ,
% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
% See the License for the specific language governing permissions and
% limitations under the License.
%%%-------------------------------------------------------------------
%%% File : tester.erl
Author : < >
%%% Description : value collector for test generator
%%%
Created : 30 April 2010 by < >
%%%-------------------------------------------------------------------
@author < >
2010 fuer Informationstechnik Berlin
@version $ I d : tester_value_collector.erl 906 2010 - 07 - 23 14:09:20Z schuett $
-module(tester_value_collector).
-author('').
-vsn('$Id: tester_value_collector.erl 906 2010-07-23 14:09:20Z schuett $').
-include_lib("unittest.hrl").
-export([parse_expression/2]).
parse_expression(Clauses, ParseState) when is_list(Clauses) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({call, _, Fun, Parameters}, ParseState) ->
parse_expression(Parameters, parse_expression(Fun, ParseState));
parse_expression({'case', _, Value, Clauses}, ParseState) ->
parse_expression(Clauses, parse_expression(Value, ParseState));
parse_expression({clause, _Line, Pattern, _, Clause}, ParseState) ->
parse_expression(Clause, parse_expression(Pattern, ParseState));
parse_expression({cons, _, Head, Tail}, ParseState) ->
parse_expression(Tail, parse_expression(Head, ParseState));
parse_expression({'fun', _, {clauses, Clauses}}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({'fun', _, {function, _Name, _Arity}}, ParseState) ->
ParseState;
parse_expression({'fun', _, {function, _Module, _Name, _Arity}}, ParseState) ->
ParseState;
parse_expression({generate, _, Expression, L}, ParseState) ->
parse_expression(L, parse_expression(Expression, ParseState));
parse_expression({'if', _, Clauses}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({lc, _, Expression, Qualifiers}, ParseState) ->
parse_expression(Qualifiers, parse_expression(Expression, ParseState));
parse_expression({match, _, Left, Right}, ParseState) ->
parse_expression(Left, parse_expression(Right, ParseState));
parse_expression({op, _, _, Value}, ParseState) ->
parse_expression(Value, ParseState);
parse_expression({op, _, _, Left, Right}, ParseState) ->
parse_expression(Left, parse_expression(Right, ParseState));
parse_expression({'receive', _, Clauses}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({'receive', _, Clauses, Timeout, AfterBody}, ParseState) ->
ParseState2 = lists:foldl(fun parse_expression/2, ParseState, Clauses),
ParseState3 = parse_expression(Timeout, ParseState2),
parse_expression(AfterBody, ParseState3);
parse_expression({record, _, _Name, Fields}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Fields);
parse_expression({record, _, _Variable, _Name, Fields}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Fields);
parse_expression({record_field, _, Name, Value}, ParseState) ->
parse_expression(Value, parse_expression(Name, ParseState));
parse_expression({record_field, _, Name, _RecordType, Value}, ParseState) ->
parse_expression(Value, parse_expression(Name, ParseState));
parse_expression({'try', _, Body, CaseClauses, CatchClauses, AfterBody}, ParseState) ->
ParseState2 = parse_expression(Body, ParseState),
ParseState3 = lists:foldl(fun parse_expression/2, ParseState2, CaseClauses),
ParseState4 = lists:foldl(fun parse_expression/2, ParseState3, CatchClauses),
parse_expression(AfterBody, ParseState4);
parse_expression({'catch', _, Expression}, ParseState) ->
parse_expression(Expression, ParseState);
parse_expression({block, _, ExpressionList}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, ExpressionList);
parse_expression({tuple, _, Values}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Values);
parse_expression({remote, _, Module, Fun}, ParseState) ->
parse_expression(Module, parse_expression(Fun, ParseState));
parse_expression({var, _, _Variable}, ParseState) ->
ParseState;
parse_expression({atom, _, Atom}, ParseState) ->
tester_parse_state:add_atom(Atom, ParseState);
parse_expression({bin, _, Binary}, ParseState) ->
tester_parse_state:add_binary(Binary, ParseState);
parse_expression({float, _, Float}, ParseState) ->
tester_parse_state:add_float(Float, ParseState);
parse_expression({char, _, _Char}, ParseState) ->
% @todo
ParseState;
parse_expression({integer, _, Integer}, ParseState) ->
tester_parse_state:add_integer(Integer, ParseState);
parse_expression({nil, _}, ParseState) ->
tester_parse_state:add_atom(nil, ParseState);
parse_expression({string, _, String}, ParseState) ->
tester_parse_state:add_string(String, ParseState);
parse_expression(Expression, _ParseState) ->
?ct_fail("unknown expression: ~w", [Expression]).
| null | https://raw.githubusercontent.com/erlangonrails/devdb/0e7eaa6bd810ec3892bfc3d933439560620d0941/dev/scalaris/test/tester_value_collector.erl | erlang |
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-------------------------------------------------------------------
File : tester.erl
Description : value collector for test generator
-------------------------------------------------------------------
@todo | Copyright 2010 fuer Informationstechnik Berlin
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
distributed under the License is distributed on an " AS IS " BASIS ,
Author : < >
Created : 30 April 2010 by < >
@author < >
2010 fuer Informationstechnik Berlin
@version $ I d : tester_value_collector.erl 906 2010 - 07 - 23 14:09:20Z schuett $
-module(tester_value_collector).
-author('').
-vsn('$Id: tester_value_collector.erl 906 2010-07-23 14:09:20Z schuett $').
-include_lib("unittest.hrl").
-export([parse_expression/2]).
parse_expression(Clauses, ParseState) when is_list(Clauses) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({call, _, Fun, Parameters}, ParseState) ->
parse_expression(Parameters, parse_expression(Fun, ParseState));
parse_expression({'case', _, Value, Clauses}, ParseState) ->
parse_expression(Clauses, parse_expression(Value, ParseState));
parse_expression({clause, _Line, Pattern, _, Clause}, ParseState) ->
parse_expression(Clause, parse_expression(Pattern, ParseState));
parse_expression({cons, _, Head, Tail}, ParseState) ->
parse_expression(Tail, parse_expression(Head, ParseState));
parse_expression({'fun', _, {clauses, Clauses}}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({'fun', _, {function, _Name, _Arity}}, ParseState) ->
ParseState;
parse_expression({'fun', _, {function, _Module, _Name, _Arity}}, ParseState) ->
ParseState;
parse_expression({generate, _, Expression, L}, ParseState) ->
parse_expression(L, parse_expression(Expression, ParseState));
parse_expression({'if', _, Clauses}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({lc, _, Expression, Qualifiers}, ParseState) ->
parse_expression(Qualifiers, parse_expression(Expression, ParseState));
parse_expression({match, _, Left, Right}, ParseState) ->
parse_expression(Left, parse_expression(Right, ParseState));
parse_expression({op, _, _, Value}, ParseState) ->
parse_expression(Value, ParseState);
parse_expression({op, _, _, Left, Right}, ParseState) ->
parse_expression(Left, parse_expression(Right, ParseState));
parse_expression({'receive', _, Clauses}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Clauses);
parse_expression({'receive', _, Clauses, Timeout, AfterBody}, ParseState) ->
ParseState2 = lists:foldl(fun parse_expression/2, ParseState, Clauses),
ParseState3 = parse_expression(Timeout, ParseState2),
parse_expression(AfterBody, ParseState3);
parse_expression({record, _, _Name, Fields}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Fields);
parse_expression({record, _, _Variable, _Name, Fields}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Fields);
parse_expression({record_field, _, Name, Value}, ParseState) ->
parse_expression(Value, parse_expression(Name, ParseState));
parse_expression({record_field, _, Name, _RecordType, Value}, ParseState) ->
parse_expression(Value, parse_expression(Name, ParseState));
parse_expression({'try', _, Body, CaseClauses, CatchClauses, AfterBody}, ParseState) ->
ParseState2 = parse_expression(Body, ParseState),
ParseState3 = lists:foldl(fun parse_expression/2, ParseState2, CaseClauses),
ParseState4 = lists:foldl(fun parse_expression/2, ParseState3, CatchClauses),
parse_expression(AfterBody, ParseState4);
parse_expression({'catch', _, Expression}, ParseState) ->
parse_expression(Expression, ParseState);
parse_expression({block, _, ExpressionList}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, ExpressionList);
parse_expression({tuple, _, Values}, ParseState) ->
lists:foldl(fun parse_expression/2, ParseState, Values);
parse_expression({remote, _, Module, Fun}, ParseState) ->
parse_expression(Module, parse_expression(Fun, ParseState));
parse_expression({var, _, _Variable}, ParseState) ->
ParseState;
parse_expression({atom, _, Atom}, ParseState) ->
tester_parse_state:add_atom(Atom, ParseState);
parse_expression({bin, _, Binary}, ParseState) ->
tester_parse_state:add_binary(Binary, ParseState);
parse_expression({float, _, Float}, ParseState) ->
tester_parse_state:add_float(Float, ParseState);
parse_expression({char, _, _Char}, ParseState) ->
ParseState;
parse_expression({integer, _, Integer}, ParseState) ->
tester_parse_state:add_integer(Integer, ParseState);
parse_expression({nil, _}, ParseState) ->
tester_parse_state:add_atom(nil, ParseState);
parse_expression({string, _, String}, ParseState) ->
tester_parse_state:add_string(String, ParseState);
parse_expression(Expression, _ParseState) ->
?ct_fail("unknown expression: ~w", [Expression]).
|
896a357c6d7760a5583daa5061e3996eaf8a7e9557916360e7e969b7516e037c | Opetushallitus/ataru | url_helper.clj | (ns ataru.config.url-helper
(:require [ataru.config.core :refer [config]]
[clojure.pprint :as pprint]
[taoensso.timbre :as log])
(:import (fi.vm.sade.properties OphProperties)))
TODO component if not too much work ?
(def ^fi.vm.sade.properties.OphProperties url-properties (atom nil))
(defn- add-default! [oph-properties name value]
(if (some? value)
(doto oph-properties (.addDefault name value))
(throw (ex-info (str "Default value for oph-properties property '" name "' is missing.")
{:property-name name
:error "missing default value"}))))
(defn- pretty-print [config]
(with-out-str
(binding [pprint/*print-right-margin* 120]
(pprint/pprint config))))
(defn- load-config
[]
(let [{:keys [virkailija-host
hakija-host
editor-url
liiteri-url
valinta-tulos-service-base-url
organisaatio-service-base-url
koodisto-service-base-url
ohjausparametrit-service-base-url
valintalaskenta-ui-service-base-url]} (:urls config)]
(log/info "load-config: url-properties default values:\n" (pretty-print (:urls config)))
(reset! url-properties
(doto (OphProperties. (into-array String ["/ataru-oph.properties"]))
(add-default! "host-virkailija" virkailija-host)
(add-default! "host-hakija" hakija-host)
(add-default! "url-editor" editor-url)
(add-default! "url-liiteri" liiteri-url)
(add-default! "url-liiteri" liiteri-url)
(add-default! "baseurl-valinta-tulos-service" valinta-tulos-service-base-url)
(add-default! "baseurl-organisaatio-service" organisaatio-service-base-url)
(add-default! "baseurl-koodisto-service" koodisto-service-base-url)
(add-default! "baseurl-ohjausparametrit-service" ohjausparametrit-service-base-url)
(add-default! "baseurl-valintalaskenta-ui-service" valintalaskenta-ui-service-base-url)))))
(defn resolve-url
[key & params]
(when (nil? @url-properties)
(load-config))
(.url @url-properties (name key) (to-array (or params []))))
(defn front-json
[]
(when (nil? @url-properties)
(load-config))
(.frontPropertiesToJson @url-properties))
| null | https://raw.githubusercontent.com/Opetushallitus/ataru/2d8ef1d3f972621e301a3818567d4e11219d2e82/src/clj/ataru/config/url_helper.clj | clojure | (ns ataru.config.url-helper
(:require [ataru.config.core :refer [config]]
[clojure.pprint :as pprint]
[taoensso.timbre :as log])
(:import (fi.vm.sade.properties OphProperties)))
TODO component if not too much work ?
(def ^fi.vm.sade.properties.OphProperties url-properties (atom nil))
(defn- add-default! [oph-properties name value]
(if (some? value)
(doto oph-properties (.addDefault name value))
(throw (ex-info (str "Default value for oph-properties property '" name "' is missing.")
{:property-name name
:error "missing default value"}))))
(defn- pretty-print [config]
(with-out-str
(binding [pprint/*print-right-margin* 120]
(pprint/pprint config))))
(defn- load-config
[]
(let [{:keys [virkailija-host
hakija-host
editor-url
liiteri-url
valinta-tulos-service-base-url
organisaatio-service-base-url
koodisto-service-base-url
ohjausparametrit-service-base-url
valintalaskenta-ui-service-base-url]} (:urls config)]
(log/info "load-config: url-properties default values:\n" (pretty-print (:urls config)))
(reset! url-properties
(doto (OphProperties. (into-array String ["/ataru-oph.properties"]))
(add-default! "host-virkailija" virkailija-host)
(add-default! "host-hakija" hakija-host)
(add-default! "url-editor" editor-url)
(add-default! "url-liiteri" liiteri-url)
(add-default! "url-liiteri" liiteri-url)
(add-default! "baseurl-valinta-tulos-service" valinta-tulos-service-base-url)
(add-default! "baseurl-organisaatio-service" organisaatio-service-base-url)
(add-default! "baseurl-koodisto-service" koodisto-service-base-url)
(add-default! "baseurl-ohjausparametrit-service" ohjausparametrit-service-base-url)
(add-default! "baseurl-valintalaskenta-ui-service" valintalaskenta-ui-service-base-url)))))
(defn resolve-url
[key & params]
(when (nil? @url-properties)
(load-config))
(.url @url-properties (name key) (to-array (or params []))))
(defn front-json
[]
(when (nil? @url-properties)
(load-config))
(.frontPropertiesToJson @url-properties))
| |
a2656c182d31f11a720528fe2d21bd225adf0adea7d152dca648d25e48a36e72 | kmi/irs | saved-new.lisp | Mode : Lisp ; Package :
Author :
The Open University
(in-package "OCML")
(in-ontology british-ethnic-groups-ontology)
| null | https://raw.githubusercontent.com/kmi/irs/e1b8d696f61c6b6878c0e92d993ed549fee6e7dd/ontologies/domains/british-ethnic-groups-ontology/saved-new.lisp | lisp | Package : |
Author :
The Open University
(in-package "OCML")
(in-ontology british-ethnic-groups-ontology)
|
91f7182f86d18125ca1373ad84967f27d3353f9cd2bdd56ca4b07068c7ef8cef | orionsbelt-battlegrounds/obb-rules | boozer.cljc | (ns ^{:added "1.9"
:author "Pedro Santos"}
obb-rules.units.boozer
"Metadata information for the Boozer unit")
(def metadata
{:name "boozer"
:code "bz"
:attack 3200
:after-hit [[:strikeback]]
:defense 2800
:range 5
:value 68
:bonus {:attack {:displacement {:air 4000}}}
:type :mechanic
:category :heavy
:displacement :ground
:movement-type :front
:movement-cost 4})
| null | https://raw.githubusercontent.com/orionsbelt-battlegrounds/obb-rules/97fad6506eb81142f74f4722aca58b80d618bf45/src/obb_rules/units/boozer.cljc | clojure | (ns ^{:added "1.9"
:author "Pedro Santos"}
obb-rules.units.boozer
"Metadata information for the Boozer unit")
(def metadata
{:name "boozer"
:code "bz"
:attack 3200
:after-hit [[:strikeback]]
:defense 2800
:range 5
:value 68
:bonus {:attack {:displacement {:air 4000}}}
:type :mechanic
:category :heavy
:displacement :ground
:movement-type :front
:movement-cost 4})
| |
a1a39065967f76a6b40b8578b2e11d2f4493a1f066afab61ede92e5ccc7c0bfe | instedd/planwise | auth.clj | (ns planwise.boundary.auth)
(defprotocol Auth
(find-auth-token [service scope user-ident])
(create-jwe-token [service user-ident]))
| null | https://raw.githubusercontent.com/instedd/planwise/1bc2a5742ae3dc377dddf1f9e9bb60f0d2f59084/src/planwise/boundary/auth.clj | clojure | (ns planwise.boundary.auth)
(defprotocol Auth
(find-auth-token [service scope user-ident])
(create-jwe-token [service user-ident]))
| |
aa9352631b48f0dfabb2bffbe3f4f62bd3d10cdd2b2885bb9d00f0be2a81af93 | Kappa-Dev/KappaTools | list_tokens.mli | val local_trace:bool
module Int_Set_and_Map:SetMap.S with type elt = int
val scan_compil:
Remanent_parameters_sig.parameters ->
Exception_without_parameter.method_handler ->
(Ckappa_sig.agent, Ckappa_sig.mixture, Ckappa_sig.mixture, 'a,
Ckappa_sig.mixture Ckappa_sig.rule)
Ast.compil ->
Exception_without_parameter.method_handler *
Cckappa_sig.kappa_handler
val empty_handler:
Remanent_parameters_sig.parameters ->
Exception_without_parameter.method_handler ->
Exception_without_parameter.method_handler *
Cckappa_sig.kappa_handler
| null | https://raw.githubusercontent.com/Kappa-Dev/KappaTools/b34ac13c13faf0b784d328ad61c9fc2338ae2fc6/core/KaSa_rep/frontend/list_tokens.mli | ocaml | val local_trace:bool
module Int_Set_and_Map:SetMap.S with type elt = int
val scan_compil:
Remanent_parameters_sig.parameters ->
Exception_without_parameter.method_handler ->
(Ckappa_sig.agent, Ckappa_sig.mixture, Ckappa_sig.mixture, 'a,
Ckappa_sig.mixture Ckappa_sig.rule)
Ast.compil ->
Exception_without_parameter.method_handler *
Cckappa_sig.kappa_handler
val empty_handler:
Remanent_parameters_sig.parameters ->
Exception_without_parameter.method_handler ->
Exception_without_parameter.method_handler *
Cckappa_sig.kappa_handler
| |
f9177dbb4d23082b22d23b9728ed11d9a78dab1329ac516d73f3f2c135d7380c | kingcons/advent-of-code | day11.lisp | (mgl-pax:define-package :aoc.2021.11
(:nicknames :2021.11)
(:use :cl :aoc.util :mgl-pax)
(:import-from :aoc.parsers #:parse-grid)
(:import-from :alexandria
#:define-constant
#:hash-table-keys)
(:import-from :serapeum #:op))
(in-package :2021.11)
(defsummary (:title "Dumbo Octopuses"))
(define-constant +adjacents+
'((0 1) (0 -1) (1 0) (-1 0) (1 1) (-1 -1) (1 -1) (-1 1))
:test #'equal)
(defun build-data (&optional input)
(flet ((build-grid (input)
(parse-grid input :container :hash
:transform (lambda (x row col)
(declare (ignore row col))
(- (char-code x) 48)))))
(read-day-input #'build-grid :whole t :input input)))
(defun neighbors (position grid)
(flet ((new-coord (x)
(cons (+ (first x) (car position))
(+ (second x) (cdr position))))
(valid? (x)
(gethash x grid)))
(let ((coordinates (mapcar #'new-coord +adjacents+)))
(remove-if-not #'valid? coordinates))))
(defun tick (grid)
(let ((flashed '()))
(loop with to-process = (hash-table-keys grid)
until (null to-process)
do (let* ((current (pop to-process))
(new-value (incf (gethash current grid))))
(when (= new-value 10)
(setf to-process (nconc to-process (neighbors current grid)))
(push current flashed))))
(dolist (octopus flashed)
(setf (gethash octopus grid) 0))
(length flashed)))
(defun count-flashes (grid steps)
(let ((count 0))
(dotimes (i steps)
(incf count (tick grid)))
count))
(defun part-1 (&optional (data (build-data)))
(count-flashes data 100))
(defun step-until-n-flashes (grid n)
(loop for i = 1 then (1+ i)
for flashes = (tick grid)
until (= flashes n)
finally (return i)))
(defun part-2 (&optional (data (build-data)))
(step-until-n-flashes data 100))
| null | https://raw.githubusercontent.com/kingcons/advent-of-code/aba6714d47760ba720fd9a6ae27d25588237c149/src/2021/day11.lisp | lisp | (mgl-pax:define-package :aoc.2021.11
(:nicknames :2021.11)
(:use :cl :aoc.util :mgl-pax)
(:import-from :aoc.parsers #:parse-grid)
(:import-from :alexandria
#:define-constant
#:hash-table-keys)
(:import-from :serapeum #:op))
(in-package :2021.11)
(defsummary (:title "Dumbo Octopuses"))
(define-constant +adjacents+
'((0 1) (0 -1) (1 0) (-1 0) (1 1) (-1 -1) (1 -1) (-1 1))
:test #'equal)
(defun build-data (&optional input)
(flet ((build-grid (input)
(parse-grid input :container :hash
:transform (lambda (x row col)
(declare (ignore row col))
(- (char-code x) 48)))))
(read-day-input #'build-grid :whole t :input input)))
(defun neighbors (position grid)
(flet ((new-coord (x)
(cons (+ (first x) (car position))
(+ (second x) (cdr position))))
(valid? (x)
(gethash x grid)))
(let ((coordinates (mapcar #'new-coord +adjacents+)))
(remove-if-not #'valid? coordinates))))
(defun tick (grid)
(let ((flashed '()))
(loop with to-process = (hash-table-keys grid)
until (null to-process)
do (let* ((current (pop to-process))
(new-value (incf (gethash current grid))))
(when (= new-value 10)
(setf to-process (nconc to-process (neighbors current grid)))
(push current flashed))))
(dolist (octopus flashed)
(setf (gethash octopus grid) 0))
(length flashed)))
(defun count-flashes (grid steps)
(let ((count 0))
(dotimes (i steps)
(incf count (tick grid)))
count))
(defun part-1 (&optional (data (build-data)))
(count-flashes data 100))
(defun step-until-n-flashes (grid n)
(loop for i = 1 then (1+ i)
for flashes = (tick grid)
until (= flashes n)
finally (return i)))
(defun part-2 (&optional (data (build-data)))
(step-until-n-flashes data 100))
| |
24764d12ed9e5514af8f757a2b4884dfa3f9a03542d117d355d49be264887063 | kirkedal/rfun-interp | MainRFun.hs | ---------------------------------------------------------------------------
--
Module :
Copyright : , 2017
-- License : AllRightsReserved
--
Maintainer : < >
-- Stability : none?
-- Portability : ?
--
-- |Main execution of RFun17 interpreter
--
-----------------------------------------------------------------------------
module Main (main) where
import Parser
import Ast
import PrettyPrinter
import TypeCheck
import Interp
import System.Environment
import System.Exit
main :: IO ()
main =
do
args <- getArgs
case args of
(filename : program : values) ->
do p <- parseProgram filename
vs <- parseValues values
typecheckProgram p
case interp p program vs of
(Left err) -> putStrLn "Run-time error:" >> (putStrLn $ err)
(Right val) -> putStrLn $ ppValue val
[filename] -> parseProgram filename >>= typecheckProgram >>= prettyPrintProgram
_ -> putStrLn "Wrong number of arguments.\nUsage:\n \"rfun\" programfile startfunc startvalue+\nor to stop before interpretation:\n \"rfun\" programfile "
typecheckProgram :: Program -> IO Program
typecheckProgram p =
case typecheck p of
Nothing -> return p
(Just e) -> putStrLn e >> (exitWith $ ExitFailure 1)
prettyPrintProgram :: Program -> IO ()
prettyPrintProgram = putStrLn.ppProgram
loadFile :: String -> IO String
loadFile "-" = getContents
loadFile filename = readFile filename
parseProgram :: String -> IO Program
parseProgram filename = loadFile filename >>= parseFromString >>= fromParserError
parseValues :: [String] -> IO [Value]
parseValues strV =
do l <- fromParserError $ mapM parseFromValue strV
return $ concat l
fromParserError :: Either ParserError a -> IO a
fromParserError (Left err) = (putStr (prettyParseError err)) >> (exitWith $ ExitFailure 1)
fromParserError (Right a) = return a
| null | https://raw.githubusercontent.com/kirkedal/rfun-interp/c5297be7ab07c92e9d489c642cd987ed646e78c8/src/MainRFun.hs | haskell | -------------------------------------------------------------------------
License : AllRightsReserved
Stability : none?
Portability : ?
|Main execution of RFun17 interpreter
--------------------------------------------------------------------------- | Module :
Copyright : , 2017
Maintainer : < >
module Main (main) where
import Parser
import Ast
import PrettyPrinter
import TypeCheck
import Interp
import System.Environment
import System.Exit
main :: IO ()
main =
do
args <- getArgs
case args of
(filename : program : values) ->
do p <- parseProgram filename
vs <- parseValues values
typecheckProgram p
case interp p program vs of
(Left err) -> putStrLn "Run-time error:" >> (putStrLn $ err)
(Right val) -> putStrLn $ ppValue val
[filename] -> parseProgram filename >>= typecheckProgram >>= prettyPrintProgram
_ -> putStrLn "Wrong number of arguments.\nUsage:\n \"rfun\" programfile startfunc startvalue+\nor to stop before interpretation:\n \"rfun\" programfile "
typecheckProgram :: Program -> IO Program
typecheckProgram p =
case typecheck p of
Nothing -> return p
(Just e) -> putStrLn e >> (exitWith $ ExitFailure 1)
prettyPrintProgram :: Program -> IO ()
prettyPrintProgram = putStrLn.ppProgram
loadFile :: String -> IO String
loadFile "-" = getContents
loadFile filename = readFile filename
parseProgram :: String -> IO Program
parseProgram filename = loadFile filename >>= parseFromString >>= fromParserError
parseValues :: [String] -> IO [Value]
parseValues strV =
do l <- fromParserError $ mapM parseFromValue strV
return $ concat l
fromParserError :: Either ParserError a -> IO a
fromParserError (Left err) = (putStr (prettyParseError err)) >> (exitWith $ ExitFailure 1)
fromParserError (Right a) = return a
|
a4c6c509752ea6c7ef0e5b1254aaba5af3abfeee9d19e59827c44011d9063402 | caribou/caribou-core | assets.clj | (ns caribou.test.assets
(:require [caribou.asset :as asset]
[aws.sdk.s3 :as s3]
[clojure.test :as test :refer [is testing deftest]]
[clojure.string :as string]
[caribou.config :as config]
[clojure.java.io :as io]))
;;; integration test of our s3 compatibility
(deftest sanity
(is true))
(defn to-stream [s] (io/input-stream (.getBytes s)))
(defn get-aws-config
[]
(try
(config/read-config (io/resource "config/test-aws.clj"))
(catch Exception e {:aws {:bucket "your.bucket"
:credentials {:access-key "REPLACE ME"
:secret-key "ME TOO"}}
:assets {:dir "app"
:prefix "caribou/test"}})))
(def payload "HELLO WORLD!")
(defn construct-loc
"for s3 level tests, so we can separate s3 issues from caibou issues"
[prefix fname]
(str prefix "/" (asset/asset-dir {}) "/" fname))
(defn upload-download
[upload]
(let [config (get-aws-config)
aws (:aws config)
creds (:credentials aws)
existing (s3/list-objects creds (:bucket aws))
prefix (-> config :assets :prefix)
prefixed (fn [s] (and (string? s) (.startsWith s (str prefix))))
test-uploads (fn [objects] (filter (comp prefixed :key) objects))
enumerated (count (test-uploads (:objects existing)))
fname (str "test-content-" (.getTime (java.util.Date.)))
_ (upload prefix fname (to-stream payload) config)
incremented (s3/list-objects creds (:bucket aws))
expanded (count (test-uploads (:objects incremented)))
contents (try (slurp (str "http://" (:bucket aws) ".s3.amazonaws.com/"
(construct-loc prefix fname)))
(catch Throwable t t))]
(testing "did something get uploaded?"
(is (= expanded (inc enumerated))))
(testing "does it have the expected contents?"
(is (= payload contents)))
(testing "delete that shit"
(is (nil?
(s3/delete-object creds (:bucket aws)
(construct-loc prefix fname)))))))
(deftest s3-level
(upload-download
(fn [prefix fname stream config]
(s3/put-object (:credentials (:aws config))
(:bucket (:aws config))
(construct-loc prefix fname)
stream
{:content-type "text/plain"
:content-length (count payload)}
(s3/grant :all-users :read)))))
(deftest caribou-asset-level
(upload-download
(fn [prefix fname stream config]
(config/with-config config
(let [asset {:filename fname :tempfile stream}
location (asset/asset-upload-path asset)]
(asset/upload-to-s3 location stream (count payload)))))))
(deftest caribou-asset-highlevel
(upload-download
(fn [prefix fname stream config]
(config/with-config config
(let [asset {:filename fname :size (count payload)}]
(asset/put-asset stream asset))))))
(defn get-disk-config
[]
{:assets {:dir "app"}})
(deftest disk-ops
(config/with-config (get-disk-config)
(let [asset {:id 1 :filename (str "hello" (.getTime (java.util.Date.)))}
location (string/join "/" [(config/draw :assets :dir)
(asset/asset-dir asset)])
destination (str (config/draw :assets :dir) \/
(asset/asset-path asset))
test-dir (io/file location)
pre-count (count (.list test-dir))
_ (asset/put-asset (to-stream "HELLO WORLD") asset)
post-count (count (.list test-dir))]
(is (= post-count (inc pre-count)))
(is (= "HELLO WORLD" (slurp destination))))))
| null | https://raw.githubusercontent.com/caribou/caribou-core/6ebd9db4e14cddb1d6b4e152e771e016fa9c55f6/test/caribou/test/assets.clj | clojure | integration test of our s3 compatibility | (ns caribou.test.assets
(:require [caribou.asset :as asset]
[aws.sdk.s3 :as s3]
[clojure.test :as test :refer [is testing deftest]]
[clojure.string :as string]
[caribou.config :as config]
[clojure.java.io :as io]))
(deftest sanity
(is true))
(defn to-stream [s] (io/input-stream (.getBytes s)))
(defn get-aws-config
[]
(try
(config/read-config (io/resource "config/test-aws.clj"))
(catch Exception e {:aws {:bucket "your.bucket"
:credentials {:access-key "REPLACE ME"
:secret-key "ME TOO"}}
:assets {:dir "app"
:prefix "caribou/test"}})))
(def payload "HELLO WORLD!")
(defn construct-loc
"for s3 level tests, so we can separate s3 issues from caibou issues"
[prefix fname]
(str prefix "/" (asset/asset-dir {}) "/" fname))
(defn upload-download
[upload]
(let [config (get-aws-config)
aws (:aws config)
creds (:credentials aws)
existing (s3/list-objects creds (:bucket aws))
prefix (-> config :assets :prefix)
prefixed (fn [s] (and (string? s) (.startsWith s (str prefix))))
test-uploads (fn [objects] (filter (comp prefixed :key) objects))
enumerated (count (test-uploads (:objects existing)))
fname (str "test-content-" (.getTime (java.util.Date.)))
_ (upload prefix fname (to-stream payload) config)
incremented (s3/list-objects creds (:bucket aws))
expanded (count (test-uploads (:objects incremented)))
contents (try (slurp (str "http://" (:bucket aws) ".s3.amazonaws.com/"
(construct-loc prefix fname)))
(catch Throwable t t))]
(testing "did something get uploaded?"
(is (= expanded (inc enumerated))))
(testing "does it have the expected contents?"
(is (= payload contents)))
(testing "delete that shit"
(is (nil?
(s3/delete-object creds (:bucket aws)
(construct-loc prefix fname)))))))
(deftest s3-level
(upload-download
(fn [prefix fname stream config]
(s3/put-object (:credentials (:aws config))
(:bucket (:aws config))
(construct-loc prefix fname)
stream
{:content-type "text/plain"
:content-length (count payload)}
(s3/grant :all-users :read)))))
(deftest caribou-asset-level
(upload-download
(fn [prefix fname stream config]
(config/with-config config
(let [asset {:filename fname :tempfile stream}
location (asset/asset-upload-path asset)]
(asset/upload-to-s3 location stream (count payload)))))))
(deftest caribou-asset-highlevel
(upload-download
(fn [prefix fname stream config]
(config/with-config config
(let [asset {:filename fname :size (count payload)}]
(asset/put-asset stream asset))))))
(defn get-disk-config
[]
{:assets {:dir "app"}})
(deftest disk-ops
(config/with-config (get-disk-config)
(let [asset {:id 1 :filename (str "hello" (.getTime (java.util.Date.)))}
location (string/join "/" [(config/draw :assets :dir)
(asset/asset-dir asset)])
destination (str (config/draw :assets :dir) \/
(asset/asset-path asset))
test-dir (io/file location)
pre-count (count (.list test-dir))
_ (asset/put-asset (to-stream "HELLO WORLD") asset)
post-count (count (.list test-dir))]
(is (= post-count (inc pre-count)))
(is (= "HELLO WORLD" (slurp destination))))))
|
075866bf285e138ee6e07a81985745a886d0e38c7654422c6b4eb52e0304a424 | jonasseglare/geex | jvm.clj | (ns geex.core.jvm
"Platform specific code needed by the compiler"
(:require [bluebell.utils.ebmd :as ebmd]
[bluebell.utils.ebmd.ops :as eops]
[bluebell.utils.ebmd.type :as type]
[geex.ebmd.type :as etype]
[geex.core.seed :as seed]))
------- Common type signatures for JVM platforms -------
(ebmd/declare-poly get-type-signature)
(ebmd/def-arg-spec nil-arg {:pred nil?
:pos [nil]
:neg [:a]})
(ebmd/def-poly get-type-signature
[etype/seed-with-class x]
(seed/datatype x))
(ebmd/def-poly get-type-signature
[etype/nothing-seed x]
Void/TYPE)
(ebmd/def-poly get-type-signature
[nil-arg x]
Void/TYPE)
(ebmd/def-poly get-type-signature
[etype/class-arg x]
x)
(ebmd/def-poly get-type-signature
[type/map x]
clojure.lang.IPersistentMap)
(ebmd/def-poly get-type-signature
[type/set x]
clojure.lang.IPersistentSet)
(ebmd/def-poly get-type-signature
[type/any x]
(if (vector? x)
clojure.lang.IPersistentVector
java.lang.Object))
;; Get a type signature that can be compiled
(ebmd/declare-poly get-compilable-type-signature)
(ebmd/def-poly get-compilable-type-signature
[type/any x]
(get-type-signature x))
(ebmd/def-poly get-compilable-type-signature
[(eops/and etype/seed-with-class
(eops/not etype/compilable-seed)) x]
clojure.lang.IPersistentMap)
| null | https://raw.githubusercontent.com/jonasseglare/geex/f1a48c14c983c054c91fb221b91f42de5fa8eee0/src/clj/geex/core/jvm.clj | clojure | Get a type signature that can be compiled | (ns geex.core.jvm
"Platform specific code needed by the compiler"
(:require [bluebell.utils.ebmd :as ebmd]
[bluebell.utils.ebmd.ops :as eops]
[bluebell.utils.ebmd.type :as type]
[geex.ebmd.type :as etype]
[geex.core.seed :as seed]))
------- Common type signatures for JVM platforms -------
(ebmd/declare-poly get-type-signature)
(ebmd/def-arg-spec nil-arg {:pred nil?
:pos [nil]
:neg [:a]})
(ebmd/def-poly get-type-signature
[etype/seed-with-class x]
(seed/datatype x))
(ebmd/def-poly get-type-signature
[etype/nothing-seed x]
Void/TYPE)
(ebmd/def-poly get-type-signature
[nil-arg x]
Void/TYPE)
(ebmd/def-poly get-type-signature
[etype/class-arg x]
x)
(ebmd/def-poly get-type-signature
[type/map x]
clojure.lang.IPersistentMap)
(ebmd/def-poly get-type-signature
[type/set x]
clojure.lang.IPersistentSet)
(ebmd/def-poly get-type-signature
[type/any x]
(if (vector? x)
clojure.lang.IPersistentVector
java.lang.Object))
(ebmd/declare-poly get-compilable-type-signature)
(ebmd/def-poly get-compilable-type-signature
[type/any x]
(get-type-signature x))
(ebmd/def-poly get-compilable-type-signature
[(eops/and etype/seed-with-class
(eops/not etype/compilable-seed)) x]
clojure.lang.IPersistentMap)
|
1619cc7772ebc20f986c0027bb85527d88a2273ba419b29b0508104108855238 | cedlemo/OCaml-GI-ctypes-bindings-generator | Proxy_address.ml | open Ctypes
open Foreign
type t = unit ptr
let t_typ : t typ = ptr void
let create =
foreign "g_proxy_address_new" (ptr Inet_address.t_typ @-> uint16_t @-> string @-> string @-> uint16_t @-> string_opt @-> string_opt @-> returning (ptr Socket_address.t_typ))
let get_destination_hostname =
foreign "g_proxy_address_get_destination_hostname" (t_typ @-> returning (string_opt))
let get_destination_port =
foreign "g_proxy_address_get_destination_port" (t_typ @-> returning (uint16_t))
let get_destination_protocol =
foreign "g_proxy_address_get_destination_protocol" (t_typ @-> returning (string_opt))
let get_password =
foreign "g_proxy_address_get_password" (t_typ @-> returning (string_opt))
let get_protocol =
foreign "g_proxy_address_get_protocol" (t_typ @-> returning (string_opt))
let get_uri =
foreign "g_proxy_address_get_uri" (t_typ @-> returning (string_opt))
let get_username =
foreign "g_proxy_address_get_username" (t_typ @-> returning (string_opt))
| null | https://raw.githubusercontent.com/cedlemo/OCaml-GI-ctypes-bindings-generator/21a4d449f9dbd6785131979b91aa76877bad2615/tools/Gio/Proxy_address.ml | ocaml | open Ctypes
open Foreign
type t = unit ptr
let t_typ : t typ = ptr void
let create =
foreign "g_proxy_address_new" (ptr Inet_address.t_typ @-> uint16_t @-> string @-> string @-> uint16_t @-> string_opt @-> string_opt @-> returning (ptr Socket_address.t_typ))
let get_destination_hostname =
foreign "g_proxy_address_get_destination_hostname" (t_typ @-> returning (string_opt))
let get_destination_port =
foreign "g_proxy_address_get_destination_port" (t_typ @-> returning (uint16_t))
let get_destination_protocol =
foreign "g_proxy_address_get_destination_protocol" (t_typ @-> returning (string_opt))
let get_password =
foreign "g_proxy_address_get_password" (t_typ @-> returning (string_opt))
let get_protocol =
foreign "g_proxy_address_get_protocol" (t_typ @-> returning (string_opt))
let get_uri =
foreign "g_proxy_address_get_uri" (t_typ @-> returning (string_opt))
let get_username =
foreign "g_proxy_address_get_username" (t_typ @-> returning (string_opt))
| |
829df6284f5733d32a4ea62c02e0a5f34d6684b1ead8fe866020f29d1285d28e | AeneasVerif/aeneas | Assumed.ml | * This module contains various utilities for the assumed functions .
Note that [ Box::free ] is peculiar : we do n't really handle it as a function ,
because it is legal to free a box whose boxed value is [ ⊥ ] ( it often
happens that we move a value out of a box before freeing this box ) .
Semantically speaking , we thus handle [ Box::free ] as a value drop and
not as a function call , and thus never need its signature .
TODO : implementing the concrete evaluation functions for the
assumed functions is really annoying ( see
[ InterpreterStatements.eval_non_local_function_call_concrete ] ) ,
I think it should be possible , in most situations , to write bodies which
model the behaviour of those unsafe functions . For instance , [ Box::deref_mut ]
should simply be :
{ [
fn deref_mut<'a , T>(x : & ' a mut Box < T > ) - > & ' a mut T {
& mut ( * x ) // box dereferencement is a primitive operation
}
] }
For vectors , we could " cheat " by using the index as a field index ( vectors
would be encoded as ADTs with a variable number of fields ) . Of course , it
would require a bit of engineering , but it would probably be quite lightweight
in the end .
{ [
Vec::get_mut<'a , T>(v : & ' a mut Vec < T > , i : usize ) - > & ' a mut T {
& mut ( ( * x ) .i )
}
] }
Note that [Box::free] is peculiar: we don't really handle it as a function,
because it is legal to free a box whose boxed value is [⊥] (it often
happens that we move a value out of a box before freeing this box).
Semantically speaking, we thus handle [Box::free] as a value drop and
not as a function call, and thus never need its signature.
TODO: implementing the concrete evaluation functions for the
assumed functions is really annoying (see
[InterpreterStatements.eval_non_local_function_call_concrete]),
I think it should be possible, in most situations, to write bodies which
model the behaviour of those unsafe functions. For instance, [Box::deref_mut]
should simply be:
{[
fn deref_mut<'a, T>(x : &'a mut Box<T>) -> &'a mut T {
&mut ( *x ) // box dereferencement is a primitive operation
}
]}
For vectors, we could "cheat" by using the index as a field index (vectors
would be encoded as ADTs with a variable number of fields). Of course, it
would require a bit of engineering, but it would probably be quite lightweight
in the end.
{[
Vec::get_mut<'a,T>(v : &'a mut Vec<T>, i : usize) -> &'a mut T {
&mut ( ( *x ).i )
}
]}
*)
open Names
open TypesUtils
module T = Types
module A = LlbcAst
module Sig = struct
(** A few utilities *)
let rvar_id_0 = T.RegionVarId.of_int 0
let rvar_0 : T.RegionVarId.id T.region = T.Var rvar_id_0
let rg_id_0 = T.RegionGroupId.of_int 0
let tvar_id_0 = T.TypeVarId.of_int 0
let tvar_0 : T.sty = T.TypeVar tvar_id_0
(** Region 'a of id 0 *)
let region_param_0 : T.region_var = { T.index = rvar_id_0; name = Some "'a" }
* Region group : [ { parent= { } ; 0 } } ]
let region_group_0 : T.region_var_group =
{ T.id = rg_id_0; regions = [ rvar_id_0 ]; parents = [] }
(** Type parameter [T] of id 0 *)
let type_param_0 : T.type_var = { T.index = tvar_id_0; name = "T" }
let mk_ref_ty (r : T.RegionVarId.id T.region) (ty : T.sty) (is_mut : bool) :
T.sty =
let ref_kind = if is_mut then T.Mut else T.Shared in
mk_ref_ty r ty ref_kind
(** [fn<T>(&'a mut T, T) -> T] *)
let mem_replace_sig : A.fun_sig =
(* The signature fields *)
let region_params = [ region_param_0 ] (* <'a> *) in
let regions_hierarchy = [ region_group_0 ] (* [{<'a>}] *) in
let type_params = [ type_param_0 ] (* <T> *) in
let inputs =
[ mk_ref_ty rvar_0 tvar_0 true (* &'a mut T *); tvar_0 (* T *) ]
in
let output = tvar_0 (* T *) in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
(** [fn<T>(T) -> Box<T>] *)
let box_new_sig : A.fun_sig =
{
region_params = [];
num_early_bound_regions = 0;
regions_hierarchy = [];
type_params = [ type_param_0 ] (* <T> *);
inputs = [ tvar_0 (* T *) ];
output = mk_box_ty tvar_0 (* Box<T> *);
}
* [ fn < < T > ) - > ( ) ]
let box_free_sig : A.fun_sig =
{
region_params = [];
num_early_bound_regions = 0;
regions_hierarchy = [];
type_params = [ type_param_0 ] (* <T> *);
inputs = [ mk_box_ty tvar_0 (* Box<T> *) ];
output = mk_unit_ty (* () *);
}
(** Helper for [Box::deref_shared] and [Box::deref_mut].
Returns:
[fn<'a, T>(&'a (mut) Box<T>) -> &'a (mut) T]
*)
let box_deref_gen_sig (is_mut : bool) : A.fun_sig =
(* The signature fields *)
let region_params = [ region_param_0 ] in
let regions_hierarchy = [ region_group_0 ] (* <'a> *) in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params = [ type_param_0 ] (* <T> *);
inputs =
[ mk_ref_ty rvar_0 (mk_box_ty tvar_0) is_mut (* &'a (mut) Box<T> *) ];
output = mk_ref_ty rvar_0 tvar_0 is_mut (* &'a (mut) T *);
}
* [ fn<'a , T>(&'a Box < T > ) - > & ' a T ]
let box_deref_shared_sig = box_deref_gen_sig false
* [ fn<'a , T>(&'a mut Box < T > ) - > & ' a mut T ]
let box_deref_mut_sig = box_deref_gen_sig true
* [ fn < T > ( ) - > < T > ]
let vec_new_sig : A.fun_sig =
let region_params = [] in
let regions_hierarchy = [] in
let type_params = [ type_param_0 ] (* <T> *) in
let inputs = [] in
< T >
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
(** [fn<T>(&'a mut Vec<T>, T)] *)
let vec_push_sig : A.fun_sig =
(* The signature fields *)
let region_params = [ region_param_0 ] in
let regions_hierarchy = [ region_group_0 ] (* <'a> *) in
let type_params = [ type_param_0 ] (* <T> *) in
let inputs =
[
mk_ref_ty rvar_0 (mk_vec_ty tvar_0) true (* &'a mut Vec<T> *);
tvar_0 (* T *);
]
in
let output = mk_unit_ty (* () *) in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
(** [fn<T>(&'a mut Vec<T>, usize, T)] *)
let vec_insert_sig : A.fun_sig =
(* The signature fields *)
let region_params = [ region_param_0 ] in
let regions_hierarchy = [ region_group_0 ] (* <'a> *) in
let type_params = [ type_param_0 ] (* <T> *) in
let inputs =
[
mk_ref_ty rvar_0 (mk_vec_ty tvar_0) true (* &'a mut Vec<T> *);
mk_usize_ty (* usize *);
tvar_0 (* T *);
]
in
let output = mk_unit_ty (* () *) in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
(** [fn<T>(&'a Vec<T>) -> usize] *)
let vec_len_sig : A.fun_sig =
(* The signature fields *)
let region_params = [ region_param_0 ] in
let regions_hierarchy = [ region_group_0 ] (* <'a> *) in
let type_params = [ type_param_0 ] (* <T> *) in
let inputs =
& ' a < T >
in
let output = mk_usize_ty (* usize *) in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
* Helper :
[ fn < T>(&'a ( mut ) < T > , usize ) - > & ' a ( mut ) T ]
[fn<T>(&'a (mut) Vec<T>, usize) -> &'a (mut) T]
*)
let vec_index_gen_sig (is_mut : bool) : A.fun_sig =
(* The signature fields *)
let region_params = [ region_param_0 ] in
let regions_hierarchy = [ region_group_0 ] (* <'a> *) in
let type_params = [ type_param_0 ] (* <T> *) in
let inputs =
[
& ' a ( mut ) < T >
mk_usize_ty (* usize *);
]
in
let output = mk_ref_ty rvar_0 tvar_0 is_mut (* &'a (mut) T *) in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
(** [fn<T>(&'a Vec<T>, usize) -> &'a T] *)
let vec_index_shared_sig : A.fun_sig = vec_index_gen_sig false
(** [fn<T>(&'a mut Vec<T>, usize) -> &'a mut T] *)
let vec_index_mut_sig : A.fun_sig = vec_index_gen_sig true
end
type assumed_info = A.assumed_fun_id * A.fun_sig * bool * name
* The list of assumed functions and all their information :
- their signature
- a boolean indicating whether the function can fail or not
- their name
. : following what is written above , we do n't include [ Box::free ] .
Remark about the vector functions : for [ Vec::len ] to be correct and return
a [ usize ] , we have to make sure that vectors are bounded by the usize .
Followingly , [ Vec::push ] is monadic .
- their signature
- a boolean indicating whether the function can fail or not
- their name
Rk.: following what is written above, we don't include [Box::free].
Remark about the vector functions: for [Vec::len] to be correct and return
a [usize], we have to make sure that vectors are bounded by the max usize.
Followingly, [Vec::push] is monadic.
*)
let assumed_infos : assumed_info list =
let deref_pre = [ "core"; "ops"; "deref" ] in
let vec_pre = [ "alloc"; "vec"; "Vec" ] in
let index_pre = [ "core"; "ops"; "index" ] in
[
(A.Replace, Sig.mem_replace_sig, false, to_name [ "core"; "mem"; "replace" ]);
(BoxNew, Sig.box_new_sig, false, to_name [ "alloc"; "boxed"; "Box"; "new" ]);
( BoxFree,
Sig.box_free_sig,
false,
to_name [ "alloc"; "boxed"; "Box"; "free" ] );
( BoxDeref,
Sig.box_deref_shared_sig,
false,
to_name (deref_pre @ [ "Deref"; "deref" ]) );
( BoxDerefMut,
Sig.box_deref_mut_sig,
false,
to_name (deref_pre @ [ "DerefMut"; "deref_mut" ]) );
(VecNew, Sig.vec_new_sig, false, to_name (vec_pre @ [ "new" ]));
(VecPush, Sig.vec_push_sig, true, to_name (vec_pre @ [ "push" ]));
(VecInsert, Sig.vec_insert_sig, true, to_name (vec_pre @ [ "insert" ]));
(VecLen, Sig.vec_len_sig, false, to_name (vec_pre @ [ "len" ]));
( VecIndex,
Sig.vec_index_shared_sig,
true,
to_name (index_pre @ [ "Index"; "index" ]) );
( VecIndexMut,
Sig.vec_index_mut_sig,
true,
to_name (index_pre @ [ "IndexMut"; "index_mut" ]) );
]
let get_assumed_info (id : A.assumed_fun_id) : assumed_info =
match List.find_opt (fun (id', _, _, _) -> id = id') assumed_infos with
| Some info -> info
| None ->
raise
(Failure ("get_assumed_info: not found: " ^ A.show_assumed_fun_id id))
let get_assumed_sig (id : A.assumed_fun_id) : A.fun_sig =
let _, sg, _, _ = get_assumed_info id in
sg
let get_assumed_name (id : A.assumed_fun_id) : fun_name =
let _, _, _, name = get_assumed_info id in
name
let assumed_can_fail (id : A.assumed_fun_id) : bool =
let _, _, b, _ = get_assumed_info id in
b
| null | https://raw.githubusercontent.com/AeneasVerif/aeneas/b191070501ceafdd49c999385c4410848249fe18/compiler/Assumed.ml | ocaml | * A few utilities
* Region 'a of id 0
* Type parameter [T] of id 0
* [fn<T>(&'a mut T, T) -> T]
The signature fields
<'a>
[{<'a>}]
<T>
&'a mut T
T
T
* [fn<T>(T) -> Box<T>]
<T>
T
Box<T>
<T>
Box<T>
()
* Helper for [Box::deref_shared] and [Box::deref_mut].
Returns:
[fn<'a, T>(&'a (mut) Box<T>) -> &'a (mut) T]
The signature fields
<'a>
<T>
&'a (mut) Box<T>
&'a (mut) T
<T>
* [fn<T>(&'a mut Vec<T>, T)]
The signature fields
<'a>
<T>
&'a mut Vec<T>
T
()
* [fn<T>(&'a mut Vec<T>, usize, T)]
The signature fields
<'a>
<T>
&'a mut Vec<T>
usize
T
()
* [fn<T>(&'a Vec<T>) -> usize]
The signature fields
<'a>
<T>
usize
The signature fields
<'a>
<T>
usize
&'a (mut) T
* [fn<T>(&'a Vec<T>, usize) -> &'a T]
* [fn<T>(&'a mut Vec<T>, usize) -> &'a mut T] | * This module contains various utilities for the assumed functions .
Note that [ Box::free ] is peculiar : we do n't really handle it as a function ,
because it is legal to free a box whose boxed value is [ ⊥ ] ( it often
happens that we move a value out of a box before freeing this box ) .
Semantically speaking , we thus handle [ Box::free ] as a value drop and
not as a function call , and thus never need its signature .
TODO : implementing the concrete evaluation functions for the
assumed functions is really annoying ( see
[ InterpreterStatements.eval_non_local_function_call_concrete ] ) ,
I think it should be possible , in most situations , to write bodies which
model the behaviour of those unsafe functions . For instance , [ Box::deref_mut ]
should simply be :
{ [
fn deref_mut<'a , T>(x : & ' a mut Box < T > ) - > & ' a mut T {
& mut ( * x ) // box dereferencement is a primitive operation
}
] }
For vectors , we could " cheat " by using the index as a field index ( vectors
would be encoded as ADTs with a variable number of fields ) . Of course , it
would require a bit of engineering , but it would probably be quite lightweight
in the end .
{ [
Vec::get_mut<'a , T>(v : & ' a mut Vec < T > , i : usize ) - > & ' a mut T {
& mut ( ( * x ) .i )
}
] }
Note that [Box::free] is peculiar: we don't really handle it as a function,
because it is legal to free a box whose boxed value is [⊥] (it often
happens that we move a value out of a box before freeing this box).
Semantically speaking, we thus handle [Box::free] as a value drop and
not as a function call, and thus never need its signature.
TODO: implementing the concrete evaluation functions for the
assumed functions is really annoying (see
[InterpreterStatements.eval_non_local_function_call_concrete]),
I think it should be possible, in most situations, to write bodies which
model the behaviour of those unsafe functions. For instance, [Box::deref_mut]
should simply be:
{[
fn deref_mut<'a, T>(x : &'a mut Box<T>) -> &'a mut T {
&mut ( *x ) // box dereferencement is a primitive operation
}
]}
For vectors, we could "cheat" by using the index as a field index (vectors
would be encoded as ADTs with a variable number of fields). Of course, it
would require a bit of engineering, but it would probably be quite lightweight
in the end.
{[
Vec::get_mut<'a,T>(v : &'a mut Vec<T>, i : usize) -> &'a mut T {
&mut ( ( *x ).i )
}
]}
*)
open Names
open TypesUtils
module T = Types
module A = LlbcAst
module Sig = struct
let rvar_id_0 = T.RegionVarId.of_int 0
let rvar_0 : T.RegionVarId.id T.region = T.Var rvar_id_0
let rg_id_0 = T.RegionGroupId.of_int 0
let tvar_id_0 = T.TypeVarId.of_int 0
let tvar_0 : T.sty = T.TypeVar tvar_id_0
let region_param_0 : T.region_var = { T.index = rvar_id_0; name = Some "'a" }
* Region group : [ { parent= { } ; 0 } } ]
let region_group_0 : T.region_var_group =
{ T.id = rg_id_0; regions = [ rvar_id_0 ]; parents = [] }
let type_param_0 : T.type_var = { T.index = tvar_id_0; name = "T" }
let mk_ref_ty (r : T.RegionVarId.id T.region) (ty : T.sty) (is_mut : bool) :
T.sty =
let ref_kind = if is_mut then T.Mut else T.Shared in
mk_ref_ty r ty ref_kind
let mem_replace_sig : A.fun_sig =
let inputs =
in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
let box_new_sig : A.fun_sig =
{
region_params = [];
num_early_bound_regions = 0;
regions_hierarchy = [];
}
* [ fn < < T > ) - > ( ) ]
let box_free_sig : A.fun_sig =
{
region_params = [];
num_early_bound_regions = 0;
regions_hierarchy = [];
}
let box_deref_gen_sig (is_mut : bool) : A.fun_sig =
let region_params = [ region_param_0 ] in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
inputs =
}
* [ fn<'a , T>(&'a Box < T > ) - > & ' a T ]
let box_deref_shared_sig = box_deref_gen_sig false
* [ fn<'a , T>(&'a mut Box < T > ) - > & ' a mut T ]
let box_deref_mut_sig = box_deref_gen_sig true
* [ fn < T > ( ) - > < T > ]
let vec_new_sig : A.fun_sig =
let region_params = [] in
let regions_hierarchy = [] in
let inputs = [] in
< T >
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
let vec_push_sig : A.fun_sig =
let region_params = [ region_param_0 ] in
let inputs =
[
]
in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
let vec_insert_sig : A.fun_sig =
let region_params = [ region_param_0 ] in
let inputs =
[
]
in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
let vec_len_sig : A.fun_sig =
let region_params = [ region_param_0 ] in
let inputs =
& ' a < T >
in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
* Helper :
[ fn < T>(&'a ( mut ) < T > , usize ) - > & ' a ( mut ) T ]
[fn<T>(&'a (mut) Vec<T>, usize) -> &'a (mut) T]
*)
let vec_index_gen_sig (is_mut : bool) : A.fun_sig =
let region_params = [ region_param_0 ] in
let inputs =
[
& ' a ( mut ) < T >
]
in
{
region_params;
num_early_bound_regions = 0;
regions_hierarchy;
type_params;
inputs;
output;
}
let vec_index_shared_sig : A.fun_sig = vec_index_gen_sig false
let vec_index_mut_sig : A.fun_sig = vec_index_gen_sig true
end
type assumed_info = A.assumed_fun_id * A.fun_sig * bool * name
* The list of assumed functions and all their information :
- their signature
- a boolean indicating whether the function can fail or not
- their name
. : following what is written above , we do n't include [ Box::free ] .
Remark about the vector functions : for [ Vec::len ] to be correct and return
a [ usize ] , we have to make sure that vectors are bounded by the usize .
Followingly , [ Vec::push ] is monadic .
- their signature
- a boolean indicating whether the function can fail or not
- their name
Rk.: following what is written above, we don't include [Box::free].
Remark about the vector functions: for [Vec::len] to be correct and return
a [usize], we have to make sure that vectors are bounded by the max usize.
Followingly, [Vec::push] is monadic.
*)
let assumed_infos : assumed_info list =
let deref_pre = [ "core"; "ops"; "deref" ] in
let vec_pre = [ "alloc"; "vec"; "Vec" ] in
let index_pre = [ "core"; "ops"; "index" ] in
[
(A.Replace, Sig.mem_replace_sig, false, to_name [ "core"; "mem"; "replace" ]);
(BoxNew, Sig.box_new_sig, false, to_name [ "alloc"; "boxed"; "Box"; "new" ]);
( BoxFree,
Sig.box_free_sig,
false,
to_name [ "alloc"; "boxed"; "Box"; "free" ] );
( BoxDeref,
Sig.box_deref_shared_sig,
false,
to_name (deref_pre @ [ "Deref"; "deref" ]) );
( BoxDerefMut,
Sig.box_deref_mut_sig,
false,
to_name (deref_pre @ [ "DerefMut"; "deref_mut" ]) );
(VecNew, Sig.vec_new_sig, false, to_name (vec_pre @ [ "new" ]));
(VecPush, Sig.vec_push_sig, true, to_name (vec_pre @ [ "push" ]));
(VecInsert, Sig.vec_insert_sig, true, to_name (vec_pre @ [ "insert" ]));
(VecLen, Sig.vec_len_sig, false, to_name (vec_pre @ [ "len" ]));
( VecIndex,
Sig.vec_index_shared_sig,
true,
to_name (index_pre @ [ "Index"; "index" ]) );
( VecIndexMut,
Sig.vec_index_mut_sig,
true,
to_name (index_pre @ [ "IndexMut"; "index_mut" ]) );
]
let get_assumed_info (id : A.assumed_fun_id) : assumed_info =
match List.find_opt (fun (id', _, _, _) -> id = id') assumed_infos with
| Some info -> info
| None ->
raise
(Failure ("get_assumed_info: not found: " ^ A.show_assumed_fun_id id))
let get_assumed_sig (id : A.assumed_fun_id) : A.fun_sig =
let _, sg, _, _ = get_assumed_info id in
sg
let get_assumed_name (id : A.assumed_fun_id) : fun_name =
let _, _, _, name = get_assumed_info id in
name
let assumed_can_fail (id : A.assumed_fun_id) : bool =
let _, _, b, _ = get_assumed_info id in
b
|
9ab0f9de56396757e194fc95c6ce7edfb1b3119a75bb547fd38c0b10673a42bc | cyverse-archive/DiscoveryEnvironmentBackend | app.clj | (ns metadactyl.routes.domain.app
(:use [common-swagger-api.schema :only [->optional-param describe]]
[metadactyl.routes.params]
[metadactyl.routes.domain.app.rating]
[metadactyl.routes.domain.tool :only [Tool]]
[schema.core :only [Any defschema optional-key recursive]])
(:import [java.util UUID Date]))
(def AppIdParam (describe UUID "A UUID that is used to identify the App"))
(def StringAppIdParam (describe String "The App identifier."))
(def OptionalIdParam (describe UUID "An optional UUID identifier"))
(def AppDocParam (describe String "The App's documentation"))
(def AppDocUrlParam (describe String "The App's documentation URL"))
(def AppReferencesParam (describe [String] "The App's references"))
(def AppDeletedParam (describe Boolean "Whether the App is marked as deleted"))
(def AppDisabledParam (describe Boolean "Whether the App is marked as disabled"))
(def AppPublicParam (describe Boolean
"Whether the App has been published and is viewable by all users"))
(def OptionalGroupsKey (optional-key :groups))
(def OptionalParametersKey (optional-key :parameters))
(def OptionalParameterArgumentsKey (optional-key :arguments))
(def ToolListDocs "The tools used to execute the App")
(def GroupListDocs "The list of Parameter Groups associated with the App")
(def ParameterListDocs "The list of Parameters in this Group")
(def ListItemOrTreeDocs
"The List Parameter's arguments. Only used in cases where the user is given a fixed number of
values to choose from. This can occur for Parameters such as `TextSelection` or
`IntegerSelection` Parameters")
(def TreeSelectorParameterListDocs "The TreeSelector root's arguments")
(def TreeSelectorGroupListDocs "The TreeSelector root's groups")
(def TreeSelectorGroupParameterListDocs "The TreeSelector Group's arguments")
(def TreeSelectorGroupGroupListDocs "The TreeSelector Group's groups")
(defschema AppParameterListItem
{:id (describe UUID "A UUID that is used to identify the List Item")
(optional-key :name) (describe String "The List Item's name")
(optional-key :value) (describe String "The List Item's value")
(optional-key :description) (describe String "The List Item's description")
(optional-key :display) (describe String "The List Item's display label")
(optional-key :isDefault) (describe Boolean "Flags this Item as the List's default selection")})
(defschema AppParameterListGroup
(merge AppParameterListItem
{OptionalParameterArgumentsKey
(describe [AppParameterListItem] TreeSelectorGroupParameterListDocs)
OptionalGroupsKey
(describe [(recursive #'AppParameterListGroup)] TreeSelectorGroupGroupListDocs)}))
(defschema AppParameterListItemOrTree
(merge AppParameterListItem
{(optional-key :isSingleSelect)
(describe Boolean "The TreeSelector root's single-selection flag")
(optional-key :selectionCascade)
(describe String "The TreeSelector root's cascace option")
OptionalParameterArgumentsKey
(describe [AppParameterListItem] TreeSelectorParameterListDocs)
OptionalGroupsKey
(describe [AppParameterListGroup] TreeSelectorGroupListDocs)}))
(defschema AppParameterValidator
{:type
(describe String
"The validation rule's type, which describes how a property value should be validated. For
example, if the type is `IntAbove` then the property value entered by the user must be an
integer above a specific value, which is specified in the parameter list. You can use the
`rule-types` endpoint to get a list of validation rule types")
:params
(describe [Any]
"The list of parameters to use when validating a Parameter value. For example, to ensure that a
Parameter contains a value that is an integer greater than zero, you would use a validation
rule of type `IntAbove` along with a parameter list of `[0]`")})
(defschema AppFileParameters
{(optional-key :format)
(describe String "The Input/Output Parameter's file format")
(optional-key :file_info_type)
(describe String "The Input/Output Parameter's info type")
(optional-key :is_implicit)
(describe Boolean
"Whether the Output Parameter name is specified on the command line (but still be referenced in
Pipelines), or implicitly determined by the app itself. If the output file name is implicit
then the output file name either must always be the same or it must follow a naming convention
that can easily be matched with a glob pattern")
(optional-key :repeat_option_flag)
(describe Boolean
"Whether or not the command-line option flag should preceed each file of a MultiFileSelector
on the command line when the App is run")
(optional-key :data_source)
(describe String "The Output Parameter's source")
(optional-key :retain)
(describe Boolean
"Whether or not the Input should be copied back to the job output directory in iRODS")})
(defschema AppParameter
{:id
(describe UUID "A UUID that is used to identify the Parameter")
(optional-key :name)
(describe String
"The Parameter's name. In most cases, this field indicates the command-line option used to
identify the Parameter on the command line. In these cases, the Parameter is assumed to be
positional and no command-line option is used if the name is blank. For Parameters that
specify a limited set of selection values, however, this is not the case. Instead, the
Parameter arguments specify both the command-line flag and the Parameter value to use for each
option that is selected")
(optional-key :defaultValue)
(describe Any "The Parameter's default value")
(optional-key :value)
(describe Any "The Parameter's value, used for previewing this parameter on the command-line.")
(optional-key :label)
(describe String "The Parameter's prompt to display in the UI")
(optional-key :description)
(describe String "The Parameter's description")
(optional-key :order)
(describe Long
"The relative command-line order for the Parameter. If this field is not specified then the
arguments will appear on the command-line in the order in which they appear in the import JSON.
If you're not specifying the order, please be sure that the argument order is unimportant for
the tool being integrated")
(optional-key :required)
(describe Boolean "Whether or not a value is required for this Parameter")
(optional-key :isVisible)
(describe Boolean "The Parameter's intended visibility in the job submission UI")
(optional-key :omit_if_blank)
(describe Boolean
"Whether the command-line option should be omitted if the Parameter value is blank. This is
most useful for optional arguments that use command-line flags in conjunction with a value. In
this case, it is an error to include the command-line flag without a corresponding value. This
flag indicates that the command-line flag should be omitted if the value is blank. This can
also be used for positional arguments, but this flag tends to be useful only for trailing
positional arguments")
:type
(describe String
"The Parameter's type name. Must contain the name of one of the Parameter types defined in the
database. You can get the list of defined and undeprecated Parameter types using the
`parameter-types` endpoint")
(optional-key :file_parameters)
(describe AppFileParameters "The File Parameter specific details")
OptionalParameterArgumentsKey
(describe [AppParameterListItemOrTree] ListItemOrTreeDocs)
(optional-key :validators)
(describe [AppParameterValidator]
"The Parameter's validation rules, which contains a list of rules that can be used to verify
that Parameter values entered by a user are valid. Note that in cases where the user is given
a list of possibilities to choose from, no validation rules are required because the selection
list itself can be used to validate the Parameter value")})
(defschema AppGroup
{:id
(describe UUID "A UUID that is used to identify the Parameter Group")
(optional-key :name)
(describe String "The Parameter Group's name")
(optional-key :description)
(describe String "The Parameter Group's description")
:label
(describe String "The label used to identify the Parameter Group in the UI")
(optional-key :isVisible)
(describe Boolean "The Parameter Group's intended visibility in the job submission UI")
OptionalParametersKey
(describe [AppParameter] ParameterListDocs)})
(defschema AppBase
{:id AppIdParam
:name (describe String "The App's name")
:description (describe String "The App's description")
(optional-key :integration_date) (describe Date "The App's Date of public submission")
(optional-key :edited_date) (describe Date "The App's Date of its last edit")})
(defschema App
(merge AppBase
{(optional-key :tools) (describe [Tool] ToolListDocs)
(optional-key :references) AppReferencesParam
OptionalGroupsKey (describe [AppGroup] GroupListDocs)}))
(defschema AppFileParameterDetails
{:id (describe String "The Parameter's ID")
:name (describe String "The Parameter's name")
:description (describe String "The Parameter's description")
:label (describe String "The Input Parameter's label or the Output Parameter's value")
:format (describe String "The Parameter's file format")
:required (describe Boolean "Whether or not a value is required for this Parameter")})
(defschema AppTask
{:id (describe String "The Task's ID")
:name (describe String "The Task's name")
:description (describe String "The Task's description")
:inputs (describe [AppFileParameterDetails] "The Task's input parameters")
:outputs (describe [AppFileParameterDetails] "The Task's output parameters")})
(defschema AppTaskListing
(assoc AppBase
:id (describe String "The App's ID.")
:tasks (describe [AppTask] "The App's tasks")))
(defschema AppParameterJobView
(assoc AppParameter
:id
(describe String
"A string consisting of the App's step ID and the Parameter ID separated by an underscore.
Both identifiers are necessary because the same task may be associated with a single App,
which would cause duplicate keys in the job submission JSON. The step ID is prepended to
the Parameter ID in order to ensure that all parameter value keys are unique.")))
(defschema AppGroupJobView
(assoc AppGroup
:id (describe String "The app group ID.")
:step_number (describe Long "The step number associated with this parameter group")
OptionalParametersKey (describe [AppParameterJobView] ParameterListDocs)))
(defschema AppJobView
(assoc AppBase
:app_type (describe String "DE or External.")
:id (describe String "The app ID.")
:label (describe String "An alias for the App's name")
:deleted AppDeletedParam
:disabled AppDisabledParam
OptionalGroupsKey (describe [AppGroupJobView] GroupListDocs)))
(defschema AppDetailCategory
{:id AppCategoryIdPathParam
:name (describe String "The App Category's name")})
(defschema AppDetailsTool
(assoc Tool
:id (describe String "The tool identifier.")))
(defschema AppDetails
(merge AppBase
{:id
(describe String "The app identifier.")
:tools
(describe [AppDetailsTool] ToolListDocs)
:deleted
AppDeletedParam
:disabled
AppDisabledParam
:integrator_email
(describe String "The App integrator's email address.")
:integrator_name
(describe String "The App integrator's full name.")
(optional-key :wiki_url)
AppDocUrlParam
:references
AppReferencesParam
:categories
(describe [AppDetailCategory]
"The list of Categories associated with the App")
:suggested_categories
(describe [AppDetailCategory]
"The list of Categories the integrator wishes to associate with the App")}))
(defschema AppDocumentation
{(optional-key :app_id)
StringAppIdParam
:documentation
AppDocParam
:references
AppReferencesParam
(optional-key :created_on)
(describe Date "The Date the App's documentation was created")
(optional-key :modified_on)
(describe Date "The Date the App's documentation was last modified")
(optional-key :created_by)
(describe String "The user that created the App's documentation")
(optional-key :modified_by)
(describe String "The user that last modified the App's documentation")})
(defschema AppDocumentationRequest
(dissoc AppDocumentation :references))
(defschema PipelineEligibility
{:is_valid (describe Boolean "Whether the App can be used in a Pipeline")
:reason (describe String "The reason an App cannot be used in a Pipeline")})
(defschema AppListingDetail
(merge AppBase
{:id
(describe String "The app ID.")
:app_type
(describe String "The type ID of the App")
:can_favor
(describe Boolean "Whether the current user can favorite this App")
:can_rate
(describe Boolean "Whether the current user can rate this App")
:can_run
(describe Boolean
"This flag is calculated by comparing the number of steps in the app to the number of steps
that have a tool associated with them. If the numbers are different then this flag is set to
`false`. The idea is that every step in the analysis has to have, at the very least, a tool
associated with it in order to run successfully")
:deleted
AppDeletedParam
:disabled
AppDisabledParam
:integrator_email
(describe String "The App integrator's email address")
:integrator_name
(describe String "The App integrator's full name")
(optional-key :is_favorite)
(describe Boolean "Whether the current user has marked the App as a favorite")
:is_public
AppPublicParam
:pipeline_eligibility
(describe PipelineEligibility "Whether the App can be used in a Pipeline")
:rating
(describe Rating "The App's rating details")
:step_count
(describe Long "The number of Tasks this App executes")
(optional-key :wiki_url)
AppDocUrlParam}))
(defschema AppListing
{:app_count (describe Long "The total number of Apps in the listing")
:apps (describe [AppListingDetail] "A listing of App details")})
(defschema AppIdList
{:app_ids (describe [UUID] "A List of UUIDs used to identify Apps")})
(defschema AppDeletionRequest
(merge AppIdList
{(optional-key :root_deletion_request)
(describe Boolean "Set to `true` to delete one or more public apps")}))
(defschema AppParameterListItemRequest
(->optional-param AppParameterListItem :id))
(defschema AppParameterListGroupRequest
(-> AppParameterListGroup
(->optional-param :id)
(assoc OptionalParameterArgumentsKey
(describe [AppParameterListItemRequest] TreeSelectorGroupParameterListDocs)
OptionalGroupsKey
(describe [(recursive #'AppParameterListGroupRequest)] TreeSelectorGroupGroupListDocs))))
(defschema AppParameterListItemOrTreeRequest
(-> AppParameterListItemOrTree
(->optional-param :id)
(assoc OptionalParameterArgumentsKey
(describe [AppParameterListItemRequest] TreeSelectorParameterListDocs))
(assoc OptionalGroupsKey
(describe [AppParameterListGroupRequest] TreeSelectorGroupListDocs))))
(defschema AppParameterRequest
(-> AppParameter
(->optional-param :id)
(assoc OptionalParameterArgumentsKey
(describe [AppParameterListItemOrTreeRequest] ListItemOrTreeDocs))))
(defschema AppGroupRequest
(-> AppGroup
(->optional-param :id)
(assoc OptionalParametersKey (describe [AppParameterRequest] ParameterListDocs))))
(defschema AppRequest
(-> App
(->optional-param :id)
(assoc OptionalGroupsKey (describe [AppGroupRequest] GroupListDocs))))
(defschema AppPreviewRequest
(-> App
(->optional-param :id)
(->optional-param :name)
(->optional-param :description)
(assoc OptionalGroupsKey (describe [AppGroupRequest] GroupListDocs)
(optional-key :is_public) AppPublicParam)))
(defschema AppCategoryIdListing
{:categories (describe [UUID] "A listing of App Category IDs")})
(defschema PublishAppRequest
(-> AppBase
(->optional-param :id)
(->optional-param :name)
(->optional-param :description)
(assoc :documentation AppDocParam
:references AppReferencesParam)
(merge AppCategoryIdListing)))
(defschema AdminAppPatchRequest
(-> AppBase
(->optional-param :id)
(->optional-param :name)
(->optional-param :description)
(assoc (optional-key :wiki_url) AppDocUrlParam
(optional-key :references) AppReferencesParam
(optional-key :deleted) AppDeletedParam
(optional-key :disabled) AppDisabledParam
OptionalGroupsKey (describe [AppGroup] GroupListDocs))))
| null | https://raw.githubusercontent.com/cyverse-archive/DiscoveryEnvironmentBackend/7f6177078c1a1cb6d11e62f12cfe2e22d669635b/services/metadactyl-clj/src/metadactyl/routes/domain/app.clj | clojure | (ns metadactyl.routes.domain.app
(:use [common-swagger-api.schema :only [->optional-param describe]]
[metadactyl.routes.params]
[metadactyl.routes.domain.app.rating]
[metadactyl.routes.domain.tool :only [Tool]]
[schema.core :only [Any defschema optional-key recursive]])
(:import [java.util UUID Date]))
(def AppIdParam (describe UUID "A UUID that is used to identify the App"))
(def StringAppIdParam (describe String "The App identifier."))
(def OptionalIdParam (describe UUID "An optional UUID identifier"))
(def AppDocParam (describe String "The App's documentation"))
(def AppDocUrlParam (describe String "The App's documentation URL"))
(def AppReferencesParam (describe [String] "The App's references"))
(def AppDeletedParam (describe Boolean "Whether the App is marked as deleted"))
(def AppDisabledParam (describe Boolean "Whether the App is marked as disabled"))
(def AppPublicParam (describe Boolean
"Whether the App has been published and is viewable by all users"))
(def OptionalGroupsKey (optional-key :groups))
(def OptionalParametersKey (optional-key :parameters))
(def OptionalParameterArgumentsKey (optional-key :arguments))
(def ToolListDocs "The tools used to execute the App")
(def GroupListDocs "The list of Parameter Groups associated with the App")
(def ParameterListDocs "The list of Parameters in this Group")
(def ListItemOrTreeDocs
"The List Parameter's arguments. Only used in cases where the user is given a fixed number of
values to choose from. This can occur for Parameters such as `TextSelection` or
`IntegerSelection` Parameters")
(def TreeSelectorParameterListDocs "The TreeSelector root's arguments")
(def TreeSelectorGroupListDocs "The TreeSelector root's groups")
(def TreeSelectorGroupParameterListDocs "The TreeSelector Group's arguments")
(def TreeSelectorGroupGroupListDocs "The TreeSelector Group's groups")
(defschema AppParameterListItem
{:id (describe UUID "A UUID that is used to identify the List Item")
(optional-key :name) (describe String "The List Item's name")
(optional-key :value) (describe String "The List Item's value")
(optional-key :description) (describe String "The List Item's description")
(optional-key :display) (describe String "The List Item's display label")
(optional-key :isDefault) (describe Boolean "Flags this Item as the List's default selection")})
(defschema AppParameterListGroup
(merge AppParameterListItem
{OptionalParameterArgumentsKey
(describe [AppParameterListItem] TreeSelectorGroupParameterListDocs)
OptionalGroupsKey
(describe [(recursive #'AppParameterListGroup)] TreeSelectorGroupGroupListDocs)}))
(defschema AppParameterListItemOrTree
(merge AppParameterListItem
{(optional-key :isSingleSelect)
(describe Boolean "The TreeSelector root's single-selection flag")
(optional-key :selectionCascade)
(describe String "The TreeSelector root's cascace option")
OptionalParameterArgumentsKey
(describe [AppParameterListItem] TreeSelectorParameterListDocs)
OptionalGroupsKey
(describe [AppParameterListGroup] TreeSelectorGroupListDocs)}))
(defschema AppParameterValidator
{:type
(describe String
"The validation rule's type, which describes how a property value should be validated. For
example, if the type is `IntAbove` then the property value entered by the user must be an
integer above a specific value, which is specified in the parameter list. You can use the
`rule-types` endpoint to get a list of validation rule types")
:params
(describe [Any]
"The list of parameters to use when validating a Parameter value. For example, to ensure that a
Parameter contains a value that is an integer greater than zero, you would use a validation
rule of type `IntAbove` along with a parameter list of `[0]`")})
(defschema AppFileParameters
{(optional-key :format)
(describe String "The Input/Output Parameter's file format")
(optional-key :file_info_type)
(describe String "The Input/Output Parameter's info type")
(optional-key :is_implicit)
(describe Boolean
"Whether the Output Parameter name is specified on the command line (but still be referenced in
Pipelines), or implicitly determined by the app itself. If the output file name is implicit
then the output file name either must always be the same or it must follow a naming convention
that can easily be matched with a glob pattern")
(optional-key :repeat_option_flag)
(describe Boolean
"Whether or not the command-line option flag should preceed each file of a MultiFileSelector
on the command line when the App is run")
(optional-key :data_source)
(describe String "The Output Parameter's source")
(optional-key :retain)
(describe Boolean
"Whether or not the Input should be copied back to the job output directory in iRODS")})
(defschema AppParameter
{:id
(describe UUID "A UUID that is used to identify the Parameter")
(optional-key :name)
(describe String
"The Parameter's name. In most cases, this field indicates the command-line option used to
identify the Parameter on the command line. In these cases, the Parameter is assumed to be
positional and no command-line option is used if the name is blank. For Parameters that
specify a limited set of selection values, however, this is not the case. Instead, the
Parameter arguments specify both the command-line flag and the Parameter value to use for each
option that is selected")
(optional-key :defaultValue)
(describe Any "The Parameter's default value")
(optional-key :value)
(describe Any "The Parameter's value, used for previewing this parameter on the command-line.")
(optional-key :label)
(describe String "The Parameter's prompt to display in the UI")
(optional-key :description)
(describe String "The Parameter's description")
(optional-key :order)
(describe Long
"The relative command-line order for the Parameter. If this field is not specified then the
arguments will appear on the command-line in the order in which they appear in the import JSON.
If you're not specifying the order, please be sure that the argument order is unimportant for
the tool being integrated")
(optional-key :required)
(describe Boolean "Whether or not a value is required for this Parameter")
(optional-key :isVisible)
(describe Boolean "The Parameter's intended visibility in the job submission UI")
(optional-key :omit_if_blank)
(describe Boolean
"Whether the command-line option should be omitted if the Parameter value is blank. This is
most useful for optional arguments that use command-line flags in conjunction with a value. In
this case, it is an error to include the command-line flag without a corresponding value. This
flag indicates that the command-line flag should be omitted if the value is blank. This can
also be used for positional arguments, but this flag tends to be useful only for trailing
positional arguments")
:type
(describe String
"The Parameter's type name. Must contain the name of one of the Parameter types defined in the
database. You can get the list of defined and undeprecated Parameter types using the
`parameter-types` endpoint")
(optional-key :file_parameters)
(describe AppFileParameters "The File Parameter specific details")
OptionalParameterArgumentsKey
(describe [AppParameterListItemOrTree] ListItemOrTreeDocs)
(optional-key :validators)
(describe [AppParameterValidator]
"The Parameter's validation rules, which contains a list of rules that can be used to verify
that Parameter values entered by a user are valid. Note that in cases where the user is given
a list of possibilities to choose from, no validation rules are required because the selection
list itself can be used to validate the Parameter value")})
(defschema AppGroup
{:id
(describe UUID "A UUID that is used to identify the Parameter Group")
(optional-key :name)
(describe String "The Parameter Group's name")
(optional-key :description)
(describe String "The Parameter Group's description")
:label
(describe String "The label used to identify the Parameter Group in the UI")
(optional-key :isVisible)
(describe Boolean "The Parameter Group's intended visibility in the job submission UI")
OptionalParametersKey
(describe [AppParameter] ParameterListDocs)})
(defschema AppBase
{:id AppIdParam
:name (describe String "The App's name")
:description (describe String "The App's description")
(optional-key :integration_date) (describe Date "The App's Date of public submission")
(optional-key :edited_date) (describe Date "The App's Date of its last edit")})
(defschema App
(merge AppBase
{(optional-key :tools) (describe [Tool] ToolListDocs)
(optional-key :references) AppReferencesParam
OptionalGroupsKey (describe [AppGroup] GroupListDocs)}))
(defschema AppFileParameterDetails
{:id (describe String "The Parameter's ID")
:name (describe String "The Parameter's name")
:description (describe String "The Parameter's description")
:label (describe String "The Input Parameter's label or the Output Parameter's value")
:format (describe String "The Parameter's file format")
:required (describe Boolean "Whether or not a value is required for this Parameter")})
(defschema AppTask
{:id (describe String "The Task's ID")
:name (describe String "The Task's name")
:description (describe String "The Task's description")
:inputs (describe [AppFileParameterDetails] "The Task's input parameters")
:outputs (describe [AppFileParameterDetails] "The Task's output parameters")})
(defschema AppTaskListing
(assoc AppBase
:id (describe String "The App's ID.")
:tasks (describe [AppTask] "The App's tasks")))
(defschema AppParameterJobView
(assoc AppParameter
:id
(describe String
"A string consisting of the App's step ID and the Parameter ID separated by an underscore.
Both identifiers are necessary because the same task may be associated with a single App,
which would cause duplicate keys in the job submission JSON. The step ID is prepended to
the Parameter ID in order to ensure that all parameter value keys are unique.")))
(defschema AppGroupJobView
(assoc AppGroup
:id (describe String "The app group ID.")
:step_number (describe Long "The step number associated with this parameter group")
OptionalParametersKey (describe [AppParameterJobView] ParameterListDocs)))
(defschema AppJobView
(assoc AppBase
:app_type (describe String "DE or External.")
:id (describe String "The app ID.")
:label (describe String "An alias for the App's name")
:deleted AppDeletedParam
:disabled AppDisabledParam
OptionalGroupsKey (describe [AppGroupJobView] GroupListDocs)))
(defschema AppDetailCategory
{:id AppCategoryIdPathParam
:name (describe String "The App Category's name")})
(defschema AppDetailsTool
(assoc Tool
:id (describe String "The tool identifier.")))
(defschema AppDetails
(merge AppBase
{:id
(describe String "The app identifier.")
:tools
(describe [AppDetailsTool] ToolListDocs)
:deleted
AppDeletedParam
:disabled
AppDisabledParam
:integrator_email
(describe String "The App integrator's email address.")
:integrator_name
(describe String "The App integrator's full name.")
(optional-key :wiki_url)
AppDocUrlParam
:references
AppReferencesParam
:categories
(describe [AppDetailCategory]
"The list of Categories associated with the App")
:suggested_categories
(describe [AppDetailCategory]
"The list of Categories the integrator wishes to associate with the App")}))
(defschema AppDocumentation
{(optional-key :app_id)
StringAppIdParam
:documentation
AppDocParam
:references
AppReferencesParam
(optional-key :created_on)
(describe Date "The Date the App's documentation was created")
(optional-key :modified_on)
(describe Date "The Date the App's documentation was last modified")
(optional-key :created_by)
(describe String "The user that created the App's documentation")
(optional-key :modified_by)
(describe String "The user that last modified the App's documentation")})
(defschema AppDocumentationRequest
(dissoc AppDocumentation :references))
(defschema PipelineEligibility
{:is_valid (describe Boolean "Whether the App can be used in a Pipeline")
:reason (describe String "The reason an App cannot be used in a Pipeline")})
(defschema AppListingDetail
(merge AppBase
{:id
(describe String "The app ID.")
:app_type
(describe String "The type ID of the App")
:can_favor
(describe Boolean "Whether the current user can favorite this App")
:can_rate
(describe Boolean "Whether the current user can rate this App")
:can_run
(describe Boolean
"This flag is calculated by comparing the number of steps in the app to the number of steps
that have a tool associated with them. If the numbers are different then this flag is set to
`false`. The idea is that every step in the analysis has to have, at the very least, a tool
associated with it in order to run successfully")
:deleted
AppDeletedParam
:disabled
AppDisabledParam
:integrator_email
(describe String "The App integrator's email address")
:integrator_name
(describe String "The App integrator's full name")
(optional-key :is_favorite)
(describe Boolean "Whether the current user has marked the App as a favorite")
:is_public
AppPublicParam
:pipeline_eligibility
(describe PipelineEligibility "Whether the App can be used in a Pipeline")
:rating
(describe Rating "The App's rating details")
:step_count
(describe Long "The number of Tasks this App executes")
(optional-key :wiki_url)
AppDocUrlParam}))
(defschema AppListing
{:app_count (describe Long "The total number of Apps in the listing")
:apps (describe [AppListingDetail] "A listing of App details")})
(defschema AppIdList
{:app_ids (describe [UUID] "A List of UUIDs used to identify Apps")})
(defschema AppDeletionRequest
(merge AppIdList
{(optional-key :root_deletion_request)
(describe Boolean "Set to `true` to delete one or more public apps")}))
(defschema AppParameterListItemRequest
(->optional-param AppParameterListItem :id))
(defschema AppParameterListGroupRequest
(-> AppParameterListGroup
(->optional-param :id)
(assoc OptionalParameterArgumentsKey
(describe [AppParameterListItemRequest] TreeSelectorGroupParameterListDocs)
OptionalGroupsKey
(describe [(recursive #'AppParameterListGroupRequest)] TreeSelectorGroupGroupListDocs))))
(defschema AppParameterListItemOrTreeRequest
(-> AppParameterListItemOrTree
(->optional-param :id)
(assoc OptionalParameterArgumentsKey
(describe [AppParameterListItemRequest] TreeSelectorParameterListDocs))
(assoc OptionalGroupsKey
(describe [AppParameterListGroupRequest] TreeSelectorGroupListDocs))))
(defschema AppParameterRequest
(-> AppParameter
(->optional-param :id)
(assoc OptionalParameterArgumentsKey
(describe [AppParameterListItemOrTreeRequest] ListItemOrTreeDocs))))
(defschema AppGroupRequest
(-> AppGroup
(->optional-param :id)
(assoc OptionalParametersKey (describe [AppParameterRequest] ParameterListDocs))))
(defschema AppRequest
(-> App
(->optional-param :id)
(assoc OptionalGroupsKey (describe [AppGroupRequest] GroupListDocs))))
(defschema AppPreviewRequest
(-> App
(->optional-param :id)
(->optional-param :name)
(->optional-param :description)
(assoc OptionalGroupsKey (describe [AppGroupRequest] GroupListDocs)
(optional-key :is_public) AppPublicParam)))
(defschema AppCategoryIdListing
{:categories (describe [UUID] "A listing of App Category IDs")})
(defschema PublishAppRequest
(-> AppBase
(->optional-param :id)
(->optional-param :name)
(->optional-param :description)
(assoc :documentation AppDocParam
:references AppReferencesParam)
(merge AppCategoryIdListing)))
(defschema AdminAppPatchRequest
(-> AppBase
(->optional-param :id)
(->optional-param :name)
(->optional-param :description)
(assoc (optional-key :wiki_url) AppDocUrlParam
(optional-key :references) AppReferencesParam
(optional-key :deleted) AppDeletedParam
(optional-key :disabled) AppDisabledParam
OptionalGroupsKey (describe [AppGroup] GroupListDocs))))
| |
15b242462e166adbee41448e012f02c242a7ed6f5cd46ec9a297f0a5c0c4c592 | gedge-platform/gs-broker | eetcd_cluster.erl | -module(eetcd_cluster).
%% API
-include("eetcd.hrl").
-export([new/1, with_timeout/2]).
-export([member_list/1]).
-export([member_add/2, member_add_as_learner/2]).
-export([member_remove/2]).
-export([member_update/3]).
-export([member_promote/2]).
@doc MemberList lists the current cluster membership .
%%% <dl>
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : member_list(ConnName)'</dd >
< dt > 2.elixir < /dt >
%%% <dd>
%%% ```
%%% :eetcd_cluster.new(connName)
%%% |> :eetcd_cluster.with_timeout(6000)
%%% |> :eetcd_cluster.member_list()
%%% '''
%%% </dd> </dl>
%%% {@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
%%% @end
-spec member_list(context()|name()) ->
{ok,router_pb:'Etcd.MemberListResponse'()}|{error,eetcd_error()}.
member_list(Context) -> eetcd_cluster_gen:member_list(new(Context)).
@doc MemberAdd adds a new member into the cluster .
%%% <dl>
< dt > 1.base < /dt >
%%% <dd> `eetcd_cluster:member_add(ConnName, [":2380"])'</dd>
< dt > 2.elixir < /dt >
%%% <dd>
%%% ```
%%% :eetcd_cluster.new(connName)
%%% |> :eetcd_cluster.with_timeout(6000)
%%% |> :eetcd_cluster.member_add([":2380"])
%%% '''
%%% </dd> </dl>
%%% {@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
%%% @end
-spec member_add(context()|name(), PeerURLs) ->
{ok,router_pb:'Etcd.MemberListResponse'()}
| {error, {'grpc_error', non_neg_integer(), binary()}} | {error, term()}
when PeerURLs :: [iodata()].
member_add(Context, PeerAddrs) when is_list(PeerAddrs) ->
C1 = new(Context),
C2 = maps:put(peerURLs, PeerAddrs, C1),
C3 = maps:put(isLearner, false, C2),
eetcd_cluster_gen:member_add(C3).
@doc MemberAddAsLearner adds a new learner member into the cluster .
%%% <dl>
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : , [ " :2380"])'</dd >
< dt > 2.elixir < /dt >
%%% <dd>
%%% ```
%%% :eetcd_cluster.new(connName)
%%% |> :eetcd_cluster.with_timeout(6000)
%%% |> :eetcd_cluster.member_add_as_learner([":2380"])
%%% '''
%%% </dd> </dl>
%%% {@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
%%% @end
-spec member_add_as_learner(context()|name(), PeerURLs) ->
{ok,router_pb:'Etcd.MemberListResponse'()}
| {error, {'grpc_error', non_neg_integer(), binary()}} | {error, term()}
when PeerURLs :: [iodata()].
member_add_as_learner(Context, PeerAddrs) when is_list(PeerAddrs) ->
C1 = new(Context),
C2 = maps:put(peerURLs, PeerAddrs, C1),
C3 = maps:put(isLearner, true, C2),
eetcd_cluster_gen:member_add(C3).
@doc MemberRemove removes an existing member from the cluster .
%%% <dl>
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : member_remove(ConnName , Id)'</dd >
< dt > 2.elixir < /dt >
%%% <dd>
%%% ```
%%% :eetcd_cluster.new(connName)
%%% |> :eetcd_cluster.with_timeout(6000)
%%% |> :eetcd_cluster.member_remove(id)
%%% '''
%%% </dd> </dl>
%%% {@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
%%% @end
-spec member_remove(context()|name(), pos_integer()) ->
{ok,router_pb:'Etcd.MemberRemoveResponse'()}|{error,eetcd_error()}.
member_remove(Context, Id) when is_integer(Id) ->
C1 = new(Context),
C2 = maps:put('ID', Id, C1),
eetcd_cluster_gen:member_remove(C2).
@doc MemberUpdate updates the peer addresses of the member .
%%% <dl>
< dt > 1.base < /dt >
%%% <dd> `eetcd_cluster:member_update(ConnName, Id, PeerAddrs)'</dd>
< dt > 2.elixir < /dt >
%%% <dd>
%%% ```
%%% :eetcd_cluster.new(connName)
%%% |> :eetcd_cluster.with_timeout(6000)
| > : eetcd_cluster.member_remove(id , )
%%% '''
%%% </dd> </dl>
%%% {@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
%%% @end
-spec member_update(context()|name(), pos_integer(), [list()]) ->
{ok,router_pb:'Etcd.MemberUpdateResponse'()}|{error,eetcd_error()}.
member_update(Context, Id, PeerAddrs)
when is_integer(Id) andalso is_list(PeerAddrs) ->
C1 = new(Context),
C2 = maps:put('ID', Id, C1),
C3 = maps:put(peerURLs, PeerAddrs, C2),
eetcd_cluster_gen:member_update(C3).
@doc MemberPromote promotes a member from raft learner ( non - voting ) to raft voting member .
%%% <dl>
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : member_promote(ConnName , Id)'</dd >
< dt > 2.elixir < /dt >
%%% <dd>
%%% ```
%%% :eetcd_cluster.new(connName)
%%% |> :eetcd_cluster.with_timeout(6000)
%%% |> :eetcd_cluster.member_promote(id)
%%% '''
%%% </dd> </dl>
%%% {@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
%%% @end
-spec member_promote(context()|name(), pos_integer()) ->
{ok,router_pb:'Etcd.MemberPromoteResponse'()}|{error,eetcd_error()}.
member_promote(Context, Id) when is_integer(Id) ->
C1 = new(Context),
C2 = maps:put('ID', Id, C1),
eetcd_cluster_gen:member_promote(C2).
%%% @doc Create context for request.
-spec new(atom()|reference()) -> context().
new(Context) -> eetcd:new(Context).
@doc Timeout is an integer greater than zero which specifies how many milliseconds to wait for a reply ,
or the atom infinity to wait indefinitely . Default value is 5000 .
%% If no reply is received within the specified time, the function call fails with `{error, timeout}'.
-spec with_timeout(context(), pos_integer()|infinity) -> context().
with_timeout(Context, Timeout) -> eetcd:with_timeout(Context, Timeout).
| null | https://raw.githubusercontent.com/gedge-platform/gs-broker/c4c1ad39e563537d46553eae1363317cf75aff26/broker-server/deps/eetcd/src/eetcd_cluster.erl | erlang | API
<dl>
<dd>
```
:eetcd_cluster.new(connName)
|> :eetcd_cluster.with_timeout(6000)
|> :eetcd_cluster.member_list()
'''
</dd> </dl>
{@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
@end
<dl>
<dd> `eetcd_cluster:member_add(ConnName, [":2380"])'</dd>
<dd>
```
:eetcd_cluster.new(connName)
|> :eetcd_cluster.with_timeout(6000)
|> :eetcd_cluster.member_add([":2380"])
'''
</dd> </dl>
{@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
@end
<dl>
<dd>
```
:eetcd_cluster.new(connName)
|> :eetcd_cluster.with_timeout(6000)
|> :eetcd_cluster.member_add_as_learner([":2380"])
'''
</dd> </dl>
{@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
@end
<dl>
<dd>
```
:eetcd_cluster.new(connName)
|> :eetcd_cluster.with_timeout(6000)
|> :eetcd_cluster.member_remove(id)
'''
</dd> </dl>
{@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
@end
<dl>
<dd> `eetcd_cluster:member_update(ConnName, Id, PeerAddrs)'</dd>
<dd>
```
:eetcd_cluster.new(connName)
|> :eetcd_cluster.with_timeout(6000)
'''
</dd> </dl>
{@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
@end
<dl>
<dd>
```
:eetcd_cluster.new(connName)
|> :eetcd_cluster.with_timeout(6000)
|> :eetcd_cluster.member_promote(id)
'''
</dd> </dl>
{@link eetcd_cluster:with_timeout/2} {@link eetcd_cluster:new/1}
@end
@doc Create context for request.
If no reply is received within the specified time, the function call fails with `{error, timeout}'.
| -module(eetcd_cluster).
-include("eetcd.hrl").
-export([new/1, with_timeout/2]).
-export([member_list/1]).
-export([member_add/2, member_add_as_learner/2]).
-export([member_remove/2]).
-export([member_update/3]).
-export([member_promote/2]).
@doc MemberList lists the current cluster membership .
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : member_list(ConnName)'</dd >
< dt > 2.elixir < /dt >
-spec member_list(context()|name()) ->
{ok,router_pb:'Etcd.MemberListResponse'()}|{error,eetcd_error()}.
member_list(Context) -> eetcd_cluster_gen:member_list(new(Context)).
@doc MemberAdd adds a new member into the cluster .
< dt > 1.base < /dt >
< dt > 2.elixir < /dt >
-spec member_add(context()|name(), PeerURLs) ->
{ok,router_pb:'Etcd.MemberListResponse'()}
| {error, {'grpc_error', non_neg_integer(), binary()}} | {error, term()}
when PeerURLs :: [iodata()].
member_add(Context, PeerAddrs) when is_list(PeerAddrs) ->
C1 = new(Context),
C2 = maps:put(peerURLs, PeerAddrs, C1),
C3 = maps:put(isLearner, false, C2),
eetcd_cluster_gen:member_add(C3).
@doc MemberAddAsLearner adds a new learner member into the cluster .
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : , [ " :2380"])'</dd >
< dt > 2.elixir < /dt >
-spec member_add_as_learner(context()|name(), PeerURLs) ->
{ok,router_pb:'Etcd.MemberListResponse'()}
| {error, {'grpc_error', non_neg_integer(), binary()}} | {error, term()}
when PeerURLs :: [iodata()].
member_add_as_learner(Context, PeerAddrs) when is_list(PeerAddrs) ->
C1 = new(Context),
C2 = maps:put(peerURLs, PeerAddrs, C1),
C3 = maps:put(isLearner, true, C2),
eetcd_cluster_gen:member_add(C3).
@doc MemberRemove removes an existing member from the cluster .
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : member_remove(ConnName , Id)'</dd >
< dt > 2.elixir < /dt >
-spec member_remove(context()|name(), pos_integer()) ->
{ok,router_pb:'Etcd.MemberRemoveResponse'()}|{error,eetcd_error()}.
member_remove(Context, Id) when is_integer(Id) ->
C1 = new(Context),
C2 = maps:put('ID', Id, C1),
eetcd_cluster_gen:member_remove(C2).
@doc MemberUpdate updates the peer addresses of the member .
< dt > 1.base < /dt >
< dt > 2.elixir < /dt >
| > : eetcd_cluster.member_remove(id , )
-spec member_update(context()|name(), pos_integer(), [list()]) ->
{ok,router_pb:'Etcd.MemberUpdateResponse'()}|{error,eetcd_error()}.
member_update(Context, Id, PeerAddrs)
when is_integer(Id) andalso is_list(PeerAddrs) ->
C1 = new(Context),
C2 = maps:put('ID', Id, C1),
C3 = maps:put(peerURLs, PeerAddrs, C2),
eetcd_cluster_gen:member_update(C3).
@doc MemberPromote promotes a member from raft learner ( non - voting ) to raft voting member .
< dt > 1.base < /dt >
< dd > ` eetcd_cluster : member_promote(ConnName , Id)'</dd >
< dt > 2.elixir < /dt >
-spec member_promote(context()|name(), pos_integer()) ->
{ok,router_pb:'Etcd.MemberPromoteResponse'()}|{error,eetcd_error()}.
member_promote(Context, Id) when is_integer(Id) ->
C1 = new(Context),
C2 = maps:put('ID', Id, C1),
eetcd_cluster_gen:member_promote(C2).
-spec new(atom()|reference()) -> context().
new(Context) -> eetcd:new(Context).
@doc Timeout is an integer greater than zero which specifies how many milliseconds to wait for a reply ,
or the atom infinity to wait indefinitely . Default value is 5000 .
-spec with_timeout(context(), pos_integer()|infinity) -> context().
with_timeout(Context, Timeout) -> eetcd:with_timeout(Context, Timeout).
|
65a7f31d296c6983406d86223ebcc022c24299c6cdb7fc3963e20d70332c7c1d | fluree/ledger | collections.clj | (ns fluree.db.ledger.docs.schema.collections
(:require [clojure.test :refer :all]
[fluree.db.test-helpers :as test]
[fluree.db.ledger.docs.getting-started.basic-schema :as basic]
[fluree.db.api :as fdb]
[clojure.core.async :as async]
[clojure.string :as str]))
(use-fixtures :once test/test-system-deprecated)
(deftest add-collection-long-desc
(testing "Add long description to collections.")
(let [long-desc-txn [{:_id "_predicate", :name "_collection/longDescription", :type "string"}]
res (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
long-desc-txn
{:timeout 120000}))
add-long-desc-txn [{:_id ["_collection/name" "person"],
:longDescription "I have a lot to say about this collection, so this is a longer description about the person collection"}
{:_id "_collection",
:name "animal",
:longDescription "I have a lot to say about this collection, so this is a longer description about the animal collection"}]
add-long-desc-res (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
add-long-desc-txn
{:timeout 120000}))]
(is (= 200 (:status res)))
(is (= 200 (:status add-long-desc-res)))
(is (= 1 (-> res :tempids count)))
(is (= 1 (-> add-long-desc-res :tempids count)))
(is (= 9 (-> res :flakes count)))
(is (= 10 (-> add-long-desc-res :flakes count)))))
(deftest query-collection-name-predicate
(testing "Query the _predicate/name predicate")
(let [query-collection-name {:select ["*"]
:from ["_predicate/name" "_collection/name"]}
db (basic/get-db test/ledger-chat)
res (-> (async/<!! (fdb/query-async db query-collection-name))
first)]
(is (= "_collection/name" (get res "_predicate/name")))))
(deftest collection-upsert
(testing "Attempt to upsert _collection/name, then set upsert")
(let [txn [{:_id "_collection", :name "_user", :doc "The user's collection"}]
res (-> (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
txn
{:timeout 120000}))
test/safe-Throwable->map
:cause)
set-upsert [{:_id ["_predicate/name" "_collection/name"], :upsert true}]
upsertRes (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
set-upsert
{:timeout 120000}))
attemptToUpsertRes (async/<!! (fdb/transact-async (basic/get-conn) test/ledger-chat txn))]
(is (= res "Unique predicate _collection/name with value: _user matched an existing subject: 17592186044421."))
(is (= 200 (:status upsertRes)))
(is (= 200 (:status attemptToUpsertRes)))
(is (= 9 (-> attemptToUpsertRes :flakes count)))))
(deftest collections-test
(add-collection-long-desc)
(query-collection-name-predicate)
(collection-upsert))
(deftest tests-independent
(basic/add-collections*)
(basic/add-predicates)
(basic/add-sample-data)
(basic/graphql-txn)
(collections-test))
| null | https://raw.githubusercontent.com/fluree/ledger/254a5a6e03291f81db1837e2a8b13e447d6342ce/test/fluree/db/ledger/docs/schema/collections.clj | clojure | (ns fluree.db.ledger.docs.schema.collections
(:require [clojure.test :refer :all]
[fluree.db.test-helpers :as test]
[fluree.db.ledger.docs.getting-started.basic-schema :as basic]
[fluree.db.api :as fdb]
[clojure.core.async :as async]
[clojure.string :as str]))
(use-fixtures :once test/test-system-deprecated)
(deftest add-collection-long-desc
(testing "Add long description to collections.")
(let [long-desc-txn [{:_id "_predicate", :name "_collection/longDescription", :type "string"}]
res (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
long-desc-txn
{:timeout 120000}))
add-long-desc-txn [{:_id ["_collection/name" "person"],
:longDescription "I have a lot to say about this collection, so this is a longer description about the person collection"}
{:_id "_collection",
:name "animal",
:longDescription "I have a lot to say about this collection, so this is a longer description about the animal collection"}]
add-long-desc-res (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
add-long-desc-txn
{:timeout 120000}))]
(is (= 200 (:status res)))
(is (= 200 (:status add-long-desc-res)))
(is (= 1 (-> res :tempids count)))
(is (= 1 (-> add-long-desc-res :tempids count)))
(is (= 9 (-> res :flakes count)))
(is (= 10 (-> add-long-desc-res :flakes count)))))
(deftest query-collection-name-predicate
(testing "Query the _predicate/name predicate")
(let [query-collection-name {:select ["*"]
:from ["_predicate/name" "_collection/name"]}
db (basic/get-db test/ledger-chat)
res (-> (async/<!! (fdb/query-async db query-collection-name))
first)]
(is (= "_collection/name" (get res "_predicate/name")))))
(deftest collection-upsert
(testing "Attempt to upsert _collection/name, then set upsert")
(let [txn [{:_id "_collection", :name "_user", :doc "The user's collection"}]
res (-> (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
txn
{:timeout 120000}))
test/safe-Throwable->map
:cause)
set-upsert [{:_id ["_predicate/name" "_collection/name"], :upsert true}]
upsertRes (async/<!! (fdb/transact-async (basic/get-conn)
test/ledger-chat
set-upsert
{:timeout 120000}))
attemptToUpsertRes (async/<!! (fdb/transact-async (basic/get-conn) test/ledger-chat txn))]
(is (= res "Unique predicate _collection/name with value: _user matched an existing subject: 17592186044421."))
(is (= 200 (:status upsertRes)))
(is (= 200 (:status attemptToUpsertRes)))
(is (= 9 (-> attemptToUpsertRes :flakes count)))))
(deftest collections-test
(add-collection-long-desc)
(query-collection-name-predicate)
(collection-upsert))
(deftest tests-independent
(basic/add-collections*)
(basic/add-predicates)
(basic/add-sample-data)
(basic/graphql-txn)
(collections-test))
| |
1aac554a1c55df6540a183d89fdf150a87dbff70f6cb5c18276b2f9a138e8047 | input-output-hk/cardano-addresses | ValidationSpec.hs | # LANGUAGE FlexibleContexts #
# LANGUAGE LambdaCase #
# LANGUAGE QuasiQuotes #
module Command.Script.ValidationSpec
( spec
) where
import Prelude
import Cardano.Address.Script
( ErrRecommendedValidateScript (..)
, ErrValidateScript (..)
, ValidationLevel (..)
, prettyErrValidateScript
)
import Data.String.Interpolate
( iii )
import Test.Hspec
( Spec, SpecWith, it, shouldBe, shouldContain )
import Test.Utils
( cli, describeCmd )
spec :: Spec
spec = do
describeCmd [ "script", "validate"] $ do
specScriptValidated RequiredValidation
[iii|#{verKeyH1}|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, #{verKeyH3} ]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, active_from 10, active_until 25]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, active_from 6, active_until 15]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, active_from 6, active_until 25]|]
specScriptValidated RequiredValidation
[iii|all []|]
specScriptNotValidated Malformed RequiredValidation
[iii|any [ #{verKeyH1}, #{verKeyH2}, active_from a]|]
specScriptNotValidated (NotRecommended EmptyList) RecommendedValidation
[iii|all []|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ active_from 11, active_until 25]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, active_from 11, active_until 25]|]
specScriptNotValidated (NotRecommended ListTooSmall) RecommendedValidation
[iii|at_least 2 [ #{verKeyH1}, active_from 11, active_until 25]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, active_from 11, active_until 25, active_until 30]|]
specScriptNotValidated (NotRecommended RedundantTimelocks) RecommendedValidation
[iii|at_least 1 [ #{verKeyH1}, active_from 11, active_until 25, active_until 30]|]
specScriptValidated RequiredValidation
[iii|any [ #{verKeyH1}, #{verKeyH2}, #{verKeyH2}]|]
specScriptNotValidated (NotRecommended DuplicateSignatures) RecommendedValidation
[iii|any [ #{verKeyH1}, #{verKeyH2}, #{verKeyH2}]|]
specScriptValidated RequiredValidation
[iii|at_least 0 [ #{verKeyH1}, #{verKeyH2} ]|]
specScriptNotValidated (NotRecommended MZero) RecommendedValidation
[iii|at_least 0 [ #{verKeyH1}, #{verKeyH2} ]|]
specScriptNotValidated NotUniformKeyType RequiredValidation
[iii|any [ #{verKeyH1}, #{verKeyH4}]|]
specScriptNotValidated NotUniformKeyType RecommendedValidation
[iii|at_least 1 [ #{verKeyH1}, #{verKeyH4} ]|]
levelStr :: ValidationLevel -> String
levelStr = \case
RequiredValidation -> "--required"
RecommendedValidation -> "--recommended"
specScriptValidated :: ValidationLevel -> String -> SpecWith ()
specScriptValidated level script = it (script <> " => Validated.") $ do
out <- cli (["script", "validate", levelStr level, script]) ""
out `shouldBe` "Validated.\n"
specScriptNotValidated :: ErrValidateScript -> ValidationLevel -> String -> SpecWith ()
specScriptNotValidated errMsg level script = it (script <> " => " <> show errMsg) $ do
(out, err) <- cli (["script", "validate", levelStr level, script]) ""
out `shouldBe` ("" :: String)
err `shouldContain` prettyErrValidateScript errMsg
verKeyH1 :: String
verKeyH1 = "addr_shared_vkh1zxt0uvrza94h3hv4jpv0ttddgnwkvdgeyq8jf9w30mcs6y8w3nq"
verKeyH2 :: String
verKeyH2 = "addr_shared_vkh1y3zl4nqgm96ankt96dsdhc86vd5geny0wr7hu8cpzdfcqskq2cp"
verKeyH3 :: String
verKeyH3 = "addr_shared_vkh175wsm9ckhm3snwcsn72543yguxeuqm7v9r6kl6gx57h8gdydcd9"
verKeyH4 :: String
verKeyH4 = "stake_shared_vkh1nqc00hvlc6cq0sfhretk0rmzw8dywmusp8retuqnnxzajtzhjg5"
| null | https://raw.githubusercontent.com/input-output-hk/cardano-addresses/27eed933b67064542879729cb8a34b8a4ae69ed2/command-line/test/Command/Script/ValidationSpec.hs | haskell | # LANGUAGE FlexibleContexts #
# LANGUAGE LambdaCase #
# LANGUAGE QuasiQuotes #
module Command.Script.ValidationSpec
( spec
) where
import Prelude
import Cardano.Address.Script
( ErrRecommendedValidateScript (..)
, ErrValidateScript (..)
, ValidationLevel (..)
, prettyErrValidateScript
)
import Data.String.Interpolate
( iii )
import Test.Hspec
( Spec, SpecWith, it, shouldBe, shouldContain )
import Test.Utils
( cli, describeCmd )
spec :: Spec
spec = do
describeCmd [ "script", "validate"] $ do
specScriptValidated RequiredValidation
[iii|#{verKeyH1}|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, #{verKeyH3} ]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, active_from 10, active_until 25]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, active_from 6, active_until 15]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, #{verKeyH2}, active_from 6, active_until 25]|]
specScriptValidated RequiredValidation
[iii|all []|]
specScriptNotValidated Malformed RequiredValidation
[iii|any [ #{verKeyH1}, #{verKeyH2}, active_from a]|]
specScriptNotValidated (NotRecommended EmptyList) RecommendedValidation
[iii|all []|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ active_from 11, active_until 25]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, active_from 11, active_until 25]|]
specScriptNotValidated (NotRecommended ListTooSmall) RecommendedValidation
[iii|at_least 2 [ #{verKeyH1}, active_from 11, active_until 25]|]
specScriptValidated RequiredValidation
[iii|at_least 2 [ #{verKeyH1}, active_from 11, active_until 25, active_until 30]|]
specScriptNotValidated (NotRecommended RedundantTimelocks) RecommendedValidation
[iii|at_least 1 [ #{verKeyH1}, active_from 11, active_until 25, active_until 30]|]
specScriptValidated RequiredValidation
[iii|any [ #{verKeyH1}, #{verKeyH2}, #{verKeyH2}]|]
specScriptNotValidated (NotRecommended DuplicateSignatures) RecommendedValidation
[iii|any [ #{verKeyH1}, #{verKeyH2}, #{verKeyH2}]|]
specScriptValidated RequiredValidation
[iii|at_least 0 [ #{verKeyH1}, #{verKeyH2} ]|]
specScriptNotValidated (NotRecommended MZero) RecommendedValidation
[iii|at_least 0 [ #{verKeyH1}, #{verKeyH2} ]|]
specScriptNotValidated NotUniformKeyType RequiredValidation
[iii|any [ #{verKeyH1}, #{verKeyH4}]|]
specScriptNotValidated NotUniformKeyType RecommendedValidation
[iii|at_least 1 [ #{verKeyH1}, #{verKeyH4} ]|]
levelStr :: ValidationLevel -> String
levelStr = \case
RequiredValidation -> "--required"
RecommendedValidation -> "--recommended"
specScriptValidated :: ValidationLevel -> String -> SpecWith ()
specScriptValidated level script = it (script <> " => Validated.") $ do
out <- cli (["script", "validate", levelStr level, script]) ""
out `shouldBe` "Validated.\n"
specScriptNotValidated :: ErrValidateScript -> ValidationLevel -> String -> SpecWith ()
specScriptNotValidated errMsg level script = it (script <> " => " <> show errMsg) $ do
(out, err) <- cli (["script", "validate", levelStr level, script]) ""
out `shouldBe` ("" :: String)
err `shouldContain` prettyErrValidateScript errMsg
verKeyH1 :: String
verKeyH1 = "addr_shared_vkh1zxt0uvrza94h3hv4jpv0ttddgnwkvdgeyq8jf9w30mcs6y8w3nq"
verKeyH2 :: String
verKeyH2 = "addr_shared_vkh1y3zl4nqgm96ankt96dsdhc86vd5geny0wr7hu8cpzdfcqskq2cp"
verKeyH3 :: String
verKeyH3 = "addr_shared_vkh175wsm9ckhm3snwcsn72543yguxeuqm7v9r6kl6gx57h8gdydcd9"
verKeyH4 :: String
verKeyH4 = "stake_shared_vkh1nqc00hvlc6cq0sfhretk0rmzw8dywmusp8retuqnnxzajtzhjg5"
| |
d7fb1609ca702c32ce9f4b28cab421a069c7d660e61c35224c1779812f2ead1e | yesodweb/path-pieces | main.hs | {-# Language ScopedTypeVariables #-}
# OPTIONS_GHC -fno - warn - orphans #
module Main where
import Test.Hspec
import Test.Hspec.QuickCheck(prop)
import Test.QuickCheck
import Web.PathPieces
import qualified Data.Text as T
import Data.Maybe (fromJust)
import FileLocation ( debug )
instance Arbitrary T.Text where
arbitrary = fmap T.pack arbitrary
main :: IO ()
main = hspec spec
spec :: Spec
spec = do
describe "PathPiece" $ do
prop "toPathPiece <=> fromPathPiece String" $ \(p::String) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> null p
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Text" $ \(p::T.Text) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> T.null p
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Int" $ \(p::Int) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> False
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Bool" $ \(p::Bool) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> False
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Maybe String" $ \(p::Maybe String) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> False
Just pConverted -> p == pConverted
describe "PathMultiPiece" $ do
prop "toPathMultiPiece <=> fromPathMultiPiece String" $ \(p::[String]) ->
p == (fromJust . fromPathMultiPiece . toPathMultiPiece) p
prop "toPathMultiPiece <=> fromPathMultiPiece Text" $ \(p::[T.Text]) ->
p == (fromJust . fromPathMultiPiece . toPathMultiPiece) p
it "bad ints are rejected" $ fromPathPiece (T.pack "123hello")
`shouldBe` (Nothing :: Maybe Int)
| null | https://raw.githubusercontent.com/yesodweb/path-pieces/d6052afa5f6b26c127dbb8361e5042d73529a2a9/test/main.hs | haskell | # Language ScopedTypeVariables # | # OPTIONS_GHC -fno - warn - orphans #
module Main where
import Test.Hspec
import Test.Hspec.QuickCheck(prop)
import Test.QuickCheck
import Web.PathPieces
import qualified Data.Text as T
import Data.Maybe (fromJust)
import FileLocation ( debug )
instance Arbitrary T.Text where
arbitrary = fmap T.pack arbitrary
main :: IO ()
main = hspec spec
spec :: Spec
spec = do
describe "PathPiece" $ do
prop "toPathPiece <=> fromPathPiece String" $ \(p::String) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> null p
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Text" $ \(p::T.Text) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> T.null p
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Int" $ \(p::Int) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> False
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Bool" $ \(p::Bool) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> False
Just pConverted -> p == pConverted
prop "toPathPiece <=> fromPathPiece Maybe String" $ \(p::Maybe String) ->
case (fromPathPiece . toPathPiece) p of
Nothing -> False
Just pConverted -> p == pConverted
describe "PathMultiPiece" $ do
prop "toPathMultiPiece <=> fromPathMultiPiece String" $ \(p::[String]) ->
p == (fromJust . fromPathMultiPiece . toPathMultiPiece) p
prop "toPathMultiPiece <=> fromPathMultiPiece Text" $ \(p::[T.Text]) ->
p == (fromJust . fromPathMultiPiece . toPathMultiPiece) p
it "bad ints are rejected" $ fromPathPiece (T.pack "123hello")
`shouldBe` (Nothing :: Maybe Int)
|
dd4ba69c2e4e7c39b3da80d80fb51823e100ea0c7d0abd1bd56efbfd77c2152f | aeolus-project/zephyrus | json_v1_v.ml | Auto - generated from " json_v1.atd "
(** Type definition for syntax version. *)
(** Type definitions for naming. *)
type version = Json_versions_t.version
type component_type_name = Json_v1_t.component_type_name
type port_name = Json_v1_t.port_name
type component_name = Json_v1_t.component_name
type package_name = Json_v1_t.package_name
type repository_name = Json_v1_t.repository_name
type location_name = Json_v1_t.location_name
* Type definitions for Universe .
type resource_name = Json_v1_t.resource_name
type provide_arity = Json_v1_t.provide_arity
type require_arity = Json_v1_t.require_arity
type resource_consumption = Json_v1_t.resource_consumption
type resource_provide_arity = Json_v1_t.resource_provide_arity
type component_type = Json_v1_t.component_type = {
atd name
atd provide
atd require
component_type_conflict (*atd conflict *): port_name list;
atd consume
(resource_name * resource_consumption) list
}
type component_types = Json_v1_t.component_types
type package = Json_v1_t.package = {
atd name
atd depend
package_conflict (*atd conflict *): package_name list;
atd consume
(resource_name * resource_consumption) list
}
type packages = Json_v1_t.packages
type repository = Json_v1_t.repository = {
atd name
atd packages
}
type repositories = Json_v1_t.repositories
type package_names = Json_v1_t.package_names
(** Type definitions for Configuration. *)
type universe = Json_v1_t.universe = {
atd version
atd component_types
universe_implementation (*atd implementation *):
(component_type_name * package_names) list;
universe_repositories (*atd repositories *): repositories
}
type resources_provided = Json_v1_t.resources_provided
type location_cost = Json_v1_t.location_cost
type location = Json_v1_t.location = {
atd name
location_provide_resources (*atd provide_resources *): resources_provided;
atd repository
location_packages_installed (*atd packages_installed *): package_name list;
location_cost (*atd cost *): location_cost
}
type component = Json_v1_t.component = {
atd name
component_type (*atd component_type_workaround *): component_type_name;
atd location
}
type binding = Json_v1_t.binding = {
atd port
binding_requirer (*atd requirer *): component_name;
binding_provider (*atd provider *): component_name
}
type configuration = Json_v1_t.configuration = {
atd version
configuration_locations (*atd locations *): location list;
atd components
configuration_bindings (*atd bindings *): binding list
}
let validate_version = (
Json_versions_v.validate_version
)
let validate_component_type_name = (
(fun _ _ -> None)
)
let validate_port_name = (
(fun _ _ -> None)
)
let validate_component_name = (
(fun _ _ -> None)
)
let validate_package_name = (
(fun _ _ -> None)
)
let validate_repository_name = (
(fun _ _ -> None)
)
let validate_location_name = (
(fun _ _ -> None)
)
let validate_resource_name = (
(fun _ _ -> None)
)
let validate_provide_arity = (
(fun _ _ -> None)
)
let validate_require_arity = (
(fun _ _ -> None)
)
let validate_resource_consumption = (
(fun _ _ -> None)
)
let validate_resource_provide_arity = (
(fun _ _ -> None)
)
let validate__1 = (
fun _ _ -> None
)
let validate__2 = (
fun _ _ -> None
)
let validate__3 = (
fun _ _ -> None
)
let validate__4 = (
fun _ _ -> None
)
let validate_component_type = (
fun _ _ -> None
)
let validate__5 = (
fun _ _ -> None
)
let validate_component_types = (
validate__5
)
let validate__6 = (
fun _ _ -> None
)
let validate__7 = (
fun _ _ -> None
)
let validate_package = (
fun _ _ -> None
)
let validate__8 = (
fun _ _ -> None
)
let validate_packages = (
validate__8
)
let validate_repository = (
fun _ _ -> None
)
let validate__9 = (
fun _ _ -> None
)
let validate_repositories = (
validate__9
)
let validate__10 = (
fun _ _ -> None
)
let validate_package_names = (
validate__10
)
let validate__11 = (
fun _ _ -> None
)
let validate_universe = (
fun path x ->
(
validate_version
) (`Field "universe_version" :: path) x.universe_version
)
let validate__12 = (
fun _ _ -> None
)
let validate_resources_provided = (
validate__12
)
let validate_location_cost = (
(fun _ _ -> None)
)
let validate_location = (
fun _ _ -> None
)
let validate_component = (
fun _ _ -> None
)
let validate_binding = (
fun _ _ -> None
)
let validate__13 = (
fun _ _ -> None
)
let validate__14 = (
fun _ _ -> None
)
let validate__15 = (
fun _ _ -> None
)
let validate_configuration = (
fun path x ->
(
validate_version
) (`Field "configuration_version" :: path) x.configuration_version
)
let create_component_type
~component_type_name
?(component_type_provide = [])
?(component_type_require = [])
?(component_type_conflict = [])
?(component_type_consume = [])
() =
{
component_type_name = component_type_name;
component_type_provide = component_type_provide;
component_type_require = component_type_require;
component_type_conflict = component_type_conflict;
component_type_consume = component_type_consume;
}
let create_package
~package_name
?(package_depend = [])
?(package_conflict = [])
?(package_consume = [])
() =
{
package_name = package_name;
package_depend = package_depend;
package_conflict = package_conflict;
package_consume = package_consume;
}
let create_repository
~repository_name
?(repository_packages = [])
() =
{
repository_name = repository_name;
repository_packages = repository_packages;
}
let create_universe
~universe_version
?(universe_component_types = [])
?(universe_implementation = [])
?(universe_repositories = [])
() =
{
universe_version = universe_version;
universe_component_types = universe_component_types;
universe_implementation = universe_implementation;
universe_repositories = universe_repositories;
}
let create_location
~location_name
?(location_provide_resources = [])
~location_repository
?(location_packages_installed = [])
?(location_cost = 1)
() =
{
location_name = location_name;
location_provide_resources = location_provide_resources;
location_repository = location_repository;
location_packages_installed = location_packages_installed;
location_cost = location_cost;
}
let create_component
~component_name
~component_type
~component_location
() =
{
component_name = component_name;
component_type = component_type;
component_location = component_location;
}
let create_binding
~binding_port
~binding_requirer
~binding_provider
() =
{
binding_port = binding_port;
binding_requirer = binding_requirer;
binding_provider = binding_provider;
}
let create_configuration
~configuration_version
?(configuration_locations = [])
?(configuration_components = [])
?(configuration_bindings = [])
() =
{
configuration_version = configuration_version;
configuration_locations = configuration_locations;
configuration_components = configuration_components;
configuration_bindings = configuration_bindings;
}
| null | https://raw.githubusercontent.com/aeolus-project/zephyrus/0b52de4038bbab724e6a9628430165a7f09f77ae/src/atd/json_v1_v.ml | ocaml | * Type definition for syntax version.
* Type definitions for naming.
atd conflict
atd conflict
* Type definitions for Configuration.
atd implementation
atd repositories
atd provide_resources
atd packages_installed
atd cost
atd component_type_workaround
atd requirer
atd provider
atd locations
atd bindings | Auto - generated from " json_v1.atd "
type version = Json_versions_t.version
type component_type_name = Json_v1_t.component_type_name
type port_name = Json_v1_t.port_name
type component_name = Json_v1_t.component_name
type package_name = Json_v1_t.package_name
type repository_name = Json_v1_t.repository_name
type location_name = Json_v1_t.location_name
* Type definitions for Universe .
type resource_name = Json_v1_t.resource_name
type provide_arity = Json_v1_t.provide_arity
type require_arity = Json_v1_t.require_arity
type resource_consumption = Json_v1_t.resource_consumption
type resource_provide_arity = Json_v1_t.resource_provide_arity
type component_type = Json_v1_t.component_type = {
atd name
atd provide
atd require
atd consume
(resource_name * resource_consumption) list
}
type component_types = Json_v1_t.component_types
type package = Json_v1_t.package = {
atd name
atd depend
atd consume
(resource_name * resource_consumption) list
}
type packages = Json_v1_t.packages
type repository = Json_v1_t.repository = {
atd name
atd packages
}
type repositories = Json_v1_t.repositories
type package_names = Json_v1_t.package_names
type universe = Json_v1_t.universe = {
atd version
atd component_types
(component_type_name * package_names) list;
}
type resources_provided = Json_v1_t.resources_provided
type location_cost = Json_v1_t.location_cost
type location = Json_v1_t.location = {
atd name
atd repository
}
type component = Json_v1_t.component = {
atd name
atd location
}
type binding = Json_v1_t.binding = {
atd port
}
type configuration = Json_v1_t.configuration = {
atd version
atd components
}
let validate_version = (
Json_versions_v.validate_version
)
let validate_component_type_name = (
(fun _ _ -> None)
)
let validate_port_name = (
(fun _ _ -> None)
)
let validate_component_name = (
(fun _ _ -> None)
)
let validate_package_name = (
(fun _ _ -> None)
)
let validate_repository_name = (
(fun _ _ -> None)
)
let validate_location_name = (
(fun _ _ -> None)
)
let validate_resource_name = (
(fun _ _ -> None)
)
let validate_provide_arity = (
(fun _ _ -> None)
)
let validate_require_arity = (
(fun _ _ -> None)
)
let validate_resource_consumption = (
(fun _ _ -> None)
)
let validate_resource_provide_arity = (
(fun _ _ -> None)
)
let validate__1 = (
fun _ _ -> None
)
let validate__2 = (
fun _ _ -> None
)
let validate__3 = (
fun _ _ -> None
)
let validate__4 = (
fun _ _ -> None
)
let validate_component_type = (
fun _ _ -> None
)
let validate__5 = (
fun _ _ -> None
)
let validate_component_types = (
validate__5
)
let validate__6 = (
fun _ _ -> None
)
let validate__7 = (
fun _ _ -> None
)
let validate_package = (
fun _ _ -> None
)
let validate__8 = (
fun _ _ -> None
)
let validate_packages = (
validate__8
)
let validate_repository = (
fun _ _ -> None
)
let validate__9 = (
fun _ _ -> None
)
let validate_repositories = (
validate__9
)
let validate__10 = (
fun _ _ -> None
)
let validate_package_names = (
validate__10
)
let validate__11 = (
fun _ _ -> None
)
let validate_universe = (
fun path x ->
(
validate_version
) (`Field "universe_version" :: path) x.universe_version
)
let validate__12 = (
fun _ _ -> None
)
let validate_resources_provided = (
validate__12
)
let validate_location_cost = (
(fun _ _ -> None)
)
let validate_location = (
fun _ _ -> None
)
let validate_component = (
fun _ _ -> None
)
let validate_binding = (
fun _ _ -> None
)
let validate__13 = (
fun _ _ -> None
)
let validate__14 = (
fun _ _ -> None
)
let validate__15 = (
fun _ _ -> None
)
let validate_configuration = (
fun path x ->
(
validate_version
) (`Field "configuration_version" :: path) x.configuration_version
)
let create_component_type
~component_type_name
?(component_type_provide = [])
?(component_type_require = [])
?(component_type_conflict = [])
?(component_type_consume = [])
() =
{
component_type_name = component_type_name;
component_type_provide = component_type_provide;
component_type_require = component_type_require;
component_type_conflict = component_type_conflict;
component_type_consume = component_type_consume;
}
let create_package
~package_name
?(package_depend = [])
?(package_conflict = [])
?(package_consume = [])
() =
{
package_name = package_name;
package_depend = package_depend;
package_conflict = package_conflict;
package_consume = package_consume;
}
let create_repository
~repository_name
?(repository_packages = [])
() =
{
repository_name = repository_name;
repository_packages = repository_packages;
}
let create_universe
~universe_version
?(universe_component_types = [])
?(universe_implementation = [])
?(universe_repositories = [])
() =
{
universe_version = universe_version;
universe_component_types = universe_component_types;
universe_implementation = universe_implementation;
universe_repositories = universe_repositories;
}
let create_location
~location_name
?(location_provide_resources = [])
~location_repository
?(location_packages_installed = [])
?(location_cost = 1)
() =
{
location_name = location_name;
location_provide_resources = location_provide_resources;
location_repository = location_repository;
location_packages_installed = location_packages_installed;
location_cost = location_cost;
}
let create_component
~component_name
~component_type
~component_location
() =
{
component_name = component_name;
component_type = component_type;
component_location = component_location;
}
let create_binding
~binding_port
~binding_requirer
~binding_provider
() =
{
binding_port = binding_port;
binding_requirer = binding_requirer;
binding_provider = binding_provider;
}
let create_configuration
~configuration_version
?(configuration_locations = [])
?(configuration_components = [])
?(configuration_bindings = [])
() =
{
configuration_version = configuration_version;
configuration_locations = configuration_locations;
configuration_components = configuration_components;
configuration_bindings = configuration_bindings;
}
|
e8da853e59392b3205752d98e6443d273f3e336776772f9e7ec46ba4c04d4b53 | kolmodin/hinotify | Utils.hs | {-# LANGUAGE OverloadedStrings #-}
module Utils where
import Control.Concurrent.Chan
import Control.Exception
import qualified Data.ByteString as B
import qualified Data.ByteString.Char8 as BC8
import Data.String
import System.Directory ( removeDirectoryRecursive )
import System.Environment
import System.Exit
import System.INotify
import System.Posix.ByteString.FilePath
import System.Posix.Directory.ByteString
import System.Posix.Files.ByteString
testName :: IO RawFilePath
testName = do
n <- getProgName
return (fromString n `B.append` "-playground")
withTempDir :: (RawFilePath -> IO a) -> IO a
withTempDir f = do
path <- testName
bracket
( createDirectory path ownerModes >> return path )
( removeDirectoryRecursive . fromString . BC8.unpack )
f
withWatch :: INotify -> [EventVariety] -> RawFilePath -> (Event -> IO ()) -> IO a -> IO a
withWatch inot events path action f =
bracket
( addWatch inot events path action )
removeWatch
( const f )
inTestEnviron :: [EventVariety] -> (FilePath -> IO a) -> ([Event] -> IO b) -> IO b
inTestEnviron events action f =
withTempDir $ \testPath -> do
inot <- initINotify
chan <- newChan
withWatch inot events testPath (writeChan chan) $ do
_ <- action (fromString . BC8.unpack $ testPath)
events' <- getChanContents chan
f events'
(~=) :: Eq a => [a] -> [a] -> Bool
[] ~= _ = True
(x:xs) ~= (y:ys) = x == y && xs ~= ys
_ ~= _ = False
asMany :: [a] -> [a] -> [a]
asMany xs ys = take (length xs) ys
explainFailure :: Show a => [a] -> [a] -> String
explainFailure expected reality = unlines $
[ "Expected:" ] ++
[ "> " ++ show x | x <- expected ] ++
[ "But got:" ] ++
[ "< " ++ show x | x <- asMany expected reality ]
testFailure, testSuccess :: IO a
testFailure = exitFailure
testSuccess = exitSuccess
| null | https://raw.githubusercontent.com/kolmodin/hinotify/d225a1aacce290f054917177c17ce5f097421ec0/tests/Utils.hs | haskell | # LANGUAGE OverloadedStrings # | module Utils where
import Control.Concurrent.Chan
import Control.Exception
import qualified Data.ByteString as B
import qualified Data.ByteString.Char8 as BC8
import Data.String
import System.Directory ( removeDirectoryRecursive )
import System.Environment
import System.Exit
import System.INotify
import System.Posix.ByteString.FilePath
import System.Posix.Directory.ByteString
import System.Posix.Files.ByteString
testName :: IO RawFilePath
testName = do
n <- getProgName
return (fromString n `B.append` "-playground")
withTempDir :: (RawFilePath -> IO a) -> IO a
withTempDir f = do
path <- testName
bracket
( createDirectory path ownerModes >> return path )
( removeDirectoryRecursive . fromString . BC8.unpack )
f
withWatch :: INotify -> [EventVariety] -> RawFilePath -> (Event -> IO ()) -> IO a -> IO a
withWatch inot events path action f =
bracket
( addWatch inot events path action )
removeWatch
( const f )
inTestEnviron :: [EventVariety] -> (FilePath -> IO a) -> ([Event] -> IO b) -> IO b
inTestEnviron events action f =
withTempDir $ \testPath -> do
inot <- initINotify
chan <- newChan
withWatch inot events testPath (writeChan chan) $ do
_ <- action (fromString . BC8.unpack $ testPath)
events' <- getChanContents chan
f events'
(~=) :: Eq a => [a] -> [a] -> Bool
[] ~= _ = True
(x:xs) ~= (y:ys) = x == y && xs ~= ys
_ ~= _ = False
asMany :: [a] -> [a] -> [a]
asMany xs ys = take (length xs) ys
explainFailure :: Show a => [a] -> [a] -> String
explainFailure expected reality = unlines $
[ "Expected:" ] ++
[ "> " ++ show x | x <- expected ] ++
[ "But got:" ] ++
[ "< " ++ show x | x <- asMany expected reality ]
testFailure, testSuccess :: IO a
testFailure = exitFailure
testSuccess = exitSuccess
|
75149fe65a4d6ce0c23eeac7d0e9d0582ed20050b539dbe391d686181bc77f24 | haskell-servant/example-servant-minimal | App.hs | # LANGUAGE DataKinds #
# LANGUAGE DeriveGeneric #
# LANGUAGE LambdaCase #
# LANGUAGE TypeOperators #
module App where
import Data.Aeson
import GHC.Generics
import Network.Wai
import Network.Wai.Handler.Warp
import Servant
import System.IO
-- * api
type ItemApi =
"item" :> Get '[JSON] [Item] :<|>
"item" :> Capture "itemId" Integer :> Get '[JSON] Item
itemApi :: Proxy ItemApi
itemApi = Proxy
-- * app
run :: IO ()
run = do
let port = 3000
settings =
setPort port $
setBeforeMainLoop (hPutStrLn stderr ("listening on port " ++ show port)) $
defaultSettings
runSettings settings =<< mkApp
mkApp :: IO Application
mkApp = return $ serve itemApi server
server :: Server ItemApi
server =
getItems :<|>
getItemById
getItems :: Handler [Item]
getItems = return [exampleItem]
getItemById :: Integer -> Handler Item
getItemById = \ case
0 -> return exampleItem
_ -> throwError err404
exampleItem :: Item
exampleItem = Item 0 "example item"
-- * item
data Item
= Item {
itemId :: Integer,
itemText :: String
}
deriving (Eq, Show, Generic)
instance ToJSON Item
instance FromJSON Item
data a + b = Foo a b
type X = Int + Bool
| null | https://raw.githubusercontent.com/haskell-servant/example-servant-minimal/9df20dd272dd0b6ed73191b62afc489b6d30f95e/src/App.hs | haskell | * api
* app
* item | # LANGUAGE DataKinds #
# LANGUAGE DeriveGeneric #
# LANGUAGE LambdaCase #
# LANGUAGE TypeOperators #
module App where
import Data.Aeson
import GHC.Generics
import Network.Wai
import Network.Wai.Handler.Warp
import Servant
import System.IO
type ItemApi =
"item" :> Get '[JSON] [Item] :<|>
"item" :> Capture "itemId" Integer :> Get '[JSON] Item
itemApi :: Proxy ItemApi
itemApi = Proxy
run :: IO ()
run = do
let port = 3000
settings =
setPort port $
setBeforeMainLoop (hPutStrLn stderr ("listening on port " ++ show port)) $
defaultSettings
runSettings settings =<< mkApp
mkApp :: IO Application
mkApp = return $ serve itemApi server
server :: Server ItemApi
server =
getItems :<|>
getItemById
getItems :: Handler [Item]
getItems = return [exampleItem]
getItemById :: Integer -> Handler Item
getItemById = \ case
0 -> return exampleItem
_ -> throwError err404
exampleItem :: Item
exampleItem = Item 0 "example item"
data Item
= Item {
itemId :: Integer,
itemText :: String
}
deriving (Eq, Show, Generic)
instance ToJSON Item
instance FromJSON Item
data a + b = Foo a b
type X = Int + Bool
|
00ec36267bc5eca45f42880979cbb404945d874c3dcd886d8b33f23db64e8796 | wdebeaum/step | pajamas.lisp | ;;;;
;;;; w::pajamas
;;;;
(define-words :pos W::n
:words (
(w::pajamas
(senses
((LF-PARENT ONT::attire)
(meta-data :origin caloy3 :entry-date 20070330 :change-date nil :comments nil)
;(TEMPL mass-PRED-TEMPL)
(TEMPL COUNT-PRED-3p-TEMPL)
(syntax (W::morph (:forms (-none))))
)
)
)
))
| null | https://raw.githubusercontent.com/wdebeaum/step/f38c07d9cd3a58d0e0183159d4445de9a0eafe26/src/LexiconManager/Data/new/pajamas.lisp | lisp |
w::pajamas
(TEMPL mass-PRED-TEMPL) |
(define-words :pos W::n
:words (
(w::pajamas
(senses
((LF-PARENT ONT::attire)
(meta-data :origin caloy3 :entry-date 20070330 :change-date nil :comments nil)
(TEMPL COUNT-PRED-3p-TEMPL)
(syntax (W::morph (:forms (-none))))
)
)
)
))
|
8d2ff9cbb7bb79a23b0a5db5c6f189934849d4fd0dae897b92955f33905d435b | ocaml-multicore/parafuzz | functors.ml | (* TEST
* setup-ocamlc.byte-build-env
** ocamlc.byte
flags = "-dlambda -dno-unique-ids"
*** check-ocamlc.byte-output
*)
module type S = sig
val foo : int -> int
end
module O (X : S) = struct
let cow x = X.foo x
let sheep x = 1 + cow x
end [@@inline always]
module F (X : S) (Y : S) = struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
module type S1 = sig
val bar : int -> int
val foo : int -> int
end
module type T = sig
val sheep : int -> int
end
module F1 (X : S) (Y : S) : T = struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
module F2 : S1 -> S1 -> T = functor (X : S) -> functor (Y : S) -> struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
module M : sig
module F (X : S1) (Y : S1) : T
end = struct
module F (X : S) (Y : S) = struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
end
| null | https://raw.githubusercontent.com/ocaml-multicore/parafuzz/6a92906f1ba03287ffcb433063bded831a644fd5/testsuite/tests/functors/functors.ml | ocaml | TEST
* setup-ocamlc.byte-build-env
** ocamlc.byte
flags = "-dlambda -dno-unique-ids"
*** check-ocamlc.byte-output
|
module type S = sig
val foo : int -> int
end
module O (X : S) = struct
let cow x = X.foo x
let sheep x = 1 + cow x
end [@@inline always]
module F (X : S) (Y : S) = struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
module type S1 = sig
val bar : int -> int
val foo : int -> int
end
module type T = sig
val sheep : int -> int
end
module F1 (X : S) (Y : S) : T = struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
module F2 : S1 -> S1 -> T = functor (X : S) -> functor (Y : S) -> struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
module M : sig
module F (X : S1) (Y : S1) : T
end = struct
module F (X : S) (Y : S) = struct
let cow x = Y.foo (X.foo x)
let sheep x = 1 + cow x
end [@@inline always]
end
|
ab1add92f9937a77133cb170ecd0717974289e5ad9c48eae3eef7805080bf5d0 | strymonas/strymonas-ocaml | benchmark_streaming.ml |
# require " streaming " ; ;
open Streaming ; ;
let st = Source.array [ |1;2;3;4;5;6;7;8;9;10| ] in
Stream.from ( Source.zip st st )
| > Stream.fold ( fun ( z1,z2 ) ( ) - > ( z1+x1 , z2+x2 ) ) ( 0,0 )
- -labs.github.io/streaming/streaming/index.html#what's-the-difference-between-sources-and-streams ?
" In general , streams offer better performance than sources for
the most common operations ( including concatenation ) and offer integration with
sinks and flows . On the other hand , sources are easier to create , and support zipping . "
- -labs.github.io/streaming/streaming/Streaming/Stream/index.html
" Streams are built to be compatible with sources , sinks and flows .
To create a stream that produces all elements from a source use
Stream.from , to consume a stream with a sink use Stream.into and
to transform stream elements with a flow use Stream.via .
For more sophisticated pipelines that might have source leftovers ,
run can be used . "
In conclusion , streaming can not zip a nested stream .
#require "streaming";;
open Streaming;;
let st = Source.array [|1;2;3;4;5;6;7;8;9;10|] in
Stream.from (Source.zip st st)
|> Stream.fold (fun (z1,z2) (x1,x2) -> (z1+x1, z2+x2) ) (0,0)
- -labs.github.io/streaming/streaming/index.html#what's-the-difference-between-sources-and-streams?
"In general, streams offer better performance than sources for
the most common operations (including concatenation) and offer integration with
sinks and flows. On the other hand, sources are easier to create, and support zipping."
- -labs.github.io/streaming/streaming/Streaming/Stream/index.html
"Streams are built to be compatible with sources, sinks and flows.
To create a stream that produces all elements from a source use
Stream.from, to consume a stream with a sink use Stream.into and
to transform stream elements with a flow use Stream.via.
For more sophisticated pipelines that might have source leftovers,
run can be used."
In conclusion, streaming cannot zip a nested stream.
*)
module Streaming_intf = struct
open Streaming.Stream
type 'a cde = 'a code
type 'a stream_raw = 'a t
type 'a stream = 'a t cde
let lift_tr1 : (('a -> 'b ) -> 'a stream_raw -> 'c stream_raw) cde
-> ('a cde -> 'b cde) -> 'a stream -> 'c stream =
fun tr f st -> .<.~tr (fun x -> .~(f .<x>.)) .~st>.
let lift_tr2 : (('a -> 'b -> 'c) -> ('a stream_raw -> 'b stream_raw -> 'c stream_raw) )cde
-> ('a cde -> 'b cde -> 'c cde) -> 'a stream -> 'b stream -> 'c stream =
fun tr f st1 st2 -> .<.~tr (fun x y -> .~(f .<x>. .<y>.)) .~st1 .~st2>.
let of_arr : 'a array cde -> 'a stream = fun x -> .<of_array .~x>.
let fold : ('z cde -> 'a cde -> 'z cde) -> 'z cde -> 'a stream -> 'z cde =
fun f z st -> .<fold (fun z x -> .~(f .<z>. .<x>.)) .~z .~st>.
let map : ('a cde -> 'b cde) -> 'a stream -> 'b stream =
fun f st -> lift_tr1 .<map>. f st
let flat_map : ('a cde -> 'b stream) -> 'a stream -> 'b stream =
fun f st -> lift_tr1 .<flat_map>. f st
let filter : ('a cde -> bool cde) -> 'a stream -> 'a stream =
fun f st -> lift_tr1 .<filter>. f st
let take : int cde -> 'a stream -> 'a stream =
fun n st -> .<take .~n .~st>.
let zip_with : ('a cde -> 'b cde -> 'c cde) -> ('a stream -> 'b stream -> 'c stream) =
fun f st1 st2 -> failwith "unusable"
type byte = int
let byte_max = 255
let decode = fun st -> failwith "unusable"
end
module Benchmark_streaming = struct
open Benchmark_types
open Benchmark
open Streaming_intf
module C = Benchmark_abstract.CodeBasic
open Benchmark_abstract.Benchmark(C)(Streaming_intf)
open Streaming
let of_arr arr = .<Source.array .~arr>.
let map f st = .<Stream.from (Source.map (fun x -> .~(f .<x>.)) .~st)>.
let filter f st = .<Source.filter (fun x -> .~(f .<x>.)) .~st>.
let zip_with f st1 st2 =
.<Stream.from (Source.zip_with (fun x y -> .~(f .<x>. .<y>.)) .~st1 .~st2)>.
let dotProduct : (int array code * int array code) -> int code
= C.to_code2 @@ fun (arr1, arr2) ->
zip_with C.( * ) (of_arr arr1) (of_arr arr2)
|> sum_int
let flatMap_after_zipWith : (int array code * int array code) -> int code
= C.to_code2 @@ fun (arr1, arr2) ->
zip_with C.( + ) (of_arr arr1) (of_arr arr1)
|> flat_map (fun x -> of_arr arr2 |> map C.(fun el -> el + x))
|> sum_int
let zip_filter_filter :(int array code * int array code) -> int code
= C.to_code2 @@ fun (arr1, arr2) ->
zip_with C.( + )
(of_arr arr1 |> filter C.(fun x -> x > int 7))
(of_arr arr2 |> filter C.(fun x -> x > int 5))
|> sum_int
(* Arrays used for benchmarking *)
let v = .< Array.init 100_000_000 (fun i -> i mod 10) >.;;
let vHi = .< Array.init 10_000_000 (fun i -> i mod 10) >.;;
let vLo = .< Array.init 10 (fun i -> i mod 10) >.;;
let vFaZ = .< Array.init 10_000 (fun i -> i) >.;;
let vZaF = .< Array.init 10_000_000 (fun i -> i) >.;;
let options = {
repetitions = 20;
final_f = (fun _ -> .<()>.);
}
let pr_int = {options with
final_f = fun x -> .<Printf.printf ""; Printf.printf "Result %d\n" .~x>.}
let check_int n = {options with
final_f = fun x -> .<Printf.printf ""; assert (.~x = n) >.}
let script =[|
perfS "sum_streaming" v sum options;
perfS "sumOfSquares_streaming" v sumOfSquares options;
perfS "sumOfSquaresEven_streaming" v sumOfSquaresEven options;
perfS "mapsMegamorphic_streaming" v maps options;
perfS "filtersMegamorphic_streaming" v filters options;
perfS2 "cart_streaming" vHi vLo cart options;
perfS2 "dotProduct_streaming" vHi vHi dotProduct options;
perfS2 "flatMapAfterZip_streaming" vFaZ vFaZ
flatMap_after_zipWith options;
(* perfS2 "zipAfterFlatMap_streaming" vZaF vZaF
zipWith_after_flatMap options; *)
perfS2 "flatMapTake_streaming" vHi vLo
flat_map_take options;
perfS2 "zipFilterFilter_streaming" v vHi
zip_filter_filter options;
perfS2 " zipFlatMapFlatMap_streaming " v vLo
zip_flat_flat options ;
zip_flat_flat options; *)
(* perfS2 "runLengthDecoding_streaming" v v
decoding options; *)
|];;
let test = .<
print_endline "Last checked: Jun 2, 2022";
assert (.~(sum v) == 450000000);
assert (.~(sumOfSquares v) == 2850000000);
assert (.~(sumOfSquaresEven v) == 1200000000);
assert (.~(maps v) == 2268000000000);
assert (.~(filters v) == 170000000);
assert (.~(cart (vHi, vLo)) == 2025000000);
assert (.~(dotProduct (vHi, vHi)) == 285000000);
assert (.~(flatMap_after_zipWith (vFaZ, vFaZ)) == 1499850000000);
assert ( .~(zipWith_after_flatMap ( vZaF , vZaF ) ) = = 99999990000000 ) ;
assert (.~(flat_map_take (vHi, vLo)) == 405000000);
assert (.~(zip_filter_filter (v, vHi)) == 64000000);
assert ( .~(zip_flat_flat ( v , ) ) = = 3250000000 ) ;
assert ( .~(decoding ( v , v ) ) = = 100000000 ) ;
print_endline "All done"
>.
end;;
module M = Benchmark_streaming
let main () =
let compiler = "ocamlfind ocamlopt -O2 -unsafe -nodynlink -package streaming -linkpkg util.cmx" in
match Sys.argv with
| [|_;"test"|] ->
Benchmark.run_natively M.test
~compiler
(* ~save:true *)
| _ ->
Benchmark.run_script M.script
~compiler
let _ = main ()
| null | https://raw.githubusercontent.com/strymonas/strymonas-ocaml/b45ab87c62e5bb845ff4a989064f30ed6b468f6d/benchmarks/benchmark_streaming.ml | ocaml | Arrays used for benchmarking
perfS2 "zipAfterFlatMap_streaming" vZaF vZaF
zipWith_after_flatMap options;
perfS2 "runLengthDecoding_streaming" v v
decoding options;
~save:true |
# require " streaming " ; ;
open Streaming ; ;
let st = Source.array [ |1;2;3;4;5;6;7;8;9;10| ] in
Stream.from ( Source.zip st st )
| > Stream.fold ( fun ( z1,z2 ) ( ) - > ( z1+x1 , z2+x2 ) ) ( 0,0 )
- -labs.github.io/streaming/streaming/index.html#what's-the-difference-between-sources-and-streams ?
" In general , streams offer better performance than sources for
the most common operations ( including concatenation ) and offer integration with
sinks and flows . On the other hand , sources are easier to create , and support zipping . "
- -labs.github.io/streaming/streaming/Streaming/Stream/index.html
" Streams are built to be compatible with sources , sinks and flows .
To create a stream that produces all elements from a source use
Stream.from , to consume a stream with a sink use Stream.into and
to transform stream elements with a flow use Stream.via .
For more sophisticated pipelines that might have source leftovers ,
run can be used . "
In conclusion , streaming can not zip a nested stream .
#require "streaming";;
open Streaming;;
let st = Source.array [|1;2;3;4;5;6;7;8;9;10|] in
Stream.from (Source.zip st st)
|> Stream.fold (fun (z1,z2) (x1,x2) -> (z1+x1, z2+x2) ) (0,0)
- -labs.github.io/streaming/streaming/index.html#what's-the-difference-between-sources-and-streams?
"In general, streams offer better performance than sources for
the most common operations (including concatenation) and offer integration with
sinks and flows. On the other hand, sources are easier to create, and support zipping."
- -labs.github.io/streaming/streaming/Streaming/Stream/index.html
"Streams are built to be compatible with sources, sinks and flows.
To create a stream that produces all elements from a source use
Stream.from, to consume a stream with a sink use Stream.into and
to transform stream elements with a flow use Stream.via.
For more sophisticated pipelines that might have source leftovers,
run can be used."
In conclusion, streaming cannot zip a nested stream.
*)
module Streaming_intf = struct
open Streaming.Stream
type 'a cde = 'a code
type 'a stream_raw = 'a t
type 'a stream = 'a t cde
let lift_tr1 : (('a -> 'b ) -> 'a stream_raw -> 'c stream_raw) cde
-> ('a cde -> 'b cde) -> 'a stream -> 'c stream =
fun tr f st -> .<.~tr (fun x -> .~(f .<x>.)) .~st>.
let lift_tr2 : (('a -> 'b -> 'c) -> ('a stream_raw -> 'b stream_raw -> 'c stream_raw) )cde
-> ('a cde -> 'b cde -> 'c cde) -> 'a stream -> 'b stream -> 'c stream =
fun tr f st1 st2 -> .<.~tr (fun x y -> .~(f .<x>. .<y>.)) .~st1 .~st2>.
let of_arr : 'a array cde -> 'a stream = fun x -> .<of_array .~x>.
let fold : ('z cde -> 'a cde -> 'z cde) -> 'z cde -> 'a stream -> 'z cde =
fun f z st -> .<fold (fun z x -> .~(f .<z>. .<x>.)) .~z .~st>.
let map : ('a cde -> 'b cde) -> 'a stream -> 'b stream =
fun f st -> lift_tr1 .<map>. f st
let flat_map : ('a cde -> 'b stream) -> 'a stream -> 'b stream =
fun f st -> lift_tr1 .<flat_map>. f st
let filter : ('a cde -> bool cde) -> 'a stream -> 'a stream =
fun f st -> lift_tr1 .<filter>. f st
let take : int cde -> 'a stream -> 'a stream =
fun n st -> .<take .~n .~st>.
let zip_with : ('a cde -> 'b cde -> 'c cde) -> ('a stream -> 'b stream -> 'c stream) =
fun f st1 st2 -> failwith "unusable"
type byte = int
let byte_max = 255
let decode = fun st -> failwith "unusable"
end
module Benchmark_streaming = struct
open Benchmark_types
open Benchmark
open Streaming_intf
module C = Benchmark_abstract.CodeBasic
open Benchmark_abstract.Benchmark(C)(Streaming_intf)
open Streaming
let of_arr arr = .<Source.array .~arr>.
let map f st = .<Stream.from (Source.map (fun x -> .~(f .<x>.)) .~st)>.
let filter f st = .<Source.filter (fun x -> .~(f .<x>.)) .~st>.
let zip_with f st1 st2 =
.<Stream.from (Source.zip_with (fun x y -> .~(f .<x>. .<y>.)) .~st1 .~st2)>.
let dotProduct : (int array code * int array code) -> int code
= C.to_code2 @@ fun (arr1, arr2) ->
zip_with C.( * ) (of_arr arr1) (of_arr arr2)
|> sum_int
let flatMap_after_zipWith : (int array code * int array code) -> int code
= C.to_code2 @@ fun (arr1, arr2) ->
zip_with C.( + ) (of_arr arr1) (of_arr arr1)
|> flat_map (fun x -> of_arr arr2 |> map C.(fun el -> el + x))
|> sum_int
let zip_filter_filter :(int array code * int array code) -> int code
= C.to_code2 @@ fun (arr1, arr2) ->
zip_with C.( + )
(of_arr arr1 |> filter C.(fun x -> x > int 7))
(of_arr arr2 |> filter C.(fun x -> x > int 5))
|> sum_int
let v = .< Array.init 100_000_000 (fun i -> i mod 10) >.;;
let vHi = .< Array.init 10_000_000 (fun i -> i mod 10) >.;;
let vLo = .< Array.init 10 (fun i -> i mod 10) >.;;
let vFaZ = .< Array.init 10_000 (fun i -> i) >.;;
let vZaF = .< Array.init 10_000_000 (fun i -> i) >.;;
let options = {
repetitions = 20;
final_f = (fun _ -> .<()>.);
}
let pr_int = {options with
final_f = fun x -> .<Printf.printf ""; Printf.printf "Result %d\n" .~x>.}
let check_int n = {options with
final_f = fun x -> .<Printf.printf ""; assert (.~x = n) >.}
let script =[|
perfS "sum_streaming" v sum options;
perfS "sumOfSquares_streaming" v sumOfSquares options;
perfS "sumOfSquaresEven_streaming" v sumOfSquaresEven options;
perfS "mapsMegamorphic_streaming" v maps options;
perfS "filtersMegamorphic_streaming" v filters options;
perfS2 "cart_streaming" vHi vLo cart options;
perfS2 "dotProduct_streaming" vHi vHi dotProduct options;
perfS2 "flatMapAfterZip_streaming" vFaZ vFaZ
flatMap_after_zipWith options;
perfS2 "flatMapTake_streaming" vHi vLo
flat_map_take options;
perfS2 "zipFilterFilter_streaming" v vHi
zip_filter_filter options;
perfS2 " zipFlatMapFlatMap_streaming " v vLo
zip_flat_flat options ;
zip_flat_flat options; *)
|];;
let test = .<
print_endline "Last checked: Jun 2, 2022";
assert (.~(sum v) == 450000000);
assert (.~(sumOfSquares v) == 2850000000);
assert (.~(sumOfSquaresEven v) == 1200000000);
assert (.~(maps v) == 2268000000000);
assert (.~(filters v) == 170000000);
assert (.~(cart (vHi, vLo)) == 2025000000);
assert (.~(dotProduct (vHi, vHi)) == 285000000);
assert (.~(flatMap_after_zipWith (vFaZ, vFaZ)) == 1499850000000);
assert ( .~(zipWith_after_flatMap ( vZaF , vZaF ) ) = = 99999990000000 ) ;
assert (.~(flat_map_take (vHi, vLo)) == 405000000);
assert (.~(zip_filter_filter (v, vHi)) == 64000000);
assert ( .~(zip_flat_flat ( v , ) ) = = 3250000000 ) ;
assert ( .~(decoding ( v , v ) ) = = 100000000 ) ;
print_endline "All done"
>.
end;;
module M = Benchmark_streaming
let main () =
let compiler = "ocamlfind ocamlopt -O2 -unsafe -nodynlink -package streaming -linkpkg util.cmx" in
match Sys.argv with
| [|_;"test"|] ->
Benchmark.run_natively M.test
~compiler
| _ ->
Benchmark.run_script M.script
~compiler
let _ = main ()
|
bb344ee9449d7af24d00c5f0ab3b6dd44c9542c22a6ec152b906905058c70157 | hyperfiddle/electric | ui_inputs.cljc | (ns geoffrey.ui-inputs
(:require [hyperfiddle.api :as hf]))
(def set-input!
#?(:cljs (fn [!needle]
(fn [^js event]
(reset! !needle (.. event -target -value))))))
(def *inputs (volatile! {}))
(defn get-input! [input]
(get @*inputs (.-id input)))
(defn new-input! [initial-value onChange]
(let [id #?(:clj (java.util.UUID/randomUUID)
:cljs (random-uuid))
input (hf/->Input id initial-value onChange)]
#?(:cljs (vswap! *inputs assoc id input))
input))
| null | https://raw.githubusercontent.com/hyperfiddle/electric/e633dc635cf84e0a2320b664ba722b696ce0067b/scratch/geoffrey/2021/ui_inputs.cljc | clojure | (ns geoffrey.ui-inputs
(:require [hyperfiddle.api :as hf]))
(def set-input!
#?(:cljs (fn [!needle]
(fn [^js event]
(reset! !needle (.. event -target -value))))))
(def *inputs (volatile! {}))
(defn get-input! [input]
(get @*inputs (.-id input)))
(defn new-input! [initial-value onChange]
(let [id #?(:clj (java.util.UUID/randomUUID)
:cljs (random-uuid))
input (hf/->Input id initial-value onChange)]
#?(:cljs (vswap! *inputs assoc id input))
input))
| |
1e0c5acd0466fc087d2d9ccc1c2d24cdcaf20c21a8129920bf5647b9b38ee921 | PEZ/reagent-bidi-accountant-example | user.clj | (ns routing-example.user
(:use [figwheel-sidecar.repl-api :as ra]))
(defn start []
(ra/start-figwheel!)
(ra/cljs-repl "dev"))
(defn stop []
(ra/stop-figwheel!))
| null | https://raw.githubusercontent.com/PEZ/reagent-bidi-accountant-example/de0ffe9ca5eb0a83e6bedea741b4335feb0ee255/dev/routing_example/user.clj | clojure | (ns routing-example.user
(:use [figwheel-sidecar.repl-api :as ra]))
(defn start []
(ra/start-figwheel!)
(ra/cljs-repl "dev"))
(defn stop []
(ra/stop-figwheel!))
| |
d2fd03b9fb25c984719c73e87a58401c9a36f5ef97d6d1b7ca11235db26abfd9 | Kraks/MyPLZoo | stlc-infer.rkt | #lang racket
;; Type Inference for Simply Typed Lambda Calculus
< >
(require rackunit)
(require racket/set)
(require "share.rkt")
;; Expressions
(struct NumE (n) #:transparent)
(struct BoolE (b) #:transparent)
(struct IdE (id) #:transparent)
(struct PlusE (l r) #:transparent)
(struct MultE (l r) #:transparent)
(struct LamE (arg body) #:transparent)
(struct AppE (fun arg) #:transparent)
;; Types
(struct NumT () #:transparent)
(struct BoolT () #:transparent)
(struct VarT (name) #:transparent)
(struct ArrowT (arg result) #:transparent)
;; Values
(struct NumV (n) #:transparent)
(struct BoolV (b) #:transparent)
(struct ClosureV (arg body env) #:transparent)
;; Environment & Type Environment
(struct Binding (name val) #:transparent)
(define lookup (make-lookup 'lookup Binding? Binding-name Binding-val))
(define ext-env cons)
(struct TypeBinding (name type) #:transparent)
(define type-lookup (make-lookup 'type-lookup TypeBinding? TypeBinding-name TypeBinding-type))
(define ext-tenv cons)
Parsers
(define (parse s)
(match s
[(? number? x) (NumE x)]
['true (BoolE #t)]
['false (BoolE #f)]
[(? symbol? x) (IdE x)]
[`(+ ,l ,r) (PlusE (parse l) (parse r))]
[`(* ,l ,r) (MultE (parse l) (parse r))]
[`(let ([,var ,val]) ,body)
(AppE (LamE var (parse body)) (parse val))]
[`(λ (,var) ,body) (LamE var (parse body))]
[`(,fun ,arg) (AppE (parse fun) (parse arg))]
[else (error 'parse "invalid expression")]))
;; Fresh Number Generator
(define (counter)
(define count 0)
(define (inner)
(set! count (add1 count))
count)
inner)
(define fresh-n (counter))
;; Type Inference
(struct Eq (fst snd) #:transparent)
(define (type-subst in src dst)
(match in
[(NumT) in]
[(BoolT) in]
[(VarT x) (if (equal? src in) dst in)]
[(ArrowT t1 t2) (ArrowT (type-subst t1 src dst)
(type-subst t2 src dst))]))
(define (unify/subst eqs src dst)
(cond [(empty? eqs) eqs]
[else (define eq (first eqs))
(define eqfst (Eq-fst eq))
(define eqsnd (Eq-snd eq))
(cons (Eq (type-subst eqfst src dst)
(type-subst eqsnd src dst))
(unify/subst (rest eqs) src dst))]))
(define (occurs? t in)
(match in
[(NumT) #f]
[(ArrowT at rt) (or (occurs? t at) (occurs? t rt))]
[(VarT x) (equal? t in)]))
(define not-occurs? (compose not occurs?))
(define (unify-error t1 t2)
(error 'type-error "can not unify: ~a and ~a" t1 t2))
(define (unify/helper substs result)
(match substs
['() result]
[(list (Eq fst snd) rest ...)
(match* (fst snd)
[((VarT x) t)
(if (not-occurs? fst snd)
(unify/helper (unify/subst rest fst snd) (cons (Eq fst snd) result))
(unify-error fst snd))]
[(t (VarT x))
(if (not-occurs? snd fst)
(unify/helper (unify/subst rest snd fst) (cons (Eq snd fst) result))
(unify-error snd fst))]
[((ArrowT t1 t2) (ArrowT t3 t4))
(unify/helper `(,(Eq t1 t3) ,(Eq t2 t4) ,@rest) result)]
[(x x) (unify/helper rest result)]
[(_ _) (unify-error fst snd)])]))
(define (unify substs) (unify/helper (set->list substs) (list)))
(define (type-infer exp tenv const)
(match exp
[(NumE n) (values (NumT) const)]
[(BoolE b) (values (BoolT) const)]
[(PlusE l r)
(define-values (lty lconst) (type-infer l tenv (set)))
(define-values (rty rconst) (type-infer r tenv (set)))
(values (NumT)
(set-add (set-add (set-union lconst rconst) (Eq lty (NumT))) (Eq rty (NumT))))]
[(MultE l r)
(define-values (lty lconst) (type-infer l tenv (set)))
(define-values (rty rconst) (type-infer r tenv (set)))
(values (NumT)
(set-add (set-add (set-union lconst rconst) (Eq lty (NumT))) (Eq rty (NumT))))]
[(IdE x)
(values (type-lookup x tenv) const)]
[(LamE arg body)
(define new-tvar (VarT (fresh-n)))
(define-values (bty bconst)
(type-infer body (ext-tenv (TypeBinding arg new-tvar) tenv) const))
(values (ArrowT new-tvar bty) bconst)]
[(AppE fun arg)
(define-values (funty funconst) (type-infer fun tenv (set)))
(define-values (argty argconst) (type-infer arg tenv (set)))
(define new-tvar (VarT (fresh-n)))
(values new-tvar (set-add (set-union funconst argconst) (Eq funty (ArrowT argty new-tvar))))]))
(define (reify substs ty)
(define (lookup/default x sts)
(match sts
['() x]
[(list (Eq fst snd) rest ...)
(if (equal? fst x)
(lookup/default snd substs)
(lookup/default x rest))]))
(match ty
[(NumT) (NumT)]
[(BoolT) (BoolT)]
[(VarT x)
(define ans (lookup/default ty substs))
(if (ArrowT? ans) (reify substs ans) ans)]
[(ArrowT t1 t2)
(ArrowT (reify substs t1) (reify substs t2))]))
(define (typecheck exp tenv)
(set! fresh-n (counter))
(define-values (ty constraints) (type-infer exp tenv (set)))
(reify (unify constraints) ty))
;; Interpreter
(define (interp expr env)
(match expr
[(IdE x) (lookup x env)]
[(NumE n) (NumV n)]
[(BoolE b) (BoolV b)]
[(PlusE l r) (NumV (+ (NumV-n (interp l env)) (NumV-n (interp r env))))]
[(MultE l r) (NumV (* (NumV-n (interp l env)) (NumV-n (interp r env))))]
[(LamE arg body) (ClosureV arg body env)]
[(AppE fun arg)
(match (interp fun env)
[(ClosureV n body env*) (interp body (ext-env (Binding n (interp arg env)) env*))])]))
(define mt-env empty)
(define mt-tenv empty)
(define (run prog)
(define prog* (parse prog))
(typecheck prog* mt-tenv)
(interp prog* mt-env))
;; Tests
(module+ test
(check-equal? (type-subst (VarT 'x) (VarT 'x) (NumT))
(NumT))
(check-equal? (unify/subst (list (Eq (VarT 'a) (NumT))) (VarT 'a) (NumT))
(list (Eq (NumT) (NumT))))
(check-equal? (unify/subst (list (Eq (VarT 'a) (VarT 'a))) (VarT 'a) (NumT))
(list (Eq (NumT) (NumT))))
(check-equal? (unify/subst (list (Eq (VarT 'b) (VarT 'a))) (VarT 'a) (NumT))
(list (Eq (VarT 'b) (NumT))))
(check-equal? (unify/helper (list (Eq (ArrowT (VarT 't1) (VarT 't1))
(ArrowT (NumT) (VarT 't2))))
(list))
(list (Eq (VarT 't2) (NumT)) (Eq (VarT 't1) (NumT))))
(check-equal? (unify/helper (list (Eq (VarT 'a1) (ArrowT (NumT) (VarT 'a2)))
(Eq (ArrowT (VarT 'a1) (VarT 'a2))
(ArrowT (ArrowT (VarT 'a3) (VarT 'a3)) (VarT 'a4))))
(list))
(list (Eq (VarT 'a4) (NumT)) (Eq (VarT 'a2) (NumT))
(Eq (VarT 'a3) (NumT)) (Eq (VarT 'a1) (ArrowT (NumT) (VarT 'a2)))))
(check-exn exn:fail?
(λ () (unify (list (Eq (VarT 'a1) (ArrowT (VarT 'a1) (VarT 'a2)))))))
(check-values-equal? (type-infer (parse '{λ {x} {+ x 1}}) empty (set))
(values (ArrowT (VarT 1) (NumT))
(set (Eq (NumT) (NumT)) (Eq (VarT 1) (NumT)))))
(check-values-equal? (type-infer (parse '{λ {x} {λ {y} {+ x y}}}) empty (set))
(values (ArrowT (VarT 2) (ArrowT (VarT 3) (NumT)))
(set (Eq (VarT 3) (NumT)) (Eq (VarT 2) (NumT)))))
(check-values-equal? (type-infer (parse '{{λ {x} x} 1}) empty (set))
(values (VarT 5)
(set (Eq (ArrowT (VarT 4) (VarT 4)) (ArrowT (NumT) (VarT 5))))))
(check-values-equal? (type-infer (parse '{{λ {f} {f 0}} {λ {x} x}}) empty (set))
(values (VarT 9)
(set (Eq (VarT 6) (ArrowT (NumT) (VarT 7)))
(Eq (ArrowT (VarT 6) (VarT 7))
(ArrowT (ArrowT (VarT 8) (VarT 8)) (VarT 9))))))
(check-values-equal? (type-infer (parse '{λ {x} x}) empty (set))
(values (ArrowT (VarT 10) (VarT 10))
(set)))
(check-equal? (typecheck (parse '{{λ {f} {f 0}} {λ {x} x}}) mt-tenv)
(NumT))
(check-equal? (typecheck (parse '{λ {x} {λ {y} {+ x y}}}) mt-tenv)
(ArrowT (NumT) (ArrowT (NumT) (NumT))))
; λf.λu.u (f u) :: ((a -> b) -> a) -> (a -> b) -> b
(check-equal? (typecheck (parse '{λ {f} {λ {u} {u {f u}}}}) mt-tenv)
(ArrowT (ArrowT (ArrowT (VarT 3) (VarT 4)) (VarT 3))
(ArrowT (ArrowT (VarT 3) (VarT 4)) (VarT 4))))
; λx.λy.x (x y) :: (a -> a) -> a -> a
(check-equal? (typecheck (parse '{λ {x} {λ {y} {x {x y}}}}) mt-tenv)
(ArrowT (ArrowT (VarT 2) (VarT 2))
(ArrowT (VarT 2) (VarT 2))))
; λx.λy.x (y x) :: (a -> b) -> ((a -> b) -> a) -> b
(check-equal? (typecheck (parse '{λ {x} {λ {y} {x {y x}}}}) mt-tenv)
(ArrowT
(ArrowT (VarT 3) (VarT 4))
(ArrowT (ArrowT (ArrowT (VarT 3) (VarT 4)) (VarT 3))
(VarT 4))))
;; λx.λy.y (y x) :: a -> (a -> a) -> a
(check-equal? (typecheck (parse '{λ {x} {λ {y} {y {y x}}}}) mt-tenv)
(ArrowT (VarT 4) (ArrowT (ArrowT (VarT 4) (VarT 4)) (VarT 4))))
(check-equal? (run '{{{λ {x} {λ {y} {+ x y}}} 3} 7})
(NumV 10))
;; (a -> (b -> c)) -> (a -> b) -> (a -> c)
(define S '{λ {x} {λ {y} {λ {z} {{x z} {y z}}}}})
(check-equal? (typecheck (parse S) mt-tenv)
(ArrowT (ArrowT (VarT 3) (ArrowT (VarT 5) (VarT 6)))
(ArrowT (ArrowT (VarT 3) (VarT 5))
(ArrowT (VarT 3) (VarT 6)))))
;; a -> b -> a
(define K '{λ {x} {λ {y} x}})
(check-equal? (typecheck (parse K) mt-tenv)
(ArrowT (VarT 1) (ArrowT (VarT 2) (VarT 1))))
;; (a -> b) -> (a -> a)
(check-equal? (typecheck (parse `(,S ,K)) mt-tenv)
(ArrowT (ArrowT (VarT 6) (VarT 5)) (ArrowT (VarT 6) (VarT 6))))
;; a -> a
(check-equal? (typecheck (parse `((,S ,K) ,K)) mt-tenv)
(ArrowT (VarT 6) (VarT 6)))
(check-exn exn:fail? (λ () (typecheck (parse '{{λ {id} {{id id} 3}} {λ {x} x}}) mt-tenv)))
(check-exn exn:fail? (λ () (typecheck (parse '{λ {x} {λ {y} {{x y} x}}}) mt-tenv)))
(check-exn exn:fail? (λ () (run '{{λ {x} {x x}} {λ {x} {x x}}})))
(check-exn exn:fail? (λ () (run '{+ 3 true})))
)
| null | https://raw.githubusercontent.com/Kraks/MyPLZoo/60203b7be9dafde04065eadf5a17200fc360cf26/stlc-infer.rkt | racket | Type Inference for Simply Typed Lambda Calculus
Expressions
Types
Values
Environment & Type Environment
Fresh Number Generator
Type Inference
Interpreter
Tests
λf.λu.u (f u) :: ((a -> b) -> a) -> (a -> b) -> b
λx.λy.x (x y) :: (a -> a) -> a -> a
λx.λy.x (y x) :: (a -> b) -> ((a -> b) -> a) -> b
λx.λy.y (y x) :: a -> (a -> a) -> a
(a -> (b -> c)) -> (a -> b) -> (a -> c)
a -> b -> a
(a -> b) -> (a -> a)
a -> a | #lang racket
< >
(require rackunit)
(require racket/set)
(require "share.rkt")
(struct NumE (n) #:transparent)
(struct BoolE (b) #:transparent)
(struct IdE (id) #:transparent)
(struct PlusE (l r) #:transparent)
(struct MultE (l r) #:transparent)
(struct LamE (arg body) #:transparent)
(struct AppE (fun arg) #:transparent)
(struct NumT () #:transparent)
(struct BoolT () #:transparent)
(struct VarT (name) #:transparent)
(struct ArrowT (arg result) #:transparent)
(struct NumV (n) #:transparent)
(struct BoolV (b) #:transparent)
(struct ClosureV (arg body env) #:transparent)
(struct Binding (name val) #:transparent)
(define lookup (make-lookup 'lookup Binding? Binding-name Binding-val))
(define ext-env cons)
(struct TypeBinding (name type) #:transparent)
(define type-lookup (make-lookup 'type-lookup TypeBinding? TypeBinding-name TypeBinding-type))
(define ext-tenv cons)
Parsers
(define (parse s)
(match s
[(? number? x) (NumE x)]
['true (BoolE #t)]
['false (BoolE #f)]
[(? symbol? x) (IdE x)]
[`(+ ,l ,r) (PlusE (parse l) (parse r))]
[`(* ,l ,r) (MultE (parse l) (parse r))]
[`(let ([,var ,val]) ,body)
(AppE (LamE var (parse body)) (parse val))]
[`(λ (,var) ,body) (LamE var (parse body))]
[`(,fun ,arg) (AppE (parse fun) (parse arg))]
[else (error 'parse "invalid expression")]))
(define (counter)
(define count 0)
(define (inner)
(set! count (add1 count))
count)
inner)
(define fresh-n (counter))
(struct Eq (fst snd) #:transparent)
(define (type-subst in src dst)
(match in
[(NumT) in]
[(BoolT) in]
[(VarT x) (if (equal? src in) dst in)]
[(ArrowT t1 t2) (ArrowT (type-subst t1 src dst)
(type-subst t2 src dst))]))
(define (unify/subst eqs src dst)
(cond [(empty? eqs) eqs]
[else (define eq (first eqs))
(define eqfst (Eq-fst eq))
(define eqsnd (Eq-snd eq))
(cons (Eq (type-subst eqfst src dst)
(type-subst eqsnd src dst))
(unify/subst (rest eqs) src dst))]))
(define (occurs? t in)
(match in
[(NumT) #f]
[(ArrowT at rt) (or (occurs? t at) (occurs? t rt))]
[(VarT x) (equal? t in)]))
(define not-occurs? (compose not occurs?))
(define (unify-error t1 t2)
(error 'type-error "can not unify: ~a and ~a" t1 t2))
(define (unify/helper substs result)
(match substs
['() result]
[(list (Eq fst snd) rest ...)
(match* (fst snd)
[((VarT x) t)
(if (not-occurs? fst snd)
(unify/helper (unify/subst rest fst snd) (cons (Eq fst snd) result))
(unify-error fst snd))]
[(t (VarT x))
(if (not-occurs? snd fst)
(unify/helper (unify/subst rest snd fst) (cons (Eq snd fst) result))
(unify-error snd fst))]
[((ArrowT t1 t2) (ArrowT t3 t4))
(unify/helper `(,(Eq t1 t3) ,(Eq t2 t4) ,@rest) result)]
[(x x) (unify/helper rest result)]
[(_ _) (unify-error fst snd)])]))
(define (unify substs) (unify/helper (set->list substs) (list)))
(define (type-infer exp tenv const)
(match exp
[(NumE n) (values (NumT) const)]
[(BoolE b) (values (BoolT) const)]
[(PlusE l r)
(define-values (lty lconst) (type-infer l tenv (set)))
(define-values (rty rconst) (type-infer r tenv (set)))
(values (NumT)
(set-add (set-add (set-union lconst rconst) (Eq lty (NumT))) (Eq rty (NumT))))]
[(MultE l r)
(define-values (lty lconst) (type-infer l tenv (set)))
(define-values (rty rconst) (type-infer r tenv (set)))
(values (NumT)
(set-add (set-add (set-union lconst rconst) (Eq lty (NumT))) (Eq rty (NumT))))]
[(IdE x)
(values (type-lookup x tenv) const)]
[(LamE arg body)
(define new-tvar (VarT (fresh-n)))
(define-values (bty bconst)
(type-infer body (ext-tenv (TypeBinding arg new-tvar) tenv) const))
(values (ArrowT new-tvar bty) bconst)]
[(AppE fun arg)
(define-values (funty funconst) (type-infer fun tenv (set)))
(define-values (argty argconst) (type-infer arg tenv (set)))
(define new-tvar (VarT (fresh-n)))
(values new-tvar (set-add (set-union funconst argconst) (Eq funty (ArrowT argty new-tvar))))]))
(define (reify substs ty)
(define (lookup/default x sts)
(match sts
['() x]
[(list (Eq fst snd) rest ...)
(if (equal? fst x)
(lookup/default snd substs)
(lookup/default x rest))]))
(match ty
[(NumT) (NumT)]
[(BoolT) (BoolT)]
[(VarT x)
(define ans (lookup/default ty substs))
(if (ArrowT? ans) (reify substs ans) ans)]
[(ArrowT t1 t2)
(ArrowT (reify substs t1) (reify substs t2))]))
(define (typecheck exp tenv)
(set! fresh-n (counter))
(define-values (ty constraints) (type-infer exp tenv (set)))
(reify (unify constraints) ty))
(define (interp expr env)
(match expr
[(IdE x) (lookup x env)]
[(NumE n) (NumV n)]
[(BoolE b) (BoolV b)]
[(PlusE l r) (NumV (+ (NumV-n (interp l env)) (NumV-n (interp r env))))]
[(MultE l r) (NumV (* (NumV-n (interp l env)) (NumV-n (interp r env))))]
[(LamE arg body) (ClosureV arg body env)]
[(AppE fun arg)
(match (interp fun env)
[(ClosureV n body env*) (interp body (ext-env (Binding n (interp arg env)) env*))])]))
(define mt-env empty)
(define mt-tenv empty)
(define (run prog)
(define prog* (parse prog))
(typecheck prog* mt-tenv)
(interp prog* mt-env))
(module+ test
(check-equal? (type-subst (VarT 'x) (VarT 'x) (NumT))
(NumT))
(check-equal? (unify/subst (list (Eq (VarT 'a) (NumT))) (VarT 'a) (NumT))
(list (Eq (NumT) (NumT))))
(check-equal? (unify/subst (list (Eq (VarT 'a) (VarT 'a))) (VarT 'a) (NumT))
(list (Eq (NumT) (NumT))))
(check-equal? (unify/subst (list (Eq (VarT 'b) (VarT 'a))) (VarT 'a) (NumT))
(list (Eq (VarT 'b) (NumT))))
(check-equal? (unify/helper (list (Eq (ArrowT (VarT 't1) (VarT 't1))
(ArrowT (NumT) (VarT 't2))))
(list))
(list (Eq (VarT 't2) (NumT)) (Eq (VarT 't1) (NumT))))
(check-equal? (unify/helper (list (Eq (VarT 'a1) (ArrowT (NumT) (VarT 'a2)))
(Eq (ArrowT (VarT 'a1) (VarT 'a2))
(ArrowT (ArrowT (VarT 'a3) (VarT 'a3)) (VarT 'a4))))
(list))
(list (Eq (VarT 'a4) (NumT)) (Eq (VarT 'a2) (NumT))
(Eq (VarT 'a3) (NumT)) (Eq (VarT 'a1) (ArrowT (NumT) (VarT 'a2)))))
(check-exn exn:fail?
(λ () (unify (list (Eq (VarT 'a1) (ArrowT (VarT 'a1) (VarT 'a2)))))))
(check-values-equal? (type-infer (parse '{λ {x} {+ x 1}}) empty (set))
(values (ArrowT (VarT 1) (NumT))
(set (Eq (NumT) (NumT)) (Eq (VarT 1) (NumT)))))
(check-values-equal? (type-infer (parse '{λ {x} {λ {y} {+ x y}}}) empty (set))
(values (ArrowT (VarT 2) (ArrowT (VarT 3) (NumT)))
(set (Eq (VarT 3) (NumT)) (Eq (VarT 2) (NumT)))))
(check-values-equal? (type-infer (parse '{{λ {x} x} 1}) empty (set))
(values (VarT 5)
(set (Eq (ArrowT (VarT 4) (VarT 4)) (ArrowT (NumT) (VarT 5))))))
(check-values-equal? (type-infer (parse '{{λ {f} {f 0}} {λ {x} x}}) empty (set))
(values (VarT 9)
(set (Eq (VarT 6) (ArrowT (NumT) (VarT 7)))
(Eq (ArrowT (VarT 6) (VarT 7))
(ArrowT (ArrowT (VarT 8) (VarT 8)) (VarT 9))))))
(check-values-equal? (type-infer (parse '{λ {x} x}) empty (set))
(values (ArrowT (VarT 10) (VarT 10))
(set)))
(check-equal? (typecheck (parse '{{λ {f} {f 0}} {λ {x} x}}) mt-tenv)
(NumT))
(check-equal? (typecheck (parse '{λ {x} {λ {y} {+ x y}}}) mt-tenv)
(ArrowT (NumT) (ArrowT (NumT) (NumT))))
(check-equal? (typecheck (parse '{λ {f} {λ {u} {u {f u}}}}) mt-tenv)
(ArrowT (ArrowT (ArrowT (VarT 3) (VarT 4)) (VarT 3))
(ArrowT (ArrowT (VarT 3) (VarT 4)) (VarT 4))))
(check-equal? (typecheck (parse '{λ {x} {λ {y} {x {x y}}}}) mt-tenv)
(ArrowT (ArrowT (VarT 2) (VarT 2))
(ArrowT (VarT 2) (VarT 2))))
(check-equal? (typecheck (parse '{λ {x} {λ {y} {x {y x}}}}) mt-tenv)
(ArrowT
(ArrowT (VarT 3) (VarT 4))
(ArrowT (ArrowT (ArrowT (VarT 3) (VarT 4)) (VarT 3))
(VarT 4))))
(check-equal? (typecheck (parse '{λ {x} {λ {y} {y {y x}}}}) mt-tenv)
(ArrowT (VarT 4) (ArrowT (ArrowT (VarT 4) (VarT 4)) (VarT 4))))
(check-equal? (run '{{{λ {x} {λ {y} {+ x y}}} 3} 7})
(NumV 10))
(define S '{λ {x} {λ {y} {λ {z} {{x z} {y z}}}}})
(check-equal? (typecheck (parse S) mt-tenv)
(ArrowT (ArrowT (VarT 3) (ArrowT (VarT 5) (VarT 6)))
(ArrowT (ArrowT (VarT 3) (VarT 5))
(ArrowT (VarT 3) (VarT 6)))))
(define K '{λ {x} {λ {y} x}})
(check-equal? (typecheck (parse K) mt-tenv)
(ArrowT (VarT 1) (ArrowT (VarT 2) (VarT 1))))
(check-equal? (typecheck (parse `(,S ,K)) mt-tenv)
(ArrowT (ArrowT (VarT 6) (VarT 5)) (ArrowT (VarT 6) (VarT 6))))
(check-equal? (typecheck (parse `((,S ,K) ,K)) mt-tenv)
(ArrowT (VarT 6) (VarT 6)))
(check-exn exn:fail? (λ () (typecheck (parse '{{λ {id} {{id id} 3}} {λ {x} x}}) mt-tenv)))
(check-exn exn:fail? (λ () (typecheck (parse '{λ {x} {λ {y} {{x y} x}}}) mt-tenv)))
(check-exn exn:fail? (λ () (run '{{λ {x} {x x}} {λ {x} {x x}}})))
(check-exn exn:fail? (λ () (run '{+ 3 true})))
)
|
0def5a56e9a9593f593e2508ae3f92951ffda54254e19b06f63323f25733dd2b | jeffshrager/biobike | help-utils.lisp | -*- Package : help ; mode : lisp ; base : 10 ; Syntax : Common - Lisp ; -*-
(in-package :help)
;;; +=========================================================================+
| Copyright ( c ) 2002 - 2006 JP , , |
;;; | |
;;; | Permission is hereby granted, free of charge, to any person obtaining |
;;; | a copy of this software and associated documentation files (the |
| " Software " ) , to deal in the Software without restriction , including |
;;; | without limitation the rights to use, copy, modify, merge, publish, |
| distribute , sublicense , and/or sell copies of the Software , and to |
| permit persons to whom the Software is furnished to do so , subject to |
;;; | the following conditions: |
;;; | |
;;; | The above copyright notice and this permission notice shall be included |
| in all copies or substantial portions of the Software . |
;;; | |
| THE SOFTWARE IS PROVIDED " AS IS " , WITHOUT WARRANTY OF ANY KIND , |
;;; | EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF |
;;; | MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. |
;;; | IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY |
| CLAIM , DAMAGES OR OTHER LIABILITY , WHETHER IN AN ACTION OF CONTRACT , |
;;; | TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE |
;;; | SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
;;; +=========================================================================+
Authors : JP Massar , .
;; Loop through all existing documentation items and find those that
satisfy a function F , returning for each satisfied entry a list of
;; the documentation type (e.g., help::module), the name of the documentation
;; item (e.g., "commands"), the documentation instance itself, and
;; the result of calling F on the documentation instance (this might
;; be a score instead of just T or NIL). Then sort the results based on
;; this score if a SORT-PREDICATE is provided.
(defun find-doc-items-if (f &key (sort-predicate nil) (type nil))
(let ((results nil))
(maphash
(lambda (key subhash)
(maphash
(lambda (name doc-item)
(flet ((doit (x)
(vwhen (result (funcall f x))
(push
(make-help-match
:doc-type key :name name :ref x
:score result :type type)
results
))))
(setq doc-item (ensure-list doc-item))
(if (not *help-debug*)
(mapcar #'doit doc-item)
(mapcar
(lambda (x)
(handler-case (doit x)
(error
(c)
(print
(list 'name name 'doc-item (help:name doc-item) 'key key))
(error c))))
doc-item
))))
subhash
))
help::*documentation*
)
(if (null sort-predicate)
(nreverse results)
(sort results sort-predicate :key 'fourth)
)))
;; Loop through a subset of symbols as defined by SCOPE, find all such
symbols that satisfy a function F. For each such symbol , create
;; some number of HELP-MATCH records for it, depending on how it is used.
;; Remove any duplicate HELP-MATCH records, then sort the results by the
result value from F.
;; This should be obsolete, but might still be useful someday.
#+obsolete
(defun find-symbol-items-if
(f &key (scope :user-external) (sort-predicate nil) (type nil))
(let ((results nil))
(labels ((doit (symbol)
(vwhen (result (funcall f symbol))
(loop for symdoc in (maybe-create-symbol-docs symbol) do
(push
(make-help-match
:doc-type :symbol :name symbol :ref symdoc
:score result :type type)
results
))))
(maybe-push (symbol)
(if (not *help-debug*)
(doit symbol)
(handler-case (doit symbol)
(error
(c)
(print (list 'symbol symbol))
(error c)))))
(search-external (packages)
(loop for p in packages do
(do-external-symbols (s p) (maybe-push s))))
(search-internal (packages)
(loop for p in packages do (do-symbols (s p) (maybe-push s)))))
(ecase scope
(:user-external
(let ((user-package (find-package wb::*username*)))
(search-internal (list user-package))
))
((:system-external :biobike-external)
(search-external cl-user::*biobike-packages*))
((:system-all :biobike-all)
(search-internal cl-user::*biobike-packages*)))
(setq results
(purge-duplicates
results
:key
(lambda (x)
(cons (help-match-name x) (help::dtype (help-match-ref x))))
:test 'equal
:hash-threshold 20
)))
(if (null sort-predicate)
(nreverse results)
(sort results sort-predicate :key 'help-match-score)
)))
(defun define-function-p (symbol) (get symbol :define-function-parse))
(defun split-at-whitespace (s)
(loop for ch in *whitespace* do (setq s (substitute #\Space ch s)))
(remove-if
(lambda (x) (zerop (length x)))
(string-split s)))
(defun arglist-to-help-display-string (arglist &optional (limit 40))
(labels ((all-keywords (x)
(cond
((null x) nil)
((symbolp x) (keywordize x))
((listp x) (mapcar #'all-keywords x))
(t x)
)))
(if (null arglist)
"()"
(let ((s (limited-string (formatn "~A" (all-keywords arglist)) limit)))
(if (eql (lastelem s) #\)) s (one-string s ")"))
))))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
(defun symbol->doc-url
(symbol
&key
(hyperspec-first? nil)
(if-no-documentation? nil)
(if-only-docstring? nil)
)
(let ((fdocobj (find-documentation symbol 'function-documentation))
(sdocobj (find-documentation symbol 'symbol-doc)))
(cond
((and (eq (find-package :common-lisp) (symbol-package symbol))
(external-symbol-of-package? symbol :common-lisp))
(if hyperspec-first?
(wb::common-lisp-external-symbol-url symbol)
(if fdocobj
(docobj->url fdocobj)
(wb::common-lisp-external-symbol-url symbol)
)))
(fdocobj (docobj->url fdocobj))
(sdocobj (docobj->url (first sdocobj)))
((documentation symbol 'function)
(ecase if-only-docstring?
((nil) nil)
))
((documentation symbol 'variable)
(ecase if-only-docstring?
((nil) nil)
))
(t
(ecase if-no-documentation?
((nil) nil)
)))))
| null | https://raw.githubusercontent.com/jeffshrager/biobike/5313ec1fe8e82c21430d645e848ecc0386436f57/BioLisp/Help/help-utils.lisp | lisp | mode : lisp ; base : 10 ; Syntax : Common - Lisp ; -*-
+=========================================================================+
| |
| Permission is hereby granted, free of charge, to any person obtaining |
| a copy of this software and associated documentation files (the |
| without limitation the rights to use, copy, modify, merge, publish, |
| the following conditions: |
| |
| The above copyright notice and this permission notice shall be included |
| |
| EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF |
| MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. |
| IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY |
| TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE |
| SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. |
+=========================================================================+
Loop through all existing documentation items and find those that
the documentation type (e.g., help::module), the name of the documentation
item (e.g., "commands"), the documentation instance itself, and
the result of calling F on the documentation instance (this might
be a score instead of just T or NIL). Then sort the results based on
this score if a SORT-PREDICATE is provided.
Loop through a subset of symbols as defined by SCOPE, find all such
some number of HELP-MATCH records for it, depending on how it is used.
Remove any duplicate HELP-MATCH records, then sort the results by the
This should be obsolete, but might still be useful someday.
|
(in-package :help)
| Copyright ( c ) 2002 - 2006 JP , , |
| " Software " ) , to deal in the Software without restriction , including |
| distribute , sublicense , and/or sell copies of the Software , and to |
| permit persons to whom the Software is furnished to do so , subject to |
| in all copies or substantial portions of the Software . |
| THE SOFTWARE IS PROVIDED " AS IS " , WITHOUT WARRANTY OF ANY KIND , |
| CLAIM , DAMAGES OR OTHER LIABILITY , WHETHER IN AN ACTION OF CONTRACT , |
Authors : JP Massar , .
satisfy a function F , returning for each satisfied entry a list of
(defun find-doc-items-if (f &key (sort-predicate nil) (type nil))
(let ((results nil))
(maphash
(lambda (key subhash)
(maphash
(lambda (name doc-item)
(flet ((doit (x)
(vwhen (result (funcall f x))
(push
(make-help-match
:doc-type key :name name :ref x
:score result :type type)
results
))))
(setq doc-item (ensure-list doc-item))
(if (not *help-debug*)
(mapcar #'doit doc-item)
(mapcar
(lambda (x)
(handler-case (doit x)
(error
(c)
(print
(list 'name name 'doc-item (help:name doc-item) 'key key))
(error c))))
doc-item
))))
subhash
))
help::*documentation*
)
(if (null sort-predicate)
(nreverse results)
(sort results sort-predicate :key 'fourth)
)))
symbols that satisfy a function F. For each such symbol , create
result value from F.
#+obsolete
(defun find-symbol-items-if
(f &key (scope :user-external) (sort-predicate nil) (type nil))
(let ((results nil))
(labels ((doit (symbol)
(vwhen (result (funcall f symbol))
(loop for symdoc in (maybe-create-symbol-docs symbol) do
(push
(make-help-match
:doc-type :symbol :name symbol :ref symdoc
:score result :type type)
results
))))
(maybe-push (symbol)
(if (not *help-debug*)
(doit symbol)
(handler-case (doit symbol)
(error
(c)
(print (list 'symbol symbol))
(error c)))))
(search-external (packages)
(loop for p in packages do
(do-external-symbols (s p) (maybe-push s))))
(search-internal (packages)
(loop for p in packages do (do-symbols (s p) (maybe-push s)))))
(ecase scope
(:user-external
(let ((user-package (find-package wb::*username*)))
(search-internal (list user-package))
))
((:system-external :biobike-external)
(search-external cl-user::*biobike-packages*))
((:system-all :biobike-all)
(search-internal cl-user::*biobike-packages*)))
(setq results
(purge-duplicates
results
:key
(lambda (x)
(cons (help-match-name x) (help::dtype (help-match-ref x))))
:test 'equal
:hash-threshold 20
)))
(if (null sort-predicate)
(nreverse results)
(sort results sort-predicate :key 'help-match-score)
)))
(defun define-function-p (symbol) (get symbol :define-function-parse))
(defun split-at-whitespace (s)
(loop for ch in *whitespace* do (setq s (substitute #\Space ch s)))
(remove-if
(lambda (x) (zerop (length x)))
(string-split s)))
(defun arglist-to-help-display-string (arglist &optional (limit 40))
(labels ((all-keywords (x)
(cond
((null x) nil)
((symbolp x) (keywordize x))
((listp x) (mapcar #'all-keywords x))
(t x)
)))
(if (null arglist)
"()"
(let ((s (limited-string (formatn "~A" (all-keywords arglist)) limit)))
(if (eql (lastelem s) #\)) s (one-string s ")"))
))))
(defun symbol->doc-url
(symbol
&key
(hyperspec-first? nil)
(if-no-documentation? nil)
(if-only-docstring? nil)
)
(let ((fdocobj (find-documentation symbol 'function-documentation))
(sdocobj (find-documentation symbol 'symbol-doc)))
(cond
((and (eq (find-package :common-lisp) (symbol-package symbol))
(external-symbol-of-package? symbol :common-lisp))
(if hyperspec-first?
(wb::common-lisp-external-symbol-url symbol)
(if fdocobj
(docobj->url fdocobj)
(wb::common-lisp-external-symbol-url symbol)
)))
(fdocobj (docobj->url fdocobj))
(sdocobj (docobj->url (first sdocobj)))
((documentation symbol 'function)
(ecase if-only-docstring?
((nil) nil)
))
((documentation symbol 'variable)
(ecase if-only-docstring?
((nil) nil)
))
(t
(ecase if-no-documentation?
((nil) nil)
)))))
|
b1109a888ba8f719fb633ec730885dfb3e783f35b0add26ac42e5a8c3fb81f48 | facebook/flow | file_sig_sig.ml |
* Copyright ( c ) Meta Platforms , Inc. and affiliates .
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree .
* Copyright (c) Meta Platforms, Inc. and affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*)
module type S = sig
module L : Loc_sig.S
(* In Flow, every file creates a single module, but may also include declared
* modules. This data structure describes all such modules.
*
* If a declared module with the same name appears twice, the last one will be
* represented here.
*
* This representation is a bit broad, because implementation files generally
* should not contain declare modules and declaration files (libdefs) are all
* coalesced into a single module (builtins). *)
type t = {
module_sig: module_sig;
declare_modules: (L.t * module_sig) SMap.t;
}
and options = {
module_ref_prefix: string option;
module_ref_prefix_LEGACY_INTEROP: string option;
enable_enums: bool;
enable_relay_integration: bool;
relay_integration_module_prefix: string option;
}
(* We can extract the observable interface of a module by extracting information
* about what it requires and what it exports. *)
and module_sig = {
requires: require list;
module_kind: module_kind;
}
We track information about dependencies for each unique module reference in a
* file . For example , ` import X from " foo " ` and ` require("foo " ) ` both induce
* dependencies on the same module and have the same module ref .
*
* Note that different refs can point to the same module , but we have n't
* resolved modules yet , so we do n't know where the ref actually points .
* file. For example, `import X from "foo"` and `require("foo")` both induce
* dependencies on the same module and have the same module ref.
*
* Note that different refs can point to the same module, but we haven't
* resolved modules yet, so we don't know where the ref actually points.
*)
and require =
(* require('foo'); *)
| Require of {
(* location of module ref *)
source: L.t Flow_ast_utils.source;
require_loc: L.t;
(* Note: These are best-effort.
* DO NOT use these for typechecking. *)
bindings: require_bindings option;
}
(* import('foo').then(...) *)
| ImportDynamic of {
source: L.t Flow_ast_utils.source;
import_loc: L.t;
}
(* import declaration without specifiers
*
* Note that this is equivalent to the Import variant below with all fields
* empty, but modeled as a separate variant to ensure use sites handle this
* case if necessary. *)
| Import0 of { source: L.t Flow_ast_utils.source }
(* import declaration with specifiers *)
| Import of {
import_loc: L.t;
(* location of module ref *)
source: L.t Flow_ast_utils.source;
(* map from remote name to local names of value imports
* source: import {A, B as C} from "foo";
* result: {A:{A:{[ImportedLocs {_}]}}, B:{C:{[ImportedLocs {_}]}}}
*
* Multiple locations for a given (remoteName, localName) pair are not typical, but they can
* occur e.g. with the code `import {foo, foo} from 'bar';`. This code would cause an error
* later because of the duplicate local name, but we should handle it here since it does parse.
*)
named: imported_locs Nel.t SMap.t SMap.t;
(* optional pair of location of namespace import and local name
* source: import * as X from "foo";
* result: loc, X *)
ns: L.t Flow_ast_utils.ident option;
(* map from remote name to local names of type imports
* source: import type {A, B as C} from "foo";
* source: import {type A, type B as C} from "foo";
* result: {A:{A:{[ImportedLocs {_}]}}, B:{C:{[ImportedLocs {_}]}}} *)
types: imported_locs Nel.t SMap.t SMap.t;
(* map from remote name to local names of typeof imports
* source: import typeof {A, B as C} from "foo";
* source: import {typeof A, typeof B as C} from "foo";
* result: {A:{A:{[ImportedLocs {_}]}}, B:{C:{[ImportedLocs {_}]}}} *)
typesof: imported_locs Nel.t SMap.t SMap.t;
(* optional pair of location of namespace typeof import and local name
* source: import typeof * as X from "foo";
* result: loc, X *)
typesof_ns: L.t Flow_ast_utils.ident option;
}
| ExportFrom of { source: L.t Flow_ast_utils.source }
and imported_locs = {
remote_loc: L.t;
local_loc: L.t;
}
and require_bindings =
(* source: const bar = require('./foo');
* result: bar *)
| BindIdent of L.t Flow_ast_utils.ident
(* map from remote name to local names of requires
* source: const {a, b: c} = require('./foo');
* result: {a: (a_loc, a), b: (c_loc, c)} *)
| BindNamed of (L.t Flow_ast_utils.ident * require_bindings) list
(* All modules are assumed to be CommonJS to start with, but if we see an ES
* module-style export, we switch to ES. *)
and module_kind =
| CommonJS of { mod_exp_loc: L.t option }
| ES
[@@deriving show]
type tolerable_error =
| IndeterminateModuleType of L.t
e.g. ` module.exports.foo = 4 ` when not at the top level
| BadExportPosition of L.t
(* e.g. `foo(module)`, dangerous because `module` is aliased *)
| BadExportContext of string (* offending identifier *) * L.t
| SignatureVerificationError of L.t Signature_error.t
[@@deriving show]
type tolerable_t = t * tolerable_error list
val empty : t
val default_opts : options
val program : ast:(L.t, L.t) Flow_ast.Program.t -> opts:options -> tolerable_t
(* Use for debugging; not for exposing info to the end user *)
val to_string : t -> string
val require_loc_map : module_sig -> L.t Nel.t SMap.t
(* Only the keys returned by `require_loc_map` *)
val require_set : module_sig -> SSet.t
class mapper :
object
method file_sig : t -> t
method ident : L.t Flow_ast_utils.ident -> L.t Flow_ast_utils.ident
method source : L.t Flow_ast_utils.source -> L.t Flow_ast_utils.source
method imported_locs : imported_locs -> imported_locs
method loc : L.t -> L.t
method module_kind : module_kind -> module_kind
method module_sig : module_sig -> module_sig
method require : require -> require
method require_bindings : require_bindings -> require_bindings
method tolerable_error : tolerable_error -> tolerable_error
end
end
| null | https://raw.githubusercontent.com/facebook/flow/f7d50bb772462888b27b5dbf9acf7d079eb1ff5f/src/parser_utils/file_sig_sig.ml | ocaml | In Flow, every file creates a single module, but may also include declared
* modules. This data structure describes all such modules.
*
* If a declared module with the same name appears twice, the last one will be
* represented here.
*
* This representation is a bit broad, because implementation files generally
* should not contain declare modules and declaration files (libdefs) are all
* coalesced into a single module (builtins).
We can extract the observable interface of a module by extracting information
* about what it requires and what it exports.
require('foo');
location of module ref
Note: These are best-effort.
* DO NOT use these for typechecking.
import('foo').then(...)
import declaration without specifiers
*
* Note that this is equivalent to the Import variant below with all fields
* empty, but modeled as a separate variant to ensure use sites handle this
* case if necessary.
import declaration with specifiers
location of module ref
map from remote name to local names of value imports
* source: import {A, B as C} from "foo";
* result: {A:{A:{[ImportedLocs {_}]}}, B:{C:{[ImportedLocs {_}]}}}
*
* Multiple locations for a given (remoteName, localName) pair are not typical, but they can
* occur e.g. with the code `import {foo, foo} from 'bar';`. This code would cause an error
* later because of the duplicate local name, but we should handle it here since it does parse.
optional pair of location of namespace import and local name
* source: import * as X from "foo";
* result: loc, X
map from remote name to local names of type imports
* source: import type {A, B as C} from "foo";
* source: import {type A, type B as C} from "foo";
* result: {A:{A:{[ImportedLocs {_}]}}, B:{C:{[ImportedLocs {_}]}}}
map from remote name to local names of typeof imports
* source: import typeof {A, B as C} from "foo";
* source: import {typeof A, typeof B as C} from "foo";
* result: {A:{A:{[ImportedLocs {_}]}}, B:{C:{[ImportedLocs {_}]}}}
optional pair of location of namespace typeof import and local name
* source: import typeof * as X from "foo";
* result: loc, X
source: const bar = require('./foo');
* result: bar
map from remote name to local names of requires
* source: const {a, b: c} = require('./foo');
* result: {a: (a_loc, a), b: (c_loc, c)}
All modules are assumed to be CommonJS to start with, but if we see an ES
* module-style export, we switch to ES.
e.g. `foo(module)`, dangerous because `module` is aliased
offending identifier
Use for debugging; not for exposing info to the end user
Only the keys returned by `require_loc_map` |
* Copyright ( c ) Meta Platforms , Inc. and affiliates .
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree .
* Copyright (c) Meta Platforms, Inc. and affiliates.
*
* This source code is licensed under the MIT license found in the
* LICENSE file in the root directory of this source tree.
*)
module type S = sig
module L : Loc_sig.S
type t = {
module_sig: module_sig;
declare_modules: (L.t * module_sig) SMap.t;
}
and options = {
module_ref_prefix: string option;
module_ref_prefix_LEGACY_INTEROP: string option;
enable_enums: bool;
enable_relay_integration: bool;
relay_integration_module_prefix: string option;
}
and module_sig = {
requires: require list;
module_kind: module_kind;
}
We track information about dependencies for each unique module reference in a
* file . For example , ` import X from " foo " ` and ` require("foo " ) ` both induce
* dependencies on the same module and have the same module ref .
*
* Note that different refs can point to the same module , but we have n't
* resolved modules yet , so we do n't know where the ref actually points .
* file. For example, `import X from "foo"` and `require("foo")` both induce
* dependencies on the same module and have the same module ref.
*
* Note that different refs can point to the same module, but we haven't
* resolved modules yet, so we don't know where the ref actually points.
*)
and require =
| Require of {
source: L.t Flow_ast_utils.source;
require_loc: L.t;
bindings: require_bindings option;
}
| ImportDynamic of {
source: L.t Flow_ast_utils.source;
import_loc: L.t;
}
| Import0 of { source: L.t Flow_ast_utils.source }
| Import of {
import_loc: L.t;
source: L.t Flow_ast_utils.source;
named: imported_locs Nel.t SMap.t SMap.t;
ns: L.t Flow_ast_utils.ident option;
types: imported_locs Nel.t SMap.t SMap.t;
typesof: imported_locs Nel.t SMap.t SMap.t;
typesof_ns: L.t Flow_ast_utils.ident option;
}
| ExportFrom of { source: L.t Flow_ast_utils.source }
and imported_locs = {
remote_loc: L.t;
local_loc: L.t;
}
and require_bindings =
| BindIdent of L.t Flow_ast_utils.ident
| BindNamed of (L.t Flow_ast_utils.ident * require_bindings) list
and module_kind =
| CommonJS of { mod_exp_loc: L.t option }
| ES
[@@deriving show]
type tolerable_error =
| IndeterminateModuleType of L.t
e.g. ` module.exports.foo = 4 ` when not at the top level
| BadExportPosition of L.t
| SignatureVerificationError of L.t Signature_error.t
[@@deriving show]
type tolerable_t = t * tolerable_error list
val empty : t
val default_opts : options
val program : ast:(L.t, L.t) Flow_ast.Program.t -> opts:options -> tolerable_t
val to_string : t -> string
val require_loc_map : module_sig -> L.t Nel.t SMap.t
val require_set : module_sig -> SSet.t
class mapper :
object
method file_sig : t -> t
method ident : L.t Flow_ast_utils.ident -> L.t Flow_ast_utils.ident
method source : L.t Flow_ast_utils.source -> L.t Flow_ast_utils.source
method imported_locs : imported_locs -> imported_locs
method loc : L.t -> L.t
method module_kind : module_kind -> module_kind
method module_sig : module_sig -> module_sig
method require : require -> require
method require_bindings : require_bindings -> require_bindings
method tolerable_error : tolerable_error -> tolerable_error
end
end
|
e2e3ebd3f326d136dad7270027c24107d0acffce4e27ea91691cc08e8c156c81 | Haskell-Things/HSlice | Util.hs | {- ORMOLU_DISABLE -}
HSlice .
- Copyright 2020
-
- This program is free software : you can redistribute it and/or modify
- it under the terms of the GNU Affero General Public License as published by
- the Free Software Foundation , either version 3 of the License , or
- ( at your option ) any later version .
-
- This program is distributed in the hope that it will be useful ,
- but WITHOUT ANY WARRANTY ; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
- GNU Affero General Public License for more details .
- You should have received a copy of the GNU Affero General Public License
- along with this program . If not , see < / > .
- Copyright 2020 Julia Longtin
-
- This program is free software: you can redistribute it and/or modify
- it under the terms of the GNU Affero General Public License as published by
- the Free Software Foundation, either version 3 of the License, or
- (at your option) any later version.
-
- This program is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
- GNU Affero General Public License for more details.
- You should have received a copy of the GNU Affero General Public License
- along with this program. If not, see </>.
-}
Shamelessly stolen from ImplicitCAD .
------------------------------------------------------------------------------
-- | Construct a golden test for rendering the given object to javascript
-- suitable for dropping into /
--
On the first run of this test , it will render the object and cache the
-- results. Subsequent test runs will compare their result to the cached one.
-- This is valuable for ensuring representations of structures don't break
-- across commits.
--
The objects are cached under @tests / golden/@ , with the given name . Deleting
-- this file is sufficient to update the test if changs in the structures are
-- intended.
{-# LANGUAGE BangPatterns #-}
# LANGUAGE LambdaCase #
module GoldenSpec.Util (golden, goldens) where
import Prelude (IO, FilePath, Bool (True, False), String, pure, (==), readFile, writeFile, (>>=), (<>), ($))
import Control.Monad.IO.Class (liftIO)
import System.Directory (getTemporaryDirectory, doesFileExist)
import System.IO (hClose, openTempFile)
import Test.Hspec (it, shouldBe, SpecWith)
import Graphics.Slicer.Math.Ganja(GanjaAble, dumpGanja, dumpGanjas)
golden :: (GanjaAble a) => String -> a -> SpecWith ()
golden name object = it (name <> " (golden)") $ do
(res, cached) <- liftIO $ do
temp_fp <- getTemporaryFilePath "ganja.js"
-- Output the rendered mesh
writeFile temp_fp $ dumpGanja object
!res <- readFile temp_fp
let golden_fp = "./tests/golden/" <> name <> ".ganja.js"
-- Check if the cached results already exist.
doesFileExist golden_fp >>= \case
True -> pure ()
-- If not, save the mesh we just created in the cache.
False -> writeFile golden_fp res
!cached <- readFile golden_fp
pure (res, cached)
Finally , check if the two meshes are equal .
if res == cached
then pure ()
else False `shouldBe` True
goldens :: String -> [String -> (String, String)] -> SpecWith ()
goldens name objects = it (name <> " (golden)") $ do
(res, cached) <- liftIO $ do
temp_fp <- getTemporaryFilePath "ganja.js"
-- Output the rendered mesh
writeFile temp_fp $ dumpGanjas objects
!res <- readFile temp_fp
let golden_fp = "./tests/golden/" <> name <> ".ganja.js"
-- Check if the cached results already exist.
doesFileExist golden_fp >>= \case
True -> pure ()
-- If not, save the mesh we just created in the cache.
False -> writeFile golden_fp res
!cached <- readFile golden_fp
pure (res, cached)
Finally , check if the two meshes are equal .
if res == cached
then pure ()
else False `shouldBe` True
------------------------------------------------------------------------------
-- | Get a temporary filepath with the desired extension. On unix systems, this
is a file under @/tmp@. Useful for tests that need to write files .
getTemporaryFilePath
:: String -- ^ File extension
-> IO FilePath
getTemporaryFilePath ext = do
tempdir <- getTemporaryDirectory
-- The only means available to us for getting a temporary filename also opens
-- its file handle. Because the 'writeSTL' function opens the file handle
itself , we must first close our handle .
(fp, h) <- openTempFile tempdir "implicit-golden"
hClose h
pure $ fp <> "." <> ext
| null | https://raw.githubusercontent.com/Haskell-Things/HSlice/3d1f6ebcac919d1c774a581fa41233ff4fe60ee4/tests/GoldenSpec/Util.hs | haskell | ORMOLU_DISABLE
----------------------------------------------------------------------------
| Construct a golden test for rendering the given object to javascript
suitable for dropping into /
results. Subsequent test runs will compare their result to the cached one.
This is valuable for ensuring representations of structures don't break
across commits.
this file is sufficient to update the test if changs in the structures are
intended.
# LANGUAGE BangPatterns #
Output the rendered mesh
Check if the cached results already exist.
If not, save the mesh we just created in the cache.
Output the rendered mesh
Check if the cached results already exist.
If not, save the mesh we just created in the cache.
----------------------------------------------------------------------------
| Get a temporary filepath with the desired extension. On unix systems, this
^ File extension
The only means available to us for getting a temporary filename also opens
its file handle. Because the 'writeSTL' function opens the file handle |
HSlice .
- Copyright 2020
-
- This program is free software : you can redistribute it and/or modify
- it under the terms of the GNU Affero General Public License as published by
- the Free Software Foundation , either version 3 of the License , or
- ( at your option ) any later version .
-
- This program is distributed in the hope that it will be useful ,
- but WITHOUT ANY WARRANTY ; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the
- GNU Affero General Public License for more details .
- You should have received a copy of the GNU Affero General Public License
- along with this program . If not , see < / > .
- Copyright 2020 Julia Longtin
-
- This program is free software: you can redistribute it and/or modify
- it under the terms of the GNU Affero General Public License as published by
- the Free Software Foundation, either version 3 of the License, or
- (at your option) any later version.
-
- This program is distributed in the hope that it will be useful,
- but WITHOUT ANY WARRANTY; without even the implied warranty of
- MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
- GNU Affero General Public License for more details.
- You should have received a copy of the GNU Affero General Public License
- along with this program. If not, see </>.
-}
Shamelessly stolen from ImplicitCAD .
On the first run of this test , it will render the object and cache the
The objects are cached under @tests / golden/@ , with the given name . Deleting
# LANGUAGE LambdaCase #
module GoldenSpec.Util (golden, goldens) where
import Prelude (IO, FilePath, Bool (True, False), String, pure, (==), readFile, writeFile, (>>=), (<>), ($))
import Control.Monad.IO.Class (liftIO)
import System.Directory (getTemporaryDirectory, doesFileExist)
import System.IO (hClose, openTempFile)
import Test.Hspec (it, shouldBe, SpecWith)
import Graphics.Slicer.Math.Ganja(GanjaAble, dumpGanja, dumpGanjas)
golden :: (GanjaAble a) => String -> a -> SpecWith ()
golden name object = it (name <> " (golden)") $ do
(res, cached) <- liftIO $ do
temp_fp <- getTemporaryFilePath "ganja.js"
writeFile temp_fp $ dumpGanja object
!res <- readFile temp_fp
let golden_fp = "./tests/golden/" <> name <> ".ganja.js"
doesFileExist golden_fp >>= \case
True -> pure ()
False -> writeFile golden_fp res
!cached <- readFile golden_fp
pure (res, cached)
Finally , check if the two meshes are equal .
if res == cached
then pure ()
else False `shouldBe` True
goldens :: String -> [String -> (String, String)] -> SpecWith ()
goldens name objects = it (name <> " (golden)") $ do
(res, cached) <- liftIO $ do
temp_fp <- getTemporaryFilePath "ganja.js"
writeFile temp_fp $ dumpGanjas objects
!res <- readFile temp_fp
let golden_fp = "./tests/golden/" <> name <> ".ganja.js"
doesFileExist golden_fp >>= \case
True -> pure ()
False -> writeFile golden_fp res
!cached <- readFile golden_fp
pure (res, cached)
Finally , check if the two meshes are equal .
if res == cached
then pure ()
else False `shouldBe` True
is a file under @/tmp@. Useful for tests that need to write files .
getTemporaryFilePath
-> IO FilePath
getTemporaryFilePath ext = do
tempdir <- getTemporaryDirectory
itself , we must first close our handle .
(fp, h) <- openTempFile tempdir "implicit-golden"
hClose h
pure $ fp <> "." <> ext
|
2a665dc467466b37d2da1b0eb7b0fcff8274f1c6261fd4ab6094cc836ebd516e | meagtan/99-lisp-problems | p95-p96.lisp | ;;;; Language processing
(in-package #:99-lisp-problems)
;;; p95
(defun full-words (number &optional (base 10))
"Print NUMBER in full words. Example: (full-words 175) => \"one-seven-five\"."
(format nil "~{~a~^-~}" (mapcar #'digit-word (digits number base))))
(defun digits (number &optional (base 10) acc)
"Return the digits of NUMBER as a list."
(if (< number base)
(cons number acc)
(multiple-value-bind (div rem) (truncate number base)
(digits div base (cons rem acc)))))
(defun digit-word (digit)
"Return the word corresponding to DIGIT."
(cdr (assoc number *digit-words*)))
(defparameter *digit-words*
(mapcar (lambda (d)
(cons d (format nil "~r" d)))
(range 0 9)) ;p22
"An alist mapping each digit to their word representation.")
Addition : printing numbers in words including teens , hundreds and thousands , not digit by digit
(defun number-to-word (number &aux (base 10))
"Return the word representation of NUMBER ≥ 0. Example: (number-to-word 175) => \"one hundred seventy-five\"."
(if (and (< number *hundreds-limit*) (/= (rem number (expt base 3)) 0))
(multiple-value-bind (div rem) (truncate number (expt base 2))
;; could be made into a format directive, but clearer this way
(if (= div 0)
(tens-to-word rem)
(format nil "~a hundred~@[ ~a~]"
(tens-to-word div)
(when (> rem 0)
(tens-to-word rem)))))
(do ((n number)
(c 0 (1+ c))
res)
((= n 0)
(format nil "~{~a~^ ~}" res))
(multiple-value-bind (div rem) (truncate n (expt base 3))
;; add rem to list only if nonzero
(when (/= rem 0)
(when (> c 0)
(push (nth (1- c) *thousands*) res))
(push (number-to-word rem) res))
if have n't reached the end of thousands
(if (< c (1- (length *thousands*)))
(setf n div)
(progn
(push (car (last *thousands*)) res)
(push (number-to-word div) res)
(setf n 0)))))))
(defun tens-to-word (number &aux (base 10))
"Convert 0 ≤ NUMBER ≤ 99 into its word representation. Example: (tens-to-word 43) => \"forty-three\"."
(multiple-value-bind (div rem) (truncate number base)
(format nil "~[~a~;~*~a~:;~2*~a-~1@*~a~]"
div
ones , print if div = 0
teens , print if div = 1
tens , print ( with ones ) if div > = 2
(defparameter *hundreds-limit* 2000
"Print all numbers less than *HUNDREDS-LIMIT*, except for multiples of 1000, using hundreds instead of thousands.
Example: \"eleven hundred\" for 1100, instead of \"one thousand one hundred\".")
(defparameter *thousands*
'("thousand" "million" "billion" "trillion" "quadrillion" "quintillion" "sextillion" "septillion" "octillion" "nonillion")
"List of powers of 1000 up to 1000^30.")
(defparameter *digits*
(mapcar (lambda (d)
(list d
(format nil "~r" d)
(format nil "~r" (+ 10 d))
(format nil "~r" (* 10 d))))
(range 0 9)) ;p22
"Alist mapping every digit D to the word representation of D, 1D and D0.")
;;; p96
(defun identifier-p (string &aux (list (coerce string 'list)))
"Return T if STRING represents a valid identifier, represented by the quasi-regex syntax \"<alpha>*(_?[<alpha>|<digit>])*\"."
(labels ((rest-p (list)
(or (null list)
(and (alphanumericp (car list))
(rest-p (cdr list)))
(and (char= (car list) #\_)
(cdr list)
(alphanumericp (cadr list))
(rest-p (cddr list))))))
(and list
(alpha-char-p (car list))
(rest-p (cdr list)))))
;; Extending the above predicate for general syntax diagrams as in the problem
;; A syntax diagram can be represented as a directed graph, expressed in readable list form, that transitions from state to
;; state based on a recognizer predicate that returns the rest of the given sequence if it is true, and NIL otherwise.
(defun alt-identifier-p (string)
"Alternate implementation of IDENTIFIER-P using general syntax diagrams."
(funcall (recognizer-predicate
(make-recognizer
`((start end ,(predicate-recognizer #'alpha-char-p))
(end b ,(at-most-one (lambda (x) (eq x #\_))))
(b end ,(predicate-recognizer #'alphanumericp)))))
string))
(defun make-recognizer (syntax-list &aux (graph (d-readable-graph syntax-list))
(start (start-state graph))
(end (end-state graph)))
"Create a recognizer predicate from the given list of directed edges of a syntax diagram.
A valid syntax diagram is composed of nodes, with a unique start node named START and end node named END,
and directed, labeled edges with recognizer predicates as labels."
(and start end
(lambda (string)
;; breadth-first graph search, allowing for loops as long as the predicate matches
(do ((node start)
(string string)
queue)
((null queue)
(if (eq node end)
string))
(let ((added (loop for e in (edges node graph)
for res = (funcall (edge-weight e) string)
if res
collect (cons (end-node e) res))))
(setf queue (append queue added)))
(let ((pair (pop queue)))
(setf node (car pair)
string (cdr pair)))))))
(defun start-state (syntax-graph)
"Return the starting state of a given syntax diagram, if any."
(let ((start (remove 'start (graph-nodes syntax-graph) :test-not #'eq)))
(if start
(car start))))
(defun end-state (syntax-graph)
"Return the ending state of a given syntax diagram, if any."
(let ((end (remove 'end (graph-nodes syntax-graph) :test-not #'eq)))
(if end
(car end))))
(defun predicate-recognizer (pred)
"Generate a recognizer predicate for strings from a predicate for characters."
(lambda (string)
(when (funcall pred (char string 0))
(subseq string 1))))
(defun recognizer-predicate (recognizer)
"Generate a predicate that tests for exactly the strings recognized by RECOGNIZER."
(lambda (string &aux (res (funcall recognizer string)))
(and res (zerop (length res)))))
(defun at-most-one (pred)
"Generate a recognizer predicate that checks at most one instance for the given predicate for characters."
(lambda (string)
(if (funcall pred (char string 0))
(subseq string 1)
string)))
(defun kleene-star (pred)
"Generate a recognizer predicate that checks any number of instances for the given predicate for characters."
(lambda (string)
(if (funcall pred (char string 0))
(funcall (kleene-star pred) (subseq string 1))
string)))
(defconstant id-recognizer #'identity "A recognizer predicate that does nothing.")
(defun recognizer-union (arg1 arg2)
"Return the union of two recognizers."
(lambda (string &aux (res (funcall arg1 string)))
(if res
res
(funcall arg2 string))))
(defun recognizer-compose (arg1 arg2)
"Return the composition of two recognizers."
(lambda (string &aux (res (funcall arg1 string)))
(when res
(funcall arg2 res))))
(defun literal-recognizer (literal)
"Generate a recognizer for a literal string."
(if (characterp literal)
(setf literal (make-string 1 :initial-element literal)))
(lambda (string)
(and (<= (length literal) (length string))
(string= literal (subseq string 0 (length literal)))
(subseq string (length literal)))))
;; using recursion to parse context-free and not just regular languages
(defun parentheses (string)
"Recognize a balanced set of parentheses."
(funcall (make-recognizer `((start end ,#'identity)
(start a ,(literal-recognizer "("))
(a b ,#'parentheses)
(b end ,(literal-recognizer ")"))))))
| null | https://raw.githubusercontent.com/meagtan/99-lisp-problems/82bd35b1c161aaba37306332505700c26721296c/src/90-99%20misc/p95-p96.lisp | lisp | Language processing
p95
p22
could be made into a format directive, but clearer this way
add rem to list only if nonzero
p22
p96
Extending the above predicate for general syntax diagrams as in the problem
A syntax diagram can be represented as a directed graph, expressed in readable list form, that transitions from state to
state based on a recognizer predicate that returns the rest of the given sequence if it is true, and NIL otherwise.
breadth-first graph search, allowing for loops as long as the predicate matches
using recursion to parse context-free and not just regular languages |
(in-package #:99-lisp-problems)
(defun full-words (number &optional (base 10))
"Print NUMBER in full words. Example: (full-words 175) => \"one-seven-five\"."
(format nil "~{~a~^-~}" (mapcar #'digit-word (digits number base))))
(defun digits (number &optional (base 10) acc)
"Return the digits of NUMBER as a list."
(if (< number base)
(cons number acc)
(multiple-value-bind (div rem) (truncate number base)
(digits div base (cons rem acc)))))
(defun digit-word (digit)
"Return the word corresponding to DIGIT."
(cdr (assoc number *digit-words*)))
(defparameter *digit-words*
(mapcar (lambda (d)
(cons d (format nil "~r" d)))
"An alist mapping each digit to their word representation.")
Addition : printing numbers in words including teens , hundreds and thousands , not digit by digit
(defun number-to-word (number &aux (base 10))
"Return the word representation of NUMBER ≥ 0. Example: (number-to-word 175) => \"one hundred seventy-five\"."
(if (and (< number *hundreds-limit*) (/= (rem number (expt base 3)) 0))
(multiple-value-bind (div rem) (truncate number (expt base 2))
(if (= div 0)
(tens-to-word rem)
(format nil "~a hundred~@[ ~a~]"
(tens-to-word div)
(when (> rem 0)
(tens-to-word rem)))))
(do ((n number)
(c 0 (1+ c))
res)
((= n 0)
(format nil "~{~a~^ ~}" res))
(multiple-value-bind (div rem) (truncate n (expt base 3))
(when (/= rem 0)
(when (> c 0)
(push (nth (1- c) *thousands*) res))
(push (number-to-word rem) res))
if have n't reached the end of thousands
(if (< c (1- (length *thousands*)))
(setf n div)
(progn
(push (car (last *thousands*)) res)
(push (number-to-word div) res)
(setf n 0)))))))
(defun tens-to-word (number &aux (base 10))
"Convert 0 ≤ NUMBER ≤ 99 into its word representation. Example: (tens-to-word 43) => \"forty-three\"."
(multiple-value-bind (div rem) (truncate number base)
(format nil "~[~a~;~*~a~:;~2*~a-~1@*~a~]"
div
ones , print if div = 0
teens , print if div = 1
tens , print ( with ones ) if div > = 2
(defparameter *hundreds-limit* 2000
"Print all numbers less than *HUNDREDS-LIMIT*, except for multiples of 1000, using hundreds instead of thousands.
Example: \"eleven hundred\" for 1100, instead of \"one thousand one hundred\".")
(defparameter *thousands*
'("thousand" "million" "billion" "trillion" "quadrillion" "quintillion" "sextillion" "septillion" "octillion" "nonillion")
"List of powers of 1000 up to 1000^30.")
(defparameter *digits*
(mapcar (lambda (d)
(list d
(format nil "~r" d)
(format nil "~r" (+ 10 d))
(format nil "~r" (* 10 d))))
"Alist mapping every digit D to the word representation of D, 1D and D0.")
(defun identifier-p (string &aux (list (coerce string 'list)))
"Return T if STRING represents a valid identifier, represented by the quasi-regex syntax \"<alpha>*(_?[<alpha>|<digit>])*\"."
(labels ((rest-p (list)
(or (null list)
(and (alphanumericp (car list))
(rest-p (cdr list)))
(and (char= (car list) #\_)
(cdr list)
(alphanumericp (cadr list))
(rest-p (cddr list))))))
(and list
(alpha-char-p (car list))
(rest-p (cdr list)))))
(defun alt-identifier-p (string)
"Alternate implementation of IDENTIFIER-P using general syntax diagrams."
(funcall (recognizer-predicate
(make-recognizer
`((start end ,(predicate-recognizer #'alpha-char-p))
(end b ,(at-most-one (lambda (x) (eq x #\_))))
(b end ,(predicate-recognizer #'alphanumericp)))))
string))
(defun make-recognizer (syntax-list &aux (graph (d-readable-graph syntax-list))
(start (start-state graph))
(end (end-state graph)))
"Create a recognizer predicate from the given list of directed edges of a syntax diagram.
A valid syntax diagram is composed of nodes, with a unique start node named START and end node named END,
and directed, labeled edges with recognizer predicates as labels."
(and start end
(lambda (string)
(do ((node start)
(string string)
queue)
((null queue)
(if (eq node end)
string))
(let ((added (loop for e in (edges node graph)
for res = (funcall (edge-weight e) string)
if res
collect (cons (end-node e) res))))
(setf queue (append queue added)))
(let ((pair (pop queue)))
(setf node (car pair)
string (cdr pair)))))))
(defun start-state (syntax-graph)
"Return the starting state of a given syntax diagram, if any."
(let ((start (remove 'start (graph-nodes syntax-graph) :test-not #'eq)))
(if start
(car start))))
(defun end-state (syntax-graph)
"Return the ending state of a given syntax diagram, if any."
(let ((end (remove 'end (graph-nodes syntax-graph) :test-not #'eq)))
(if end
(car end))))
(defun predicate-recognizer (pred)
"Generate a recognizer predicate for strings from a predicate for characters."
(lambda (string)
(when (funcall pred (char string 0))
(subseq string 1))))
(defun recognizer-predicate (recognizer)
"Generate a predicate that tests for exactly the strings recognized by RECOGNIZER."
(lambda (string &aux (res (funcall recognizer string)))
(and res (zerop (length res)))))
(defun at-most-one (pred)
"Generate a recognizer predicate that checks at most one instance for the given predicate for characters."
(lambda (string)
(if (funcall pred (char string 0))
(subseq string 1)
string)))
(defun kleene-star (pred)
"Generate a recognizer predicate that checks any number of instances for the given predicate for characters."
(lambda (string)
(if (funcall pred (char string 0))
(funcall (kleene-star pred) (subseq string 1))
string)))
(defconstant id-recognizer #'identity "A recognizer predicate that does nothing.")
(defun recognizer-union (arg1 arg2)
"Return the union of two recognizers."
(lambda (string &aux (res (funcall arg1 string)))
(if res
res
(funcall arg2 string))))
(defun recognizer-compose (arg1 arg2)
"Return the composition of two recognizers."
(lambda (string &aux (res (funcall arg1 string)))
(when res
(funcall arg2 res))))
(defun literal-recognizer (literal)
"Generate a recognizer for a literal string."
(if (characterp literal)
(setf literal (make-string 1 :initial-element literal)))
(lambda (string)
(and (<= (length literal) (length string))
(string= literal (subseq string 0 (length literal)))
(subseq string (length literal)))))
(defun parentheses (string)
"Recognize a balanced set of parentheses."
(funcall (make-recognizer `((start end ,#'identity)
(start a ,(literal-recognizer "("))
(a b ,#'parentheses)
(b end ,(literal-recognizer ")"))))))
|
8ae4be745be0a1344418e2f741b979039162d650b70715587572bd24820d749e | racket/web-server | add-formlets2.rkt | #lang racket/base
(require web-server/servlet
web-server/formlets)
(provide (all-defined-out))
(define interface-version 'v1)
(define timeout +inf.0)
; request-number : str -> num
(define (request-number which-number)
(send/formlet
(formlet
(#%# "Enter the " ,which-number " number to add: "
,{input-int . => . the-number}
(input ([type "submit"] [name "enter"] [value "Enter"])))
the-number)
#:method
"POST"
#:wrap
(lambda (f-expr)
`(html (head (title "Enter a Number to Add"))
(body ([bgcolor "white"])
,f-expr)))))
(define (start initial-request)
(response/xexpr
`(html (head (title "Sum"))
(body ([bgcolor "white"])
(p "The answer is "
,(number->string (+ (request-number "first") (request-number "second"))))))))
| null | https://raw.githubusercontent.com/racket/web-server/f718800b5b3f407f7935adf85dfa663c4bba1651/web-server-lib/web-server/default-web-root/htdocs/servlets/examples/add-formlets2.rkt | racket | request-number : str -> num | #lang racket/base
(require web-server/servlet
web-server/formlets)
(provide (all-defined-out))
(define interface-version 'v1)
(define timeout +inf.0)
(define (request-number which-number)
(send/formlet
(formlet
(#%# "Enter the " ,which-number " number to add: "
,{input-int . => . the-number}
(input ([type "submit"] [name "enter"] [value "Enter"])))
the-number)
#:method
"POST"
#:wrap
(lambda (f-expr)
`(html (head (title "Enter a Number to Add"))
(body ([bgcolor "white"])
,f-expr)))))
(define (start initial-request)
(response/xexpr
`(html (head (title "Sum"))
(body ([bgcolor "white"])
(p "The answer is "
,(number->string (+ (request-number "first") (request-number "second"))))))))
|
4a7c69c7ff65454c23df4c5a617f00db6b070322cf1b5707f623a3df4ef86f8f | taybin/lethink | lethink_sup.erl | @private
-module(lethink_sup).
-behaviour(supervisor).
%% API
-export([start_link/0]).
%% Supervisor callbacks
-export([init/1]).
%% Helper macro for declaring children of supervisor
-define(CHILD(I, Type), {I, {I, start_link, []}, permanent, 5000, Type, [I]}).
%% ===================================================================
%% API functions
%% ===================================================================
-spec start_link() -> any().
start_link() ->
supervisor:start_link({local, ?MODULE}, ?MODULE, []).
%% ===================================================================
%% Supervisor callbacks
%% ===================================================================
-spec init([]) -> {ok,{{supervisor:strategy(),non_neg_integer(), non_neg_integer()},[supervisor:child_spec()]}}.
init([]) ->
lethink_server = ets:new(lethink_server, [
ordered_set, public, named_table, {read_concurrency, true}]),
Procs = [
{lethink_server, {lethink_server, start_link, []},
permanent, 5000, worker, [lethink_server]}
],
{ok, {{one_for_one, 10, 10}, Procs}}.
| null | https://raw.githubusercontent.com/taybin/lethink/f90986dde34f5910c82d5e6bb5e541d7a6fd0c03/src/lethink_sup.erl | erlang | API
Supervisor callbacks
Helper macro for declaring children of supervisor
===================================================================
API functions
===================================================================
===================================================================
Supervisor callbacks
=================================================================== | @private
-module(lethink_sup).
-behaviour(supervisor).
-export([start_link/0]).
-export([init/1]).
-define(CHILD(I, Type), {I, {I, start_link, []}, permanent, 5000, Type, [I]}).
-spec start_link() -> any().
start_link() ->
supervisor:start_link({local, ?MODULE}, ?MODULE, []).
-spec init([]) -> {ok,{{supervisor:strategy(),non_neg_integer(), non_neg_integer()},[supervisor:child_spec()]}}.
init([]) ->
lethink_server = ets:new(lethink_server, [
ordered_set, public, named_table, {read_concurrency, true}]),
Procs = [
{lethink_server, {lethink_server, start_link, []},
permanent, 5000, worker, [lethink_server]}
],
{ok, {{one_for_one, 10, 10}, Procs}}.
|
4a5cf63b944f74300c4f861cfa54900799a1d52ef3ee899b624cbd8d6b4e2137 | tezos/tezos-mirror | node.mli | (*****************************************************************************)
(* *)
(* Open Source License *)
Copyright ( c ) 2018 Dynamic Ledger Solutions , Inc. < >
Copyright ( c ) 2018 - 2021 Nomadic Labs , < >
(* *)
(* Permission is hereby granted, free of charge, to any person obtaining a *)
(* copy of this software and associated documentation files (the "Software"),*)
to deal in the Software without restriction , including without limitation
(* the rights to use, copy, modify, merge, publish, distribute, sublicense, *)
and/or sell copies of the Software , and to permit persons to whom the
(* Software is furnished to do so, subject to the following conditions: *)
(* *)
(* The above copyright notice and this permission notice shall be included *)
(* in all copies or substantial portions of the Software. *)
(* *)
THE SOFTWARE IS PROVIDED " AS IS " , WITHOUT WARRANTY OF ANY KIND , EXPRESS OR
(* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, *)
(* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL *)
(* THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER*)
LIABILITY , WHETHER IN AN ACTION OF CONTRACT , TORT OR OTHERWISE , ARISING
(* FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER *)
(* DEALINGS IN THE SOFTWARE. *)
(* *)
(*****************************************************************************)
type t
type config = {
genesis : Genesis.t;
chain_name : Distributed_db_version.Name.t;
sandboxed_chain_name : Distributed_db_version.Name.t;
user_activated_upgrades : User_activated.upgrades;
user_activated_protocol_overrides : User_activated.protocol_overrides;
operation_metadata_size_limit : Shell_limits.operation_metadata_size_limit;
data_dir : string;
external_validator_log_config : External_validation.log_config;
store_root : string;
context_root : string;
protocol_root : string;
patch_context :
(Tezos_protocol_environment.Context.t ->
Tezos_protocol_environment.Context.t tzresult Lwt.t)
option;
p2p : (P2p.config * P2p_limits.t) option;
target : (Block_hash.t * int32) option;
disable_mempool : bool;
(** If [true], all non-empty mempools will be ignored. *)
enable_testchain : bool;
(** If [false], testchain related messages will be ignored. *)
dal : Tezos_crypto_dal.Cryptobox.Config.t;
}
val create :
?sandboxed:bool ->
?sandbox_parameters:Data_encoding.json ->
singleprocess:bool ->
config ->
Shell_limits.peer_validator_limits ->
Shell_limits.block_validator_limits ->
Shell_limits.prevalidator_limits ->
Shell_limits.chain_validator_limits ->
History_mode.t option ->
t tzresult Lwt.t
val shutdown : t -> unit Lwt.t
val build_rpc_directory : t -> unit Tezos_rpc.Directory.t
| null | https://raw.githubusercontent.com/tezos/tezos-mirror/213f7dc936ae6c63d4b20b2ea895fcffcefd8002/src/lib_shell/node.mli | ocaml | ***************************************************************************
Open Source License
Permission is hereby granted, free of charge, to any person obtaining a
copy of this software and associated documentation files (the "Software"),
the rights to use, copy, modify, merge, publish, distribute, sublicense,
Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included
in all copies or substantial portions of the Software.
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
DEALINGS IN THE SOFTWARE.
***************************************************************************
* If [true], all non-empty mempools will be ignored.
* If [false], testchain related messages will be ignored. | Copyright ( c ) 2018 Dynamic Ledger Solutions , Inc. < >
Copyright ( c ) 2018 - 2021 Nomadic Labs , < >
to deal in the Software without restriction , including without limitation
and/or sell copies of the Software , and to permit persons to whom the
THE SOFTWARE IS PROVIDED " AS IS " , WITHOUT WARRANTY OF ANY KIND , EXPRESS OR
LIABILITY , WHETHER IN AN ACTION OF CONTRACT , TORT OR OTHERWISE , ARISING
type t
type config = {
genesis : Genesis.t;
chain_name : Distributed_db_version.Name.t;
sandboxed_chain_name : Distributed_db_version.Name.t;
user_activated_upgrades : User_activated.upgrades;
user_activated_protocol_overrides : User_activated.protocol_overrides;
operation_metadata_size_limit : Shell_limits.operation_metadata_size_limit;
data_dir : string;
external_validator_log_config : External_validation.log_config;
store_root : string;
context_root : string;
protocol_root : string;
patch_context :
(Tezos_protocol_environment.Context.t ->
Tezos_protocol_environment.Context.t tzresult Lwt.t)
option;
p2p : (P2p.config * P2p_limits.t) option;
target : (Block_hash.t * int32) option;
disable_mempool : bool;
enable_testchain : bool;
dal : Tezos_crypto_dal.Cryptobox.Config.t;
}
val create :
?sandboxed:bool ->
?sandbox_parameters:Data_encoding.json ->
singleprocess:bool ->
config ->
Shell_limits.peer_validator_limits ->
Shell_limits.block_validator_limits ->
Shell_limits.prevalidator_limits ->
Shell_limits.chain_validator_limits ->
History_mode.t option ->
t tzresult Lwt.t
val shutdown : t -> unit Lwt.t
val build_rpc_directory : t -> unit Tezos_rpc.Directory.t
|
2ac1238cf55bd046b75224a536537855411f735cc745d7aeae1bb1e44d2f1c44 | metabase/metabase | common.clj | (ns metabase.sync.sync-metadata.fields.common
"Schemas and functions shared by different `metabase.sync.sync-metadata.fields.*` namespaces."
(:require
[metabase.sync.interface :as i]
[metabase.sync.util :as sync-util]
[metabase.util :as u]
[metabase.util.i18n :refer [trs]]
[metabase.util.schema :as su]
[schema.core :as s]))
(def ParentID
"Schema for the `parent-id` of a Field, i.e. an optional ID."
(s/maybe su/IntGreaterThanZero))
(def TableMetadataFieldWithID
"Schema for `TableMetadataField` with an included ID of the corresponding Metabase Field object.
`our-metadata` is always returned in this format. (The ID is needed in certain places so we know which Fields to
retire, and the parent ID of any nested-fields.)"
(assoc i/TableMetadataField
:id su/IntGreaterThanZero
(s/optional-key :nested-fields) #{(s/recursive #'TableMetadataFieldWithID)}))
(def TableMetadataFieldWithOptionalID
"Schema for either `i/TableMetadataField` (`db-metadata`) or `TableMetadataFieldWithID` (`our-metadata`)."
(assoc i/TableMetadataField
(s/optional-key :id) su/IntGreaterThanZero
(s/optional-key :nested-fields) #{(s/recursive #'TableMetadataFieldWithOptionalID)}))
(s/defn field-metadata-name-for-logging :- s/Str
"Return a 'name for logging' for a map that conforms to the `TableMetadataField` schema.
- > \"Table ' venues ' Field ' name'\ " "
[table :- i/TableInstance field-metadata :- TableMetadataFieldWithOptionalID]
(format "%s %s '%s'" (sync-util/name-for-logging table) (trs "Field") (:name field-metadata)))
(defn canonical-name
"Return the lower-cased 'canonical' name that should be used to uniquely identify `field` -- this is done to ignore
case differences when syncing, e.g. we will consider `field` and `field` to mean the same thing."
[field]
(u/lower-case-en (:name field)))
(s/defn semantic-type :- (s/maybe su/FieldSemanticOrRelationType)
"Determine a the appropriate `semantic-type` for a Field with `field-metadata`."
[field-metadata :- (s/maybe i/TableMetadataField)]
(and field-metadata
(or (:semantic-type field-metadata)
(when (:pk? field-metadata) :type/PK))))
(s/defn matching-field-metadata :- (s/maybe TableMetadataFieldWithOptionalID)
"Find Metadata that matches `field-metadata` from a set of `other-metadata`, if any exists. Useful for finding the
corresponding Metabase Field for field metadata from the DB, or vice versa."
[field-metadata :- TableMetadataFieldWithOptionalID
other-metadata :- #{TableMetadataFieldWithOptionalID}]
(some
(fn [other-field-metadata]
(when (= (canonical-name field-metadata)
(canonical-name other-field-metadata))
other-field-metadata))
other-metadata))
| null | https://raw.githubusercontent.com/metabase/metabase/e6aad38a1fd2ed55f61bf9c0b1ae3b3e57f543b1/src/metabase/sync/sync_metadata/fields/common.clj | clojure | (ns metabase.sync.sync-metadata.fields.common
"Schemas and functions shared by different `metabase.sync.sync-metadata.fields.*` namespaces."
(:require
[metabase.sync.interface :as i]
[metabase.sync.util :as sync-util]
[metabase.util :as u]
[metabase.util.i18n :refer [trs]]
[metabase.util.schema :as su]
[schema.core :as s]))
(def ParentID
"Schema for the `parent-id` of a Field, i.e. an optional ID."
(s/maybe su/IntGreaterThanZero))
(def TableMetadataFieldWithID
"Schema for `TableMetadataField` with an included ID of the corresponding Metabase Field object.
`our-metadata` is always returned in this format. (The ID is needed in certain places so we know which Fields to
retire, and the parent ID of any nested-fields.)"
(assoc i/TableMetadataField
:id su/IntGreaterThanZero
(s/optional-key :nested-fields) #{(s/recursive #'TableMetadataFieldWithID)}))
(def TableMetadataFieldWithOptionalID
"Schema for either `i/TableMetadataField` (`db-metadata`) or `TableMetadataFieldWithID` (`our-metadata`)."
(assoc i/TableMetadataField
(s/optional-key :id) su/IntGreaterThanZero
(s/optional-key :nested-fields) #{(s/recursive #'TableMetadataFieldWithOptionalID)}))
(s/defn field-metadata-name-for-logging :- s/Str
"Return a 'name for logging' for a map that conforms to the `TableMetadataField` schema.
- > \"Table ' venues ' Field ' name'\ " "
[table :- i/TableInstance field-metadata :- TableMetadataFieldWithOptionalID]
(format "%s %s '%s'" (sync-util/name-for-logging table) (trs "Field") (:name field-metadata)))
(defn canonical-name
"Return the lower-cased 'canonical' name that should be used to uniquely identify `field` -- this is done to ignore
case differences when syncing, e.g. we will consider `field` and `field` to mean the same thing."
[field]
(u/lower-case-en (:name field)))
(s/defn semantic-type :- (s/maybe su/FieldSemanticOrRelationType)
"Determine a the appropriate `semantic-type` for a Field with `field-metadata`."
[field-metadata :- (s/maybe i/TableMetadataField)]
(and field-metadata
(or (:semantic-type field-metadata)
(when (:pk? field-metadata) :type/PK))))
(s/defn matching-field-metadata :- (s/maybe TableMetadataFieldWithOptionalID)
"Find Metadata that matches `field-metadata` from a set of `other-metadata`, if any exists. Useful for finding the
corresponding Metabase Field for field metadata from the DB, or vice versa."
[field-metadata :- TableMetadataFieldWithOptionalID
other-metadata :- #{TableMetadataFieldWithOptionalID}]
(some
(fn [other-field-metadata]
(when (= (canonical-name field-metadata)
(canonical-name other-field-metadata))
other-field-metadata))
other-metadata))
| |
ed64c394ba23c393661a0e328b1ec2b3f564a7bf66cbfc18d53abb48e0308ed8 | mmottl/lacaml | utils.ml | File : utils.ml
Copyright ( C ) 2001-
email :
WWW :
email :
WWW : /~liam
email :
WWW : /
email :
WWW : none
This library is free software ; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation ; either
version 2.1 of the License , or ( at your option ) any later version .
This library is distributed in the hope that it will be useful ,
but WITHOUT ANY WARRANTY ; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the GNU
Lesser General Public License for more details .
You should have received a copy of the GNU Lesser General Public
License along with this library ; if not , write to the Free Software
Foundation , Inc. , 51 Franklin Street , Fifth Floor , Boston , MA 02110 - 1301 USA
Copyright (C) 2001-
Markus Mottl
email:
WWW:
Liam Stewart
email:
WWW: /~liam
Christophe Troestler
email:
WWW: /
Florent Hoareau
email:
WWW: none
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
*)
(** General auxiliary functions *)
open Printf
open Bigarray
open Common
Zero - sized dummy vector ( int )
let empty_int32_vec = create_int32_vec 0
indicating type of norm to retrieve for XlanYY routines
let get_norm_char = function `M -> 'M' | `O -> 'O' | `I -> 'I' | `F -> 'F'
indicating whether the " U"pper or " L"ower triangle of a matrix
is stored
is stored *)
let get_uplo_char up = if up then 'U' else 'L'
indicating whether some operation operates on a " N"ormal ,
" T"ransposed or " C"onjugated transposed matrix .
"T"ransposed or "C"onjugated transposed matrix. *)
let get_trans_char = function `N -> 'N' | `T -> 'T' | `C -> 'C'
indicating which side of the matrix B matrix A should be on
let get_side_char = function `L -> 'L' | `R -> 'R'
indicating whether a diagonal is unit or non - unit
let get_diag_char = function `U -> 'U' | `N -> 'N'
indicating whether / how the left / right singular vectors
should be computed
should be computed *)
let get_s_d_job_char = function `A -> 'A' | `S -> 'S' | `O -> 'O' | `N -> 'N'
indicating whether the eigen"V"ectors are computed or " N"ot
let get_job_char = function true -> 'V' | _ -> 'N'
let job_char_true = get_job_char true
let job_char_false = get_job_char false
(** Preallocated strings (names) *)
let a_str = "a"
let ab_str = "ab"
let alphas_str = "alphas"
let ap_str = "ap"
let b_str = "b"
let br_str = "br"
let bc_str = "bc"
let c_str = "c"
let cr_str = "cr"
let cc_str = "cc"
let d_str = "d"
let dl_str = "dl"
let du_str = "du"
let e_str = "e"
let ipiv_str = "ipiv"
let iseed_str = "iseed"
let k_str = "k"
let ka_str = "ka"
let kb_str = "kb"
let work_str = "work"
let lwork_str = "lwork"
let liwork_str = "liwork"
let k1_str = "k1"
let k2_str = "k2"
let kd_str = "kd"
let kl_str = "kl"
let ku_str = "ku"
let m_str = "m"
let n_str = "n"
let nrhs_str = "nrhs"
let ofs_str = "ofs"
let r_str = "r"
let s_str = "s"
let tau_str = "tau"
let u_str = "u"
let um_str = "um"
let un_str = "un"
let vm_str = "vm"
let vn_str = "vn"
let vs_str = "vs"
let vsr_str = "vsr"
let vsc_str = "vsc"
let vt_str = "vt"
let w_str = "w"
let wi_str = "wi"
let wr_str = "wr"
let x_str = "x"
let y_str = "y"
let z_str = "z"
(** Range checking *)
* [ var ] @raise Invalid_argument to indicate
that integer variable [ var ] with name [ name ] at location [ loc ] is lower
than [ 0 ] .
that integer variable [var] with name [name] at location [loc] is lower
than [0]. *)
let raise_var_lt0 ~loc ~name var =
invalid_arg (sprintf "%s: %s < 0: %d" loc name var)
* [ check_var_lt0 ~loc ~name var ] checks whether integer variable [ var ] with
name [ name ] at location [ loc ] is lower than [ 0 ] . @raise Invalid_argument
in that case .
name [name] at location [loc] is lower than [0]. @raise Invalid_argument
in that case. *)
let check_var_lt0 ~loc ~name var = if var < 0 then raise_var_lt0 ~loc ~name var
let check_var_within loc var_name var lb ub c =
if var < lb then
invalid_arg (sprintf "%s: %s %s < %s" loc var_name (c var) (c lb))
else if var > ub then
invalid_arg (sprintf "%s: %s %s > %s" loc var_name (c var) (c ub))
else ()
(** Valueless vector checking and allocation functions (do not require a
vector value as argument *)
(** [calc_vec_min_dim ~n ~ofs ~inc] @return minimum vector dimension given
offset [ofs], increment [inc], and operation size [n] for a vector. *)
let calc_vec_min_dim ~n ~ofs ~inc =
if n = 0 then ofs - 1 else ofs + (n - 1) * abs inc
* [ raise_vec_min_dim ~loc ~vec_name ~dim ~min_dim ] @raise Invalid_argument
to indicate that dimension [ dim ] of a vector with name [ vec_name ]
exceeds the minimum [ min_dim ] at location [ loc ] .
to indicate that dimension [dim] of a vector with name [vec_name]
exceeds the minimum [min_dim] at location [loc]. *)
let raise_vec_min_dim ~loc ~vec_name ~dim ~min_dim =
invalid_arg (
sprintf "%s: dim(%s): valid=[%d..[ got=%d" loc vec_name min_dim dim)
* [ check_vec_min_dim ~loc ~vec_name ~dim ~min_dim ] checks whether vector
with name [ vec_name ] and dimension [ dim ] satisfies minimum dimension
[ min_dim ] . @raise Invalid_argument otherwise .
with name [vec_name] and dimension [dim] satisfies minimum dimension
[min_dim]. @raise Invalid_argument otherwise. *)
let check_vec_min_dim ~loc ~vec_name ~dim ~min_dim =
if dim < min_dim then raise_vec_min_dim ~loc ~vec_name ~dim ~min_dim
* [ raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs ] @raise Invalid_argument
to indicate that vector offset [ ofs ] is invalid ( i.e. is outside of
[ 1 .. max_ofs ] ) .
to indicate that vector offset [ofs] is invalid (i.e. is outside of
[1..max_ofs]). *)
let raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs =
invalid_arg (
sprintf "%s: ofs%s: valid=[1..%d] got=%d" loc vec_name max_ofs ofs)
* [ bad_n ~n ~max_n ] @return [ true ] iff [ n ] is smaller than zero or larger
than [ max_n ] .
than [max_n]. *)
let bad_n ~n ~max_n = n < 0 || n > max_n
* [ bad_ofs ~ofs ~max_ofs ] @return [ true ] iff [ ofs ] is smaller than one or
exceeds [ max_ofs ] .
exceeds [max_ofs]. *)
let bad_ofs ~ofs ~max_ofs = ofs < 1 || ofs > max_ofs
(** [bad_inc inc] @return [true] iff [inc] is illegal. *)
let bad_inc inc = inc = 0
* [ check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs ] checks whether vector
offset [ ofs ] for vector of name [ vec_name ] is invalid ( i.e. outside of
[ 1 .. max_ofs ] ) . @raise Invalid_argument in that case .
offset [ofs] for vector of name [vec_name] is invalid (i.e. outside of
[1..max_ofs]). @raise Invalid_argument in that case. *)
let check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs =
if bad_ofs ~ofs ~max_ofs then raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs
(** [check_vec_inc ~loc ~vec_name inc] checks whether vector increment [inc]
for vector of name [vec_name] is invalid (i.e. [0]). @raise
Invalid_argument in that case. *)
let check_vec_inc ~loc ~vec_name inc =
if bad_inc inc then invalid_arg (sprintf "%s: inc%s = 0" loc vec_name)
* [ calc_vec_max_n ~dim ~ofs ~inc ] @return maximum operation length [ n ]
for a vector given the dimension [ dim ] of the vector , the offset [ ofs ] ,
and increment [ inc ] . Assumes that the offset has already been validated
to not exceed [ dim ] , i.e. the returned [ max_n ] is at least [ 1 ] .
for a vector given the dimension [dim] of the vector, the offset [ofs],
and increment [inc]. Assumes that the offset has already been validated
to not exceed [dim], i.e. the returned [max_n] is at least [1]. *)
let calc_vec_max_n ~dim ~ofs ~inc = 1 + (dim - ofs) / abs inc
* [ calc_vec_opt_max_n ? ofs ? inc dim ] @return maximum operation length [ n ]
for a vector given the dimension [ dim ] of the vector , the optional offset
[ ofs ] , and optional increment [ inc ] . Assumes that the offset has already
been validated to not exceed [ dim ] , i.e. the returned [ max_n ] is at least
[ 1 ] .
for a vector given the dimension [dim] of the vector, the optional offset
[ofs], and optional increment [inc]. Assumes that the offset has already
been validated to not exceed [dim], i.e. the returned [max_n] is at least
[1]. *)
let calc_vec_opt_max_n ?(ofs = 1) ?(inc = 1) dim = calc_vec_max_n ~dim ~ofs ~inc
(** [raise_max_len ~loc ~len_name ~len ~max_len] @raise Invalid_argument
that the maximum operation size (e.g. [m] or [n] for vectors and matrices)
has been exceeded. *)
let raise_max_len ~loc ~len_name ~len ~max_len =
invalid_arg (sprintf "%s: %s: valid=[0..%d] got=%d" loc len_name max_len len)
* [ check_vec_dim ~loc ~vec_name ~dim ~ofs ~inc ~n_name ~n ] checks the vector
operation length in parameter [ n ] with name [ n_name ] at location [ loc ]
for vector with name [ vec_name ] and dimension [ dim ] given the operation
offset [ ofs ] and increment [ inc ] . @raise Invalid_argument if any
arguments are invalid .
operation length in parameter [n] with name [n_name] at location [loc]
for vector with name [vec_name] and dimension [dim] given the operation
offset [ofs] and increment [inc]. @raise Invalid_argument if any
arguments are invalid. *)
let check_vec_dim ~loc ~vec_name ~dim ~ofs ~inc ~n_name ~n =
check_vec_inc ~loc ~vec_name inc;
check_var_lt0 ~loc ~name:n_name n;
if n = 0 then check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs:(dim + 1)
else begin
check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs:dim;
let max_n = calc_vec_max_n ~dim ~ofs ~inc in
if n > max_n then raise_max_len ~loc ~len_name:n_name ~len:n ~max_len:max_n
end
* [ get_vec_n ~loc ~vec_name ~dim ~ofs ~inc ~n_name n ] checks or infers
the vector operation length in the option parameter [ n ] with name [ n_name ]
at location [ loc ] for vector with name [ vec_name ] and dimension [ dim ] given
the operation offset [ ofs ] and increment [ inc ] . @raise Invalid_argument
if any arguments are invalid .
the vector operation length in the option parameter [n] with name [n_name]
at location [loc] for vector with name [vec_name] and dimension [dim] given
the operation offset [ofs] and increment [inc]. @raise Invalid_argument
if any arguments are invalid. *)
let get_vec_n ~loc ~vec_name ~dim ~ofs ~inc ~n_name = function
| None when dim = 0 ->
check_vec_inc ~loc ~vec_name inc;
if ofs = 1 then dim else raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs:1
| None ->
check_vec_inc ~loc ~vec_name inc;
if ofs = dim + 1 then 0
else begin
check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs:dim;
calc_vec_max_n ~dim ~ofs ~inc
end
| Some n -> check_vec_dim ~loc ~vec_name ~dim ~ofs ~inc ~n_name ~n; n
* [ get_vec_min_dim ~loc ~vec_name ~ofs ~inc ~n ] @return minimum vector
dimension given offset [ ofs ] , increment [ inc ] , and operation size [ n ]
for a vector named [ vec_name ] at location [ loc ] . @raise Invalid_argument
if any of the parameters are illegal .
dimension given offset [ofs], increment [inc], and operation size [n]
for a vector named [vec_name] at location [loc]. @raise Invalid_argument
if any of the parameters are illegal. *)
let get_vec_min_dim ~loc ~vec_name ~ofs ~inc ~n =
check_vec_inc ~loc ~vec_name inc;
if ofs >= 1 then calc_vec_min_dim ~ofs ~inc ~n
else invalid_arg (sprintf "%s: ofs%s: valid=[1..] got=%d" loc vec_name ofs)
(** [get_vec_start_stop ~ofsx ~incx ~n] @return [(start, stop)] where [start]
and [stop] reflect the start and stop of an iteration respectively. *)
let get_vec_start_stop ~ofsx ~incx ~n =
if n = 0 then 0, 0
else
if incx > 0 then ofsx, ofsx + n * incx
else ofsx - (n - 1) * incx, ofsx + incx
(** Valueless matrix checking and allocation functions (do not require a
matrix value as argument *)
* [ raise_bad_mat_ofs ~loc ~name ~ofs_name ~ofs ~max_ofs ] @raise
Invalid_argument to indicate that a matrix offset [ ofs ] named [ ofs_name ]
for a matrix having [ name ] is invalid ( i.e. is outside of [ 1 .. max_ofs ] ) .
Invalid_argument to indicate that a matrix offset [ofs] named [ofs_name]
for a matrix having [name] is invalid (i.e. is outside of [1..max_ofs]). *)
let raise_bad_mat_ofs ~loc ~name ~ofs_name ~ofs ~max_ofs =
invalid_arg (
sprintf "%s: %s%s: valid=[1..%d] got=%d" loc name ofs_name max_ofs ofs)
* [ raise_mat_bad_r ~loc ~mat_name ~r ~max_r ] @raise Invalid_argument
to indicate that matrix row offset [ r ] is invalid ( i.e. is outside of
[ 1 .. max_r ] ) .
to indicate that matrix row offset [r] is invalid (i.e. is outside of
[1..max_r]). *)
let raise_mat_bad_r ~loc ~mat_name ~r ~max_r =
raise_bad_mat_ofs ~loc ~name:mat_name ~ofs_name:r_str ~ofs:r ~max_ofs:max_r
* [ raise_mat_bad_c ~loc ~mat_name ~c ~max_c ] @raise Invalid_argument
to indicate that matrix column offset [ c ] is invalid ( i.e. is outside of
[ 1 .. max_c ] ) .
to indicate that matrix column offset [c] is invalid (i.e. is outside of
[1..max_c]). *)
let raise_mat_bad_c ~loc ~mat_name ~c ~max_c =
raise_bad_mat_ofs ~loc ~name:mat_name ~ofs_name:c_str ~ofs:c ~max_ofs:max_c
* [ check_mat_r ~loc ~vec_name ~r ~max_r ] checks whether matrix row
offset [ r ] for vector of name [ vec_name ] is invalid ( i.e. outside of
[ 1 .. max_r ] ) . @raise Invalid_argument in that case .
offset [r] for vector of name [vec_name] is invalid (i.e. outside of
[1..max_r]). @raise Invalid_argument in that case. *)
let check_mat_r ~loc ~mat_name ~r ~max_r =
if r < 1 || r > max_r then raise_mat_bad_r ~loc ~mat_name ~r ~max_r
* [ check_mat_c ~loc ~vec_name ~c ~max_c ] checks whether matrix column
offset [ c ] for vector of name [ vec_name ] is invalid ( i.e. outside of
[ 1 .. max_c ] ) . @raise Invalid_argument in that case .
offset [c] for vector of name [vec_name] is invalid (i.e. outside of
[1..max_c]). @raise Invalid_argument in that case. *)
let check_mat_c ~loc ~mat_name ~c ~max_c =
if c < 1 || c > max_c then raise_mat_bad_c ~loc ~mat_name ~c ~max_c
* [ calc_mat_max_rows ~dim1 ~r ] @return maximum row operation length [ m ] for a
matrix given the dimension [ dim1 ] of the matrix and the start row [ r ] .
matrix given the dimension [dim1] of the matrix and the start row [r]. *)
let calc_mat_max_rows ~dim1 ~r = dim1 - r + 1
* [ calc_mat_opt_max_rows ? r dim1 ] @return maximum row operation length
[ m ] for a matrix given the dimension [ dim1 ] of the matrix and the optional
start row [ r ] . Assumes that the offset has already been validated to
not exceed [ dim1 ] , i.e. the returned [ max_m ] is at least [ 1 ] .
[m] for a matrix given the dimension [dim1] of the matrix and the optional
start row [r]. Assumes that the offset has already been validated to
not exceed [dim1], i.e. the returned [max_m] is at least [1]. *)
let calc_mat_opt_max_rows ?(r = 1) dim1 = calc_mat_max_rows ~dim1 ~r
* [ calc_mat_max_cols ~dim2 ~c ] @return maximum column operation length
[ n ] for a matrix given the dimension [ dim1 ] of the matrix and the start
column [ c ] .
[n] for a matrix given the dimension [dim1] of the matrix and the start
column [c]. *)
let calc_mat_max_cols ~dim2 ~c = dim2 - c + 1
* [ calc_mat_opt_max_cols ? c dim1 ] @return maximum column operation length
[ m ] for a matrix given the dimension [ dim2 ] of the matrix and the optional
start column [ c ] . Assumes that the offset has already been validated to
not exceed [ dim2 ] , i.e. the returned [ max_n ] is at least [ 1 ] .
[m] for a matrix given the dimension [dim2] of the matrix and the optional
start column [c]. Assumes that the offset has already been validated to
not exceed [dim2], i.e. the returned [max_n] is at least [1]. *)
let calc_mat_opt_max_cols ?(c = 1) dim2 = calc_mat_max_cols ~dim2 ~c
* [ check_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name ] checks the matrix
row operation length in parameter [ p ] with name [ param_name ] at
location [ loc ] for matrix with name [ mat_name ] and dimension [ dim1 ]
given the operation row [ r ] . @raise Invalid_argument if any arguments
are invalid .
row operation length in parameter [p] with name [param_name] at
location [loc] for matrix with name [mat_name] and dimension [dim1]
given the operation row [r]. @raise Invalid_argument if any arguments
are invalid. *)
let check_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name =
check_var_lt0 ~loc ~name:param_name p;
if p = 0 then check_mat_r ~loc ~mat_name ~r ~max_r:(dim1 + 1)
else begin
check_mat_r ~loc ~mat_name ~r ~max_r:dim1;
let max_rows = calc_mat_max_rows ~dim1 ~r in
if p > max_rows then
raise_max_len ~loc ~len_name:param_name ~len:p ~max_len:max_rows
end
* [ check_mat_m ~loc ~mat_name ~dim1 ~r ~m ] checks the matrix row operation
length in parameter [ m ] at location [ loc ] for matrix with name [ mat_name ]
and dimension [ dim1 ] given the operation row [ r ] . @raise Invalid_argument
if any arguments are invalid .
length in parameter [m] at location [loc] for matrix with name [mat_name]
and dimension [dim1] given the operation row [r]. @raise Invalid_argument
if any arguments are invalid. *)
let check_mat_m ~loc ~mat_name ~dim1 ~r ~m =
check_mat_rows ~loc ~mat_name ~dim1 ~r ~p:m ~param_name:m_str
(** [check_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name] checks the
matrix column operation length in parameter [p] with name [param_name]
at location [loc] for matrix with name [mat_name] and dimension [dim2]
given the operation column [c]. @raise Invalid_argument if any arguments
are invalid. *)
let check_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name =
check_var_lt0 ~loc ~name:param_name p;
if p = 0 then check_mat_c ~loc ~mat_name ~c ~max_c:(dim2 + 1)
else begin
check_mat_c ~loc ~mat_name ~c ~max_c:dim2;
let max_cols = calc_mat_max_cols ~dim2 ~c in
if p > max_cols then
raise_max_len ~loc ~len_name:param_name ~len:p ~max_len:max_cols
end
(** [check_mat_n ~loc ~mat_name ~dim2 ~c ~n] checks the matrix column
operation length in parameter [n] at location [loc] for matrix with
name [mat_name] and dimension [dim2] given the operation column [c].
@raise Invalid_argument if any arguments are invalid. *)
let check_mat_n ~loc ~mat_name ~dim2 ~c ~n =
check_mat_cols ~loc ~mat_name ~dim2 ~c ~p:n ~param_name:n_str
* [ check_mat_mn ~loc ~mat_name ~dim1 ~c ~m ~n ] checks the matrix
operation lengths in parameters [ m ] and [ n ] at location [ loc ] for matrix
with name [ mat_name ] and dimensions [ dim1 ] and [ dim2 ] given the operation
row [ r ] and column [ c ] . @raise Invalid_argument if any arguments are
invalid .
operation lengths in parameters [m] and [n] at location [loc] for matrix
with name [mat_name] and dimensions [dim1] and [dim2] given the operation
row [r] and column [c]. @raise Invalid_argument if any arguments are
invalid. *)
let check_mat_mn ~loc ~mat_name ~dim1 ~dim2 ~r ~c ~m ~n =
check_mat_m ~loc ~mat_name ~dim1 ~r ~m;
check_mat_n ~loc ~mat_name ~dim2 ~c ~n
* [ get_mat_rows ~loc ~mat_name ~dim1 ~r p ~param_name ] checks or infers
the matrix row operation length in the option parameter [ p ] with
name [ param_name ] at location [ loc ] for matrix with name [ mat_name ]
and dimension [ dim1 ] given the row operation offset [ r ] . @raise
Invalid_argument if any arguments are invalid .
the matrix row operation length in the option parameter [p] with
name [param_name] at location [loc] for matrix with name [mat_name]
and dimension [dim1] given the row operation offset [r]. @raise
Invalid_argument if any arguments are invalid. *)
let get_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name =
match p with
| None when dim1 = 0 ->
if r = 1 then dim1 else raise_mat_bad_r ~loc ~mat_name ~r ~max_r:1
| None ->
let max_r = dim1 + 1 in
check_mat_r ~loc ~mat_name ~r ~max_r;
max_r - r
| Some p -> check_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name; p
* [ get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m ~m_name ] checks or infers the
matrix row operation length in the option parameter [ m ] with name [ m_name ]
at location [ loc ] for matrix with name [ mat_name ] and dimension [ dim1 ]
given the row operation offset [ r ] . @raise Invalid_argument if any
arguments are invalid .
matrix row operation length in the option parameter [m] with name [m_name]
at location [loc] for matrix with name [mat_name] and dimension [dim1]
given the row operation offset [r]. @raise Invalid_argument if any
arguments are invalid. *)
let get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m ~m_name =
get_mat_rows ~loc ~mat_name ~dim1 ~r ~p:m ~param_name:m_name
* [ get_mat_m ~loc ~mat_name ~dim1 ~r ~m ] checks or infers the matrix row
operation length in the option parameter [ m ] at location [ loc ] for matrix
with name [ mat_name ] and dimension [ dim1 ] given the row operation offset
[ r ] . @raise Invalid_argument if any arguments are invalid .
operation length in the option parameter [m] at location [loc] for matrix
with name [mat_name] and dimension [dim1] given the row operation offset
[r]. @raise Invalid_argument if any arguments are invalid. *)
let get_mat_m ~loc ~mat_name ~dim1 ~r ~m =
get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m_name:m_str ~m
(** [get_mat_cols ~loc ~mat_name ~dim2 ~c ~param_name p] checks or infers
the matrix column operation length in the option parameter [p] with
name [param_name] at location [loc] for matrix with name [mat_name]
and dimension [dim2] given the column operation offset [c]. @raise
Invalid_argument if any arguments are invalid. *)
let get_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name =
match p with
| None when dim2 = 0 ->
if c = 1 then dim2 else raise_mat_bad_c ~loc ~mat_name ~c ~max_c:1
| None ->
let max_c = dim2 + 1 in
check_mat_c ~loc ~mat_name ~c ~max_c;
max_c - c
| Some p -> check_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name; p
(** [get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name] checks or infers the
matrix column operation length in the option parameter [n] with name
[n_name] at location [loc] for matrix with name [mat_name] and dimension
[dim2] given the column operation offset [c]. @raise Invalid_argument
if any arguments are invalid. *)
let get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name =
get_mat_cols ~loc ~mat_name ~dim2 ~c ~p:n ~param_name:n_name
(** [get_mat_n ~loc ~mat_name ~dim2 ~c ~n] checks or infers the matrix column
operation length in the option parameter [n] at location [loc] for matrix
with name [mat_name] and dimension [dim2] given the column operation
offset [c]. @raise Invalid_argument if any arguments are invalid. *)
let get_mat_n ~loc ~mat_name ~dim2 ~c ~n =
get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name:n_str
* [ get_mat_min_dim1 ~loc ~mat_name ~r ~m ] @return the minimum row dimension
of a matrix with name [ mat_name ] at location [ loc ] given row [ r ] and
row operation length [ m ] . @raise Invalid_argument if any arguments
are invalid .
of a matrix with name [mat_name] at location [loc] given row [r] and
row operation length [m]. @raise Invalid_argument if any arguments
are invalid. *)
let get_mat_min_dim1 ~loc ~mat_name ~r ~m =
if r > 0 then r + m - 1
else invalid_arg (sprintf "%s: %sr < 1: %d" loc mat_name r)
(** [get_mat_min_dim2 ~loc ~mat_name ~c ~n] @return the minimum column
dimension of a matrix with name [mat_name] at location [loc] given column
[c] and row operation length [n]. @raise Invalid_argument if any
arguments are invalid. *)
let get_mat_min_dim2 ~loc ~mat_name ~c ~n =
if c > 0 then c + n - 1
else invalid_arg (sprintf "%s: %sc < 1: %d" loc mat_name c)
* [ check_mat_min_dim1 ~loc ~mat_name ~dim1 ~min_dim1 ] checks the minimum
row dimension [ min_dim1 ] of a matrix with name [ mat_name ] at location
[ loc ] given its row dimension [ dim1 ] . @raise Invalid_argument if
any arguments are invalid .
row dimension [min_dim1] of a matrix with name [mat_name] at location
[loc] given its row dimension [dim1]. @raise Invalid_argument if
any arguments are invalid. *)
let check_mat_min_dim1 ~loc ~mat_name ~dim1 ~min_dim1 =
if dim1 < min_dim1 then
invalid_arg (
sprintf "%s: dim1(%s): valid=[%d..[ got=%d" loc mat_name min_dim1 dim1)
(** [check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2] checks the minimum
column dimension [min_dim2] of a matrix with name [mat_name] at location
[loc] given its column dimension [dim2]. @raise Invalid_argument if
any arguments are invalid. *)
let check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2 =
if dim2 < min_dim2 then
invalid_arg (
sprintf "%s: dim2(%s): valid=[%d..[ got=%d" loc mat_name min_dim2 dim2)
(** [check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2] checks the minimum
column dimension [min_dim2] of a matrix with name [mat_name] at location
[loc] given its column dimension [dim2]. @raise Invalid_argument if
any arguments are invalid. *)
let check_mat_min_dims ~loc ~mat_name ~dim1 ~dim2 ~min_dim1 ~min_dim2 =
check_mat_min_dim1 ~loc ~mat_name ~dim1 ~min_dim1;
check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2
(** (Old) Vector checking and allocation functions *)
let check_vec loc vec_name vec min_dim =
check_vec_min_dim ~loc ~vec_name ~dim:(Array1.dim vec) ~min_dim
* [ check_vec_is_perm loc vec_name vec n ] checks whether [ vec ]
is a valid permutation vector .
is a valid permutation vector. *)
let check_vec_is_perm loc vec_name vec n =
let dim = Array1.dim vec in
if dim <> n then
invalid_arg (sprintf "%s: dim(%s): valid=%d got=%d" loc vec_name n dim)
else
let ub = Int32.of_int n in
for i = 1 to dim do
let r = Array1.get vec i in
check_var_within loc (sprintf "%s(%d)" k_str i) r 1l ub Int32.to_string
done
let get_vec loc vec_name vec ofs inc n vec_create =
let min_dim = get_vec_min_dim ~loc ~vec_name ~ofs ~inc ~n in
match vec with
| Some vec -> check_vec loc vec_name vec min_dim; vec
| None -> vec_create min_dim
* [ inc vec n_name n ] if the dimension [ n ]
is given , check that the vector [ vec ] is big enough , otherwise return
the maximal [ n ] for the given vector [ vec ] .
is given, check that the vector [vec] is big enough, otherwise return
the maximal [n] for the given vector [vec]. *)
let get_dim_vec loc vec_name ofs inc vec n_name n =
get_vec_n ~loc ~vec_name ~dim:(Array1.dim vec) ~ofs ~inc ~n_name n
let check_vec_empty ~loc ~vec_name ~dim =
if dim = 0 then
invalid_arg (sprintf "%s: dimension of vector %s is zero" loc vec_name)
else ()
(** (Old) Matrix checking and allocation functions *)
let get_mat loc mat_name mat_create r c mat m n =
let min_dim1 = get_mat_min_dim1 ~loc ~mat_name ~r ~m in
let min_dim2 = get_mat_min_dim2 ~loc ~mat_name ~c ~n in
match mat with
| None -> mat_create min_dim1 min_dim2
| Some mat ->
let dim1 = Array2.dim1 mat in
let dim2 = Array2.dim2 mat in
check_mat_min_dims ~loc ~mat_name ~dim1 ~dim2 ~min_dim1 ~min_dim2;
mat
let check_dim1_mat loc mat_name mat mat_r m_name m =
let dim1 = Array2.dim1 mat in
check_mat_rows ~loc ~mat_name ~dim1 ~r:mat_r ~p:m ~param_name:m_name
let check_dim2_mat loc mat_name mat mat_c n_name n =
let dim2 = Array2.dim2 mat in
check_mat_cols ~loc ~mat_name ~dim2 ~c:mat_c ~p:n ~param_name:n_name
let check_dim_mat loc mat_name mat_r mat_c mat m n =
check_dim1_mat loc mat_name mat mat_r m_str m;
check_dim2_mat loc mat_name mat mat_c n_str n
let get_dim1_mat loc mat_name mat r m_name m =
let dim1 = Array2.dim1 mat in
get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m ~m_name
let get_dim2_mat loc mat_name mat c n_name n =
let dim2 = Array2.dim2 mat in
get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name
let check_mat_empty ~loc ~mat_name ~dim1 ~dim2 =
if dim1 = 0 then
invalid_arg (sprintf "%s: dim1 of matrix %s is zero" loc mat_name)
else if dim2 = 0 then
invalid_arg (sprintf "%s: dim2 of matrix %s is zero" loc mat_name)
else ()
let get_vec_inc loc vec_name = function
| Some inc -> check_vec_inc ~loc ~vec_name inc; inc
| None -> 1
let get_vec_ofs loc var = function
| Some ofs when ofs < 1 -> invalid_arg (sprintf "%s: ofs%s < 1" loc var)
| Some ofs -> ofs
| None -> 1
(**)
(* Dealing with pattern arguments in matrices *)
module Mat_patt = struct
type kind = Upper | Lower
let check_upent ~loc ~l ~m =
if l <= 0 then
failwith (sprintf "%s: illegal initial rows (%d) of upper pentagon" loc l)
else if l > m then
failwith (
sprintf
"%s: initial rows (%d) of upper pentagon exceed maximum [m] (%d)"
loc l m)
let check_lpent ~loc ~l ~n =
if l <= 0 then
failwith (
sprintf "%s: illegal initial columns (%d) of lower pentagon" loc l)
else if l > n then
failwith (
sprintf
"%s: initial columns (%d) of lower pentagon exceed maximum [n] (%d)"
loc l n)
let check_args ~loc ~m ~n : Types.Mat.patt option -> unit= function
| None | Some `Full | Some `Utr | Some `Ltr -> ()
| Some `Upent l -> check_upent ~loc ~l ~m
| Some `Lpent l -> check_lpent ~loc ~l ~n
let normalize_args ~loc ~m ~n : Types.Mat.patt option -> kind * int = function
| None | Some `Full -> Lower, n
| Some `Utr -> Upper, 1
| Some `Ltr -> Lower, 1
| Some `Upent l -> check_upent ~loc ~l ~m; Upper, l
| Some `Lpent l -> check_lpent ~loc ~l ~n; Lower, l
let patt_of_uplo ~(uplo : [`U | `L] option) ~(patt : Types.Mat.patt option) =
match uplo with
| Some `U -> Some `Utr
| Some `L -> Some `Ltr
| None -> patt
let patt_of_up ~up ~(patt : Types.Mat.patt option) =
match up with
| Some true -> Some `Utr
| Some false -> Some `Ltr
| None -> patt
end (* Mat_patt *)
(**)
problem - dependent parameters for LAPACK - functions
external ilaenv :
(int [@untagged]) ->
string ->
string ->
(int [@untagged]) ->
(int [@untagged]) ->
(int [@untagged]) ->
(int [@untagged]) ->
(int [@untagged])
= "lacaml_ilaenv_stub_bc" "lacaml_ilaenv_stub" [@@noalloc]
(* Get a work array *)
let get_work loc vec_create work min_lwork opt_lwork lwork_str =
match work with
| Some work ->
let lwork = Array1.dim work in
if lwork < min_lwork then
invalid_arg (
sprintf "%s: %s: valid=[%d..[ got=%d" loc lwork_str min_lwork lwork)
else work, lwork
| None -> vec_create opt_lwork, opt_lwork
let calc_unpacked_dim loc n_vec =
let n = truncate (sqrt (float (8 * n_vec + 1)) *. 0.5) in
if (n * n + n) / 2 <> n_vec then
failwith (sprintf "%s: illegal vector length: %d" loc n_vec)
else n
(* Calculate the dimension of a packed square matrix given the vector length *)
let get_unpacked_dim loc ?n n_vec =
match n with
| None -> calc_unpacked_dim loc n_vec
| Some n ->
let n_unpacked = calc_unpacked_dim loc n_vec in
if n < 0 || n > n_unpacked then
invalid_arg (sprintf "%s: n: valid=[0..%d] got=%d" loc n_unpacked n)
else n
let get_vec_geom loc var ofs inc =
get_vec_ofs loc var ofs, get_vec_inc loc var inc
A symmetric band ( SB ) or triangular band ( TB ) matrix has physical size
[ k+1]*[n ] for a logical matrix of size [ n]*[n ] . Check and return the [ k ]
( possibly also given by the optional argument [ k ] ) .
[k+1]*[n] for a logical matrix of size [n]*[n]. Check and return the [k]
(possibly also given by the optional argument [k]). *)
let get_k_mat_sb loc mat_name mat mat_r k_name k =
let dim1 = Array2.dim1 mat in
let max_k = dim1 - mat_r in
if mat_r < 1 || max_k < 0 then
invalid_arg (
sprintf "%s: mat_r(%s): valid=[1..%d] got=%d" loc mat_name dim1 mat_r);
match k with
| None -> max_k
| Some k ->
if k < 0 || max_k < k then
invalid_arg (
sprintf "%s: %s(%s): valid=[0..%d] got=%d"
loc k_name mat_name max_k k)
else k
let get_dim_mat_packed loc mat_name ofsmat mat n_name n =
let dim = Array1.dim mat in
match n with
| Some n ->
let n1 = ofsmat + (n - 1)*(n + 2)/2 (* ?overflow? *) in
if n < 0 || dim < n1 then
invalid_arg (sprintf "%s: %s(%s): valid=[0..%d] got=%d"
loc n_name mat_name dim n1)
else n
| None -> (* the greater n s.t. ofsmat - 1 + n(n+1)/2 <= dim mat *)
max 0 (truncate((sqrt(9. +. 8. *. float(dim - ofsmat)) -. 1.) /. 2.))
(* Makes sure that [mat] is a square matrix and [n] is within range *)
let get_n_of_square loc mat_name r c mat n =
let n = get_dim2_mat loc mat_name mat c n_str n in
check_dim1_mat loc mat_name mat r n_str n;
n
let get_n_of_a loc ar ac a n = get_n_of_square loc a_str ar ac a n
let get_nrhs_of_b loc n br bc b nrhs =
let nrhs = get_dim2_mat loc b_str b bc nrhs_str nrhs in
check_dim1_mat loc b_str b br n_str n;
nrhs
(* ORGQR - Auxiliary Functions *)
let orgqr_err ~loc ~m ~n ~k ~work ~a ~err =
let msg =
match err with
| -1 -> sprintf "m: valid=[0..[ got=%d" m
| -2 -> sprintf "n: valid=[0..%d] got=%d" m n
| -3 -> sprintf "k: valid=[0..%d] got=%d" n k
| -5 -> sprintf "dim2(a): valid=[%d..[ got=%d" n (Array2.dim2 a)
| -8 ->
sprintf "dim1(work): valid=[%d..[ got=%d" (max 1 n) (Array1.dim work)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n))
in
invalid_arg (sprintf "%s: %s" loc msg)
let orgqr_get_params loc ?m ?n ?k ~tau ~ar ~ac a =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
if m < n then invalid_arg (sprintf "%s: m(%d) < n(%d)" loc m n)
else
let k = get_dim_vec loc tau_str 1 1 tau k_str k in
m, n, k
(* ORMQR - Auxiliary Functions *)
let ormqr_err ~loc ~side ~m ~n ~k ~lwork ~a ~c ~err =
let nq, nw =
match side with
| `L -> m, n
| `R -> n, m
in
let msg =
match err with
| -3 -> sprintf "m: valid=[0..[ got=%d" m
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "k: valid=[0..%d] got=%d" k nq
| -7 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 nq) (Array2.dim1 a)
| -10 -> sprintf "dim1(c): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 c)
| -12 ->
let min_lwork = max 1 nw in
sprintf "lwork: valid=[%d..[ got=%d" min_lwork lwork
| _ -> raise (InternalError (sprintf "%s: error code %d" loc err))
in
invalid_arg (sprintf "%s: %s" loc msg)
let ormqr_get_params loc ~side ?m ?n ?k ~tau ~ar ~ac a ~cr ~cc c =
let m = get_dim1_mat loc c_str c cr m_str m in
let n = get_dim2_mat loc c_str c cc n_str n in
let k = get_dim2_mat loc a_str a ac k_str k in
begin match side with
| `L ->
if m < k then failwith (sprintf "%s: m(%d) < k(%d)" loc m k);
check_dim1_mat loc a_str a ar m_str (max 1 m)
| `R ->
if n < k then failwith (sprintf "%s: n(%d) < k(%d)" loc n k);
check_dim1_mat loc a_str a ar n_str (max 1 n)
end;
check_vec loc tau_str tau k;
m, n, k
(* GELS? - Auxiliary Functions *)
let gelsX_err loc gelsX_min_work ar a m n lwork nrhs br b err =
if err > 0 then
failwith
(sprintf "%s: failed to converge on off-diagonal element %d" loc err)
else
let msg =
match err with
| -1 -> sprintf "m: valid=[0..[ got=%d" m
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -5 ->
sprintf "dim1(a): valid=[%d..[ got=%d"
(max 1 m + ar - 1) (Array2.dim1 a)
| -7 ->
let min_dim = max 1 (max m n) + br - 1 in
sprintf "dim1(b): valid=[%d..[ got=%d" min_dim (Array2.dim1 b)
| -12 ->
let min_lwork = gelsX_min_work ~m ~n ~nrhs in
sprintf "lwork: valid=[%d..[ got=%d" min_lwork lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gelsX_get_s vec_create loc min_dim ofss = function
| Some s ->
let dim_s = Array1.dim s in
let min_dim_ofs = ofss - 1 + min_dim in
if dim_s < min_dim_ofs then
invalid_arg (sprintf "%s: s: valid=[%d..[ got=%d" loc min_dim_ofs dim_s)
else s
| None -> vec_create min_dim
let gelsX_get_params loc ar ac a m n nrhs br bc b =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
let nrhs = get_dim2_mat loc b_str b bc nrhs_str nrhs in
check_dim1_mat loc b_str b br m_str (max m n);
m, n, nrhs
(* ??ev -- auxiliary functions *)
let xxev_get_params loc ar ac a n vectors up =
let n = get_n_of_a loc ar ac a n in
let jobz = get_job_char vectors in
let uplo = get_uplo_char up in
n, jobz, uplo
let xxev_get_wx vec_create loc wname ofsw w n =
match w with
| None -> vec_create (ofsw - 1 + n)
| Some w -> check_vec loc wname w (ofsw - 1 + n); w
(* geev -- auxiliary functions *)
let geev_get_job_side loc mat_empty mat_create mat_name n r c mat_opt =
match mat_opt with
| None ->
if r < 1 then failwith (sprintf "%s: %sr < 1" loc mat_name)
else if c < 1 then failwith (sprintf "%s: %sc < 1" loc mat_name)
else r, c, mat_create (n + r - 1) (n + c - 1), job_char_true, true
| Some None -> 1, 1, mat_empty, job_char_false, false
| Some (Some mat) ->
check_dim1_mat loc mat_name mat r n_str n;
check_dim2_mat loc mat_name mat c n_str n;
r, c, mat, job_char_true, true
let geev_gen_get_params loc mat_empty mat_create ar ac a n
leftr leftc left rightr rightc right =
let n = get_n_of_a loc ar ac a n in
let leftr, leftc, vl, jobvl, lvs =
geev_get_job_side loc mat_empty mat_create "vl" n leftr leftc left in
let rightr, rightc, vr, jobvr, rvs =
geev_get_job_side loc mat_empty mat_create "vr" n rightr rightc right in
n, leftr, leftc, vl, jobvl, rightr, rightc, vr, jobvr, lvs || rvs
(* g?mv -- auxiliary functions *)
let gXmv_get_params loc vec_create m n ofsx incx x ofsy incy y trans =
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
let ofsy, incy = get_vec_geom loc y_str ofsy incy in
let lx, ly, trans_char =
let trans_char = get_trans_char trans in
if trans = `N then n, m, trans_char else m, n, trans_char in
check_vec loc x_str x (ofsx + (lx - 1) * abs incx);
let y = get_vec loc y_str y ofsy incy ly vec_create in
ofsx, incx, ofsy, incy, y, trans_char
(* symv -- auxiliary functions *)
let symv_get_params loc vec_create ar ac a n ofsx incx x ofsy incy y up =
let n = get_dim1_mat loc a_str a ar n_str n in
check_dim2_mat loc a_str a ac n_str n;
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
let ofsy, incy = get_vec_geom loc y_str ofsy incy in
check_vec loc x_str x (ofsx + (n - 1) * abs incx);
let y = get_vec loc y_str y ofsy incy n vec_create in
check_vec loc y_str y (ofsy + (n - 1) * abs incy);
n, ofsx, incx, ofsy, incy, y, get_uplo_char up
(* tr?v -- auxiliary functions *)
let trXv_get_params loc ar ac a n ofsx incx x up trans unit_triangular =
let n = get_dim1_mat loc a_str a ar n_str n in
check_dim2_mat loc a_str a ac n_str n;
let trans_char = get_trans_char trans in
let diag_char = get_diag_char unit_triangular in
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
check_vec loc x_str x (ofsx + (n - 1) * abs incx);
n, ofsx, incx, get_uplo_char up, trans_char, diag_char
(* tp?v -- auxiliary functions *)
let tpXv_get_params loc ofsap ap ?n ofsx incx x up trans unit_triangular =
let ofsap = get_vec_ofs loc ap_str ofsap in
let n = get_unpacked_dim loc ?n (Array1.dim ap - ofsap + 1) in
let trans_char = get_trans_char trans in
let diag_char = get_diag_char unit_triangular in
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
check_vec loc x_str x (ofsx + (n - 1) * abs incx);
n, ofsap, ofsx, incx, get_uplo_char up, trans_char, diag_char
(* gemm -- auxiliary functions *)
let get_c loc mat_create cr cc c m n = get_mat loc c_str mat_create cr cc c m n
let get_rows_mat_tr loc mat_str mat mat_r mat_c transp dim_str dim =
match transp with
| `N -> get_dim1_mat loc mat_str mat mat_r dim_str dim
| _ -> get_dim2_mat loc mat_str mat mat_c dim_str dim
let get_cols_mat_tr loc mat_str mat mat_r mat_c transp dim_str dim =
match transp with
| `N -> get_dim2_mat loc mat_str mat mat_c dim_str dim
| _ -> get_dim1_mat loc mat_str mat mat_r dim_str dim
let get_inner_dim loc mat1_str mat1 mat1_r mat1_c tr1
mat2_str mat2 mat2_r mat2_c tr2 dim_str k =
let k1 = get_cols_mat_tr loc mat1_str mat1 mat1_r mat1_c tr1 dim_str k in
let k2 = get_rows_mat_tr loc mat2_str mat2 mat2_r mat2_c tr2 dim_str k in
if k = None && k1 <> k2 then
failwith (
sprintf "%s: inner dimensions of matrices do not match (%d,%d)"
loc k1 k2)
else k1
let gemm_get_params loc mat_create ar ac a transa br bc b cr transb cc c m n k =
let m = get_rows_mat_tr loc a_str a ar ac transa m_str m in
let n = get_cols_mat_tr loc b_str b br bc transb n_str n in
let k = get_inner_dim loc a_str a ar ac transa b_str b br bc transb k_str k in
let transa = get_trans_char transa in
let transb = get_trans_char transb in
let c = get_c loc mat_create cr cc c m n in
m, n, k, transa, transb, c
(* symm -- auxiliary functions *)
let check_mat_square loc mat_str mat mat_r mat_c n =
check_dim1_mat loc mat_str mat mat_r n_str n;
check_dim2_mat loc mat_str mat mat_c n_str n
let symm_get_params loc mat_create ar ac a br bc b cr cc c m n side up =
let m = get_dim1_mat loc b_str b br m_str m in
let n = get_dim2_mat loc b_str b bc n_str n in
if side = `L then check_mat_square loc a_str a ar ac m
else check_mat_square loc a_str a ar ac n;
let side_char = get_side_char side in
let uplo_char = get_uplo_char up in
let c = get_c loc mat_create cr cc c m n in
m, n, side_char, uplo_char, c
(* trmm -- auxiliary functions *)
let trXm_get_params loc ar ac a br bc b m n side up transa diag =
let m = get_dim1_mat loc b_str b br m_str m in
let n = get_dim2_mat loc b_str b bc n_str n in
if side = `L then check_mat_square loc a_str a ar ac m
else check_mat_square loc a_str a ar ac n;
let side_char = get_side_char side in
let uplo_char = get_uplo_char up in
let transa = get_trans_char transa in
let diag_char = get_diag_char diag in
m, n, side_char, uplo_char, transa, diag_char
(* syrk -- auxiliary functions *)
let syrk_get_params loc mat_create ar ac a cr cc c n k up trans =
let n = get_rows_mat_tr loc a_str a ar ac trans n_str n in
let k = get_cols_mat_tr loc a_str a ar ac trans k_str k in
let trans_char = get_trans_char trans in
let uplo_char = get_uplo_char up in
let c = get_c loc mat_create cr cc c n n in
n, k, uplo_char, trans_char, c
(* syr2k -- auxiliary functions *)
let syr2k_get_params loc mat_create ar ac a br bc b cr cc c n k up trans =
let n = get_rows_mat_tr loc a_str a ar ac trans n_str n in
let k = get_cols_mat_tr loc a_str a ar ac trans k_str k in
begin match trans with
| `N ->
check_dim1_mat loc b_str b br n_str n;
check_dim2_mat loc b_str b bc k_str k;
| _ ->
check_dim1_mat loc b_str b br k_str k;
check_dim2_mat loc b_str b bc n_str n;
end;
let trans_char = get_trans_char trans in
let uplo_char = get_uplo_char up in
let c = get_c loc mat_create cr cc c n n in
n, k, uplo_char, trans_char, c
(* ?lange -- auxiliary functions *)
let xlange_get_params loc m n ar ac a =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
m, n
? ? -- auxiliary functions
let xxtrs_get_params loc ar ac a n br bc b nrhs =
let n = get_n_of_a loc ar ac a n in
let nrhs = get_nrhs_of_b loc n br bc b nrhs in
n, nrhs
let xxtrs_err loc n nrhs a b err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -8 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* ??tri -- auxiliary functions *)
let xxtri_singular_err loc err =
failwith (sprintf "%s: singular on index %i" loc err)
let xxtri_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* ??con -- auxiliary functions *)
let xxcon_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
geXrf -- auxiliary functions
let geXrf_get_params loc m n ar ac a =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
m, n
(* getrf -- auxiliary functions *)
let getrf_err loc m n a err =
let msg =
match err with
| -1 -> sprintf "n: valid=[0..[ got=%d" n
| -2 -> sprintf "m: valid=[0..[ got=%d" m
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let getrf_lu_err loc err =
failwith (sprintf "%s: U(%i,%i)=0 in the LU factorization" loc err err)
let getrf_get_ipiv loc ipiv m n =
match ipiv with
| None -> create_int32_vec (min m n)
| Some ipiv ->
check_vec loc ipiv_str ipiv (min m n);
ipiv
(* sytrf -- auxiliary functions *)
let sytrf_get_ipiv loc ipiv n =
match ipiv with
| None -> create_int32_vec n
| Some ipiv ->
check_vec loc ipiv_str ipiv n;
ipiv
let sytrf_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let sytrf_fact_err loc err =
failwith (sprintf "%s: D(%i,%i)=0 in the factorization" loc err err)
(* potrf -- auxiliary functions *)
let potrf_chol_err loc err =
failwith (
sprintf "%s: leading minor of order %d is not positive definite" loc err)
let potrf_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| _ -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* potrs -- auxiliary functions *)
let potrs_err loc n nrhs a b err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -7 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* trtrs -- auxiliary functions *)
let trtrs_err loc n nrhs a b err =
let msg =
match err with
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -7 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -9 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* tbtrs -- auxiliary functions *)
let tbtrs_err loc n nrhs kd ab b err =
let msg =
match err with
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "kd: valid=[0..[ got=%d" kd
| -6 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -8 -> sprintf "dim1(ab): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 ab)
| -10 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* getri -- auxiliary functions *)
let getri_err loc getri_min_lwork n a lwork err =
let msg =
match err with
| -1 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -6 ->
let min_lwork = getri_min_lwork n in
sprintf "lwork: valid=[%d..[ got=%d" min_lwork lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* trtri -- auxiliary functions *)
let trtri_err loc n a err =
let msg =
match err with
| -3 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* geqrf -- auxiliary functions *)
let geqrf_err loc m n a err =
let msg =
match err with
| -1 -> sprintf "m: valid=[0..[ got=%d" m
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
gecon -- auxiliary functions
let gecon_err loc norm_char n a err =
let msg =
match err with
| -1 -> sprintf "norm: valid=['O', I'] got='%c'" norm_char
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
(* gees -- auxiliary functions *)
let gees_err loc n err jobvs sort =
if err > 0 && err <= n then
failwith (sprintf "%s: %d eigenvalue elements did not converge" loc err)
else if err = n + 1 then
failwith (
sprintf "%s: eigenvalues not reordered, too close to separate" loc)
else if err = n + 2 then
failwith (
sprintf "%s: after reordering, roundoff changed values of some \
complex eigenvalues so that leading eigenvalues in \
the Schur form no longer satisfy SELECT" loc)
else
let msg =
match err with
| -1 -> sprintf "JOBVS: valid=['N', V'] got='%c'" jobvs
| -2 -> sprintf "SORT: valid=['N', S'] got='%c'" sort
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| n -> raise (InternalError (sprintf "%s: error code %d" loc n))
in
invalid_arg (sprintf "%s: %s" loc msg)
let dummy_select_fun _ = false
let gees_get_params_generic
loc mat_create mat_empty jobvs sort n ar ac a vsr vsc vs =
let n = get_n_of_a loc ar ac a n in
let jobvs, min_ldvs =
match jobvs with
| `No_Schur_vectors -> 'N', 1
| `Compute_Schur_vectors -> 'V', n
in
let vs =
match vs with
| Some vs ->
check_dim1_mat loc vs_str vs vsr vsr_str min_ldvs;
check_dim2_mat loc vs_str vs vsc vsc_str n;
vs
| None when jobvs = 'N' -> mat_empty
| None -> mat_create min_ldvs n
in
let sort, select, select_fun =
match sort with
| `No_sort -> 'N', 0, dummy_select_fun
| `Select_left_plane -> 'S', 0, dummy_select_fun
| `Select_right_plane -> 'S', 1, dummy_select_fun
| `Select_interior_disk -> 'S', 2, dummy_select_fun
| `Select_exterior_disk -> 'S', 3, dummy_select_fun
| `Select_custom select_fun -> 'S', 4, select_fun
in
jobvs, sort, select, select_fun, n, vs
let gees_get_params_real
loc vec_create mat_create mat_empty
jobvs sort n ar ac a wr wi vsr vsc vs =
let jobvs, sort, select, select_fun, n, vs =
gees_get_params_generic
loc mat_create mat_empty jobvs sort n ar ac a vsr vsc vs
in
let wr =
match wr with
| None -> vec_create n
| Some wr -> check_vec loc wr_str wr n; wr
in
let wi =
match wi with
| None -> vec_create n
| Some wi -> check_vec loc wi_str wi n; wi
in
jobvs, sort, select, select_fun, n, vs, wr, wi
let gees_get_params_complex
loc vec_create mat_create mat_empty jobvs sort n ar ac a w vsr vsc vs =
let jobvs, sort, select, select_fun, n, vs =
gees_get_params_generic
loc mat_create mat_empty jobvs sort n ar ac a vsr vsc vs
in
let w =
match w with
| None -> vec_create n
| Some w -> check_vec loc w_str w n; w
in
jobvs, sort, select, select_fun, n, vs, w
(* gesvd -- auxiliary functions *)
let gesvd_err loc jobu jobvt m n a u vt lwork err =
if err > 0 then
failwith
(sprintf "%s: %d off-diagonal elements did not converge" loc err)
else
let msg =
match err with
| -3 -> sprintf "m: valid=[0..[ got=%d" m
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -6 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 a)
| -9 ->
sprintf "dim1(u): valid=[%d..[ got=%d"
(match jobu with 'A' | 'S' -> max 1 m | _ -> 1)
(Array2.dim1 u)
| -11 ->
sprintf "dim1(vt): valid=[%d..[ got=%d"
(
match jobvt with
| 'A' -> max 1 n
| 'S' -> max 1 (min m n)
| _ -> 1
)
(Array2.dim1 vt)
| -13 -> sprintf "lwork: valid=[%d..[ got=%d" 1 lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gesvd_get_params
loc vec_create mat_create jobu jobvt m n ar ac a s ur uc u vtr vtc vt =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
let s = get_vec loc s_str s 1 1 (min m n) vec_create in
let um, un =
match jobu with
| `A -> m, m
| `S -> m, min m n
LDU > = 1 even when U not referenced
let u =
match u with
| Some u ->
check_dim1_mat loc u_str u ur um_str um;
check_dim2_mat loc u_str u uc un_str un;
u
| None -> mat_create um un in
let vm, vn =
match jobvt with
| `A -> n, n
| `S -> min m n, n
LDVT > = 1 even when VT not referenced
let vt =
match vt with
| Some vt ->
check_dim1_mat loc vt_str vt vtr vm_str vm;
check_dim2_mat loc vt_str vt vtc vn_str vn;
vt
| None -> mat_create vm vn in
let jobu_c = get_s_d_job_char jobu in
let jobvt_c = get_s_d_job_char jobvt in
jobu_c, jobvt_c, m, n, s, u, vt
gesdd -- auxiliary functions
let gesdd_err loc jobz m n a u vt lwork err =
if err > 0 then
failwith (
sprintf "%s: %d DBDSDC did not converge, updating process failed" loc err)
else
let msg =
match err with
| -2 -> sprintf "m: valid=[0..[ got=%d" m
| -3 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 a)
| -8 ->
sprintf "dim1(u): valid=[%d..[ got=%d"
(
if jobz = 'A' || jobz = 'S' || (jobz = 'O' && m < n)
then max 1 m
else 1
)
(Array2.dim1 u)
| -10 ->
sprintf "dim1(vt): valid=[%d..[ got=%d"
(
if jobz = 'A' || (jobz = 'O' && m >= n) then max 1 n
else if jobz = 'S' then max 1 (min m n)
else 1
)
(Array2.dim1 vt)
| -12 -> sprintf "lwork: valid=[%d..[ got=%d" 1 lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gesdd_get_params
loc vec_create mat_create jobz m n ar ac a s ur uc u vtr vtc vt =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
let min_m_n = min m n in
let s = get_vec loc s_str s 1 1 min_m_n vec_create in
let um, un, vm, vn =
match jobz with
| `A -> m, m, n, n
| `S -> m, min_m_n, min_m_n, n
| `O -> if m >= n then 1, 1, n, n else m, m, m, n
LDU > = 1 even when U not referenced
let u =
match u with
| Some u ->
check_dim1_mat loc u_str u ur um_str um;
check_dim2_mat loc u_str u uc un_str un;
u
| None -> mat_create um un in
let vt =
match vt with
| Some vt ->
check_dim1_mat loc vt_str vt vtr vm_str vm;
check_dim2_mat loc vt_str vt vtc vn_str vn;
vt
| None -> mat_create vm vn in
let jobz_c = get_s_d_job_char jobz in
jobz_c, m, n, s, u, vt
(* ??sv -- auxiliary functions *)
let xxsv_err loc n nrhs b err =
let msg =
match err with
| -1 -> sprintf "n: valid=[0..[ got=%d" n
| -2 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -7 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let xxsv_lu_err loc err =
failwith (sprintf "%s: U(%i,%i)=0 in the LU factorization" loc err err)
let xxsv_pos_err loc err =
let msg =
sprintf
"%s: the leading minor of order %i is not positive definite" loc err in
failwith msg
let xxsv_ind_err loc err =
let msg =
sprintf
"%s: D(%i,%i)=0 in the diagonal pivoting factorization" loc err err in
failwith msg
let xxsv_a_err loc a n =
let msg =
sprintf "%s: dim1(a): valid=[%d..[ got=%d" loc (max 1 n) (Array2.dim1 a) in
invalid_arg msg
let xxsv_work_err loc lwork =
invalid_arg (sprintf "%s: dim(work): valid=[1..[ got=%d" loc lwork)
let xxsv_get_ipiv loc ipiv n =
match ipiv with
| None -> create_int32_vec n
| Some ipiv ->
check_vec loc ipiv_str ipiv n;
ipiv
let xxsv_get_params loc ar ac a n br bc b nrhs =
let n = get_n_of_a loc ar ac a n in
let nrhs = get_nrhs_of_b loc n br bc b nrhs in
n, nrhs
| null | https://raw.githubusercontent.com/mmottl/lacaml/2e01c0747e740e54ab9a23ea59b29ea0d929b50f/src/utils.ml | ocaml | * General auxiliary functions
* Preallocated strings (names)
* Range checking
* Valueless vector checking and allocation functions (do not require a
vector value as argument
* [calc_vec_min_dim ~n ~ofs ~inc] @return minimum vector dimension given
offset [ofs], increment [inc], and operation size [n] for a vector.
* [bad_inc inc] @return [true] iff [inc] is illegal.
* [check_vec_inc ~loc ~vec_name inc] checks whether vector increment [inc]
for vector of name [vec_name] is invalid (i.e. [0]). @raise
Invalid_argument in that case.
* [raise_max_len ~loc ~len_name ~len ~max_len] @raise Invalid_argument
that the maximum operation size (e.g. [m] or [n] for vectors and matrices)
has been exceeded.
* [get_vec_start_stop ~ofsx ~incx ~n] @return [(start, stop)] where [start]
and [stop] reflect the start and stop of an iteration respectively.
* Valueless matrix checking and allocation functions (do not require a
matrix value as argument
* [check_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name] checks the
matrix column operation length in parameter [p] with name [param_name]
at location [loc] for matrix with name [mat_name] and dimension [dim2]
given the operation column [c]. @raise Invalid_argument if any arguments
are invalid.
* [check_mat_n ~loc ~mat_name ~dim2 ~c ~n] checks the matrix column
operation length in parameter [n] at location [loc] for matrix with
name [mat_name] and dimension [dim2] given the operation column [c].
@raise Invalid_argument if any arguments are invalid.
* [get_mat_cols ~loc ~mat_name ~dim2 ~c ~param_name p] checks or infers
the matrix column operation length in the option parameter [p] with
name [param_name] at location [loc] for matrix with name [mat_name]
and dimension [dim2] given the column operation offset [c]. @raise
Invalid_argument if any arguments are invalid.
* [get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name] checks or infers the
matrix column operation length in the option parameter [n] with name
[n_name] at location [loc] for matrix with name [mat_name] and dimension
[dim2] given the column operation offset [c]. @raise Invalid_argument
if any arguments are invalid.
* [get_mat_n ~loc ~mat_name ~dim2 ~c ~n] checks or infers the matrix column
operation length in the option parameter [n] at location [loc] for matrix
with name [mat_name] and dimension [dim2] given the column operation
offset [c]. @raise Invalid_argument if any arguments are invalid.
* [get_mat_min_dim2 ~loc ~mat_name ~c ~n] @return the minimum column
dimension of a matrix with name [mat_name] at location [loc] given column
[c] and row operation length [n]. @raise Invalid_argument if any
arguments are invalid.
* [check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2] checks the minimum
column dimension [min_dim2] of a matrix with name [mat_name] at location
[loc] given its column dimension [dim2]. @raise Invalid_argument if
any arguments are invalid.
* [check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2] checks the minimum
column dimension [min_dim2] of a matrix with name [mat_name] at location
[loc] given its column dimension [dim2]. @raise Invalid_argument if
any arguments are invalid.
* (Old) Vector checking and allocation functions
* (Old) Matrix checking and allocation functions
Dealing with pattern arguments in matrices
Mat_patt
Get a work array
Calculate the dimension of a packed square matrix given the vector length
?overflow?
the greater n s.t. ofsmat - 1 + n(n+1)/2 <= dim mat
Makes sure that [mat] is a square matrix and [n] is within range
ORGQR - Auxiliary Functions
ORMQR - Auxiliary Functions
GELS? - Auxiliary Functions
??ev -- auxiliary functions
geev -- auxiliary functions
g?mv -- auxiliary functions
symv -- auxiliary functions
tr?v -- auxiliary functions
tp?v -- auxiliary functions
gemm -- auxiliary functions
symm -- auxiliary functions
trmm -- auxiliary functions
syrk -- auxiliary functions
syr2k -- auxiliary functions
?lange -- auxiliary functions
??tri -- auxiliary functions
??con -- auxiliary functions
getrf -- auxiliary functions
sytrf -- auxiliary functions
potrf -- auxiliary functions
potrs -- auxiliary functions
trtrs -- auxiliary functions
tbtrs -- auxiliary functions
getri -- auxiliary functions
trtri -- auxiliary functions
geqrf -- auxiliary functions
gees -- auxiliary functions
gesvd -- auxiliary functions
??sv -- auxiliary functions | File : utils.ml
Copyright ( C ) 2001-
email :
WWW :
email :
WWW : /~liam
email :
WWW : /
email :
WWW : none
This library is free software ; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation ; either
version 2.1 of the License , or ( at your option ) any later version .
This library is distributed in the hope that it will be useful ,
but WITHOUT ANY WARRANTY ; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the GNU
Lesser General Public License for more details .
You should have received a copy of the GNU Lesser General Public
License along with this library ; if not , write to the Free Software
Foundation , Inc. , 51 Franklin Street , Fifth Floor , Boston , MA 02110 - 1301 USA
Copyright (C) 2001-
Markus Mottl
email:
WWW:
Liam Stewart
email:
WWW: /~liam
Christophe Troestler
email:
WWW: /
Florent Hoareau
email:
WWW: none
This library is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 2.1 of the License, or (at your option) any later version.
This library is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details.
You should have received a copy of the GNU Lesser General Public
License along with this library; if not, write to the Free Software
Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
*)
open Printf
open Bigarray
open Common
Zero - sized dummy vector ( int )
let empty_int32_vec = create_int32_vec 0
indicating type of norm to retrieve for XlanYY routines
let get_norm_char = function `M -> 'M' | `O -> 'O' | `I -> 'I' | `F -> 'F'
indicating whether the " U"pper or " L"ower triangle of a matrix
is stored
is stored *)
let get_uplo_char up = if up then 'U' else 'L'
indicating whether some operation operates on a " N"ormal ,
" T"ransposed or " C"onjugated transposed matrix .
"T"ransposed or "C"onjugated transposed matrix. *)
let get_trans_char = function `N -> 'N' | `T -> 'T' | `C -> 'C'
indicating which side of the matrix B matrix A should be on
let get_side_char = function `L -> 'L' | `R -> 'R'
indicating whether a diagonal is unit or non - unit
let get_diag_char = function `U -> 'U' | `N -> 'N'
indicating whether / how the left / right singular vectors
should be computed
should be computed *)
let get_s_d_job_char = function `A -> 'A' | `S -> 'S' | `O -> 'O' | `N -> 'N'
indicating whether the eigen"V"ectors are computed or " N"ot
let get_job_char = function true -> 'V' | _ -> 'N'
let job_char_true = get_job_char true
let job_char_false = get_job_char false
let a_str = "a"
let ab_str = "ab"
let alphas_str = "alphas"
let ap_str = "ap"
let b_str = "b"
let br_str = "br"
let bc_str = "bc"
let c_str = "c"
let cr_str = "cr"
let cc_str = "cc"
let d_str = "d"
let dl_str = "dl"
let du_str = "du"
let e_str = "e"
let ipiv_str = "ipiv"
let iseed_str = "iseed"
let k_str = "k"
let ka_str = "ka"
let kb_str = "kb"
let work_str = "work"
let lwork_str = "lwork"
let liwork_str = "liwork"
let k1_str = "k1"
let k2_str = "k2"
let kd_str = "kd"
let kl_str = "kl"
let ku_str = "ku"
let m_str = "m"
let n_str = "n"
let nrhs_str = "nrhs"
let ofs_str = "ofs"
let r_str = "r"
let s_str = "s"
let tau_str = "tau"
let u_str = "u"
let um_str = "um"
let un_str = "un"
let vm_str = "vm"
let vn_str = "vn"
let vs_str = "vs"
let vsr_str = "vsr"
let vsc_str = "vsc"
let vt_str = "vt"
let w_str = "w"
let wi_str = "wi"
let wr_str = "wr"
let x_str = "x"
let y_str = "y"
let z_str = "z"
* [ var ] @raise Invalid_argument to indicate
that integer variable [ var ] with name [ name ] at location [ loc ] is lower
than [ 0 ] .
that integer variable [var] with name [name] at location [loc] is lower
than [0]. *)
let raise_var_lt0 ~loc ~name var =
invalid_arg (sprintf "%s: %s < 0: %d" loc name var)
* [ check_var_lt0 ~loc ~name var ] checks whether integer variable [ var ] with
name [ name ] at location [ loc ] is lower than [ 0 ] . @raise Invalid_argument
in that case .
name [name] at location [loc] is lower than [0]. @raise Invalid_argument
in that case. *)
let check_var_lt0 ~loc ~name var = if var < 0 then raise_var_lt0 ~loc ~name var
let check_var_within loc var_name var lb ub c =
if var < lb then
invalid_arg (sprintf "%s: %s %s < %s" loc var_name (c var) (c lb))
else if var > ub then
invalid_arg (sprintf "%s: %s %s > %s" loc var_name (c var) (c ub))
else ()
let calc_vec_min_dim ~n ~ofs ~inc =
if n = 0 then ofs - 1 else ofs + (n - 1) * abs inc
* [ raise_vec_min_dim ~loc ~vec_name ~dim ~min_dim ] @raise Invalid_argument
to indicate that dimension [ dim ] of a vector with name [ vec_name ]
exceeds the minimum [ min_dim ] at location [ loc ] .
to indicate that dimension [dim] of a vector with name [vec_name]
exceeds the minimum [min_dim] at location [loc]. *)
let raise_vec_min_dim ~loc ~vec_name ~dim ~min_dim =
invalid_arg (
sprintf "%s: dim(%s): valid=[%d..[ got=%d" loc vec_name min_dim dim)
* [ check_vec_min_dim ~loc ~vec_name ~dim ~min_dim ] checks whether vector
with name [ vec_name ] and dimension [ dim ] satisfies minimum dimension
[ min_dim ] . @raise Invalid_argument otherwise .
with name [vec_name] and dimension [dim] satisfies minimum dimension
[min_dim]. @raise Invalid_argument otherwise. *)
let check_vec_min_dim ~loc ~vec_name ~dim ~min_dim =
if dim < min_dim then raise_vec_min_dim ~loc ~vec_name ~dim ~min_dim
* [ raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs ] @raise Invalid_argument
to indicate that vector offset [ ofs ] is invalid ( i.e. is outside of
[ 1 .. max_ofs ] ) .
to indicate that vector offset [ofs] is invalid (i.e. is outside of
[1..max_ofs]). *)
let raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs =
invalid_arg (
sprintf "%s: ofs%s: valid=[1..%d] got=%d" loc vec_name max_ofs ofs)
* [ bad_n ~n ~max_n ] @return [ true ] iff [ n ] is smaller than zero or larger
than [ max_n ] .
than [max_n]. *)
let bad_n ~n ~max_n = n < 0 || n > max_n
* [ bad_ofs ~ofs ~max_ofs ] @return [ true ] iff [ ofs ] is smaller than one or
exceeds [ max_ofs ] .
exceeds [max_ofs]. *)
let bad_ofs ~ofs ~max_ofs = ofs < 1 || ofs > max_ofs
let bad_inc inc = inc = 0
* [ check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs ] checks whether vector
offset [ ofs ] for vector of name [ vec_name ] is invalid ( i.e. outside of
[ 1 .. max_ofs ] ) . @raise Invalid_argument in that case .
offset [ofs] for vector of name [vec_name] is invalid (i.e. outside of
[1..max_ofs]). @raise Invalid_argument in that case. *)
let check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs =
if bad_ofs ~ofs ~max_ofs then raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs
let check_vec_inc ~loc ~vec_name inc =
if bad_inc inc then invalid_arg (sprintf "%s: inc%s = 0" loc vec_name)
* [ calc_vec_max_n ~dim ~ofs ~inc ] @return maximum operation length [ n ]
for a vector given the dimension [ dim ] of the vector , the offset [ ofs ] ,
and increment [ inc ] . Assumes that the offset has already been validated
to not exceed [ dim ] , i.e. the returned [ max_n ] is at least [ 1 ] .
for a vector given the dimension [dim] of the vector, the offset [ofs],
and increment [inc]. Assumes that the offset has already been validated
to not exceed [dim], i.e. the returned [max_n] is at least [1]. *)
let calc_vec_max_n ~dim ~ofs ~inc = 1 + (dim - ofs) / abs inc
* [ calc_vec_opt_max_n ? ofs ? inc dim ] @return maximum operation length [ n ]
for a vector given the dimension [ dim ] of the vector , the optional offset
[ ofs ] , and optional increment [ inc ] . Assumes that the offset has already
been validated to not exceed [ dim ] , i.e. the returned [ max_n ] is at least
[ 1 ] .
for a vector given the dimension [dim] of the vector, the optional offset
[ofs], and optional increment [inc]. Assumes that the offset has already
been validated to not exceed [dim], i.e. the returned [max_n] is at least
[1]. *)
let calc_vec_opt_max_n ?(ofs = 1) ?(inc = 1) dim = calc_vec_max_n ~dim ~ofs ~inc
let raise_max_len ~loc ~len_name ~len ~max_len =
invalid_arg (sprintf "%s: %s: valid=[0..%d] got=%d" loc len_name max_len len)
* [ check_vec_dim ~loc ~vec_name ~dim ~ofs ~inc ~n_name ~n ] checks the vector
operation length in parameter [ n ] with name [ n_name ] at location [ loc ]
for vector with name [ vec_name ] and dimension [ dim ] given the operation
offset [ ofs ] and increment [ inc ] . @raise Invalid_argument if any
arguments are invalid .
operation length in parameter [n] with name [n_name] at location [loc]
for vector with name [vec_name] and dimension [dim] given the operation
offset [ofs] and increment [inc]. @raise Invalid_argument if any
arguments are invalid. *)
let check_vec_dim ~loc ~vec_name ~dim ~ofs ~inc ~n_name ~n =
check_vec_inc ~loc ~vec_name inc;
check_var_lt0 ~loc ~name:n_name n;
if n = 0 then check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs:(dim + 1)
else begin
check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs:dim;
let max_n = calc_vec_max_n ~dim ~ofs ~inc in
if n > max_n then raise_max_len ~loc ~len_name:n_name ~len:n ~max_len:max_n
end
* [ get_vec_n ~loc ~vec_name ~dim ~ofs ~inc ~n_name n ] checks or infers
the vector operation length in the option parameter [ n ] with name [ n_name ]
at location [ loc ] for vector with name [ vec_name ] and dimension [ dim ] given
the operation offset [ ofs ] and increment [ inc ] . @raise Invalid_argument
if any arguments are invalid .
the vector operation length in the option parameter [n] with name [n_name]
at location [loc] for vector with name [vec_name] and dimension [dim] given
the operation offset [ofs] and increment [inc]. @raise Invalid_argument
if any arguments are invalid. *)
let get_vec_n ~loc ~vec_name ~dim ~ofs ~inc ~n_name = function
| None when dim = 0 ->
check_vec_inc ~loc ~vec_name inc;
if ofs = 1 then dim else raise_vec_bad_ofs ~loc ~vec_name ~ofs ~max_ofs:1
| None ->
check_vec_inc ~loc ~vec_name inc;
if ofs = dim + 1 then 0
else begin
check_vec_ofs ~loc ~vec_name ~ofs ~max_ofs:dim;
calc_vec_max_n ~dim ~ofs ~inc
end
| Some n -> check_vec_dim ~loc ~vec_name ~dim ~ofs ~inc ~n_name ~n; n
* [ get_vec_min_dim ~loc ~vec_name ~ofs ~inc ~n ] @return minimum vector
dimension given offset [ ofs ] , increment [ inc ] , and operation size [ n ]
for a vector named [ vec_name ] at location [ loc ] . @raise Invalid_argument
if any of the parameters are illegal .
dimension given offset [ofs], increment [inc], and operation size [n]
for a vector named [vec_name] at location [loc]. @raise Invalid_argument
if any of the parameters are illegal. *)
let get_vec_min_dim ~loc ~vec_name ~ofs ~inc ~n =
check_vec_inc ~loc ~vec_name inc;
if ofs >= 1 then calc_vec_min_dim ~ofs ~inc ~n
else invalid_arg (sprintf "%s: ofs%s: valid=[1..] got=%d" loc vec_name ofs)
let get_vec_start_stop ~ofsx ~incx ~n =
if n = 0 then 0, 0
else
if incx > 0 then ofsx, ofsx + n * incx
else ofsx - (n - 1) * incx, ofsx + incx
* [ raise_bad_mat_ofs ~loc ~name ~ofs_name ~ofs ~max_ofs ] @raise
Invalid_argument to indicate that a matrix offset [ ofs ] named [ ofs_name ]
for a matrix having [ name ] is invalid ( i.e. is outside of [ 1 .. max_ofs ] ) .
Invalid_argument to indicate that a matrix offset [ofs] named [ofs_name]
for a matrix having [name] is invalid (i.e. is outside of [1..max_ofs]). *)
let raise_bad_mat_ofs ~loc ~name ~ofs_name ~ofs ~max_ofs =
invalid_arg (
sprintf "%s: %s%s: valid=[1..%d] got=%d" loc name ofs_name max_ofs ofs)
* [ raise_mat_bad_r ~loc ~mat_name ~r ~max_r ] @raise Invalid_argument
to indicate that matrix row offset [ r ] is invalid ( i.e. is outside of
[ 1 .. max_r ] ) .
to indicate that matrix row offset [r] is invalid (i.e. is outside of
[1..max_r]). *)
let raise_mat_bad_r ~loc ~mat_name ~r ~max_r =
raise_bad_mat_ofs ~loc ~name:mat_name ~ofs_name:r_str ~ofs:r ~max_ofs:max_r
* [ raise_mat_bad_c ~loc ~mat_name ~c ~max_c ] @raise Invalid_argument
to indicate that matrix column offset [ c ] is invalid ( i.e. is outside of
[ 1 .. max_c ] ) .
to indicate that matrix column offset [c] is invalid (i.e. is outside of
[1..max_c]). *)
let raise_mat_bad_c ~loc ~mat_name ~c ~max_c =
raise_bad_mat_ofs ~loc ~name:mat_name ~ofs_name:c_str ~ofs:c ~max_ofs:max_c
* [ check_mat_r ~loc ~vec_name ~r ~max_r ] checks whether matrix row
offset [ r ] for vector of name [ vec_name ] is invalid ( i.e. outside of
[ 1 .. max_r ] ) . @raise Invalid_argument in that case .
offset [r] for vector of name [vec_name] is invalid (i.e. outside of
[1..max_r]). @raise Invalid_argument in that case. *)
let check_mat_r ~loc ~mat_name ~r ~max_r =
if r < 1 || r > max_r then raise_mat_bad_r ~loc ~mat_name ~r ~max_r
* [ check_mat_c ~loc ~vec_name ~c ~max_c ] checks whether matrix column
offset [ c ] for vector of name [ vec_name ] is invalid ( i.e. outside of
[ 1 .. max_c ] ) . @raise Invalid_argument in that case .
offset [c] for vector of name [vec_name] is invalid (i.e. outside of
[1..max_c]). @raise Invalid_argument in that case. *)
let check_mat_c ~loc ~mat_name ~c ~max_c =
if c < 1 || c > max_c then raise_mat_bad_c ~loc ~mat_name ~c ~max_c
* [ calc_mat_max_rows ~dim1 ~r ] @return maximum row operation length [ m ] for a
matrix given the dimension [ dim1 ] of the matrix and the start row [ r ] .
matrix given the dimension [dim1] of the matrix and the start row [r]. *)
let calc_mat_max_rows ~dim1 ~r = dim1 - r + 1
* [ calc_mat_opt_max_rows ? r dim1 ] @return maximum row operation length
[ m ] for a matrix given the dimension [ dim1 ] of the matrix and the optional
start row [ r ] . Assumes that the offset has already been validated to
not exceed [ dim1 ] , i.e. the returned [ max_m ] is at least [ 1 ] .
[m] for a matrix given the dimension [dim1] of the matrix and the optional
start row [r]. Assumes that the offset has already been validated to
not exceed [dim1], i.e. the returned [max_m] is at least [1]. *)
let calc_mat_opt_max_rows ?(r = 1) dim1 = calc_mat_max_rows ~dim1 ~r
* [ calc_mat_max_cols ~dim2 ~c ] @return maximum column operation length
[ n ] for a matrix given the dimension [ dim1 ] of the matrix and the start
column [ c ] .
[n] for a matrix given the dimension [dim1] of the matrix and the start
column [c]. *)
let calc_mat_max_cols ~dim2 ~c = dim2 - c + 1
* [ calc_mat_opt_max_cols ? c dim1 ] @return maximum column operation length
[ m ] for a matrix given the dimension [ dim2 ] of the matrix and the optional
start column [ c ] . Assumes that the offset has already been validated to
not exceed [ dim2 ] , i.e. the returned [ max_n ] is at least [ 1 ] .
[m] for a matrix given the dimension [dim2] of the matrix and the optional
start column [c]. Assumes that the offset has already been validated to
not exceed [dim2], i.e. the returned [max_n] is at least [1]. *)
let calc_mat_opt_max_cols ?(c = 1) dim2 = calc_mat_max_cols ~dim2 ~c
* [ check_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name ] checks the matrix
row operation length in parameter [ p ] with name [ param_name ] at
location [ loc ] for matrix with name [ mat_name ] and dimension [ dim1 ]
given the operation row [ r ] . @raise Invalid_argument if any arguments
are invalid .
row operation length in parameter [p] with name [param_name] at
location [loc] for matrix with name [mat_name] and dimension [dim1]
given the operation row [r]. @raise Invalid_argument if any arguments
are invalid. *)
let check_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name =
check_var_lt0 ~loc ~name:param_name p;
if p = 0 then check_mat_r ~loc ~mat_name ~r ~max_r:(dim1 + 1)
else begin
check_mat_r ~loc ~mat_name ~r ~max_r:dim1;
let max_rows = calc_mat_max_rows ~dim1 ~r in
if p > max_rows then
raise_max_len ~loc ~len_name:param_name ~len:p ~max_len:max_rows
end
* [ check_mat_m ~loc ~mat_name ~dim1 ~r ~m ] checks the matrix row operation
length in parameter [ m ] at location [ loc ] for matrix with name [ mat_name ]
and dimension [ dim1 ] given the operation row [ r ] . @raise Invalid_argument
if any arguments are invalid .
length in parameter [m] at location [loc] for matrix with name [mat_name]
and dimension [dim1] given the operation row [r]. @raise Invalid_argument
if any arguments are invalid. *)
let check_mat_m ~loc ~mat_name ~dim1 ~r ~m =
check_mat_rows ~loc ~mat_name ~dim1 ~r ~p:m ~param_name:m_str
let check_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name =
check_var_lt0 ~loc ~name:param_name p;
if p = 0 then check_mat_c ~loc ~mat_name ~c ~max_c:(dim2 + 1)
else begin
check_mat_c ~loc ~mat_name ~c ~max_c:dim2;
let max_cols = calc_mat_max_cols ~dim2 ~c in
if p > max_cols then
raise_max_len ~loc ~len_name:param_name ~len:p ~max_len:max_cols
end
let check_mat_n ~loc ~mat_name ~dim2 ~c ~n =
check_mat_cols ~loc ~mat_name ~dim2 ~c ~p:n ~param_name:n_str
* [ check_mat_mn ~loc ~mat_name ~dim1 ~c ~m ~n ] checks the matrix
operation lengths in parameters [ m ] and [ n ] at location [ loc ] for matrix
with name [ mat_name ] and dimensions [ dim1 ] and [ dim2 ] given the operation
row [ r ] and column [ c ] . @raise Invalid_argument if any arguments are
invalid .
operation lengths in parameters [m] and [n] at location [loc] for matrix
with name [mat_name] and dimensions [dim1] and [dim2] given the operation
row [r] and column [c]. @raise Invalid_argument if any arguments are
invalid. *)
let check_mat_mn ~loc ~mat_name ~dim1 ~dim2 ~r ~c ~m ~n =
check_mat_m ~loc ~mat_name ~dim1 ~r ~m;
check_mat_n ~loc ~mat_name ~dim2 ~c ~n
* [ get_mat_rows ~loc ~mat_name ~dim1 ~r p ~param_name ] checks or infers
the matrix row operation length in the option parameter [ p ] with
name [ param_name ] at location [ loc ] for matrix with name [ mat_name ]
and dimension [ dim1 ] given the row operation offset [ r ] . @raise
Invalid_argument if any arguments are invalid .
the matrix row operation length in the option parameter [p] with
name [param_name] at location [loc] for matrix with name [mat_name]
and dimension [dim1] given the row operation offset [r]. @raise
Invalid_argument if any arguments are invalid. *)
let get_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name =
match p with
| None when dim1 = 0 ->
if r = 1 then dim1 else raise_mat_bad_r ~loc ~mat_name ~r ~max_r:1
| None ->
let max_r = dim1 + 1 in
check_mat_r ~loc ~mat_name ~r ~max_r;
max_r - r
| Some p -> check_mat_rows ~loc ~mat_name ~dim1 ~r ~p ~param_name; p
* [ get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m ~m_name ] checks or infers the
matrix row operation length in the option parameter [ m ] with name [ m_name ]
at location [ loc ] for matrix with name [ mat_name ] and dimension [ dim1 ]
given the row operation offset [ r ] . @raise Invalid_argument if any
arguments are invalid .
matrix row operation length in the option parameter [m] with name [m_name]
at location [loc] for matrix with name [mat_name] and dimension [dim1]
given the row operation offset [r]. @raise Invalid_argument if any
arguments are invalid. *)
let get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m ~m_name =
get_mat_rows ~loc ~mat_name ~dim1 ~r ~p:m ~param_name:m_name
* [ get_mat_m ~loc ~mat_name ~dim1 ~r ~m ] checks or infers the matrix row
operation length in the option parameter [ m ] at location [ loc ] for matrix
with name [ mat_name ] and dimension [ dim1 ] given the row operation offset
[ r ] . @raise Invalid_argument if any arguments are invalid .
operation length in the option parameter [m] at location [loc] for matrix
with name [mat_name] and dimension [dim1] given the row operation offset
[r]. @raise Invalid_argument if any arguments are invalid. *)
let get_mat_m ~loc ~mat_name ~dim1 ~r ~m =
get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m_name:m_str ~m
let get_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name =
match p with
| None when dim2 = 0 ->
if c = 1 then dim2 else raise_mat_bad_c ~loc ~mat_name ~c ~max_c:1
| None ->
let max_c = dim2 + 1 in
check_mat_c ~loc ~mat_name ~c ~max_c;
max_c - c
| Some p -> check_mat_cols ~loc ~mat_name ~dim2 ~c ~p ~param_name; p
let get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name =
get_mat_cols ~loc ~mat_name ~dim2 ~c ~p:n ~param_name:n_name
let get_mat_n ~loc ~mat_name ~dim2 ~c ~n =
get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name:n_str
* [ get_mat_min_dim1 ~loc ~mat_name ~r ~m ] @return the minimum row dimension
of a matrix with name [ mat_name ] at location [ loc ] given row [ r ] and
row operation length [ m ] . @raise Invalid_argument if any arguments
are invalid .
of a matrix with name [mat_name] at location [loc] given row [r] and
row operation length [m]. @raise Invalid_argument if any arguments
are invalid. *)
let get_mat_min_dim1 ~loc ~mat_name ~r ~m =
if r > 0 then r + m - 1
else invalid_arg (sprintf "%s: %sr < 1: %d" loc mat_name r)
let get_mat_min_dim2 ~loc ~mat_name ~c ~n =
if c > 0 then c + n - 1
else invalid_arg (sprintf "%s: %sc < 1: %d" loc mat_name c)
* [ check_mat_min_dim1 ~loc ~mat_name ~dim1 ~min_dim1 ] checks the minimum
row dimension [ min_dim1 ] of a matrix with name [ mat_name ] at location
[ loc ] given its row dimension [ dim1 ] . @raise Invalid_argument if
any arguments are invalid .
row dimension [min_dim1] of a matrix with name [mat_name] at location
[loc] given its row dimension [dim1]. @raise Invalid_argument if
any arguments are invalid. *)
let check_mat_min_dim1 ~loc ~mat_name ~dim1 ~min_dim1 =
if dim1 < min_dim1 then
invalid_arg (
sprintf "%s: dim1(%s): valid=[%d..[ got=%d" loc mat_name min_dim1 dim1)
let check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2 =
if dim2 < min_dim2 then
invalid_arg (
sprintf "%s: dim2(%s): valid=[%d..[ got=%d" loc mat_name min_dim2 dim2)
let check_mat_min_dims ~loc ~mat_name ~dim1 ~dim2 ~min_dim1 ~min_dim2 =
check_mat_min_dim1 ~loc ~mat_name ~dim1 ~min_dim1;
check_mat_min_dim2 ~loc ~mat_name ~dim2 ~min_dim2
let check_vec loc vec_name vec min_dim =
check_vec_min_dim ~loc ~vec_name ~dim:(Array1.dim vec) ~min_dim
* [ check_vec_is_perm loc vec_name vec n ] checks whether [ vec ]
is a valid permutation vector .
is a valid permutation vector. *)
let check_vec_is_perm loc vec_name vec n =
let dim = Array1.dim vec in
if dim <> n then
invalid_arg (sprintf "%s: dim(%s): valid=%d got=%d" loc vec_name n dim)
else
let ub = Int32.of_int n in
for i = 1 to dim do
let r = Array1.get vec i in
check_var_within loc (sprintf "%s(%d)" k_str i) r 1l ub Int32.to_string
done
let get_vec loc vec_name vec ofs inc n vec_create =
let min_dim = get_vec_min_dim ~loc ~vec_name ~ofs ~inc ~n in
match vec with
| Some vec -> check_vec loc vec_name vec min_dim; vec
| None -> vec_create min_dim
* [ inc vec n_name n ] if the dimension [ n ]
is given , check that the vector [ vec ] is big enough , otherwise return
the maximal [ n ] for the given vector [ vec ] .
is given, check that the vector [vec] is big enough, otherwise return
the maximal [n] for the given vector [vec]. *)
let get_dim_vec loc vec_name ofs inc vec n_name n =
get_vec_n ~loc ~vec_name ~dim:(Array1.dim vec) ~ofs ~inc ~n_name n
let check_vec_empty ~loc ~vec_name ~dim =
if dim = 0 then
invalid_arg (sprintf "%s: dimension of vector %s is zero" loc vec_name)
else ()
let get_mat loc mat_name mat_create r c mat m n =
let min_dim1 = get_mat_min_dim1 ~loc ~mat_name ~r ~m in
let min_dim2 = get_mat_min_dim2 ~loc ~mat_name ~c ~n in
match mat with
| None -> mat_create min_dim1 min_dim2
| Some mat ->
let dim1 = Array2.dim1 mat in
let dim2 = Array2.dim2 mat in
check_mat_min_dims ~loc ~mat_name ~dim1 ~dim2 ~min_dim1 ~min_dim2;
mat
let check_dim1_mat loc mat_name mat mat_r m_name m =
let dim1 = Array2.dim1 mat in
check_mat_rows ~loc ~mat_name ~dim1 ~r:mat_r ~p:m ~param_name:m_name
let check_dim2_mat loc mat_name mat mat_c n_name n =
let dim2 = Array2.dim2 mat in
check_mat_cols ~loc ~mat_name ~dim2 ~c:mat_c ~p:n ~param_name:n_name
let check_dim_mat loc mat_name mat_r mat_c mat m n =
check_dim1_mat loc mat_name mat mat_r m_str m;
check_dim2_mat loc mat_name mat mat_c n_str n
let get_dim1_mat loc mat_name mat r m_name m =
let dim1 = Array2.dim1 mat in
get_mat_dim1 ~loc ~mat_name ~dim1 ~r ~m ~m_name
let get_dim2_mat loc mat_name mat c n_name n =
let dim2 = Array2.dim2 mat in
get_mat_dim2 ~loc ~mat_name ~dim2 ~c ~n ~n_name
let check_mat_empty ~loc ~mat_name ~dim1 ~dim2 =
if dim1 = 0 then
invalid_arg (sprintf "%s: dim1 of matrix %s is zero" loc mat_name)
else if dim2 = 0 then
invalid_arg (sprintf "%s: dim2 of matrix %s is zero" loc mat_name)
else ()
let get_vec_inc loc vec_name = function
| Some inc -> check_vec_inc ~loc ~vec_name inc; inc
| None -> 1
let get_vec_ofs loc var = function
| Some ofs when ofs < 1 -> invalid_arg (sprintf "%s: ofs%s < 1" loc var)
| Some ofs -> ofs
| None -> 1
module Mat_patt = struct
type kind = Upper | Lower
let check_upent ~loc ~l ~m =
if l <= 0 then
failwith (sprintf "%s: illegal initial rows (%d) of upper pentagon" loc l)
else if l > m then
failwith (
sprintf
"%s: initial rows (%d) of upper pentagon exceed maximum [m] (%d)"
loc l m)
let check_lpent ~loc ~l ~n =
if l <= 0 then
failwith (
sprintf "%s: illegal initial columns (%d) of lower pentagon" loc l)
else if l > n then
failwith (
sprintf
"%s: initial columns (%d) of lower pentagon exceed maximum [n] (%d)"
loc l n)
let check_args ~loc ~m ~n : Types.Mat.patt option -> unit= function
| None | Some `Full | Some `Utr | Some `Ltr -> ()
| Some `Upent l -> check_upent ~loc ~l ~m
| Some `Lpent l -> check_lpent ~loc ~l ~n
let normalize_args ~loc ~m ~n : Types.Mat.patt option -> kind * int = function
| None | Some `Full -> Lower, n
| Some `Utr -> Upper, 1
| Some `Ltr -> Lower, 1
| Some `Upent l -> check_upent ~loc ~l ~m; Upper, l
| Some `Lpent l -> check_lpent ~loc ~l ~n; Lower, l
let patt_of_uplo ~(uplo : [`U | `L] option) ~(patt : Types.Mat.patt option) =
match uplo with
| Some `U -> Some `Utr
| Some `L -> Some `Ltr
| None -> patt
let patt_of_up ~up ~(patt : Types.Mat.patt option) =
match up with
| Some true -> Some `Utr
| Some false -> Some `Ltr
| None -> patt
problem - dependent parameters for LAPACK - functions
external ilaenv :
(int [@untagged]) ->
string ->
string ->
(int [@untagged]) ->
(int [@untagged]) ->
(int [@untagged]) ->
(int [@untagged]) ->
(int [@untagged])
= "lacaml_ilaenv_stub_bc" "lacaml_ilaenv_stub" [@@noalloc]
let get_work loc vec_create work min_lwork opt_lwork lwork_str =
match work with
| Some work ->
let lwork = Array1.dim work in
if lwork < min_lwork then
invalid_arg (
sprintf "%s: %s: valid=[%d..[ got=%d" loc lwork_str min_lwork lwork)
else work, lwork
| None -> vec_create opt_lwork, opt_lwork
let calc_unpacked_dim loc n_vec =
let n = truncate (sqrt (float (8 * n_vec + 1)) *. 0.5) in
if (n * n + n) / 2 <> n_vec then
failwith (sprintf "%s: illegal vector length: %d" loc n_vec)
else n
let get_unpacked_dim loc ?n n_vec =
match n with
| None -> calc_unpacked_dim loc n_vec
| Some n ->
let n_unpacked = calc_unpacked_dim loc n_vec in
if n < 0 || n > n_unpacked then
invalid_arg (sprintf "%s: n: valid=[0..%d] got=%d" loc n_unpacked n)
else n
let get_vec_geom loc var ofs inc =
get_vec_ofs loc var ofs, get_vec_inc loc var inc
A symmetric band ( SB ) or triangular band ( TB ) matrix has physical size
[ k+1]*[n ] for a logical matrix of size [ n]*[n ] . Check and return the [ k ]
( possibly also given by the optional argument [ k ] ) .
[k+1]*[n] for a logical matrix of size [n]*[n]. Check and return the [k]
(possibly also given by the optional argument [k]). *)
let get_k_mat_sb loc mat_name mat mat_r k_name k =
let dim1 = Array2.dim1 mat in
let max_k = dim1 - mat_r in
if mat_r < 1 || max_k < 0 then
invalid_arg (
sprintf "%s: mat_r(%s): valid=[1..%d] got=%d" loc mat_name dim1 mat_r);
match k with
| None -> max_k
| Some k ->
if k < 0 || max_k < k then
invalid_arg (
sprintf "%s: %s(%s): valid=[0..%d] got=%d"
loc k_name mat_name max_k k)
else k
let get_dim_mat_packed loc mat_name ofsmat mat n_name n =
let dim = Array1.dim mat in
match n with
| Some n ->
if n < 0 || dim < n1 then
invalid_arg (sprintf "%s: %s(%s): valid=[0..%d] got=%d"
loc n_name mat_name dim n1)
else n
max 0 (truncate((sqrt(9. +. 8. *. float(dim - ofsmat)) -. 1.) /. 2.))
let get_n_of_square loc mat_name r c mat n =
let n = get_dim2_mat loc mat_name mat c n_str n in
check_dim1_mat loc mat_name mat r n_str n;
n
let get_n_of_a loc ar ac a n = get_n_of_square loc a_str ar ac a n
let get_nrhs_of_b loc n br bc b nrhs =
let nrhs = get_dim2_mat loc b_str b bc nrhs_str nrhs in
check_dim1_mat loc b_str b br n_str n;
nrhs
let orgqr_err ~loc ~m ~n ~k ~work ~a ~err =
let msg =
match err with
| -1 -> sprintf "m: valid=[0..[ got=%d" m
| -2 -> sprintf "n: valid=[0..%d] got=%d" m n
| -3 -> sprintf "k: valid=[0..%d] got=%d" n k
| -5 -> sprintf "dim2(a): valid=[%d..[ got=%d" n (Array2.dim2 a)
| -8 ->
sprintf "dim1(work): valid=[%d..[ got=%d" (max 1 n) (Array1.dim work)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n))
in
invalid_arg (sprintf "%s: %s" loc msg)
let orgqr_get_params loc ?m ?n ?k ~tau ~ar ~ac a =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
if m < n then invalid_arg (sprintf "%s: m(%d) < n(%d)" loc m n)
else
let k = get_dim_vec loc tau_str 1 1 tau k_str k in
m, n, k
let ormqr_err ~loc ~side ~m ~n ~k ~lwork ~a ~c ~err =
let nq, nw =
match side with
| `L -> m, n
| `R -> n, m
in
let msg =
match err with
| -3 -> sprintf "m: valid=[0..[ got=%d" m
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "k: valid=[0..%d] got=%d" k nq
| -7 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 nq) (Array2.dim1 a)
| -10 -> sprintf "dim1(c): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 c)
| -12 ->
let min_lwork = max 1 nw in
sprintf "lwork: valid=[%d..[ got=%d" min_lwork lwork
| _ -> raise (InternalError (sprintf "%s: error code %d" loc err))
in
invalid_arg (sprintf "%s: %s" loc msg)
let ormqr_get_params loc ~side ?m ?n ?k ~tau ~ar ~ac a ~cr ~cc c =
let m = get_dim1_mat loc c_str c cr m_str m in
let n = get_dim2_mat loc c_str c cc n_str n in
let k = get_dim2_mat loc a_str a ac k_str k in
begin match side with
| `L ->
if m < k then failwith (sprintf "%s: m(%d) < k(%d)" loc m k);
check_dim1_mat loc a_str a ar m_str (max 1 m)
| `R ->
if n < k then failwith (sprintf "%s: n(%d) < k(%d)" loc n k);
check_dim1_mat loc a_str a ar n_str (max 1 n)
end;
check_vec loc tau_str tau k;
m, n, k
let gelsX_err loc gelsX_min_work ar a m n lwork nrhs br b err =
if err > 0 then
failwith
(sprintf "%s: failed to converge on off-diagonal element %d" loc err)
else
let msg =
match err with
| -1 -> sprintf "m: valid=[0..[ got=%d" m
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -5 ->
sprintf "dim1(a): valid=[%d..[ got=%d"
(max 1 m + ar - 1) (Array2.dim1 a)
| -7 ->
let min_dim = max 1 (max m n) + br - 1 in
sprintf "dim1(b): valid=[%d..[ got=%d" min_dim (Array2.dim1 b)
| -12 ->
let min_lwork = gelsX_min_work ~m ~n ~nrhs in
sprintf "lwork: valid=[%d..[ got=%d" min_lwork lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gelsX_get_s vec_create loc min_dim ofss = function
| Some s ->
let dim_s = Array1.dim s in
let min_dim_ofs = ofss - 1 + min_dim in
if dim_s < min_dim_ofs then
invalid_arg (sprintf "%s: s: valid=[%d..[ got=%d" loc min_dim_ofs dim_s)
else s
| None -> vec_create min_dim
let gelsX_get_params loc ar ac a m n nrhs br bc b =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
let nrhs = get_dim2_mat loc b_str b bc nrhs_str nrhs in
check_dim1_mat loc b_str b br m_str (max m n);
m, n, nrhs
let xxev_get_params loc ar ac a n vectors up =
let n = get_n_of_a loc ar ac a n in
let jobz = get_job_char vectors in
let uplo = get_uplo_char up in
n, jobz, uplo
let xxev_get_wx vec_create loc wname ofsw w n =
match w with
| None -> vec_create (ofsw - 1 + n)
| Some w -> check_vec loc wname w (ofsw - 1 + n); w
let geev_get_job_side loc mat_empty mat_create mat_name n r c mat_opt =
match mat_opt with
| None ->
if r < 1 then failwith (sprintf "%s: %sr < 1" loc mat_name)
else if c < 1 then failwith (sprintf "%s: %sc < 1" loc mat_name)
else r, c, mat_create (n + r - 1) (n + c - 1), job_char_true, true
| Some None -> 1, 1, mat_empty, job_char_false, false
| Some (Some mat) ->
check_dim1_mat loc mat_name mat r n_str n;
check_dim2_mat loc mat_name mat c n_str n;
r, c, mat, job_char_true, true
let geev_gen_get_params loc mat_empty mat_create ar ac a n
leftr leftc left rightr rightc right =
let n = get_n_of_a loc ar ac a n in
let leftr, leftc, vl, jobvl, lvs =
geev_get_job_side loc mat_empty mat_create "vl" n leftr leftc left in
let rightr, rightc, vr, jobvr, rvs =
geev_get_job_side loc mat_empty mat_create "vr" n rightr rightc right in
n, leftr, leftc, vl, jobvl, rightr, rightc, vr, jobvr, lvs || rvs
let gXmv_get_params loc vec_create m n ofsx incx x ofsy incy y trans =
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
let ofsy, incy = get_vec_geom loc y_str ofsy incy in
let lx, ly, trans_char =
let trans_char = get_trans_char trans in
if trans = `N then n, m, trans_char else m, n, trans_char in
check_vec loc x_str x (ofsx + (lx - 1) * abs incx);
let y = get_vec loc y_str y ofsy incy ly vec_create in
ofsx, incx, ofsy, incy, y, trans_char
let symv_get_params loc vec_create ar ac a n ofsx incx x ofsy incy y up =
let n = get_dim1_mat loc a_str a ar n_str n in
check_dim2_mat loc a_str a ac n_str n;
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
let ofsy, incy = get_vec_geom loc y_str ofsy incy in
check_vec loc x_str x (ofsx + (n - 1) * abs incx);
let y = get_vec loc y_str y ofsy incy n vec_create in
check_vec loc y_str y (ofsy + (n - 1) * abs incy);
n, ofsx, incx, ofsy, incy, y, get_uplo_char up
let trXv_get_params loc ar ac a n ofsx incx x up trans unit_triangular =
let n = get_dim1_mat loc a_str a ar n_str n in
check_dim2_mat loc a_str a ac n_str n;
let trans_char = get_trans_char trans in
let diag_char = get_diag_char unit_triangular in
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
check_vec loc x_str x (ofsx + (n - 1) * abs incx);
n, ofsx, incx, get_uplo_char up, trans_char, diag_char
let tpXv_get_params loc ofsap ap ?n ofsx incx x up trans unit_triangular =
let ofsap = get_vec_ofs loc ap_str ofsap in
let n = get_unpacked_dim loc ?n (Array1.dim ap - ofsap + 1) in
let trans_char = get_trans_char trans in
let diag_char = get_diag_char unit_triangular in
let ofsx, incx = get_vec_geom loc x_str ofsx incx in
check_vec loc x_str x (ofsx + (n - 1) * abs incx);
n, ofsap, ofsx, incx, get_uplo_char up, trans_char, diag_char
let get_c loc mat_create cr cc c m n = get_mat loc c_str mat_create cr cc c m n
let get_rows_mat_tr loc mat_str mat mat_r mat_c transp dim_str dim =
match transp with
| `N -> get_dim1_mat loc mat_str mat mat_r dim_str dim
| _ -> get_dim2_mat loc mat_str mat mat_c dim_str dim
let get_cols_mat_tr loc mat_str mat mat_r mat_c transp dim_str dim =
match transp with
| `N -> get_dim2_mat loc mat_str mat mat_c dim_str dim
| _ -> get_dim1_mat loc mat_str mat mat_r dim_str dim
let get_inner_dim loc mat1_str mat1 mat1_r mat1_c tr1
mat2_str mat2 mat2_r mat2_c tr2 dim_str k =
let k1 = get_cols_mat_tr loc mat1_str mat1 mat1_r mat1_c tr1 dim_str k in
let k2 = get_rows_mat_tr loc mat2_str mat2 mat2_r mat2_c tr2 dim_str k in
if k = None && k1 <> k2 then
failwith (
sprintf "%s: inner dimensions of matrices do not match (%d,%d)"
loc k1 k2)
else k1
let gemm_get_params loc mat_create ar ac a transa br bc b cr transb cc c m n k =
let m = get_rows_mat_tr loc a_str a ar ac transa m_str m in
let n = get_cols_mat_tr loc b_str b br bc transb n_str n in
let k = get_inner_dim loc a_str a ar ac transa b_str b br bc transb k_str k in
let transa = get_trans_char transa in
let transb = get_trans_char transb in
let c = get_c loc mat_create cr cc c m n in
m, n, k, transa, transb, c
let check_mat_square loc mat_str mat mat_r mat_c n =
check_dim1_mat loc mat_str mat mat_r n_str n;
check_dim2_mat loc mat_str mat mat_c n_str n
let symm_get_params loc mat_create ar ac a br bc b cr cc c m n side up =
let m = get_dim1_mat loc b_str b br m_str m in
let n = get_dim2_mat loc b_str b bc n_str n in
if side = `L then check_mat_square loc a_str a ar ac m
else check_mat_square loc a_str a ar ac n;
let side_char = get_side_char side in
let uplo_char = get_uplo_char up in
let c = get_c loc mat_create cr cc c m n in
m, n, side_char, uplo_char, c
let trXm_get_params loc ar ac a br bc b m n side up transa diag =
let m = get_dim1_mat loc b_str b br m_str m in
let n = get_dim2_mat loc b_str b bc n_str n in
if side = `L then check_mat_square loc a_str a ar ac m
else check_mat_square loc a_str a ar ac n;
let side_char = get_side_char side in
let uplo_char = get_uplo_char up in
let transa = get_trans_char transa in
let diag_char = get_diag_char diag in
m, n, side_char, uplo_char, transa, diag_char
let syrk_get_params loc mat_create ar ac a cr cc c n k up trans =
let n = get_rows_mat_tr loc a_str a ar ac trans n_str n in
let k = get_cols_mat_tr loc a_str a ar ac trans k_str k in
let trans_char = get_trans_char trans in
let uplo_char = get_uplo_char up in
let c = get_c loc mat_create cr cc c n n in
n, k, uplo_char, trans_char, c
let syr2k_get_params loc mat_create ar ac a br bc b cr cc c n k up trans =
let n = get_rows_mat_tr loc a_str a ar ac trans n_str n in
let k = get_cols_mat_tr loc a_str a ar ac trans k_str k in
begin match trans with
| `N ->
check_dim1_mat loc b_str b br n_str n;
check_dim2_mat loc b_str b bc k_str k;
| _ ->
check_dim1_mat loc b_str b br k_str k;
check_dim2_mat loc b_str b bc n_str n;
end;
let trans_char = get_trans_char trans in
let uplo_char = get_uplo_char up in
let c = get_c loc mat_create cr cc c n n in
n, k, uplo_char, trans_char, c
let xlange_get_params loc m n ar ac a =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
m, n
? ? -- auxiliary functions
let xxtrs_get_params loc ar ac a n br bc b nrhs =
let n = get_n_of_a loc ar ac a n in
let nrhs = get_nrhs_of_b loc n br bc b nrhs in
n, nrhs
let xxtrs_err loc n nrhs a b err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -8 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let xxtri_singular_err loc err =
failwith (sprintf "%s: singular on index %i" loc err)
let xxtri_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let xxcon_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
geXrf -- auxiliary functions
let geXrf_get_params loc m n ar ac a =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
m, n
let getrf_err loc m n a err =
let msg =
match err with
| -1 -> sprintf "n: valid=[0..[ got=%d" n
| -2 -> sprintf "m: valid=[0..[ got=%d" m
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let getrf_lu_err loc err =
failwith (sprintf "%s: U(%i,%i)=0 in the LU factorization" loc err err)
let getrf_get_ipiv loc ipiv m n =
match ipiv with
| None -> create_int32_vec (min m n)
| Some ipiv ->
check_vec loc ipiv_str ipiv (min m n);
ipiv
let sytrf_get_ipiv loc ipiv n =
match ipiv with
| None -> create_int32_vec n
| Some ipiv ->
check_vec loc ipiv_str ipiv n;
ipiv
let sytrf_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let sytrf_fact_err loc err =
failwith (sprintf "%s: D(%i,%i)=0 in the factorization" loc err err)
let potrf_chol_err loc err =
failwith (
sprintf "%s: leading minor of order %d is not positive definite" loc err)
let potrf_err loc n a err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| _ -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let potrs_err loc n nrhs a b err =
let msg =
match err with
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -7 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let trtrs_err loc n nrhs a b err =
let msg =
match err with
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -7 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -9 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let tbtrs_err loc n nrhs kd ab b err =
let msg =
match err with
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "kd: valid=[0..[ got=%d" kd
| -6 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -8 -> sprintf "dim1(ab): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 ab)
| -10 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let getri_err loc getri_min_lwork n a lwork err =
let msg =
match err with
| -1 -> sprintf "n: valid=[0..[ got=%d" n
| -3 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| -6 ->
let min_lwork = getri_min_lwork n in
sprintf "lwork: valid=[%d..[ got=%d" min_lwork lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let trtri_err loc n a err =
let msg =
match err with
| -3 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let geqrf_err loc m n a err =
let msg =
match err with
| -1 -> sprintf "m: valid=[0..[ got=%d" m
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
gecon -- auxiliary functions
let gecon_err loc norm_char n a err =
let msg =
match err with
| -1 -> sprintf "norm: valid=['O', I'] got='%c'" norm_char
| -2 -> sprintf "n: valid=[0..[ got=%d" n
| -4 -> sprintf "dim1(a): valid=%d..[ got=%d" (max 1 n) (Array2.dim1 a)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gees_err loc n err jobvs sort =
if err > 0 && err <= n then
failwith (sprintf "%s: %d eigenvalue elements did not converge" loc err)
else if err = n + 1 then
failwith (
sprintf "%s: eigenvalues not reordered, too close to separate" loc)
else if err = n + 2 then
failwith (
sprintf "%s: after reordering, roundoff changed values of some \
complex eigenvalues so that leading eigenvalues in \
the Schur form no longer satisfy SELECT" loc)
else
let msg =
match err with
| -1 -> sprintf "JOBVS: valid=['N', V'] got='%c'" jobvs
| -2 -> sprintf "SORT: valid=['N', S'] got='%c'" sort
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| n -> raise (InternalError (sprintf "%s: error code %d" loc n))
in
invalid_arg (sprintf "%s: %s" loc msg)
let dummy_select_fun _ = false
let gees_get_params_generic
loc mat_create mat_empty jobvs sort n ar ac a vsr vsc vs =
let n = get_n_of_a loc ar ac a n in
let jobvs, min_ldvs =
match jobvs with
| `No_Schur_vectors -> 'N', 1
| `Compute_Schur_vectors -> 'V', n
in
let vs =
match vs with
| Some vs ->
check_dim1_mat loc vs_str vs vsr vsr_str min_ldvs;
check_dim2_mat loc vs_str vs vsc vsc_str n;
vs
| None when jobvs = 'N' -> mat_empty
| None -> mat_create min_ldvs n
in
let sort, select, select_fun =
match sort with
| `No_sort -> 'N', 0, dummy_select_fun
| `Select_left_plane -> 'S', 0, dummy_select_fun
| `Select_right_plane -> 'S', 1, dummy_select_fun
| `Select_interior_disk -> 'S', 2, dummy_select_fun
| `Select_exterior_disk -> 'S', 3, dummy_select_fun
| `Select_custom select_fun -> 'S', 4, select_fun
in
jobvs, sort, select, select_fun, n, vs
let gees_get_params_real
loc vec_create mat_create mat_empty
jobvs sort n ar ac a wr wi vsr vsc vs =
let jobvs, sort, select, select_fun, n, vs =
gees_get_params_generic
loc mat_create mat_empty jobvs sort n ar ac a vsr vsc vs
in
let wr =
match wr with
| None -> vec_create n
| Some wr -> check_vec loc wr_str wr n; wr
in
let wi =
match wi with
| None -> vec_create n
| Some wi -> check_vec loc wi_str wi n; wi
in
jobvs, sort, select, select_fun, n, vs, wr, wi
let gees_get_params_complex
loc vec_create mat_create mat_empty jobvs sort n ar ac a w vsr vsc vs =
let jobvs, sort, select, select_fun, n, vs =
gees_get_params_generic
loc mat_create mat_empty jobvs sort n ar ac a vsr vsc vs
in
let w =
match w with
| None -> vec_create n
| Some w -> check_vec loc w_str w n; w
in
jobvs, sort, select, select_fun, n, vs, w
let gesvd_err loc jobu jobvt m n a u vt lwork err =
if err > 0 then
failwith
(sprintf "%s: %d off-diagonal elements did not converge" loc err)
else
let msg =
match err with
| -3 -> sprintf "m: valid=[0..[ got=%d" m
| -4 -> sprintf "n: valid=[0..[ got=%d" n
| -6 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 a)
| -9 ->
sprintf "dim1(u): valid=[%d..[ got=%d"
(match jobu with 'A' | 'S' -> max 1 m | _ -> 1)
(Array2.dim1 u)
| -11 ->
sprintf "dim1(vt): valid=[%d..[ got=%d"
(
match jobvt with
| 'A' -> max 1 n
| 'S' -> max 1 (min m n)
| _ -> 1
)
(Array2.dim1 vt)
| -13 -> sprintf "lwork: valid=[%d..[ got=%d" 1 lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gesvd_get_params
loc vec_create mat_create jobu jobvt m n ar ac a s ur uc u vtr vtc vt =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
let s = get_vec loc s_str s 1 1 (min m n) vec_create in
let um, un =
match jobu with
| `A -> m, m
| `S -> m, min m n
LDU > = 1 even when U not referenced
let u =
match u with
| Some u ->
check_dim1_mat loc u_str u ur um_str um;
check_dim2_mat loc u_str u uc un_str un;
u
| None -> mat_create um un in
let vm, vn =
match jobvt with
| `A -> n, n
| `S -> min m n, n
LDVT > = 1 even when VT not referenced
let vt =
match vt with
| Some vt ->
check_dim1_mat loc vt_str vt vtr vm_str vm;
check_dim2_mat loc vt_str vt vtc vn_str vn;
vt
| None -> mat_create vm vn in
let jobu_c = get_s_d_job_char jobu in
let jobvt_c = get_s_d_job_char jobvt in
jobu_c, jobvt_c, m, n, s, u, vt
gesdd -- auxiliary functions
let gesdd_err loc jobz m n a u vt lwork err =
if err > 0 then
failwith (
sprintf "%s: %d DBDSDC did not converge, updating process failed" loc err)
else
let msg =
match err with
| -2 -> sprintf "m: valid=[0..[ got=%d" m
| -3 -> sprintf "n: valid=[0..[ got=%d" n
| -5 -> sprintf "dim1(a): valid=[%d..[ got=%d" (max 1 m) (Array2.dim1 a)
| -8 ->
sprintf "dim1(u): valid=[%d..[ got=%d"
(
if jobz = 'A' || jobz = 'S' || (jobz = 'O' && m < n)
then max 1 m
else 1
)
(Array2.dim1 u)
| -10 ->
sprintf "dim1(vt): valid=[%d..[ got=%d"
(
if jobz = 'A' || (jobz = 'O' && m >= n) then max 1 n
else if jobz = 'S' then max 1 (min m n)
else 1
)
(Array2.dim1 vt)
| -12 -> sprintf "lwork: valid=[%d..[ got=%d" 1 lwork
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let gesdd_get_params
loc vec_create mat_create jobz m n ar ac a s ur uc u vtr vtc vt =
let m = get_dim1_mat loc a_str a ar m_str m in
let n = get_dim2_mat loc a_str a ac n_str n in
let min_m_n = min m n in
let s = get_vec loc s_str s 1 1 min_m_n vec_create in
let um, un, vm, vn =
match jobz with
| `A -> m, m, n, n
| `S -> m, min_m_n, min_m_n, n
| `O -> if m >= n then 1, 1, n, n else m, m, m, n
LDU > = 1 even when U not referenced
let u =
match u with
| Some u ->
check_dim1_mat loc u_str u ur um_str um;
check_dim2_mat loc u_str u uc un_str un;
u
| None -> mat_create um un in
let vt =
match vt with
| Some vt ->
check_dim1_mat loc vt_str vt vtr vm_str vm;
check_dim2_mat loc vt_str vt vtc vn_str vn;
vt
| None -> mat_create vm vn in
let jobz_c = get_s_d_job_char jobz in
jobz_c, m, n, s, u, vt
let xxsv_err loc n nrhs b err =
let msg =
match err with
| -1 -> sprintf "n: valid=[0..[ got=%d" n
| -2 -> sprintf "nrhs: valid=[0..[ got=%d" nrhs
| -7 -> sprintf "dim1(b): valid=[%d..[ got=%d" (max 1 n) (Array2.dim1 b)
| n -> raise (InternalError (sprintf "%s: error code %d" loc n)) in
invalid_arg (sprintf "%s: %s" loc msg)
let xxsv_lu_err loc err =
failwith (sprintf "%s: U(%i,%i)=0 in the LU factorization" loc err err)
let xxsv_pos_err loc err =
let msg =
sprintf
"%s: the leading minor of order %i is not positive definite" loc err in
failwith msg
let xxsv_ind_err loc err =
let msg =
sprintf
"%s: D(%i,%i)=0 in the diagonal pivoting factorization" loc err err in
failwith msg
let xxsv_a_err loc a n =
let msg =
sprintf "%s: dim1(a): valid=[%d..[ got=%d" loc (max 1 n) (Array2.dim1 a) in
invalid_arg msg
let xxsv_work_err loc lwork =
invalid_arg (sprintf "%s: dim(work): valid=[1..[ got=%d" loc lwork)
let xxsv_get_ipiv loc ipiv n =
match ipiv with
| None -> create_int32_vec n
| Some ipiv ->
check_vec loc ipiv_str ipiv n;
ipiv
let xxsv_get_params loc ar ac a n br bc b nrhs =
let n = get_n_of_a loc ar ac a n in
let nrhs = get_nrhs_of_b loc n br bc b nrhs in
n, nrhs
|
5b6bf378672e76e38255bab5753dfbc47777ca6ddaffaa66b85d2f3eca51deba | davidlazar/ocaml-semantics | match-or04.ml | match (2, 7) with _, (2|6) -> 1 | (1|2), (3|7) -> 0 | (2,7) -> 2 | _ -> 3
| null | https://raw.githubusercontent.com/davidlazar/ocaml-semantics/6f302c6b9cced0407d501d70ad25c2d2aefbb77d/tests/unit/match-or04.ml | ocaml | match (2, 7) with _, (2|6) -> 1 | (1|2), (3|7) -> 0 | (2,7) -> 2 | _ -> 3
| |
84d8e44eee6bd7a059c48af2e6fd770bd47150f9ea4a1983d0eb0f4384bc3cc6 | garrigue/labltk | timer.mli | (***********************************************************************)
(* *)
MLTk , Tcl / Tk interface of OCaml
(* *)
, , and
projet Cristal , INRIA Rocquencourt
, Kyoto University RIMS
(* *)
Copyright 2002 Institut National de Recherche en Informatique et
en Automatique and Kyoto University . All rights reserved .
This file is distributed under the terms of the GNU Library
General Public License , with the special exception on linking
(* described in file LICENSE found in the OCaml source tree. *)
(* *)
(***********************************************************************)
$ Id$
type t
val add : ms:int -> callback:(unit -> unit) -> t
val set : ms:int -> callback:(unit -> unit) -> unit
val remove : t -> unit
| null | https://raw.githubusercontent.com/garrigue/labltk/c7f50b4faed57f1ac03cb3c9aedc35b10d36bdb6/support/timer.mli | ocaml | *********************************************************************
described in file LICENSE found in the OCaml source tree.
********************************************************************* | MLTk , Tcl / Tk interface of OCaml
, , and
projet Cristal , INRIA Rocquencourt
, Kyoto University RIMS
Copyright 2002 Institut National de Recherche en Informatique et
en Automatique and Kyoto University . All rights reserved .
This file is distributed under the terms of the GNU Library
General Public License , with the special exception on linking
$ Id$
type t
val add : ms:int -> callback:(unit -> unit) -> t
val set : ms:int -> callback:(unit -> unit) -> unit
val remove : t -> unit
|
acb3bd507d9d771e28eaac60a68a2c2d343819dbf7f8fc8b2913f5fe77117477 | Clozure/ccl-tests | fill.lsp | ;-*- Mode: Lisp -*-
Author :
Created : Sat Oct 12 19:44:45 2002
Contains : Tests on FILL
(in-package :cl-test)
(deftest fill.error.1
(signals-error (fill 'a 'b) type-error)
t)
(deftest fill.error.2
(signals-error (fill) program-error)
t)
(deftest fill.error.3
(signals-error (fill (list 'a 'b)) program-error)
t)
(deftest fill.error.4
(signals-error (fill (list 'a 'b) 'c :bad t) program-error)
t)
(deftest fill.error.5
(signals-error (fill (list 'a 'b) 'c :bad t :allow-other-keys nil)
program-error)
t)
(deftest fill.error.6
(signals-error (fill (list 'a 'b) 'c :start) program-error)
t)
(deftest fill.error.7
(signals-error (fill (list 'a 'b) 'c :end) program-error)
t)
(deftest fill.error.8
(signals-error (fill (list 'a 'b) 'c 1 2) program-error)
t)
(deftest fill.error.10
(signals-error (fill (list 'a 'b) 'c :bad t :allow-other-keys nil
:allow-other-keys t)
program-error)
t)
(deftest fill.error.11
(signals-error (locally (fill 'a 'b) t) type-error)
t)
;;; Fill on arrays
(deftest array-fill-1
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x)))
(values (eqt a b)
(map 'list #'identity a)))
t (x x x x x))
(deftest array-fill-2
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :start 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (a b x x x))
(deftest array-fill-3
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :end 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (x x c d e))
(deftest array-fill-4
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :start 1 :end 3)))
(values (eqt a b)
(map 'list #'identity a)))
t (a x x d e))
(deftest array-fill-5
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :start 1 :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (a x x x x))
(deftest array-fill-6
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (x x x x x))
(deftest array-fill-7
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :start -1))
type-error)
t)
(deftest array-fill-8
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :start 'a))
type-error)
t)
(deftest array-fill-9
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :end -1))
type-error)
t)
(deftest array-fill-10
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :end 'a))
type-error)
t)
;;; fill on arrays of fixnums
(deftest array-fixnum-fill-1
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 6)))
(values (eqt a b)
(map 'list #'identity a)))
t (6 6 6 6 6))
(deftest array-fixnum-fill-2
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 6 :start 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (1 2 6 6 6))
(deftest array-fixnum-fill-3
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 7 :end 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (7 7 3 4 5))
(deftest array-fixnum-fill-4
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 8 :start 1 :end 3)))
(values (eqt a b)
(map 'list #'identity a)))
t (1 8 8 4 5))
(deftest array-fixnum-fill-5
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 0 :start 1 :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (1 0 0 0 0))
(deftest array-fixnum-fill-6
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a -1 :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (-1 -1 -1 -1 -1))
(deftest array-fixnum-fill-7
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a 10 :start -1))
type-error)
t)
(deftest array-fixnum-fill-8
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a 100 :start 'a))
type-error)
t)
(deftest array-fixnum-fill-9
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a -5 :end -1))
type-error)
t)
(deftest array-fixnum-fill-10
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a 17 :end 'a))
type-error)
t)
fill on arrays of unsigned eight bit bytes
(deftest array-unsigned-byte8-fill-1
(array-unsigned-byte-fill-test-fn 8 6)
t (6 6 6 6 6))
(deftest array-unsigned-byte8-fill-2
(array-unsigned-byte-fill-test-fn 8 6 :start 2)
t (1 2 6 6 6))
(deftest array-unsigned-byte8-fill-3
(array-unsigned-byte-fill-test-fn 8 7 :end 2)
t (7 7 3 4 5))
(deftest array-unsigned-byte8-fill-4
(array-unsigned-byte-fill-test-fn 8 8 :start 1 :end 3)
t (1 8 8 4 5))
(deftest array-unsigned-byte8-fill-5
(array-unsigned-byte-fill-test-fn 8 9 :start 1 :end nil)
t (1 9 9 9 9))
(deftest array-unsigned-byte8-fill-6
(array-unsigned-byte-fill-test-fn 8 0 :end nil)
t (0 0 0 0 0))
(deftest array-unsigned-byte8-fill-7
(signals-error (array-unsigned-byte-fill-test-fn 8 0 :start -1)
type-error)
t)
(deftest array-unsigned-byte8-fill-8
(signals-error (array-unsigned-byte-fill-test-fn 8 100 :start 'a)
type-error)
t)
(deftest array-unsigned-byte8-fill-9
(signals-error (array-unsigned-byte-fill-test-fn 8 19 :end -1)
type-error)
t)
(deftest array-unsigned-byte8-fill-10
(signals-error (array-unsigned-byte-fill-test-fn 8 17 :end 'a)
type-error)
t)
;;; Tests on arrays with fill pointers
(deftest array-fill-pointer-fill.1
(let ((s1 (make-array '(10) :fill-pointer 5 :initial-element nil)))
(fill s1 'a)
(loop for i from 0 to 9 collect (aref s1 i)))
(a a a a a nil nil nil nil nil))
(deftest array-fill-pointer-fill.2
(let ((s1 (make-array '(10) :fill-pointer 5 :initial-element nil)))
(fill s1 'a :end nil)
(loop for i from 0 to 9 collect (aref s1 i)))
(a a a a a nil nil nil nil nil))
;;; Tests on strings
(deftest fill.string.1
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z)))
(values (eqt s1 s2) s2))
t
"zzzzz")
(deftest fill.string.2
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z :start 0 :end 1)))
(values (eqt s1 s2) s2))
t
"zbcde")
(deftest fill.string.3
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z :end 2)))
(values (eqt s1 s2) s2))
t
"zzcde")
(deftest fill.string.4
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z :end nil)))
(values (eqt s1 s2) s2))
t
"zzzzz")
(deftest fill.string.5
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for start from 0 to (1- len)
always
(loop for end from (1+ start) to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :start start :end end)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:start start :end end))
t)))))
t)
(deftest fill.string.6
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :start start)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:start start))
t))))
t)
(deftest fill.string.7
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :end nil :start start)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:end nil :start start))
t))))
t)
(deftest fill.string.8
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for end from 1 to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :end end)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:end end))
t))))
t)
(deftest fill.string.9
(let* ((s1 (make-array '(8) :element-type 'character
:initial-element #\z
:fill-pointer 4))
(s2 (fill s1 #\a)))
(and (eqt s1 s2)
(coerce (loop for i from 0 to 7 collect (aref s2 i))
'string)))
"aaaazzzz")
(deftest fill.string.10
(let* ((s1 (make-array '(8) :element-type 'base-char
:initial-element #\z
:fill-pointer 4))
(s2 (fill s1 #\a)))
(and (eqt s1 s2)
(coerce (loop for i from 0 to 7 collect (aref s2 i))
'base-string)))
"aaaazzzz")
;;; Tests for bit vectors
(deftest fill.bit-vector.1
(let* ((s1 (copy-seq #*01100))
(s2 (fill s1 0)))
(values (eqt s1 s2) s2))
t
#*00000)
(deftest fill.bit-vector.2
(let* ((s1 (copy-seq #*00100))
(s2 (fill s1 1 :start 0 :end 1)))
(values (eqt s1 s2) s2))
t
#*10100)
(deftest fill.bit-vector.3
(let* ((s1 (copy-seq #*00010))
(s2 (fill s1 1 :end 2)))
(values (eqt s1 s2) s2))
t
#*11010)
(deftest fill.bit-vector.4
(let* ((s1 (copy-seq #*00111))
(s2 (fill s1 0 :end nil)))
(values (eqt s1 s2) s2))
t
#*00000)
(deftest fill.bit-vector.5
(let* ((s1 #*00000000)
(len (length s1)))
(loop for start from 0 to (1- len)
always
(loop for end from (1+ start) to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 1 :start start :end end)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 1 (constantly t) s1
:start start :end end))
t)))))
t)
(deftest fill.bit-vector.6
(let* ((s1 #*11111111)
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 0 :start start)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 0 (constantly t) s1
:start start))
t))))
t)
(deftest fill.bit-vector.7
(let* ((s1 #*00000000)
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 1 :end nil :start start)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 1 (constantly t) s1
:end nil :start start))
t))))
t)
(deftest fill.bit-vector.8
(let* ((s1 #*11111111)
(len (length s1)))
(loop for end from 1 to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 0 :end end)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 0 (constantly t) s1
:end end))
t))))
t)
(deftest fill.bit-vector.9
(let* ((s1 (make-array '(8) :element-type 'bit
:initial-element 0
:fill-pointer 4))
(s2 (fill s1 1)))
(and (eqt s1 s2)
(coerce (loop for i from 0 to 7 collect (aref s2 i))
'bit-vector)))
#*11110000)
;;; Test of :allow-other-keys
(deftest fill.allow-other-keys.1
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys t)
(a a a a a))
(deftest fill.allow-other-keys.2
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys nil)
(a a a a a))
(deftest fill.allow-other-keys.3
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys t :bad t)
(a a a a a))
(deftest fill.allow-other-keys.4
(fill (list 'a 'b 'c 'd 'e) 'a :bad t :allow-other-keys t)
(a a a a a))
(deftest fill.allow-other-keys.5
(fill (list 'a 'b 'c 'd 'e) 'a 'bad t :allow-other-keys t)
(a a a a a))
(deftest fill.allow-other-keys.6
(fill (list 'a 'b 'c 'd 'e) 'a :bad t :allow-other-keys t
:allow-other-keys nil)
(a a a a a))
(deftest fill.allow-other-keys.7
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys t :allow-other-keys nil
:bad t)
(a a a a a))
;;; Tests of evaluation order
(deftest fill.order.1
(let ((i 0) x y (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z))
i x y))
#(z z z z) 2 1 2)
(deftest fill.order.2
(let ((i 0) x y z w (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z)
:start (progn (setf z (incf i)) 1)
:end (progn (setf w (incf i)) 3))
i x y z w))
#(a z z a) 4 1 2 3 4)
(deftest fill.order.3
(let ((i 0) x y z w (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z)
:end (progn (setf z (incf i)) 3)
:start (progn (setf w (incf i)) 1))
i x y z w))
#(a z z a) 4 1 2 3 4)
(deftest fill.order.4
(let ((i 0) x y z p q r s w (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z)
:end (progn (setf z (incf i)) 3)
:end (progn (setf p (incf i)) 1)
:end (progn (setf q (incf i)) 1)
:end (progn (setf r (incf i)) 1)
:start (progn (setf s (incf i)) 1)
:start (progn (setf w (incf i)) 0))
i x y z p q r s w))
#(a z z a) 8 1 2 3 4 5 6 7 8)
Specialized strings
(deftest fill.specialized-strings.1
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x)))
(assert (string= s "xxxxx")))
nil)
(deftest fill.specialized-strings.2
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x :start 2)))
(assert (string= s "abxxx")))
nil)
(deftest fill.specialized-strings.3
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x :end 3)))
(assert (string= s "xxxde")))
nil)
(deftest fill.specialized-strings.4
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x :start 1 :end 4)))
(assert (string= s "axxxe")))
nil)
Specialized vector tests
(deftest fill.specialized-vectors.1
(do-special-integer-vectors
(v #(0 1 1 0 1) nil)
(let ((etype (array-element-type v)))
(assert (eq v (fill v 0)))
(assert (equal (array-element-type v) etype)))
(assert (equalp v #(0 0 0 0 0))))
nil)
(deftest fill.specialized-vectors.2
(do-special-integer-vectors
(v #(0 -1 1 0 -1) nil)
(let ((etype (array-element-type v)))
(assert (eq v (fill v 1)))
(assert (equal (array-element-type v) etype)))
(assert (equalp v #(1 1 1 1 1))))
nil)
(deftest fill.specialized-vectors.3
(do-special-integer-vectors
(v #(1 1 1 1 0) nil)
(let ((etype (array-element-type v)))
(assert (eq v (fill v 0 :start 1 :end 3)))
(assert (equal (array-element-type v) etype)))
(assert (equalp v #(1 0 0 1 0))))
nil) | null | https://raw.githubusercontent.com/Clozure/ccl-tests/0478abddb34dbc16487a1975560d8d073a988060/ansi-tests/fill.lsp | lisp | -*- Mode: Lisp -*-
Fill on arrays
fill on arrays of fixnums
Tests on arrays with fill pointers
Tests on strings
Tests for bit vectors
Test of :allow-other-keys
Tests of evaluation order | Author :
Created : Sat Oct 12 19:44:45 2002
Contains : Tests on FILL
(in-package :cl-test)
(deftest fill.error.1
(signals-error (fill 'a 'b) type-error)
t)
(deftest fill.error.2
(signals-error (fill) program-error)
t)
(deftest fill.error.3
(signals-error (fill (list 'a 'b)) program-error)
t)
(deftest fill.error.4
(signals-error (fill (list 'a 'b) 'c :bad t) program-error)
t)
(deftest fill.error.5
(signals-error (fill (list 'a 'b) 'c :bad t :allow-other-keys nil)
program-error)
t)
(deftest fill.error.6
(signals-error (fill (list 'a 'b) 'c :start) program-error)
t)
(deftest fill.error.7
(signals-error (fill (list 'a 'b) 'c :end) program-error)
t)
(deftest fill.error.8
(signals-error (fill (list 'a 'b) 'c 1 2) program-error)
t)
(deftest fill.error.10
(signals-error (fill (list 'a 'b) 'c :bad t :allow-other-keys nil
:allow-other-keys t)
program-error)
t)
(deftest fill.error.11
(signals-error (locally (fill 'a 'b) t) type-error)
t)
(deftest array-fill-1
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x)))
(values (eqt a b)
(map 'list #'identity a)))
t (x x x x x))
(deftest array-fill-2
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :start 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (a b x x x))
(deftest array-fill-3
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :end 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (x x c d e))
(deftest array-fill-4
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :start 1 :end 3)))
(values (eqt a b)
(map 'list #'identity a)))
t (a x x d e))
(deftest array-fill-5
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :start 1 :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (a x x x x))
(deftest array-fill-6
(let* ((a (make-array '(5) :initial-contents '(a b c d e)))
(b (fill a 'x :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (x x x x x))
(deftest array-fill-7
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :start -1))
type-error)
t)
(deftest array-fill-8
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :start 'a))
type-error)
t)
(deftest array-fill-9
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :end -1))
type-error)
t)
(deftest array-fill-10
(signals-error
(let* ((a (make-array '(5))))
(fill a 'x :end 'a))
type-error)
t)
(deftest array-fixnum-fill-1
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 6)))
(values (eqt a b)
(map 'list #'identity a)))
t (6 6 6 6 6))
(deftest array-fixnum-fill-2
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 6 :start 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (1 2 6 6 6))
(deftest array-fixnum-fill-3
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 7 :end 2)))
(values (eqt a b)
(map 'list #'identity a)))
t (7 7 3 4 5))
(deftest array-fixnum-fill-4
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 8 :start 1 :end 3)))
(values (eqt a b)
(map 'list #'identity a)))
t (1 8 8 4 5))
(deftest array-fixnum-fill-5
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a 0 :start 1 :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (1 0 0 0 0))
(deftest array-fixnum-fill-6
(let* ((a (make-array '(5) :element-type 'fixnum :initial-contents '(1 2 3 4 5)))
(b (fill a -1 :end nil)))
(values (eqt a b)
(map 'list #'identity a)))
t (-1 -1 -1 -1 -1))
(deftest array-fixnum-fill-7
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a 10 :start -1))
type-error)
t)
(deftest array-fixnum-fill-8
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a 100 :start 'a))
type-error)
t)
(deftest array-fixnum-fill-9
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a -5 :end -1))
type-error)
t)
(deftest array-fixnum-fill-10
(signals-error
(let* ((a (make-array '(5) :element-type 'fixnum)))
(fill a 17 :end 'a))
type-error)
t)
fill on arrays of unsigned eight bit bytes
(deftest array-unsigned-byte8-fill-1
(array-unsigned-byte-fill-test-fn 8 6)
t (6 6 6 6 6))
(deftest array-unsigned-byte8-fill-2
(array-unsigned-byte-fill-test-fn 8 6 :start 2)
t (1 2 6 6 6))
(deftest array-unsigned-byte8-fill-3
(array-unsigned-byte-fill-test-fn 8 7 :end 2)
t (7 7 3 4 5))
(deftest array-unsigned-byte8-fill-4
(array-unsigned-byte-fill-test-fn 8 8 :start 1 :end 3)
t (1 8 8 4 5))
(deftest array-unsigned-byte8-fill-5
(array-unsigned-byte-fill-test-fn 8 9 :start 1 :end nil)
t (1 9 9 9 9))
(deftest array-unsigned-byte8-fill-6
(array-unsigned-byte-fill-test-fn 8 0 :end nil)
t (0 0 0 0 0))
(deftest array-unsigned-byte8-fill-7
(signals-error (array-unsigned-byte-fill-test-fn 8 0 :start -1)
type-error)
t)
(deftest array-unsigned-byte8-fill-8
(signals-error (array-unsigned-byte-fill-test-fn 8 100 :start 'a)
type-error)
t)
(deftest array-unsigned-byte8-fill-9
(signals-error (array-unsigned-byte-fill-test-fn 8 19 :end -1)
type-error)
t)
(deftest array-unsigned-byte8-fill-10
(signals-error (array-unsigned-byte-fill-test-fn 8 17 :end 'a)
type-error)
t)
(deftest array-fill-pointer-fill.1
(let ((s1 (make-array '(10) :fill-pointer 5 :initial-element nil)))
(fill s1 'a)
(loop for i from 0 to 9 collect (aref s1 i)))
(a a a a a nil nil nil nil nil))
(deftest array-fill-pointer-fill.2
(let ((s1 (make-array '(10) :fill-pointer 5 :initial-element nil)))
(fill s1 'a :end nil)
(loop for i from 0 to 9 collect (aref s1 i)))
(a a a a a nil nil nil nil nil))
(deftest fill.string.1
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z)))
(values (eqt s1 s2) s2))
t
"zzzzz")
(deftest fill.string.2
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z :start 0 :end 1)))
(values (eqt s1 s2) s2))
t
"zbcde")
(deftest fill.string.3
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z :end 2)))
(values (eqt s1 s2) s2))
t
"zzcde")
(deftest fill.string.4
(let* ((s1 (copy-seq "abcde"))
(s2 (fill s1 #\z :end nil)))
(values (eqt s1 s2) s2))
t
"zzzzz")
(deftest fill.string.5
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for start from 0 to (1- len)
always
(loop for end from (1+ start) to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :start start :end end)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:start start :end end))
t)))))
t)
(deftest fill.string.6
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :start start)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:start start))
t))))
t)
(deftest fill.string.7
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :end nil :start start)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:end nil :start start))
t))))
t)
(deftest fill.string.8
(let* ((s1 "aaaaaaaa")
(len (length s1)))
(loop for end from 1 to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 #\z :end end)))
(and (eqt s2 s3)
(string= s3
(substitute-if #\z (constantly t) s1
:end end))
t))))
t)
(deftest fill.string.9
(let* ((s1 (make-array '(8) :element-type 'character
:initial-element #\z
:fill-pointer 4))
(s2 (fill s1 #\a)))
(and (eqt s1 s2)
(coerce (loop for i from 0 to 7 collect (aref s2 i))
'string)))
"aaaazzzz")
(deftest fill.string.10
(let* ((s1 (make-array '(8) :element-type 'base-char
:initial-element #\z
:fill-pointer 4))
(s2 (fill s1 #\a)))
(and (eqt s1 s2)
(coerce (loop for i from 0 to 7 collect (aref s2 i))
'base-string)))
"aaaazzzz")
(deftest fill.bit-vector.1
(let* ((s1 (copy-seq #*01100))
(s2 (fill s1 0)))
(values (eqt s1 s2) s2))
t
#*00000)
(deftest fill.bit-vector.2
(let* ((s1 (copy-seq #*00100))
(s2 (fill s1 1 :start 0 :end 1)))
(values (eqt s1 s2) s2))
t
#*10100)
(deftest fill.bit-vector.3
(let* ((s1 (copy-seq #*00010))
(s2 (fill s1 1 :end 2)))
(values (eqt s1 s2) s2))
t
#*11010)
(deftest fill.bit-vector.4
(let* ((s1 (copy-seq #*00111))
(s2 (fill s1 0 :end nil)))
(values (eqt s1 s2) s2))
t
#*00000)
(deftest fill.bit-vector.5
(let* ((s1 #*00000000)
(len (length s1)))
(loop for start from 0 to (1- len)
always
(loop for end from (1+ start) to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 1 :start start :end end)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 1 (constantly t) s1
:start start :end end))
t)))))
t)
(deftest fill.bit-vector.6
(let* ((s1 #*11111111)
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 0 :start start)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 0 (constantly t) s1
:start start))
t))))
t)
(deftest fill.bit-vector.7
(let* ((s1 #*00000000)
(len (length s1)))
(loop for start from 0 to (1- len)
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 1 :end nil :start start)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 1 (constantly t) s1
:end nil :start start))
t))))
t)
(deftest fill.bit-vector.8
(let* ((s1 #*11111111)
(len (length s1)))
(loop for end from 1 to len
always
(let* ((s2 (copy-seq s1))
(s3 (fill s2 0 :end end)))
(and (eqt s2 s3)
(equalp s3
(substitute-if 0 (constantly t) s1
:end end))
t))))
t)
(deftest fill.bit-vector.9
(let* ((s1 (make-array '(8) :element-type 'bit
:initial-element 0
:fill-pointer 4))
(s2 (fill s1 1)))
(and (eqt s1 s2)
(coerce (loop for i from 0 to 7 collect (aref s2 i))
'bit-vector)))
#*11110000)
(deftest fill.allow-other-keys.1
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys t)
(a a a a a))
(deftest fill.allow-other-keys.2
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys nil)
(a a a a a))
(deftest fill.allow-other-keys.3
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys t :bad t)
(a a a a a))
(deftest fill.allow-other-keys.4
(fill (list 'a 'b 'c 'd 'e) 'a :bad t :allow-other-keys t)
(a a a a a))
(deftest fill.allow-other-keys.5
(fill (list 'a 'b 'c 'd 'e) 'a 'bad t :allow-other-keys t)
(a a a a a))
(deftest fill.allow-other-keys.6
(fill (list 'a 'b 'c 'd 'e) 'a :bad t :allow-other-keys t
:allow-other-keys nil)
(a a a a a))
(deftest fill.allow-other-keys.7
(fill (list 'a 'b 'c 'd 'e) 'a :allow-other-keys t :allow-other-keys nil
:bad t)
(a a a a a))
(deftest fill.order.1
(let ((i 0) x y (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z))
i x y))
#(z z z z) 2 1 2)
(deftest fill.order.2
(let ((i 0) x y z w (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z)
:start (progn (setf z (incf i)) 1)
:end (progn (setf w (incf i)) 3))
i x y z w))
#(a z z a) 4 1 2 3 4)
(deftest fill.order.3
(let ((i 0) x y z w (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z)
:end (progn (setf z (incf i)) 3)
:start (progn (setf w (incf i)) 1))
i x y z w))
#(a z z a) 4 1 2 3 4)
(deftest fill.order.4
(let ((i 0) x y z p q r s w (a (copy-seq #(a a a a))))
(values
(fill (progn (setf x (incf i)) a)
(progn (setf y (incf i)) 'z)
:end (progn (setf z (incf i)) 3)
:end (progn (setf p (incf i)) 1)
:end (progn (setf q (incf i)) 1)
:end (progn (setf r (incf i)) 1)
:start (progn (setf s (incf i)) 1)
:start (progn (setf w (incf i)) 0))
i x y z p q r s w))
#(a z z a) 8 1 2 3 4 5 6 7 8)
Specialized strings
(deftest fill.specialized-strings.1
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x)))
(assert (string= s "xxxxx")))
nil)
(deftest fill.specialized-strings.2
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x :start 2)))
(assert (string= s "abxxx")))
nil)
(deftest fill.specialized-strings.3
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x :end 3)))
(assert (string= s "xxxde")))
nil)
(deftest fill.specialized-strings.4
(do-special-strings
(s (copy-seq "abcde") nil)
(assert (string= s "abcde"))
(assert (eq s (fill s #\x :start 1 :end 4)))
(assert (string= s "axxxe")))
nil)
Specialized vector tests
(deftest fill.specialized-vectors.1
(do-special-integer-vectors
(v #(0 1 1 0 1) nil)
(let ((etype (array-element-type v)))
(assert (eq v (fill v 0)))
(assert (equal (array-element-type v) etype)))
(assert (equalp v #(0 0 0 0 0))))
nil)
(deftest fill.specialized-vectors.2
(do-special-integer-vectors
(v #(0 -1 1 0 -1) nil)
(let ((etype (array-element-type v)))
(assert (eq v (fill v 1)))
(assert (equal (array-element-type v) etype)))
(assert (equalp v #(1 1 1 1 1))))
nil)
(deftest fill.specialized-vectors.3
(do-special-integer-vectors
(v #(1 1 1 1 0) nil)
(let ((etype (array-element-type v)))
(assert (eq v (fill v 0 :start 1 :end 3)))
(assert (equal (array-element-type v) etype)))
(assert (equalp v #(1 0 0 1 0))))
nil) |
4bccd1baf5e01e42c950d034cb77d12d878339f077e4bc4f07242a30891425fa | jbclements/sxml | parse-error.rkt | #lang racket/base
(require racket/contract/base
syntax/readerr
"errors-and-warnings.rkt")
(provide/contract
[parser-error
(->* (port?) () #:rest list? any)]
[ssax:warn
(->* (port?) () #:rest list? any)])
; This code provides informative error messages
for SSAX ( S)XML parser .
;==============================================================================
; Error handler
; According to the SSAX convention this function
accepts the port as its first argument which is used for
; location of the error in input file.
; Other parameters are considered as error messages,
; they are printed to stderr as is.
NB : updated to signal a racket error rather than printing to stdout .
(define (parser-error p . args)
(let-values ([(line col pos) (port-next-location p)])
(raise-read-error (format "SXML parser error: ~a" (args->display-string args))
(object-name p)
line col pos #f)))
;; map args to their display representations, glue them together:
(define (args->display-string args)
(apply string-append (map (lambda (x) (format "~a" x)) args)))
(define (ssax:warn p . args)
(sxml:warn 'ssax:warn
"warning at position ~a: ~a"
(file-position p)
(args->display-string args)))
| null | https://raw.githubusercontent.com/jbclements/sxml/d3b8570cf7287c4e06636e17634f0f5c39203d52/sxml/ssax/parse-error.rkt | racket | This code provides informative error messages
==============================================================================
Error handler
According to the SSAX convention this function
location of the error in input file.
Other parameters are considered as error messages,
they are printed to stderr as is.
map args to their display representations, glue them together: | #lang racket/base
(require racket/contract/base
syntax/readerr
"errors-and-warnings.rkt")
(provide/contract
[parser-error
(->* (port?) () #:rest list? any)]
[ssax:warn
(->* (port?) () #:rest list? any)])
for SSAX ( S)XML parser .
accepts the port as its first argument which is used for
NB : updated to signal a racket error rather than printing to stdout .
(define (parser-error p . args)
(let-values ([(line col pos) (port-next-location p)])
(raise-read-error (format "SXML parser error: ~a" (args->display-string args))
(object-name p)
line col pos #f)))
(define (args->display-string args)
(apply string-append (map (lambda (x) (format "~a" x)) args)))
(define (ssax:warn p . args)
(sxml:warn 'ssax:warn
"warning at position ~a: ~a"
(file-position p)
(args->display-string args)))
|
0bf31292fffa02a7f48ed1c236d5ccb9d2f8c1df199473a19668bb2d63408d2e | klarna/jesse | jesse_error.erl | %%%=============================================================================
Copyright 2014 Klarna AB
%%
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
%% you may not use this file except in compliance with the License.
%% You may obtain a copy of the License at
%%
%% -2.0
%%
%% Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an " AS IS " BASIS ,
%% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
%% See the License for the specific language governing permissions and
%% limitations under the License.
%%
@doc Json schema validation module .
%%
This module is the core of jesse , it implements the validation functionality
%% according to the standard.
%% @end
%%%=============================================================================
-module(jesse_error).
-export([ default_error_handler/3
, handle_data_invalid/3
, handle_schema_invalid/2
]).
-export_type([ error/0
]).
-type error() :: {error, [error_reason()]}.
-type error_reason() :: { 'schema_invalid'
, Schema :: jesse:json_term()
, Error :: error_type()
}
| { 'data_invalid'
, Schema :: jesse:json_term()
, Error :: error_type()
, Data :: jesse:json_term()
, Path :: [binary()]
}.
-type error_type() :: {'missing_id_field', binary()}
| {'missing_required_property', binary()}
| {'missing_dependency', binary()}
| 'no_match'
| 'no_extra_properties_allowed'
| 'no_extra_items_allowed'
| 'not_enought_items'
| 'not_allowed'
| {'not_unique', jesse:json_term()}
| 'not_in_range'
| 'not_divisible'
| 'wrong_type'
| {'wrong_type_items', jesse:json_term()}
| {'wrong_type_dependency', jesse:json_term()}
| 'wrong_size'
| 'wrong_length'
| 'wrong_format'
| {'schema_unsupported', binary()}.
%% Includes
-include("jesse_schema_validator.hrl").
%% @doc Implements the default error handler.
If the length of ` ErrorList ' exceeds ` AllowedErrors ' then the function
%% throws an exeption, otherwise adds a new element to the list and returs it.
-spec default_error_handler( Error :: error_reason()
, ErrorList :: [error_reason()]
, AllowedErrors :: non_neg_integer()
) -> [error_reason()] | no_return().
default_error_handler(Error, ErrorList, AllowedErrors) ->
case AllowedErrors > length(ErrorList) orelse AllowedErrors =:= 'infinity' of
true -> [Error | ErrorList];
false -> throw([Error | ErrorList])
end.
%% @doc Generates a new data error and returns the updated state.
-spec handle_data_invalid( Info :: error_type()
, Value :: jesse:json_term()
, State :: jesse_state:state()
) -> jesse_state:state().
handle_data_invalid(Info, Value, State) ->
Error = { ?data_invalid
, jesse_state:get_current_schema(State)
, Info
, Value
, lists:reverse(jesse_state:get_current_path(State))
},
handle_error(Error, State).
%% @doc Generates a new schema error and returns the updated state.
-spec handle_schema_invalid( Info :: error_type()
, State :: jesse_state:state()
) -> jesse_state:state().
handle_schema_invalid(Info, State) ->
Error = { ?schema_invalid
, jesse_state:get_current_schema(State)
, Info
},
handle_error(Error, State).
Internal functions
@private
handle_error(Error, State) ->
ErrorHandler = jesse_state:get_error_handler(State),
ErrorList = jesse_state:get_error_list(State),
AllowedErrors = jesse_state:get_allowed_errors(State),
NewErrorList = ErrorHandler(Error, ErrorList, AllowedErrors),
jesse_state:set_error_list(State, NewErrorList).
%%% Local Variables:
erlang - indent - level : 2
%%% End:
| null | https://raw.githubusercontent.com/klarna/jesse/830738e8a03413851cc48a55a5e70f459391cc00/src/jesse_error.erl | erlang | =============================================================================
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
according to the standard.
@end
=============================================================================
Includes
@doc Implements the default error handler.
throws an exeption, otherwise adds a new element to the list and returs it.
@doc Generates a new data error and returns the updated state.
@doc Generates a new schema error and returns the updated state.
Local Variables:
End: | Copyright 2014 Klarna AB
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
distributed under the License is distributed on an " AS IS " BASIS ,
@doc Json schema validation module .
This module is the core of jesse , it implements the validation functionality
-module(jesse_error).
-export([ default_error_handler/3
, handle_data_invalid/3
, handle_schema_invalid/2
]).
-export_type([ error/0
]).
-type error() :: {error, [error_reason()]}.
-type error_reason() :: { 'schema_invalid'
, Schema :: jesse:json_term()
, Error :: error_type()
}
| { 'data_invalid'
, Schema :: jesse:json_term()
, Error :: error_type()
, Data :: jesse:json_term()
, Path :: [binary()]
}.
-type error_type() :: {'missing_id_field', binary()}
| {'missing_required_property', binary()}
| {'missing_dependency', binary()}
| 'no_match'
| 'no_extra_properties_allowed'
| 'no_extra_items_allowed'
| 'not_enought_items'
| 'not_allowed'
| {'not_unique', jesse:json_term()}
| 'not_in_range'
| 'not_divisible'
| 'wrong_type'
| {'wrong_type_items', jesse:json_term()}
| {'wrong_type_dependency', jesse:json_term()}
| 'wrong_size'
| 'wrong_length'
| 'wrong_format'
| {'schema_unsupported', binary()}.
-include("jesse_schema_validator.hrl").
If the length of ` ErrorList ' exceeds ` AllowedErrors ' then the function
-spec default_error_handler( Error :: error_reason()
, ErrorList :: [error_reason()]
, AllowedErrors :: non_neg_integer()
) -> [error_reason()] | no_return().
default_error_handler(Error, ErrorList, AllowedErrors) ->
case AllowedErrors > length(ErrorList) orelse AllowedErrors =:= 'infinity' of
true -> [Error | ErrorList];
false -> throw([Error | ErrorList])
end.
-spec handle_data_invalid( Info :: error_type()
, Value :: jesse:json_term()
, State :: jesse_state:state()
) -> jesse_state:state().
handle_data_invalid(Info, Value, State) ->
Error = { ?data_invalid
, jesse_state:get_current_schema(State)
, Info
, Value
, lists:reverse(jesse_state:get_current_path(State))
},
handle_error(Error, State).
-spec handle_schema_invalid( Info :: error_type()
, State :: jesse_state:state()
) -> jesse_state:state().
handle_schema_invalid(Info, State) ->
Error = { ?schema_invalid
, jesse_state:get_current_schema(State)
, Info
},
handle_error(Error, State).
Internal functions
@private
handle_error(Error, State) ->
ErrorHandler = jesse_state:get_error_handler(State),
ErrorList = jesse_state:get_error_list(State),
AllowedErrors = jesse_state:get_allowed_errors(State),
NewErrorList = ErrorHandler(Error, ErrorList, AllowedErrors),
jesse_state:set_error_list(State, NewErrorList).
erlang - indent - level : 2
|
87d11e38ac557cc656c456b6f5180cd3dbfe30d5dfa0edc26c6aea635b9e367d | evilmartians/foundry | udb_data.ml | let data : Codepoint.t array = [||]
| null | https://raw.githubusercontent.com/evilmartians/foundry/ce947c7dcca79ab7a7ce25870e9fc0eb15e9c2bd/vendor/ucs/lib/udb_data.ml | ocaml | let data : Codepoint.t array = [||]
| |
28a95df3fcb224e0ccb5444b6b8c4df39355a538892cc1510d17fa8eff615843 | dongcarl/guix | stenography.scm | ;;; GNU Guix --- Functional package management for GNU
Copyright © 2020 < >
Copyright © 2021 < >
;;;
;;; This file is part of GNU Guix.
;;;
GNU is free software ; you can redistribute it and/or modify it
under the terms of the GNU General Public License as published by
the Free Software Foundation ; either version 3 of the License , or ( at
;;; your option) any later version.
;;;
;;; GNU Guix is distributed in the hope that it will be useful, but
;;; WITHOUT ANY WARRANTY; without even the implied warranty of
;;; MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
;;; GNU General Public License for more details.
;;;
You should have received a copy of the GNU General Public License
along with GNU . If not , see < / > .
(define-module (gnu packages stenography)
#:use-module (guix build-system python)
#:use-module (guix git-download)
#:use-module ((guix licenses) #:prefix license:)
#:use-module (guix packages)
#:use-module (gnu packages)
#:use-module (gnu packages qt)
#:use-module (gnu packages check)
#:use-module (gnu packages libusb)
#:use-module (gnu packages python)
#:use-module (gnu packages python-xyz)
#:use-module (gnu packages wxwidgets))
(define-public plover
(package
(name "plover")
(version "4.0.0.dev8")
(source
(origin
(method git-fetch)
(uri (git-reference
(url "")
(commit (string-append "v" version))))
(file-name (git-file-name name version))
(sha256
(base32 "1b2ys77bkjsdmyg97i7lq3lj45q56bycvsm06d4rs656kxhvc0a3"))))
(build-system python-build-system)
(native-inputs
`(("python-mock" ,python-mock)
("python-pytest" ,python-pytest)
("python-setuptools-scm" ,python-setuptools-scm)))
(inputs
`(("python-appdirs" ,python-appdirs)
("python-pyqt" ,python-pyqt)
("python-babel" ,python-babel)
("python-dbus" ,python-dbus)
("python-hidapi" ,python-hidapi)
("python-pyserial" ,python-pyserial)
("python-wxpython" ,python-wxpython)
("python-xlib" ,python-xlib)))
(home-page "/")
(synopsis "Stenography engine")
(description
"Plover (rhymes with @emph{lover}) is a desktop application that
allows anyone to use stenography to write on their computer, up to
speeds of 200WPM and beyond.")
(license license:gpl2+)))
| null | https://raw.githubusercontent.com/dongcarl/guix/82543e9649da2da9a5285ede4ec4f718fd740fcb/gnu/packages/stenography.scm | scheme | GNU Guix --- Functional package management for GNU
This file is part of GNU Guix.
you can redistribute it and/or modify it
either version 3 of the License , or ( at
your option) any later version.
GNU Guix is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
| Copyright © 2020 < >
Copyright © 2021 < >
under the terms of the GNU General Public License as published by
You should have received a copy of the GNU General Public License
along with GNU . If not , see < / > .
(define-module (gnu packages stenography)
#:use-module (guix build-system python)
#:use-module (guix git-download)
#:use-module ((guix licenses) #:prefix license:)
#:use-module (guix packages)
#:use-module (gnu packages)
#:use-module (gnu packages qt)
#:use-module (gnu packages check)
#:use-module (gnu packages libusb)
#:use-module (gnu packages python)
#:use-module (gnu packages python-xyz)
#:use-module (gnu packages wxwidgets))
(define-public plover
(package
(name "plover")
(version "4.0.0.dev8")
(source
(origin
(method git-fetch)
(uri (git-reference
(url "")
(commit (string-append "v" version))))
(file-name (git-file-name name version))
(sha256
(base32 "1b2ys77bkjsdmyg97i7lq3lj45q56bycvsm06d4rs656kxhvc0a3"))))
(build-system python-build-system)
(native-inputs
`(("python-mock" ,python-mock)
("python-pytest" ,python-pytest)
("python-setuptools-scm" ,python-setuptools-scm)))
(inputs
`(("python-appdirs" ,python-appdirs)
("python-pyqt" ,python-pyqt)
("python-babel" ,python-babel)
("python-dbus" ,python-dbus)
("python-hidapi" ,python-hidapi)
("python-pyserial" ,python-pyserial)
("python-wxpython" ,python-wxpython)
("python-xlib" ,python-xlib)))
(home-page "/")
(synopsis "Stenography engine")
(description
"Plover (rhymes with @emph{lover}) is a desktop application that
allows anyone to use stenography to write on their computer, up to
speeds of 200WPM and beyond.")
(license license:gpl2+)))
|
82d3ab949d8f564fb34dff17feceee69b362cd7369167bfc66dd1acd20fcd141 | ptal/AbSolute | box.mli | Copyright 2019 AbSolute Team
This program is free software ; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation ; either
version 3 of the License , or ( at your option ) any later version .
This program is distributed in the hope that it will be useful ,
but WITHOUT ANY WARRANTY ; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the GNU
Lesser General Public License for more details .
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details. *)
module Box_split = Box_split
module Box_interpretation = Box_interpretation
module Var_store = Var_store
open Bounds
open Vardom
open Domains.Abstract_domain
(** Box are an event-based abstract domain and must be encapsulated in an `Event_loop`.
The `closure` operator is the identity function since it is decomposed in many tasks
handled by `Event_loop`. *)
module type Box_sig =
sig
module Vardom: Vardom_sig.S
type vardom = Vardom.t
include Abstract_domain
(** `project_vardom box v` projects the domain of the variable `v`. *)
val project_vardom: t -> I.var_id -> vardom
end
module type Box_functor = functor (B: Bound_sig.S) -> Box_sig
with module B = B and module Vardom.B = B
module Make
(B: Bound_sig.S)
(VARDOM: Vardom_sig.Vardom_functor)
(SPLIT: Box_split.Box_split_sig) : Box_sig
module Box_base(SPLIT: Box_split.Box_split_sig) : Box_functor
| null | https://raw.githubusercontent.com/ptal/AbSolute/469159d87e3a717499573c1e187e5cfa1b569829/src/domains/box/box.mli | ocaml | * Box are an event-based abstract domain and must be encapsulated in an `Event_loop`.
The `closure` operator is the identity function since it is decomposed in many tasks
handled by `Event_loop`.
* `project_vardom box v` projects the domain of the variable `v`. | Copyright 2019 AbSolute Team
This program is free software ; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation ; either
version 3 of the License , or ( at your option ) any later version .
This program is distributed in the hope that it will be useful ,
but WITHOUT ANY WARRANTY ; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE . See the GNU
Lesser General Public License for more details .
This program is free software; you can redistribute it and/or
modify it under the terms of the GNU Lesser General Public
License as published by the Free Software Foundation; either
version 3 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
Lesser General Public License for more details. *)
module Box_split = Box_split
module Box_interpretation = Box_interpretation
module Var_store = Var_store
open Bounds
open Vardom
open Domains.Abstract_domain
module type Box_sig =
sig
module Vardom: Vardom_sig.S
type vardom = Vardom.t
include Abstract_domain
val project_vardom: t -> I.var_id -> vardom
end
module type Box_functor = functor (B: Bound_sig.S) -> Box_sig
with module B = B and module Vardom.B = B
module Make
(B: Bound_sig.S)
(VARDOM: Vardom_sig.Vardom_functor)
(SPLIT: Box_split.Box_split_sig) : Box_sig
module Box_base(SPLIT: Box_split.Box_split_sig) : Box_functor
|
2aefd95e7052585cd94158ea163cb21a64a40c5d4c3f5012238c791eb5a479c1 | korya/efuns | xdebug.mli | (***********************************************************************)
(* *)
(* ____ *)
(* *)
Fabrice Le Fessant , projet Para / SOR , INRIA Rocquencourt
(* *)
Copyright 1999 Institut National de Recherche en Informatique et
Automatique . Distributed only by permission .
(* *)
(***********************************************************************)
val debug_flag : bool ref
val print : string -> unit
| null | https://raw.githubusercontent.com/korya/efuns/78b21d9dff45b7eec764c63132c7a564f5367c30/xlib/xdebug.mli | ocaml | *********************************************************************
____
********************************************************************* | Fabrice Le Fessant , projet Para / SOR , INRIA Rocquencourt
Copyright 1999 Institut National de Recherche en Informatique et
Automatique . Distributed only by permission .
val debug_flag : bool ref
val print : string -> unit
|
b41f1ed80687b79815c9116e044658b4e3d3edcd68d47f4039c9b7641f4f43df | geophf/1HaskellADay | Endpoint.hs | module Wikidata.Query.Endpoint where
-
WHEW ! Has querying the wikidata endpoint gotten easier or what ? ! ?
Just submit your SPARQL query using the sparql function and you 'll ge back
a ByteString response as JSON . Decoding the JSON is on you .
-
WHEW! Has querying the wikidata endpoint gotten easier or what?!?
Just submit your SPARQL query using the sparql function and you'll ge back
a ByteString response as JSON. Decoding the JSON is on you.
--}
import Data.ByteString.Lazy (ByteString)
import Network.HTTP.Conduit (simpleHttp)
import Network.HTTP (urlEncode)
endpoint :: FilePath
endpoint = "="
sparql :: String -> IO ByteString
sparql = simpleHttp . (endpoint ++) . urlEncode
| null | https://raw.githubusercontent.com/geophf/1HaskellADay/514792071226cd1e2ba7640af942667b85601006/exercises/HAD/Wikidata/Query/Endpoint.hs | haskell | } | module Wikidata.Query.Endpoint where
-
WHEW ! Has querying the wikidata endpoint gotten easier or what ? ! ?
Just submit your SPARQL query using the sparql function and you 'll ge back
a ByteString response as JSON . Decoding the JSON is on you .
-
WHEW! Has querying the wikidata endpoint gotten easier or what?!?
Just submit your SPARQL query using the sparql function and you'll ge back
a ByteString response as JSON. Decoding the JSON is on you.
import Data.ByteString.Lazy (ByteString)
import Network.HTTP.Conduit (simpleHttp)
import Network.HTTP (urlEncode)
endpoint :: FilePath
endpoint = "="
sparql :: String -> IO ByteString
sparql = simpleHttp . (endpoint ++) . urlEncode
|
e2407bdd6d141895ca058741896b8bb7c4f68eaaca0026660421daa14b79f4f6 | 45deg/distributed-xfrp | graphviz.ml | open Syntax
open Module
TODO : Fix this after ` codegen ` works .
let concat_map = Codegen.concat_map
let indent = Codegen.indent
let concat_map = Codegen.concat_map
let indent = Codegen.indent
*)
let concat_map s f l = String.concat s (List.map f l)
let indent n s = String.make n '\t' ^ s
let colors =
["#1f77b4"; "#ff7f0e"; "#2ca02c"; "#d62728"; "#9467bd"; "#8c564b"; "#e377c2"; "#7f7f7f"; "#bcbd22"; "#17becf"]
module PairS = Set.Make(struct type t = string * string let compare = compare end)
let loop_table loops =
List.fold_left (fun set loop ->
List.fold_left2 (fun set s t -> PairS.add (s,t) set) set loop (List.tl loop @ [List.hd loop])
) PairS.empty loops
let of_xmodule xmod =
let graph = Dependency.get_graph xmod in
let deps = Dependency.M.bindings graph in
let loops = loop_table (Dependency.find_loop xmod.source graph) in
let def key =
key ^ " [label=\"" ^ key ^ "\"" ^
(if List.mem key xmod.source then ", shape = \"invhouse\"" else "") ^
(if List.mem key xmod.sink then
", style = filled, shape = invtriangle, fillcolor = \"#e4e4e4\"" else "") ^
"];" in
let def_subgraph key label color nodes =
["subgraph " ^ key ^ " {"] @
indent 1 ("label=\"" ^ label ^ "\"; color=\"" ^ color ^ "\"; fontcolor=\"" ^ color ^ "\";") ::
List.map (indent 1) nodes @
["}"] in
let edge (key, dep) =
List.map (fun i ->
if (PairS.mem (i,key) loops) then
i ^ " -> " ^ key ^ " [color = red];"
else
i ^ " -> " ^ key ^ ";"
) (dep.Dependency.input_current) @
List.map (fun i -> i ^ " -> " ^ key ^ " [style = dashed];") (dep.Dependency.input_last)
in
"digraph " ^ xmod.id ^ " {\n" ^
concat_map " \n " ( indent 1 ) ( List.map ) ^ " \n\n " ^
String.concat "\n" (List.mapi (fun i (host, ids) ->
concat_map "\n" (indent 1) @@
def_subgraph
("cluster_" ^ string_of_int i) (string_of_host host)
(List.nth colors (i mod 10)) (List.map def ids)
) xmod.hostinfo) ^ "\n" ^
concat_map "\n" (indent 1) (List.map edge deps |> List.flatten) ^
(*
indent 1 "{ rank = source; " ^ String.concat "; " xmod.source ^ "; }\n" ^
indent 1 "{ rank = sink; " ^ String.concat "; " xmod.sink ^ "; }" ^
*)
"\n}" | null | https://raw.githubusercontent.com/45deg/distributed-xfrp/e1d4594e27c89c9ba4147e0bbf5ef44e39630409/src/graphviz.ml | ocaml |
indent 1 "{ rank = source; " ^ String.concat "; " xmod.source ^ "; }\n" ^
indent 1 "{ rank = sink; " ^ String.concat "; " xmod.sink ^ "; }" ^
| open Syntax
open Module
TODO : Fix this after ` codegen ` works .
let concat_map = Codegen.concat_map
let indent = Codegen.indent
let concat_map = Codegen.concat_map
let indent = Codegen.indent
*)
let concat_map s f l = String.concat s (List.map f l)
let indent n s = String.make n '\t' ^ s
let colors =
["#1f77b4"; "#ff7f0e"; "#2ca02c"; "#d62728"; "#9467bd"; "#8c564b"; "#e377c2"; "#7f7f7f"; "#bcbd22"; "#17becf"]
module PairS = Set.Make(struct type t = string * string let compare = compare end)
let loop_table loops =
List.fold_left (fun set loop ->
List.fold_left2 (fun set s t -> PairS.add (s,t) set) set loop (List.tl loop @ [List.hd loop])
) PairS.empty loops
let of_xmodule xmod =
let graph = Dependency.get_graph xmod in
let deps = Dependency.M.bindings graph in
let loops = loop_table (Dependency.find_loop xmod.source graph) in
let def key =
key ^ " [label=\"" ^ key ^ "\"" ^
(if List.mem key xmod.source then ", shape = \"invhouse\"" else "") ^
(if List.mem key xmod.sink then
", style = filled, shape = invtriangle, fillcolor = \"#e4e4e4\"" else "") ^
"];" in
let def_subgraph key label color nodes =
["subgraph " ^ key ^ " {"] @
indent 1 ("label=\"" ^ label ^ "\"; color=\"" ^ color ^ "\"; fontcolor=\"" ^ color ^ "\";") ::
List.map (indent 1) nodes @
["}"] in
let edge (key, dep) =
List.map (fun i ->
if (PairS.mem (i,key) loops) then
i ^ " -> " ^ key ^ " [color = red];"
else
i ^ " -> " ^ key ^ ";"
) (dep.Dependency.input_current) @
List.map (fun i -> i ^ " -> " ^ key ^ " [style = dashed];") (dep.Dependency.input_last)
in
"digraph " ^ xmod.id ^ " {\n" ^
concat_map " \n " ( indent 1 ) ( List.map ) ^ " \n\n " ^
String.concat "\n" (List.mapi (fun i (host, ids) ->
concat_map "\n" (indent 1) @@
def_subgraph
("cluster_" ^ string_of_int i) (string_of_host host)
(List.nth colors (i mod 10)) (List.map def ids)
) xmod.hostinfo) ^ "\n" ^
concat_map "\n" (indent 1) (List.map edge deps |> List.flatten) ^
"\n}" |
316b48640f48ebf38c94ad520697b374a1b0d50596f04ac81695c6d568c492fd | jkk/formative | parse_test.cljc | (ns formative.parse-test
#?(:cljs (:require-macros [cemerick.cljs.test :refer [is are deftest testing run-tests]]
[formative.macros :refer [with-fallback]]))
(:require [formative.parse :as fp]
[formative.util :as fu]
#?(:clj [formative.parse :refer [with-fallback]])
#?(:cljs [cemerick.cljs.test :as t])
#?(:clj [clojure.test :refer [is are deftest testing run-tests]])))
(def form1
{:fields [{:name :f-default}
{:name :f-int :type :hidden :datatype :int}
{:name :f-long :datatype :long}
{:name :f-boolean :datatype :boolean}
{:name :f-float :datatype :float}
{:name :f-double :datatype :double}
{:name :f-decimal :datatype :decimal}
{:name :f-bigint :datatype :bigint}
{:name :f-date :datatype :date}
{:name :f-time :datatype :time}
{:name :f-instant :datatype :instant}
{:name :f-ints :datatype :ints}
{:name :f-longs :datatype :longs}
{:name :f-booleans :datatype :booleans}
{:name :f-floats :datatype :floats}
{:name :f-doubles :datatype :doubles}
{:name :f-decimals :datatype :decimals}
{:name :f-bigints :datatype :bigints}
{:name :f-dates :datatype :dates}
{:name :f-times :datatype :times}
{:name :f-instants :datatype :instants}
{:name :f-textarea :type :textarea}
{:name :f-select1 :type :select :options ["foo" "bar" "baz"]}
{:name :f-select2 :type :select :datatype :boolean
:options [{:label "Foo" :value true}
{:label "Bar" :value false}]}
{:name :f-checkbox1 :type :checkbox}
{:name :f-checkbox2 :type :checkbox :checked-value "foo"
:unchecked-value "bar"}
{:name :f-checkboxes1 :type :checkboxes
:options ["foo" "bar" "baz"]}
{:name :f-checkboxes2 :type :checkboxes :datatype :booleans
:options [{:label "Foo" :value true}
{:label "Bar" :value false}]}
{:name :f-checkboxes3 :type :checkboxes :datatype :ints
:options [{:label "Foo" :value 1}
{:label "Bar" :value 2}
{:label "Baz" :value 3}]}
{:name :f-radios1 :type :radios
:options ["foo" "bar" "baz"]}
{:name :f-radios2 :type :radios :datatype :boolean
:options [{:label "Foo" :value true}
{:label "Bar" :value false}]}
{:name :f-us-state :type :us-state}
{:name :f-ca-state :type :ca-state}
{:name :f-country :type :country}
{:name :f-us-tel :type :us-tel}
{:name :f-date-select :type :date-select}
{:name :f-year-select :type :year-select}
{:name :f-month-select :type :month-select}
{:name :f-time-select :type :time-select}
{:name :f-datetime-select :type :datetime-select
:timezone "America/New_York"}
{:name :f-currency :type :currency}
{:name :f-heading :type :heading}
{:name :f-labeled-html :type :labeled-html}
{:name :f-html :type :html}
{:name "foo[bar][baz]" :datatype :int}
{:name :foo2.bar.baz :datatype :int}]})
(def good-params
{:f-default "foo"
:f-int "123"
:f-long "123"
:f-boolean "true"
:f-float "123.45"
:f-double "123.45"
:f-decimal "123.45"
:f-bigint "13918723981723918723987129387198273198273918273"
:f-date "2012-12-25"
:f-time "23:06"
:f-instant "2012-12-25T23:06:00"
:f-ints ["123" "456" "789"]
:f-longs "123,456, 789"
:f-booleans ["true" "true" "false"]
:f-floats ["123.45" "678.90"]
:f-doubles ["123.45" "678.90"]
:f-decimals ["123.45" "678.90"]
:f-bigints ["13918723981723918723987129387198273198273918273"
"29038402938402938402983409283049203948209384209"]
:f-dates ["2012-01-01" "2012-02-03" "2012-10-04"]
:f-times ["0:01" "23:02" "12:00"]
:f-instants ["2012-01-01T00:01:00" "2012-02-03T23:02:00" "2012-10-04T12:00:00"]
:f-textarea "foo"
:f-select1 "bar"
:f-select2 "true"
:f-checkbox1 "false"
:f-checkbox2 "bar"
:f-checkboxes1 ["" "foo" "bar"]
:f-checkboxes2 ["" "true" "false"]
:f-checkboxes3 ["" "2" "3"]
:f-radios1 "foo"
:f-radios2 "false"
:f-us-state "NY"
:f-ca-state "ON"
:f-country "US"
:f-us-tel "(234) 567-8901x123"
:f-date-select {:month "12" :day "25" :year "2012"}
:f-year-select "2012"
:f-month-select "12"
:f-time-select {:h "12" :m "0" :ampm "pm"}
:f-datetime-select {:year "2012" :month "12" :day "25" :h "6" :m "0" :ampm "pm"}
:f-currency "123.45"
:f-heading "foo"
:f-labeled-html "foo"
:f-html "foo"
:foo {:bar {:baz "1"}}
:foo2 {:bar {:baz "1"}}})
(def good-values
{:f-default "foo"
:f-int 123
:f-long 123
:f-boolean true
:f-float 123.45
:f-double 123.45
:f-decimal #?(:clj 123.45M :cljs "123.45")
:f-bigint #?(:clj 13918723981723918723987129387198273198273918273N
:cljs "13918723981723918723987129387198273198273918273")
:f-date (fu/to-date (fu/utc-date 2012 12 25))
:f-time (fu/to-time (fu/parse-time "23:06"))
:f-instant (fu/to-date (fu/utc-date 2012 12 25 23 6))
:f-ints [123 456 789]
:f-longs [123 456 789]
:f-booleans [true true false]
:f-floats [123.45 678.90]
:f-doubles [123.45 678.90]
:f-decimals #?(:clj [123.45M 678.90M] :cljs ["123.45" "678.90"])
:f-bigints #?(:clj [13918723981723918723987129387198273198273918273N
29038402938402938402983409283049203948209384209N]
:cljs ["13918723981723918723987129387198273198273918273"
"29038402938402938402983409283049203948209384209"])
:f-dates [(fu/to-date (fu/utc-date 2012 1 1))
(fu/to-date (fu/utc-date 2012 2 3))
(fu/to-date (fu/utc-date 2012 10 4))]
:f-times [(fu/to-time (fu/parse-time "00:01"))
(fu/to-time (fu/parse-time "23:02"))
(fu/to-time (fu/parse-time "12:00"))]
:f-instants [(fu/to-date (fu/utc-date 2012 1 1 0 1))
(fu/to-date (fu/utc-date 2012 2 3 23 2))
(fu/to-date (fu/utc-date 2012 10 4 12 0))]
:f-textarea "foo"
:f-select1 "bar"
:f-select2 true
:f-checkbox1 false
:f-checkbox2 "bar"
:f-checkboxes1 ["foo" "bar"]
:f-checkboxes2 [true false]
:f-checkboxes3 [2 3]
:f-radios1 "foo"
:f-radios2 false
:f-us-state "NY"
:f-ca-state "ON"
:f-country "US"
:f-us-tel "2345678901x123"
:f-date-select (fu/to-date (fu/utc-date 2012 12 25))
:f-year-select 2012
:f-month-select 12
:f-time-select (fu/to-time (fu/parse-time "12:00"))
:f-datetime-select (fu/to-date (fu/utc-date 2012 12 25 #?(:clj 23 :cljs 18) 0))
:f-currency #?(:clj 123.45M :cljs "123.45")
:foo {:bar {:baz 1}}
:foo2 {:bar {:baz 1}}})
(deftest parse-test
(testing "Known-good params"
(let [values (fp/parse-params form1 good-params)]
(is (= values good-values))))
(testing "Unparsed Ring params"
(is (= (fp/parse-params form1 {"f-date-select[year]" "2012"
"f-date-select[month]" "12"
"f-date-select[day]" "25"
"f-checkboxes2[]" ["" "true" "false"]})
{:f-date-select (fu/to-date (fu/utc-date 2012 12 25))
:f-checkboxes2 [true false]})))
(testing "Unparsed form data"
(is (= (fp/parse-params form1 (str "f-date-select[year]=2012"
"&f-date-select[month]=12"
"&f-date-select[day]=25"
"&f-checkboxes2[]="
"&f-checkboxes2[]=true"
"&f-checkboxes2[]=false"))
{:f-date-select (fu/to-date (fu/utc-date 2012 12 25))
:f-checkboxes2 [true false]})))
(testing "Failed parsing"
(let [values (fp/parse-params form1 {:f-int "xxx"}
:validate false)]
(is (instance? formative.parse.ParseError (:f-int values))))
(let [ex (try
(fp/parse-params form1 {:f-int "xxx"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= [{:keys [:f-int] :msg "must be a number"}]
(:problems (ex-data ex)))))
(let [ex (try
(fp/parse-params (assoc form1
:validations
[[:required [:f-us-state
:f-ca-state
:f-country]]])
{:f-int "123"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= [{:keys [:f-us-state :f-ca-state :f-country]
:msg "must not be blank"}]
(:problems (ex-data ex)))))))
(def form2
{:fields [{:name :a :datatype :int :datatype-error "foobar"}]
:validations [[:int :a "nope"]]})
(deftest validate-types-test
(testing ":validate-types true (default)"
(let [ex (try
(fp/parse-params form2 {:a "x"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= '({:keys (:a), :msg "foobar"}
{:keys (:a), :msg "nope"})
(:problems (ex-data ex))))))
(testing ":validate-types false"
(let [ex (try
(fp/parse-params (assoc form2 :validate-types false)
{:a "x"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= '({:keys (:a), :msg "nope"})
(:problems (ex-data ex))))))
(testing ":validate-types false, without :validations"
(is (= (fp/parse-params (-> form2
(dissoc :validations)
(assoc :validate-types false))
{:a "x"})
{:a (fp/->ParseError "x")}))))
(deftest with-fallback-test
(= [{:msg "hi"}]
(with-fallback identity
(throw (ex-info "Boo" {:problems [{:msg "hi"}]})))))
;;(run-tests)
| null | https://raw.githubusercontent.com/jkk/formative/e1b69161c05438a48d3186bd2eb5126377ffe2e3/test/formative/parse_test.cljc | clojure | (run-tests) | (ns formative.parse-test
#?(:cljs (:require-macros [cemerick.cljs.test :refer [is are deftest testing run-tests]]
[formative.macros :refer [with-fallback]]))
(:require [formative.parse :as fp]
[formative.util :as fu]
#?(:clj [formative.parse :refer [with-fallback]])
#?(:cljs [cemerick.cljs.test :as t])
#?(:clj [clojure.test :refer [is are deftest testing run-tests]])))
(def form1
{:fields [{:name :f-default}
{:name :f-int :type :hidden :datatype :int}
{:name :f-long :datatype :long}
{:name :f-boolean :datatype :boolean}
{:name :f-float :datatype :float}
{:name :f-double :datatype :double}
{:name :f-decimal :datatype :decimal}
{:name :f-bigint :datatype :bigint}
{:name :f-date :datatype :date}
{:name :f-time :datatype :time}
{:name :f-instant :datatype :instant}
{:name :f-ints :datatype :ints}
{:name :f-longs :datatype :longs}
{:name :f-booleans :datatype :booleans}
{:name :f-floats :datatype :floats}
{:name :f-doubles :datatype :doubles}
{:name :f-decimals :datatype :decimals}
{:name :f-bigints :datatype :bigints}
{:name :f-dates :datatype :dates}
{:name :f-times :datatype :times}
{:name :f-instants :datatype :instants}
{:name :f-textarea :type :textarea}
{:name :f-select1 :type :select :options ["foo" "bar" "baz"]}
{:name :f-select2 :type :select :datatype :boolean
:options [{:label "Foo" :value true}
{:label "Bar" :value false}]}
{:name :f-checkbox1 :type :checkbox}
{:name :f-checkbox2 :type :checkbox :checked-value "foo"
:unchecked-value "bar"}
{:name :f-checkboxes1 :type :checkboxes
:options ["foo" "bar" "baz"]}
{:name :f-checkboxes2 :type :checkboxes :datatype :booleans
:options [{:label "Foo" :value true}
{:label "Bar" :value false}]}
{:name :f-checkboxes3 :type :checkboxes :datatype :ints
:options [{:label "Foo" :value 1}
{:label "Bar" :value 2}
{:label "Baz" :value 3}]}
{:name :f-radios1 :type :radios
:options ["foo" "bar" "baz"]}
{:name :f-radios2 :type :radios :datatype :boolean
:options [{:label "Foo" :value true}
{:label "Bar" :value false}]}
{:name :f-us-state :type :us-state}
{:name :f-ca-state :type :ca-state}
{:name :f-country :type :country}
{:name :f-us-tel :type :us-tel}
{:name :f-date-select :type :date-select}
{:name :f-year-select :type :year-select}
{:name :f-month-select :type :month-select}
{:name :f-time-select :type :time-select}
{:name :f-datetime-select :type :datetime-select
:timezone "America/New_York"}
{:name :f-currency :type :currency}
{:name :f-heading :type :heading}
{:name :f-labeled-html :type :labeled-html}
{:name :f-html :type :html}
{:name "foo[bar][baz]" :datatype :int}
{:name :foo2.bar.baz :datatype :int}]})
(def good-params
{:f-default "foo"
:f-int "123"
:f-long "123"
:f-boolean "true"
:f-float "123.45"
:f-double "123.45"
:f-decimal "123.45"
:f-bigint "13918723981723918723987129387198273198273918273"
:f-date "2012-12-25"
:f-time "23:06"
:f-instant "2012-12-25T23:06:00"
:f-ints ["123" "456" "789"]
:f-longs "123,456, 789"
:f-booleans ["true" "true" "false"]
:f-floats ["123.45" "678.90"]
:f-doubles ["123.45" "678.90"]
:f-decimals ["123.45" "678.90"]
:f-bigints ["13918723981723918723987129387198273198273918273"
"29038402938402938402983409283049203948209384209"]
:f-dates ["2012-01-01" "2012-02-03" "2012-10-04"]
:f-times ["0:01" "23:02" "12:00"]
:f-instants ["2012-01-01T00:01:00" "2012-02-03T23:02:00" "2012-10-04T12:00:00"]
:f-textarea "foo"
:f-select1 "bar"
:f-select2 "true"
:f-checkbox1 "false"
:f-checkbox2 "bar"
:f-checkboxes1 ["" "foo" "bar"]
:f-checkboxes2 ["" "true" "false"]
:f-checkboxes3 ["" "2" "3"]
:f-radios1 "foo"
:f-radios2 "false"
:f-us-state "NY"
:f-ca-state "ON"
:f-country "US"
:f-us-tel "(234) 567-8901x123"
:f-date-select {:month "12" :day "25" :year "2012"}
:f-year-select "2012"
:f-month-select "12"
:f-time-select {:h "12" :m "0" :ampm "pm"}
:f-datetime-select {:year "2012" :month "12" :day "25" :h "6" :m "0" :ampm "pm"}
:f-currency "123.45"
:f-heading "foo"
:f-labeled-html "foo"
:f-html "foo"
:foo {:bar {:baz "1"}}
:foo2 {:bar {:baz "1"}}})
(def good-values
{:f-default "foo"
:f-int 123
:f-long 123
:f-boolean true
:f-float 123.45
:f-double 123.45
:f-decimal #?(:clj 123.45M :cljs "123.45")
:f-bigint #?(:clj 13918723981723918723987129387198273198273918273N
:cljs "13918723981723918723987129387198273198273918273")
:f-date (fu/to-date (fu/utc-date 2012 12 25))
:f-time (fu/to-time (fu/parse-time "23:06"))
:f-instant (fu/to-date (fu/utc-date 2012 12 25 23 6))
:f-ints [123 456 789]
:f-longs [123 456 789]
:f-booleans [true true false]
:f-floats [123.45 678.90]
:f-doubles [123.45 678.90]
:f-decimals #?(:clj [123.45M 678.90M] :cljs ["123.45" "678.90"])
:f-bigints #?(:clj [13918723981723918723987129387198273198273918273N
29038402938402938402983409283049203948209384209N]
:cljs ["13918723981723918723987129387198273198273918273"
"29038402938402938402983409283049203948209384209"])
:f-dates [(fu/to-date (fu/utc-date 2012 1 1))
(fu/to-date (fu/utc-date 2012 2 3))
(fu/to-date (fu/utc-date 2012 10 4))]
:f-times [(fu/to-time (fu/parse-time "00:01"))
(fu/to-time (fu/parse-time "23:02"))
(fu/to-time (fu/parse-time "12:00"))]
:f-instants [(fu/to-date (fu/utc-date 2012 1 1 0 1))
(fu/to-date (fu/utc-date 2012 2 3 23 2))
(fu/to-date (fu/utc-date 2012 10 4 12 0))]
:f-textarea "foo"
:f-select1 "bar"
:f-select2 true
:f-checkbox1 false
:f-checkbox2 "bar"
:f-checkboxes1 ["foo" "bar"]
:f-checkboxes2 [true false]
:f-checkboxes3 [2 3]
:f-radios1 "foo"
:f-radios2 false
:f-us-state "NY"
:f-ca-state "ON"
:f-country "US"
:f-us-tel "2345678901x123"
:f-date-select (fu/to-date (fu/utc-date 2012 12 25))
:f-year-select 2012
:f-month-select 12
:f-time-select (fu/to-time (fu/parse-time "12:00"))
:f-datetime-select (fu/to-date (fu/utc-date 2012 12 25 #?(:clj 23 :cljs 18) 0))
:f-currency #?(:clj 123.45M :cljs "123.45")
:foo {:bar {:baz 1}}
:foo2 {:bar {:baz 1}}})
(deftest parse-test
(testing "Known-good params"
(let [values (fp/parse-params form1 good-params)]
(is (= values good-values))))
(testing "Unparsed Ring params"
(is (= (fp/parse-params form1 {"f-date-select[year]" "2012"
"f-date-select[month]" "12"
"f-date-select[day]" "25"
"f-checkboxes2[]" ["" "true" "false"]})
{:f-date-select (fu/to-date (fu/utc-date 2012 12 25))
:f-checkboxes2 [true false]})))
(testing "Unparsed form data"
(is (= (fp/parse-params form1 (str "f-date-select[year]=2012"
"&f-date-select[month]=12"
"&f-date-select[day]=25"
"&f-checkboxes2[]="
"&f-checkboxes2[]=true"
"&f-checkboxes2[]=false"))
{:f-date-select (fu/to-date (fu/utc-date 2012 12 25))
:f-checkboxes2 [true false]})))
(testing "Failed parsing"
(let [values (fp/parse-params form1 {:f-int "xxx"}
:validate false)]
(is (instance? formative.parse.ParseError (:f-int values))))
(let [ex (try
(fp/parse-params form1 {:f-int "xxx"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= [{:keys [:f-int] :msg "must be a number"}]
(:problems (ex-data ex)))))
(let [ex (try
(fp/parse-params (assoc form1
:validations
[[:required [:f-us-state
:f-ca-state
:f-country]]])
{:f-int "123"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= [{:keys [:f-us-state :f-ca-state :f-country]
:msg "must not be blank"}]
(:problems (ex-data ex)))))))
(def form2
{:fields [{:name :a :datatype :int :datatype-error "foobar"}]
:validations [[:int :a "nope"]]})
(deftest validate-types-test
(testing ":validate-types true (default)"
(let [ex (try
(fp/parse-params form2 {:a "x"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= '({:keys (:a), :msg "foobar"}
{:keys (:a), :msg "nope"})
(:problems (ex-data ex))))))
(testing ":validate-types false"
(let [ex (try
(fp/parse-params (assoc form2 :validate-types false)
{:a "x"})
(catch #?(:clj Exception :cljs js/Error) ex
ex))]
(is (= '({:keys (:a), :msg "nope"})
(:problems (ex-data ex))))))
(testing ":validate-types false, without :validations"
(is (= (fp/parse-params (-> form2
(dissoc :validations)
(assoc :validate-types false))
{:a "x"})
{:a (fp/->ParseError "x")}))))
(deftest with-fallback-test
(= [{:msg "hi"}]
(with-fallback identity
(throw (ex-info "Boo" {:problems [{:msg "hi"}]})))))
|
b78b0a07ab8c9ba4fbf217a9951bc206b9a07a383c92d26b227b2410d7ec5406 | rescript-lang/rescript-compiler | mlSyntax.ml | test ml file
let () = print_endline "hello world"
let unicode = "🙈 😅 🙌"
let d = {|Sehr Schön|}
| null | https://raw.githubusercontent.com/rescript-lang/rescript-compiler/e87d6b52b7a78207c1fecfaba672523003e1dc3e/res_syntax/tests/api/mlSyntax.ml | ocaml | test ml file
let () = print_endline "hello world"
let unicode = "🙈 😅 🙌"
let d = {|Sehr Schön|}
| |
0629484cb7e993f631012b024ee1901d442d2b45bd85d46c25f55d2817fae52c | ocharles/zero-to-quake-3 | Input.hs | # language LambdaCase #
# language RecordWildCards #
module Quake3.Input where
sdl2
import qualified SDL
import qualified SDL.Event
zero - to - quake-3
import qualified Quake3.Model
eventToAction :: SDL.Event -> Maybe Quake3.Model.Action
eventToAction =
eventPayloadToAction . SDL.Event.eventPayload
eventPayloadToAction :: SDL.EventPayload -> Maybe Quake3.Model.Action
eventPayloadToAction = \case
SDL.Event.KeyboardEvent e ->
keyboardEventToAction e
SDL.Event.MouseMotionEvent e ->
mouseMotionEventToAction e
_ ->
Nothing
keyboardEventToAction :: SDL.KeyboardEventData -> Maybe Quake3.Model.Action
keyboardEventToAction SDL.Event.KeyboardEventData{..} =
case SDL.keysymScancode keyboardEventKeysym of
SDL.ScancodeW ->
Just
( Quake3.Model.ToggleRunForward
( keyboardEventKeyMotion == SDL.Event.Pressed )
)
_ ->
Nothing
mouseMotionEventToAction :: SDL.MouseMotionEventData -> Maybe Quake3.Model.Action
mouseMotionEventToAction SDL.MouseMotionEventData{..} =
Just
( Quake3.Model.TurnBy
( fmap ( ( / 100 ) . fromIntegral ) mouseMotionEventRelMotion )
)
| null | https://raw.githubusercontent.com/ocharles/zero-to-quake-3/6fe4ef61955b8a816369cf70c0edd2cacfd911db/src/Quake3/Input.hs | haskell | # language LambdaCase #
# language RecordWildCards #
module Quake3.Input where
sdl2
import qualified SDL
import qualified SDL.Event
zero - to - quake-3
import qualified Quake3.Model
eventToAction :: SDL.Event -> Maybe Quake3.Model.Action
eventToAction =
eventPayloadToAction . SDL.Event.eventPayload
eventPayloadToAction :: SDL.EventPayload -> Maybe Quake3.Model.Action
eventPayloadToAction = \case
SDL.Event.KeyboardEvent e ->
keyboardEventToAction e
SDL.Event.MouseMotionEvent e ->
mouseMotionEventToAction e
_ ->
Nothing
keyboardEventToAction :: SDL.KeyboardEventData -> Maybe Quake3.Model.Action
keyboardEventToAction SDL.Event.KeyboardEventData{..} =
case SDL.keysymScancode keyboardEventKeysym of
SDL.ScancodeW ->
Just
( Quake3.Model.ToggleRunForward
( keyboardEventKeyMotion == SDL.Event.Pressed )
)
_ ->
Nothing
mouseMotionEventToAction :: SDL.MouseMotionEventData -> Maybe Quake3.Model.Action
mouseMotionEventToAction SDL.MouseMotionEventData{..} =
Just
( Quake3.Model.TurnBy
( fmap ( ( / 100 ) . fromIntegral ) mouseMotionEventRelMotion )
)
| |
2eb72d6f314d05fa68663bd710a9fb8eecfb23427b227537bceba2c947c794c9 | zyrolasting/polyglot | vcomps.rkt | #lang racket/base
(require racket/list polyglot)
(provide (all-defined-out))
(define (page title initial-page-tx)
`(html (head (title ,title)
(meta ((charset "utf-8")))
(meta ((name "viewport") (content "width=device-width, initial-scale=1")))
; polyglot counts styles.css as a dependency.
(link ((rel "stylesheet") (type "text/css") (href "styles.css"))))
(body
(h1 ,title)
. ,(get-elements (findf-txexpr initial-page-tx
(λ (x) (tag-equal? 'body x)))))))
(define (add-newlines lines)
(map (λ (l) (format "~a~n" l)) lines))
; Note that this returns an application element. `polyglot` will do another pass if
; it sees that application elements produce more application elements. This makes
; it possible to generate more sophisticated content in terms of code...
(define (meta-script id . lines)
`(script ((type "application/racket") (id ,id))
. ,lines))
; ...such as code samples combined with their actual output.
(define (rackdown-code-sample name . lines)
`(div ((class "code-sample"))
(pre . ,(add-newlines lines))
(output ,(apply meta-script (cons name lines)))))
| null | https://raw.githubusercontent.com/zyrolasting/polyglot/d27ca7fe90fd4ba2a6c5bcd921fce89e72d2c408/polyglot-lib/polyglot/private/skel/functional/vcomps.rkt | racket | polyglot counts styles.css as a dependency.
Note that this returns an application element. `polyglot` will do another pass if
it sees that application elements produce more application elements. This makes
it possible to generate more sophisticated content in terms of code...
...such as code samples combined with their actual output. | #lang racket/base
(require racket/list polyglot)
(provide (all-defined-out))
(define (page title initial-page-tx)
`(html (head (title ,title)
(meta ((charset "utf-8")))
(meta ((name "viewport") (content "width=device-width, initial-scale=1")))
(link ((rel "stylesheet") (type "text/css") (href "styles.css"))))
(body
(h1 ,title)
. ,(get-elements (findf-txexpr initial-page-tx
(λ (x) (tag-equal? 'body x)))))))
(define (add-newlines lines)
(map (λ (l) (format "~a~n" l)) lines))
(define (meta-script id . lines)
`(script ((type "application/racket") (id ,id))
. ,lines))
(define (rackdown-code-sample name . lines)
`(div ((class "code-sample"))
(pre . ,(add-newlines lines))
(output ,(apply meta-script (cons name lines)))))
|
54949225160f5f04a24d290b841a70b89a71ad3f89a556a3bef4b4a9ea1bbbed | ghcjs/jsaddle-dom | SVGFEMergeNodeElement.hs | # LANGUAGE PatternSynonyms #
-- For HasCallStack compatibility
{-# LANGUAGE ImplicitParams, ConstraintKinds, KindSignatures #-}
# OPTIONS_GHC -fno - warn - unused - imports #
module JSDOM.Generated.SVGFEMergeNodeElement
(getIn1, SVGFEMergeNodeElement(..), gTypeSVGFEMergeNodeElement)
where
import Prelude ((.), (==), (>>=), return, IO, Int, Float, Double, Bool(..), Maybe, maybe, fromIntegral, round, realToFrac, fmap, Show, Read, Eq, Ord, Maybe(..))
import qualified Prelude (error)
import Data.Typeable (Typeable)
import Data.Traversable (mapM)
import Language.Javascript.JSaddle (JSM(..), JSVal(..), JSString, strictEqual, toJSVal, valToStr, valToNumber, valToBool, js, jss, jsf, jsg, function, asyncFunction, new, array, jsUndefined, (!), (!!))
import Data.Int (Int64)
import Data.Word (Word, Word64)
import JSDOM.Types
import Control.Applicative ((<$>))
import Control.Monad (void)
import Control.Lens.Operators ((^.))
import JSDOM.EventTargetClosures (EventName, unsafeEventName, unsafeEventNameAsync)
import JSDOM.Enums
| < -US/docs/Web/API/SVGFEMergeNodeElement.in1 Mozilla SVGFEMergeNodeElement.in1 documentation >
getIn1 ::
(MonadDOM m) => SVGFEMergeNodeElement -> m SVGAnimatedString
getIn1 self = liftDOM ((self ^. js "in1") >>= fromJSValUnchecked)
| null | https://raw.githubusercontent.com/ghcjs/jsaddle-dom/5f5094277d4b11f3dc3e2df6bb437b75712d268f/src/JSDOM/Generated/SVGFEMergeNodeElement.hs | haskell | For HasCallStack compatibility
# LANGUAGE ImplicitParams, ConstraintKinds, KindSignatures # | # LANGUAGE PatternSynonyms #
# OPTIONS_GHC -fno - warn - unused - imports #
module JSDOM.Generated.SVGFEMergeNodeElement
(getIn1, SVGFEMergeNodeElement(..), gTypeSVGFEMergeNodeElement)
where
import Prelude ((.), (==), (>>=), return, IO, Int, Float, Double, Bool(..), Maybe, maybe, fromIntegral, round, realToFrac, fmap, Show, Read, Eq, Ord, Maybe(..))
import qualified Prelude (error)
import Data.Typeable (Typeable)
import Data.Traversable (mapM)
import Language.Javascript.JSaddle (JSM(..), JSVal(..), JSString, strictEqual, toJSVal, valToStr, valToNumber, valToBool, js, jss, jsf, jsg, function, asyncFunction, new, array, jsUndefined, (!), (!!))
import Data.Int (Int64)
import Data.Word (Word, Word64)
import JSDOM.Types
import Control.Applicative ((<$>))
import Control.Monad (void)
import Control.Lens.Operators ((^.))
import JSDOM.EventTargetClosures (EventName, unsafeEventName, unsafeEventNameAsync)
import JSDOM.Enums
| < -US/docs/Web/API/SVGFEMergeNodeElement.in1 Mozilla SVGFEMergeNodeElement.in1 documentation >
getIn1 ::
(MonadDOM m) => SVGFEMergeNodeElement -> m SVGAnimatedString
getIn1 self = liftDOM ((self ^. js "in1") >>= fromJSValUnchecked)
|
d017cf5b0285b1b803d7b6439598f838d55066d52f665ab718208c4a3da9f075 | arbor/antiope | Orphans.hs | # OPTIONS_GHC -fno - warn - orphans #
module Antiope.Orphans where
import Control.Monad.Trans (lift)
import Control.Monad.Trans.Resource (ResourceT)
import Network.AWS (MonadAWS (..))
instance MonadAWS m => MonadAWS (ResourceT m) where
liftAWS = lift . liftAWS
| null | https://raw.githubusercontent.com/arbor/antiope/86ad3df07b8d3fd5d2c8bef4111a73b85850e1ba/antiope-core/src/Antiope/Orphans.hs | haskell | # OPTIONS_GHC -fno - warn - orphans #
module Antiope.Orphans where
import Control.Monad.Trans (lift)
import Control.Monad.Trans.Resource (ResourceT)
import Network.AWS (MonadAWS (..))
instance MonadAWS m => MonadAWS (ResourceT m) where
liftAWS = lift . liftAWS
| |
fb7803c5e2c652a5e6ffedeeea907e744f380415ccc35aad5303b3b513e9556c | roehst/tapl-implementations | syntax.mli | module Syntax : syntax trees and associated support functions
open Support.Pervasive
open Support.Error
(* Data type definitions *)
type ty =
TyTop
| TyArr of ty * ty
| TyRecord of (string * ty) list
| TyBool
type term =
TmVar of info * int * int
| TmAbs of info * string * ty * term
| TmApp of info * term * term
| TmRecord of info * (string * term) list
| TmProj of info * term * string
| TmTrue of info
| TmFalse of info
| TmIf of info * term * term * term
type binding =
NameBind
| VarBind of ty
type command =
| Eval of info * term
| Bind of info * string * binding
(* Contexts *)
type context
val emptycontext : context
val ctxlength : context -> int
val addbinding : context -> string -> binding -> context
val addname: context -> string -> context
val index2name : info -> context -> int -> string
val getbinding : info -> context -> int -> binding
val name2index : info -> context -> string -> int
val isnamebound : context -> string -> bool
val getTypeFromContext : info -> context -> int -> ty
(* Shifting and substitution *)
val termShift: int -> term -> term
val termSubstTop: term -> term -> term
(* Printing *)
val printtm: context -> term -> unit
val printtm_ATerm: bool -> context -> term -> unit
val printty : ty -> unit
val prbinding : context -> binding -> unit
(* Misc *)
val tmInfo: term -> info
| null | https://raw.githubusercontent.com/roehst/tapl-implementations/23c0dc505a8c0b0a797201a7e4e3e5b939dd8fdb/joinexercise/syntax.mli | ocaml | Data type definitions
Contexts
Shifting and substitution
Printing
Misc | module Syntax : syntax trees and associated support functions
open Support.Pervasive
open Support.Error
type ty =
TyTop
| TyArr of ty * ty
| TyRecord of (string * ty) list
| TyBool
type term =
TmVar of info * int * int
| TmAbs of info * string * ty * term
| TmApp of info * term * term
| TmRecord of info * (string * term) list
| TmProj of info * term * string
| TmTrue of info
| TmFalse of info
| TmIf of info * term * term * term
type binding =
NameBind
| VarBind of ty
type command =
| Eval of info * term
| Bind of info * string * binding
type context
val emptycontext : context
val ctxlength : context -> int
val addbinding : context -> string -> binding -> context
val addname: context -> string -> context
val index2name : info -> context -> int -> string
val getbinding : info -> context -> int -> binding
val name2index : info -> context -> string -> int
val isnamebound : context -> string -> bool
val getTypeFromContext : info -> context -> int -> ty
val termShift: int -> term -> term
val termSubstTop: term -> term -> term
val printtm: context -> term -> unit
val printtm_ATerm: bool -> context -> term -> unit
val printty : ty -> unit
val prbinding : context -> binding -> unit
val tmInfo: term -> info
|
6b1f80f7bcb19f143570b8a3f859d70e28d767688819643a8bcde15849e0a68b | stevana/coroutine-state-machines | Main.hs | # LANGUAGE DeriveFoldable #
# LANGUAGE DeriveFunctor #
# LANGUAGE DerivingStrategies #
# LANGUAGE ScopedTypeVariables #
module Main where
import Control.Concurrent
import Control.Concurrent.Async
import Control.Concurrent.STM
import Control.Concurrent.STM.TQueue
import Control.Monad
import Control.Monad.IO.Class
import Data.List (permutations)
import Data.Map (Map)
import qualified Data.Map as Map
import Data.Tree
import System.Random
import Test.QuickCheck
import Test.QuickCheck.Monadic
import Test.Tasty
import Test.Tasty.QuickCheck
import KeyValueApp (keyValueMain)
import KeyValueClient
------------------------------------------------------------------------
data Command = WriteCmd String Int | ReadCmd String
deriving stock (Eq, Show)
data Response = Unit () | MaybeInt (Maybe Int)
deriving stock (Eq, Show)
type Model = Map String Int
initModel :: Model
initModel = Map.empty
step :: Model -> Command -> (Model, Response)
step m cmd = case cmd of
WriteCmd k v -> Unit <$> fakeWrite m k v
ReadCmd k -> MaybeInt <$> fakeRead m k
where
fakeWrite :: Model -> String -> Int -> (Model, ())
fakeWrite m k v = (Map.insert k v m, ())
fakeRead :: Model -> String -> (Model, Maybe Int)
fakeRead m k = (m, m Map.!? k)
newtype Program = Program [Command]
deriving stock Show
genProgram :: Model -> Gen Program
genProgram _m = Program <$> listOf genCommand
genCommand :: Gen Command
genCommand = oneof [WriteCmd <$> genKey <*> arbitrary, ReadCmd <$> genKey]
where
genKey :: Gen String
genKey = elements ["a", "b", "c"]
shrinkProgram :: Program -> [Program]
shrinkProgram (Program cmds) = [ Program cmds' | cmds' <- shrinkList shrinkCommand cmds ]
shrinkCommand :: Command -> [Command]
shrinkCommand (WriteCmd k v) = [ WriteCmd k v' | v' <- shrink v ]
shrinkCommand (ReadCmd k) = []
exec :: KeyValueClient -> Command -> IO Response
exec c cmd = case cmd of
WriteCmd k v -> Unit <$> kvWrite c k v
ReadCmd k -> MaybeInt <$> kvRead c k
type Trace = [Step]
data Step = Step
{ sModelBefore :: Model
, sCommand :: Command
, sResponse :: Response
, sModelAfter :: Model
}
showTrace :: Trace -> String
showTrace ss0 = "\n\n" ++ go ss0
where
go [] = ""
go (Step m cmd resp m' : ss) = show m ++ "\n == " ++ show cmd ++ " ==> " ++ show resp ++ "\n" ++ show m' ++ "\n\n" ++ go ss
coverage :: Trace -> Property -> Property
coverage hist = classifyLength hist
where
classifyLength xs = classify (length xs == 0) "0 length"
. classify (0 < length xs && length xs <= 10) "1-10 length"
. classify (10 < length xs && length xs <= 50) "11-50 length"
. classify (50 < length xs && length xs <= 100) "51-100 length"
. classify (100 < length xs && length xs <= 300) "101-300 length"
. classify (300 < length xs && length xs <= 500) "301-500 length"
-- NOTE: Assumes that the server is running.
prop_sequential :: Int -> Property
prop_sequential port = forallPrograms $ \prog -> monadicIO $ do
kvc <- run (newKeyValueClient port)
run (kvReset kvc)
let m = initModel
(mce, hist) <- runProgram kvc m prog
monitor (coverage hist)
case mce of
Nothing -> return True
Just ce -> do
monitor (counterexample (showTrace hist))
monitor (counterexample ce)
return False
forallPrograms :: (Program -> Property) -> Property
forallPrograms p =
forAllShrink (genProgram initModel) shrinkProgram p
runProgram :: MonadIO m => KeyValueClient -> Model -> Program -> m (Maybe String, Trace)
runProgram c0 m0 (Program cmds0) = go c0 m0 [] cmds0
where
go _c _m hist [] = return (Nothing, reverse hist)
go c m hist (cmd : cmds) = do
resp <- liftIO (exec c cmd)
let (m', resp') = step m cmd
if resp == resp'
then go c m' (Step m cmd resp m' : hist) cmds
else return (Just (show resp ++ " /= " ++ show resp'), reverse (Step m cmd resp m' : hist))
newtype ConcProgram = ConcProgram { unConcProgram :: [[Command]] }
deriving stock Show
forAllConcProgram :: (ConcProgram -> Property) -> Property
forAllConcProgram k =
forAllShrinkShow (genConcProgram m) (shrinkConcProgram m) prettyConcProgram k
where
m = initModel
genConcProgram :: Model -> Gen ConcProgram
genConcProgram m0 = sized (go m0 [])
where
go :: Model -> [[Command]] -> Int -> Gen ConcProgram
go m acc sz | sz <= 0 = return (ConcProgram (reverse acc))
| otherwise = do
n <- chooseInt (2, 5)
cmds <- vectorOf n genCommand `suchThat` concSafe m
go (advanceModel m cmds) (cmds : acc) (sz - n)
advanceModel :: Model -> [Command] -> Model
advanceModel m cmds = foldl (\ih cmd -> fst (step ih cmd)) m cmds
concSafe :: Model -> [Command] -> Bool
concSafe m = all (validProgram m) . permutations
validProgram :: Model -> [Command] -> Bool
validProgram _model _cmds = True
validConcProgram :: Model -> ConcProgram -> Bool
validConcProgram m0 (ConcProgram cmdss0) = go m0 True cmdss0
where
go :: Model -> Bool -> [[Command]] -> Bool
go _m False _ = False
go _m acc [] = acc
go m _acc (cmds : cmdss) = go (advanceModel m cmds) (concSafe m cmds) cmdss
shrinkConcProgram :: Model -> ConcProgram -> [ConcProgram]
shrinkConcProgram m
= filter (validConcProgram m)
. map ConcProgram
. filter (not . null)
. shrinkList (shrinkList shrinkCommand)
. unConcProgram
prettyConcProgram :: ConcProgram -> String
prettyConcProgram = show
newtype History' cmd resp = History [Operation' cmd resp]
deriving stock (Show, Functor, Foldable)
type History = History' Command Response
newtype Pid = Pid Int
deriving stock (Eq, Ord, Show)
data Operation' cmd resp
= Invoke Pid cmd
| Ok Pid resp
deriving stock (Show, Functor, Foldable)
type Operation = Operation' Command Response
toPid :: ThreadId -> Pid
toPid tid = Pid (read (drop (length ("ThreadId " :: String)) (show tid)))
appendHistory :: TQueue (Operation' cmd resp) -> Operation' cmd resp -> IO ()
appendHistory hist op = atomically (writeTQueue hist op)
concExec :: TQueue Operation -> KeyValueClient -> Command -> IO ()
concExec queue kvc cmd = do
pid <- toPid <$> myThreadId
appendHistory queue (Invoke pid cmd)
-- Adds some entropy to the possible interleavings.
sleep <- randomRIO (0, 5)
threadDelay sleep
resp <- exec kvc cmd
atomically (writeTQueue queue (Ok pid resp))
interleavings :: History' cmd resp -> Forest (cmd, resp)
interleavings (History []) = []
interleavings (History ops0) =
[ Node (cmd, resp) (interleavings (History ops'))
| (tid, cmd) <- takeInvocations ops0
, (resp, ops') <- findResponse tid
(filter1 (not . matchInvocation tid) ops0)
]
where
takeInvocations :: [Operation' cmd resp] -> [(Pid, cmd)]
takeInvocations [] = []
takeInvocations ((Invoke pid cmd) : ops) = (pid, cmd) : takeInvocations ops
takeInvocations ((Ok _pid _resp) : _) = []
findResponse :: Pid -> [Operation' cmd resp] -> [(resp, [Operation' cmd resp])]
findResponse _pid [] = []
findResponse pid ((Ok pid' resp) : ops) | pid == pid' = [(resp, ops)]
findResponse pid (op : ops) =
[ (resp, op : ops') | (resp, ops') <- findResponse pid ops ]
matchInvocation :: Pid -> Operation' cmd resp -> Bool
matchInvocation pid (Invoke pid' _cmd) = pid == pid'
matchInvocation _ _ = False
filter1 :: (a -> Bool) -> [a] -> [a]
filter1 _ [] = []
filter1 p (x : xs) | p x = x : filter1 p xs
| otherwise = xs
linearisable :: forall model cmd resp. Eq resp
=> (model -> cmd -> (model, resp)) -> model -> Forest (cmd, resp) -> Bool
linearisable step0 model0 = any' (go model0)
where
go :: model -> Tree (cmd, resp) -> Bool
go model (Node (cmd, resp) ts) =
let
(model', resp') = step0 model cmd
in
resp == resp' && any' (go model') ts
any' :: (a -> Bool) -> [a] -> Bool
any' _p [] = True
any' p xs = any p xs
-- NOTE: Assumes that the server is running.
prop_concurrent :: Int -> Property
prop_concurrent port = mapSize (min 20) $
forAllConcProgram $ \(ConcProgram cmdss) -> monadicIO $ do
kvc <- run (newKeyValueClient port)
monitor (classifyCommandsLength (concat cmdss))
-- Rerun a couple of times, to avoid being lucky with the interleavings.
monitor (tabulate "Commands" (map constructorString (concat cmdss)))
monitor (tabulate "Number of concurrent commands" (map (show . length) cmdss))
replicateM_ 10 $ do
run (kvReset kvc)
queue <- run newTQueueIO
run (mapM_ (mapConcurrently (concExec queue kvc)) cmdss)
hist <- History <$> run (atomically (flushTQueue queue))
assertWithFail (linearisable step initModel (interleavings hist)) (prettyHistory hist)
where
constructorString :: Command -> String
constructorString WriteCmd {} = "Write"
constructorString ReadCmd {} = "Read"
assertWithFail :: Monad m => Bool -> String -> PropertyM m ()
assertWithFail condition msg = do
unless condition $
monitor (counterexample ("Failed: " ++ msg))
assert condition
classifyCommandsLength :: [cmd] -> Property -> Property
classifyCommandsLength cmds
= classify (length cmds == 0) "length commands: 0"
. classify (0 < length cmds && length cmds <= 10) "length commands: 1-10"
. classify (10 < length cmds && length cmds <= 50) "length commands: 11-50"
. classify (50 < length cmds && length cmds <= 100) "length commands: 51-100"
. classify (100 < length cmds && length cmds <= 200) "length commands: 101-200"
. classify (200 < length cmds && length cmds <= 500) "length commands: 201-500"
. classify (500 < length cmds) "length commands: >501"
prettyHistory :: (Show cmd, Show resp) => History' cmd resp -> String
prettyHistory = show
main :: IO ()
main = defaultMain tests
tests :: TestTree
tests = testGroup "Key-value store"
[ withResource (async (keyValueMain 8080)) cancel
(\_ -> testProperty "sequential" (noShrinking (prop_sequential 8080)))
, withResource (async (keyValueMain 8081)) cancel
(\_ -> testProperty "concurrent" (prop_concurrent 8081))
]
| null | https://raw.githubusercontent.com/stevana/coroutine-state-machines/c832a54415d5e91b93fa1c6cc677d757bd4b2c76/test/Main.hs | haskell | ----------------------------------------------------------------------
NOTE: Assumes that the server is running.
Adds some entropy to the possible interleavings.
NOTE: Assumes that the server is running.
Rerun a couple of times, to avoid being lucky with the interleavings. | # LANGUAGE DeriveFoldable #
# LANGUAGE DeriveFunctor #
# LANGUAGE DerivingStrategies #
# LANGUAGE ScopedTypeVariables #
module Main where
import Control.Concurrent
import Control.Concurrent.Async
import Control.Concurrent.STM
import Control.Concurrent.STM.TQueue
import Control.Monad
import Control.Monad.IO.Class
import Data.List (permutations)
import Data.Map (Map)
import qualified Data.Map as Map
import Data.Tree
import System.Random
import Test.QuickCheck
import Test.QuickCheck.Monadic
import Test.Tasty
import Test.Tasty.QuickCheck
import KeyValueApp (keyValueMain)
import KeyValueClient
data Command = WriteCmd String Int | ReadCmd String
deriving stock (Eq, Show)
data Response = Unit () | MaybeInt (Maybe Int)
deriving stock (Eq, Show)
type Model = Map String Int
initModel :: Model
initModel = Map.empty
step :: Model -> Command -> (Model, Response)
step m cmd = case cmd of
WriteCmd k v -> Unit <$> fakeWrite m k v
ReadCmd k -> MaybeInt <$> fakeRead m k
where
fakeWrite :: Model -> String -> Int -> (Model, ())
fakeWrite m k v = (Map.insert k v m, ())
fakeRead :: Model -> String -> (Model, Maybe Int)
fakeRead m k = (m, m Map.!? k)
newtype Program = Program [Command]
deriving stock Show
genProgram :: Model -> Gen Program
genProgram _m = Program <$> listOf genCommand
genCommand :: Gen Command
genCommand = oneof [WriteCmd <$> genKey <*> arbitrary, ReadCmd <$> genKey]
where
genKey :: Gen String
genKey = elements ["a", "b", "c"]
shrinkProgram :: Program -> [Program]
shrinkProgram (Program cmds) = [ Program cmds' | cmds' <- shrinkList shrinkCommand cmds ]
shrinkCommand :: Command -> [Command]
shrinkCommand (WriteCmd k v) = [ WriteCmd k v' | v' <- shrink v ]
shrinkCommand (ReadCmd k) = []
exec :: KeyValueClient -> Command -> IO Response
exec c cmd = case cmd of
WriteCmd k v -> Unit <$> kvWrite c k v
ReadCmd k -> MaybeInt <$> kvRead c k
type Trace = [Step]
data Step = Step
{ sModelBefore :: Model
, sCommand :: Command
, sResponse :: Response
, sModelAfter :: Model
}
showTrace :: Trace -> String
showTrace ss0 = "\n\n" ++ go ss0
where
go [] = ""
go (Step m cmd resp m' : ss) = show m ++ "\n == " ++ show cmd ++ " ==> " ++ show resp ++ "\n" ++ show m' ++ "\n\n" ++ go ss
coverage :: Trace -> Property -> Property
coverage hist = classifyLength hist
where
classifyLength xs = classify (length xs == 0) "0 length"
. classify (0 < length xs && length xs <= 10) "1-10 length"
. classify (10 < length xs && length xs <= 50) "11-50 length"
. classify (50 < length xs && length xs <= 100) "51-100 length"
. classify (100 < length xs && length xs <= 300) "101-300 length"
. classify (300 < length xs && length xs <= 500) "301-500 length"
prop_sequential :: Int -> Property
prop_sequential port = forallPrograms $ \prog -> monadicIO $ do
kvc <- run (newKeyValueClient port)
run (kvReset kvc)
let m = initModel
(mce, hist) <- runProgram kvc m prog
monitor (coverage hist)
case mce of
Nothing -> return True
Just ce -> do
monitor (counterexample (showTrace hist))
monitor (counterexample ce)
return False
forallPrograms :: (Program -> Property) -> Property
forallPrograms p =
forAllShrink (genProgram initModel) shrinkProgram p
runProgram :: MonadIO m => KeyValueClient -> Model -> Program -> m (Maybe String, Trace)
runProgram c0 m0 (Program cmds0) = go c0 m0 [] cmds0
where
go _c _m hist [] = return (Nothing, reverse hist)
go c m hist (cmd : cmds) = do
resp <- liftIO (exec c cmd)
let (m', resp') = step m cmd
if resp == resp'
then go c m' (Step m cmd resp m' : hist) cmds
else return (Just (show resp ++ " /= " ++ show resp'), reverse (Step m cmd resp m' : hist))
newtype ConcProgram = ConcProgram { unConcProgram :: [[Command]] }
deriving stock Show
forAllConcProgram :: (ConcProgram -> Property) -> Property
forAllConcProgram k =
forAllShrinkShow (genConcProgram m) (shrinkConcProgram m) prettyConcProgram k
where
m = initModel
genConcProgram :: Model -> Gen ConcProgram
genConcProgram m0 = sized (go m0 [])
where
go :: Model -> [[Command]] -> Int -> Gen ConcProgram
go m acc sz | sz <= 0 = return (ConcProgram (reverse acc))
| otherwise = do
n <- chooseInt (2, 5)
cmds <- vectorOf n genCommand `suchThat` concSafe m
go (advanceModel m cmds) (cmds : acc) (sz - n)
advanceModel :: Model -> [Command] -> Model
advanceModel m cmds = foldl (\ih cmd -> fst (step ih cmd)) m cmds
concSafe :: Model -> [Command] -> Bool
concSafe m = all (validProgram m) . permutations
validProgram :: Model -> [Command] -> Bool
validProgram _model _cmds = True
validConcProgram :: Model -> ConcProgram -> Bool
validConcProgram m0 (ConcProgram cmdss0) = go m0 True cmdss0
where
go :: Model -> Bool -> [[Command]] -> Bool
go _m False _ = False
go _m acc [] = acc
go m _acc (cmds : cmdss) = go (advanceModel m cmds) (concSafe m cmds) cmdss
shrinkConcProgram :: Model -> ConcProgram -> [ConcProgram]
shrinkConcProgram m
= filter (validConcProgram m)
. map ConcProgram
. filter (not . null)
. shrinkList (shrinkList shrinkCommand)
. unConcProgram
prettyConcProgram :: ConcProgram -> String
prettyConcProgram = show
newtype History' cmd resp = History [Operation' cmd resp]
deriving stock (Show, Functor, Foldable)
type History = History' Command Response
newtype Pid = Pid Int
deriving stock (Eq, Ord, Show)
data Operation' cmd resp
= Invoke Pid cmd
| Ok Pid resp
deriving stock (Show, Functor, Foldable)
type Operation = Operation' Command Response
toPid :: ThreadId -> Pid
toPid tid = Pid (read (drop (length ("ThreadId " :: String)) (show tid)))
appendHistory :: TQueue (Operation' cmd resp) -> Operation' cmd resp -> IO ()
appendHistory hist op = atomically (writeTQueue hist op)
concExec :: TQueue Operation -> KeyValueClient -> Command -> IO ()
concExec queue kvc cmd = do
pid <- toPid <$> myThreadId
appendHistory queue (Invoke pid cmd)
sleep <- randomRIO (0, 5)
threadDelay sleep
resp <- exec kvc cmd
atomically (writeTQueue queue (Ok pid resp))
interleavings :: History' cmd resp -> Forest (cmd, resp)
interleavings (History []) = []
interleavings (History ops0) =
[ Node (cmd, resp) (interleavings (History ops'))
| (tid, cmd) <- takeInvocations ops0
, (resp, ops') <- findResponse tid
(filter1 (not . matchInvocation tid) ops0)
]
where
takeInvocations :: [Operation' cmd resp] -> [(Pid, cmd)]
takeInvocations [] = []
takeInvocations ((Invoke pid cmd) : ops) = (pid, cmd) : takeInvocations ops
takeInvocations ((Ok _pid _resp) : _) = []
findResponse :: Pid -> [Operation' cmd resp] -> [(resp, [Operation' cmd resp])]
findResponse _pid [] = []
findResponse pid ((Ok pid' resp) : ops) | pid == pid' = [(resp, ops)]
findResponse pid (op : ops) =
[ (resp, op : ops') | (resp, ops') <- findResponse pid ops ]
matchInvocation :: Pid -> Operation' cmd resp -> Bool
matchInvocation pid (Invoke pid' _cmd) = pid == pid'
matchInvocation _ _ = False
filter1 :: (a -> Bool) -> [a] -> [a]
filter1 _ [] = []
filter1 p (x : xs) | p x = x : filter1 p xs
| otherwise = xs
linearisable :: forall model cmd resp. Eq resp
=> (model -> cmd -> (model, resp)) -> model -> Forest (cmd, resp) -> Bool
linearisable step0 model0 = any' (go model0)
where
go :: model -> Tree (cmd, resp) -> Bool
go model (Node (cmd, resp) ts) =
let
(model', resp') = step0 model cmd
in
resp == resp' && any' (go model') ts
any' :: (a -> Bool) -> [a] -> Bool
any' _p [] = True
any' p xs = any p xs
prop_concurrent :: Int -> Property
prop_concurrent port = mapSize (min 20) $
forAllConcProgram $ \(ConcProgram cmdss) -> monadicIO $ do
kvc <- run (newKeyValueClient port)
monitor (classifyCommandsLength (concat cmdss))
monitor (tabulate "Commands" (map constructorString (concat cmdss)))
monitor (tabulate "Number of concurrent commands" (map (show . length) cmdss))
replicateM_ 10 $ do
run (kvReset kvc)
queue <- run newTQueueIO
run (mapM_ (mapConcurrently (concExec queue kvc)) cmdss)
hist <- History <$> run (atomically (flushTQueue queue))
assertWithFail (linearisable step initModel (interleavings hist)) (prettyHistory hist)
where
constructorString :: Command -> String
constructorString WriteCmd {} = "Write"
constructorString ReadCmd {} = "Read"
assertWithFail :: Monad m => Bool -> String -> PropertyM m ()
assertWithFail condition msg = do
unless condition $
monitor (counterexample ("Failed: " ++ msg))
assert condition
classifyCommandsLength :: [cmd] -> Property -> Property
classifyCommandsLength cmds
= classify (length cmds == 0) "length commands: 0"
. classify (0 < length cmds && length cmds <= 10) "length commands: 1-10"
. classify (10 < length cmds && length cmds <= 50) "length commands: 11-50"
. classify (50 < length cmds && length cmds <= 100) "length commands: 51-100"
. classify (100 < length cmds && length cmds <= 200) "length commands: 101-200"
. classify (200 < length cmds && length cmds <= 500) "length commands: 201-500"
. classify (500 < length cmds) "length commands: >501"
prettyHistory :: (Show cmd, Show resp) => History' cmd resp -> String
prettyHistory = show
main :: IO ()
main = defaultMain tests
tests :: TestTree
tests = testGroup "Key-value store"
[ withResource (async (keyValueMain 8080)) cancel
(\_ -> testProperty "sequential" (noShrinking (prop_sequential 8080)))
, withResource (async (keyValueMain 8081)) cancel
(\_ -> testProperty "concurrent" (prop_concurrent 8081))
]
|
0fd99cedf38d2e4c3c88152e8387d60b5af9956e732e4914ea862fcf75a2634b | Eonblast/Scalaxis | api_dht.erl | 2007 - 2011 Zuse Institute Berlin
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
% you may not use this file except in compliance with the License.
% You may obtain a copy of the License at
%
% -2.0
%
% Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an " AS IS " BASIS ,
% WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
% See the License for the specific language governing permissions and
% limitations under the License.
@author < >
%% @doc API for access to the non replicated DHT items.
%% @end
%% @version $Id$
-module(api_dht).
-author('').
-vsn('$Id$').
| null | https://raw.githubusercontent.com/Eonblast/Scalaxis/10287d11428e627dca8c41c818745763b9f7e8d4/src/api_dht.erl | erlang | you may not use this file except in compliance with the License.
You may obtain a copy of the License at
-2.0
Unless required by applicable law or agreed to in writing, software
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
@doc API for access to the non replicated DHT items.
@end
@version $Id$ | 2007 - 2011 Zuse Institute Berlin
Licensed under the Apache License , Version 2.0 ( the " License " ) ;
distributed under the License is distributed on an " AS IS " BASIS ,
@author < >
-module(api_dht).
-author('').
-vsn('$Id$').
|
ed6fa8b5656e8b71a7d6d95e30061a6f0fa303248dd3c9b6d849eeab6b9d277f | ejgallego/coq-serapi | ser_xml_datatype.mli | (************************************************************************)
v * The Coq Proof Assistant / The Coq Development Team
< O _ _ _ , , * INRIA - CNRS - LIX - LRI - PPS - Copyright 1999 - 2016
\VV/ * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
(* // * This file is distributed under the terms of the *)
(* * GNU Lesser General Public License Version 2.1 *)
(************************************************************************)
(************************************************************************)
(* Coq serialization API/Plugin *)
Copyright 2016 MINES ParisTech
(************************************************************************)
(* Status: Very Experimental *)
(************************************************************************)
open Sexplib
type 'a gxml = 'a Xml_datatype.gxml
val gxml_of_sexp : (Sexp.t -> 'a) -> Sexp.t -> 'a gxml
val sexp_of_gxml : ('a -> Sexp.t) -> 'a gxml -> Sexp.t
val gxml_of_yojson : (Yojson.Safe.t -> ('a, string) Result.result) -> Yojson.Safe.t -> ('a gxml, string) Result.result
val gxml_to_yojson : ('a -> Yojson.Safe.t) -> 'a gxml -> Yojson.Safe.t
type xml = Xml_datatype.xml
val xml_of_sexp : Sexp.t -> xml
val sexp_of_xml : xml -> Sexp.t
val xml_of_yojson : Yojson.Safe.t -> (xml, string) Result.result
val xml_to_yojson : xml -> Yojson.Safe.t
| null | https://raw.githubusercontent.com/ejgallego/coq-serapi/61d2a5c092c1918312b8a92f43a374639d1786f9/serlib/ser_xml_datatype.mli | ocaml | **********************************************************************
// * This file is distributed under the terms of the
* GNU Lesser General Public License Version 2.1
**********************************************************************
**********************************************************************
Coq serialization API/Plugin
**********************************************************************
Status: Very Experimental
********************************************************************** | v * The Coq Proof Assistant / The Coq Development Team
< O _ _ _ , , * INRIA - CNRS - LIX - LRI - PPS - Copyright 1999 - 2016
\VV/ * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
Copyright 2016 MINES ParisTech
open Sexplib
type 'a gxml = 'a Xml_datatype.gxml
val gxml_of_sexp : (Sexp.t -> 'a) -> Sexp.t -> 'a gxml
val sexp_of_gxml : ('a -> Sexp.t) -> 'a gxml -> Sexp.t
val gxml_of_yojson : (Yojson.Safe.t -> ('a, string) Result.result) -> Yojson.Safe.t -> ('a gxml, string) Result.result
val gxml_to_yojson : ('a -> Yojson.Safe.t) -> 'a gxml -> Yojson.Safe.t
type xml = Xml_datatype.xml
val xml_of_sexp : Sexp.t -> xml
val sexp_of_xml : xml -> Sexp.t
val xml_of_yojson : Yojson.Safe.t -> (xml, string) Result.result
val xml_to_yojson : xml -> Yojson.Safe.t
|
e8d5f87bda43d530c93bad52802262343e30265bb8f4ff288e69cc3939c19d70 | albertoruiz/easyVision | combi2.hs | import EasyVision
drift r a b = r .* a |+| (1-r) .* b
main = run $ camera
~> float . grayscale
~~> scanl1 (drift 0.9)
>>= observe "drift" id
| null | https://raw.githubusercontent.com/albertoruiz/easyVision/26bb2efaa676c902cecb12047560a09377a969f2/projects/old/tutorial/combi2.hs | haskell | import EasyVision
drift r a b = r .* a |+| (1-r) .* b
main = run $ camera
~> float . grayscale
~~> scanl1 (drift 0.9)
>>= observe "drift" id
| |
e0ff5b9066928a687035c161f577954180b38b38e2ba6963478d1aa965d1e1ca | hipsleek/hipsleek | minisat.ml | #include "xdebug.cppo"
open VarGen
open Globals
open GlobProver
open Gen.Basic
open Cpure
(* open Rtc_new_stable *)
(* open Rtc_new_algorithm *)
open Rtc_algorithm
module StringSet = Set.Make(String)
(* Global settings *)
let minisat_timeout_limit = 15.0
let test_number = ref 0
let last_test_number = ref 0
let minisat_restart_interval = ref (-1)
let log_all_flag = ref false
let is_minisat_running = ref false
default timeout is 15 seconds
let minisat_call_count: int ref = ref 0
let log_file = open_log_out ("allinput.minisat")
valid value is : " file " or " stdin "
(*minisat*)
let minisat_path = "/usr/local/bin/minisat"
let minisat_name = "minisat"
let minisat_arg = "-pre"(*"-pre"*)
let minisat_path_crypt = "/home/bachle/improve_rtc_algo/sleekex"
let minisat_name_crypt = "cryptominisat"
let minisat_arg_crypt = "--no-simplify --nosatelite --gaussuntil=3"
" /home / bachle / improve_rtc_algo / sleekex / cryptominisat "
let minisat_name2 = ( "
let minisat_name2 = (*"cryptominisat"*)"minisat"
let minisat_arg2 = ""(*"-pre"*)
let eq_path = "equality_logic"
let eq_name = "equality_logic"
let eq_arg = "equality_logic"
let minisat_input_format = "cnf" (* valid value is: cnf *)
let number_clauses = ref 1
let number_vars = ref 0
let len=1000
let bcl= ref [ ]
let sat= ref true
let minisat_process = ref { name = "minisat";
pid = 0;
inchannel = stdin;
outchannel = stdout;
errchannel = stdin
}
(***************************************************************
TRANSLATE CPURE FORMULA TO PROBLEM IN CNF FORMAT
**************************************************************)
(*minisat*)
let de_morgan f=match f with
|Not (And(f1,f2,_),l1,l2)-> Or(Not(f1,l1,l2), Not (f2,l1,l2),l1,l2)
|Not (Or(f1,f2,_,_),l1,l2)-> And(Not(f1,l1,l2),Not(f2,l1,l2),l2)
|_->f
let double_negative f= match f with
|Not (Not(f1,_,_),_,_)->f1
|_->f
let minisat_cnf_of_spec_var sv = let ident=Cpure.name_of_spec_var sv in ident
let rec minisat_of_exp e0 = match e0 with
| Null _ -> "null_var"
| Var (sv, _) -> minisat_cnf_of_spec_var sv
| IConst (i, _) -> string_of_int i
| AConst (i, _) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Add (a1, a2, _) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Subtract (a1, a2, _) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Mult (a1, a2, l) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Div (a1, a2, l) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Max _
| Min _ -> illegal_format ("eq_logic.eq_logic_of_exp: min/max should not appear here")
| TypeCast _ -> illegal_format ("eq_logic.eq_logic_of_exp: TypeCast should not appear here")
| FConst _ -> illegal_format ("eq_logic.eq_logic_of_exp: FConst")
| Func _ -> "0" (* TODO: Need to handle *)
| _ -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
(*-------------------------------Functions are used for generating cnf of CNF formula--------------------*)
(* let addBooleanConst v = *)
let _ = print_endline ( " length of string_of_int ( ! ) ) in ( * Debug - bach
let index= ref 0 in
(* begin *)
for i=0 to ( List.length ! do
(* ( *)
if v=(List.nth ! bcl i ) then ( index:=i+len )
(* ) *)
(* done; *)
(* if(!index>0) then string_of_int !index *)
else let _ = bcl:= ! bcl@[v ] in ( string_of_int ( ( List.length ! bcl)+len-1 ) )
(* end *)
let minisat_cnf_of_p_formula (pf : Cpure.p_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t) =
match pf with
| Frm (sv, _) -> ""
| LexVar _ -> ""
let _ = print_endline ( " minisat_cnf_of_p_formula_for_helper BConst EXIT ! " ) in
WN : weakening
| BVar (sv, pos) ->
let _= x_binfo_hp (add_str "minisat_cnf_of_p_formula Bvar" minisat_cnf_of_spec_var) sv pos
in ""
| Lt _ -> ""
| Lte _ -> ""
| Gt _ -> ""
| Gte _ -> ""
| SubAnn _ -> ""
| Eq (e1, e2, _) ->
(*Handle here*)let li=minisat_of_exp e1 and ri=minisat_of_exp e2 in
(* let () = print_endline("minisat of e1: "^li^" minisat of e2: "^ri) in *)
(* if(li=ri) then *)
(* begin *)
(* let index=addBooleanConst (li) in index *)
(* end *)
(* else(*add xx to the set of boolean constants *) *)
let eq_edge=G.E.create li () ri in
let _= G.add_edge_e ge eq_edge in
let mem = Glabel.mem_edge allvars in
(* let _=if(mem=false)then *)
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
let lr=get_var li ri allvars in
lr
| Neq (e1, e2, _) -> (*Handle here*)let li=minisat_of_exp e1 and ri=minisat_of_exp e2 in
(* if(li=ri) then (let index=addBooleanConst (li) in ("-"^index)) *)
(* else(*add xx to the set of boolean constants *) *)
let diseq_edge=G.E.create li () ri in
let _= G.add_edge_e gd diseq_edge in
let mem = Glabel.mem_edge allvars in
(* let _=if(mem=false)then *)
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
in "-"^lr
| EqMax _ -> ""
| EqMin _ -> ""
(* bag formulas *)
| BagIn _
| BagNotIn _
| BagSub _
| BagMin _
| BagMax _ -> ""
(* list formulas *)
| ListIn _
| ListNotIn _
| ListAllN _
| ListPerm _
| RelForm _ -> ""
(* | VarPerm _ -> Error.report_no_pattern () *)
let minisat_cnf_of_b_formula (bf : Cpure.b_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t)=
match bf with
| (pf, _) -> minisat_cnf_of_p_formula pf allvars ge gd
let minisat_cnf_of_not_of_p_formula (pf : Cpure.p_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t) =
match pf with
| Frm _ -> ""
| LexVar _ -> ""
| BConst (c, _) -> (*let _=print_endline ("minisat_cnf_of_not_of_p_formula_for_helper BConst EXIT!") in*) ""
| BVar (sv, _) -> (*let _=print_endline ("minisat_cnf_of_not_of_p_formula_for_helper Bvar EXIT!") in*) ""
| Lt _ -> ""
| Lte _ -> ""
| Gt _ -> ""
| Gte _ -> ""
| SubAnn _ -> ""
| Eq (e1, e2, _) -> (*Handle here*)let li=minisat_of_exp e1 and ri=minisat_of_exp e2 in
if(li = ri ) then
(* begin *)
(* let index=addBooleanConst (li) in ("-"^index)(*add -xx to the set of boolean constants *) *)
(* end *)
(* else *)
let diseq_edge=G.E.create li () ri in
let _= G.add_edge_e gd diseq_edge in
let mem = Glabel.mem_edge allvars in
(* let _=if(mem=false)then *)
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
"-"^lr
| Neq (e1, e2, _) -> (*Handle here*)let li=minisat_of_exp e1 and ri=minisat_of_exp e2 in
(* if(li=ri) then (let index=addBooleanConst li in index ) (*add xx to the set of boolean constants *) *)
(* else *)
let eq_edge=G.E.create li () ri in
let _= G.add_edge_e ge eq_edge in
let mem = Glabel.mem_edge allvars in
(* let _=if(mem=false)then *)
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
lr
| EqMax _ -> ""
| EqMin _ -> ""
(* bag formulas *)
| BagIn _
| BagNotIn _
| BagSub _
| BagMin _
| BagMax _ -> ""
(* list formulas *)
| ListIn _
| ListNotIn _
| ListAllN _
| ListPerm _
| RelForm _ -> ""
| XPure _ (* | VarPerm _ *) -> Error.report_no_pattern ()
let minisat_cnf_of_not_of_b_formula (bf : Cpure.b_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t) =
match bf with
| (pf, _) -> minisat_cnf_of_not_of_p_formula pf allvars ge gd
(*----------------------------------Functions are used for generating T-----------------------------------*)
(*---------------------------------------CNF conversion here-----------------------------------*)
let return_pure bf f= match bf with
| (pf,_)-> match pf with
| Eq _ -> f
| Neq _ -> f
| Frm _ -> f
| BConst(a,_)->f (*let _=if(a) then print_endline ("TRUE") else print_endline ("FALSE") in*)
| BVar(_,_)->f
| XPure _ | LexVar _ | Lt _ | Lte _ | Gt _ | Gte _ | SubAnn _ | EqMax _ | EqMin _ | BagIn _ | BagNotIn _ | BagSub _
| BagMin _ | BagMax _ (* | VarPerm _ *) | ListIn _ | ListNotIn _ | ListAllN _ | ListPerm _ | RelForm _ -> Error.report_no_pattern ()
For converting to NNF -- no need??--
let rec minisat_cnf_of_formula f =
match f with
| BForm (b, _) -> (*return_pure b *)f
| And (f1, f2, l1) -> And(minisat_cnf_of_formula f1,minisat_cnf_of_formula f2,l1)
| Or (f1, f2, l1, l2) -> Or(minisat_cnf_of_formula f1,minisat_cnf_of_formula f2,l1,l2)
| Not (BForm(b,_), _, _) -> return_pure b f
| _ -> minisat_cnf_of_formula (de_morgan (double_negative f));;
(*let rec cnf_to_string f = *)
(* match f with *)
(* |BForm (b,_)-> minisat_cnf_of_b_formula b *)
|Not ( f1,_,_)->"-"^cnf_to_string f1
( f1 , f2 , _ ) - > " ( " ^(cnf_to_string f1)^"&"^(cnf_to_string f2)^ " ) "
(* |Or (f1, f2, _, _)->"("^(cnf_to_string f1)^"v"^(cnf_to_string f2)^")"*)
(* let incr_cls= number_clauses:=1 + !number_clauses *)
(*let rec cnf_to_string_to_file f (map: spec_var list)= *)
(* match f with *)
(* |BForm (b,_)-> let var=minisat_cnf_of_b_formula b map in check_inmap var map *)
|Not ( f1,_,_)->"-"^cnf_to_string_to_file f1 map
( f1 , f2 , _ ) - > let _ = incr_cls in ( cnf_to_string_to_file f1 map)^ " 0"^"\n"^(cnf_to_string_to_file f2 map )
(* |Or (f1, f2, _, _)-> (cnf_to_string_to_file f1 map)^" "^(cnf_to_string_to_file f2 map) *)
For CNF conversion
let unsat_in_cnf (bf : Cpure.b_formula) =
match bf with
| (pf, _) -> match pf with
| Neq(e1,e2,_)->let li=minisat_of_exp e1 and ri=minisat_of_exp e2 in
| _->()
let rec has_and f =
match f with
|BForm _ -> false
|And(_,_,_)->true
|Or(f1,f2,_,_) -> if(has_and f1) then true else if (has_and f2) then true else false
| _->false
and is_cnf_old2 f =
match f with
| BForm _ -> true
| Or (f1,f2,_,_)-> if(has_and f1) then false else if (has_and f2) then false else true
| And (BForm(b,_),f2,_)->let _=unsat_in_cnf b in if(!sat=true) then is_cnf f2 else true
| And (f1,BForm(b,_),_)->let _=unsat_in_cnf b in if(!sat=true) then is_cnf f1 else true
| And (f1,f2,_)-> if(is_cnf f1) then is_cnf f2 else false
| AndList _ | Not _ | Forall _ | Exists _ -> Error.report_no_pattern ()
and is_cnf_old1 f = (*Should use heuristic in CNF*)
match f with
| BForm _ -> true
| Or (f1,f2,_,_)-> if(has_and f1) then false else if (has_and f2) then false else true
| And (BForm(b,_),f2,_)->is_cnf f2
| And (f1,BForm(b,_),_)->is_cnf f1
| And (f1,f2,_)-> if(is_cnf f1) then is_cnf f2 else false
| AndList _ | Not _ | Forall _ | Exists _ -> Error.report_no_pattern()
and is_cnf f = (*Should use heuristic in CNF*)
match f with
| BForm _ -> true
| Or (f1,f2,_,_)-> if(has_and f1) then false else if (has_and f2) then false else true
| And (BForm(b,_),f2,_)->is_cnf f2
| And (f1,BForm(b,_),_)->is_cnf f1
| And (f1,f2,_)-> if(is_cnf f1) then is_cnf f2 else false
| _-> let _=print_endline_quiet ("CNF conv here: "^Cprinter.string_of_pure_formula f) in true
distributive law 1 - ( f & k ) v ( g & h ) - > ( f v g ) & ( f v h ) & ( k v g ) & ( k v h )
let dist_1 f =
using heuristic for the first one
And(Or(f1 , f2,l1,l2 ) , Or(f1 , f3,l1,l2),l2 )
, f2 , _ ) , And(f3 , f4,_),l1,l2 ) ->And(And(Or(f1 , f3,l1,l2 ) , Or(f1 , f4,l1,l2),l2 ) , And(Or(f2 , f3,l1,l2 ) , Or(f2 , )
| Or(And(f2, f3,_), f1,l1,l2) -> And(Or(f1, f2,l1,l2), Or(f1, f3,l1,l2),l2)
| _ -> f
let dist_no_slicing f =
match f with
| Or(f1, And(f2, f3,_),l1,l2) -> And(Or(f1, f2,l1,l2), Or(f1, f3,l1,l2),l2) (*The main here- when using slicing*)
, f2 , _ ) , And(f3 , f4,_),l1,l2 ) ->And(And(Or(f1 , f3,l1,l2 ) , Or(f1 , f4,l1,l2),l2 ) , And(Or(f2 , f3,l1,l2 ) , Or(f2 , )
| Or(And(f2, f3,_), f1,l1,l2) -> And(Or(f1, f2,l1,l2), Or(f1, f3,l1,l2),l2)
| _ -> f
let rec nnf_to_xxx f rule =
let nf = match f with
BForm (b,_) -> return_pure b f
| Not (f1,l1,l2) -> Not ((nnf_to_xxx f1 rule),l1,l2)
| And (f1, f2,l1) -> And (nnf_to_xxx f1 rule, nnf_to_xxx f2 rule,l1)
| Or (f1, f2,l1,l2) -> Or (nnf_to_xxx f1 rule, nnf_to_xxx f2 rule,l1,l2)
| Exists (_,f1,_,_) -> nnf_to_xxx f1 rule
(* let _=print_endline ("CNF form: "^Cprinter.string_of_pure_formula f1) in let _= print_endline ("[minisat.ml exit 0] Please use the option '--enable-slicing'") in exit 0 *)
(* | Exists _ -> *)
| AndList _ | Forall _ -> Error.report_no_pattern()
in
rule nf
let nnf_to_cnf f= nnf_to_xxx f dist_1
let nnf_to_cnf_no_slicing f= nnf_to_xxx f dist_no_slicing
(*let to_cnf f = nnf_to_cnf (minisat_cnf_of_formula f)*)
The old CNF conversion
let rec to_cnf f =
let cnf_form=(nnf_to_cnf_no_slicing f) in
if(is_cnf cnf_form) then cnf_form else to_cnf cnf_form(*(to_cnf cnf_form)*)
in
let _=print_endline_quiet ("CNF form: "^Cprinter.string_of_pure_formula res) in
res
let to_cnf_no_slicing f=
let _=print_endline_quiet ("Orig: "^Cprinter.string_of_pure_formula f) in
let nnf= minisat_cnf_of_formula f in
let _=print_endline_quiet ("NNF here: "^Cprinter.string_of_pure_formula nnf) in
to_cnf nnf
The no need CNF conversion adapt to slicing , we just need the distributive law
(* let minisat_cnf_of_formula f = *)
Debug.no_1 " minisat_of_formula " Cprinter.string_of_pure_formula pr_id minisat_cnf_of_formula f
bach - minisat
(*************************************************************)
(* Check whether minisat can handle the expression, formula... *)
let rec can_minisat_handle_expression (exp: Cpure.exp) : bool =
match exp with
| Cpure.Null _ -> false
| Cpure.Var _ -> false
| Cpure.IConst _ -> false
| Cpure.FConst _ -> false
| Cpure.AConst _ -> false
| Cpure.NegInfConst _
| Cpure.InfConst _ -> false
(* arithmetic expressions *)
| Cpure.Add _
| Cpure.Subtract _
| Cpure.Mult _
| Cpure.Div _
| Cpure.Max _
| Cpure.Min _
| Cpure.TypeCast _ -> false
(* bag expressions *)
| Cpure.Bag _
| Cpure.BagUnion _
| Cpure.BagIntersect _
| Cpure.BagDiff _ -> false
(* list expressions *)
| Cpure.List _
| Cpure.ListCons _
| Cpure.ListHead _
| Cpure.ListTail _
| Cpure.ListLength _
| Cpure.ListAppend _
| Cpure.ListReverse _ -> false
(* array expressions *)
| Cpure.ArrayAt _ -> false
| Cpure.Func _ -> false
| Cpure.Template _ -> false
| Cpure.Level _
| Cpure.Tsconst _ -> Error.report_no_pattern()
| Cpure.Tup2 _ -> Error.report_no_pattern()
| Cpure.Bptriple _ -> Error.report_no_pattern()
and can_minisat_handle_p_formula (pf : Cpure.p_formula) : bool =
match pf with
| Frm _ -> false
| LexVar _ -> false
| BConst (a,_) -> true (*true*)
| BVar _ -> false (*true*)
| Lt _ -> false
| Lte _ -> false
| Gt _ -> false
| Gte _ -> false
| SubAnn (ex1, ex2, _) -> false
| Eq (ex1, ex2, _) -> true
| Neq (ex1, ex2, _) -> true
| EqMax _ -> false
| EqMin _ -> false
(* bag formulars *)
| BagIn _
| BagNotIn _
| BagSub _
| BagMin _
| BagMax _ -> false
(* list formulas *)
| ListIn _
| ListNotIn _
| ListAllN _
| ListPerm _
| RelForm _ -> false
| XPure _ (* | VarPerm _ *) -> Error.report_no_pattern()
and can_minisat_handle_b_formula (bf : Cpure.b_formula) : bool =
match bf with
| (pf, _) -> can_minisat_handle_p_formula pf
and can_minisat_handle_formula (f: Cpure.formula) : bool =
match f with
| BForm (bf, _) -> can_minisat_handle_b_formula bf
| And (f1, f2, _) -> (can_minisat_handle_formula f1) && (can_minisat_handle_formula f2)
| Or (f1, f2, _, _) -> (can_minisat_handle_formula f1) && (can_minisat_handle_formula f2)
| Not (f, _, _) -> can_minisat_handle_formula f
| Forall (_, f, _, _) -> can_minisat_handle_formula f
| Exists (_, f, _, _) -> can_minisat_handle_formula f
| AndList _ -> Error.report_no_pattern()
(***************************************************************
INTERACTION
**************************************************************)
let rec collect_output (chn: in_channel) : (string * bool) =
try
let line = input_line chn in
(* let () = print_endline (" -- output: " ^ line) in *)
if line = "SATISFIABLE" then
(line, true)
else if (line = "c SAT") then
("SATISFIABLE",true)
else
collect_output chn
with
| End_of_file -> ("", false)
read the output stream of minisat prover , return ( conclusion * reason )
(* TODO: this function need to be optimized *)
let get_prover_result (output : string) :bool =
if !Globals.print_original_solver_output then
begin
print_endline_quiet "MINISAT OUTPUT";
print_endline_quiet "--------------";
print_endline_quiet output;
print_endline_quiet "--------------";
end;
let validity =
if (output="SATISFIABLE") then
(* let _=print_endline output in*)
true
else
(* let _=print_endline output in*)
false in
validity
(* output: - prover_output
- the running status of prover: true if running, otherwise false *)
let get_answer (chn: in_channel) : (bool * bool)=
let (output, running_state) = collect_output chn in
let
validity_result = get_prover_result output;
in
(validity_result, running_state)
let remove_file filename =
try Sys.remove filename;
with e -> ignore e
let set_process (proc: prover_process_t) =
minisat_process := proc
let start () =
if not !is_minisat_running then (
print_endline_quiet ("Starting minisat... \n");
last_test_number := !test_number;
let prelude () = () in
if (minisat_input_format = "cnf") then (
Procutils.PrvComms.start !log_all_flag log_file (minisat_name, minisat_path, [|minisat_arg|]) set_process prelude;
is_minisat_running := true;
)
)
(* stop minisat system *)
let stop () =
if !is_minisat_running then (
let num_tasks = !test_number - !last_test_number in
print_string_if !Globals.enable_count_stats ("\nStop minisat... " ^ (string_of_int !minisat_call_count) ^ " invocations "); flush stdout;
let () = Procutils.PrvComms.stop !log_all_flag log_file !minisat_process num_tasks Sys.sigkill (fun () -> ()) in
is_minisat_running := false;
)
restart Omega system
let restart reason =
if !is_minisat_running then (
let () = print_string_if !Globals.enable_count_stats (reason ^ " Restarting minisat after ... " ^ (string_of_int !minisat_call_count) ^ " invocations ") in
Procutils.PrvComms.restart !log_all_flag log_file reason "minisat" start stop
)
else (
let () = print_string_if !Globals.enable_count_stats (reason ^ " not restarting minisat ... " ^ (string_of_int !minisat_call_count) ^ " invocations ") in ()
)
(* Runs the specified prover and returns output *)
let check_problem_through_file (input: string) (timeout: float) : bool =
(* debug *)
(* let () = print_endline "** In function minisat.check_problem" in *)
let file_suffix = "bach_eq_minisat" in
let infile =(file_suffix) ^ ".cnf" in
(* let () = print_endline ("-- input: " ^ input^"\n") in *)
if !Globals.print_original_solver_input then
begin
print_endline_quiet "MINISAT INPUT";
print_endline_quiet "--------------";
print_endline_quiet input;
print_endline_quiet "--------------";
end;
let out_stream = open_out infile in
output_string out_stream input;
close_out out_stream;
let minisat_result="minisatres.txt" in
let set_process proc = minisat_process := proc in
let fnc () =
if (minisat_input_format = "cnf") then (
(* let tstartlog = Gen.Profiling.get_time () in *)
(* let ch = Unix.open_process_in "/usr/local/bin/ minisat22 bach_eq_minisat.cnf" in *)
(* let ch = Unix.execvp "/usr/local/bin/minisat22" [|"minisat22";"bach_eq_minisat.cnf"|] in *)
Procutils.PrvComms.start false stdout (minisat_name2, minisat_path2, [|minisat_arg2;infile;minisat_result|]) set_process (fun () -> ());
(* let status = Unix.close_process_in ch in *)
minisat_call_count := !minisat_call_count + 1;
let (prover_output, running_state) = get_answer !minisat_process.inchannel in
is_minisat_running := running_state;
let tstoplog = Gen. ( ) in
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
prover_output;
)
else illegal_format "[minisat.ml] The value of minisat_input_format is invalid!" in
let res =
try
let res = Procutils.PrvComms.maybe_raise_timeout fnc () timeout in
res
with _ -> ((* exception : return the safe result to ensure soundness *)
print_backtrace_quiet ();
print_endline_quiet ("WARNING: Restarting prover due to timeout");
Unix.kill !minisat_process.pid 9;
ignore (Unix.waitpid [] !minisat_process.pid);
false
)
in
let () = Procutils.PrvComms.stop false stdout !minisat_process 0 9 (fun () -> ()) in
let (*tstoplog*)_ = Gen.Profiling.get_time () in
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
remove_file infile;
res
let check_problem_through_file (input: string) (timeout: float) : bool =
Debug.no_1 "check_problem_through_file (minisat)"
(fun s -> s) string_of_bool
(fun f -> check_problem_through_file f timeout) input
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
FOR IMPLICATION / SATISFIABILITY CHECKING
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
GENERATE CNF INPUT FOR IMPLICATION / SATISFIABILITY CHECKING
**************************************************************)
(* minisat: output for cnf format *)
let rtc_generate_B (f:Cpure.formula) =
ge is eq graph and gd is diseq graph
(*let () = print_endline("INSIDE rtc_generate_B, f=="^Cprinter.string_of_pure_formula f) in*)
Aiming to get ge and gd and cnf string of the given CNF formula
match f with
|BForm (b,_)-> minisat_cnf_of_b_formula b gr_e ge gd
|And (f1, f2, _) ->cnf_to_string_to_file f1 ^" 0"^"\n"^ cnf_to_string_to_file f2
|Or (f1, f2, _, _)->cnf_to_string_to_file f1 ^" "^ cnf_to_string_to_file f2
|Not ((BForm(b,_)),_,_)-> minisat_cnf_of_not_of_b_formula b gr_e ge gd
| _->
let _=
x_tinfo_hp (add_str "imply Final Formula :" Cprinter.string_of_pure_formula) f no_pos
in ""
in
let cnf_str =cnf_to_string_to_file f in
(cnf_str,ge,gd,gr_e)
let get_cnf_from_cache ge gd gr_e=
let testRTC= new rTC in
let cache= testRTC#rtc_v2 ge gd gr_e !number_vars in
cache
let to_minisat_cnf (ante: Cpure.formula) =
(*let () = "** In function Spass.to_minisat_cnf" in*)
(*let _=print_endline ("imply Final Formula :" ^ (Cprinter.string_of_pure_formula ante))in*)
(*let () = read_line() in*)
(*let _=print_endline ("CNF Formula :" ^ (Cprinter.string_of_pure_formula (to_cnf ante)))in*)
(* let () = print_endline("INSIDE to_minisat_cnf"^Cprinter.string_of_pure_formula ante) in *)
let _= number_vars := 0 in
(* let _=Gen.Profiling.push_time("stat_CNF_ori_conversion") in *)
let ante_cnf = to_cnf ante the given formula in to CNF here
let cnf_ante=nnf_to_cnf ante
in
let _ = print_endline ( " To minisat cnf : " ^ ( Cprinter.string_of_pure_formula cnf_ante))in
match ante with
| BForm ((BConst (a,_),_),_)->
let () = print_endline_quiet ("BForm:\n ") in
if (a)
then (false,"t",G.create(),G.create(),Glabel.create())
else (false,"f",G.create(),G.create(),Glabel.create())
| _ ->
(* let () = print_endline ("other\n") in *)
(* let _=Gen.Profiling.pop_time("stat_CNF_ori_conversion") in *)
(* let _=print_endline "sat true" in*)
(* let _=Gen.Profiling.push_time("stat_CNF_generation_of_B") in *)
let (ante_str,ge,gd,gr_e)=rtc_generate_B cnf_ante in
let () = Debug.ninfo_hprint (add_str "ante_str == " pr_id) ante_str no_pos in
(*start generating cnf for the given CNF formula*)
let temp= if(ante_str <> "0" && ante_str <> "") then (ante_str^" 0") else "p cnf 0 0" in
let final_res= temp(*result*)^"\n" in
let _ = Gen. " ) in
(true,final_res,ge,gd,gr_e)
bach
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
FOR IMPLICATION / SATISFIABILITY CHECKING
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
GENERATE CNF INPUT FOR IMPLICATION / SATISFIABILITY CHECKING
**************************************************************)
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
MAIN INTERFACE : CHECKING IMPLICATION AND SATISFIABILITY
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
MAIN INTERFACE : CHECKING IMPLICATION AND SATISFIABILITY
*************************************************************)
(**
* Test for satisfiability
* We also consider unknown is the same as sat
*)
(* minisat *)
let minisat_is_sat (f : Cpure.formula) (sat_no : string) timeout : bool =
to check sat of f , check the validity of negative(f ) or ( f = > None )
(* let tstartlog = Gen.Profiling.get_time () in *)
let ( ) = print_endline ( " here"^Cprinter.string_of_pure_formula f ) in
let (flag,minisat_input,ge,gd,gr_e) = to_minisat_cnf f in
(* let tstoplog = Gen.Profiling.get_time () in *)
let _ = Globals.minisat_time_cnf_conv : = ! Globals.minisat_time_cnf_conv + . ( tstoplog - . ) in
if(flag = false ) then
begin
if(minisat_input = "t") then true
else
if(minisat_input = "f") then false
else false
end
else
(* let validity = *)
( * if ( ( List.length ! ) then
(* (* let _=Gen.Profiling.push_time("stat_check_sat_1") in *) *)
(* let res=check_problem_through_file minisat_input timeout in res *)
(* (* let _=Gen.Profiling.pop_time("stat_check_sat_1") in res *) *)
(* (* else true *) *)
(* in check_problem_through_file *)
(* if(validity=false) then *)
let _ = print_endline " check " in
(* validity *)
(* else *)
(* let _= print_endline "check sat2" in*)
(* let _=Gen.Profiling.push_time("stat_generation_of_T") in *)
(* let tstartlog = Gen.Profiling.get_time () in *)
(* let _= print_endline ("ori cnf form: "^minisat_input) in *)
(* let tstartlog = Gen.Profiling.get_time () in *)
let cnf_T = get_cnf_from_cache ge gd gr_e in
(*let () = print_endline("get_cnf_from_cache "^cnf_T^"\n") in *)
(* let tstoplog = Gen.Profiling.get_time () in *)
let _ = Globals.minisat_time_BCC : = ! . ( tstoplog - . ) in
(* let tstoplog = Gen.Profiling.get_time () in *)
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
(* let _=Gen.Profiling.pop_time("stat_generation_of_T") in *)
(* let _=Gen.Profiling.push_time("stat_check_sat_2") in *)
let all_input=if(cnf_T <> "") then cnf_T^minisat_input else minisat_input in
(*let () = print_endline("cnf_T:"^cnf_T^" minisat_input:"^minisat_input^"\n") in*)
(* let _=print_endline ("All input: \n"^all_input) in *)
(* let tstartlog = Gen.Profiling.get_time () in *)
(* let () = print_endline("all_input: "^all_input^"\n") in *)
let res= check_problem_through_file (all_input) timeout in
(* let tstoplog = Gen.Profiling.get_time () in *)
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
res
let _ = Gen. " ) in res
(* minisat *)
let minisat_is_sat (f : Cpure.formula) (sat_no : string) : bool =
minisat_is_sat f sat_no minisat_timeout_limit
(* minisat *)
let minisat_is_sat (f : Cpure.formula) (sat_no : string) : bool =
let pr = Cprinter.string_of_pure_formula in
let result = Debug.no_1 "minisat_is_sat" pr string_of_bool (fun _ -> minisat_is_sat f sat_no) f in
(* let omega_result = Omega.is_sat f sat_no in
let () = print_endline ("-- minisat_is_sat result: " ^ (if result then "TRUE" else "FALSE")) in
let () = print_endline ("-- Omega.is_sat result: " ^ (if omega_result then "TRUE" else "FALSE")) in *)
result
(* see imply *)
let is_sat (f: Cpure.formula) (sat_no: string) : bool =
(* debug *)
(* let () = print_endline "** In function minisat.is_sat: " in *)
minisat_is_sat f sat_no
let is_sat_with_check (pe : Cpure.formula) sat_no : bool option =
Cpure.do_with_check "" (fun x -> is_sat x sat_no) pe
let is_sat f sat_no = Debug.loop_2_no " is_sat " ( ! print_pure ) ( fun x->x )
(* string_of_bool is_sat f sat_no *)
let is_sat (pe : Cpure.formula) (sat_no: string) : bool =
(* let () = print_endline "** In function minisat.is_sat: " in *)
try
is_sat pe sat_no;
with Illegal_Prover_Format s -> (
print_endline_quiet ("\nWARNING : Illegal_Prover_Format for :" ^ s);
print_endline_quiet ("Apply minisat.is_sat on formula :" ^ (Cprinter.string_of_pure_formula pe));
flush stdout;
failwith s
)
(**
* Test for validity
* To check the implication P -> Q, we check the satisfiability of
* P /\ not Q
* If it is satisfiable, then the original implication is false.
* If it is unsatisfiable, the original implication is true.
* We also consider unknown is the same as sat
*)
let imply (ante: Cpure.formula) (conseq: Cpure.formula) (timeout: float) : bool =
(*let () = print_endline "** In function minisat.imply:" in *)
let _ = ( fun x- > print_endline ( minisat_cnf_of_spec_var x ) ) all in
let cons= (mkNot_s conseq) in
let imply_f= mkAnd ante cons no_pos in
(* x_tinfo_pp "hello\n" no_pos; *)
let res =is_sat imply_f ""
in
let _ = if(res ) then print_endline ( " SAT " ) else print_endline ( " UNSAT " ) in
if(res) then false else true
let imply (ante : Cpure.formula) (conseq : Cpure.formula) (timeout: float) : bool =
(* let () = print_endline "** In function minisat.imply:" in *)
try
let result = imply ante conseq timeout in
bach - test
result
with Illegal_Prover_Format s -> (
print_endline_quiet ("\nWARNING : Illegal_Prover_Format for :" ^ s);
print_endline_quiet ("Apply minisat.imply on ante Formula :" ^ (Cprinter.string_of_pure_formula ante));
print_endline_quiet ("and conseq Formula :" ^ (Cprinter.string_of_pure_formula conseq));
flush stdout;
failwith s
)
let imply (ante : Cpure.formula) (conseq : Cpure.formula) (timeout: float) : bool =
(* let () = pint_endline "** In function minisat.imply:" in *)
let pr = Cprinter.string_of_pure_formula in
Debug.no_2(* _loop *) "minisat.imply" (add_str "ante" pr) (add_str "conseq" pr) string_of_bool
(fun _ _ -> imply ante conseq timeout) ante conseq
let imply_with_check (ante : Cpure.formula) (conseq : Cpure.formula) (imp_no : string) (timeout: float) : bool option =
let ( ) = print_endline " * * In function : " in
Cpure.do_with_check2 "" (fun a c -> imply a c timeout) ante conseq
(**
* To be implemented
*)
let simplify (f: Cpure.formula) : Cpure.formula =
(* debug *)
(* let () = print_endline "** In function minisat.simplify" in *)
try (Omega.simplify f) with _ -> f
let simplify (pe : Cpure.formula) : Cpure.formula =
match (Cpure.do_with_check "" simplify pe) with
| None -> pe
| Some f -> f
let hull (f: Cpure.formula) : Cpure.formula = f
let pairwisecheck (f: Cpure.formula): Cpure.formula = f
| null | https://raw.githubusercontent.com/hipsleek/hipsleek/596f7fa7f67444c8309da2ca86ba4c47d376618c/bef_indent/minisat.ml | ocaml | open Rtc_new_stable
open Rtc_new_algorithm
Global settings
minisat
"-pre"
"cryptominisat"
"-pre"
valid value is: cnf
**************************************************************
TRANSLATE CPURE FORMULA TO PROBLEM IN CNF FORMAT
*************************************************************
minisat
TODO: Need to handle
-------------------------------Functions are used for generating cnf of CNF formula--------------------
let addBooleanConst v =
begin
(
)
done;
if(!index>0) then string_of_int !index
end
Handle here
let () = print_endline("minisat of e1: "^li^" minisat of e2: "^ri) in
if(li=ri) then
begin
let index=addBooleanConst (li) in index
end
else(*add xx to the set of boolean constants
let _=if(mem=false)then
Handle here
if(li=ri) then (let index=addBooleanConst (li) in ("-"^index))
else(*add xx to the set of boolean constants
let _=if(mem=false)then
bag formulas
list formulas
| VarPerm _ -> Error.report_no_pattern ()
let _=print_endline ("minisat_cnf_of_not_of_p_formula_for_helper BConst EXIT!") in
let _=print_endline ("minisat_cnf_of_not_of_p_formula_for_helper Bvar EXIT!") in
Handle here
begin
let index=addBooleanConst (li) in ("-"^index)(*add -xx to the set of boolean constants
end
else
let _=if(mem=false)then
Handle here
if(li=ri) then (let index=addBooleanConst li in index ) (*add xx to the set of boolean constants
else
let _=if(mem=false)then
bag formulas
list formulas
| VarPerm _
----------------------------------Functions are used for generating T-----------------------------------
---------------------------------------CNF conversion here-----------------------------------
let _=if(a) then print_endline ("TRUE") else print_endline ("FALSE") in
| VarPerm _
return_pure b
let rec cnf_to_string f =
match f with
|BForm (b,_)-> minisat_cnf_of_b_formula b
|Or (f1, f2, _, _)->"("^(cnf_to_string f1)^"v"^(cnf_to_string f2)^")"
let incr_cls= number_clauses:=1 + !number_clauses
let rec cnf_to_string_to_file f (map: spec_var list)=
match f with
|BForm (b,_)-> let var=minisat_cnf_of_b_formula b map in check_inmap var map
|Or (f1, f2, _, _)-> (cnf_to_string_to_file f1 map)^" "^(cnf_to_string_to_file f2 map)
Should use heuristic in CNF
Should use heuristic in CNF
The main here- when using slicing
let _=print_endline ("CNF form: "^Cprinter.string_of_pure_formula f1) in let _= print_endline ("[minisat.ml exit 0] Please use the option '--enable-slicing'") in exit 0
| Exists _ ->
let to_cnf f = nnf_to_cnf (minisat_cnf_of_formula f)
(to_cnf cnf_form)
let minisat_cnf_of_formula f =
***********************************************************
Check whether minisat can handle the expression, formula...
arithmetic expressions
bag expressions
list expressions
array expressions
true
true
bag formulars
list formulas
| VarPerm _
**************************************************************
INTERACTION
*************************************************************
let () = print_endline (" -- output: " ^ line) in
TODO: this function need to be optimized
let _=print_endline output in
let _=print_endline output in
output: - prover_output
- the running status of prover: true if running, otherwise false
stop minisat system
Runs the specified prover and returns output
debug
let () = print_endline "** In function minisat.check_problem" in
let () = print_endline ("-- input: " ^ input^"\n") in
let tstartlog = Gen.Profiling.get_time () in
let ch = Unix.open_process_in "/usr/local/bin/ minisat22 bach_eq_minisat.cnf" in
let ch = Unix.execvp "/usr/local/bin/minisat22" [|"minisat22";"bach_eq_minisat.cnf"|] in
let status = Unix.close_process_in ch in
exception : return the safe result to ensure soundness
tstoplog
minisat: output for cnf format
let () = print_endline("INSIDE rtc_generate_B, f=="^Cprinter.string_of_pure_formula f) in
let () = "** In function Spass.to_minisat_cnf" in
let _=print_endline ("imply Final Formula :" ^ (Cprinter.string_of_pure_formula ante))in
let () = read_line() in
let _=print_endline ("CNF Formula :" ^ (Cprinter.string_of_pure_formula (to_cnf ante)))in
let () = print_endline("INSIDE to_minisat_cnf"^Cprinter.string_of_pure_formula ante) in
let _=Gen.Profiling.push_time("stat_CNF_ori_conversion") in
let () = print_endline ("other\n") in
let _=Gen.Profiling.pop_time("stat_CNF_ori_conversion") in
let _=print_endline "sat true" in
let _=Gen.Profiling.push_time("stat_CNF_generation_of_B") in
start generating cnf for the given CNF formula
result
*
* Test for satisfiability
* We also consider unknown is the same as sat
minisat
let tstartlog = Gen.Profiling.get_time () in
let tstoplog = Gen.Profiling.get_time () in
let validity =
(* let _=Gen.Profiling.push_time("stat_check_sat_1") in
let res=check_problem_through_file minisat_input timeout in res
(* let _=Gen.Profiling.pop_time("stat_check_sat_1") in res
(* else true
in check_problem_through_file
if(validity=false) then
validity
else
let _= print_endline "check sat2" in
let _=Gen.Profiling.push_time("stat_generation_of_T") in
let tstartlog = Gen.Profiling.get_time () in
let _= print_endline ("ori cnf form: "^minisat_input) in
let tstartlog = Gen.Profiling.get_time () in
let () = print_endline("get_cnf_from_cache "^cnf_T^"\n") in
let tstoplog = Gen.Profiling.get_time () in
let tstoplog = Gen.Profiling.get_time () in
let _=Gen.Profiling.pop_time("stat_generation_of_T") in
let _=Gen.Profiling.push_time("stat_check_sat_2") in
let () = print_endline("cnf_T:"^cnf_T^" minisat_input:"^minisat_input^"\n") in
let _=print_endline ("All input: \n"^all_input) in
let tstartlog = Gen.Profiling.get_time () in
let () = print_endline("all_input: "^all_input^"\n") in
let tstoplog = Gen.Profiling.get_time () in
minisat
minisat
let omega_result = Omega.is_sat f sat_no in
let () = print_endline ("-- minisat_is_sat result: " ^ (if result then "TRUE" else "FALSE")) in
let () = print_endline ("-- Omega.is_sat result: " ^ (if omega_result then "TRUE" else "FALSE")) in
see imply
debug
let () = print_endline "** In function minisat.is_sat: " in
string_of_bool is_sat f sat_no
let () = print_endline "** In function minisat.is_sat: " in
*
* Test for validity
* To check the implication P -> Q, we check the satisfiability of
* P /\ not Q
* If it is satisfiable, then the original implication is false.
* If it is unsatisfiable, the original implication is true.
* We also consider unknown is the same as sat
let () = print_endline "** In function minisat.imply:" in
x_tinfo_pp "hello\n" no_pos;
let () = print_endline "** In function minisat.imply:" in
let () = pint_endline "** In function minisat.imply:" in
_loop
*
* To be implemented
debug
let () = print_endline "** In function minisat.simplify" in | #include "xdebug.cppo"
open VarGen
open Globals
open GlobProver
open Gen.Basic
open Cpure
open Rtc_algorithm
module StringSet = Set.Make(String)
let minisat_timeout_limit = 15.0
let test_number = ref 0
let last_test_number = ref 0
let minisat_restart_interval = ref (-1)
let log_all_flag = ref false
let is_minisat_running = ref false
default timeout is 15 seconds
let minisat_call_count: int ref = ref 0
let log_file = open_log_out ("allinput.minisat")
valid value is : " file " or " stdin "
let minisat_path = "/usr/local/bin/minisat"
let minisat_name = "minisat"
let minisat_path_crypt = "/home/bachle/improve_rtc_algo/sleekex"
let minisat_name_crypt = "cryptominisat"
let minisat_arg_crypt = "--no-simplify --nosatelite --gaussuntil=3"
" /home / bachle / improve_rtc_algo / sleekex / cryptominisat "
let minisat_name2 = ( "
let eq_path = "equality_logic"
let eq_name = "equality_logic"
let eq_arg = "equality_logic"
let number_clauses = ref 1
let number_vars = ref 0
let len=1000
let bcl= ref [ ]
let sat= ref true
let minisat_process = ref { name = "minisat";
pid = 0;
inchannel = stdin;
outchannel = stdout;
errchannel = stdin
}
let de_morgan f=match f with
|Not (And(f1,f2,_),l1,l2)-> Or(Not(f1,l1,l2), Not (f2,l1,l2),l1,l2)
|Not (Or(f1,f2,_,_),l1,l2)-> And(Not(f1,l1,l2),Not(f2,l1,l2),l2)
|_->f
let double_negative f= match f with
|Not (Not(f1,_,_),_,_)->f1
|_->f
let minisat_cnf_of_spec_var sv = let ident=Cpure.name_of_spec_var sv in ident
let rec minisat_of_exp e0 = match e0 with
| Null _ -> "null_var"
| Var (sv, _) -> minisat_cnf_of_spec_var sv
| IConst (i, _) -> string_of_int i
| AConst (i, _) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Add (a1, a2, _) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Subtract (a1, a2, _) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Mult (a1, a2, l) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Div (a1, a2, l) -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
| Max _
| Min _ -> illegal_format ("eq_logic.eq_logic_of_exp: min/max should not appear here")
| TypeCast _ -> illegal_format ("eq_logic.eq_logic_of_exp: TypeCast should not appear here")
| FConst _ -> illegal_format ("eq_logic.eq_logic_of_exp: FConst")
| _ -> illegal_format ("eq_logic.eq_logic_of_exp: array, bag or list constraint")
let _ = print_endline ( " length of string_of_int ( ! ) ) in ( * Debug - bach
let index= ref 0 in
for i=0 to ( List.length ! do
if v=(List.nth ! bcl i ) then ( index:=i+len )
else let _ = bcl:= ! bcl@[v ] in ( string_of_int ( ( List.length ! bcl)+len-1 ) )
let minisat_cnf_of_p_formula (pf : Cpure.p_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t) =
match pf with
| Frm (sv, _) -> ""
| LexVar _ -> ""
let _ = print_endline ( " minisat_cnf_of_p_formula_for_helper BConst EXIT ! " ) in
WN : weakening
| BVar (sv, pos) ->
let _= x_binfo_hp (add_str "minisat_cnf_of_p_formula Bvar" minisat_cnf_of_spec_var) sv pos
in ""
| Lt _ -> ""
| Lte _ -> ""
| Gt _ -> ""
| Gte _ -> ""
| SubAnn _ -> ""
| Eq (e1, e2, _) ->
let eq_edge=G.E.create li () ri in
let _= G.add_edge_e ge eq_edge in
let mem = Glabel.mem_edge allvars in
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
let lr=get_var li ri allvars in
lr
let diseq_edge=G.E.create li () ri in
let _= G.add_edge_e gd diseq_edge in
let mem = Glabel.mem_edge allvars in
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
in "-"^lr
| EqMax _ -> ""
| EqMin _ -> ""
| BagIn _
| BagNotIn _
| BagSub _
| BagMin _
| BagMax _ -> ""
| ListIn _
| ListNotIn _
| ListAllN _
| ListPerm _
| RelForm _ -> ""
let minisat_cnf_of_b_formula (bf : Cpure.b_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t)=
match bf with
| (pf, _) -> minisat_cnf_of_p_formula pf allvars ge gd
let minisat_cnf_of_not_of_p_formula (pf : Cpure.p_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t) =
match pf with
| Frm _ -> ""
| LexVar _ -> ""
| Lt _ -> ""
| Lte _ -> ""
| Gt _ -> ""
| Gte _ -> ""
| SubAnn _ -> ""
if(li = ri ) then
let diseq_edge=G.E.create li () ri in
let _= G.add_edge_e gd diseq_edge in
let mem = Glabel.mem_edge allvars in
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
"-"^lr
let eq_edge=G.E.create li () ri in
let _= G.add_edge_e ge eq_edge in
let mem = Glabel.mem_edge allvars in
let _=
begin
let _=number_vars := !number_vars+1 in
let cx=Glabel.E.create li (ref (string_of_int !number_vars)) ri in
Glabel.add_edge_e allvars cx
end
in
let rtc = new rTC in
lr
| EqMax _ -> ""
| EqMin _ -> ""
| BagIn _
| BagNotIn _
| BagSub _
| BagMin _
| BagMax _ -> ""
| ListIn _
| ListNotIn _
| ListAllN _
| ListPerm _
| RelForm _ -> ""
let minisat_cnf_of_not_of_b_formula (bf : Cpure.b_formula) (allvars:Glabel.t) (ge:G.t) (gd:G.t) =
match bf with
| (pf, _) -> minisat_cnf_of_not_of_p_formula pf allvars ge gd
let return_pure bf f= match bf with
| (pf,_)-> match pf with
| Eq _ -> f
| Neq _ -> f
| Frm _ -> f
| BVar(_,_)->f
| XPure _ | LexVar _ | Lt _ | Lte _ | Gt _ | Gte _ | SubAnn _ | EqMax _ | EqMin _ | BagIn _ | BagNotIn _ | BagSub _
For converting to NNF -- no need??--
let rec minisat_cnf_of_formula f =
match f with
| And (f1, f2, l1) -> And(minisat_cnf_of_formula f1,minisat_cnf_of_formula f2,l1)
| Or (f1, f2, l1, l2) -> Or(minisat_cnf_of_formula f1,minisat_cnf_of_formula f2,l1,l2)
| Not (BForm(b,_), _, _) -> return_pure b f
| _ -> minisat_cnf_of_formula (de_morgan (double_negative f));;
|Not ( f1,_,_)->"-"^cnf_to_string f1
( f1 , f2 , _ ) - > " ( " ^(cnf_to_string f1)^"&"^(cnf_to_string f2)^ " ) "
|Not ( f1,_,_)->"-"^cnf_to_string_to_file f1 map
( f1 , f2 , _ ) - > let _ = incr_cls in ( cnf_to_string_to_file f1 map)^ " 0"^"\n"^(cnf_to_string_to_file f2 map )
For CNF conversion
let unsat_in_cnf (bf : Cpure.b_formula) =
match bf with
| (pf, _) -> match pf with
| Neq(e1,e2,_)->let li=minisat_of_exp e1 and ri=minisat_of_exp e2 in
| _->()
let rec has_and f =
match f with
|BForm _ -> false
|And(_,_,_)->true
|Or(f1,f2,_,_) -> if(has_and f1) then true else if (has_and f2) then true else false
| _->false
and is_cnf_old2 f =
match f with
| BForm _ -> true
| Or (f1,f2,_,_)-> if(has_and f1) then false else if (has_and f2) then false else true
| And (BForm(b,_),f2,_)->let _=unsat_in_cnf b in if(!sat=true) then is_cnf f2 else true
| And (f1,BForm(b,_),_)->let _=unsat_in_cnf b in if(!sat=true) then is_cnf f1 else true
| And (f1,f2,_)-> if(is_cnf f1) then is_cnf f2 else false
| AndList _ | Not _ | Forall _ | Exists _ -> Error.report_no_pattern ()
match f with
| BForm _ -> true
| Or (f1,f2,_,_)-> if(has_and f1) then false else if (has_and f2) then false else true
| And (BForm(b,_),f2,_)->is_cnf f2
| And (f1,BForm(b,_),_)->is_cnf f1
| And (f1,f2,_)-> if(is_cnf f1) then is_cnf f2 else false
| AndList _ | Not _ | Forall _ | Exists _ -> Error.report_no_pattern()
match f with
| BForm _ -> true
| Or (f1,f2,_,_)-> if(has_and f1) then false else if (has_and f2) then false else true
| And (BForm(b,_),f2,_)->is_cnf f2
| And (f1,BForm(b,_),_)->is_cnf f1
| And (f1,f2,_)-> if(is_cnf f1) then is_cnf f2 else false
| _-> let _=print_endline_quiet ("CNF conv here: "^Cprinter.string_of_pure_formula f) in true
distributive law 1 - ( f & k ) v ( g & h ) - > ( f v g ) & ( f v h ) & ( k v g ) & ( k v h )
let dist_1 f =
using heuristic for the first one
And(Or(f1 , f2,l1,l2 ) , Or(f1 , f3,l1,l2),l2 )
, f2 , _ ) , And(f3 , f4,_),l1,l2 ) ->And(And(Or(f1 , f3,l1,l2 ) , Or(f1 , f4,l1,l2),l2 ) , And(Or(f2 , f3,l1,l2 ) , Or(f2 , )
| Or(And(f2, f3,_), f1,l1,l2) -> And(Or(f1, f2,l1,l2), Or(f1, f3,l1,l2),l2)
| _ -> f
let dist_no_slicing f =
match f with
, f2 , _ ) , And(f3 , f4,_),l1,l2 ) ->And(And(Or(f1 , f3,l1,l2 ) , Or(f1 , f4,l1,l2),l2 ) , And(Or(f2 , f3,l1,l2 ) , Or(f2 , )
| Or(And(f2, f3,_), f1,l1,l2) -> And(Or(f1, f2,l1,l2), Or(f1, f3,l1,l2),l2)
| _ -> f
let rec nnf_to_xxx f rule =
let nf = match f with
BForm (b,_) -> return_pure b f
| Not (f1,l1,l2) -> Not ((nnf_to_xxx f1 rule),l1,l2)
| And (f1, f2,l1) -> And (nnf_to_xxx f1 rule, nnf_to_xxx f2 rule,l1)
| Or (f1, f2,l1,l2) -> Or (nnf_to_xxx f1 rule, nnf_to_xxx f2 rule,l1,l2)
| Exists (_,f1,_,_) -> nnf_to_xxx f1 rule
| AndList _ | Forall _ -> Error.report_no_pattern()
in
rule nf
let nnf_to_cnf f= nnf_to_xxx f dist_1
let nnf_to_cnf_no_slicing f= nnf_to_xxx f dist_no_slicing
The old CNF conversion
let rec to_cnf f =
let cnf_form=(nnf_to_cnf_no_slicing f) in
in
let _=print_endline_quiet ("CNF form: "^Cprinter.string_of_pure_formula res) in
res
let to_cnf_no_slicing f=
let _=print_endline_quiet ("Orig: "^Cprinter.string_of_pure_formula f) in
let nnf= minisat_cnf_of_formula f in
let _=print_endline_quiet ("NNF here: "^Cprinter.string_of_pure_formula nnf) in
to_cnf nnf
The no need CNF conversion adapt to slicing , we just need the distributive law
Debug.no_1 " minisat_of_formula " Cprinter.string_of_pure_formula pr_id minisat_cnf_of_formula f
bach - minisat
let rec can_minisat_handle_expression (exp: Cpure.exp) : bool =
match exp with
| Cpure.Null _ -> false
| Cpure.Var _ -> false
| Cpure.IConst _ -> false
| Cpure.FConst _ -> false
| Cpure.AConst _ -> false
| Cpure.NegInfConst _
| Cpure.InfConst _ -> false
| Cpure.Add _
| Cpure.Subtract _
| Cpure.Mult _
| Cpure.Div _
| Cpure.Max _
| Cpure.Min _
| Cpure.TypeCast _ -> false
| Cpure.Bag _
| Cpure.BagUnion _
| Cpure.BagIntersect _
| Cpure.BagDiff _ -> false
| Cpure.List _
| Cpure.ListCons _
| Cpure.ListHead _
| Cpure.ListTail _
| Cpure.ListLength _
| Cpure.ListAppend _
| Cpure.ListReverse _ -> false
| Cpure.ArrayAt _ -> false
| Cpure.Func _ -> false
| Cpure.Template _ -> false
| Cpure.Level _
| Cpure.Tsconst _ -> Error.report_no_pattern()
| Cpure.Tup2 _ -> Error.report_no_pattern()
| Cpure.Bptriple _ -> Error.report_no_pattern()
and can_minisat_handle_p_formula (pf : Cpure.p_formula) : bool =
match pf with
| Frm _ -> false
| LexVar _ -> false
| Lt _ -> false
| Lte _ -> false
| Gt _ -> false
| Gte _ -> false
| SubAnn (ex1, ex2, _) -> false
| Eq (ex1, ex2, _) -> true
| Neq (ex1, ex2, _) -> true
| EqMax _ -> false
| EqMin _ -> false
| BagIn _
| BagNotIn _
| BagSub _
| BagMin _
| BagMax _ -> false
| ListIn _
| ListNotIn _
| ListAllN _
| ListPerm _
| RelForm _ -> false
and can_minisat_handle_b_formula (bf : Cpure.b_formula) : bool =
match bf with
| (pf, _) -> can_minisat_handle_p_formula pf
and can_minisat_handle_formula (f: Cpure.formula) : bool =
match f with
| BForm (bf, _) -> can_minisat_handle_b_formula bf
| And (f1, f2, _) -> (can_minisat_handle_formula f1) && (can_minisat_handle_formula f2)
| Or (f1, f2, _, _) -> (can_minisat_handle_formula f1) && (can_minisat_handle_formula f2)
| Not (f, _, _) -> can_minisat_handle_formula f
| Forall (_, f, _, _) -> can_minisat_handle_formula f
| Exists (_, f, _, _) -> can_minisat_handle_formula f
| AndList _ -> Error.report_no_pattern()
let rec collect_output (chn: in_channel) : (string * bool) =
try
let line = input_line chn in
if line = "SATISFIABLE" then
(line, true)
else if (line = "c SAT") then
("SATISFIABLE",true)
else
collect_output chn
with
| End_of_file -> ("", false)
read the output stream of minisat prover , return ( conclusion * reason )
let get_prover_result (output : string) :bool =
if !Globals.print_original_solver_output then
begin
print_endline_quiet "MINISAT OUTPUT";
print_endline_quiet "--------------";
print_endline_quiet output;
print_endline_quiet "--------------";
end;
let validity =
if (output="SATISFIABLE") then
true
else
false in
validity
let get_answer (chn: in_channel) : (bool * bool)=
let (output, running_state) = collect_output chn in
let
validity_result = get_prover_result output;
in
(validity_result, running_state)
let remove_file filename =
try Sys.remove filename;
with e -> ignore e
let set_process (proc: prover_process_t) =
minisat_process := proc
let start () =
if not !is_minisat_running then (
print_endline_quiet ("Starting minisat... \n");
last_test_number := !test_number;
let prelude () = () in
if (minisat_input_format = "cnf") then (
Procutils.PrvComms.start !log_all_flag log_file (minisat_name, minisat_path, [|minisat_arg|]) set_process prelude;
is_minisat_running := true;
)
)
let stop () =
if !is_minisat_running then (
let num_tasks = !test_number - !last_test_number in
print_string_if !Globals.enable_count_stats ("\nStop minisat... " ^ (string_of_int !minisat_call_count) ^ " invocations "); flush stdout;
let () = Procutils.PrvComms.stop !log_all_flag log_file !minisat_process num_tasks Sys.sigkill (fun () -> ()) in
is_minisat_running := false;
)
restart Omega system
let restart reason =
if !is_minisat_running then (
let () = print_string_if !Globals.enable_count_stats (reason ^ " Restarting minisat after ... " ^ (string_of_int !minisat_call_count) ^ " invocations ") in
Procutils.PrvComms.restart !log_all_flag log_file reason "minisat" start stop
)
else (
let () = print_string_if !Globals.enable_count_stats (reason ^ " not restarting minisat ... " ^ (string_of_int !minisat_call_count) ^ " invocations ") in ()
)
let check_problem_through_file (input: string) (timeout: float) : bool =
let file_suffix = "bach_eq_minisat" in
let infile =(file_suffix) ^ ".cnf" in
if !Globals.print_original_solver_input then
begin
print_endline_quiet "MINISAT INPUT";
print_endline_quiet "--------------";
print_endline_quiet input;
print_endline_quiet "--------------";
end;
let out_stream = open_out infile in
output_string out_stream input;
close_out out_stream;
let minisat_result="minisatres.txt" in
let set_process proc = minisat_process := proc in
let fnc () =
if (minisat_input_format = "cnf") then (
Procutils.PrvComms.start false stdout (minisat_name2, minisat_path2, [|minisat_arg2;infile;minisat_result|]) set_process (fun () -> ());
minisat_call_count := !minisat_call_count + 1;
let (prover_output, running_state) = get_answer !minisat_process.inchannel in
is_minisat_running := running_state;
let tstoplog = Gen. ( ) in
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
prover_output;
)
else illegal_format "[minisat.ml] The value of minisat_input_format is invalid!" in
let res =
try
let res = Procutils.PrvComms.maybe_raise_timeout fnc () timeout in
res
print_backtrace_quiet ();
print_endline_quiet ("WARNING: Restarting prover due to timeout");
Unix.kill !minisat_process.pid 9;
ignore (Unix.waitpid [] !minisat_process.pid);
false
)
in
let () = Procutils.PrvComms.stop false stdout !minisat_process 0 9 (fun () -> ()) in
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
remove_file infile;
res
let check_problem_through_file (input: string) (timeout: float) : bool =
Debug.no_1 "check_problem_through_file (minisat)"
(fun s -> s) string_of_bool
(fun f -> check_problem_through_file f timeout) input
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
FOR IMPLICATION / SATISFIABILITY CHECKING
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
GENERATE CNF INPUT FOR IMPLICATION / SATISFIABILITY CHECKING
**************************************************************)
let rtc_generate_B (f:Cpure.formula) =
ge is eq graph and gd is diseq graph
Aiming to get ge and gd and cnf string of the given CNF formula
match f with
|BForm (b,_)-> minisat_cnf_of_b_formula b gr_e ge gd
|And (f1, f2, _) ->cnf_to_string_to_file f1 ^" 0"^"\n"^ cnf_to_string_to_file f2
|Or (f1, f2, _, _)->cnf_to_string_to_file f1 ^" "^ cnf_to_string_to_file f2
|Not ((BForm(b,_)),_,_)-> minisat_cnf_of_not_of_b_formula b gr_e ge gd
| _->
let _=
x_tinfo_hp (add_str "imply Final Formula :" Cprinter.string_of_pure_formula) f no_pos
in ""
in
let cnf_str =cnf_to_string_to_file f in
(cnf_str,ge,gd,gr_e)
let get_cnf_from_cache ge gd gr_e=
let testRTC= new rTC in
let cache= testRTC#rtc_v2 ge gd gr_e !number_vars in
cache
let to_minisat_cnf (ante: Cpure.formula) =
let _= number_vars := 0 in
let ante_cnf = to_cnf ante the given formula in to CNF here
let cnf_ante=nnf_to_cnf ante
in
let _ = print_endline ( " To minisat cnf : " ^ ( Cprinter.string_of_pure_formula cnf_ante))in
match ante with
| BForm ((BConst (a,_),_),_)->
let () = print_endline_quiet ("BForm:\n ") in
if (a)
then (false,"t",G.create(),G.create(),Glabel.create())
else (false,"f",G.create(),G.create(),Glabel.create())
| _ ->
let (ante_str,ge,gd,gr_e)=rtc_generate_B cnf_ante in
let () = Debug.ninfo_hprint (add_str "ante_str == " pr_id) ante_str no_pos in
let temp= if(ante_str <> "0" && ante_str <> "") then (ante_str^" 0") else "p cnf 0 0" in
let _ = Gen. " ) in
(true,final_res,ge,gd,gr_e)
bach
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
FOR IMPLICATION / SATISFIABILITY CHECKING
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
GENERATE CNF INPUT FOR IMPLICATION / SATISFIABILITY CHECKING
**************************************************************)
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
MAIN INTERFACE : CHECKING IMPLICATION AND SATISFIABILITY
* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *
MAIN INTERFACE : CHECKING IMPLICATION AND SATISFIABILITY
*************************************************************)
let minisat_is_sat (f : Cpure.formula) (sat_no : string) timeout : bool =
to check sat of f , check the validity of negative(f ) or ( f = > None )
let ( ) = print_endline ( " here"^Cprinter.string_of_pure_formula f ) in
let (flag,minisat_input,ge,gd,gr_e) = to_minisat_cnf f in
let _ = Globals.minisat_time_cnf_conv : = ! Globals.minisat_time_cnf_conv + . ( tstoplog - . ) in
if(flag = false ) then
begin
if(minisat_input = "t") then true
else
if(minisat_input = "f") then false
else false
end
else
( * if ( ( List.length ! ) then
let _ = print_endline " check " in
let cnf_T = get_cnf_from_cache ge gd gr_e in
let _ = Globals.minisat_time_BCC : = ! . ( tstoplog - . ) in
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
let all_input=if(cnf_T <> "") then cnf_T^minisat_input else minisat_input in
let res= check_problem_through_file (all_input) timeout in
let _ = Globals.minisat_time_T : = ! Globals.minisat_time_T + . ( tstoplog - . ) in
res
let _ = Gen. " ) in res
let minisat_is_sat (f : Cpure.formula) (sat_no : string) : bool =
minisat_is_sat f sat_no minisat_timeout_limit
let minisat_is_sat (f : Cpure.formula) (sat_no : string) : bool =
let pr = Cprinter.string_of_pure_formula in
let result = Debug.no_1 "minisat_is_sat" pr string_of_bool (fun _ -> minisat_is_sat f sat_no) f in
result
let is_sat (f: Cpure.formula) (sat_no: string) : bool =
minisat_is_sat f sat_no
let is_sat_with_check (pe : Cpure.formula) sat_no : bool option =
Cpure.do_with_check "" (fun x -> is_sat x sat_no) pe
let is_sat f sat_no = Debug.loop_2_no " is_sat " ( ! print_pure ) ( fun x->x )
let is_sat (pe : Cpure.formula) (sat_no: string) : bool =
try
is_sat pe sat_no;
with Illegal_Prover_Format s -> (
print_endline_quiet ("\nWARNING : Illegal_Prover_Format for :" ^ s);
print_endline_quiet ("Apply minisat.is_sat on formula :" ^ (Cprinter.string_of_pure_formula pe));
flush stdout;
failwith s
)
let imply (ante: Cpure.formula) (conseq: Cpure.formula) (timeout: float) : bool =
let _ = ( fun x- > print_endline ( minisat_cnf_of_spec_var x ) ) all in
let cons= (mkNot_s conseq) in
let imply_f= mkAnd ante cons no_pos in
let res =is_sat imply_f ""
in
let _ = if(res ) then print_endline ( " SAT " ) else print_endline ( " UNSAT " ) in
if(res) then false else true
let imply (ante : Cpure.formula) (conseq : Cpure.formula) (timeout: float) : bool =
try
let result = imply ante conseq timeout in
bach - test
result
with Illegal_Prover_Format s -> (
print_endline_quiet ("\nWARNING : Illegal_Prover_Format for :" ^ s);
print_endline_quiet ("Apply minisat.imply on ante Formula :" ^ (Cprinter.string_of_pure_formula ante));
print_endline_quiet ("and conseq Formula :" ^ (Cprinter.string_of_pure_formula conseq));
flush stdout;
failwith s
)
let imply (ante : Cpure.formula) (conseq : Cpure.formula) (timeout: float) : bool =
let pr = Cprinter.string_of_pure_formula in
(fun _ _ -> imply ante conseq timeout) ante conseq
let imply_with_check (ante : Cpure.formula) (conseq : Cpure.formula) (imp_no : string) (timeout: float) : bool option =
let ( ) = print_endline " * * In function : " in
Cpure.do_with_check2 "" (fun a c -> imply a c timeout) ante conseq
let simplify (f: Cpure.formula) : Cpure.formula =
try (Omega.simplify f) with _ -> f
let simplify (pe : Cpure.formula) : Cpure.formula =
match (Cpure.do_with_check "" simplify pe) with
| None -> pe
| Some f -> f
let hull (f: Cpure.formula) : Cpure.formula = f
let pairwisecheck (f: Cpure.formula): Cpure.formula = f
|
7f4dc1e45bc5c7d4316769cb12c2ba2563676155c4763d1d6ad3de11ea991f07 | input-output-hk/ouroboros-network | ExitPolicy.hs | {-# LANGUAGE DerivingStrategies #-}
{-# LANGUAGE DerivingVia #-}
# LANGUAGE GeneralisedNewtypeDeriving #
# LANGUAGE NamedFieldPuns #
module Ouroboros.Network.ExitPolicy
( ReconnectDelay (..)
, ExitPolicy (..)
, stdExitPolicy
, ReturnPolicy
, alwaysCleanReturnPolicy
) where
import Control.Monad.Class.MonadTime
import Data.Semigroup (Max (..))
newtype ReconnectDelay = ReconnectDelay { reconnectDelay :: DiffTime }
deriving (Eq, Ord)
deriving newtype Num
deriving newtype Fractional
deriving Semigroup via Max DiffTime
-- It ought to be derived via 'Quiet' but 'Difftime' lacks 'Generic' instance.
instance Show ReconnectDelay where
show (ReconnectDelay d) = "ReconnectDelay " ++ show d
type ReturnPolicy a = a -> ReconnectDelay
-- | 'ReturnPolicy' allows to compute reconnection delay from value return by
-- a mini-protocol. If a mini-protocol returned with an error 'epErrorDelay'
-- is used.
data ExitPolicy a =
ExitPolicy {
-- | Compute 'ReturnCommand' from return value.
--
epReturnDelay :: ReturnPolicy a,
-- | The delay when a mini-protocol returned with an error.
--
epErrorDelay :: ReconnectDelay
}
alwaysCleanReturnPolicy :: ReconnectDelay -- ^ reconnection delay on error
-> ExitPolicy a
alwaysCleanReturnPolicy = ExitPolicy $ \_ -> 0
| ' ExitPolicy ' with 10s error delay .
--
stdExitPolicy :: ReturnPolicy a -> ExitPolicy a
stdExitPolicy epReturnDelay =
ExitPolicy {
epReturnDelay,
epErrorDelay = 10
}
| null | https://raw.githubusercontent.com/input-output-hk/ouroboros-network/b9959087bfd72837c02b8b33ba04175e4ff8b186/ouroboros-network/src/Ouroboros/Network/ExitPolicy.hs | haskell | # LANGUAGE DerivingStrategies #
# LANGUAGE DerivingVia #
It ought to be derived via 'Quiet' but 'Difftime' lacks 'Generic' instance.
| 'ReturnPolicy' allows to compute reconnection delay from value return by
a mini-protocol. If a mini-protocol returned with an error 'epErrorDelay'
is used.
| Compute 'ReturnCommand' from return value.
| The delay when a mini-protocol returned with an error.
^ reconnection delay on error
| # LANGUAGE GeneralisedNewtypeDeriving #
# LANGUAGE NamedFieldPuns #
module Ouroboros.Network.ExitPolicy
( ReconnectDelay (..)
, ExitPolicy (..)
, stdExitPolicy
, ReturnPolicy
, alwaysCleanReturnPolicy
) where
import Control.Monad.Class.MonadTime
import Data.Semigroup (Max (..))
newtype ReconnectDelay = ReconnectDelay { reconnectDelay :: DiffTime }
deriving (Eq, Ord)
deriving newtype Num
deriving newtype Fractional
deriving Semigroup via Max DiffTime
instance Show ReconnectDelay where
show (ReconnectDelay d) = "ReconnectDelay " ++ show d
type ReturnPolicy a = a -> ReconnectDelay
data ExitPolicy a =
ExitPolicy {
epReturnDelay :: ReturnPolicy a,
epErrorDelay :: ReconnectDelay
}
-> ExitPolicy a
alwaysCleanReturnPolicy = ExitPolicy $ \_ -> 0
| ' ExitPolicy ' with 10s error delay .
stdExitPolicy :: ReturnPolicy a -> ExitPolicy a
stdExitPolicy epReturnDelay =
ExitPolicy {
epReturnDelay,
epErrorDelay = 10
}
|
0f2a04a1c0a30379fe1d5e5c4aba37c5c8bfd106ae58c6a99a1a867648af83de | FranklinChen/learn-you-some-erlang | cases_tests.erl | -module(cases_tests).
-include_lib("eunit/include/eunit.hrl").
insert_test_() ->
[?_assertEqual([1], cases:insert(1,[])),
?_assertEqual([1], cases:insert(1,[1])),
?_assertEqual([1,2], cases:insert(1,[2]))].
beach_test_() ->
[?_assertEqual('favorable', cases:beach({celsius, 20})),
?_assertEqual('favorable', cases:beach({celsius, 45})),
?_assertEqual('avoid beach', cases:beach({celsius, 46})),
?_assertEqual('avoid beach', cases:beach({celsius, 19})),
?_assertEqual('scientifically favorable', cases:beach({kelvin, 293})),
?_assertEqual('scientifically favorable', cases:beach({kelvin, 318})),
?_assertEqual('avoid beach', cases:beach({kelvin, 292})),
?_assertEqual('avoid beach', cases:beach({celsius, 319})),
?_assertEqual('favorable in the US',
cases:beach({fahrenheit, 68})),
?_assertEqual('favorable in the US',
cases:beach({fahrenheit, 113})),
?_assertEqual('avoid beach', cases:beach({fahrenheit, 67})),
?_assertEqual('avoid beach', cases:beach({fahrenheit, 114})),
?_assertEqual('avoid beach', cases:beach(cat))].
| null | https://raw.githubusercontent.com/FranklinChen/learn-you-some-erlang/878c8bc2011a12862fe72dd7fdc6c921348c79d6/tests/cases_tests.erl | erlang | -module(cases_tests).
-include_lib("eunit/include/eunit.hrl").
insert_test_() ->
[?_assertEqual([1], cases:insert(1,[])),
?_assertEqual([1], cases:insert(1,[1])),
?_assertEqual([1,2], cases:insert(1,[2]))].
beach_test_() ->
[?_assertEqual('favorable', cases:beach({celsius, 20})),
?_assertEqual('favorable', cases:beach({celsius, 45})),
?_assertEqual('avoid beach', cases:beach({celsius, 46})),
?_assertEqual('avoid beach', cases:beach({celsius, 19})),
?_assertEqual('scientifically favorable', cases:beach({kelvin, 293})),
?_assertEqual('scientifically favorable', cases:beach({kelvin, 318})),
?_assertEqual('avoid beach', cases:beach({kelvin, 292})),
?_assertEqual('avoid beach', cases:beach({celsius, 319})),
?_assertEqual('favorable in the US',
cases:beach({fahrenheit, 68})),
?_assertEqual('favorable in the US',
cases:beach({fahrenheit, 113})),
?_assertEqual('avoid beach', cases:beach({fahrenheit, 67})),
?_assertEqual('avoid beach', cases:beach({fahrenheit, 114})),
?_assertEqual('avoid beach', cases:beach(cat))].
| |
915190d2018a2f9e142b1ed0e6cf1cce76687919ec78bc6ab92885625960fe86 | pforpallav/school | lab1pset.rkt | The first three lines of this file were inserted by . They record metadata
;; about the language level of this file in a form that our tools can easily process.
#reader(lib "htdp-beginner-reader.ss" "lang")((modname pset1) (read-case-sensitive #t) (teachpacks ()) (htdp-settings #(#t constructor repeating-decimal #f #t none #f ())))
;;
CPSC 110 , 2010 - 2011 T1 , Problem Set 1
;;
NAME : _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
;;
;;
(require 2htdp/universe)
(require 2htdp/image)
;;
Problem 1
;;
(* 3 5 7)
(* (* 3 5) 7)
;;
Problem 2
;;
(beside (above (square 20 'solid 'red) (square 20 'solid 'blue))
(above (square 20 'solid 'blue) (square 20 'solid 'red)))
;;
Problem 3
;;
(string-append "accio" " " "firebolt")
;;
Problem 4
;;
(define (accio x)
(if (string? x) (string-append "accio " x) "not a string"))
| null | https://raw.githubusercontent.com/pforpallav/school/60f19dce52f6fb8e416df73c924132da6ac7fa01/CPSC110/lab1pset.rkt | racket | about the language level of this file in a form that our tools can easily process.
| The first three lines of this file were inserted by . They record metadata
#reader(lib "htdp-beginner-reader.ss" "lang")((modname pset1) (read-case-sensitive #t) (teachpacks ()) (htdp-settings #(#t constructor repeating-decimal #f #t none #f ())))
CPSC 110 , 2010 - 2011 T1 , Problem Set 1
NAME : _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
(require 2htdp/universe)
(require 2htdp/image)
Problem 1
(* 3 5 7)
(* (* 3 5) 7)
Problem 2
(beside (above (square 20 'solid 'red) (square 20 'solid 'blue))
(above (square 20 'solid 'blue) (square 20 'solid 'red)))
Problem 3
(string-append "accio" " " "firebolt")
Problem 4
(define (accio x)
(if (string? x) (string-append "accio " x) "not a string"))
|
81fd10b7b56f079320501d6141b7d5deb2dd2fe5782ae447be1fb40c4c24b983 | plumatic/grab-bag | pubsub.clj | (ns web.pubsub
(:use plumbing.core)
(:require
[plumbing.accumulators :as accumulators]
[plumbing.error :as err]
[plumbing.graph :as graph]
[plumbing.logging :as log]
[plumbing.new-time :as new-time]
[plumbing.parallel :as parallel]
[plumbing.resource :as resource]
[store.bucket :as bucket]
[store.s3 :as s3]
[web.client :as client]
[web.data :as data]
[web.server :as server]))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;; Private helpers
(defn topic-bucket [bucket topic] (s3/sub-bucket bucket (str topic "/")))
(defn env-topic-name [topic env]
(str topic "-" (name env)))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;; Broker (shared between pub and sub)
(def broker-bucket
"Bucket where subscribers write files under topic/service-name with info
about where to reach them.
Not an env-bucket, since we actually want cross-env talk currently
(e.g. doc-poller prod --> index-master-stage).
Suffixed \"-prod\" only because of a historical mistake, which
will require a migration to fix."
(graph/instance bucket/bucket-resource []
{:type :s3
:name "grabbag-unreliable-pubsub-prod"}))
(def pubsub-resources
"Overall resource, to be included whenever pubbing or subbing is going on"
(graph/graph
:broker broker-bucket
:sub-callbacks bucket/bucket-resource ;; topic --> [callback]
))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;; Subscribing
(def subscribers-resources
"Resource shared between all subscribers"
(graph/graph
:pubsub-subscriber-port
(fnk [] (server/get-available-port))
:subscriber-server
(graph/instance server/server-resource [[:pubsub sub-callbacks] pubsub-subscriber-port]
{:port pubsub-subscriber-port
:join? false
:root-handler (fnk [uri body]
(let [^String raw-topic uri
topic (subs raw-topic 1)
body (data/decode-serialized-content body)]
(assert (.startsWith raw-topic "/"))
(if-let [callbacks (bucket/get sub-callbacks topic)]
(do (doseq [x body
c callbacks]
(c x))
{:body "\"OK\""})
(log/throw+ {:client-message "Topic not found"}))))})
:put-subscriber!
(graph/instance parallel/scheduled-work-now
[[:pubsub broker sub-callbacks]
[:instance service-name] server-host pubsub-subscriber-port]
{:period-secs 60
:f (fn put-subscriber! []
(doseq [topic (bucket/keys sub-callbacks)]
(bucket/put
(topic-bucket broker topic) service-name
{:host server-host
:port pubsub-subscriber-port
:id service-name
:topic topic
:uri (str "/" topic)
:date (millis)})))})))
(defn sync! [subscribers-graph]
(parallel/refresh! (safe-get subscribers-graph :put-subscriber!)))
(defnk raw-subscriber
"A subscriber to a specific raw-topic (which will not be env-d)"
[subscriber*
[:instance service-name]
[:pubsub sub-callbacks]
pubsub-stats
raw-topic]
(bucket/update
sub-callbacks raw-topic
(fn [x]
(assert (not x))
[(fn [x] (pubsub-stats :sub raw-topic))]))
(sync! subscriber*)
(fn [f] (bucket/update sub-callbacks raw-topic #(conj % f))))
(def subscriber
"A subscriber to a topic, which will be auto-namespaced with env"
(graph/instance raw-subscriber [env topic]
{:raw-topic (env-topic-name topic env)}))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;; Publishing
(defnk publish-resource [init-spec drain drain-trash-size]
(let [spec-atom (atom init-spec)
f (fn [msgs] (client/serialized-post (client/build-url @spec-atom) msgs))]
(if drain
(let [df (accumulators/time-batching-fn
{:secs drain
:f f
:on-error (fn [t] (log/warn t {:message (str "Error publishing" (pr-str @spec-atom))}))
:max-queue-size drain-trash-size})]
{:spec-atom spec-atom
:f #(accumulators/offer-to df %)
:res df})
{:spec-atom spec-atom
:res nil
:f (comp f list)})))
(defn- mirror-subscribers! [broker pub-args pub-resources]
(doseq [[topic topic-pub-args] (bucket/seq pub-args)]
(let [broker-bucket (topic-bucket broker topic)
specs (for-map [[id spec] (bucket/seq broker-bucket)
:let [dead? (when (and (:date spec)
(< (:date spec) (new-time/time-ago 10 :minutes)))
(do (log/infof "Deleting old-ass subscriber %s %s" id spec)
(bucket/delete broker-bucket id)
true))]
:when (not dead?)]
id spec)]
;; ensure entries in local bucket
(doseq [[id spec] specs]
(when-not (contains? (bucket/get pub-resources topic) id)
(bucket/update
pub-resources topic
(fn-> (assoc id (publish-resource (assoc (bucket/get pub-args topic) :init-spec spec))))))
(reset! (safe-get-in (bucket/get pub-resources topic) [id :spec-atom]) spec))
;; delete stale entries
(doseq [[id pub-resource] (bucket/get pub-resources topic)]
(when-not (contains? specs id)
(log/infof "Deleting subscriber %s" id)
(resource/close (safe-get pub-resource :res))
(bucket/update pub-resources topic (fn-> (dissoc id))))))))
(def publishers-resources
"Resource shared between all publishers"
(graph/graph
:pub-args
bucket/bucket-resource ;; topic --> pub-args
:pub-resources
bucket/bucket-resource ;; topic --> {service-id {:f fn (optional-key :res) PCloseable :spec-atom ...}}
:shutdown-pub-resources
(fnk [pub-resources]
(reify resource/PCloseable
(close [this]
(doseq [[topic pub-map] (bucket/seq pub-resources)
[service-id pub-resource] pub-map
:let [res (safe-get pub-resource :res)]]
(when res
(err/?error
(format "Error shutting down publisher %s to %s" topic service-id)
(resource/close res)))))))
:mirror-subscribers!
(graph/instance parallel/scheduled-work-now
[[:pubsub broker] pub-args pub-resources {refresh 30}]
{:period-secs refresh
:f (fn [] (mirror-subscribers! broker pub-args pub-resources))})))
(defnk raw-publisher
"A publisher to a specific raw-topic (which will not be env-d)"
[[:pubsub sub-callbacks]
[:instance service-name]
[:publisher* pub-args pub-resources mirror-subscribers!]
pubsub-stats
raw-topic
{drain 10} {drain-trash-size nil} {refresh 30} {force-remote? false}]
(assert (not (bucket/get pub-args raw-topic)))
(bucket/put pub-args raw-topic {:drain drain :drain-trash-size drain-trash-size})
(parallel/refresh! mirror-subscribers!)
(fn [msg]
(pubsub-stats :pub raw-topic)
(when-not force-remote?
(doseq [sub (bucket/get sub-callbacks raw-topic)]
(sub msg)))
(doseq [[id pub-resource] (bucket/get pub-resources raw-topic)]
(when (and msg (or force-remote? (not= id service-name)))
((safe-get pub-resource :f) msg)))))
(def publisher
"A publisher to a topic, which will be auto-namespaced with env"
(graph/instance raw-publisher [env topic]
{:raw-topic (env-topic-name topic env)}))
;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
;;; A few async tools for combining and transforming subscribers.
(defn- make-subscriber
"Make a subscriber from a function that takes its publish fn and somehow schedules
publications on it."
[sub-pub-fn!]
(let [my-subscribers (atom [])]
(sub-pub-fn! (fn [m] (doseq [s @my-subscribers] (s m))))
(fn [s] (swap! my-subscribers conj s))))
(defn subscriber-concat
"The asynchrounous version of concat. Takes a sequence of input subscribers, and produces
a subscriber that publishes one message for every message received on any input-subscriber."
[& input-subscribers]
(make-subscriber
(fn [pub-fn!]
(doseq [s input-subscribers]
(s pub-fn!)))))
| null | https://raw.githubusercontent.com/plumatic/grab-bag/a15e943322fbbf6f00790ce5614ba6f90de1a9b5/lib/web/src/web/pubsub.clj | clojure |
Private helpers
Broker (shared between pub and sub)
topic --> [callback]
Subscribing
Publishing
ensure entries in local bucket
delete stale entries
topic --> pub-args
topic --> {service-id {:f fn (optional-key :res) PCloseable :spec-atom ...}}
A few async tools for combining and transforming subscribers. | (ns web.pubsub
(:use plumbing.core)
(:require
[plumbing.accumulators :as accumulators]
[plumbing.error :as err]
[plumbing.graph :as graph]
[plumbing.logging :as log]
[plumbing.new-time :as new-time]
[plumbing.parallel :as parallel]
[plumbing.resource :as resource]
[store.bucket :as bucket]
[store.s3 :as s3]
[web.client :as client]
[web.data :as data]
[web.server :as server]))
(defn topic-bucket [bucket topic] (s3/sub-bucket bucket (str topic "/")))
(defn env-topic-name [topic env]
(str topic "-" (name env)))
(def broker-bucket
"Bucket where subscribers write files under topic/service-name with info
about where to reach them.
Not an env-bucket, since we actually want cross-env talk currently
(e.g. doc-poller prod --> index-master-stage).
Suffixed \"-prod\" only because of a historical mistake, which
will require a migration to fix."
(graph/instance bucket/bucket-resource []
{:type :s3
:name "grabbag-unreliable-pubsub-prod"}))
(def pubsub-resources
"Overall resource, to be included whenever pubbing or subbing is going on"
(graph/graph
:broker broker-bucket
))
(def subscribers-resources
"Resource shared between all subscribers"
(graph/graph
:pubsub-subscriber-port
(fnk [] (server/get-available-port))
:subscriber-server
(graph/instance server/server-resource [[:pubsub sub-callbacks] pubsub-subscriber-port]
{:port pubsub-subscriber-port
:join? false
:root-handler (fnk [uri body]
(let [^String raw-topic uri
topic (subs raw-topic 1)
body (data/decode-serialized-content body)]
(assert (.startsWith raw-topic "/"))
(if-let [callbacks (bucket/get sub-callbacks topic)]
(do (doseq [x body
c callbacks]
(c x))
{:body "\"OK\""})
(log/throw+ {:client-message "Topic not found"}))))})
:put-subscriber!
(graph/instance parallel/scheduled-work-now
[[:pubsub broker sub-callbacks]
[:instance service-name] server-host pubsub-subscriber-port]
{:period-secs 60
:f (fn put-subscriber! []
(doseq [topic (bucket/keys sub-callbacks)]
(bucket/put
(topic-bucket broker topic) service-name
{:host server-host
:port pubsub-subscriber-port
:id service-name
:topic topic
:uri (str "/" topic)
:date (millis)})))})))
(defn sync! [subscribers-graph]
(parallel/refresh! (safe-get subscribers-graph :put-subscriber!)))
(defnk raw-subscriber
"A subscriber to a specific raw-topic (which will not be env-d)"
[subscriber*
[:instance service-name]
[:pubsub sub-callbacks]
pubsub-stats
raw-topic]
(bucket/update
sub-callbacks raw-topic
(fn [x]
(assert (not x))
[(fn [x] (pubsub-stats :sub raw-topic))]))
(sync! subscriber*)
(fn [f] (bucket/update sub-callbacks raw-topic #(conj % f))))
(def subscriber
"A subscriber to a topic, which will be auto-namespaced with env"
(graph/instance raw-subscriber [env topic]
{:raw-topic (env-topic-name topic env)}))
(defnk publish-resource [init-spec drain drain-trash-size]
(let [spec-atom (atom init-spec)
f (fn [msgs] (client/serialized-post (client/build-url @spec-atom) msgs))]
(if drain
(let [df (accumulators/time-batching-fn
{:secs drain
:f f
:on-error (fn [t] (log/warn t {:message (str "Error publishing" (pr-str @spec-atom))}))
:max-queue-size drain-trash-size})]
{:spec-atom spec-atom
:f #(accumulators/offer-to df %)
:res df})
{:spec-atom spec-atom
:res nil
:f (comp f list)})))
(defn- mirror-subscribers! [broker pub-args pub-resources]
(doseq [[topic topic-pub-args] (bucket/seq pub-args)]
(let [broker-bucket (topic-bucket broker topic)
specs (for-map [[id spec] (bucket/seq broker-bucket)
:let [dead? (when (and (:date spec)
(< (:date spec) (new-time/time-ago 10 :minutes)))
(do (log/infof "Deleting old-ass subscriber %s %s" id spec)
(bucket/delete broker-bucket id)
true))]
:when (not dead?)]
id spec)]
(doseq [[id spec] specs]
(when-not (contains? (bucket/get pub-resources topic) id)
(bucket/update
pub-resources topic
(fn-> (assoc id (publish-resource (assoc (bucket/get pub-args topic) :init-spec spec))))))
(reset! (safe-get-in (bucket/get pub-resources topic) [id :spec-atom]) spec))
(doseq [[id pub-resource] (bucket/get pub-resources topic)]
(when-not (contains? specs id)
(log/infof "Deleting subscriber %s" id)
(resource/close (safe-get pub-resource :res))
(bucket/update pub-resources topic (fn-> (dissoc id))))))))
(def publishers-resources
"Resource shared between all publishers"
(graph/graph
:pub-args
:pub-resources
:shutdown-pub-resources
(fnk [pub-resources]
(reify resource/PCloseable
(close [this]
(doseq [[topic pub-map] (bucket/seq pub-resources)
[service-id pub-resource] pub-map
:let [res (safe-get pub-resource :res)]]
(when res
(err/?error
(format "Error shutting down publisher %s to %s" topic service-id)
(resource/close res)))))))
:mirror-subscribers!
(graph/instance parallel/scheduled-work-now
[[:pubsub broker] pub-args pub-resources {refresh 30}]
{:period-secs refresh
:f (fn [] (mirror-subscribers! broker pub-args pub-resources))})))
(defnk raw-publisher
"A publisher to a specific raw-topic (which will not be env-d)"
[[:pubsub sub-callbacks]
[:instance service-name]
[:publisher* pub-args pub-resources mirror-subscribers!]
pubsub-stats
raw-topic
{drain 10} {drain-trash-size nil} {refresh 30} {force-remote? false}]
(assert (not (bucket/get pub-args raw-topic)))
(bucket/put pub-args raw-topic {:drain drain :drain-trash-size drain-trash-size})
(parallel/refresh! mirror-subscribers!)
(fn [msg]
(pubsub-stats :pub raw-topic)
(when-not force-remote?
(doseq [sub (bucket/get sub-callbacks raw-topic)]
(sub msg)))
(doseq [[id pub-resource] (bucket/get pub-resources raw-topic)]
(when (and msg (or force-remote? (not= id service-name)))
((safe-get pub-resource :f) msg)))))
(def publisher
"A publisher to a topic, which will be auto-namespaced with env"
(graph/instance raw-publisher [env topic]
{:raw-topic (env-topic-name topic env)}))
(defn- make-subscriber
"Make a subscriber from a function that takes its publish fn and somehow schedules
publications on it."
[sub-pub-fn!]
(let [my-subscribers (atom [])]
(sub-pub-fn! (fn [m] (doseq [s @my-subscribers] (s m))))
(fn [s] (swap! my-subscribers conj s))))
(defn subscriber-concat
"The asynchrounous version of concat. Takes a sequence of input subscribers, and produces
a subscriber that publishes one message for every message received on any input-subscriber."
[& input-subscribers]
(make-subscriber
(fn [pub-fn!]
(doseq [s input-subscribers]
(s pub-fn!)))))
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.