175 Commits

Author SHA1 Message Date
Tom Alexander
482d5ecfa3 Switch to local-path-provisioner.
Some checks failed
format Build format has succeeded
clippy Build clippy has failed
rust-test Build rust-test has succeeded
foreign-document-test Build foreign-document-test has succeeded
2025-08-31 19:51:03 -04:00
Tom Alexander
84b8ddb582 Merge branch 'add_jump_to_line_number' 2025-02-01 21:55:43 -05:00
Tom Alexander
113bb5888a Add a test with a tramp link.
Some checks failed
rust-test Build rust-test has started
format Build format has succeeded
clippy Build clippy has failed
2025-02-01 19:19:05 -05:00
Tom Alexander
bf5fe6920b Add a test for jump to line number.
Some checks failed
format Build format has succeeded
clippy Build clippy has failed
rust-test Build rust-test has failed
2025-02-01 18:42:10 -05:00
Tom Alexander
4b52ed0d2a Fix clippy lint.
Some checks failed
format Build format has succeeded
clippy Build clippy has failed
foreign-document-test Build foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2024-10-21 00:25:45 -04:00
Tom Alexander
d2c558ccfa Merge branch 'buildkit'
Some checks failed
format Build format has succeeded
clippy Build clippy has failed
rust-test Build rust-test has succeeded
foreign-document-test Build foreign-document-test has succeeded
2024-10-21 00:13:02 -04:00
Tom Alexander
a01f78b510 Update dockerfiles to take advantage of BuildKit.
Some checks failed
format Build format has succeeded
clippy Build clippy has failed
rust-test Build rust-test has succeeded
2024-10-20 23:13:07 -04:00
Tom Alexander
d80b473fae Switch to using BuiltKit instead of Kaniko to build docker images. 2024-10-20 22:55:22 -04:00
Tom Alexander
e6b4bc3d94 Merge branch 'webhook_bridge'
All checks were successful
format Build format has succeeded
clippy Build clippy has succeeded
foreign-document-test Build foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2024-09-30 17:34:52 -04:00
Tom Alexander
c6cde8db74 Switch to using webhook_bridge instead of lighthouse for triggering the CI. 2024-09-30 17:33:54 -04:00
Tom Alexander
841a348dd0 Publish version 0.1.16.
All checks were successful
format Build format has succeeded
rust-test Build rust-test has succeeded
build-organic Build build-organic has succeeded
clippy Build clippy has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
2024-04-11 21:34:22 -04:00
Tom Alexander
b46fae331b Fix clippy errors.
All checks were successful
format Build format has succeeded
clippy Build clippy has succeeded
build-organic Build build-organic has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2024-04-11 21:03:50 -04:00
Tom Alexander
7223e08df3 Merge branch 'fix_docker'
Some checks failed
format Build format has failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
build-organic Build build-organic has succeeded
2024-04-11 20:24:06 -04:00
Tom Alexander
8321f83dac Inline the foreign document test. 2024-04-11 20:21:50 -04:00
Tom Alexander
bd441a0293 Serialize the build job to try to reduce disruption on the cluster.
Some checks failed
rust-test Build rust-test has failed
format Build format has succeeded
clippy Build clippy has failed
build-organic Build build-organic has succeeded
2024-04-10 23:41:47 -04:00
Tom Alexander
f5a07e0d70 Reduce memory requests to make builds less disruptive.
Some checks failed
rust-test Build rust-test has failed
format Build format has succeeded
clippy Build clippy has failed
build-organic Build build-organic has succeeded
2024-04-10 23:25:06 -04:00
Tom Alexander
9d750ed5e1 Fix workflows for new targets.
Some checks failed
rust-test Build rust-test has failed
format Build format has succeeded
clippy Build clippy has failed
build-organic Build build-organic has succeeded
2024-04-10 22:48:04 -04:00
Tom Alexander
9f111fe445 Rework the makefiles.
Some checks failed
clippy Build clippy has failed
build-organic Build build-organic has failed
format Build format has failed
2024-04-10 20:48:11 -04:00
Tom Alexander
a4e433dab1 Inline build pipeline. 2024-04-06 21:44:54 -04:00
Tom Alexander
4e9f1e4fac Inline the format pipeline.
Some checks failed
rust-foreign-document-test Build rust-foreign-document-test has failed
rust-test Build rust-test has failed
rust-build Build rust-build has failed
format Build format has succeeded
clippy Build clippy has failed
2024-04-06 12:00:07 -04:00
Tom Alexander
4dee130873 Add cranelift.
Some checks failed
rust-test Build rust-test has failed
rust-foreign-document-test Build rust-foreign-document-test has failed
rust-build Build rust-build has failed
clippy Build clippy has failed
2024-04-06 11:45:28 -04:00
Tom Alexander
8e712532e1 Add a organic_development image.
Some checks failed
rust-foreign-document-test Build rust-foreign-document-test has failed
rust-test Build rust-test has failed
rust-build Build rust-build has failed
clippy Build clippy has failed
This image will be shared by CI jobs rather than having a separate image for each, mirroring the developments I've done in natter.
2024-04-06 11:39:47 -04:00
Tom Alexander
4b85236c5f Inline clippy pipeline spec.
Some checks failed
rust-foreign-document-test Build rust-foreign-document-test has failed
rust-test Build rust-test has failed
clippy Build clippy has failed
rust-build Build rust-build has failed
2024-04-06 11:17:54 -04:00
Tom Alexander
66f003e6fd Update docker images to latest alpine.
Some checks failed
rust-foreign-document-test Build rust-foreign-document-test has failed
rust-test Build rust-test has failed
clippy Build clippy has failed
rust-build Build rust-build has failed
2024-04-06 10:25:22 -04:00
Tom Alexander
b35a2d5f5a Fix debug assert.
Some checks failed
rustfmt Build rustfmt has failed
rust-foreign-document-test Build rust-foreign-document-test has failed
rust-test Build rust-test has failed
clippy Build clippy has failed
rust-build Build rust-build has failed
2024-04-06 10:12:06 -04:00
Tom Alexander
320b5f8568 Publish version 0.1.15.
All checks were successful
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
clippy Build clippy has succeeded
rustfmt Build rustfmt has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
2024-01-28 17:12:45 -05:00
Tom Alexander
99b2af6c99 Fix clippy. 2024-01-28 17:11:18 -05:00
Tom Alexander
6e71acdb7d Update README.
Some checks failed
clippy Build clippy has failed
rustfmt Build rustfmt has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2024-01-28 14:25:57 -05:00
Tom Alexander
8406d37991 Switch to using JSON for wasm.
Some checks failed
rust-build Build rust-build has failed
clippy Build clippy has failed
rustfmt Build rustfmt has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
serde_wasm_bindgen was silently dropping many attributes (I suspect it is triggered by serde flatten) so this switches to serializing to JSON for passing values from wasm to js.
2024-01-27 16:13:17 -05:00
Tom Alexander
64bb597908 Build bundler wasm target by default. 2024-01-26 21:17:40 -05:00
Tom Alexander
068864ea87 Publish version 0.1.14.
All checks were successful
clippy Build clippy has succeeded
rustfmt Build rustfmt has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2024-01-03 23:59:51 -05:00
Tom Alexander
03a3ddbd63 Merge branch 'wasm'
All checks were successful
clippy Build clippy has succeeded
rustfmt Build rustfmt has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2024-01-03 23:56:35 -05:00
Tom Alexander
122adee23b Hide the wasm module.
All checks were successful
clippy Build clippy has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2024-01-03 23:38:04 -05:00
Tom Alexander
556afecbb8 Hide the util module. 2024-01-03 23:04:47 -05:00
Tom Alexander
e4407cbdd1 Hide the event_count module.
All checks were successful
clippy Build clippy has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
By placing the code for the parse executable inside a module inside the organic library, we only need to expose the entrypoint publicly rather than all functions it calls. This hides the event_count module, but I will be expanding the practice to the rest of the code base shortly. This is important for not inadvertently promising stability w.r.t. semver for essentially internal functions for development tools.

It was the parse binary, not compare.
2024-01-03 21:17:44 -05:00
Tom Alexander
f57d60dab0 Add a doc target to the Makefile. 2024-01-03 19:55:22 -05:00
Tom Alexander
0aa3939a75 Format. 2024-01-01 18:34:10 -05:00
Tom Alexander
52cb81e75e Cleanup.
All checks were successful
clippy Build clippy has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-31 12:02:02 -05:00
Tom Alexander
945121202d Remove wasm_test's dependency on compare module.
All checks were successful
clippy Build clippy has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-31 11:11:25 -05:00
Tom Alexander
f4e0dddd9d Fix clippy.
All checks were successful
clippy Build clippy has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-30 23:14:40 -05:00
Tom Alexander
6b62176fd0 Run cargo fix.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-30 22:22:32 -05:00
Tom Alexander
44483b4d54 Break util up into modules. 2023-12-30 22:19:16 -05:00
Tom Alexander
48d3de77fe Move elisp fact to util. 2023-12-30 21:37:25 -05:00
Tom Alexander
680b176501 Fix src block value.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has succeeded
2023-12-30 21:30:08 -05:00
Tom Alexander
dc0338e978 Handle nil in object tree. 2023-12-30 21:28:25 -05:00
Tom Alexander
ff3e0a50af Implement dynamic block.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-30 20:56:35 -05:00
Tom Alexander
03c8c07fe0 Implement quote block. 2023-12-30 20:53:20 -05:00
Tom Alexander
3a6fc5b669 Support noop on all token types. 2023-12-30 20:50:28 -05:00
Tom Alexander
d258cdb839 Support null vs noop comparison. 2023-12-30 20:47:17 -05:00
Tom Alexander
aa5629354e Implement special block. 2023-12-30 20:43:01 -05:00
Tom Alexander
efc4a04829 Implement center block. 2023-12-30 20:38:08 -05:00
Tom Alexander
dd611ea64a Fix plain list item. 2023-12-30 20:35:27 -05:00
Tom Alexander
4bd5f3bec7 Implement node property. 2023-12-30 19:01:07 -05:00
Tom Alexander
c2b3509b6a Implement property drawer. 2023-12-30 18:59:52 -05:00
Tom Alexander
7f3f5fb889 Implement table cell.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-30 18:52:52 -05:00
Tom Alexander
e0fbf17226 Implement table row. 2023-12-30 18:52:51 -05:00
Tom Alexander
4e18cbafba Implement table. 2023-12-30 18:52:51 -05:00
Tom Alexander
46c36d7f3e Implement babel call. 2023-12-30 18:15:58 -05:00
Tom Alexander
c46a935cfc Implement clock.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-30 18:12:09 -05:00
Tom Alexander
f50415cb32 Implement drawer. 2023-12-30 18:05:46 -05:00
Tom Alexander
4f1a151e97 Implement diary sexp. 2023-12-30 18:03:57 -05:00
Tom Alexander
c8e3fdba51 Implement horizontal rule. 2023-12-30 18:02:38 -05:00
Tom Alexander
4b3fc20c62 Fix order of reading optional pair values from elisp. 2023-12-30 18:00:26 -05:00
Tom Alexander
3131f8ac64 Implement example block and export block. 2023-12-30 17:55:56 -05:00
Tom Alexander
60a4835590 Implement comment block. 2023-12-30 17:44:32 -05:00
Tom Alexander
172d72aa46 Implement src block.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-30 17:40:15 -05:00
Tom Alexander
b4fcc6500b Implement verse block. 2023-12-30 16:47:24 -05:00
Tom Alexander
ddb6f31562 Implement angle link. 2023-12-30 16:41:55 -05:00
Tom Alexander
dc080b30fc Implement citation reference. 2023-12-30 16:35:01 -05:00
Tom Alexander
9901e17437 Implement citation. 2023-12-30 16:33:02 -05:00
Tom Alexander
ea000894f0 Implement entity. 2023-12-30 16:24:51 -05:00
Tom Alexander
e7742b529a Implement export snippet.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-30 16:18:08 -05:00
Tom Alexander
8eba0c4923 Implement footnote definition. 2023-12-30 16:13:54 -05:00
Tom Alexander
e0c0070a13 Implement footnote reference. 2023-12-30 16:05:41 -05:00
Tom Alexander
65ce116998 Implement inline babel call. 2023-12-30 15:52:48 -05:00
Tom Alexander
e348e7d4e3 Implement inline source block. 2023-12-30 13:13:35 -05:00
Tom Alexander
492090470c Implement latex environment. 2023-12-30 13:07:16 -05:00
Tom Alexander
3ec900c8df Implement latex fragment.
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has failed
2023-12-30 13:00:07 -05:00
Tom Alexander
d0a008ed22 Implement org macro. 2023-12-30 13:00:07 -05:00
Tom Alexander
f2292f1c07 Implement target. 2023-12-30 12:22:23 -05:00
Tom Alexander
44392cfcca Implement radio target. 2023-12-30 12:17:04 -05:00
Tom Alexander
110630d230 Implement radio link and regular link. 2023-12-30 12:14:03 -05:00
Tom Alexander
ebe12d96c1 Implement subscript and superscript.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-29 23:46:47 -05:00
Tom Alexander
24c8ac8e21 Implement all the text markup.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-29 23:41:15 -05:00
Tom Alexander
259ad6e242 Implement line break. 2023-12-29 23:27:37 -05:00
Tom Alexander
dd1f7c7777 Support a no-op for headline pre-blank.
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has failed
2023-12-29 23:21:30 -05:00
Tom Alexander
c1b471208d Implement plain list item.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-29 23:06:45 -05:00
Tom Alexander
606bab9e6d Fix handling of optval. 2023-12-29 22:58:32 -05:00
Tom Alexander
0edf5620a2 Implement plain list. 2023-12-29 22:04:34 -05:00
Tom Alexander
cdf87641c5 Implement comment. 2023-12-29 21:59:45 -05:00
Tom Alexander
eb2995dd3b Support list with empty string as only element for empty list. 2023-12-29 21:56:31 -05:00
Tom Alexander
cd6a64c015 Implement keyword. 2023-12-29 21:36:52 -05:00
Tom Alexander
a4a83d047d Fix node name getting chopped off.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has failed
2023-12-29 21:33:17 -05:00
Tom Alexander
a4414369ce Remove unnecessary additional properties in the already-implemented types. 2023-12-29 21:04:31 -05:00
Tom Alexander
83e4b72307 Implement timestamp. 2023-12-29 20:55:01 -05:00
Tom Alexander
34b3e4fa7b Implement statistics cookie. 2023-12-29 20:26:12 -05:00
Tom Alexander
c0e879dc1e Implement headline. 2023-12-29 20:26:11 -05:00
Tom Alexander
fa31b001f4 Implement fixed width area. 2023-12-29 19:21:35 -05:00
Tom Alexander
0897061ff6 Add wasm tests to the CI. 2023-12-29 19:07:07 -05:00
Tom Alexander
28a3e1bc7b Implement bold. 2023-12-29 18:56:29 -05:00
Tom Alexander
3fd3d20722 Merge branch 'test_wasm_json' into wasm
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2023-12-29 18:54:58 -05:00
Tom Alexander
90735586b5 Add special case for object trees. 2023-12-29 18:54:41 -05:00
Tom Alexander
78befc7665 Remove old code. 2023-12-29 17:31:14 -05:00
Tom Alexander
ef549d3b19 Compare quoted strings. 2023-12-29 17:29:13 -05:00
Tom Alexander
777c756a7f Compare plain text AST nodes. 2023-12-29 17:24:38 -05:00
Tom Alexander
037caf369c Standardize parameter order. 2023-12-29 16:56:02 -05:00
Tom Alexander
54085b5833 Implement compare optional pair. 2023-12-29 16:51:52 -05:00
Tom Alexander
2bfa8e59e7 Add code to compare children. 2023-12-29 16:06:07 -05:00
Tom Alexander
5d31db39a4 Remove some underscores from wasm schema to match elisp. 2023-12-29 15:41:41 -05:00
Tom Alexander
adcd0de7e4 Compare standard properties. 2023-12-29 15:38:18 -05:00
Tom Alexander
c2f9789a64 Placeholder for comparing quoted strings. 2023-12-29 15:09:54 -05:00
Tom Alexander
579cbb5d11 Switch everything over to the new to_wasm macro. 2023-12-29 15:03:36 -05:00
Tom Alexander
cad2be43bf Implement a new to_wasm macro that uses the WasmAstNodeWrapper. 2023-12-29 14:06:10 -05:00
Tom Alexander
a0a4f0eb90 Remove lifetimes from wasm ast nodes. 2023-12-29 12:49:43 -05:00
Tom Alexander
9f4f8e79ce Implement a wrapper type for AST nodes.
This is to make it impossible to have a collision for attribute names that are real attributes vs attributes I've added for structure (like children and ast_node).
2023-12-29 11:58:46 -05:00
Tom Alexander
77e0dbb42e Start working on a version of compare based on json values.
This will be a better test because it will be testing that what we export to json is equivalent to the elisp AST generated from emacs. Because of these tests, we could also confidently use the wasm structure to elisp.
2023-12-29 11:37:30 -05:00
Tom Alexander
eff5cdbf40 Flatten some structures.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-29 10:04:59 -05:00
Tom Alexander
eef3571299 Add compare logic for optional pair.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-27 21:23:06 -05:00
Tom Alexander
f227d8405e Implement compare for list of quoted strings.
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2023-12-27 21:00:12 -05:00
Tom Alexander
9520e5814b Add conversion for affiliated keywords to wasm additional properties. 2023-12-27 20:36:35 -05:00
Tom Alexander
28ad4fd046 Add conversion to WasmAstNode for wasm Objects. 2023-12-27 19:53:07 -05:00
Tom Alexander
7626a69fa1 Add default implementations for WasmElispCompare. 2023-12-27 19:42:45 -05:00
Tom Alexander
121c0ce516 Move the logic functions into their own module. 2023-12-27 19:22:43 -05:00
Tom Alexander
5a64db98fe Move wasm diff structs to their own module. 2023-12-27 19:15:39 -05:00
Tom Alexander
abfae9c6c0 Compare section.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
rust-build Build rust-build has succeeded
2023-12-27 19:10:43 -05:00
Tom Alexander
5272e2f1b4 Start adding paragraph. 2023-12-27 18:47:59 -05:00
Tom Alexander
90d4b11922 Switch to a formatted print of the wasm compare status.
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2023-12-27 18:39:10 -05:00
Tom Alexander
d552ef6569 Compare the additional properties. 2023-12-27 18:20:23 -05:00
Tom Alexander
f050e9b6a8 Taking into account additional property names but not comparing their values. 2023-12-27 18:01:56 -05:00
Tom Alexander
a5e108bc37 Compare the standard properties.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-27 17:07:42 -05:00
Tom Alexander
58290515b5 Enable child checking. 2023-12-27 16:47:02 -05:00
Tom Alexander
423f65046e Record the property comparisons. 2023-12-27 16:40:55 -05:00
Tom Alexander
badeaf8246 Add compare for document category. 2023-12-27 16:34:04 -05:00
Tom Alexander
d38100581c Add a script to run the wasm test inside docker. 2023-12-27 16:32:06 -05:00
Tom Alexander
f4eff5ca56 Fix wasm build.
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has succeeded
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2023-12-27 16:00:16 -05:00
Tom Alexander
5b02c21ebf Progress on comparing properties in the wasm_compare macro. 2023-12-27 15:58:31 -05:00
Tom Alexander
5f1668702a Starting the wasm_compare macro.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has succeeded
2023-12-27 15:38:30 -05:00
Tom Alexander
1faaeeebf1 Simplify wasm diff result types.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-27 14:19:25 -05:00
Tom Alexander
20a7c89084 Improving WasmElispCompare. 2023-12-27 13:21:20 -05:00
Tom Alexander
e83417b243 Introducing a trait for running compares.
This should enable us to invoke compares without needing a reference ast node type.
2023-12-27 12:38:21 -05:00
Tom Alexander
36b80dc093 Separate out rust parsing step to support references to values stored in the parsed state.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-27 12:24:21 -05:00
Tom Alexander
1812b1a56e Remove phantom data. 2023-12-27 12:24:21 -05:00
Tom Alexander
1a70b3d2c0 Add a lifetime for data in the parsed result but not from the source. 2023-12-27 12:24:21 -05:00
Tom Alexander
abf066701e Add category and path to WasmDocument. 2023-12-27 11:31:35 -05:00
Tom Alexander
4984ea4179 More of the test structure. 2023-12-27 11:10:40 -05:00
Tom Alexander
3cb251ea6c Move terminal colors to the shared util module. 2023-12-27 10:57:40 -05:00
Tom Alexander
4bfea41291 Add more structure to the wasm compare. 2023-12-27 10:52:59 -05:00
Tom Alexander
99376515ef Invoking wasm_compare_document.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-27 09:31:54 -05:00
Tom Alexander
23f4ba4205 Serialize to wasm during wasm compare.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has failed
rust-test Build rust-test has succeeded
2023-12-27 08:57:56 -05:00
Tom Alexander
55ad136283 Fix imports for wasm. 2023-12-27 08:49:34 -05:00
Tom Alexander
c717541099 Move the parsing of the elisp to the util module. 2023-12-27 08:46:18 -05:00
Tom Alexander
c2e921c2dc Move wasm test to a top-level module.
For some unknown reason, this makes rust-analyzer not angry.
2023-12-27 08:42:13 -05:00
Tom Alexander
e499169f0e Fix imports for wasm_test.
Some checks failed
clippy Build clippy has failed
rust-build Build rust-build has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-test Build rust-test has succeeded
2023-12-27 08:37:34 -05:00
Tom Alexander
84c088df67 Add wasm targets to the build test in the CI.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-27 08:04:03 -05:00
Tom Alexander
f210f95f99 Use a temporary folder for the builds. 2023-12-26 21:23:20 -05:00
Tom Alexander
17b81c7c72 Add a script to build every possible feature combination. 2023-12-26 21:05:40 -05:00
Tom Alexander
2911fce7cc Put util under the library. 2023-12-26 20:18:41 -05:00
Tom Alexander
e622d9fa6b Remove the old implementation of print_versions.
Some checks failed
clippy Build clippy has failed
rust-foreign-document-test Build rust-foreign-document-test has succeeded
rust-build Build rust-build has succeeded
rust-test Build rust-test has succeeded
2023-12-26 19:15:02 -05:00
Tom Alexander
8186fbb8b3 Move print_versions into a util crate. 2023-12-26 19:06:12 -05:00
Tom Alexander
68ccff74fa Outline for the wasm compare function. 2023-12-26 18:55:28 -05:00
Tom Alexander
9a13cb72c6 Make the wasm test binary async. 2023-12-25 14:32:01 -05:00
Tom Alexander
65abaa332f Separate out the wasm test into its own feature/binary. 2023-12-25 13:12:32 -05:00
Tom Alexander
67e5829fd9 Populating document's children. 2023-12-25 12:55:48 -05:00
Tom Alexander
995b41e697 Remove deserialize to support borrows. 2023-12-25 12:42:38 -05:00
Tom Alexander
eb51bdfe2f Add original field name to wasm macro. 2023-12-25 12:32:35 -05:00
Tom Alexander
bbb9ec637a Add code to test the wasm code path without actually dropping into wasm. 2023-12-25 12:14:50 -05:00
Tom Alexander
dc012b49f5 Add a generic WasmAstNode enum. 2023-12-25 11:51:39 -05:00
Tom Alexander
13863a68f7 Add placeholders for all the wasm ast nodes. 2023-12-25 11:33:43 -05:00
Tom Alexander
2962f76c81 Add lifetime to wasm objects. 2023-12-25 11:19:09 -05:00
Tom Alexander
b9b3ef6e74 Populate standard properties. 2023-12-25 10:47:10 -05:00
Tom Alexander
310ab2eab2 Add standard properties to wasm. 2023-12-24 15:26:45 -05:00
Tom Alexander
53320070da Define a wasm document. 2023-12-24 15:17:41 -05:00
Tom Alexander
2d5593681f Start defining the return type. 2023-12-24 13:02:34 -05:00
Tom Alexander
b3f97dbb40 Add wasm-bindgen. 2023-12-24 00:59:41 -05:00
Tom Alexander
a48d76321e Building basic wasm. 2023-12-24 00:47:32 -05:00
132 changed files with 7422 additions and 1714 deletions

View File

@@ -2,3 +2,4 @@
target
Cargo.lock
notes/
.lighthouse/

View File

@@ -1,191 +0,0 @@
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: clippy
spec:
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
- name: GIT_USER_NAME
description: The username for git
type: string
default: "fluxcdbot"
- name: GIT_USER_EMAIL
description: The email for git
type: string
default: "fluxcdbot@users.noreply.github.com"
tasks:
- name: do-stuff
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
steps:
- image: alpine:3.18
name: do-stuff-step
script: |
#!/usr/bin/env sh
echo "hello world"
- name: report-pending
taskRef:
name: gitea-set-status
runAfter:
- fetch-repository
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
name: git-clone
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: build-image
taskRef:
name: kaniko
params:
- name: IMAGE
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: BUILDER_IMAGE
value: "gcr.io/kaniko-project/executor:v1.12.1"
- name: EXTRA_ARGS
value:
- --cache=true
- --cache-copy-layers
- --cache-repo=harbor.fizz.buzz/kanikocache/cache
- --use-new-run # Should result in a speed-up
- --reproducible # To remove timestamps so layer caching works.
- --snapshot-mode=redo
- --skip-unused-stages=true
- --registry-mirror=dockerhub.dockerhub.svc.cluster.local
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: clippy
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
workspaces:
- name: git-source
- name: docker-credentials
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "nfs-client"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-clippy
- name: docker-credentials
secret:
secretName: harbor-plain
serviceAccountName: build-bot
timeout: 240h0m0s
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-clippy"
- name: path-to-image-context
value: docker/organic_clippy/
- name: path-to-dockerfile
value: docker/organic_clippy/Dockerfile

View File

@@ -1,203 +0,0 @@
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: rust-foreign-document-test
spec:
pipelineSpec:
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m40s"
finally: "0h30m0s"
params:
- name: image-name
description: The name for the built image
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: do-stuff
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
steps:
- image: alpine:3.18
name: do-stuff-step
script: |
#!/usr/bin/env sh
echo "hello world"
- name: report-pending
taskRef:
name: gitea-set-status
runAfter:
- fetch-repository
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
name: git-clone
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: build-image
taskRef:
name: kaniko
params:
- name: IMAGE
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: BUILDER_IMAGE
value: "gcr.io/kaniko-project/executor:v1.12.1"
- name: EXTRA_ARGS
value:
- --target=foreign-document-test
- --cache=true
- --cache-copy-layers
- --cache-repo=harbor.fizz.buzz/kanikocache/cache
- --use-new-run # Should result in a speed-up
- --reproducible # To remove timestamps so layer caching works.
- --snapshot-mode=redo
- --skip-unused-stages=true
- --registry-mirror=dockerhub.dockerhub.svc.cluster.local
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-image
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: command
value: [cargo, cache, --autoclean]
- name: args
value: []
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "nfs-client"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-test-foreign-document
- name: docker-credentials
secret:
secretName: harbor-plain
serviceAccountName: build-bot
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-test-foreign-document"
- name: path-to-image-context
value: docker/organic_test/
- name: path-to-dockerfile
value: docker/organic_test/Dockerfile

View File

@@ -1,284 +0,0 @@
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: rust-build
spec:
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: report-pending
taskRef:
name: gitea-set-status
runAfter:
- fetch-repository
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
name: git-clone
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: build-image
taskRef:
name: kaniko
params:
- name: IMAGE
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: BUILDER_IMAGE
value: "gcr.io/kaniko-project/executor:v1.12.1"
- name: EXTRA_ARGS
value:
- --cache=true
- --cache-copy-layers
- --cache-repo=harbor.fizz.buzz/kanikocache/cache
- --use-new-run # Should result in a speed-up
- --reproducible # To remove timestamps so layer caching works.
- --snapshot-mode=redo
- --skip-unused-stages=true
- --registry-mirror=dockerhub.dockerhub.svc.cluster.local
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-image-none
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: args
value: ["--no-default-features"]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: run-image-tracing
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- run-image-none
params:
- name: args
value: ["--no-default-features", "--features", "tracing"]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: run-image-compare
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- run-image-tracing
params:
- name: args
value: ["--no-default-features", "--features", "compare"]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: run-image-default
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- run-image-compare
params:
- name: args
value: []
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: run-image-tracing-compare
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- run-image-default
params:
- name: args
value: ["--no-default-features", "--features", "tracing,compare"]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: run-image-compare-foreign
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- run-image-tracing-compare
params:
- name: args
value:
[
"--no-default-features",
"--features",
"compare,foreign_document_test",
]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: run-image-all
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- run-image-compare-foreign
params:
- name: args
value:
[
"--no-default-features",
"--features",
"tracing,compare,foreign_document_test",
]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: command
value: [cargo, cache, --autoclean]
- name: args
value: []
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "nfs-client"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-build
- name: docker-credentials
secret:
secretName: harbor-plain
serviceAccountName: build-bot
timeout: 240h0m0s
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-build"
- name: path-to-image-context
value: docker/organic_build/
- name: path-to-dockerfile
value: docker/organic_build/Dockerfile

View File

@@ -1,214 +0,0 @@
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: rust-test
spec:
pipelineSpec:
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m40s"
finally: "0h30m0s"
params:
- name: image-name
description: The name for the built image
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: do-stuff
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
steps:
- image: alpine:3.18
name: do-stuff-step
script: |
#!/usr/bin/env sh
echo "hello world"
- name: report-pending
taskRef:
name: gitea-set-status
runAfter:
- fetch-repository
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
name: git-clone
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: build-image
taskRef:
name: kaniko
params:
- name: IMAGE
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: BUILDER_IMAGE
value: "gcr.io/kaniko-project/executor:v1.12.1"
- name: EXTRA_ARGS
value:
- --target=tester
- --cache=true
- --cache-copy-layers
- --cache-repo=harbor.fizz.buzz/kanikocache/cache
- --use-new-run # Should result in a speed-up
- --reproducible # To remove timestamps so layer caching works.
- --snapshot-mode=redo
- --skip-unused-stages=true
- --registry-mirror=dockerhub.dockerhub.svc.cluster.local
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-image
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: args
value:
[
--no-default-features,
--features,
compare,
--no-fail-fast,
--lib,
--test,
test_loader,
]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: command
value: [cargo, cache, --autoclean]
- name: args
value: []
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "nfs-client"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-test
- name: docker-credentials
secret:
secretName: harbor-plain
serviceAccountName: build-bot
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-test"
- name: path-to-image-context
value: docker/organic_test/
- name: path-to-dockerfile
value: docker/organic_test/Dockerfile

View File

@@ -1,230 +0,0 @@
apiVersion: tekton.dev/v1beta1
kind: PipelineRun
metadata:
name: rustfmt
spec:
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
- name: GIT_USER_NAME
description: The username for git
type: string
default: "fluxcdbot"
- name: GIT_USER_EMAIL
description: The email for git
type: string
default: "fluxcdbot@users.noreply.github.com"
tasks:
- name: do-stuff
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
steps:
- image: alpine:3.18
name: do-stuff-step
script: |
#!/usr/bin/env sh
echo "hello world"
- name: report-pending
taskRef:
name: gitea-set-status
runAfter:
- fetch-repository
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
name: git-clone
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: build-image
taskRef:
name: kaniko
params:
- name: IMAGE
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: BUILDER_IMAGE
value: "gcr.io/kaniko-project/executor:v1.12.1"
- name: EXTRA_ARGS
value:
- --cache=true
- --cache-copy-layers
- --cache-repo=harbor.fizz.buzz/kanikocache/cache
- --use-new-run # Should result in a speed-up
- --reproducible # To remove timestamps so layer caching works.
- --snapshot-mode=redo
- --skip-unused-stages=true
- --registry-mirror=dockerhub.dockerhub.svc.cluster.local
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: rustfmt
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
runAfter:
- build-image
params:
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: cargo-fix
taskRef:
name: run-docker-image
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- rustfmt
params:
- name: command
value: ["cargo", "fix"]
- name: args
value: ["--allow-dirty"]
- name: docker-image
value: "$(params.image-name):$(tasks.fetch-repository.results.commit)"
- name: commit-changes
taskRef:
name: git-cli
params:
- name: GIT_USER_NAME
value: $(params.GIT_USER_NAME)
- name: GIT_USER_EMAIL
value: $(params.GIT_USER_EMAIL)
- name: GIT_SCRIPT
value: |
pwd
git config --global --add safe.directory /workspace/source
git_status=$(git status --porcelain)
if [ -n "$git_status" ]; then
git commit -a -m "CI: autofix rust code."
git push origin HEAD:$(params.PULL_BASE_REF)
else
echo "No changes to commit."
fi
workspaces:
- name: source
workspace: git-source
runAfter:
- cargo-fix
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
name: gitea-set-status
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
workspaces:
- name: git-source
- name: docker-credentials
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "nfs-client"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-fmt
- name: docker-credentials
secret:
secretName: harbor-plain
serviceAccountName: build-bot
timeout: 240h0m0s
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-fmt"
- name: path-to-image-context
value: docker/cargo_fmt/
- name: path-to-dockerfile
value: docker/cargo_fmt/Dockerfile

View File

@@ -1,39 +0,0 @@
apiVersion: config.lighthouse.jenkins-x.io/v1alpha1
kind: TriggerConfig
spec:
postsubmits:
- name: rustfmt
source: "pipeline-rustfmt.yaml"
# Override https-based url from lighthouse events.
clone_uri: "git@code.fizz.buzz:talexander/organic.git"
branches:
- ^main$
- ^master$
- name: rust-test
source: "pipeline-rust-test.yaml"
# Override https-based url from lighthouse events.
clone_uri: "git@code.fizz.buzz:talexander/organic.git"
skip_branches:
# We already run on every commit, so running when the semver tags get pushed is causing needless double-processing.
- "^v[0-9]+\\.[0-9]+\\.[0-9]+$"
- name: rust-foreign-document-test
source: "pipeline-foreign-document-test.yaml"
# Override https-based url from lighthouse events.
clone_uri: "git@code.fizz.buzz:talexander/organic.git"
skip_branches:
# We already run on every commit, so running when the semver tags get pushed is causing needless double-processing.
- "^v[0-9]+\\.[0-9]+\\.[0-9]+$"
- name: rust-build
source: "pipeline-rust-build.yaml"
# Override https-based url from lighthouse events.
clone_uri: "git@code.fizz.buzz:talexander/organic.git"
skip_branches:
# We already run on every commit, so running when the semver tags get pushed is causing needless double-processing.
- "^v[0-9]+\\.[0-9]+\\.[0-9]+$"
- name: clippy
source: "pipeline-clippy.yaml"
# Override https-based url from lighthouse events.
clone_uri: "git@code.fizz.buzz:talexander/organic.git"
skip_branches:
# We already run on every commit, so running when the semver tags get pushed is causing needless double-processing.
- "^v[0-9]+\\.[0-9]+\\.[0-9]+$"

View File

@@ -0,0 +1,701 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: build
spec:
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m0s"
finally: "0h30m0s"
taskRunTemplate:
serviceAccountName: build-bot
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: target-name
description: The dockerfile target to build
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: report-pending
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/git-clone/0.9/git-clone.yaml
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: get-git-commit-time
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: "$(workspaces.repo.path)"
results:
- name: unix-time
description: The time of the git commit in unix timestamp format.
steps:
- image: alpine/git:v2.34.2
name: detect-tag-step
script: |
#!/usr/bin/env sh
set -euo pipefail
echo -n "$(git log -1 --pretty=%ct)" | tee $(results.unix-time.path)
workspaces:
- name: repo
workspace: git-source
runAfter:
- fetch-repository
- name: build-image
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/talexander/personal_tekton_catalog.git
- name: revision
value: 7ee31a185243ee6da13dcd26a592c585b64c80e5
- name: pathInRepo
value: task/buildkit-rootless-daemonless/0.1/buildkit-rootless-daemonless.yaml
params:
- name: OUTPUT
value: >-
type=image,"name=$(params.image-name):latest,$(params.image-name):$(tasks.fetch-repository.results.commit)",push=true,compression=zstd,compression-level=22,oci-mediatypes=true
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: EXTRA_ARGS
value:
- "--opt"
- "target=$(params.target-name)"
- --import-cache
- "type=registry,ref=$(params.image-name):buildcache"
- --export-cache
- "type=registry,ref=$(params.image-name):buildcache,mode=max,compression=zstd,compression-level=22,rewrite-timestamp=true,image-manifest=true,oci-mediatypes=true"
- --opt
- build-arg:SOURCE_DATE_EPOCH=$(tasks.get-git-commit-time.results.unix-time)
- name: BUILDKITD_TOML
value: |
debug = true
[registry."docker.io"]
mirrors = ["dockerhub.dockerhub.svc.cluster.local"]
[registry."dockerhub.dockerhub.svc.cluster.local"]
http = true
insecure = true
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
#############
- name: run-image-none
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args: ["--no-default-features"]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: none
runAfter:
- build-image
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-tracing
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args: ["--no-default-features", "--features", "tracing"]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: tracing
runAfter:
- run-image-none
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-compare
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args: ["--no-default-features", "--features", "compare"]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: compare
runAfter:
- run-image-tracing
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-default
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args: []
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: default
runAfter:
- run-image-compare
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-tracing-compare
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args: ["--no-default-features", "--features", "tracing,compare"]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: tracing-compare
runAfter:
- run-image-default
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-compare-foreign
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args:
[
"--no-default-features",
"--features",
"compare,foreign_document_test",
]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: compare-foreign
runAfter:
- run-image-tracing-compare
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-all
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args:
[
"--no-default-features",
"--features",
"tracing,compare,foreign_document_test",
]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: all
runAfter:
- run-image-compare-foreign
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-wasm
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args:
[
"--target",
"wasm32-unknown-unknown",
"--profile",
"wasm",
"--bin",
"wasm",
"--no-default-features",
"--features",
"wasm",
]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: wasm
runAfter:
- run-image-all
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
- name: run-image-wasm-test
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
stepTemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "build"]
args:
[
"--bin",
"wasm_test",
"--no-default-features",
"--features",
"wasm_test",
]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: wasm-test
runAfter:
- run-image-wasm
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
#############
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.18
- name: cache-subdir
type: string
description: subPath used in the persistent volume for the cargo cache.
steptemplate:
image: alpine:3.18
name: ""
resources:
requests:
cpu: 10m
memory: 60Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["ls", "$(workspaces.cargo-cache.path)"]
# command: [echo, $(params.cache-subdir)]
# command: [cargo, cache, --autoclean]
args: []
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
subPath: $(params.cache-subdir)
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
- name: cache-subdir
value: none
# matrix:
# params:
# - name: cache-subdir
# value:
# - none
# - tracing
# - compare
# - default
# - tracing-compare
# - compare-foreign
# - all
# - wasm
# - wasm-test
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "local-path"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-build
- name: docker-credentials
secret:
secretName: harbor-plain
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-development-wasm"
- name: target-name
value: "wasm"
- name: path-to-image-context
value: .
- name: path-to-dockerfile
value: docker/organic_development/

View File

@@ -0,0 +1,301 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: foreign-document-test
spec:
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m40s"
finally: "0h30m0s"
taskRunTemplate:
serviceAccountName: build-bot
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: target-name
description: The dockerfile target to build
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: report-pending
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/git-clone/0.9/git-clone.yaml
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: get-git-commit-time
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: "$(workspaces.repo.path)"
results:
- name: unix-time
description: The time of the git commit in unix timestamp format.
steps:
- image: alpine/git:v2.34.2
name: detect-tag-step
script: |
#!/usr/bin/env sh
set -euo pipefail
echo -n "$(git log -1 --pretty=%ct)" | tee $(results.unix-time.path)
workspaces:
- name: repo
workspace: git-source
runAfter:
- fetch-repository
- name: build-image
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/talexander/personal_tekton_catalog.git
- name: revision
value: 7ee31a185243ee6da13dcd26a592c585b64c80e5
- name: pathInRepo
value: task/buildkit-rootless-daemonless/0.1/buildkit-rootless-daemonless.yaml
params:
- name: OUTPUT
value: >-
type=image,"name=$(params.image-name):latest,$(params.image-name):$(tasks.fetch-repository.results.commit)",push=true,compression=zstd,compression-level=22,oci-mediatypes=true
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: EXTRA_ARGS
value:
- "--opt"
- "target=$(params.target-name)"
- --import-cache
- "type=registry,ref=$(params.image-name):buildcache"
- --export-cache
- "type=registry,ref=$(params.image-name):buildcache,mode=max,compression=zstd,compression-level=22,rewrite-timestamp=true,image-manifest=true,oci-mediatypes=true"
- --opt
- build-arg:SOURCE_DATE_EPOCH=$(tasks.get-git-commit-time.results.unix-time)
- name: BUILDKITD_TOML
value: |
debug = true
[registry."docker.io"]
mirrors = ["dockerhub.dockerhub.svc.cluster.local"]
[registry."dockerhub.dockerhub.svc.cluster.local"]
http = true
insecure = true
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-test
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: [cargo, cache, --autoclean]
args: []
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "local-path"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-test-foreign-document
- name: docker-credentials
secret:
secretName: harbor-plain
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-test-foreign-document"
- name: target-name
value: "foreign-document"
- name: path-to-image-context
value: docker/organic_test/
- name: path-to-dockerfile
value: docker/organic_test/

View File

@@ -0,0 +1,334 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: rust-format
spec:
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m0s"
finally: "0h30m0s"
taskRunTemplate:
serviceAccountName: build-bot
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: target-name
description: The dockerfile target to build
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: report-pending
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/git-clone/0.9/git-clone.yaml
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: get-git-commit-time
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: "$(workspaces.repo.path)"
results:
- name: unix-time
description: The time of the git commit in unix timestamp format.
steps:
- image: alpine/git:v2.34.2
name: detect-tag-step
script: |
#!/usr/bin/env sh
set -euo pipefail
echo -n "$(git log -1 --pretty=%ct)" | tee $(results.unix-time.path)
workspaces:
- name: repo
workspace: git-source
runAfter:
- fetch-repository
- name: build-image
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/talexander/personal_tekton_catalog.git
- name: revision
value: 7ee31a185243ee6da13dcd26a592c585b64c80e5
- name: pathInRepo
value: task/buildkit-rootless-daemonless/0.1/buildkit-rootless-daemonless.yaml
params:
- name: OUTPUT
value: >-
type=image,"name=$(params.image-name):latest,$(params.image-name):$(tasks.fetch-repository.results.commit)",push=true,compression=zstd,compression-level=22,oci-mediatypes=true
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: EXTRA_ARGS
value:
- "--opt"
- "target=$(params.target-name)"
- --import-cache
- "type=registry,ref=$(params.image-name):buildcache"
- --export-cache
- "type=registry,ref=$(params.image-name):buildcache,mode=max,compression=zstd,compression-level=22,rewrite-timestamp=true,image-manifest=true,oci-mediatypes=true"
- --opt
- build-arg:SOURCE_DATE_EPOCH=$(tasks.get-git-commit-time.results.unix-time)
- name: BUILDKITD_TOML
value: |
debug = true
[registry."docker.io"]
mirrors = ["dockerhub.dockerhub.svc.cluster.local"]
[registry."dockerhub.dockerhub.svc.cluster.local"]
http = true
insecure = true
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-cargo-fmt
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: ["cargo", "fmt"]
args: []
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
- name: commit-changes
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/git-cli/0.4/git-cli.yaml
params:
- name: GIT_USER_NAME
value: fluxcdbot
- name: GIT_USER_EMAIL
value: "fluxcdbot@users.noreply.github.com"
- name: GIT_SCRIPT
value: |
pwd
git config --global --add safe.directory /workspace/source
git_status=$(git status --porcelain)
if [ -n "$git_status" ]; then
git commit -a -m "CI: autofix rust code."
git push origin HEAD:$(params.PULL_BASE_REF)
else
echo "No changes to commit."
fi
workspaces:
- name: source
workspace: git-source
runAfter:
- run-cargo-fmt
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: [cargo, cache, --autoclean]
args: []
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "local-path"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-fmt
- name: docker-credentials
secret:
secretName: harbor-plain
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-development-format"
- name: target-name
value: "format"
- name: path-to-image-context
value: docker/organic_development/
- name: path-to-dockerfile
value: docker/organic_development/

View File

@@ -0,0 +1,313 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: rust-clippy
spec:
taskRunTemplate:
serviceAccountName: build-bot
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m40s"
finally: "0h30m0s"
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: target-name
description: The dockerfile target to build
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: report-pending
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/git-clone/0.9/git-clone.yaml
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: get-git-commit-time
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: "$(workspaces.repo.path)"
results:
- name: unix-time
description: The time of the git commit in unix timestamp format.
steps:
- image: alpine/git:v2.34.2
name: detect-tag-step
script: |
#!/usr/bin/env sh
set -euo pipefail
echo -n "$(git log -1 --pretty=%ct)" | tee $(results.unix-time.path)
workspaces:
- name: repo
workspace: git-source
runAfter:
- fetch-repository
- name: build-image
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/talexander/personal_tekton_catalog.git
- name: revision
value: 7ee31a185243ee6da13dcd26a592c585b64c80e5
- name: pathInRepo
value: task/buildkit-rootless-daemonless/0.1/buildkit-rootless-daemonless.yaml
params:
- name: OUTPUT
value: >-
type=image,"name=$(params.image-name):latest,$(params.image-name):$(tasks.fetch-repository.results.commit)",push=true,compression=zstd,compression-level=22,oci-mediatypes=true
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: EXTRA_ARGS
value:
- "--opt"
- "target=$(params.target-name)"
- --import-cache
- "type=registry,ref=$(params.image-name):buildcache"
- --export-cache
- "type=registry,ref=$(params.image-name):buildcache,mode=max,compression=zstd,compression-level=22,rewrite-timestamp=true,image-manifest=true,oci-mediatypes=true"
- --opt
- build-arg:SOURCE_DATE_EPOCH=$(tasks.get-git-commit-time.results.unix-time)
- name: BUILDKITD_TOML
value: |
debug = true
[registry."docker.io"]
mirrors = ["dockerhub.dockerhub.svc.cluster.local"]
[registry."dockerhub.dockerhub.svc.cluster.local"]
http = true
insecure = true
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-cargo-clippy
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command:
[
"cargo",
"clippy",
"--no-deps",
"--all-targets",
"--all-features",
"--",
"-D",
"warnings",
]
args: []
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: [cargo, cache, --autoclean]
args: []
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "local-path"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-clippy
- name: docker-credentials
secret:
secretName: harbor-plain
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-development-clippy"
- name: target-name
value: "clippy"
- name: path-to-image-context
value: docker/organic_development/
- name: path-to-dockerfile
value: docker/organic_development/

View File

@@ -0,0 +1,312 @@
apiVersion: tekton.dev/v1
kind: PipelineRun
metadata:
name: rust-test
spec:
timeouts:
pipeline: "2h0m0s"
tasks: "1h0m40s"
finally: "0h30m0s"
taskRunTemplate:
serviceAccountName: build-bot
pipelineSpec:
params:
- name: image-name
description: The name for the built image
type: string
- name: target-name
description: The dockerfile target to build
type: string
- name: path-to-image-context
description: The path to the build context
type: string
- name: path-to-dockerfile
description: The path to the Dockerfile
type: string
tasks:
- name: report-pending
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has started"
- name: STATE
value: pending
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: fetch-repository
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/git-clone/0.9/git-clone.yaml
workspaces:
- name: output
workspace: git-source
params:
- name: url
value: $(params.REPO_URL)
- name: revision
value: $(params.PULL_BASE_SHA)
- name: deleteExisting
value: "true"
- name: get-git-commit-time
taskSpec:
metadata: {}
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: "$(workspaces.repo.path)"
results:
- name: unix-time
description: The time of the git commit in unix timestamp format.
steps:
- image: alpine/git:v2.34.2
name: detect-tag-step
script: |
#!/usr/bin/env sh
set -euo pipefail
echo -n "$(git log -1 --pretty=%ct)" | tee $(results.unix-time.path)
workspaces:
- name: repo
workspace: git-source
runAfter:
- fetch-repository
- name: build-image
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/talexander/personal_tekton_catalog.git
- name: revision
value: 7ee31a185243ee6da13dcd26a592c585b64c80e5
- name: pathInRepo
value: task/buildkit-rootless-daemonless/0.1/buildkit-rootless-daemonless.yaml
params:
- name: OUTPUT
value: >-
type=image,"name=$(params.image-name):latest,$(params.image-name):$(tasks.fetch-repository.results.commit)",push=true,compression=zstd,compression-level=22,oci-mediatypes=true
- name: CONTEXT
value: $(params.path-to-image-context)
- name: DOCKERFILE
value: $(params.path-to-dockerfile)
- name: EXTRA_ARGS
value:
- "--opt"
- "target=$(params.target-name)"
- --import-cache
- "type=registry,ref=$(params.image-name):buildcache"
- --export-cache
- "type=registry,ref=$(params.image-name):buildcache,mode=max,compression=zstd,compression-level=22,rewrite-timestamp=true,image-manifest=true,oci-mediatypes=true"
- --opt
- build-arg:SOURCE_DATE_EPOCH=$(tasks.get-git-commit-time.results.unix-time)
- name: BUILDKITD_TOML
value: |
debug = true
[registry."docker.io"]
mirrors = ["dockerhub.dockerhub.svc.cluster.local"]
[registry."dockerhub.dockerhub.svc.cluster.local"]
http = true
insecure = true
workspaces:
- name: source
workspace: git-source
- name: dockerconfig
workspace: docker-credentials
runAfter:
- fetch-repository
- name: run-cargo-test
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: [cargo, test]
args:
[
--no-default-features,
--features,
"compare,wasm_test",
--no-fail-fast,
--lib,
--test,
test_loader,
]
env:
- name: CARGO_TARGET_DIR
value: /target
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
runAfter:
- build-image
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
finally:
- name: report-success
when:
- input: "$(tasks.status)"
operator: in
values: ["Succeeded", "Completed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has succeeded"
- name: STATE
value: success
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: report-failure
when:
- input: "$(tasks.status)"
operator: in
values: ["Failed"]
taskRef:
resolver: git
params:
- name: url
value: https://code.fizz.buzz/mirror/catalog.git # mirror of https://github.com/tektoncd/catalog.git
- name: revision
value: df36b3853a5657fd883015cdbf07ad6466918acf
- name: pathInRepo
value: task/gitea-set-status/0.1/gitea-set-status.yaml
params:
- name: CONTEXT
value: "$(params.JOB_NAME)"
- name: REPO_FULL_NAME
value: "$(params.REPO_OWNER)/$(params.REPO_NAME)"
- name: GITEA_HOST_URL
value: code.fizz.buzz
- name: SHA
value: "$(tasks.fetch-repository.results.commit)"
- name: DESCRIPTION
value: "Build $(params.JOB_NAME) has failed"
- name: STATE
value: failure
- name: TARGET_URL
value: "https://tekton.fizz.buzz/#/namespaces/$(context.pipelineRun.namespace)/pipelineruns/$(context.pipelineRun.name)"
- name: cargo-cache-autoclean
taskSpec:
metadata: {}
params:
- name: docker-image
type: string
description: Docker image to run.
default: alpine:3.20
stepTemplate:
image: alpine:3.20
computeResources:
requests:
cpu: 10m
memory: 600Mi
workingDir: /workspace/source
workspaces:
- name: source
mountPath: /source
- name: cargo-cache
mountPath: /usr/local/cargo/registry
optional: true
steps:
- name: run
image: $(params.docker-image)
workingDir: "$(workspaces.source.path)"
command: [cargo, cache, --autoclean]
args: []
workspaces:
- name: source
workspace: git-source
- name: cargo-cache
workspace: cargo-cache
params:
- name: docker-image
value: "$(tasks.build-image.results.IMAGE_URL[1])"
workspaces:
- name: git-source
- name: docker-credentials
- name: cargo-cache
workspaces:
- name: git-source
volumeClaimTemplate:
spec:
storageClassName: "local-path"
accessModes:
- ReadWriteOnce
resources:
requests:
storage: 10Gi
subPath: rust-source
- name: cargo-cache
persistentVolumeClaim:
claimName: organic-cargo-cache-test
- name: docker-credentials
secret:
secretName: harbor-plain
params:
- name: image-name
value: "harbor.fizz.buzz/private/organic-test"
- name: target-name
value: "tester"
- name: path-to-image-context
value: docker/organic_test/
- name: path-to-dockerfile
value: docker/organic_test/

View File

@@ -0,0 +1,31 @@
version = "0.0.1"
[[push]]
name = "rust-test"
source = "pipeline-rust-test.yaml"
clone_uri = "git@code.fizz.buzz:talexander/organic.git"
skip_branches = [ "^v[0-9]+\\.[0-9]+\\.[0-9]+$" ]
[[push]]
name = "foreign-document-test"
source = "pipeline-foreign-document-test.yaml"
clone_uri = "git@code.fizz.buzz:talexander/organic.git"
branches = [ "^main$", "^master$" ]
[[push]]
name = "clippy"
source = "pipeline-rust-clippy.yaml"
clone_uri = "git@code.fizz.buzz:talexander/organic.git"
skip_branches = [ "^v[0-9]+\\.[0-9]+\\.[0-9]+$" ]
[[push]]
name = "format"
source = "pipeline-format.yaml"
clone_uri = "git@code.fizz.buzz:talexander/organic.git"
skip_branches = [ "^v[0-9]+\\.[0-9]+\\.[0-9]+$" ]
[[push]]
name = "build"
source = "pipeline-build-hash.yaml"
clone_uri = "git@code.fizz.buzz:talexander/organic.git"
branches = [ "^main$", "^master$" ]

View File

@@ -1,8 +1,9 @@
# cargo-features = ["profile-rustflags"]
cargo-features = ["codegen-backend"]
[package]
name = "organic"
version = "0.1.13"
version = "0.1.16"
authors = ["Tom Alexander <tom@fizz.buzz>"]
description = "An org-mode parser."
edition = "2021"
@@ -39,17 +40,33 @@ path = "src/lib.rs"
path = "src/bin_foreign_document_test.rs"
required-features = ["foreign_document_test"]
[[bin]]
name = "wasm"
path = "src/bin_wasm.rs"
required-features = ["wasm"]
[[bin]]
# This bin exists for development purposes only. The real target of this crate is the library.
name = "wasm_test"
path = "src/bin_wasm_test.rs"
required-features = ["wasm_test"]
[dependencies]
futures = { version = "0.3.28", optional = true }
gloo-utils = "0.2.0"
nom = "7.1.1"
opentelemetry = { version = "0.20.0", optional = true, default-features = false, features = ["trace", "rt-tokio"] }
opentelemetry-otlp = { version = "0.13.0", optional = true }
opentelemetry-semantic-conventions = { version = "0.12.0", optional = true }
serde = { version = "1.0.193", optional = true, features = ["derive"] }
serde-wasm-bindgen = { version = "0.6.3", optional = true }
serde_json = { version = "1.0.108", optional = true }
tokio = { version = "1.30.0", optional = true, default-features = false, features = ["rt", "rt-multi-thread"] }
tracing = { version = "0.1.37", optional = true }
tracing-opentelemetry = { version = "0.20.0", optional = true }
tracing-subscriber = { version = "0.3.17", optional = true, features = ["env-filter"] }
walkdir = { version = "2.3.3", optional = true }
wasm-bindgen = { version = "0.2.89", optional = true }
[build-dependencies]
walkdir = "2.3.3"
@@ -60,6 +77,8 @@ compare = ["tokio/process", "tokio/macros"]
foreign_document_test = ["compare", "dep:futures", "tokio/sync", "dep:walkdir", "tokio/process"]
tracing = ["dep:opentelemetry", "dep:opentelemetry-otlp", "dep:opentelemetry-semantic-conventions", "dep:tokio", "dep:tracing", "dep:tracing-opentelemetry", "dep:tracing-subscriber"]
event_count = []
wasm = ["dep:serde", "dep:wasm-bindgen", "dep:serde-wasm-bindgen"]
wasm_test = ["wasm", "dep:serde_json", "tokio/process", "tokio/macros"]
# Optimized build for any sort of release.
[profile.release-lto]
@@ -79,3 +98,15 @@ strip = "symbols"
inherits = "release"
lto = true
debug = true
[profile.wasm]
inherits = "release"
lto = true
strip = true
[profile.dev]
codegen-backend = "cranelift"
[profile.dev.package."*"]
codegen-backend = "llvm"
opt-level = 3

View File

@@ -7,6 +7,7 @@ MAKEFLAGS += --no-builtin-rules
TESTJOBS := 4
OS:=$(shell uname -s)
RELEASEFLAGS :=
WASMTARGET := bundler # or web
ifeq ($(OS),Linux)
TESTJOBS:=$(shell nproc)
@@ -21,51 +22,83 @@ ifeq ($(origin .RECIPEPREFIX), undefined)
endif
.RECIPEPREFIX = >
.PHONY: help
help: ## List the available make targets.
> @grep -h "##" $(MAKEFILE_LIST) | grep -v grep | sed -E 's/^([^:]*): *## */\1: /'
.PHONY: build
build:
build: ## Make a debug build of the project.
> cargo build
.PHONY: release
release:
release: ## Make an optimized build of the project.
> cargo build --release $(RELEASEFLAGS)
.PHONY: wasm
wasm: ## Build the parser as wasm.
> cargo build --target=wasm32-unknown-unknown --profile wasm --bin wasm --features wasm
> wasm-bindgen --target $(WASMTARGET) --out-dir target/wasm32-unknown-unknown/js target/wasm32-unknown-unknown/wasm/wasm.wasm
.PHONY: clean
clean:
clean: ## Delete the built binaries.
> cargo clean
> $(MAKE) -C docker/organic_development TARGET=builder clean
> $(MAKE) -C docker/organic_development TARGET=format clean
> $(MAKE) -C docker/organic_development TARGET=clippy clean
> $(MAKE) -C docker/organic_development TARGET=wasm clean
> $(MAKE) -C docker/organic_test TARGET=tester build
.PHONY: format
format:
> $(MAKE) -C docker/cargo_fmt run
format: ## Format the code.
> cargo fmt
.PHONY: dockerclippy
dockerclippy:
> $(MAKE) -C docker/organic_clippy run
.PHONY: docker_format
docker_format: ## Format the code using docker.
> $(MAKE) -C docker/organic_development TARGET=format build
> docker run --rm -i -t --mount type=tmpfs,destination=/tmp -v "$(shell readlink -f .):/source" --workdir=/source --env CARGO_TARGET_DIR=/target -v "organic-cargo-registry:/usr/local/cargo/registry" organic-development-format cargo fmt
.PHONY: docker_clippy
docker_clippy: ## Lint the code using docker.
> $(MAKE) -C docker/organic_development TARGET=clippy build
> docker run --rm -i -t --mount type=tmpfs,destination=/tmp -v "$(shell readlink -f .):/source" --workdir=/source --env CARGO_TARGET_DIR=/target -v "organic-cargo-registry:/usr/local/cargo/registry" organic-development-clippy cargo clippy --no-deps --all-targets --all-features -- -D warnings
.PHONY: clippy
clippy:
clippy: ## Lint the code.
> cargo clippy --no-deps --all-targets --all-features -- -D warnings
.PHONY: test
test:
test: ## Run the test suite.
> cargo test --no-default-features --features compare --no-fail-fast --lib --test test_loader -- --test-threads $(TESTJOBS)
.PHONY: dockertest
dockertest:
> $(MAKE) -C docker/organic_test
.PHONY: doc
doc: ## Generate documentation.
> cargo doc --no-deps --open --lib --release --all-features
.PHONY: docker_test
docker_test: ## Run the test suite using docker.
> $(MAKE) -C docker/organic_test TARGET=tester build
> docker run --init --rm -i -t --read-only -v "$$(readlink -f ./):/source:ro" --mount type=tmpfs,destination=/tmp --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target -w /source organic-test --no-default-features --features compare --no-fail-fast --lib --test test_loader -- --test-threads $(TESTJOBS)
.PHONY: buildtest
buildtest:
.PHONY: docker_wasm_test
docker_wasm_test: ## Run the test suite with wasm tests.
> $(MAKE) -C docker/organic_test TARGET=tester build
> docker run --init --rm -i -t --read-only -v "$$(readlink -f ./):/source:ro" --mount type=tmpfs,destination=/tmp --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target -w /source organic-test --no-default-features --features compare,wasm_test --no-fail-fast --lib --test test_loader autogen_wasm_ -- --test-threads $(TESTJOBS)
.PHONY: build_test
build_test:
> cargo build --no-default-features
> cargo build --no-default-features --features compare
> cargo build --no-default-features --features tracing
> cargo build --no-default-features --features compare,tracing
> cargo build --no-default-features --features compare,foreign_document_test
> cargo build --no-default-features --features compare,tracing,foreign_document_test
> cargo build --target wasm32-unknown-unknown --profile wasm --bin wasm --no-default-features --features wasm
> cargo build --bin wasm_test --no-default-features --features wasm_test
.PHONY: foreign_document_test
foreign_document_test:
> $(MAKE) -C docker/organic_test run_foreign_document_test
> $(MAKE) -C docker/organic_test TARGET=foreign-document build
> docker run --init --rm -i -t --read-only -v "$$(readlink -f ./):/source:ro" --mount type=tmpfs,destination=/tmp --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target -w /source organic-test-foreign-document
.PHONY: dockerclean
dockerclean:

View File

@@ -10,17 +10,16 @@ Currently, Organic parses most documents the same as the official org-mode parse
### Project Goals
- We aim to provide perfect parity with the emacs org-mode parser. In that regard, any document that parses differently between Emacs and Organic is considered a bug.
- The parser should have minimal dependencies. This should reduce effort w.r.t.: security audits, legal compliance, portability.
- The parser should be usable everywhere. In the interest of getting org-mode used in as many places as possible, this parser should be usable by everyone everywhere. This means:
- The parser should have minimal dependencies.
- The parser should be usable everywhere. In the interest of getting org used in as many places as possible, this parser should be usable by everyone everywhere. This means:
- It must have a permissive license.
- We will investigate compiling to WASM. This is an important goal of the project and will definitely happen, but only after the parser has a more stable API.
- It compiles to both natively and to wasm.
- We will investigate compiling to a C library for native linking to other code. This is more of a maybe-goal for the project.
### Project Non-Goals
- This project will not include an elisp engine since that would drastically increase the complexity of the code. Any features requiring an elisp engine will not be implemented (for example, Emacs supports embedded eval expressions in documents but this parser will never support that).
- This project is exclusively an org-mode **parser**. This limits its scope to roughly the output of `(org-element-parse-buffer)`. It will not render org-mode documents in other formats like HTML or LaTeX.
### Project Maybe-Goals
- table.el support. Currently we support org-mode tables but org-mode also allows table.el tables. So far, their use in org-mode documents seems rather uncommon so this is a low-priority feature.
- Document editing support. I do not anticipate any advanced editing features to make editing ergonomic, but it should be relatively easy to be able to parse an org-mode document and serialize it back into org-mode. This would enable cool features to be built on top of the library like auto-formatters. To accomplish this feature, We'd have to capture all of the various separators and whitespace that we are currently simply throwing away. This would add many additional fields to the parsed structs and it would add more noise to the parsers themselves, so I do not want to approach this feature until the parser is more complete since it would make modifications and refactoring more difficult.
### Supported Versions
This project targets the version of Emacs and Org-mode that are built into the [organic-test docker image](docker/organic_test/Dockerfile). This is newer than the version of Org-mode that shipped with Emacs 29.1. The parser itself does not depend on Emacs or Org-mode though, so this only matters for development purposes when running the automated tests that compare against upstream Org-mode.

View File

@@ -26,7 +26,7 @@ fn main() {
dir_entry.file_type().is_file()
&& Path::new(dir_entry.file_name())
.extension()
.map(|ext| ext.to_ascii_lowercase() == "org")
.map(|ext| ext.eq_ignore_ascii_case("org"))
.unwrap_or(false)
}
Err(_) => true,

View File

@@ -1,6 +0,0 @@
FROM rustlang/rust:nightly-alpine3.17
RUN apk add --no-cache musl-dev
RUN rustup component add rustfmt
ENTRYPOINT ["cargo", "fmt"]

View File

@@ -1,36 +0,0 @@
IMAGE_NAME:=cargo-fmt
# REMOTE_REPO:=harbor.fizz.buzz/private
.PHONY: all
all: build push
.PHONY: build
build:
docker build -t $(IMAGE_NAME) -f Dockerfile .
.PHONY: push
push:
ifdef REMOTE_REPO
docker tag $(IMAGE_NAME) $(REMOTE_REPO)/$(IMAGE_NAME)
docker push $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not pushing to a remote repo."
endif
.PHONY: clean
clean:
docker rmi $(IMAGE_NAME)
ifdef REMOTE_REPO
docker rmi $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not removing from remote repo."
endif
# NOTE: This target will write to folders underneath the git-root
.PHONY: run
run: build
docker run --rm --init --read-only --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source" --workdir=/source $(IMAGE_NAME)
.PHONY: shell
shell: build
docker run --rm -i -t --entrypoint /bin/sh --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source" --workdir=/source $(IMAGE_NAME)

View File

@@ -1,6 +0,0 @@
FROM rustlang/rust:nightly-alpine3.17
RUN apk add --no-cache musl-dev
RUN cargo install --locked --no-default-features --features ci-autoclean cargo-cache
ENTRYPOINT ["cargo", "build"]

View File

@@ -1,37 +0,0 @@
IMAGE_NAME:=organic-build
# REMOTE_REPO:=harbor.fizz.buzz/private
.PHONY: all
all: build push
.PHONY: build
build:
docker build -t $(IMAGE_NAME) -f Dockerfile .
.PHONY: push
push:
ifdef REMOTE_REPO
docker tag $(IMAGE_NAME) $(REMOTE_REPO)/$(IMAGE_NAME)
docker push $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not pushing to a remote repo."
endif
.PHONY: clean
clean:
docker rmi $(IMAGE_NAME)
ifdef REMOTE_REPO
docker rmi $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not removing from remote repo."
endif
docker volume rm cargo-cache
# NOTE: This target will write to folders underneath the git-root
.PHONY: run
run: build
docker run --rm --init --read-only --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry $(IMAGE_NAME)
.PHONY: shell
shell: build
docker run --rm -i -t --entrypoint /bin/sh --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry $(IMAGE_NAME)

View File

@@ -1,5 +0,0 @@
FROM rustlang/rust:nightly-alpine3.17
RUN apk add --no-cache musl-dev
ENTRYPOINT ["cargo", "clippy", "--no-deps", "--all-targets", "--all-features", "--", "-D", "warnings"]

View File

@@ -1,37 +0,0 @@
IMAGE_NAME:=organic-clippy
# REMOTE_REPO:=harbor.fizz.buzz/private
.PHONY: all
all: build push
.PHONY: build
build:
docker build -t $(IMAGE_NAME) -f Dockerfile .
.PHONY: push
push:
ifdef REMOTE_REPO
docker tag $(IMAGE_NAME) $(REMOTE_REPO)/$(IMAGE_NAME)
docker push $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not pushing to a remote repo."
endif
.PHONY: clean
clean:
docker rmi $(IMAGE_NAME)
ifdef REMOTE_REPO
docker rmi $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not removing from remote repo."
endif
docker volume rm rust-cache cargo-cache
# NOTE: This target will write to folders underneath the git-root
.PHONY: run
run: build
docker run --rm --init --read-only --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source:ro" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target $(IMAGE_NAME)
.PHONY: shell
shell: build
docker run --rm -i -t --entrypoint /bin/sh --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source:ro" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target $(IMAGE_NAME)

View File

@@ -0,0 +1,20 @@
# syntax=docker/dockerfile:1
ARG ALPINE_VERSION="3.20"
FROM rustlang/rust:nightly-alpine$ALPINE_VERSION AS builder
RUN apk add --no-cache musl-dev
RUN --mount=type=tmpfs,target=/tmp --mount=type=cache,target=/usr/local/cargo/registry,sharing=locked cargo install --locked --no-default-features --features ci-autoclean cargo-cache
RUN rustup component add rustc-codegen-cranelift
FROM builder AS format
RUN rustup component add rustfmt
FROM builder AS clippy
RUN rustup component add clippy
FROM builder AS wasm
RUN rustup target add wasm32-unknown-unknown

View File

@@ -0,0 +1,36 @@
SHELL := bash
.ONESHELL:
.SHELLFLAGS := -eu -o pipefail -c
.DELETE_ON_ERROR:
MAKEFLAGS += --warn-undefined-variables
MAKEFLAGS += --no-builtin-rules
ifeq ($(origin .RECIPEPREFIX), undefined)
$(error This Make does not support .RECIPEPREFIX. Please use GNU Make 4.0 or later)
endif
.RECIPEPREFIX = >
TARGET := builder
IMAGE_NAME := organic-development
ifneq ($(TARGET),builder)
IMAGE_NAME := $(IMAGE_NAME)-$(TARGET)
endif
.PHONY: help
help:
> @grep -h "##" $(MAKEFILE_LIST) | grep -v grep | sed -E 's/^([^:]*): *## */\1: /'
.PHONY: build
build: ## Build the docker image.
> docker build --tag $(IMAGE_NAME) --target=$(TARGET) --file Dockerfile .
> docker volume create organic-cargo-registry
.PHONY: shell
shell: ## Launch an interactive shell inside the docker image with the source repository mounted at /source.
shell: build
> docker run --rm -i -t --entrypoint /bin/sh --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source" --workdir=/source --env CARGO_TARGET_DIR=/target -v "organic-cargo-registry:/usr/local/cargo/registry" $(IMAGE_NAME)
.PHONY: clean
clean: ## Remove the docker image and volume.
> docker rmi $(IMAGE_NAME)
> docker volume rm organic-cargo-registry

View File

@@ -1,10 +1,25 @@
FROM alpine:3.17 AS build
# syntax=docker/dockerfile:1
ARG ALPINE_VERSION="3.20"
# ARG EMACS_REPO=https://git.savannah.gnu.org/git/emacs.git
ARG EMACS_REPO=https://code.fizz.buzz/mirror/emacs.git
ARG EMACS_VERSION=emacs-29.1
# ARG ORG_MODE_REPO=https://git.savannah.gnu.org/git/emacs/org-mode.git
ARG ORG_MODE_REPO=https://code.fizz.buzz/mirror/org-mode.git
ARG ORG_VERSION=abf5156096c06ee5aa05795c3dc5a065f76ada97
FROM alpine:$ALPINE_VERSION AS build
RUN apk add --no-cache build-base musl-dev git autoconf make texinfo gnutls-dev ncurses-dev gawk libgccjit-dev
FROM build AS build-emacs
ARG EMACS_VERSION=emacs-29.1
RUN git clone --depth 1 --branch $EMACS_VERSION https://git.savannah.gnu.org/git/emacs.git /root/emacs
ARG EMACS_VERSION
ARG EMACS_REPO
RUN git clone --depth 1 --branch $EMACS_VERSION $EMACS_REPO /root/emacs
WORKDIR /root/emacs
RUN mkdir /root/dist
RUN ./autogen.sh
@@ -14,23 +29,25 @@ RUN make DESTDIR="/root/dist" install
FROM build AS build-org-mode
ARG ORG_VERSION=abf5156096c06ee5aa05795c3dc5a065f76ada97
COPY --from=build-emacs /root/dist/ /
ARG ORG_VERSION
ARG ORG_MODE_REPO
COPY --link --from=build-emacs /root/dist/ /
RUN mkdir /root/dist
# Savannah does not allow fetching specific revisions, so we're going to have to put unnecessary load on their server by cloning main and then checking out the revision we want.
RUN git clone https://git.savannah.gnu.org/git/emacs/org-mode.git /root/org-mode && git -C /root/org-mode checkout $ORG_VERSION
# RUN mkdir /root/org-mode && git -C /root/org-mode init --initial-branch=main && git -C /root/org-mode remote add origin https://git.savannah.gnu.org/git/emacs/org-mode.git && git -C /root/org-mode fetch origin $ORG_VERSION && git -C /root/org-mode checkout FETCH_HEAD
RUN git clone $ORG_MODE_REPO /root/org-mode && git -C /root/org-mode checkout $ORG_VERSION
# RUN mkdir /root/org-mode && git -C /root/org-mode init --initial-branch=main && git -C /root/org-mode remote add origin $ORG_REPO && git -C /root/org-mode fetch origin $ORG_VERSION && git -C /root/org-mode checkout FETCH_HEAD
WORKDIR /root/org-mode
RUN make compile
RUN make DESTDIR="/root/dist" install
FROM rustlang/rust:nightly-alpine3.17 AS tester
FROM rustlang/rust:nightly-alpine$ALPINE_VERSION AS tester
ENV LANG=en_US.UTF-8
RUN apk add --no-cache musl-dev ncurses gnutls libgccjit
RUN cargo install --locked --no-default-features --features ci-autoclean cargo-cache
COPY --from=build-emacs /root/dist/ /
COPY --from=build-org-mode /root/dist/ /
RUN --mount=type=tmpfs,target=/tmp --mount=type=cache,target=/usr/local/cargo/registry,sharing=locked cargo install --locked --no-default-features --features ci-autoclean cargo-cache
RUN rustup component add rustc-codegen-cranelift
COPY --link --from=build-emacs /root/dist/ /
COPY --link --from=build-org-mode /root/dist/ /
ENTRYPOINT ["cargo", "test"]
@@ -100,13 +117,13 @@ RUN mkdir -p $LITERATE_BUILD_EMACS_PATH && git -C $LITERATE_BUILD_EMACS_PATH ini
# unused/aws.org contains invalid paths for setupfile which causes both upstream org-mode and Organic to error out.
RUN rm $LITERATE_BUILD_EMACS_PATH/unused/aws.org
FROM tester as foreign-document-test
FROM tester as foreign-document
RUN apk add --no-cache bash coreutils
RUN mkdir /foreign_documents
COPY --from=foreign-document-gather /foreign_documents/howardabrams /foreign_documents/howardabrams
COPY --from=foreign-document-gather /foreign_documents/doomemacs /foreign_documents/doomemacs
COPY --from=foreign-document-gather /foreign_documents/worg /foreign_documents/worg
COPY --from=foreign-document-gather /foreign_documents/literate_build_emacs /foreign_documents/literate_build_emacs
COPY --from=build-org-mode /root/org-mode /foreign_documents/org-mode
COPY --from=build-emacs /root/emacs /foreign_documents/emacs
COPY --link --from=foreign-document-gather /foreign_documents/howardabrams /foreign_documents/howardabrams
COPY --link --from=foreign-document-gather /foreign_documents/doomemacs /foreign_documents/doomemacs
COPY --link --from=foreign-document-gather /foreign_documents/worg /foreign_documents/worg
COPY --link --from=foreign-document-gather /foreign_documents/literate_build_emacs /foreign_documents/literate_build_emacs
COPY --link --from=build-org-mode /root/org-mode /foreign_documents/org-mode
COPY --link --from=build-emacs /root/emacs /foreign_documents/emacs
ENTRYPOINT ["cargo", "run", "--bin", "foreign_document_test", "--features", "compare,foreign_document_test", "--profile", "release-lto"]

View File

@@ -1,44 +1,36 @@
IMAGE_NAME:=organic-test
# REMOTE_REPO:=harbor.fizz.buzz/private
SHELL := bash
.ONESHELL:
.SHELLFLAGS := -eu -o pipefail -c
.DELETE_ON_ERROR:
MAKEFLAGS += --warn-undefined-variables
MAKEFLAGS += --no-builtin-rules
.PHONY: all
all: build push
ifeq ($(origin .RECIPEPREFIX), undefined)
$(error This Make does not support .RECIPEPREFIX. Please use GNU Make 4.0 or later)
endif
.RECIPEPREFIX = >
TARGET := tester
IMAGE_NAME := organic-test
ifneq ($(TARGET),tester)
IMAGE_NAME := $(IMAGE_NAME)-$(TARGET)
endif
.PHONY: help
help:
> @grep -h "##" $(MAKEFILE_LIST) | grep -v grep | sed -E 's/^([^:]*): *## */\1: /'
.PHONY: build
build:
docker build -t $(IMAGE_NAME) -f Dockerfile --target tester .
.PHONY: build_foreign_document_test
build_foreign_document_test:
docker build -t $(IMAGE_NAME)-foreign-document -f Dockerfile --target foreign-document-test .
.PHONY: push
push:
ifdef REMOTE_REPO
docker tag $(IMAGE_NAME) $(REMOTE_REPO)/$(IMAGE_NAME)
docker push $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not pushing to a remote repo."
endif
.PHONY: clean
clean:
docker rmi $(IMAGE_NAME)
ifdef REMOTE_REPO
docker rmi $(REMOTE_REPO)/$(IMAGE_NAME)
else
@echo "REMOTE_REPO not defined, not removing from remote repo."
endif
docker volume rm rust-cache cargo-cache
.PHONY: run
run: build
docker run --rm --init --read-only --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source:ro" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target $(IMAGE_NAME) --no-default-features --features compare --no-fail-fast --lib --test test_loader
build: ## Build the docker image.
> docker build --tag $(IMAGE_NAME) --target=$(TARGET) --file Dockerfile .
> docker volume create organic-cargo-registry
.PHONY: shell
shell: ## Launch an interactive shell inside the docker image with the source repository mounted at /source.
shell: build
docker run --rm -i -t --entrypoint /bin/sh --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source:ro" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target $(IMAGE_NAME)
> docker run --rm -i -t --entrypoint /bin/sh --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source" --workdir=/source --env CARGO_TARGET_DIR=/target -v "organic-cargo-registry:/usr/local/cargo/registry" $(IMAGE_NAME)
.PHONY: run_foreign_document_test
run_foreign_document_test: build_foreign_document_test
docker run --rm --init --read-only --mount type=tmpfs,destination=/tmp -v "$$(readlink -f ../../):/source:ro" --workdir=/source --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target $(IMAGE_NAME)-foreign-document
.PHONY: clean
clean: ## Remove the docker image and volume.
> docker rmi $(IMAGE_NAME)
> docker volume rm organic-cargo-registry

View File

@@ -0,0 +1 @@
[[file:simple.org::2]]

View File

@@ -0,0 +1 @@
[[/ssh:admin@test.example:important/file.pdf]]

View File

@@ -0,0 +1,76 @@
#!/usr/bin/env bash
#
# Time running a single parse without invoking a compare with emacs.
set -euo pipefail
IFS=$'\n\t'
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
: ${PROFILE:="debug"}
############## Setup #########################
function cleanup {
for f in "${folders[@]}"; do
log "Deleting $f"
rm -rf "$f"
done
}
folders=()
for sig in EXIT INT QUIT HUP TERM; do
trap "set +e; cleanup" "$sig"
done
function die {
local status_code="$1"
shift
(>&2 echo "${@}")
exit "$status_code"
}
function log {
(>&2 echo "${@}")
}
############## Program #########################
function main {
if [ "$#" -gt 0 ]; then
export CARGO_TARGET_DIR="$1"
else
local work_directory=$(mktemp -d -t 'organic.XXXXXX')
folders+=("$work_directory")
export CARGO_TARGET_DIR="$work_directory"
fi
local features=(compare foreign_document_test tracing event_count wasm wasm_test)
ENABLED_FEATURES= for_each_combination "${features[@]}"
}
function for_each_combination {
local additional_flags=()
if [ "$PROFILE" = "dev" ] || [ "$PROFILE" = "debug" ]; then
PROFILE="debug"
else
additional_flags+=(--profile "$PROFILE")
fi
local flag=$1
shift
if [ "$#" -gt 0 ]; then
ENABLED_FEATURES="$ENABLED_FEATURES" for_each_combination "${@}"
elif [ -z "$ENABLED_FEATURES" ]; then
(cd "$DIR/../" && printf "\n\n\n========== no features ==========\n\n\n" && set -x && cargo build "${additional_flags[@]}" --no-default-features)
else
(cd "$DIR/../" && printf "\n\n\n========== %s ==========\n\n\n" "${ENABLED_FEATURES:1}" && set -x && cargo build "${additional_flags[@]}" --no-default-features --features "${ENABLED_FEATURES:1}")
fi
ENABLED_FEATURES="$ENABLED_FEATURES,$flag"
if [ "$#" -gt 0 ]; then
ENABLED_FEATURES="$ENABLED_FEATURES" for_each_combination "${@}"
else
(cd "$DIR/../" && printf "\n\n\n========== %s ==========\n\n\n" "${ENABLED_FEATURES:1}" && set -x && cargo build "${additional_flags[@]}" --no-default-features --features "${ENABLED_FEATURES:1}")
fi
}
main "${@}"

View File

@@ -0,0 +1,111 @@
#!/usr/bin/env bash
#
set -euo pipefail
IFS=$'\n\t'
DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
: ${SHELL:="NO"} # or YES to launch a shell instead of running the test
: ${TRACE:="NO"} # or YES to send traces to jaeger
: ${BACKTRACE:="NO"} # or YES to print a rust backtrace when panicking
: ${NO_COLOR:=""} # Set to anything to disable color output
: ${PROFILE:="debug"}
REALPATH=$(command -v uu-realpath || command -v realpath)
MAKE=$(command -v gmake || command -v make)
############## Setup #########################
function die {
local status_code="$1"
shift
(>&2 echo "${@}")
exit "$status_code"
}
function log {
(>&2 echo "${@}")
}
############## Program #########################
function main {
build_container
launch_container "${@}"
}
function build_container {
$MAKE -C "$DIR/../docker/organic_test"
}
function launch_container {
local additional_flags=()
local features=(wasm_test)
if [ "$NO_COLOR" != "" ]; then
additional_flags+=(--env "NO_COLOR=$NO_COLOR")
fi
if [ "$TRACE" = "YES" ]; then
# We use the host network so it can talk to jaeger hosted at 127.0.0.1
additional_flags+=(--network=host --env RUST_LOG=debug)
features+=(tracing)
fi
if [ "$SHELL" != "YES" ]; then
additional_flags+=(--read-only)
else
additional_flags+=(-t)
fi
if [ "$BACKTRACE" = "YES" ]; then
additional_flags+=(--env RUST_BACKTRACE=full)
fi
if [ "$SHELL" = "YES" ]; then
exec docker run "${additional_flags[@]}" --init --rm -i --mount type=tmpfs,destination=/tmp -v "/:/input:ro" -v "$($REALPATH "$DIR/../"):/source:ro" --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target -w /source --entrypoint "" organic-test /bin/sh
fi
local features_joined
features_joined=$(IFS=","; echo "${features[*]}")
local build_flags=()
if [ "$PROFILE" = "dev" ] || [ "$PROFILE" = "debug" ]; then
PROFILE="debug"
else
build_flags+=(--profile "$PROFILE")
fi
if [ $# -gt 0 ]; then
# If we passed in args, we need to forward them along
for path in "${@}"; do
local full_path
full_path=$($REALPATH "$path")
init_script=$(cat <<EOF
set -euo pipefail
IFS=\$'\n\t'
cargo build --bin wasm_test --no-default-features --features "$features_joined" ${build_flags[@]}
exec /target/${PROFILE}/wasm_test "/input${full_path}"
EOF
)
docker run "${additional_flags[@]}" --init --rm -i --mount type=tmpfs,destination=/tmp -v "/:/input:ro" -v "$($REALPATH "$DIR/../"):/source:ro" --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target -w /source --entrypoint "" organic-test sh -c "$init_script"
done
else
local current_directory init_script
current_directory=$(pwd)
init_script=$(cat <<EOF
set -euo pipefail
IFS=\$'\n\t'
cargo build --bin wasm_test --no-default-features --features "$features_joined" ${build_flags[@]}
cd /input${current_directory}
exec /target/${PROFILE}/wasm_test
EOF
)
docker run "${additional_flags[@]}" --init --rm -i --mount type=tmpfs,destination=/tmp -v "/:/input:ro" -v "$($REALPATH "$DIR/../"):/source:ro" --mount source=cargo-cache,target=/usr/local/cargo/registry --mount source=rust-cache,target=/target --env CARGO_TARGET_DIR=/target -w /source --entrypoint "" organic-test sh -c "$init_script"
fi
}
main "${@}"

View File

@@ -1,3 +1,4 @@
#![feature(exit_status_error)]
#![feature(round_char_boundary)]
#![feature(exact_size_is_empty)]
use std::io::Read;

10
src/bin_wasm.rs Normal file
View File

@@ -0,0 +1,10 @@
use wasm_bindgen::prelude::wasm_bindgen;
#[wasm_bindgen]
pub fn parse_org(org_contents: &str) -> wasm_bindgen::JsValue {
organic::wasm_cli::parse_org(org_contents)
}
fn main() -> Result<(), Box<dyn std::error::Error>> {
Ok(())
}

62
src/bin_wasm_test.rs Normal file
View File

@@ -0,0 +1,62 @@
#![feature(exact_size_is_empty)]
#![feature(exit_status_error)]
use std::io::Read;
use organic::wasm_test::wasm_run_anonymous_compare;
use organic::wasm_test::wasm_run_compare_on_file;
#[cfg(feature = "tracing")]
use crate::init_tracing::init_telemetry;
#[cfg(feature = "tracing")]
use crate::init_tracing::shutdown_telemetry;
#[cfg(feature = "tracing")]
mod init_tracing;
#[cfg(not(feature = "tracing"))]
fn main() -> Result<(), Box<dyn std::error::Error>> {
let rt = tokio::runtime::Runtime::new()?;
rt.block_on(async {
let main_body_result = main_body().await;
main_body_result
})
}
#[cfg(feature = "tracing")]
fn main() -> Result<(), Box<dyn std::error::Error>> {
let rt = tokio::runtime::Runtime::new()?;
rt.block_on(async {
init_telemetry()?;
let main_body_result = main_body().await;
shutdown_telemetry()?;
main_body_result
})
}
#[cfg_attr(feature = "tracing", tracing::instrument(ret, level = "debug"))]
async fn main_body() -> Result<(), Box<dyn std::error::Error>> {
let args = std::env::args().skip(1);
if args.is_empty() {
let org_contents = read_stdin_to_string()?;
if wasm_run_anonymous_compare(org_contents).await? {
} else {
Err("Diff results do not match.")?;
}
Ok(())
} else {
for arg in args {
if wasm_run_compare_on_file(arg).await? {
} else {
Err("Diff results do not match.")?;
}
}
Ok(())
}
}
fn read_stdin_to_string() -> Result<String, Box<dyn std::error::Error>> {
let mut stdin_contents = String::new();
std::io::stdin()
.lock()
.read_to_string(&mut stdin_contents)?;
Ok(stdin_contents)
}

View File

@@ -1,16 +1,16 @@
use std::path::Path;
use crate::compare::diff::compare_document;
use crate::compare::diff::DiffResult;
use crate::compare::parse::emacs_parse_anonymous_org_document;
use crate::compare::parse::emacs_parse_file_org_document;
use crate::compare::parse::get_emacs_version;
use crate::compare::parse::get_org_mode_version;
use crate::compare::sexp::sexp;
use crate::context::GlobalSettings;
use crate::context::LocalFileAccessInterface;
use crate::parser::parse_file_with_settings;
use crate::parser::parse_with_settings;
use crate::util::cli::emacs_parse_anonymous_org_document;
use crate::util::cli::emacs_parse_file_org_document;
use crate::util::cli::print_versions;
use crate::util::elisp::sexp;
use crate::util::terminal::foreground_color;
use crate::util::terminal::reset_color;
pub async fn run_anonymous_compare<P: AsRef<str>>(
org_contents: P,
@@ -68,8 +68,8 @@ pub async fn run_anonymous_compare_with_settings<'g, 's, P: AsRef<str>>(
} else if !silent {
println!(
"{color}Entire document passes.{reset}",
color = DiffResult::foreground_color(0, 255, 0),
reset = DiffResult::reset_color(),
color = foreground_color(0, 255, 0),
reset = reset_color(),
);
}
@@ -121,19 +121,10 @@ pub async fn run_compare_on_file_with_settings<'g, 's, P: AsRef<Path>>(
} else if !silent {
println!(
"{color}Entire document passes.{reset}",
color = DiffResult::foreground_color(0, 255, 0),
reset = DiffResult::reset_color(),
color = foreground_color(0, 255, 0),
reset = reset_color(),
);
}
Ok(true)
}
async fn print_versions() -> Result<(), Box<dyn std::error::Error>> {
eprintln!("Using emacs version: {}", get_emacs_version().await?.trim());
eprintln!(
"Using org-mode version: {}",
get_org_mode_version().await?.trim()
);
Ok(())
}

View File

@@ -9,8 +9,6 @@ use super::diff::artificial_owned_diff_scope;
use super::diff::compare_ast_node;
use super::diff::DiffEntry;
use super::diff::DiffStatus;
use super::sexp::unquote;
use super::sexp::Token;
use super::util::get_property;
use super::util::get_property_numeric;
use super::util::get_property_quoted_string;
@@ -20,6 +18,8 @@ use crate::types::CharOffsetInLine;
use crate::types::LineNumber;
use crate::types::RetainLabels;
use crate::types::SwitchNumberLines;
use crate::util::elisp::unquote;
use crate::util::elisp::Token;
#[derive(Debug)]
pub(crate) enum EmacsField<'s> {

View File

@@ -16,10 +16,6 @@ use super::compare_field::compare_property_retain_labels;
use super::compare_field::compare_property_set_of_quoted_string;
use super::compare_field::compare_property_single_ast_node;
use super::compare_field::compare_property_unquoted_atom;
use super::elisp_fact::ElispFact;
use super::elisp_fact::GetElispFact;
use super::sexp::unquote;
use super::sexp::Token;
use super::util::affiliated_keywords_names;
use super::util::assert_no_children;
use super::util::compare_additional_properties;
@@ -109,6 +105,12 @@ use crate::types::Verbatim;
use crate::types::VerseBlock;
use crate::types::WarningDelayType;
use crate::types::Year;
use crate::util::elisp::unquote;
use crate::util::elisp::Token;
use crate::util::elisp_fact::ElispFact;
use crate::util::elisp_fact::GetElispFact;
use crate::util::terminal::foreground_color;
use crate::util::terminal::reset_color;
#[derive(Debug)]
pub enum DiffEntry<'b, 's> {
@@ -200,21 +202,21 @@ impl<'b, 's> DiffResult<'b, 's> {
if self.has_bad_children() {
format!(
"{color}BADCHILD{reset}",
color = DiffResult::foreground_color(255, 255, 0),
reset = DiffResult::reset_color(),
color = foreground_color(255, 255, 0),
reset = reset_color(),
)
} else {
format!(
"{color}GOOD{reset}",
color = DiffResult::foreground_color(0, 255, 0),
reset = DiffResult::reset_color(),
color = foreground_color(0, 255, 0),
reset = reset_color(),
)
}
}
DiffStatus::Bad => format!(
"{color}BAD{reset}",
color = DiffResult::foreground_color(255, 0, 0),
reset = DiffResult::reset_color(),
color = foreground_color(255, 0, 0),
reset = reset_color(),
),
}
};
@@ -239,45 +241,6 @@ impl<'b, 's> DiffResult<'b, 's> {
.iter()
.any(|child| child.is_immediately_bad() || child.has_bad_children())
}
pub(crate) fn foreground_color(red: u8, green: u8, blue: u8) -> String {
if DiffResult::should_use_color() {
format!(
"\x1b[38;2;{red};{green};{blue}m",
red = red,
green = green,
blue = blue
)
} else {
String::new()
}
}
#[allow(dead_code)]
pub(crate) fn background_color(red: u8, green: u8, blue: u8) -> String {
if DiffResult::should_use_color() {
format!(
"\x1b[48;2;{red};{green};{blue}m",
red = red,
green = green,
blue = blue
)
} else {
String::new()
}
}
pub(crate) fn reset_color() -> &'static str {
if DiffResult::should_use_color() {
"\x1b[0m"
} else {
""
}
}
fn should_use_color() -> bool {
!std::env::var("NO_COLOR").is_ok_and(|val| !val.is_empty())
}
}
impl<'b, 's> DiffLayer<'b, 's> {
@@ -295,14 +258,14 @@ impl<'b, 's> DiffLayer<'b, 's> {
let status_text = if self.has_bad_children() {
format!(
"{color}BADCHILD{reset}",
color = DiffResult::foreground_color(255, 255, 0),
reset = DiffResult::reset_color(),
color = foreground_color(255, 255, 0),
reset = reset_color(),
)
} else {
format!(
"{color}GOOD{reset}",
color = DiffResult::foreground_color(0, 255, 0),
reset = DiffResult::reset_color(),
color = foreground_color(0, 255, 0),
reset = reset_color(),
)
};
println!(

View File

@@ -2,10 +2,7 @@
mod compare;
mod compare_field;
mod diff;
mod elisp_fact;
mod macros;
mod parse;
mod sexp;
mod util;
pub use compare::run_anonymous_compare;
pub use compare::run_anonymous_compare_with_settings;

View File

@@ -8,14 +8,15 @@ use super::compare_field::compare_property_quoted_string;
use super::compare_field::ComparePropertiesResult;
use super::diff::DiffEntry;
use super::diff::DiffStatus;
use super::elisp_fact::GetElispFact;
use super::sexp::Token;
use crate::compare::diff::compare_ast_node;
use crate::compare::sexp::unquote;
use crate::types::AffiliatedKeywordValue;
use crate::types::AstNode;
use crate::types::GetAffiliatedKeywords;
use crate::types::StandardProperties;
use crate::util::elisp::get_emacs_standard_properties;
use crate::util::elisp::unquote;
use crate::util::elisp::Token;
use crate::util::elisp_fact::GetElispFact;
/// Check if the child string slice is a slice of the parent string slice.
fn is_slice_of(parent: &str, child: &str) -> bool {
@@ -145,80 +146,6 @@ fn assert_post_blank<'b, 's, S: StandardProperties<'s> + ?Sized>(
Ok(())
}
struct EmacsStandardProperties {
begin: Option<usize>,
#[allow(dead_code)]
post_affiliated: Option<usize>,
#[allow(dead_code)]
contents_begin: Option<usize>,
#[allow(dead_code)]
contents_end: Option<usize>,
end: Option<usize>,
#[allow(dead_code)]
post_blank: Option<usize>,
}
fn get_emacs_standard_properties(
emacs: &Token<'_>,
) -> Result<EmacsStandardProperties, Box<dyn std::error::Error>> {
let children = emacs.as_list()?;
let attributes_child = children.get(1).ok_or("Should have an attributes child.")?;
let attributes_map = attributes_child.as_map()?;
let standard_properties = attributes_map.get(":standard-properties");
Ok(if standard_properties.is_some() {
let mut std_props = standard_properties
.expect("if statement proves its Some")
.as_vector()?
.iter();
let begin = maybe_token_to_usize(std_props.next())?;
let post_affiliated = maybe_token_to_usize(std_props.next())?;
let contents_begin = maybe_token_to_usize(std_props.next())?;
let contents_end = maybe_token_to_usize(std_props.next())?;
let end = maybe_token_to_usize(std_props.next())?;
let post_blank = maybe_token_to_usize(std_props.next())?;
EmacsStandardProperties {
begin,
post_affiliated,
contents_begin,
contents_end,
end,
post_blank,
}
} else {
let begin = maybe_token_to_usize(attributes_map.get(":begin").copied())?;
let end = maybe_token_to_usize(attributes_map.get(":end").copied())?;
let contents_begin = maybe_token_to_usize(attributes_map.get(":contents-begin").copied())?;
let contents_end = maybe_token_to_usize(attributes_map.get(":contents-end").copied())?;
let post_blank = maybe_token_to_usize(attributes_map.get(":post-blank").copied())?;
let post_affiliated =
maybe_token_to_usize(attributes_map.get(":post-affiliated").copied())?;
EmacsStandardProperties {
begin,
post_affiliated,
contents_begin,
contents_end,
end,
post_blank,
}
})
}
fn maybe_token_to_usize(
token: Option<&Token<'_>>,
) -> Result<Option<usize>, Box<dyn std::error::Error>> {
Ok(token
.map(|token| token.as_atom())
.map_or(Ok(None), |r| r.map(Some))?
.and_then(|val| {
if val == "nil" {
None
} else {
Some(val.parse::<usize>())
}
})
.map_or(Ok(None), |r| r.map(Some))?)
}
/// Get a named property from the emacs token.
///
/// Returns Ok(None) if value is nil or absent.

View File

@@ -71,9 +71,7 @@ pub struct EntityDefinition<'a> {
impl<'g, 's> GlobalSettings<'g, 's> {
fn new() -> GlobalSettings<'g, 's> {
debug_assert!(
DEFAULT_ORG_ENTITIES.is_sorted_by(|a, b| b.name.len().partial_cmp(&a.name.len()))
);
debug_assert!(DEFAULT_ORG_ENTITIES.is_sorted_by(|a, b| a.name.len() >= b.name.len()));
GlobalSettings {
radio_targets: Vec::new(),
file_access: &LocalFileAccessInterface {

View File

@@ -6,9 +6,9 @@ pub(crate) type Res<T, U> = IResult<T, U, CustomError>;
#[derive(Debug)]
pub enum CustomError {
Static(&'static str),
IO(std::io::Error),
Parser(ErrorKind),
Static(#[allow(dead_code)] &'static str),
IO(#[allow(dead_code)] std::io::Error),
Parser(#[allow(dead_code)] ErrorKind),
}
impl<I: std::fmt::Debug> ParseError<I> for CustomError {

View File

@@ -24,7 +24,7 @@ pub(crate) fn record_event(event_type: EventType, input: OrgSource<'_>) {
*db.entry(key).or_insert(0) += 1;
}
pub fn report(original_document: &str) {
pub(crate) fn report(original_document: &str) {
let mut db = GLOBAL_DATA.lock().unwrap();
let db = db.get_or_insert_with(HashMap::new);
let mut results: Vec<_> = db.iter().collect();

View File

@@ -2,5 +2,5 @@ mod database;
mod event_type;
pub(crate) use database::record_event;
pub use database::report;
pub(crate) use database::report;
pub(crate) use event_type::EventType;

View File

@@ -1,8 +1,9 @@
#![feature(exit_status_error)]
#![feature(trait_alias)]
#![feature(path_file_prefix)]
#![feature(is_sorted)]
#![feature(test)]
#![feature(iter_intersperse)]
#![feature(exact_size_is_empty)]
// TODO: #![warn(missing_docs)]
#![allow(clippy::bool_assert_comparison)] // Sometimes you want the long form because its easier to see at a glance.
@@ -10,11 +11,20 @@ extern crate test;
#[cfg(feature = "compare")]
pub mod compare;
pub mod parse_cli;
#[cfg(any(feature = "compare", feature = "wasm", feature = "wasm_test"))]
mod util;
#[cfg(any(feature = "wasm", feature = "wasm_test"))]
mod wasm;
#[cfg(any(feature = "wasm", feature = "wasm_test"))]
pub mod wasm_cli;
#[cfg(feature = "wasm_test")]
pub mod wasm_test;
mod context;
mod error;
#[cfg(feature = "event_count")]
pub mod event_count;
mod event_count;
mod iter;
pub mod parser;
pub mod types;

View File

@@ -1,12 +1,4 @@
#![feature(round_char_boundary)]
#![feature(exact_size_is_empty)]
use std::io::Read;
use std::path::Path;
use ::organic::parser::parse;
use organic::parser::parse_with_settings;
use organic::settings::GlobalSettings;
use organic::settings::LocalFileAccessInterface;
use organic::parse_cli::main_body;
#[cfg(feature = "tracing")]
use crate::init_tracing::init_telemetry;
@@ -30,55 +22,3 @@ fn main() -> Result<(), Box<dyn std::error::Error>> {
main_body_result
})
}
#[cfg_attr(feature = "tracing", tracing::instrument(ret, level = "debug"))]
fn main_body() -> Result<(), Box<dyn std::error::Error>> {
let args = std::env::args().skip(1);
if args.is_empty() {
let org_contents = read_stdin_to_string()?;
run_anonymous_parse(org_contents)
} else {
for arg in args {
run_parse_on_file(arg)?
}
Ok(())
}
}
fn read_stdin_to_string() -> Result<String, Box<dyn std::error::Error>> {
let mut stdin_contents = String::new();
std::io::stdin()
.lock()
.read_to_string(&mut stdin_contents)?;
Ok(stdin_contents)
}
fn run_anonymous_parse<P: AsRef<str>>(org_contents: P) -> Result<(), Box<dyn std::error::Error>> {
let org_contents = org_contents.as_ref();
let rust_parsed = parse(org_contents)?;
println!("{:#?}", rust_parsed);
#[cfg(feature = "event_count")]
organic::event_count::report(org_contents);
Ok(())
}
fn run_parse_on_file<P: AsRef<Path>>(org_path: P) -> Result<(), Box<dyn std::error::Error>> {
let org_path = org_path.as_ref();
let parent_directory = org_path
.parent()
.ok_or("Should be contained inside a directory.")?;
let org_contents = std::fs::read_to_string(org_path)?;
let org_contents = org_contents.as_str();
let file_access_interface = LocalFileAccessInterface {
working_directory: Some(parent_directory.to_path_buf()),
};
let global_settings = GlobalSettings {
file_access: &file_access_interface,
..Default::default()
};
let rust_parsed = parse_with_settings(org_contents, &global_settings)?;
println!("{:#?}", rust_parsed);
#[cfg(feature = "event_count")]
organic::event_count::report(org_contents);
Ok(())
}

59
src/parse_cli/mod.rs Normal file
View File

@@ -0,0 +1,59 @@
use std::io::Read;
use std::path::Path;
use crate::parser::parse;
use crate::parser::parse_with_settings;
use crate::settings::GlobalSettings;
use crate::settings::LocalFileAccessInterface;
#[cfg_attr(feature = "tracing", tracing::instrument(ret, level = "debug"))]
pub fn main_body() -> Result<(), Box<dyn std::error::Error>> {
let args = std::env::args().skip(1);
if args.is_empty() {
let org_contents = read_stdin_to_string()?;
run_anonymous_parse(org_contents)
} else {
for arg in args {
run_parse_on_file(arg)?
}
Ok(())
}
}
fn read_stdin_to_string() -> Result<String, Box<dyn std::error::Error>> {
let mut stdin_contents = String::new();
std::io::stdin()
.lock()
.read_to_string(&mut stdin_contents)?;
Ok(stdin_contents)
}
fn run_anonymous_parse<P: AsRef<str>>(org_contents: P) -> Result<(), Box<dyn std::error::Error>> {
let org_contents = org_contents.as_ref();
let rust_parsed = parse(org_contents)?;
println!("{:#?}", rust_parsed);
#[cfg(feature = "event_count")]
crate::event_count::report(org_contents);
Ok(())
}
fn run_parse_on_file<P: AsRef<Path>>(org_path: P) -> Result<(), Box<dyn std::error::Error>> {
let org_path = org_path.as_ref();
let parent_directory = org_path
.parent()
.ok_or("Should be contained inside a directory.")?;
let org_contents = std::fs::read_to_string(org_path)?;
let org_contents = org_contents.as_str();
let file_access_interface = LocalFileAccessInterface {
working_directory: Some(parent_directory.to_path_buf()),
};
let global_settings = GlobalSettings {
file_access: &file_access_interface,
..Default::default()
};
let rust_parsed = parse_with_settings(org_contents, &global_settings)?;
println!("{:#?}", rust_parsed);
#[cfg(feature = "event_count")]
crate::event_count::report(org_contents);
Ok(())
}

View File

@@ -106,7 +106,6 @@ mod tests {
use super::*;
use crate::context::bind_context;
use crate::context::Context;
use crate::context::ContextElement;
use crate::context::GlobalSettings;
use crate::context::List;

View File

@@ -77,9 +77,9 @@ fn _heading<'b, 'g, 'r, 's>(
// If the section has a planning then the timestamp values are copied to the heading.
if let DocumentElement::Section(inner_section) = &section {
if let Some(Element::Planning(planning)) = inner_section.children.first() {
scheduled = planning.scheduled.clone();
deadline = planning.deadline.clone();
closed = planning.closed.clone();
scheduled.clone_from(&planning.scheduled);
deadline.clone_from(&planning.deadline);
closed.clone_from(&planning.closed);
}
}
children.insert(0, section);

View File

@@ -224,7 +224,6 @@ mod tests {
use test::Bencher;
use super::*;
use crate::parser::OrgSource;
#[bench]
fn bench_affiliated_keyword(b: &mut Bencher) {

View File

@@ -175,9 +175,7 @@ pub(crate) trait RematchObject<'x> {
#[cfg(test)]
mod tests {
use super::*;
use crate::context::Context;
use crate::context::GlobalSettings;
use crate::context::List;
use crate::parser::element_parser::element;
use crate::types::Bold;
use crate::types::Element;

View File

@@ -47,6 +47,7 @@ pub struct Section<'s> {
}
#[derive(Debug)]
#[allow(clippy::large_enum_variant)]
pub enum DocumentElement<'s> {
Heading(Heading<'s>),
Section(Section<'s>),

View File

@@ -6,7 +6,6 @@ mod greater_element;
mod lesser_element;
mod macros;
mod object;
mod remove_trailing;
mod standard_properties;
mod util;
pub use affiliated_keyword::AffiliatedKeyword;

View File

@@ -332,19 +332,19 @@ pub type HourInner = u8;
pub type MinuteInner = u8;
#[derive(Debug, Clone)]
pub struct Year(YearInner);
pub struct Year(pub YearInner);
#[derive(Debug, Clone)]
pub struct Month(MonthInner);
pub struct Month(pub MonthInner);
#[derive(Debug, Clone)]
pub struct DayOfMonth(DayOfMonthInner);
pub struct DayOfMonth(pub DayOfMonthInner);
#[derive(Debug, Clone)]
pub struct Hour(HourInner);
pub struct Hour(pub HourInner);
#[derive(Debug, Clone)]
pub struct Minute(MinuteInner);
pub struct Minute(pub MinuteInner);
impl Year {
// TODO: Make a real error type instead of a boxed any error.

View File

@@ -1,56 +0,0 @@
pub(crate) trait RemoveTrailing: Iterator + Sized {
fn remove_trailing<R: Into<usize>>(self, amount_to_remove: R) -> RemoveTrailingIter<Self>;
}
impl<I> RemoveTrailing for I
where
I: Iterator,
{
fn remove_trailing<R: Into<usize>>(self, amount_to_remove: R) -> RemoveTrailingIter<Self> {
RemoveTrailingIter {
inner: self,
buffer: Vec::new(),
next_to_pop: 0,
amount_to_remove: amount_to_remove.into(),
}
}
}
pub(crate) struct RemoveTrailingIter<I: Iterator> {
inner: I,
buffer: Vec<I::Item>,
next_to_pop: usize,
amount_to_remove: usize,
}
impl<I: Iterator> Iterator for RemoveTrailingIter<I> {
type Item = I::Item;
fn next(&mut self) -> Option<Self::Item> {
if self.buffer.len() < self.amount_to_remove {
self.buffer.reserve_exact(self.amount_to_remove);
}
while self.buffer.len() < self.amount_to_remove {
if let Some(elem) = self.inner.next() {
self.buffer.push(elem);
} else {
// The inner was smaller than amount_to_remove, so never return anything.
return None;
}
}
let new_value = self.inner.next();
if self.amount_to_remove == 0 {
return new_value;
}
if let Some(new_value) = new_value {
let ret = std::mem::replace(&mut self.buffer[self.next_to_pop], new_value);
self.next_to_pop = (self.next_to_pop + 1) % self.amount_to_remove;
Some(ret)
} else {
// We have exactly the amount in the buffer than we wanted to cut off, so stop returning values.
None
}
}
}

View File

@@ -2,28 +2,53 @@ use std::path::Path;
use tokio::process::Command;
use crate::context::HeadlineLevelFilter;
use crate::settings::GlobalSettings;
use crate::settings::HeadlineLevelFilter;
/// Generate elisp to configure org-mode parsing settings
///
/// Currently only org-list-allow-alphabetical is supported.
fn global_settings_elisp(global_settings: &GlobalSettings) -> String {
// This string concatenation is wildly inefficient but its only called in tests 🤷.
let mut ret = "".to_owned();
if global_settings.list_allow_alphabetical {
ret += "(setq org-list-allow-alphabetical t)\n"
}
if global_settings.tab_width != crate::settings::DEFAULT_TAB_WIDTH {
ret += format!("(setq-default tab-width {})", global_settings.tab_width).as_str();
}
if global_settings.odd_levels_only != HeadlineLevelFilter::default() {
ret += match global_settings.odd_levels_only {
HeadlineLevelFilter::Odd => "(setq org-odd-levels-only t)\n",
HeadlineLevelFilter::OddEven => "(setq org-odd-levels-only nil)\n",
};
}
ret
pub async fn print_versions() -> Result<(), Box<dyn std::error::Error>> {
eprintln!("Using emacs version: {}", get_emacs_version().await?.trim());
eprintln!(
"Using org-mode version: {}",
get_org_mode_version().await?.trim()
);
Ok(())
}
pub(crate) async fn get_emacs_version() -> Result<String, Box<dyn std::error::Error>> {
let elisp_script = r#"(progn
(message "%s" (version))
)"#;
let mut cmd = Command::new("emacs");
let cmd = cmd
.arg("-q")
.arg("--no-site-file")
.arg("--no-splash")
.arg("--batch")
.arg("--eval")
.arg(elisp_script);
let out = cmd.output().await?;
out.status.exit_ok()?;
Ok(String::from_utf8(out.stderr)?)
}
pub(crate) async fn get_org_mode_version() -> Result<String, Box<dyn std::error::Error>> {
let elisp_script = r#"(progn
(org-mode)
(message "%s" (org-version nil t nil))
)"#;
let mut cmd = Command::new("emacs");
let cmd = cmd
.arg("-q")
.arg("--no-site-file")
.arg("--no-splash")
.arg("--batch")
.arg("--eval")
.arg(elisp_script);
let out = cmd.output().await?;
out.status.exit_ok()?;
Ok(String::from_utf8(out.stderr)?)
}
pub(crate) async fn emacs_parse_anonymous_org_document<'g, 's, C>(
@@ -144,39 +169,23 @@ where
output
}
pub async fn get_emacs_version() -> Result<String, Box<dyn std::error::Error>> {
let elisp_script = r#"(progn
(message "%s" (version))
)"#;
let mut cmd = Command::new("emacs");
let cmd = cmd
.arg("-q")
.arg("--no-site-file")
.arg("--no-splash")
.arg("--batch")
.arg("--eval")
.arg(elisp_script);
let out = cmd.output().await?;
out.status.exit_ok()?;
Ok(String::from_utf8(out.stderr)?)
}
pub async fn get_org_mode_version() -> Result<String, Box<dyn std::error::Error>> {
let elisp_script = r#"(progn
(org-mode)
(message "%s" (org-version nil t nil))
)"#;
let mut cmd = Command::new("emacs");
let cmd = cmd
.arg("-q")
.arg("--no-site-file")
.arg("--no-splash")
.arg("--batch")
.arg("--eval")
.arg(elisp_script);
let out = cmd.output().await?;
out.status.exit_ok()?;
Ok(String::from_utf8(out.stderr)?)
/// Generate elisp to configure org-mode parsing settings
///
/// Currently only org-list-allow-alphabetical is supported.
fn global_settings_elisp(global_settings: &GlobalSettings) -> String {
// This string concatenation is wildly inefficient but its only called in tests 🤷.
let mut ret = "".to_owned();
if global_settings.list_allow_alphabetical {
ret += "(setq org-list-allow-alphabetical t)\n"
}
if global_settings.tab_width != crate::settings::DEFAULT_TAB_WIDTH {
ret += format!("(setq-default tab-width {})", global_settings.tab_width).as_str();
}
if global_settings.odd_levels_only != HeadlineLevelFilter::default() {
ret += match global_settings.odd_levels_only {
HeadlineLevelFilter::Odd => "(setq org-odd-levels-only t)\n",
HeadlineLevelFilter::OddEven => "(setq org-odd-levels-only nil)\n",
};
}
ret
}

14
src/util/elisp/mod.rs Normal file
View File

@@ -0,0 +1,14 @@
mod sexp;
mod util;
pub use sexp::sexp;
pub(crate) use sexp::unquote;
#[cfg(feature = "wasm_test")]
pub(crate) use sexp::TextWithProperties;
pub use sexp::Token;
#[cfg(feature = "compare")]
pub(crate) use util::get_emacs_standard_properties;
#[cfg(feature = "wasm_test")]
pub(crate) use util::maybe_token_to_usize;
#[cfg(feature = "wasm_test")]
pub(crate) use util::EmacsStandardProperties;

View File

@@ -61,6 +61,7 @@ impl<'s> Token<'s> {
}?)
}
#[cfg(feature = "compare")]
pub(crate) fn as_text<'p>(
&'p self,
) -> Result<&'p TextWithProperties<'s>, Box<dyn std::error::Error>> {

76
src/util/elisp/util.rs Normal file
View File

@@ -0,0 +1,76 @@
use super::Token;
pub(crate) fn maybe_token_to_usize(
token: Option<&Token<'_>>,
) -> Result<Option<usize>, Box<dyn std::error::Error>> {
Ok(token
.map(|token| token.as_atom())
.map_or(Ok(None), |r| r.map(Some))?
.and_then(|val| {
if val == "nil" {
None
} else {
Some(val.parse::<usize>())
}
})
.map_or(Ok(None), |r| r.map(Some))?)
}
pub(crate) struct EmacsStandardProperties {
pub(crate) begin: Option<usize>,
#[allow(dead_code)]
pub(crate) post_affiliated: Option<usize>,
#[allow(dead_code)]
pub(crate) contents_begin: Option<usize>,
#[allow(dead_code)]
pub(crate) contents_end: Option<usize>,
pub(crate) end: Option<usize>,
#[allow(dead_code)]
pub(crate) post_blank: Option<usize>,
}
#[cfg(feature = "compare")]
pub(crate) fn get_emacs_standard_properties(
emacs: &Token<'_>,
) -> Result<EmacsStandardProperties, Box<dyn std::error::Error>> {
let children = emacs.as_list()?;
let attributes_child = children.get(1).ok_or("Should have an attributes child.")?;
let attributes_map = attributes_child.as_map()?;
let standard_properties = attributes_map.get(":standard-properties");
Ok(if standard_properties.is_some() {
let mut std_props = standard_properties
.expect("if statement proves its Some")
.as_vector()?
.iter();
let begin = maybe_token_to_usize(std_props.next())?;
let post_affiliated = maybe_token_to_usize(std_props.next())?;
let contents_begin = maybe_token_to_usize(std_props.next())?;
let contents_end = maybe_token_to_usize(std_props.next())?;
let end = maybe_token_to_usize(std_props.next())?;
let post_blank = maybe_token_to_usize(std_props.next())?;
EmacsStandardProperties {
begin,
post_affiliated,
contents_begin,
contents_end,
end,
post_blank,
}
} else {
let begin = maybe_token_to_usize(attributes_map.get(":begin").copied())?;
let end = maybe_token_to_usize(attributes_map.get(":end").copied())?;
let contents_begin = maybe_token_to_usize(attributes_map.get(":contents-begin").copied())?;
let contents_end = maybe_token_to_usize(attributes_map.get(":contents-end").copied())?;
let post_blank = maybe_token_to_usize(attributes_map.get(":post-blank").copied())?;
let post_affiliated =
maybe_token_to_usize(attributes_map.get(":post-affiliated").copied())?;
EmacsStandardProperties {
begin,
post_affiliated,
contents_begin,
contents_end,
end,
post_blank,
}
})
}

8
src/util/mod.rs Normal file
View File

@@ -0,0 +1,8 @@
#[cfg(any(feature = "compare", feature = "wasm_test"))]
pub mod cli;
#[cfg(any(feature = "compare", feature = "wasm_test"))]
pub mod elisp;
#[cfg(any(feature = "compare", feature = "wasm", feature = "wasm_test"))]
pub mod elisp_fact;
#[cfg(any(feature = "compare", feature = "wasm_test"))]
pub mod terminal;

42
src/util/terminal.rs Normal file
View File

@@ -0,0 +1,42 @@
use std::borrow::Cow;
fn should_use_color() -> bool {
!std::env::var("NO_COLOR").is_ok_and(|val| !val.is_empty())
}
pub(crate) fn foreground_color(red: u8, green: u8, blue: u8) -> Cow<'static, str> {
if should_use_color() {
format!(
"\x1b[38;2;{red};{green};{blue}m",
red = red,
green = green,
blue = blue
)
.into()
} else {
Cow::from("")
}
}
#[allow(dead_code)]
pub(crate) fn background_color(red: u8, green: u8, blue: u8) -> Cow<'static, str> {
if should_use_color() {
format!(
"\x1b[48;2;{red};{green};{blue}m",
red = red,
green = green,
blue = blue
)
.into()
} else {
Cow::from("")
}
}
pub(crate) fn reset_color() -> &'static str {
if should_use_color() {
"\x1b[0m"
} else {
""
}
}

View File

@@ -0,0 +1,95 @@
use std::collections::HashMap;
use serde::Deserialize;
use serde::Serialize;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::WasmAstNode;
use crate::types::AffiliatedKeywordValue;
use crate::types::AffiliatedKeywords;
#[derive(Debug, Serialize, Deserialize)]
#[serde(untagged)]
pub enum AdditionalPropertyValue {
SingleString(String),
ListOfStrings(Vec<String>),
OptionalPair {
optval: Option<String>,
val: String,
},
ObjectTree {
#[serde(rename = "object-tree")]
object_tree: Vec<(Option<Vec<WasmAstNode>>, Vec<WasmAstNode>)>,
},
}
#[derive(Debug, Serialize, Deserialize, Default)]
pub struct AdditionalProperties {
#[serde(flatten)]
pub(crate) properties: HashMap<String, AdditionalPropertyValue>,
}
to_wasm!(
AdditionalProperties,
AffiliatedKeywords<'s>,
original,
wasm_context,
{
let mut additional_properties = AdditionalProperties::default();
for (name, val) in original.keywords.iter() {
let converted_val = match val {
AffiliatedKeywordValue::SingleString(val) => {
AdditionalPropertyValue::SingleString((*val).to_owned())
}
AffiliatedKeywordValue::ListOfStrings(val) => {
AdditionalPropertyValue::ListOfStrings(
val.iter().map(|s| (*s).to_owned()).collect(),
)
}
AffiliatedKeywordValue::OptionalPair { optval, val } => {
AdditionalPropertyValue::OptionalPair {
optval: optval.map(|s| (*s).to_owned()),
val: (*val).to_owned(),
}
}
AffiliatedKeywordValue::ObjectTree(val) => {
let mut ret = Vec::with_capacity(val.len());
for (optval, value) in val {
let converted_optval = if let Some(optval) = optval {
Some(
optval
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?,
)
} else {
None
};
let converted_value = value
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
ret.push((converted_optval, converted_value));
}
AdditionalPropertyValue::ObjectTree { object_tree: ret }
}
};
additional_properties
.properties
.insert(name.clone(), converted_val);
}
Ok(additional_properties)
}
);

54
src/wasm/angle_link.rs Normal file
View File

@@ -0,0 +1,54 @@
use std::borrow::Cow;
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::AngleLink;
use crate::types::LinkType;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
#[serde(tag = "format")]
#[serde(rename = "angle")]
pub struct WasmAngleLink {
#[serde(rename = "type")]
pub(crate) link_type: String,
pub(crate) path: String,
#[serde(rename = "raw-link")]
pub(crate) raw_link: String,
pub(crate) application: Option<String>,
#[serde(rename = "search-option")]
pub(crate) search_option: Option<String>,
}
to_wasm!(
WasmAngleLink,
AngleLink<'s>,
original,
wasm_context,
{ WasmAstNode::AngleLink(original) },
{ "link".into() },
{
Ok((
Vec::new(),
WasmAngleLink {
link_type: match &original.link_type {
LinkType::File => "file".to_owned(),
LinkType::Protocol(protocol) => protocol.clone().into_owned(),
LinkType::Id => "id".to_owned(),
LinkType::CustomId => "custom-id".to_owned(),
LinkType::CodeRef => "coderef".to_owned(),
LinkType::Fuzzy => "fuzzy".to_owned(),
},
path: original.get_path().into_owned(),
raw_link: original.raw_link.to_owned(),
application: original.application.map(|c| c.to_owned()),
search_option: original.get_search_option().map(Cow::into_owned),
},
))
}
);

142
src/wasm/ast_node.rs Normal file
View File

@@ -0,0 +1,142 @@
use serde::Deserialize;
use serde::Serialize;
use super::angle_link::WasmAngleLink;
use super::babel_call::WasmBabelCall;
use super::bold::WasmBold;
use super::center_block::WasmCenterBlock;
use super::citation::WasmCitation;
use super::citation_reference::WasmCitationReference;
use super::clock::WasmClock;
use super::code::WasmCode;
use super::comment::WasmComment;
use super::comment_block::WasmCommentBlock;
use super::diary_sexp::WasmDiarySexp;
use super::document::WasmDocument;
use super::drawer::WasmDrawer;
use super::dynamic_block::WasmDynamicBlock;
use super::entity::WasmEntity;
use super::example_block::WasmExampleBlock;
use super::export_block::WasmExportBlock;
use super::export_snippet::WasmExportSnippet;
use super::fixed_width_area::WasmFixedWidthArea;
use super::footnote_definition::WasmFootnoteDefinition;
use super::footnote_reference::WasmFootnoteReference;
use super::headline::WasmHeadline;
use super::horizontal_rule::WasmHorizontalRule;
use super::inline_babel_call::WasmInlineBabelCall;
use super::inline_source_block::WasmInlineSourceBlock;
use super::italic::WasmItalic;
use super::keyword::WasmKeyword;
use super::latex_environment::WasmLatexEnvironment;
use super::latex_fragment::WasmLatexFragment;
use super::line_break::WasmLineBreak;
use super::node_property::WasmNodeProperty;
use super::org_macro::WasmOrgMacro;
use super::paragraph::WasmParagraph;
use super::plain_link::WasmPlainLink;
use super::plain_list::WasmPlainList;
use super::plain_list_item::WasmPlainListItem;
use super::plain_text::WasmPlainText;
use super::planning::WasmPlanning;
use super::property_drawer::WasmPropertyDrawer;
use super::quote_block::WasmQuoteBlock;
use super::radio_link::WasmRadioLink;
use super::radio_target::WasmRadioTarget;
use super::regular_link::WasmRegularLink;
use super::section::WasmSection;
use super::special_block::WasmSpecialBlock;
use super::src_block::WasmSrcBlock;
use super::statistics_cookie::WasmStatisticsCookie;
use super::strike_through::WasmStrikeThrough;
use super::subscript::WasmSubscript;
use super::superscript::WasmSuperscript;
use super::table::WasmTable;
use super::table_cell::WasmTableCell;
use super::table_row::WasmTableRow;
use super::target::WasmTarget;
use super::timestamp::WasmTimestamp;
use super::underline::WasmUnderline;
use super::verbatim::WasmVerbatim;
use super::verse_block::WasmVerseBlock;
use super::WasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmAstNodeWrapper<I> {
#[serde(rename = "ast-node")]
pub(crate) ast_node: String,
#[serde(rename = "standard-properties")]
pub(crate) standard_properties: WasmStandardProperties,
#[serde(rename = "children")]
pub(crate) children: Vec<WasmAstNode>,
#[serde(rename = "properties")]
pub(crate) properties: I,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(untagged)]
pub enum WasmAstNode {
// Document Nodes
Document(WasmAstNodeWrapper<WasmDocument>),
Headline(WasmAstNodeWrapper<WasmHeadline>),
Section(WasmAstNodeWrapper<WasmSection>),
// Elements
Paragraph(WasmAstNodeWrapper<WasmParagraph>),
PlainList(WasmAstNodeWrapper<WasmPlainList>),
PlainListItem(WasmAstNodeWrapper<WasmPlainListItem>),
CenterBlock(WasmAstNodeWrapper<WasmCenterBlock>),
QuoteBlock(WasmAstNodeWrapper<WasmQuoteBlock>),
SpecialBlock(WasmAstNodeWrapper<WasmSpecialBlock>),
DynamicBlock(WasmAstNodeWrapper<WasmDynamicBlock>),
FootnoteDefinition(WasmAstNodeWrapper<WasmFootnoteDefinition>),
Comment(WasmAstNodeWrapper<WasmComment>),
Drawer(WasmAstNodeWrapper<WasmDrawer>),
PropertyDrawer(WasmAstNodeWrapper<WasmPropertyDrawer>),
NodeProperty(WasmAstNodeWrapper<WasmNodeProperty>),
Table(WasmAstNodeWrapper<WasmTable>),
TableRow(WasmAstNodeWrapper<WasmTableRow>),
VerseBlock(WasmAstNodeWrapper<WasmVerseBlock>),
CommentBlock(WasmAstNodeWrapper<WasmCommentBlock>),
ExampleBlock(WasmAstNodeWrapper<WasmExampleBlock>),
ExportBlock(WasmAstNodeWrapper<WasmExportBlock>),
SrcBlock(WasmAstNodeWrapper<WasmSrcBlock>),
Clock(WasmAstNodeWrapper<WasmClock>),
DiarySexp(WasmAstNodeWrapper<WasmDiarySexp>),
Planning(WasmAstNodeWrapper<WasmPlanning>),
FixedWidthArea(WasmAstNodeWrapper<WasmFixedWidthArea>),
HorizontalRule(WasmAstNodeWrapper<WasmHorizontalRule>),
Keyword(WasmAstNodeWrapper<WasmKeyword>),
BabelCall(WasmAstNodeWrapper<WasmBabelCall>),
LatexEnvironment(WasmAstNodeWrapper<WasmLatexEnvironment>),
// Objects
Bold(WasmAstNodeWrapper<WasmBold>),
Italic(WasmAstNodeWrapper<WasmItalic>),
Underline(WasmAstNodeWrapper<WasmUnderline>),
StrikeThrough(WasmAstNodeWrapper<WasmStrikeThrough>),
Code(WasmAstNodeWrapper<WasmCode>),
Verbatim(WasmAstNodeWrapper<WasmVerbatim>),
PlainText(WasmAstNodeWrapper<WasmPlainText>),
RegularLink(WasmAstNodeWrapper<WasmRegularLink>),
RadioLink(WasmAstNodeWrapper<WasmRadioLink>),
RadioTarget(WasmAstNodeWrapper<WasmRadioTarget>),
PlainLink(WasmAstNodeWrapper<WasmPlainLink>),
AngleLink(WasmAstNodeWrapper<WasmAngleLink>),
OrgMacro(WasmAstNodeWrapper<WasmOrgMacro>),
Entity(WasmAstNodeWrapper<WasmEntity>),
LatexFragment(WasmAstNodeWrapper<WasmLatexFragment>),
ExportSnippet(WasmAstNodeWrapper<WasmExportSnippet>),
FootnoteReference(WasmAstNodeWrapper<WasmFootnoteReference>),
Citation(WasmAstNodeWrapper<WasmCitation>),
CitationReference(WasmAstNodeWrapper<WasmCitationReference>),
InlineBabelCall(WasmAstNodeWrapper<WasmInlineBabelCall>),
InlineSourceBlock(WasmAstNodeWrapper<WasmInlineSourceBlock>),
LineBreak(WasmAstNodeWrapper<WasmLineBreak>),
Target(WasmAstNodeWrapper<WasmTarget>),
StatisticsCookie(WasmAstNodeWrapper<WasmStatisticsCookie>),
Subscript(WasmAstNodeWrapper<WasmSubscript>),
Superscript(WasmAstNodeWrapper<WasmSuperscript>),
TableCell(WasmAstNodeWrapper<WasmTableCell>),
Timestamp(WasmAstNodeWrapper<WasmTimestamp>),
}
impl WasmAstNode {}

50
src/wasm/babel_call.rs Normal file
View File

@@ -0,0 +1,50 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::BabelCall;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmBabelCall {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) call: Option<String>,
#[serde(rename = "inside-header")]
pub(crate) inside_header: Option<String>,
pub(crate) arguments: Option<String>,
#[serde(rename = "end-header")]
pub(crate) end_header: Option<String>,
pub(crate) value: String,
}
to_wasm!(
WasmBabelCall,
BabelCall<'s>,
original,
wasm_context,
{ WasmAstNode::BabelCall(original) },
{ "babel-call".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmBabelCall {
additional_properties,
call: original.call.map(|s| s.to_owned()),
inside_header: original.inside_header.map(|s| s.to_owned()),
arguments: original.arguments.map(|s| s.to_owned()),
end_header: original.end_header.map(|s| s.to_owned()),
value: original.value.to_owned(),
},
))
}
);

34
src/wasm/bold.rs Normal file
View File

@@ -0,0 +1,34 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Bold;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmBold {}
to_wasm!(
WasmBold,
Bold<'s>,
original,
wasm_context,
{ WasmAstNode::Bold(original) },
{ "bold".into() },
{
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((children, WasmBold {}))
}
);

48
src/wasm/center_block.rs Normal file
View File

@@ -0,0 +1,48 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::CenterBlock;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmCenterBlock {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
}
to_wasm!(
WasmCenterBlock,
CenterBlock<'s>,
original,
wasm_context,
{ WasmAstNode::CenterBlock(original) },
{ "center-block".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmCenterBlock {
additional_properties,
},
))
}
);

71
src/wasm/citation.rs Normal file
View File

@@ -0,0 +1,71 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Citation;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmCitation {
pub(crate) style: Option<String>,
pub(crate) prefix: Option<Vec<WasmAstNode>>,
pub(crate) suffix: Option<Vec<WasmAstNode>>,
}
to_wasm!(
WasmCitation,
Citation<'s>,
original,
wasm_context,
{ WasmAstNode::Citation(original) },
{ "citation".into() },
{
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
let prefix = original
.prefix
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
let suffix = original
.suffix
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmCitation {
style: original.style.map(|s| s.to_owned()),
prefix: if prefix.is_empty() {
None
} else {
Some(prefix)
},
suffix: if suffix.is_empty() {
None
} else {
Some(suffix)
},
},
))
}
);

View File

@@ -0,0 +1,62 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::CitationReference;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmCitationReference {
pub(crate) key: String,
pub(crate) prefix: Option<Vec<WasmAstNode>>,
pub(crate) suffix: Option<Vec<WasmAstNode>>,
}
to_wasm!(
WasmCitationReference,
CitationReference<'s>,
original,
wasm_context,
{ WasmAstNode::CitationReference(original) },
{ "citation-reference".into() },
{
let prefix = original
.prefix
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
let suffix = original
.suffix
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
Vec::new(),
WasmCitationReference {
key: original.key.to_owned(),
prefix: if prefix.is_empty() {
None
} else {
Some(prefix)
},
suffix: if suffix.is_empty() {
None
} else {
Some(suffix)
},
},
))
}
);

43
src/wasm/clock.rs Normal file
View File

@@ -0,0 +1,43 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Clock;
use crate::types::ClockStatus;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmClock {
#[serde(rename = "value")]
pub(crate) timestamp: Box<WasmAstNode>,
pub(crate) duration: Option<String>,
pub(crate) status: String,
}
to_wasm!(
WasmClock,
Clock<'s>,
original,
wasm_context,
{ WasmAstNode::Clock(original) },
{ "clock".into() },
{
Ok((
Vec::new(),
WasmClock {
timestamp: Box::new(Into::<WasmAstNode>::into(
original.timestamp.to_wasm(wasm_context.clone())?,
)),
duration: original.duration.map(|s| s.to_owned()),
status: match original.status {
ClockStatus::Running => "running",
ClockStatus::Closed => "closed",
}
.to_owned(),
},
))
}
);

31
src/wasm/code.rs Normal file
View File

@@ -0,0 +1,31 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Code;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmCode {
pub(crate) value: String,
}
to_wasm!(
WasmCode,
Code<'s>,
original,
wasm_context,
{ WasmAstNode::Code(original) },
{ "code".into() },
{
Ok((
Vec::new(),
WasmCode {
value: original.contents.to_owned(),
},
))
}
);

31
src/wasm/comment.rs Normal file
View File

@@ -0,0 +1,31 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Comment;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmComment {
pub(crate) value: String,
}
to_wasm!(
WasmComment,
Comment<'s>,
original,
wasm_context,
{ WasmAstNode::Comment(original) },
{ "comment".into() },
{
Ok((
Vec::new(),
WasmComment {
value: original.get_value(),
},
))
}
);

40
src/wasm/comment_block.rs Normal file
View File

@@ -0,0 +1,40 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::CommentBlock;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmCommentBlock {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) value: String,
}
to_wasm!(
WasmCommentBlock,
CommentBlock<'s>,
original,
wasm_context,
{ WasmAstNode::CommentBlock(original) },
{ "comment-block".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmCommentBlock {
additional_properties,
value: original.contents.to_owned(),
},
))
}
);

40
src/wasm/diary_sexp.rs Normal file
View File

@@ -0,0 +1,40 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::DiarySexp;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmDiarySexp {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) value: String,
}
to_wasm!(
WasmDiarySexp,
DiarySexp<'s>,
original,
wasm_context,
{ WasmAstNode::DiarySexp(original) },
{ "diary-sexp".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmDiarySexp {
additional_properties,
value: original.value.to_owned(),
},
))
}
);

69
src/wasm/document.rs Normal file
View File

@@ -0,0 +1,69 @@
use std::path::PathBuf;
use serde::Deserialize;
use serde::Serialize;
use super::additional_property::AdditionalProperties;
use super::additional_property::AdditionalPropertyValue;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Document;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmDocument {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
#[serde(rename = "CATEGORY")]
pub(crate) category: Option<String>,
pub(crate) path: Option<PathBuf>,
}
to_wasm!(
WasmDocument,
Document<'s>,
original,
wasm_context,
{ WasmAstNode::Document(original) },
{ "org-data".into() },
{
let category = original.category.as_deref();
let path = original.path.clone();
let mut additional_properties = AdditionalProperties::default();
for (name, val) in original.get_additional_properties().map(|node_property| {
(
node_property.property_name.to_uppercase(),
AdditionalPropertyValue::SingleString(node_property.value.unwrap_or("").to_owned()),
)
}) {
additional_properties.properties.insert(name, val);
}
let children = original
.zeroth_section
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.chain(original.children.iter().map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
}))
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmDocument {
additional_properties,
category: category.map(str::to_owned),
path,
},
))
}
);

51
src/wasm/drawer.rs Normal file
View File

@@ -0,0 +1,51 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::Drawer;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmDrawer {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
#[serde(rename = "drawer-name")]
pub(crate) drawer_name: String,
}
to_wasm!(
WasmDrawer,
Drawer<'s>,
original,
wasm_context,
{ WasmAstNode::Drawer(original) },
{ "drawer".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmDrawer {
additional_properties,
drawer_name: original.drawer_name.to_owned(),
},
))
}
);

53
src/wasm/dynamic_block.rs Normal file
View File

@@ -0,0 +1,53 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::DynamicBlock;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmDynamicBlock {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
#[serde(rename = "block-name")]
pub(crate) block_name: String,
pub(crate) arguments: Option<String>,
}
to_wasm!(
WasmDynamicBlock,
DynamicBlock<'s>,
original,
wasm_context,
{ WasmAstNode::DynamicBlock(original) },
{ "dynamic-block".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmDynamicBlock {
additional_properties,
block_name: original.block_name.to_owned(),
arguments: original.parameters.map(|s| s.to_owned()),
},
))
}
);

49
src/wasm/entity.rs Normal file
View File

@@ -0,0 +1,49 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::headline::Noop;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Entity;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmEntity {
pub(crate) name: String,
pub(crate) latex: String,
#[serde(rename = "latex-math-p")]
pub(crate) latex_math_mode: bool,
pub(crate) html: String,
pub(crate) ascii: String,
pub(crate) latin1: Noop,
#[serde(rename = "utf-8")]
pub(crate) utf8: String,
#[serde(rename = "use-brackets-p")]
pub(crate) use_brackets: bool,
}
to_wasm!(
WasmEntity,
Entity<'s>,
original,
wasm_context,
{ WasmAstNode::Entity(original) },
{ "entity".into() },
{
Ok((
Vec::new(),
WasmEntity {
name: original.name.to_owned(),
latex: original.latex.to_owned(),
latex_math_mode: original.latex_math_mode,
html: original.html.to_owned(),
ascii: original.ascii.to_owned(),
latin1: Noop {},
utf8: original.utf8.to_owned(),
use_brackets: original.use_brackets,
},
))
}
);

75
src/wasm/example_block.rs Normal file
View File

@@ -0,0 +1,75 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::src_block::WasmNumberLines;
use super::src_block::WasmNumberLinesWrapper;
use super::src_block::WasmRetainLabels;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::CharOffsetInLine;
use crate::types::ExampleBlock;
use crate::types::GetAffiliatedKeywords;
use crate::types::RetainLabels;
use crate::types::SwitchNumberLines;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmExampleBlock {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) value: String,
pub(crate) switches: Option<String>,
#[serde(rename = "number-lines")]
pub(crate) number_lines: Option<WasmNumberLinesWrapper>,
#[serde(rename = "preserve-indent")]
pub(crate) preserve_indent: Option<CharOffsetInLine>,
#[serde(rename = "retain-labels")]
pub(crate) retain_labels: WasmRetainLabels,
#[serde(rename = "use-labels")]
pub(crate) use_labels: bool,
#[serde(rename = "label-fmt")]
pub(crate) label_format: Option<String>,
}
to_wasm!(
WasmExampleBlock,
ExampleBlock<'s>,
original,
wasm_context,
{ WasmAstNode::ExampleBlock(original) },
{ "example-block".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmExampleBlock {
additional_properties,
value: original.get_value().into_owned(),
switches: original.switches.map(|s| s.to_owned()),
number_lines: match original.number_lines {
None => None,
Some(SwitchNumberLines::New(n)) => Some(WasmNumberLinesWrapper {
inner: WasmNumberLines::New(n),
}),
Some(SwitchNumberLines::Continued(n)) => Some(WasmNumberLinesWrapper {
inner: WasmNumberLines::Continued(n),
}),
},
preserve_indent: original.preserve_indent,
retain_labels: match original.retain_labels {
RetainLabels::No => WasmRetainLabels::YesNo(false),
RetainLabels::Yes => WasmRetainLabels::YesNo(true),
RetainLabels::Keep(n) => WasmRetainLabels::Keep(n),
},
use_labels: original.use_labels,
label_format: original.label_format.map(|s| s.to_owned()),
},
))
}
);

43
src/wasm/export_block.rs Normal file
View File

@@ -0,0 +1,43 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::ExportBlock;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmExportBlock {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
#[serde(rename = "type")]
pub(crate) export_type: Option<String>,
pub(crate) value: String,
}
to_wasm!(
WasmExportBlock,
ExportBlock<'s>,
original,
wasm_context,
{ WasmAstNode::ExportBlock(original) },
{ "export-block".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmExportBlock {
additional_properties,
export_type: original.get_export_type(),
value: original.get_value().into_owned(),
},
))
}
);

View File

@@ -0,0 +1,34 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::ExportSnippet;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmExportSnippet {
#[serde(rename = "back-end")]
pub(crate) backend: String,
pub(crate) value: Option<String>,
}
to_wasm!(
WasmExportSnippet,
ExportSnippet<'s>,
original,
wasm_context,
{ WasmAstNode::ExportSnippet(original) },
{ "export-snippet".into() },
{
Ok((
Vec::new(),
WasmExportSnippet {
backend: original.backend.to_owned(),
value: original.contents.map(|s| s.to_owned()),
},
))
}
);

View File

@@ -0,0 +1,42 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::FixedWidthArea;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmFixedWidthArea {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) value: String,
}
to_wasm!(
WasmFixedWidthArea,
FixedWidthArea<'s>,
original,
wasm_context,
{ WasmAstNode::FixedWidthArea(original) },
{ "fixed-width".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
let value = original.get_value();
Ok((
Vec::new(),
WasmFixedWidthArea {
additional_properties,
value,
},
))
}
);

View File

@@ -0,0 +1,54 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::headline::Noop;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::FootnoteDefinition;
use crate::types::GetAffiliatedKeywords;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmFootnoteDefinition {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) label: String,
#[serde(rename = "pre-blank")]
pub(crate) pre_blank: Noop,
}
to_wasm!(
WasmFootnoteDefinition,
FootnoteDefinition<'s>,
original,
wasm_context,
{ WasmAstNode::FootnoteDefinition(original) },
{ "footnote-definition".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmFootnoteDefinition {
additional_properties,
label: original.label.to_owned(),
pre_blank: Noop {},
},
))
}
);

View File

@@ -0,0 +1,49 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::FootnoteReference;
use crate::types::FootnoteReferenceType;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmFootnoteReference {
pub(crate) label: Option<String>,
#[serde(rename = "type")]
pub(crate) footnote_reference_type: String,
}
to_wasm!(
WasmFootnoteReference,
FootnoteReference<'s>,
original,
wasm_context,
{ WasmAstNode::FootnoteReference(original) },
{ "footnote-reference".into() },
{
let children = original
.definition
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmFootnoteReference {
label: original.label.map(|s| s.to_owned()),
footnote_reference_type: match original.get_type() {
FootnoteReferenceType::Standard => "standard",
FootnoteReferenceType::Inline => "inline",
}
.to_owned(),
},
))
}
);

140
src/wasm/headline.rs Normal file
View File

@@ -0,0 +1,140 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use super::AdditionalPropertyValue;
use crate::types::Heading;
use crate::types::HeadlineLevel;
use crate::types::PriorityCookie;
use crate::types::TodoKeywordType;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmHeadline {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) level: HeadlineLevel,
pub(crate) tags: Vec<String>,
#[serde(rename = "todo-keyword")]
pub(crate) todo_keyword: Option<String>,
#[serde(rename = "todo-type")]
pub(crate) todo_type: Option<String>,
pub(crate) title: Vec<WasmAstNode>,
pub(crate) priority: Option<PriorityCookie>,
#[serde(rename = "archivedp")]
pub(crate) is_archived: bool,
#[serde(rename = "commentedp")]
pub(crate) is_comment: bool,
#[serde(rename = "raw-value")]
pub(crate) raw_value: String,
#[serde(rename = "footnote-section-p")]
pub(crate) is_footnote_section: bool,
pub(crate) scheduled: Option<Box<WasmAstNode>>,
pub(crate) deadline: Option<Box<WasmAstNode>>,
pub(crate) closed: Option<Box<WasmAstNode>>,
#[serde(rename = "pre-blank")]
pub(crate) pre_blank: Noop,
}
#[derive(Debug, Serialize, Deserialize)]
#[serde(tag = "noop")]
pub struct Noop {}
to_wasm!(
WasmHeadline,
Heading<'s>,
original,
wasm_context,
{ WasmAstNode::Headline(original) },
{ "headline".into() },
{
let mut additional_properties = AdditionalProperties::default();
for (name, val) in original.get_additional_properties().map(|node_property| {
(
node_property.property_name.to_uppercase(),
AdditionalPropertyValue::SingleString(node_property.value.unwrap_or("").to_owned()),
)
}) {
additional_properties.properties.insert(name, val);
}
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmHeadline {
additional_properties,
level: original.level,
tags: original.tags.iter().map(|tag| (*tag).to_owned()).collect(),
todo_keyword: original
.todo_keyword
.as_ref()
.map(|(_, keyword)| (*keyword).to_owned()),
todo_type: original
.todo_keyword
.as_ref()
.map(|(keyword, _)| match keyword {
TodoKeywordType::Done => "done".to_owned(),
TodoKeywordType::Todo => "todo".to_owned(),
}),
title: original
.title
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?,
priority: original.priority_cookie,
is_archived: original.is_archived,
is_comment: original.is_comment,
raw_value: original.get_raw_value(),
is_footnote_section: original.is_footnote_section,
scheduled: original
.scheduled
.as_ref()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.map_or(Ok(None), |r| r.map(Some))?
.map(Box::new),
deadline: original
.deadline
.as_ref()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.map_or(Ok(None), |r| r.map(Some))?
.map(Box::new),
closed: original
.closed
.as_ref()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.map_or(Ok(None), |r| r.map(Some))?
.map(Box::new),
pre_blank: Noop {},
},
))
}
);

View File

@@ -0,0 +1,38 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::GetAffiliatedKeywords;
use crate::types::HorizontalRule;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmHorizontalRule {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
}
to_wasm!(
WasmHorizontalRule,
HorizontalRule<'s>,
original,
wasm_context,
{ WasmAstNode::HorizontalRule(original) },
{ "horizontal-rule".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmHorizontalRule {
additional_properties,
},
))
}
);

View File

@@ -0,0 +1,41 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::InlineBabelCall;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmInlineBabelCall {
pub(crate) call: String,
#[serde(rename = "inside-header")]
pub(crate) inside_header: Option<String>,
pub(crate) arguments: Option<String>,
#[serde(rename = "end-header")]
pub(crate) end_header: Option<String>,
pub(crate) value: String,
}
to_wasm!(
WasmInlineBabelCall,
InlineBabelCall<'s>,
original,
wasm_context,
{ WasmAstNode::InlineBabelCall(original) },
{ "inline-babel-call".into() },
{
Ok((
Vec::new(),
WasmInlineBabelCall {
call: original.call.to_owned(),
inside_header: original.inside_header.map(|s| s.to_owned()),
arguments: original.arguments.map(|s| s.to_owned()),
end_header: original.end_header.map(|s| s.to_owned()),
value: original.value.to_owned(),
},
))
}
);

View File

@@ -0,0 +1,35 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::InlineSourceBlock;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmInlineSourceBlock {
pub(crate) language: String,
pub(crate) value: String,
pub(crate) parameters: Option<String>,
}
to_wasm!(
WasmInlineSourceBlock,
InlineSourceBlock<'s>,
original,
wasm_context,
{ WasmAstNode::InlineSourceBlock(original) },
{ "inline-src-block".into() },
{
Ok((
Vec::new(),
WasmInlineSourceBlock {
language: original.language.to_owned(),
value: original.value.to_owned(),
parameters: original.parameters.map(str::to_owned),
},
))
}
);

34
src/wasm/italic.rs Normal file
View File

@@ -0,0 +1,34 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::Italic;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmItalic {}
to_wasm!(
WasmItalic,
Italic<'s>,
original,
wasm_context,
{ WasmAstNode::Italic(original) },
{ "italic".into() },
{
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((children, WasmItalic {}))
}
);

42
src/wasm/keyword.rs Normal file
View File

@@ -0,0 +1,42 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::GetAffiliatedKeywords;
use crate::types::Keyword;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmKeyword {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) key: String,
pub(crate) value: String,
}
to_wasm!(
WasmKeyword,
Keyword<'s>,
original,
wasm_context,
{ WasmAstNode::Keyword(original) },
{ "keyword".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmKeyword {
additional_properties,
key: original.key.to_uppercase(),
value: original.value.to_owned(),
},
))
}
);

View File

@@ -0,0 +1,40 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::GetAffiliatedKeywords;
use crate::types::LatexEnvironment;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmLatexEnvironment {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
pub(crate) value: String,
}
to_wasm!(
WasmLatexEnvironment,
LatexEnvironment<'s>,
original,
wasm_context,
{ WasmAstNode::LatexEnvironment(original) },
{ "latex-environment".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
Ok((
Vec::new(),
WasmLatexEnvironment {
additional_properties,
value: original.value.to_owned(),
},
))
}
);

View File

@@ -0,0 +1,31 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::LatexFragment;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmLatexFragment {
pub(crate) value: String,
}
to_wasm!(
WasmLatexFragment,
LatexFragment<'s>,
original,
wasm_context,
{ WasmAstNode::LatexFragment(original) },
{ "latex-fragment".into() },
{
Ok((
Vec::new(),
WasmLatexFragment {
value: original.value.to_owned(),
},
))
}
);

22
src/wasm/line_break.rs Normal file
View File

@@ -0,0 +1,22 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::LineBreak;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmLineBreak {}
to_wasm!(
WasmLineBreak,
LineBreak<'s>,
original,
wasm_context,
{ WasmAstNode::LineBreak(original) },
{ "line-break".into() },
{ Ok((Vec::new(), WasmLineBreak {},)) }
);

59
src/wasm/macros.rs Normal file
View File

@@ -0,0 +1,59 @@
/// Write the implementation for the intermediate ast node.
///
/// This exists to make changing the type signature easier.
macro_rules! to_wasm {
($ostruct:ty, $istruct:ty, $original:ident, $wasm_context:ident, $fnbody:tt) => {
impl<'s> ToWasm for $istruct {
type Output = $ostruct;
fn to_wasm(
&self,
$wasm_context: crate::wasm::to_wasm::ToWasmContext<'_>,
) -> Result<Self::Output, crate::error::CustomError> {
let $original = self;
#[allow(unused_braces)]
$fnbody
}
}
};
($ostruct:ty, $istruct:ty, $original:ident, $wasm_context:ident, $toastnodebody:tt, $elispnamebody:tt, $fnbody:tt) => {
impl<'s> ToWasm for $istruct {
type Output = crate::wasm::ast_node::WasmAstNodeWrapper<$ostruct>;
fn to_wasm(
&self,
$wasm_context: crate::wasm::to_wasm::ToWasmContext<'_>,
) -> Result<Self::Output, crate::error::CustomError> {
#[allow(unused_variables)]
let $original = self;
let standard_properties =
self.to_wasm_standard_properties($wasm_context.clone())?;
$fnbody.map(
|(children, inner)| crate::wasm::ast_node::WasmAstNodeWrapper {
ast_node: inner.get_elisp_name().into_owned(),
standard_properties,
children,
properties: inner,
},
)
}
}
impl From<crate::wasm::ast_node::WasmAstNodeWrapper<$ostruct>> for WasmAstNode {
fn from($original: crate::wasm::ast_node::WasmAstNodeWrapper<$ostruct>) -> Self {
let ret = $toastnodebody;
ret
}
}
impl<'s> crate::util::elisp_fact::ElispFact<'s> for $ostruct {
fn get_elisp_name<'b>(&'b self) -> std::borrow::Cow<'s, str> {
let ret = $elispnamebody;
ret
}
}
};
}
pub(crate) use to_wasm;

76
src/wasm/mod.rs Normal file
View File

@@ -0,0 +1,76 @@
mod additional_property;
mod angle_link;
mod ast_node;
mod babel_call;
mod bold;
mod center_block;
mod citation;
mod citation_reference;
mod clock;
mod code;
mod comment;
mod comment_block;
mod diary_sexp;
mod document;
mod drawer;
mod dynamic_block;
mod entity;
mod example_block;
mod export_block;
mod export_snippet;
mod fixed_width_area;
mod footnote_definition;
mod footnote_reference;
mod headline;
mod horizontal_rule;
mod inline_babel_call;
mod inline_source_block;
mod italic;
mod keyword;
mod latex_environment;
mod latex_fragment;
mod line_break;
mod macros;
mod node_property;
mod org_macro;
mod paragraph;
mod parse_result;
mod plain_link;
mod plain_list;
mod plain_list_item;
mod plain_text;
mod planning;
mod property_drawer;
mod quote_block;
mod radio_link;
mod radio_target;
mod regular_link;
mod section;
mod special_block;
mod src_block;
mod standard_properties;
mod statistics_cookie;
mod strike_through;
mod subscript;
mod superscript;
mod table;
mod table_cell;
mod table_row;
mod target;
mod timestamp;
mod to_wasm;
mod underline;
mod verbatim;
mod verse_block;
pub use additional_property::AdditionalProperties;
pub use additional_property::AdditionalPropertyValue;
pub use ast_node::WasmAstNode;
#[cfg(feature = "wasm_test")]
pub use ast_node::WasmAstNodeWrapper;
#[cfg(feature = "wasm_test")]
pub use document::WasmDocument;
pub use parse_result::ParseResult;
pub(crate) use standard_properties::WasmStandardProperties;
pub use to_wasm::ToWasm;
pub use to_wasm::ToWasmContext;

33
src/wasm/node_property.rs Normal file
View File

@@ -0,0 +1,33 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::NodeProperty;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmNodeProperty {
pub(crate) key: String,
pub(crate) value: Option<String>,
}
to_wasm!(
WasmNodeProperty,
NodeProperty<'s>,
original,
wasm_context,
{ WasmAstNode::NodeProperty(original) },
{ "node-property".into() },
{
Ok((
Vec::new(),
WasmNodeProperty {
key: original.property_name.to_owned(),
value: original.value.map(|s| s.to_owned()),
},
))
}
);

35
src/wasm/org_macro.rs Normal file
View File

@@ -0,0 +1,35 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::OrgMacro;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmOrgMacro {
pub(crate) key: String,
pub(crate) value: String,
pub(crate) args: Vec<String>,
}
to_wasm!(
WasmOrgMacro,
OrgMacro<'s>,
original,
wasm_context,
{ WasmAstNode::OrgMacro(original) },
{ "macro".into() },
{
Ok((
Vec::new(),
WasmOrgMacro {
key: original.key.to_lowercase(),
value: original.value.to_owned(),
args: original.get_args().map(|s| s.into_owned()).collect(),
},
))
}
);

48
src/wasm/paragraph.rs Normal file
View File

@@ -0,0 +1,48 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use super::AdditionalProperties;
use crate::types::GetAffiliatedKeywords;
use crate::types::Paragraph;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
pub struct WasmParagraph {
#[serde(flatten)]
pub(crate) additional_properties: AdditionalProperties,
}
to_wasm!(
WasmParagraph,
Paragraph<'s>,
original,
wasm_context,
{ WasmAstNode::Paragraph(original) },
{ "paragraph".into() },
{
let additional_properties = original
.get_affiliated_keywords()
.to_wasm(wasm_context.clone())?;
let children = original
.children
.iter()
.map(|child| {
child
.to_wasm(wasm_context.clone())
.map(Into::<WasmAstNode>::into)
})
.collect::<Result<Vec<_>, _>>()?;
Ok((
children,
WasmParagraph {
additional_properties,
},
))
}
);

15
src/wasm/parse_result.rs Normal file
View File

@@ -0,0 +1,15 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNodeWrapper;
use super::document::WasmDocument;
#[derive(Debug, Serialize, Deserialize)]
#[serde(tag = "status", content = "content")]
pub enum ParseResult {
#[serde(rename = "success")]
Success(WasmAstNodeWrapper<WasmDocument>),
#[serde(rename = "error")]
Error(String),
}

52
src/wasm/plain_link.rs Normal file
View File

@@ -0,0 +1,52 @@
use serde::Deserialize;
use serde::Serialize;
use super::ast_node::WasmAstNode;
use super::macros::to_wasm;
use super::to_wasm::ToWasm;
use crate::types::LinkType;
use crate::types::PlainLink;
use crate::util::elisp_fact::ElispFact;
use crate::wasm::to_wasm::ToWasmStandardProperties;
#[derive(Debug, Serialize, Deserialize)]
#[serde(tag = "format")]
#[serde(rename = "plain")]
pub struct WasmPlainLink {
#[serde(rename = "type")]
pub(crate) link_type: String,
pub(crate) path: String,
#[serde(rename = "raw-link")]
pub(crate) raw_link: String,
pub(crate) application: Option<String>,
#[serde(rename = "search-option")]
pub(crate) search_option: Option<String>,
}
to_wasm!(
WasmPlainLink,
PlainLink<'s>,
original,
wasm_context,
{ WasmAstNode::PlainLink(original) },
{ "link".into() },
{
Ok((
Vec::new(),
WasmPlainLink {
link_type: match &original.link_type {
LinkType::File => "file".to_owned(),
LinkType::Protocol(protocol) => protocol.clone().into_owned(),
LinkType::Id => "id".to_owned(),
LinkType::CustomId => "custom-id".to_owned(),
LinkType::CodeRef => "coderef".to_owned(),
LinkType::Fuzzy => "fuzzy".to_owned(),
},
path: original.path.to_owned(),
raw_link: original.raw_link.to_owned(),
application: original.application.map(str::to_owned),
search_option: original.search_option.map(str::to_owned),
},
))
}
);

Some files were not shown because too many files have changed in this diff Show More