mirror of
https://github.com/ksyasuda/SubMiner.git
synced 2026-03-07 03:22:17 -08:00
Add annotation websocket and texthooker startup config
- Add `texthooker.launchAtStartup` (default `true`) and wire startup behavior - Add dedicated `annotationWebsocket` config/service path (default port `6678`) for texthooker annotations - Regenerate config example/tests and update Yomitan patching/vendor assets
This commit is contained in:
@@ -0,0 +1,49 @@
|
||||
---
|
||||
id: TASK-100
|
||||
title: 'Add configurable texthooker startup launch'
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2026-03-06 23:30'
|
||||
updated_date: '2026-03-07 01:59'
|
||||
labels: []
|
||||
dependencies: []
|
||||
priority: medium
|
||||
ordinal: 10000
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
|
||||
Add a config option under `texthooker` to launch the built-in texthooker server automatically when SubMiner starts.
|
||||
|
||||
Scope:
|
||||
|
||||
- Add `texthooker.launchAtStartup`.
|
||||
- Default to `true`.
|
||||
- Start the existing texthooker server during normal app startup when enabled.
|
||||
- Keep `texthooker.openBrowser` as separate behavior.
|
||||
- Add regression coverage and update generated config docs/example.
|
||||
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
<!-- AC:BEGIN -->
|
||||
|
||||
- [x] #1 Default config enables automatic texthooker startup.
|
||||
- [x] #2 Config parser accepts valid boolean values and warns on invalid values.
|
||||
- [x] #3 App-ready startup launches texthooker when enabled.
|
||||
- [x] #4 Generated config template/example documents the new option.
|
||||
|
||||
<!-- AC:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
|
||||
Added `texthooker.launchAtStartup` with a default of `true`, wired it through config defaults/validation/template generation, and started the existing texthooker server during app-ready startup without coupling it to browser auto-open behavior.
|
||||
|
||||
Also added regression coverage for config parsing/template output and app-ready dependency wiring, then regenerated the checked-in config example artifacts.
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
@@ -0,0 +1,38 @@
|
||||
---
|
||||
id: TASK-103
|
||||
title: Add dedicated annotation websocket for texthooker
|
||||
status: Done
|
||||
assignee:
|
||||
- codex
|
||||
created_date: '2026-03-07 02:20'
|
||||
updated_date: '2026-03-07 02:20'
|
||||
labels:
|
||||
- texthooker
|
||||
- websocket
|
||||
- subtitle
|
||||
dependencies: []
|
||||
priority: medium
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
Add a separate annotated subtitle websocket for bundled texthooker so token/JLPT/frequency markup is available on a stable dedicated port even when the regular websocket is in `auto` mode and skipped because `mpv_websocket` is installed.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
<!-- AC:BEGIN -->
|
||||
- [x] #1 Regular `websocket.enabled: "auto"` behavior remains unchanged and still skips the regular websocket when `mpv_websocket` is installed.
|
||||
- [x] #2 A separate `annotationWebsocket` config controls an independent annotated websocket with default port `6678`.
|
||||
- [x] #3 Bundled texthooker is pointed at the annotation websocket when it is enabled.
|
||||
- [x] #4 Focused regression tests cover config parsing, startup wiring, and texthooker bootstrap injection.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
Added `annotationWebsocket.enabled`/`annotationWebsocket.port` with defaults of `true`/`6678`, started that websocket independently from the regular auto-managed websocket, and injected the bundled texthooker websocket URL so it connects to the annotation feed by default.
|
||||
|
||||
Also added focused regression coverage and regenerated the checked-in config examples.
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
@@ -0,0 +1,43 @@
|
||||
---
|
||||
id: TASK-104
|
||||
title: Mirror overlay annotation hover behavior in vendored texthooker
|
||||
status: Done
|
||||
assignee:
|
||||
- codex
|
||||
created_date: '2026-03-06 21:45'
|
||||
updated_date: '2026-03-06 21:45'
|
||||
labels:
|
||||
- texthooker
|
||||
- subtitle
|
||||
- websocket
|
||||
dependencies:
|
||||
- TASK-103
|
||||
references:
|
||||
- /home/sudacode/projects/japanese/SubMiner/src/core/services/subtitle-ws.ts
|
||||
- /home/sudacode/projects/japanese/SubMiner/vendor/texthooker-ui/src/components/App.svelte
|
||||
- /home/sudacode/projects/japanese/SubMiner/vendor/texthooker-ui/src/line-markup.ts
|
||||
- /home/sudacode/projects/japanese/SubMiner/vendor/texthooker-ui/src/app.css
|
||||
priority: medium
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
Bring bundled texthooker annotation rendering closer to the visible overlay. Keep the lightweight texthooker UX, but preserve token metadata for hover, match overlay color-precedence rules across known/N+1/name/frequency/JLPT, expose name-match highlighting as a toggle, and emit a structured annotation payload on the dedicated websocket so non-SubMiner clients can treat it as an API.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
<!-- AC:BEGIN -->
|
||||
- [x] #1 Annotation websocket payload includes both rendered `sentence` HTML and structured token metadata for generic clients.
|
||||
- [x] #2 Vendored texthooker preserves annotation metadata attrs needed for hover labels and uses overlay-matching color precedence rules.
|
||||
- [x] #3 Vendored texthooker supports character-name highlighting with a user-facing toggle and standalone-web note.
|
||||
- [x] #4 Hovering annotated texthooker tokens reveals JLPT/frequency metadata without adding the full overlay popup workflow.
|
||||
- [x] #5 Focused serializer, texthooker markup, socket parsing, CSS, and build verification pass.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
Extended the dedicated annotation websocket payload to ship `version`, plain `text`, rendered `sentence`, and structured `tokens` metadata while keeping backward-compatible `sentence` consumers working. Updated the vendored texthooker to preserve hover metadata attrs, follow overlay color precedence for known/N+1/name/frequency/JLPT annotations, add a character-name highlight toggle plus standalone-web dictionary note, and render lightweight hover labels for frequency/JLPT metadata. Added focused regression coverage and rebuilt both the vendored texthooker bundle and SubMiner.
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
@@ -0,0 +1,47 @@
|
||||
---
|
||||
id: TASK-105
|
||||
title: Stop local docs artifact writes after docs repo split
|
||||
status: Done
|
||||
assignee: []
|
||||
created_date: '2026-03-07 00:00'
|
||||
updated_date: '2026-03-07 00:20'
|
||||
labels: []
|
||||
dependencies: []
|
||||
priority: medium
|
||||
ordinal: 10500
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
|
||||
Now that user-facing docs live in `../subminer-docs`, first-party scripts in this repo should not keep writing generated artifacts into the local `docs/` tree.
|
||||
|
||||
Scope:
|
||||
|
||||
- Audit first-party scripts/automation for writes to `docs/`.
|
||||
- Keep repo-local outputs only where they are still intentionally owned by this repo.
|
||||
- Repoint generated docs artifacts to `../subminer-docs` when that is the maintained source of truth.
|
||||
- Add regression coverage for the config-example generation path contract.
|
||||
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
<!-- AC:BEGIN -->
|
||||
|
||||
- [x] #1 The config-example generator no longer writes to `docs/public/config.example.jsonc` inside this repo.
|
||||
- [x] #2 When `../subminer-docs` exists, the generator updates `../subminer-docs/public/config.example.jsonc`.
|
||||
- [x] #3 Automated coverage guards the output-path contract so local docs writes do not regress.
|
||||
|
||||
<!-- AC:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
|
||||
Removed the first-party local `docs/public` config-example write path from `src/generate-config-example.ts` and replaced it with sibling-docs-repo detection that targets `../subminer-docs/public/config.example.jsonc` only when that repo exists.
|
||||
|
||||
Added a project-local regression suite for output-path resolution and artifact writing, wired that suite into the maintained config test lane, and removed the stale generated `docs/public/config.example.jsonc` artifact from the working tree.
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
@@ -5,6 +5,7 @@
|
||||
* Copy to $XDG_CONFIG_HOME/SubMiner/config.jsonc (or ~/.config/SubMiner/config.jsonc) and edit as needed.
|
||||
*/
|
||||
{
|
||||
|
||||
// ==========================================
|
||||
// Overlay Auto-Start
|
||||
// When overlay connects to mpv, automatically show overlay and hide mpv subtitles.
|
||||
@@ -13,11 +14,12 @@
|
||||
|
||||
// ==========================================
|
||||
// Texthooker Server
|
||||
// Control whether browser opens automatically for texthooker.
|
||||
// Configure texthooker startup launch and browser opening behavior.
|
||||
// ==========================================
|
||||
"texthooker": {
|
||||
"openBrowser": true, // Open browser setting. Values: true | false
|
||||
}, // Control whether browser opens automatically for texthooker.
|
||||
"launchAtStartup": true, // Launch texthooker server automatically when SubMiner starts. Values: true | false
|
||||
"openBrowser": true // Open browser setting. Values: true | false
|
||||
}, // Configure texthooker startup launch and browser opening behavior.
|
||||
|
||||
// ==========================================
|
||||
// WebSocket Server
|
||||
@@ -26,17 +28,41 @@
|
||||
// ==========================================
|
||||
"websocket": {
|
||||
"enabled": "auto", // Built-in subtitle websocket server mode. Values: auto | true | false
|
||||
"port": 6677, // Built-in subtitle websocket server port.
|
||||
"port": 6677 // Built-in subtitle websocket server port.
|
||||
}, // Built-in WebSocket server broadcasts subtitle text to connected clients.
|
||||
|
||||
// ==========================================
|
||||
// Annotation WebSocket
|
||||
// Dedicated annotated subtitle websocket for bundled texthooker and token-aware clients.
|
||||
// Independent from websocket.auto and defaults to port 6678.
|
||||
// ==========================================
|
||||
"annotationWebsocket": {
|
||||
"enabled": true, // Annotated subtitle websocket server enabled state. Values: true | false
|
||||
"port": 6678 // Annotated subtitle websocket server port.
|
||||
}, // Dedicated annotated subtitle websocket for bundled texthooker and token-aware clients.
|
||||
|
||||
// ==========================================
|
||||
// Logging
|
||||
// Controls logging verbosity.
|
||||
// Set to debug for full runtime diagnostics.
|
||||
// ==========================================
|
||||
"logging": {
|
||||
"level": "info", // Minimum log level for runtime logging. Values: debug | info | warn | error
|
||||
}, // Controls logging verbosity. Keep this as an object; do not replace with a bare string.
|
||||
"level": "info" // Minimum log level for runtime logging. Values: debug | info | warn | error
|
||||
}, // Controls logging verbosity.
|
||||
|
||||
// ==========================================
|
||||
// Startup Warmups
|
||||
// Background warmup controls for MeCab, Yomitan, dictionaries, and Jellyfin session.
|
||||
// Disable individual warmups to defer load until first real usage.
|
||||
// lowPowerMode defers all warmups except Yomitan extension.
|
||||
// ==========================================
|
||||
"startupWarmups": {
|
||||
"lowPowerMode": false, // Defer startup warmups except Yomitan extension. Values: true | false
|
||||
"mecab": true, // Warm up MeCab tokenizer at startup. Values: true | false
|
||||
"yomitanExtension": true, // Warm up Yomitan extension at startup. Values: true | false
|
||||
"subtitleDictionaries": true, // Warm up subtitle dictionaries at startup. Values: true | false
|
||||
"jellyfinRemoteSession": true // Warm up Jellyfin remote session at startup. Values: true | false
|
||||
}, // Background warmup controls for MeCab, Yomitan, dictionaries, and Jellyfin session.
|
||||
|
||||
// ==========================================
|
||||
// Keyboard Shortcuts
|
||||
@@ -56,7 +82,7 @@
|
||||
"toggleSecondarySub": "CommandOrControl+Shift+V", // Toggle secondary sub setting.
|
||||
"markAudioCard": "CommandOrControl+Shift+A", // Mark audio card setting.
|
||||
"openRuntimeOptions": "CommandOrControl+Shift+O", // Open runtime options setting.
|
||||
"openJimaku": "Ctrl+Shift+J", // Open jimaku setting.
|
||||
"openJimaku": "Ctrl+Shift+J" // Open jimaku setting.
|
||||
}, // Overlay keyboard shortcuts. Set a shortcut to null to disable.
|
||||
|
||||
// ==========================================
|
||||
@@ -76,7 +102,7 @@
|
||||
"secondarySub": {
|
||||
"secondarySubLanguages": [], // Secondary sub languages setting.
|
||||
"autoLoadSecondarySub": false, // Auto load secondary sub setting. Values: true | false
|
||||
"defaultMode": "hover", // Default mode setting.
|
||||
"defaultMode": "hover" // Default mode setting.
|
||||
}, // Dual subtitle track options.
|
||||
|
||||
// ==========================================
|
||||
@@ -88,7 +114,7 @@
|
||||
"alass_path": "", // Alass path setting.
|
||||
"ffsubsync_path": "", // Ffsubsync path setting.
|
||||
"ffmpeg_path": "", // Ffmpeg path setting.
|
||||
"replace": true, // Replace active subtitle file when synchronization succeeds.
|
||||
"replace": true // Replace the active subtitle file when sync completes. Values: true | false
|
||||
}, // Subsync engine and executable paths.
|
||||
|
||||
// ==========================================
|
||||
@@ -96,7 +122,7 @@
|
||||
// Initial vertical subtitle position from the bottom.
|
||||
// ==========================================
|
||||
"subtitlePosition": {
|
||||
"yPercent": 10, // Y percent setting.
|
||||
"yPercent": 10 // Y percent setting.
|
||||
}, // Initial vertical subtitle position from the bottom.
|
||||
|
||||
// ==========================================
|
||||
@@ -108,8 +134,11 @@
|
||||
"enableJlpt": false, // Enable JLPT vocabulary level underlines. When disabled, JLPT tagging lookup and underlines are skipped. Values: true | false
|
||||
"preserveLineBreaks": false, // Preserve line breaks in visible overlay subtitle rendering. When false, line breaks are flattened to spaces for a single-line flow. Values: true | false
|
||||
"autoPauseVideoOnHover": true, // Automatically pause mpv playback while hovering subtitle text, then resume on leave. Values: true | false
|
||||
"autoPauseVideoOnYomitanPopup": false, // Automatically pause mpv playback while Yomitan popup is open, then resume when popup closes. Values: true | false
|
||||
"hoverTokenColor": "#f4dbd6", // Hex color used for hovered subtitle token highlight in mpv.
|
||||
"hoverTokenBackgroundColor": "rgba(54, 58, 79, 0.84)", // CSS color used for hovered subtitle token background highlight in mpv.
|
||||
"nameMatchEnabled": true, // Enable subtitle token coloring for matches from the SubMiner character dictionary. Values: true | false
|
||||
"nameMatchColor": "#f5bde6", // Hex color used when a subtitle token matches an entry from the SubMiner character dictionary.
|
||||
"fontFamily": "M PLUS 1 Medium, Source Han Sans JP, Noto Sans CJK JP", // Font family setting.
|
||||
"fontSize": 35, // Font size setting.
|
||||
"fontColor": "#cad3f5", // Font color setting.
|
||||
@@ -130,16 +159,22 @@
|
||||
"N2": "#f5a97f", // N2 setting.
|
||||
"N3": "#f9e2af", // N3 setting.
|
||||
"N4": "#a6e3a1", // N4 setting.
|
||||
"N5": "#8aadf4", // N5 setting.
|
||||
"N5": "#8aadf4" // N5 setting.
|
||||
}, // Jlpt colors setting.
|
||||
"frequencyDictionary": {
|
||||
"enabled": false, // Enable frequency-dictionary-based highlighting based on token rank. Values: true | false
|
||||
"sourcePath": "", // Optional absolute path to a frequency dictionary directory. If empty, SubMiner searches installed/default frequency-dictionary locations.
|
||||
"sourcePath": "", // Optional absolute path to a frequency dictionary directory. If empty, built-in discovery search paths are used.
|
||||
"topX": 1000, // Only color tokens with frequency rank <= topX (default: 1000).
|
||||
"mode": "single", // single: use one color for all matching tokens. banded: use color ramp by frequency band. Values: single | banded
|
||||
"matchMode": "headword", // Frequency lookup text selection mode. Values: headword | surface
|
||||
"matchMode": "headword", // headword: frequency lookup uses dictionary form. surface: lookup uses subtitle-visible token text. Values: headword | surface
|
||||
"singleColor": "#f5a97f", // Color used when frequencyDictionary.mode is `single`.
|
||||
"bandedColors": ["#ed8796", "#f5a97f", "#f9e2af", "#8bd5ca", "#8aadf4"], // Five colors used for rank bands when mode is `banded` (from most common to least within topX).
|
||||
"bandedColors": [
|
||||
"#ed8796",
|
||||
"#f5a97f",
|
||||
"#f9e2af",
|
||||
"#8bd5ca",
|
||||
"#8aadf4"
|
||||
] // Five colors used for rank bands when mode is `banded` (from most common to least within topX).
|
||||
}, // Frequency dictionary setting.
|
||||
"secondary": {
|
||||
"fontFamily": "Inter, Noto Sans, Helvetica Neue, sans-serif", // Font family setting.
|
||||
@@ -154,8 +189,8 @@
|
||||
"backgroundColor": "transparent", // Background color setting.
|
||||
"backdropFilter": "blur(6px)", // Backdrop filter setting.
|
||||
"fontWeight": "normal", // Font weight setting.
|
||||
"fontStyle": "normal", // Font style setting.
|
||||
}, // Secondary setting.
|
||||
"fontStyle": "normal" // Font style setting.
|
||||
} // Secondary setting.
|
||||
}, // Primary and secondary subtitle styling.
|
||||
|
||||
// ==========================================
|
||||
@@ -169,18 +204,20 @@
|
||||
"url": "http://127.0.0.1:8765", // Url setting.
|
||||
"pollingRate": 3000, // Polling interval in milliseconds.
|
||||
"proxy": {
|
||||
"enabled": false, // Enable local AnkiConnect-compatible proxy for push-based auto-enrichment. Values: true | false
|
||||
"enabled": true, // Enable local AnkiConnect-compatible proxy for push-based auto-enrichment. Values: true | false
|
||||
"host": "127.0.0.1", // Bind host for local AnkiConnect proxy.
|
||||
"port": 8766, // Bind port for local AnkiConnect proxy.
|
||||
"upstreamUrl": "http://127.0.0.1:8765", // Upstream AnkiConnect URL proxied by local AnkiConnect proxy.
|
||||
"upstreamUrl": "http://127.0.0.1:8765" // Upstream AnkiConnect URL proxied by local AnkiConnect proxy.
|
||||
}, // Proxy setting.
|
||||
"tags": ["SubMiner"], // Tags to add to cards mined or updated by SubMiner. Provide an empty array to disable automatic tagging.
|
||||
"tags": [
|
||||
"SubMiner"
|
||||
], // Tags to add to cards mined or updated by SubMiner. Provide an empty array to disable automatic tagging.
|
||||
"fields": {
|
||||
"audio": "ExpressionAudio", // Audio setting.
|
||||
"image": "Picture", // Image setting.
|
||||
"sentence": "Sentence", // Sentence setting.
|
||||
"miscInfo": "MiscInfo", // Misc info setting.
|
||||
"translation": "SelectionText", // Translation setting.
|
||||
"translation": "SelectionText" // Translation setting.
|
||||
}, // Fields setting.
|
||||
"ai": {
|
||||
"enabled": false, // Enabled setting. Values: true | false
|
||||
@@ -189,7 +226,7 @@
|
||||
"model": "openai/gpt-4o-mini", // Model setting.
|
||||
"baseUrl": "https://openrouter.ai/api", // Base url setting.
|
||||
"targetLanguage": "English", // Target language setting.
|
||||
"systemPrompt": "You are a translation engine. Return only the translated text with no explanations.", // System prompt setting.
|
||||
"systemPrompt": "You are a translation engine. Return only the translated text with no explanations." // System prompt setting.
|
||||
}, // Ai setting.
|
||||
"media": {
|
||||
"generateAudio": true, // Generate audio setting. Values: true | false
|
||||
@@ -202,7 +239,7 @@
|
||||
"animatedCrf": 35, // Animated crf setting.
|
||||
"audioPadding": 0.5, // Audio padding setting.
|
||||
"fallbackDuration": 3, // Fallback duration setting.
|
||||
"maxMediaDuration": 30, // Max media duration setting.
|
||||
"maxMediaDuration": 30 // Max media duration setting.
|
||||
}, // Media setting.
|
||||
"behavior": {
|
||||
"overwriteAudio": true, // Overwrite audio setting. Values: true | false
|
||||
@@ -210,7 +247,7 @@
|
||||
"mediaInsertMode": "append", // Media insert mode setting.
|
||||
"highlightWord": true, // Highlight word setting. Values: true | false
|
||||
"notificationType": "osd", // Notification type setting.
|
||||
"autoUpdateNewCards": true, // Automatically update newly added cards. Values: true | false
|
||||
"autoUpdateNewCards": true // Automatically update newly added cards. Values: true | false
|
||||
}, // Behavior setting.
|
||||
"nPlusOne": {
|
||||
"highlightEnabled": false, // Enable fast local highlighting for words already known in Anki. Values: true | false
|
||||
@@ -219,20 +256,20 @@
|
||||
"decks": [], // Decks used for N+1 known-word cache scope. Supports one or more deck names.
|
||||
"minSentenceWords": 3, // Minimum sentence word count required for N+1 targeting (default: 3).
|
||||
"nPlusOne": "#c6a0f6", // Color used for the single N+1 target token highlight.
|
||||
"knownWord": "#a6da95", // Color used for legacy known-word highlights.
|
||||
"knownWord": "#a6da95" // Color used for legacy known-word highlights.
|
||||
}, // N plus one setting.
|
||||
"metadata": {
|
||||
"pattern": "[SubMiner] %f (%t)", // Pattern setting.
|
||||
"pattern": "[SubMiner] %f (%t)" // Pattern setting.
|
||||
}, // Metadata setting.
|
||||
"isLapis": {
|
||||
"enabled": false, // Enabled setting. Values: true | false
|
||||
"sentenceCardModel": "Japanese sentences", // Sentence card model setting.
|
||||
"sentenceCardModel": "Japanese sentences" // Sentence card model setting.
|
||||
}, // Is lapis setting.
|
||||
"isKiku": {
|
||||
"enabled": false, // Enabled setting. Values: true | false
|
||||
"fieldGrouping": "disabled", // Kiku duplicate-card field grouping mode. Values: auto | manual | disabled
|
||||
"deleteDuplicateInAuto": true, // Delete duplicate in auto setting. Values: true | false
|
||||
}, // Is kiku setting.
|
||||
"deleteDuplicateInAuto": true // Delete duplicate in auto setting. Values: true | false
|
||||
} // Is kiku setting.
|
||||
}, // Automatic Anki updates and media generation options.
|
||||
|
||||
// ==========================================
|
||||
@@ -242,7 +279,7 @@
|
||||
"jimaku": {
|
||||
"apiBaseUrl": "https://jimaku.cc", // Api base url setting.
|
||||
"languagePreference": "ja", // Preferred language used in Jimaku search. Values: ja | en | none
|
||||
"maxEntryResults": 10, // Maximum Jimaku search results returned.
|
||||
"maxEntryResults": 10 // Maximum Jimaku search results returned.
|
||||
}, // Jimaku API configuration and defaults.
|
||||
|
||||
// ==========================================
|
||||
@@ -253,16 +290,33 @@
|
||||
"mode": "automatic", // YouTube subtitle generation mode for the launcher script. Values: automatic | preprocess | off
|
||||
"whisperBin": "", // Path to whisper.cpp CLI used as fallback transcription engine.
|
||||
"whisperModel": "", // Path to whisper model used for fallback transcription.
|
||||
"primarySubLanguages": ["ja", "jpn"], // Comma-separated primary subtitle language priority used by the launcher.
|
||||
"primarySubLanguages": [
|
||||
"ja",
|
||||
"jpn"
|
||||
] // Comma-separated primary subtitle language priority used by the launcher.
|
||||
}, // Defaults for subminer YouTube subtitle extraction/transcription mode.
|
||||
|
||||
// ==========================================
|
||||
// Anilist
|
||||
// Anilist API credentials and update behavior.
|
||||
// Includes optional auto-sync for a merged MRU-based character dictionary in bundled Yomitan.
|
||||
// Character dictionaries are keyed by AniList media ID (no season/franchise merge).
|
||||
// ==========================================
|
||||
"anilist": {
|
||||
"enabled": false, // Enable AniList post-watch progress updates. Values: true | false
|
||||
"accessToken": "", // Optional explicit AniList access token override; leave empty to use locally stored token from setup.
|
||||
"characterDictionary": {
|
||||
"enabled": false, // Enable automatic Yomitan character dictionary sync for currently watched AniList media. Values: true | false
|
||||
"refreshTtlHours": 168, // Legacy setting; merged character dictionary retention is now usage-based and this value is ignored.
|
||||
"maxLoaded": 3, // Maximum number of most-recently-used anime snapshots included in the merged Yomitan character dictionary.
|
||||
"evictionPolicy": "delete", // Legacy setting; merged character dictionary eviction is usage-based and this value is ignored. Values: disable | delete
|
||||
"profileScope": "all", // Yomitan profile scope for dictionary enable/disable updates. Values: all | active
|
||||
"collapsibleSections": {
|
||||
"description": false, // Open the Description section by default in character dictionary glossary entries. Values: true | false
|
||||
"characterInformation": false, // Open the Character Information section by default in character dictionary glossary entries. Values: true | false
|
||||
"voicedBy": false // Open the Voiced by section by default in character dictionary glossary entries. Values: true | false
|
||||
} // Collapsible sections setting.
|
||||
} // Character dictionary setting.
|
||||
}, // Anilist API credentials and update behavior.
|
||||
|
||||
// ==========================================
|
||||
@@ -286,8 +340,16 @@
|
||||
"pullPictures": false, // Enable Jellyfin poster/icon fetching for launcher menus. Values: true | false
|
||||
"iconCacheDir": "/tmp/subminer-jellyfin-icons", // Directory used by launcher for cached Jellyfin poster icons.
|
||||
"directPlayPreferred": true, // Try direct play before server-managed transcoding when possible. Values: true | false
|
||||
"directPlayContainers": ["mkv", "mp4", "webm", "mov", "flac", "mp3", "aac"], // Container allowlist for direct play decisions.
|
||||
"transcodeVideoCodec": "h264", // Preferred transcode video codec when direct play is unavailable.
|
||||
"directPlayContainers": [
|
||||
"mkv",
|
||||
"mp4",
|
||||
"webm",
|
||||
"mov",
|
||||
"flac",
|
||||
"mp3",
|
||||
"aac"
|
||||
], // Container allowlist for direct play decisions.
|
||||
"transcodeVideoCodec": "h264" // Preferred transcode video codec when direct play is unavailable.
|
||||
}, // Optional Jellyfin integration for auth, browsing, and playback launch.
|
||||
|
||||
// ==========================================
|
||||
@@ -298,7 +360,7 @@
|
||||
"discordPresence": {
|
||||
"enabled": false, // Enable optional Discord Rich Presence updates. Values: true | false
|
||||
"updateIntervalMs": 3000, // Minimum interval between presence payload updates.
|
||||
"debounceMs": 750, // Debounce delay used to collapse bursty presence updates.
|
||||
"debounceMs": 750 // Debounce delay used to collapse bursty presence updates.
|
||||
}, // Optional Discord Rich Presence activity card updates for current playback/study session.
|
||||
|
||||
// ==========================================
|
||||
@@ -320,7 +382,7 @@
|
||||
"telemetryDays": 30, // Telemetry retention window in days.
|
||||
"dailyRollupsDays": 365, // Daily rollup retention window in days.
|
||||
"monthlyRollupsDays": 1825, // Monthly rollup retention window in days.
|
||||
"vacuumIntervalDays": 7, // Minimum days between VACUUM runs.
|
||||
}, // Retention setting.
|
||||
}, // Enable/disable immersion tracking.
|
||||
"vacuumIntervalDays": 7 // Minimum days between VACUUM runs.
|
||||
} // Retention setting.
|
||||
} // Enable/disable immersion tracking.
|
||||
}
|
||||
|
||||
@@ -29,12 +29,38 @@ set -e
|
||||
|
||||
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
|
||||
YOMITAN_DIR="${1:-$SCRIPT_DIR/../vendor/yomitan}"
|
||||
YOMITAN_MANIFEST_PATH="$YOMITAN_DIR/manifest.json"
|
||||
|
||||
if [ ! -d "$YOMITAN_DIR" ]; then
|
||||
echo "Error: Yomitan directory not found: $YOMITAN_DIR"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if [ ! -f "$YOMITAN_MANIFEST_PATH" ]; then
|
||||
echo "Error: manifest.json not found at $YOMITAN_MANIFEST_PATH"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Patching manifest.json..."
|
||||
if node - "$YOMITAN_MANIFEST_PATH" <<'PATCH_EOF'
|
||||
const fs = require('node:fs');
|
||||
const path = process.argv[2];
|
||||
const manifest = JSON.parse(fs.readFileSync(path, 'utf8'));
|
||||
const stableKey = 'MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAxclvOy2sunfRa2UeSV/L9xyuMR9V65z85mbKCy0XvSLUkTBXM8BnvnrDu1DHhLjpidA3cBtetVt7rzwsJSA6/CzlMmtG6L6//3MOAH5Mhng8tXXWXbuNuJobLv/7MORPqoqYKZuoL1bnUvjdrf4Pb3BBDZtHN8LcDz13gOO4dnEFQbSE4F5RQ4mIQAGMkmbmlJkwFk5I022XyX+cWm/+9VvwPuEDA1Qf7X1G+4use3hGYWVPcRb6xTp7swXsO/fP7auE51gYQD0Ht36wr32UR6lfRmsahbHOX4RLe36S8B4ee74kk5C8iCsZf2fidWmevzLk7kK0GW15pv3dpGFpPQIDAQAB';
|
||||
if (manifest.key === stableKey) {
|
||||
process.exit(0);
|
||||
}
|
||||
manifest.key = stableKey;
|
||||
fs.writeFileSync(path, `${JSON.stringify(manifest, null, 4)}\n`, 'utf8');
|
||||
process.exit(0);
|
||||
PATCH_EOF
|
||||
then
|
||||
echo " - Set stable manifest key in manifest.json"
|
||||
else
|
||||
echo " - Failed to patch manifest.json"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Patching Yomitan in: $YOMITAN_DIR"
|
||||
|
||||
PERMISSIONS_UTIL="$YOMITAN_DIR/js/data/permissions-util.js"
|
||||
|
||||
@@ -16,6 +16,9 @@ test('loads defaults when config is missing', () => {
|
||||
const service = new ConfigService(dir);
|
||||
const config = service.getConfig();
|
||||
assert.equal(config.websocket.port, DEFAULT_CONFIG.websocket.port);
|
||||
assert.equal(config.annotationWebsocket.enabled, DEFAULT_CONFIG.annotationWebsocket.enabled);
|
||||
assert.equal(config.annotationWebsocket.port, DEFAULT_CONFIG.annotationWebsocket.port);
|
||||
assert.equal(config.texthooker.launchAtStartup, true);
|
||||
assert.equal(config.ankiConnect.behavior.autoUpdateNewCards, true);
|
||||
assert.deepEqual(config.ankiConnect.tags, ['SubMiner']);
|
||||
assert.equal(config.anilist.enabled, false);
|
||||
@@ -24,6 +27,9 @@ test('loads defaults when config is missing', () => {
|
||||
assert.equal(config.anilist.characterDictionary.maxLoaded, 3);
|
||||
assert.equal(config.anilist.characterDictionary.evictionPolicy, 'delete');
|
||||
assert.equal(config.anilist.characterDictionary.profileScope, 'all');
|
||||
assert.equal(config.anilist.characterDictionary.collapsibleSections.description, false);
|
||||
assert.equal(config.anilist.characterDictionary.collapsibleSections.characterInformation, false);
|
||||
assert.equal(config.anilist.characterDictionary.collapsibleSections.voicedBy, false);
|
||||
assert.equal(config.jellyfin.remoteControlEnabled, true);
|
||||
assert.equal(config.jellyfin.remoteControlAutoConnect, true);
|
||||
assert.equal(config.jellyfin.autoAnnounce, false);
|
||||
@@ -128,6 +134,94 @@ test('parses subtitleStyle.preserveLineBreaks and warns on invalid values', () =
|
||||
);
|
||||
});
|
||||
|
||||
test('parses texthooker.launchAtStartup and warns on invalid values', () => {
|
||||
const validDir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
path.join(validDir, 'config.jsonc'),
|
||||
`{
|
||||
"texthooker": {
|
||||
"launchAtStartup": false
|
||||
}
|
||||
}`,
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
const validService = new ConfigService(validDir);
|
||||
assert.equal(validService.getConfig().texthooker.launchAtStartup, false);
|
||||
|
||||
const invalidDir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
path.join(invalidDir, 'config.jsonc'),
|
||||
`{
|
||||
"texthooker": {
|
||||
"launchAtStartup": "yes"
|
||||
}
|
||||
}`,
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
const invalidService = new ConfigService(invalidDir);
|
||||
assert.equal(
|
||||
invalidService.getConfig().texthooker.launchAtStartup,
|
||||
DEFAULT_CONFIG.texthooker.launchAtStartup,
|
||||
);
|
||||
assert.ok(
|
||||
invalidService
|
||||
.getWarnings()
|
||||
.some((warning) => warning.path === 'texthooker.launchAtStartup'),
|
||||
);
|
||||
});
|
||||
|
||||
test('parses annotationWebsocket settings and warns on invalid values', () => {
|
||||
const validDir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
path.join(validDir, 'config.jsonc'),
|
||||
`{
|
||||
"annotationWebsocket": {
|
||||
"enabled": false,
|
||||
"port": 7788
|
||||
}
|
||||
}`,
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
const validService = new ConfigService(validDir);
|
||||
assert.equal(validService.getConfig().annotationWebsocket.enabled, false);
|
||||
assert.equal(validService.getConfig().annotationWebsocket.port, 7788);
|
||||
|
||||
const invalidDir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
path.join(invalidDir, 'config.jsonc'),
|
||||
`{
|
||||
"annotationWebsocket": {
|
||||
"enabled": "yes",
|
||||
"port": "bad"
|
||||
}
|
||||
}`,
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
const invalidService = new ConfigService(invalidDir);
|
||||
assert.equal(
|
||||
invalidService.getConfig().annotationWebsocket.enabled,
|
||||
DEFAULT_CONFIG.annotationWebsocket.enabled,
|
||||
);
|
||||
assert.equal(
|
||||
invalidService.getConfig().annotationWebsocket.port,
|
||||
DEFAULT_CONFIG.annotationWebsocket.port,
|
||||
);
|
||||
assert.ok(
|
||||
invalidService
|
||||
.getWarnings()
|
||||
.some((warning) => warning.path === 'annotationWebsocket.enabled'),
|
||||
);
|
||||
assert.ok(
|
||||
invalidService
|
||||
.getWarnings()
|
||||
.some((warning) => warning.path === 'annotationWebsocket.port'),
|
||||
);
|
||||
});
|
||||
|
||||
test('parses subtitleStyle.autoPauseVideoOnHover and warns on invalid values', () => {
|
||||
const validDir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
@@ -417,6 +511,39 @@ test('parses anilist.characterDictionary config with clamping and enum validatio
|
||||
assert.ok(warnings.some((warning) => warning.path === 'anilist.characterDictionary.profileScope'));
|
||||
});
|
||||
|
||||
test('parses anilist.characterDictionary.collapsibleSections booleans and warns on invalid values', () => {
|
||||
const dir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
path.join(dir, 'config.jsonc'),
|
||||
`{
|
||||
"anilist": {
|
||||
"characterDictionary": {
|
||||
"collapsibleSections": {
|
||||
"description": true,
|
||||
"characterInformation": "yes",
|
||||
"voicedBy": true
|
||||
}
|
||||
}
|
||||
}
|
||||
}`,
|
||||
'utf-8',
|
||||
);
|
||||
|
||||
const service = new ConfigService(dir);
|
||||
const config = service.getConfig();
|
||||
const warnings = service.getWarnings();
|
||||
|
||||
assert.equal(config.anilist.characterDictionary.collapsibleSections.description, true);
|
||||
assert.equal(config.anilist.characterDictionary.collapsibleSections.characterInformation, false);
|
||||
assert.equal(config.anilist.characterDictionary.collapsibleSections.voicedBy, true);
|
||||
assert.ok(
|
||||
warnings.some(
|
||||
(warning) =>
|
||||
warning.path === 'anilist.characterDictionary.collapsibleSections.characterInformation',
|
||||
),
|
||||
);
|
||||
});
|
||||
|
||||
test('parses jellyfin remote control fields', () => {
|
||||
const dir = makeTempDir();
|
||||
fs.writeFileSync(
|
||||
@@ -840,6 +967,10 @@ test('warning emission order is deterministic across reloads', () => {
|
||||
"enabled": "sometimes",
|
||||
"port": -1
|
||||
},
|
||||
"annotationWebsocket": {
|
||||
"enabled": "sometimes",
|
||||
"port": -1
|
||||
},
|
||||
"logging": {
|
||||
"level": "trace"
|
||||
}
|
||||
@@ -856,7 +987,14 @@ test('warning emission order is deterministic across reloads', () => {
|
||||
assert.deepEqual(secondWarnings, firstWarnings);
|
||||
assert.deepEqual(
|
||||
firstWarnings.map((warning) => warning.path),
|
||||
['unknownFeature', 'websocket.enabled', 'websocket.port', 'logging.level'],
|
||||
[
|
||||
'unknownFeature',
|
||||
'websocket.enabled',
|
||||
'websocket.port',
|
||||
'annotationWebsocket.enabled',
|
||||
'annotationWebsocket.port',
|
||||
'logging.level',
|
||||
],
|
||||
);
|
||||
});
|
||||
|
||||
@@ -1426,8 +1564,17 @@ test('template generator includes known keys', () => {
|
||||
output,
|
||||
/"enabled": "auto",? \/\/ Built-in subtitle websocket server mode\. Values: auto \| true \| false/,
|
||||
);
|
||||
assert.match(
|
||||
output,
|
||||
/"enabled": true,? \/\/ Annotated subtitle websocket server enabled state\. Values: true \| false/,
|
||||
);
|
||||
assert.match(output, /"port": 6678,? \/\/ Annotated subtitle websocket server port\./);
|
||||
assert.match(
|
||||
output,
|
||||
/"enabled": false,? \/\/ Enable AnkiConnect integration\. Values: true \| false/,
|
||||
);
|
||||
assert.match(
|
||||
output,
|
||||
/"launchAtStartup": true,? \/\/ Launch texthooker server automatically when SubMiner starts\. Values: true \| false/,
|
||||
);
|
||||
});
|
||||
|
||||
@@ -22,6 +22,7 @@ const {
|
||||
subtitlePosition,
|
||||
keybindings,
|
||||
websocket,
|
||||
annotationWebsocket,
|
||||
logging,
|
||||
texthooker,
|
||||
shortcuts,
|
||||
@@ -39,6 +40,7 @@ export const DEFAULT_CONFIG: ResolvedConfig = {
|
||||
subtitlePosition,
|
||||
keybindings,
|
||||
websocket,
|
||||
annotationWebsocket,
|
||||
logging,
|
||||
texthooker,
|
||||
ankiConnect,
|
||||
|
||||
@@ -5,6 +5,7 @@ export const CORE_DEFAULT_CONFIG: Pick<
|
||||
| 'subtitlePosition'
|
||||
| 'keybindings'
|
||||
| 'websocket'
|
||||
| 'annotationWebsocket'
|
||||
| 'logging'
|
||||
| 'texthooker'
|
||||
| 'shortcuts'
|
||||
@@ -19,10 +20,15 @@ export const CORE_DEFAULT_CONFIG: Pick<
|
||||
enabled: 'auto',
|
||||
port: 6677,
|
||||
},
|
||||
annotationWebsocket: {
|
||||
enabled: true,
|
||||
port: 6678,
|
||||
},
|
||||
logging: {
|
||||
level: 'info',
|
||||
},
|
||||
texthooker: {
|
||||
launchAtStartup: true,
|
||||
openBrowser: true,
|
||||
},
|
||||
shortcuts: {
|
||||
|
||||
@@ -18,11 +18,13 @@ test('config option registry includes critical paths and has unique entries', ()
|
||||
|
||||
for (const requiredPath of [
|
||||
'logging.level',
|
||||
'annotationWebsocket.enabled',
|
||||
'startupWarmups.lowPowerMode',
|
||||
'subtitleStyle.enableJlpt',
|
||||
'subtitleStyle.autoPauseVideoOnYomitanPopup',
|
||||
'ankiConnect.enabled',
|
||||
'anilist.characterDictionary.enabled',
|
||||
'anilist.characterDictionary.collapsibleSections.description',
|
||||
'immersionTracking.enabled',
|
||||
]) {
|
||||
assert.ok(paths.includes(requiredPath), `missing config path: ${requiredPath}`);
|
||||
@@ -35,6 +37,7 @@ test('config template sections include expected domains and unique keys', () =>
|
||||
const keys = CONFIG_TEMPLATE_SECTIONS.map((section) => section.key);
|
||||
const requiredKeys: (typeof keys)[number][] = [
|
||||
'websocket',
|
||||
'annotationWebsocket',
|
||||
'startupWarmups',
|
||||
'subtitleStyle',
|
||||
'ankiConnect',
|
||||
|
||||
@@ -12,6 +12,12 @@ export function buildCoreConfigOptionRegistry(
|
||||
defaultValue: defaultConfig.logging.level,
|
||||
description: 'Minimum log level for runtime logging.',
|
||||
},
|
||||
{
|
||||
path: 'texthooker.launchAtStartup',
|
||||
kind: 'boolean',
|
||||
defaultValue: defaultConfig.texthooker.launchAtStartup,
|
||||
description: 'Launch texthooker server automatically when SubMiner starts.',
|
||||
},
|
||||
{
|
||||
path: 'websocket.enabled',
|
||||
kind: 'enum',
|
||||
@@ -25,6 +31,18 @@ export function buildCoreConfigOptionRegistry(
|
||||
defaultValue: defaultConfig.websocket.port,
|
||||
description: 'Built-in subtitle websocket server port.',
|
||||
},
|
||||
{
|
||||
path: 'annotationWebsocket.enabled',
|
||||
kind: 'boolean',
|
||||
defaultValue: defaultConfig.annotationWebsocket.enabled,
|
||||
description: 'Annotated subtitle websocket server enabled state.',
|
||||
},
|
||||
{
|
||||
path: 'annotationWebsocket.port',
|
||||
kind: 'number',
|
||||
defaultValue: defaultConfig.annotationWebsocket.port,
|
||||
description: 'Annotated subtitle websocket server port.',
|
||||
},
|
||||
{
|
||||
path: 'subsync.defaultMode',
|
||||
kind: 'enum',
|
||||
|
||||
@@ -10,7 +10,7 @@ const CORE_TEMPLATE_SECTIONS: ConfigTemplateSection[] = [
|
||||
},
|
||||
{
|
||||
title: 'Texthooker Server',
|
||||
description: ['Control whether browser opens automatically for texthooker.'],
|
||||
description: ['Configure texthooker startup launch and browser opening behavior.'],
|
||||
key: 'texthooker',
|
||||
},
|
||||
{
|
||||
@@ -21,6 +21,14 @@ const CORE_TEMPLATE_SECTIONS: ConfigTemplateSection[] = [
|
||||
],
|
||||
key: 'websocket',
|
||||
},
|
||||
{
|
||||
title: 'Annotation WebSocket',
|
||||
description: [
|
||||
'Dedicated annotated subtitle websocket for bundled texthooker and token-aware clients.',
|
||||
'Independent from websocket.auto and defaults to port 6678.',
|
||||
],
|
||||
key: 'annotationWebsocket',
|
||||
},
|
||||
{
|
||||
title: 'Logging',
|
||||
description: ['Controls logging verbosity.', 'Set to debug for full runtime diagnostics.'],
|
||||
|
||||
@@ -5,6 +5,18 @@ export function applyCoreDomainConfig(context: ResolveContext): void {
|
||||
const { src, resolved, warn } = context;
|
||||
|
||||
if (isObject(src.texthooker)) {
|
||||
const launchAtStartup = asBoolean(src.texthooker.launchAtStartup);
|
||||
if (launchAtStartup !== undefined) {
|
||||
resolved.texthooker.launchAtStartup = launchAtStartup;
|
||||
} else if (src.texthooker.launchAtStartup !== undefined) {
|
||||
warn(
|
||||
'texthooker.launchAtStartup',
|
||||
src.texthooker.launchAtStartup,
|
||||
resolved.texthooker.launchAtStartup,
|
||||
'Expected boolean.',
|
||||
);
|
||||
}
|
||||
|
||||
const openBrowser = asBoolean(src.texthooker.openBrowser);
|
||||
if (openBrowser !== undefined) {
|
||||
resolved.texthooker.openBrowser = openBrowser;
|
||||
@@ -44,6 +56,32 @@ export function applyCoreDomainConfig(context: ResolveContext): void {
|
||||
}
|
||||
}
|
||||
|
||||
if (isObject(src.annotationWebsocket)) {
|
||||
const enabled = asBoolean(src.annotationWebsocket.enabled);
|
||||
if (enabled !== undefined) {
|
||||
resolved.annotationWebsocket.enabled = enabled;
|
||||
} else if (src.annotationWebsocket.enabled !== undefined) {
|
||||
warn(
|
||||
'annotationWebsocket.enabled',
|
||||
src.annotationWebsocket.enabled,
|
||||
resolved.annotationWebsocket.enabled,
|
||||
'Expected boolean.',
|
||||
);
|
||||
}
|
||||
|
||||
const port = asNumber(src.annotationWebsocket.port);
|
||||
if (port !== undefined && port > 0 && port <= 65535) {
|
||||
resolved.annotationWebsocket.port = Math.floor(port);
|
||||
} else if (src.annotationWebsocket.port !== undefined) {
|
||||
warn(
|
||||
'annotationWebsocket.port',
|
||||
src.annotationWebsocket.port,
|
||||
resolved.annotationWebsocket.port,
|
||||
'Expected integer between 1 and 65535.',
|
||||
);
|
||||
}
|
||||
}
|
||||
|
||||
if (isObject(src.logging)) {
|
||||
const logLevel = asString(src.logging.level);
|
||||
if (
|
||||
|
||||
@@ -1,6 +1,10 @@
|
||||
import test from 'node:test';
|
||||
import assert from 'node:assert/strict';
|
||||
import { serializeSubtitleMarkup, serializeSubtitleWebsocketMessage } from './subtitle-ws';
|
||||
import {
|
||||
serializeInitialSubtitleWebsocketMessage,
|
||||
serializeSubtitleMarkup,
|
||||
serializeSubtitleWebsocketMessage,
|
||||
} from './subtitle-ws';
|
||||
import { PartOfSpeech, type SubtitleData } from '../../types';
|
||||
|
||||
const frequencyOptions = {
|
||||
@@ -78,6 +82,51 @@ test('serializeSubtitleMarkup includes known, n+1, jlpt, and frequency classes',
|
||||
assert.match(markup, /word word-frequency-band-1/);
|
||||
});
|
||||
|
||||
test('serializeSubtitleMarkup preserves tooltip attrs and name-match precedence', () => {
|
||||
const payload: SubtitleData = {
|
||||
text: 'ignored',
|
||||
tokens: [
|
||||
{
|
||||
surface: '無事',
|
||||
reading: 'ぶじ',
|
||||
headword: '無事',
|
||||
startPos: 0,
|
||||
endPos: 2,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
isMerged: false,
|
||||
isKnown: true,
|
||||
isNPlusOneTarget: false,
|
||||
jlptLevel: 'N2',
|
||||
frequencyRank: 745,
|
||||
},
|
||||
{
|
||||
surface: 'アレクシア',
|
||||
reading: 'あれくしあ',
|
||||
headword: 'アレクシア',
|
||||
startPos: 2,
|
||||
endPos: 7,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
isMerged: false,
|
||||
isKnown: false,
|
||||
isNPlusOneTarget: false,
|
||||
isNameMatch: true,
|
||||
frequencyRank: 12,
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
const markup = serializeSubtitleMarkup(payload, frequencyOptions);
|
||||
assert.match(
|
||||
markup,
|
||||
/<span class="word word-known word-jlpt-n2" data-reading="ぶじ" data-headword="無事" data-frequency-rank="745" data-jlpt-level="N2">無事<\/span>/,
|
||||
);
|
||||
assert.match(
|
||||
markup,
|
||||
/<span class="word word-name-match" data-reading="あれくしあ" data-headword="アレクシア" data-frequency-rank="12">アレクシア<\/span>/,
|
||||
);
|
||||
assert.doesNotMatch(markup, /word-name-match word-known|word-known word-name-match/);
|
||||
});
|
||||
|
||||
test('serializeSubtitleWebsocketMessage emits sentence payload', () => {
|
||||
const payload: SubtitleData = {
|
||||
text: '字幕',
|
||||
@@ -85,5 +134,101 @@ test('serializeSubtitleWebsocketMessage emits sentence payload', () => {
|
||||
};
|
||||
|
||||
const raw = serializeSubtitleWebsocketMessage(payload, frequencyOptions);
|
||||
assert.deepEqual(JSON.parse(raw), { sentence: '字幕' });
|
||||
assert.deepEqual(JSON.parse(raw), {
|
||||
version: 1,
|
||||
text: '字幕',
|
||||
sentence: '字幕',
|
||||
tokens: [],
|
||||
});
|
||||
});
|
||||
|
||||
test('serializeSubtitleWebsocketMessage emits structured token api payload', () => {
|
||||
const payload: SubtitleData = {
|
||||
text: '無事',
|
||||
tokens: [
|
||||
{
|
||||
surface: '無事',
|
||||
reading: 'ぶじ',
|
||||
headword: '無事',
|
||||
startPos: 0,
|
||||
endPos: 2,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
isMerged: false,
|
||||
isKnown: true,
|
||||
isNPlusOneTarget: false,
|
||||
jlptLevel: 'N2',
|
||||
frequencyRank: 745,
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
const raw = serializeSubtitleWebsocketMessage(payload, frequencyOptions);
|
||||
assert.deepEqual(JSON.parse(raw), {
|
||||
version: 1,
|
||||
text: '無事',
|
||||
sentence:
|
||||
'<span class="word word-known word-jlpt-n2" data-reading="ぶじ" data-headword="無事" data-frequency-rank="745" data-jlpt-level="N2">無事</span>',
|
||||
tokens: [
|
||||
{
|
||||
surface: '無事',
|
||||
reading: 'ぶじ',
|
||||
headword: '無事',
|
||||
startPos: 0,
|
||||
endPos: 2,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
isMerged: false,
|
||||
isKnown: true,
|
||||
isNPlusOneTarget: false,
|
||||
isNameMatch: false,
|
||||
jlptLevel: 'N2',
|
||||
frequencyRank: 745,
|
||||
className: 'word word-known word-jlpt-n2',
|
||||
frequencyRankLabel: '745',
|
||||
jlptLevelLabel: 'N2',
|
||||
},
|
||||
],
|
||||
});
|
||||
});
|
||||
|
||||
test('serializeInitialSubtitleWebsocketMessage keeps annotated current subtitle content', () => {
|
||||
const payload: SubtitleData = {
|
||||
text: 'ignored fallback',
|
||||
tokens: [
|
||||
{
|
||||
surface: '既知',
|
||||
reading: '',
|
||||
headword: '',
|
||||
startPos: 0,
|
||||
endPos: 2,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
isMerged: false,
|
||||
isKnown: true,
|
||||
isNPlusOneTarget: false,
|
||||
},
|
||||
],
|
||||
};
|
||||
|
||||
const raw = serializeInitialSubtitleWebsocketMessage(payload, frequencyOptions);
|
||||
assert.deepEqual(JSON.parse(raw ?? ''), {
|
||||
version: 1,
|
||||
text: 'ignored fallback',
|
||||
sentence: '<span class="word word-known">既知</span>',
|
||||
tokens: [
|
||||
{
|
||||
surface: '既知',
|
||||
reading: '',
|
||||
headword: '',
|
||||
startPos: 0,
|
||||
endPos: 2,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
isMerged: false,
|
||||
isKnown: true,
|
||||
isNPlusOneTarget: false,
|
||||
isNameMatch: false,
|
||||
className: 'word word-known',
|
||||
frequencyRankLabel: null,
|
||||
jlptLevelLabel: null,
|
||||
},
|
||||
],
|
||||
});
|
||||
});
|
||||
|
||||
@@ -18,6 +18,26 @@ export type SubtitleWebsocketFrequencyOptions = {
|
||||
mode: 'single' | 'banded';
|
||||
};
|
||||
|
||||
type SerializedSubtitleToken = Pick<
|
||||
MergedToken,
|
||||
| 'surface'
|
||||
| 'reading'
|
||||
| 'headword'
|
||||
| 'startPos'
|
||||
| 'endPos'
|
||||
| 'partOfSpeech'
|
||||
| 'isMerged'
|
||||
| 'isKnown'
|
||||
| 'isNPlusOneTarget'
|
||||
| 'frequencyRank'
|
||||
| 'jlptLevel'
|
||||
> & {
|
||||
isNameMatch: boolean;
|
||||
className: string;
|
||||
frequencyRankLabel: string | null;
|
||||
jlptLevelLabel: string | null;
|
||||
};
|
||||
|
||||
function escapeHtml(text: string): string {
|
||||
return text
|
||||
.replaceAll('&', '&')
|
||||
@@ -46,11 +66,29 @@ function computeFrequencyClass(
|
||||
return 'word-frequency-single';
|
||||
}
|
||||
|
||||
function getFrequencyRankLabel(
|
||||
token: MergedToken,
|
||||
options: SubtitleWebsocketFrequencyOptions,
|
||||
): string | null {
|
||||
if (!options.enabled) return null;
|
||||
if (typeof token.frequencyRank !== 'number' || !Number.isFinite(token.frequencyRank)) return null;
|
||||
|
||||
const rank = Math.max(1, Math.floor(token.frequencyRank));
|
||||
const topX = Math.max(1, Math.floor(options.topX));
|
||||
return rank <= topX ? String(rank) : null;
|
||||
}
|
||||
|
||||
function getJlptLevelLabel(token: MergedToken): string | null {
|
||||
return token.jlptLevel ?? null;
|
||||
}
|
||||
|
||||
function computeWordClass(token: MergedToken, options: SubtitleWebsocketFrequencyOptions): string {
|
||||
const classes = ['word'];
|
||||
|
||||
if (token.isNPlusOneTarget) {
|
||||
classes.push('word-n-plus-one');
|
||||
} else if (token.isNameMatch) {
|
||||
classes.push('word-name-match');
|
||||
} else if (token.isKnown) {
|
||||
classes.push('word-known');
|
||||
}
|
||||
@@ -59,7 +97,7 @@ function computeWordClass(token: MergedToken, options: SubtitleWebsocketFrequenc
|
||||
classes.push(`word-jlpt-${token.jlptLevel.toLowerCase()}`);
|
||||
}
|
||||
|
||||
if (!token.isKnown && !token.isNPlusOneTarget) {
|
||||
if (!token.isKnown && !token.isNPlusOneTarget && !token.isNameMatch) {
|
||||
const frequencyClass = computeFrequencyClass(token, options);
|
||||
if (frequencyClass) {
|
||||
classes.push(frequencyClass);
|
||||
@@ -69,6 +107,55 @@ function computeWordClass(token: MergedToken, options: SubtitleWebsocketFrequenc
|
||||
return classes.join(' ');
|
||||
}
|
||||
|
||||
function serializeWordDataAttributes(
|
||||
token: MergedToken,
|
||||
options: SubtitleWebsocketFrequencyOptions,
|
||||
): string {
|
||||
const attributes: string[] = [];
|
||||
|
||||
if (token.reading) {
|
||||
attributes.push(`data-reading="${escapeHtml(token.reading)}"`);
|
||||
}
|
||||
if (token.headword) {
|
||||
attributes.push(`data-headword="${escapeHtml(token.headword)}"`);
|
||||
}
|
||||
|
||||
const frequencyRankLabel = getFrequencyRankLabel(token, options);
|
||||
if (frequencyRankLabel) {
|
||||
attributes.push(`data-frequency-rank="${escapeHtml(frequencyRankLabel)}"`);
|
||||
}
|
||||
|
||||
const jlptLevelLabel = getJlptLevelLabel(token);
|
||||
if (jlptLevelLabel) {
|
||||
attributes.push(`data-jlpt-level="${escapeHtml(jlptLevelLabel)}"`);
|
||||
}
|
||||
|
||||
return attributes.length > 0 ? ` ${attributes.join(' ')}` : '';
|
||||
}
|
||||
|
||||
function serializeSubtitleToken(
|
||||
token: MergedToken,
|
||||
options: SubtitleWebsocketFrequencyOptions,
|
||||
): SerializedSubtitleToken {
|
||||
return {
|
||||
surface: token.surface,
|
||||
reading: token.reading,
|
||||
headword: token.headword,
|
||||
startPos: token.startPos,
|
||||
endPos: token.endPos,
|
||||
partOfSpeech: token.partOfSpeech,
|
||||
isMerged: token.isMerged,
|
||||
isKnown: token.isKnown,
|
||||
isNPlusOneTarget: token.isNPlusOneTarget,
|
||||
isNameMatch: token.isNameMatch ?? false,
|
||||
jlptLevel: token.jlptLevel,
|
||||
frequencyRank: token.frequencyRank,
|
||||
className: computeWordClass(token, options),
|
||||
frequencyRankLabel: getFrequencyRankLabel(token, options),
|
||||
jlptLevelLabel: getJlptLevelLabel(token),
|
||||
};
|
||||
}
|
||||
|
||||
export function serializeSubtitleMarkup(
|
||||
payload: SubtitleData,
|
||||
options: SubtitleWebsocketFrequencyOptions,
|
||||
@@ -80,11 +167,12 @@ export function serializeSubtitleMarkup(
|
||||
const chunks: string[] = [];
|
||||
for (const token of payload.tokens) {
|
||||
const klass = computeWordClass(token, options);
|
||||
const attrs = serializeWordDataAttributes(token, options);
|
||||
const parts = token.surface.split('\n');
|
||||
for (let index = 0; index < parts.length; index += 1) {
|
||||
const part = parts[index];
|
||||
if (part) {
|
||||
chunks.push(`<span class="${klass}">${escapeHtml(part)}</span>`);
|
||||
chunks.push(`<span class="${klass}"${attrs}>${escapeHtml(part)}</span>`);
|
||||
}
|
||||
if (index < parts.length - 1) {
|
||||
chunks.push('<br>');
|
||||
@@ -99,7 +187,23 @@ export function serializeSubtitleWebsocketMessage(
|
||||
payload: SubtitleData,
|
||||
options: SubtitleWebsocketFrequencyOptions,
|
||||
): string {
|
||||
return JSON.stringify({ sentence: serializeSubtitleMarkup(payload, options) });
|
||||
return JSON.stringify({
|
||||
version: 1,
|
||||
text: payload.text,
|
||||
sentence: serializeSubtitleMarkup(payload, options),
|
||||
tokens: payload.tokens?.map((token) => serializeSubtitleToken(token, options)) ?? [],
|
||||
});
|
||||
}
|
||||
|
||||
export function serializeInitialSubtitleWebsocketMessage(
|
||||
payload: SubtitleData | null,
|
||||
options: SubtitleWebsocketFrequencyOptions,
|
||||
): string | null {
|
||||
if (!payload || !payload.text.trim()) {
|
||||
return null;
|
||||
}
|
||||
|
||||
return serializeSubtitleWebsocketMessage(payload, options);
|
||||
}
|
||||
|
||||
export class SubtitleWebSocket {
|
||||
@@ -114,7 +218,11 @@ export class SubtitleWebSocket {
|
||||
return (this.server?.clients.size ?? 0) > 0;
|
||||
}
|
||||
|
||||
public start(port: number, getCurrentSubtitleText: () => string): void {
|
||||
public start(
|
||||
port: number,
|
||||
getCurrentSubtitleData: () => SubtitleData | null,
|
||||
getFrequencyOptions: () => SubtitleWebsocketFrequencyOptions,
|
||||
): void {
|
||||
this.server = new WebSocket.Server({ port, host: '127.0.0.1' });
|
||||
|
||||
this.server.on('connection', (ws: WebSocket) => {
|
||||
@@ -124,9 +232,12 @@ export class SubtitleWebSocket {
|
||||
return;
|
||||
}
|
||||
|
||||
const currentText = getCurrentSubtitleText();
|
||||
if (currentText) {
|
||||
ws.send(JSON.stringify({ sentence: currentText }));
|
||||
const currentMessage = serializeInitialSubtitleWebsocketMessage(
|
||||
getCurrentSubtitleData(),
|
||||
getFrequencyOptions(),
|
||||
);
|
||||
if (currentMessage) {
|
||||
ws.send(currentMessage);
|
||||
}
|
||||
});
|
||||
|
||||
|
||||
27
src/core/services/texthooker.test.ts
Normal file
27
src/core/services/texthooker.test.ts
Normal file
@@ -0,0 +1,27 @@
|
||||
import assert from 'node:assert/strict';
|
||||
import test from 'node:test';
|
||||
import { injectTexthookerBootstrapHtml } from './texthooker';
|
||||
|
||||
test('injectTexthookerBootstrapHtml injects websocket bootstrap before head close', () => {
|
||||
const html = '<html><head><title>Texthooker</title></head><body></body></html>';
|
||||
|
||||
const actual = injectTexthookerBootstrapHtml(html, 'ws://127.0.0.1:6678');
|
||||
|
||||
assert.match(
|
||||
actual,
|
||||
/window\.localStorage\.setItem\('bannou-texthooker-websocketUrl', "ws:\/\/127\.0\.0\.1:6678"\)/,
|
||||
);
|
||||
assert.ok(actual.indexOf('</script></head>') !== -1);
|
||||
assert.ok(actual.includes("bannou-texthooker-websocketUrl"));
|
||||
assert.ok(!actual.includes('bannou-texthooker-enableKnownWordColoring'));
|
||||
assert.ok(!actual.includes('bannou-texthooker-enableNPlusOneColoring'));
|
||||
assert.ok(!actual.includes('bannou-texthooker-enableNameMatchColoring'));
|
||||
assert.ok(!actual.includes('bannou-texthooker-enableFrequencyColoring'));
|
||||
assert.ok(!actual.includes('bannou-texthooker-enableJlptColoring'));
|
||||
});
|
||||
|
||||
test('injectTexthookerBootstrapHtml leaves html unchanged without websocketUrl', () => {
|
||||
const html = '<html><head></head><body></body></html>';
|
||||
|
||||
assert.equal(injectTexthookerBootstrapHtml(html), html);
|
||||
});
|
||||
@@ -5,6 +5,22 @@ import { createLogger } from '../../logger';
|
||||
|
||||
const logger = createLogger('main:texthooker');
|
||||
|
||||
export function injectTexthookerBootstrapHtml(html: string, websocketUrl?: string): string {
|
||||
if (!websocketUrl) {
|
||||
return html;
|
||||
}
|
||||
|
||||
const bootstrapScript = `<script>window.localStorage.setItem('bannou-texthooker-websocketUrl', ${JSON.stringify(
|
||||
websocketUrl,
|
||||
)});</script>`;
|
||||
|
||||
if (html.includes('</head>')) {
|
||||
return html.replace('</head>', `${bootstrapScript}</head>`);
|
||||
}
|
||||
|
||||
return `${bootstrapScript}${html}`;
|
||||
}
|
||||
|
||||
export class Texthooker {
|
||||
private server: http.Server | null = null;
|
||||
|
||||
@@ -12,7 +28,11 @@ export class Texthooker {
|
||||
return this.server !== null;
|
||||
}
|
||||
|
||||
public start(port: number): http.Server | null {
|
||||
public start(port: number, websocketUrl?: string): http.Server | null {
|
||||
if (this.server) {
|
||||
return this.server;
|
||||
}
|
||||
|
||||
const texthookerPath = this.getTexthookerPath();
|
||||
if (!texthookerPath) {
|
||||
logger.error('texthooker-ui not found');
|
||||
@@ -42,8 +62,12 @@ export class Texthooker {
|
||||
res.end('Not found');
|
||||
return;
|
||||
}
|
||||
const responseData =
|
||||
urlPath === '/' || urlPath === '/index.html'
|
||||
? Buffer.from(injectTexthookerBootstrapHtml(data.toString('utf-8'), websocketUrl))
|
||||
: data;
|
||||
res.writeHead(200, { 'Content-Type': mimeTypes[ext] || 'text/plain' });
|
||||
res.end(data);
|
||||
res.end(responseData);
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
82
src/generate-config-example.test.ts
Normal file
82
src/generate-config-example.test.ts
Normal file
@@ -0,0 +1,82 @@
|
||||
import assert from 'node:assert/strict';
|
||||
import fs from 'node:fs';
|
||||
import path from 'node:path';
|
||||
import test from 'node:test';
|
||||
import {
|
||||
resolveConfigExampleOutputPaths,
|
||||
writeConfigExampleArtifacts,
|
||||
} from './generate-config-example';
|
||||
|
||||
function createWorkspace(name: string): string {
|
||||
const baseDir = path.join(process.cwd(), '.tmp', 'generate-config-example-test');
|
||||
fs.mkdirSync(baseDir, { recursive: true });
|
||||
return fs.mkdtempSync(path.join(baseDir, `${name}-`));
|
||||
}
|
||||
|
||||
test('resolveConfigExampleOutputPaths includes sibling docs repo and never local docs/public', () => {
|
||||
const workspace = createWorkspace('with-docs-repo');
|
||||
const projectRoot = path.join(workspace, 'SubMiner');
|
||||
const docsRepoRoot = path.join(workspace, 'subminer-docs');
|
||||
|
||||
fs.mkdirSync(projectRoot, { recursive: true });
|
||||
fs.mkdirSync(docsRepoRoot, { recursive: true });
|
||||
|
||||
try {
|
||||
const outputPaths = resolveConfigExampleOutputPaths({ cwd: projectRoot });
|
||||
|
||||
assert.deepEqual(outputPaths, [
|
||||
path.join(projectRoot, 'config.example.jsonc'),
|
||||
path.join(docsRepoRoot, 'public', 'config.example.jsonc'),
|
||||
]);
|
||||
assert.equal(
|
||||
outputPaths.includes(path.join(projectRoot, 'docs', 'public', 'config.example.jsonc')),
|
||||
false,
|
||||
);
|
||||
} finally {
|
||||
fs.rmSync(workspace, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
test('resolveConfigExampleOutputPaths stays repo-local when sibling docs repo is absent', () => {
|
||||
const workspace = createWorkspace('without-docs-repo');
|
||||
const projectRoot = path.join(workspace, 'SubMiner');
|
||||
|
||||
fs.mkdirSync(projectRoot, { recursive: true });
|
||||
|
||||
try {
|
||||
const outputPaths = resolveConfigExampleOutputPaths({ cwd: projectRoot });
|
||||
|
||||
assert.deepEqual(outputPaths, [path.join(projectRoot, 'config.example.jsonc')]);
|
||||
} finally {
|
||||
fs.rmSync(workspace, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
|
||||
test('writeConfigExampleArtifacts creates parent directories for resolved outputs', () => {
|
||||
const workspace = createWorkspace('write-artifacts');
|
||||
const projectRoot = path.join(workspace, 'SubMiner');
|
||||
const docsRepoRoot = path.join(workspace, 'subminer-docs');
|
||||
const template = '{\n "ok": true\n}\n';
|
||||
|
||||
fs.mkdirSync(projectRoot, { recursive: true });
|
||||
fs.mkdirSync(docsRepoRoot, { recursive: true });
|
||||
|
||||
try {
|
||||
const writtenPaths = writeConfigExampleArtifacts(template, {
|
||||
cwd: projectRoot,
|
||||
deps: { log: () => {} },
|
||||
});
|
||||
|
||||
assert.deepEqual(writtenPaths, [
|
||||
path.join(projectRoot, 'config.example.jsonc'),
|
||||
path.join(docsRepoRoot, 'public', 'config.example.jsonc'),
|
||||
]);
|
||||
assert.equal(fs.readFileSync(path.join(projectRoot, 'config.example.jsonc'), 'utf8'), template);
|
||||
assert.equal(
|
||||
fs.readFileSync(path.join(docsRepoRoot, 'public', 'config.example.jsonc'), 'utf8'),
|
||||
template,
|
||||
);
|
||||
} finally {
|
||||
fs.rmSync(workspace, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
@@ -2,18 +2,62 @@ import * as fs from 'fs';
|
||||
import * as path from 'path';
|
||||
import { DEFAULT_CONFIG, generateConfigTemplate } from './config';
|
||||
|
||||
function main(): void {
|
||||
const template = generateConfigTemplate(DEFAULT_CONFIG);
|
||||
const outputPaths = [
|
||||
path.join(process.cwd(), 'config.example.jsonc'),
|
||||
path.join(process.cwd(), 'docs', 'public', 'config.example.jsonc'),
|
||||
];
|
||||
type ConfigExampleFsDeps = {
|
||||
existsSync?: (candidate: string) => boolean;
|
||||
mkdirSync?: (candidate: string, options: { recursive: true }) => void;
|
||||
writeFileSync?: (candidate: string, content: string, encoding: BufferEncoding) => void;
|
||||
log?: (message: string) => void;
|
||||
};
|
||||
|
||||
for (const outputPath of outputPaths) {
|
||||
fs.mkdirSync(path.dirname(outputPath), { recursive: true });
|
||||
fs.writeFileSync(outputPath, template, 'utf-8');
|
||||
console.log(`Generated ${outputPath}`);
|
||||
export function resolveConfigExampleOutputPaths(options?: {
|
||||
cwd?: string;
|
||||
docsRepoName?: string;
|
||||
existsSync?: (candidate: string) => boolean;
|
||||
}): string[] {
|
||||
const cwd = options?.cwd ?? process.cwd();
|
||||
const existsSync = options?.existsSync ?? fs.existsSync;
|
||||
const docsRepoName = options?.docsRepoName ?? 'subminer-docs';
|
||||
const outputPaths = [path.join(cwd, 'config.example.jsonc')];
|
||||
const docsRepoRoot = path.resolve(cwd, '..', docsRepoName);
|
||||
|
||||
if (existsSync(docsRepoRoot)) {
|
||||
outputPaths.push(path.join(docsRepoRoot, 'public', 'config.example.jsonc'));
|
||||
}
|
||||
|
||||
return outputPaths;
|
||||
}
|
||||
|
||||
main();
|
||||
export function writeConfigExampleArtifacts(
|
||||
template: string,
|
||||
options?: {
|
||||
cwd?: string;
|
||||
docsRepoName?: string;
|
||||
deps?: ConfigExampleFsDeps;
|
||||
},
|
||||
): string[] {
|
||||
const mkdirSync = options?.deps?.mkdirSync ?? fs.mkdirSync;
|
||||
const writeFileSync = options?.deps?.writeFileSync ?? fs.writeFileSync;
|
||||
const log = options?.deps?.log ?? console.log;
|
||||
const outputPaths = resolveConfigExampleOutputPaths({
|
||||
cwd: options?.cwd,
|
||||
docsRepoName: options?.docsRepoName,
|
||||
existsSync: options?.deps?.existsSync,
|
||||
});
|
||||
|
||||
for (const outputPath of outputPaths) {
|
||||
mkdirSync(path.dirname(outputPath), { recursive: true });
|
||||
writeFileSync(outputPath, template, 'utf-8');
|
||||
log(`Generated ${outputPath}`);
|
||||
}
|
||||
|
||||
return outputPaths;
|
||||
}
|
||||
|
||||
function main(): void {
|
||||
const template = generateConfigTemplate(DEFAULT_CONFIG);
|
||||
writeConfigExampleArtifacts(template);
|
||||
}
|
||||
|
||||
if (require.main === module) {
|
||||
main();
|
||||
}
|
||||
|
||||
@@ -114,7 +114,13 @@ export interface WebSocketConfig {
|
||||
port?: number;
|
||||
}
|
||||
|
||||
export interface AnnotationWebSocketConfig {
|
||||
enabled?: boolean;
|
||||
port?: number;
|
||||
}
|
||||
|
||||
export interface TexthookerConfig {
|
||||
launchAtStartup?: boolean;
|
||||
openBrowser?: boolean;
|
||||
}
|
||||
|
||||
@@ -479,6 +485,7 @@ export interface Config {
|
||||
subtitlePosition?: SubtitlePosition;
|
||||
keybindings?: Keybinding[];
|
||||
websocket?: WebSocketConfig;
|
||||
annotationWebsocket?: AnnotationWebSocketConfig;
|
||||
texthooker?: TexthookerConfig;
|
||||
ankiConnect?: AnkiConnectConfig;
|
||||
shortcuts?: ShortcutsConfig;
|
||||
@@ -504,6 +511,7 @@ export interface ResolvedConfig {
|
||||
subtitlePosition: SubtitlePosition;
|
||||
keybindings: Keybinding[];
|
||||
websocket: Required<WebSocketConfig>;
|
||||
annotationWebsocket: Required<AnnotationWebSocketConfig>;
|
||||
texthooker: Required<TexthookerConfig>;
|
||||
ankiConnect: AnkiConnectConfig & {
|
||||
enabled: boolean;
|
||||
|
||||
35
vendor/yomitan/js/language/it/italian-processors.js
vendored
Normal file
35
vendor/yomitan/js/language/it/italian-processors.js
vendored
Normal file
@@ -0,0 +1,35 @@
|
||||
/*
|
||||
* Copyright (C) 2024-2025 Yomitan Authors
|
||||
*
|
||||
* This program is free software: you can redistribute it and/or modify
|
||||
* it under the terms of the GNU General Public License as published by
|
||||
* the Free Software Foundation, either version 3 of the License, or
|
||||
* (at your option) any later version.
|
||||
*
|
||||
* This program is distributed in the hope that it will be useful,
|
||||
* but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||
* GNU General Public License for more details.
|
||||
*
|
||||
* You should have received a copy of the GNU General Public License
|
||||
* along with this program. If not, see <https://www.gnu.org/licenses/>.
|
||||
*/
|
||||
|
||||
|
||||
/** @type {import('language').TextProcessor} */
|
||||
export const removeApostrophedWords = {
|
||||
name: 'Remove common apostrophed words',
|
||||
description: 'dell\'Italia > Italia, c\'erano > erano',
|
||||
process: (str) => [
|
||||
str,
|
||||
removeApostrophedWordsImpl(str),
|
||||
],
|
||||
};
|
||||
|
||||
/**
|
||||
* @param {string} word
|
||||
* @returns {string}
|
||||
*/
|
||||
export function removeApostrophedWordsImpl(word) {
|
||||
return word.replace(/(l|dell|all|dall|nell|sull|coll|un|quest|quell|c|n)['’]/g, '');
|
||||
}
|
||||
570
vendor/yomitan/lib/zstd-wasm.js
vendored
Normal file
570
vendor/yomitan/lib/zstd-wasm.js
vendored
Normal file
@@ -0,0 +1,570 @@
|
||||
// @ts-nocheck
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/zstd.js
|
||||
var Module = typeof Module !== "undefined" ? Module : {};
|
||||
var moduleOverrides = {};
|
||||
var key;
|
||||
for (key in Module) {
|
||||
if (Module.hasOwnProperty(key)) {
|
||||
moduleOverrides[key] = Module[key];
|
||||
}
|
||||
}
|
||||
var arguments_ = [];
|
||||
var err = Module["printErr"] || console.warn.bind(console);
|
||||
for (key in moduleOverrides) {
|
||||
if (moduleOverrides.hasOwnProperty(key)) {
|
||||
Module[key] = moduleOverrides[key];
|
||||
}
|
||||
}
|
||||
var quit_ = (status, toThrow) => {
|
||||
throw toThrow;
|
||||
};
|
||||
moduleOverrides = null;
|
||||
if (Module["arguments"])
|
||||
arguments_ = Module["arguments"];
|
||||
if (Module["thisProgram"])
|
||||
thisProgram = Module["thisProgram"];
|
||||
if (Module["quit"])
|
||||
quit_ = Module["quit"];
|
||||
if (typeof WebAssembly !== "object") {
|
||||
abort("no native wasm support detected");
|
||||
}
|
||||
var wasmMemory;
|
||||
var ABORT = false;
|
||||
var EXITSTATUS;
|
||||
var HEAPU8;
|
||||
var HEAP8;
|
||||
function updateMemoryViews() {
|
||||
var b = wasmMemory.buffer;
|
||||
Module["HEAP8"] = HEAP8 = new Int8Array(b);
|
||||
Module["HEAPU8"] = HEAPU8 = new Uint8Array(b);
|
||||
}
|
||||
var __ATPRERUN__ = [];
|
||||
var __ATINIT__ = [];
|
||||
var __ATPOSTRUN__ = [];
|
||||
var runtimeInitialized = false;
|
||||
function preRun() {
|
||||
if (Module["preRun"]) {
|
||||
if (typeof Module["preRun"] == "function")
|
||||
Module["preRun"] = [Module["preRun"]];
|
||||
while (Module["preRun"].length) {
|
||||
addOnPreRun(Module["preRun"].shift());
|
||||
}
|
||||
}
|
||||
callRuntimeCallbacks(__ATPRERUN__);
|
||||
}
|
||||
function initRuntime() {
|
||||
runtimeInitialized = true;
|
||||
callRuntimeCallbacks(__ATINIT__);
|
||||
}
|
||||
function postRun() {
|
||||
if (Module["postRun"]) {
|
||||
if (typeof Module["postRun"] == "function")
|
||||
Module["postRun"] = [Module["postRun"]];
|
||||
while (Module["postRun"].length) {
|
||||
addOnPostRun(Module["postRun"].shift());
|
||||
}
|
||||
}
|
||||
callRuntimeCallbacks(__ATPOSTRUN__);
|
||||
}
|
||||
function addOnPreRun(cb) {
|
||||
__ATPRERUN__.unshift(cb);
|
||||
}
|
||||
function addOnInit(cb) {
|
||||
__ATINIT__.unshift(cb);
|
||||
}
|
||||
function addOnPostRun(cb) {
|
||||
__ATPOSTRUN__.unshift(cb);
|
||||
}
|
||||
var runDependencies = 0;
|
||||
var dependenciesFulfilled = null;
|
||||
function addRunDependency(id) {
|
||||
var _a;
|
||||
runDependencies++;
|
||||
(_a = Module["monitorRunDependencies"]) === null || _a === void 0 ? void 0 : _a.call(Module, runDependencies);
|
||||
}
|
||||
function removeRunDependency(id) {
|
||||
var _a;
|
||||
runDependencies--;
|
||||
(_a = Module["monitorRunDependencies"]) === null || _a === void 0 ? void 0 : _a.call(Module, runDependencies);
|
||||
if (runDependencies == 0) {
|
||||
if (dependenciesFulfilled) {
|
||||
var callback = dependenciesFulfilled;
|
||||
dependenciesFulfilled = null;
|
||||
callback();
|
||||
}
|
||||
}
|
||||
}
|
||||
function abort(what) {
|
||||
var _a;
|
||||
(_a = Module["onAbort"]) === null || _a === void 0 ? void 0 : _a.call(Module, what);
|
||||
what = "Aborted(" + what + ")";
|
||||
err(what);
|
||||
ABORT = true;
|
||||
what += ". Build with -sASSERTIONS for more info.";
|
||||
var e = new WebAssembly.RuntimeError(what);
|
||||
throw e;
|
||||
}
|
||||
function getWasmImports() {
|
||||
return { a: wasmImports };
|
||||
}
|
||||
function getBinaryPromise(url) {
|
||||
return fetch(url, { credentials: "same-origin" }).then(function(response) {
|
||||
if (!response["ok"]) {
|
||||
throw "failed to load wasm binary file at '" + url + "'";
|
||||
}
|
||||
return response["arrayBuffer"]();
|
||||
});
|
||||
}
|
||||
function init(filePathOrBuf) {
|
||||
var info = getWasmImports();
|
||||
function receiveInstance(instance, module) {
|
||||
wasmExports = instance.exports;
|
||||
wasmMemory = wasmExports["f"];
|
||||
updateMemoryViews();
|
||||
addOnInit(wasmExports["g"]);
|
||||
removeRunDependency("wasm-instantiate");
|
||||
return wasmExports;
|
||||
}
|
||||
addRunDependency("wasm-instantiate");
|
||||
function receiveInstantiationResult(result) {
|
||||
receiveInstance(result["instance"]);
|
||||
}
|
||||
function instantiateArrayBuffer(receiver) {
|
||||
return getBinaryPromise(filePathOrBuf).then(function(binary) {
|
||||
var result = WebAssembly.instantiate(binary, info);
|
||||
return result;
|
||||
}).then(receiver, function(reason) {
|
||||
err("failed to asynchronously prepare wasm: " + reason);
|
||||
abort(reason);
|
||||
});
|
||||
}
|
||||
function instantiateAsync() {
|
||||
if (filePathOrBuf && filePathOrBuf.byteLength > 0) {
|
||||
return WebAssembly.instantiate(filePathOrBuf, info).then(receiveInstantiationResult, function(reason) {
|
||||
err("wasm compile failed: " + reason);
|
||||
});
|
||||
} else if (typeof WebAssembly.instantiateStreaming === "function" && typeof filePathOrBuf === "string" && typeof fetch === "function") {
|
||||
return fetch(filePathOrBuf, { credentials: "same-origin" }).then(function(response) {
|
||||
var result = WebAssembly.instantiateStreaming(response, info);
|
||||
return result.then(receiveInstantiationResult, function(reason) {
|
||||
err("wasm streaming compile failed: " + reason);
|
||||
err("falling back to ArrayBuffer instantiation");
|
||||
return instantiateArrayBuffer(receiveInstantiationResult);
|
||||
});
|
||||
});
|
||||
} else {
|
||||
return instantiateArrayBuffer(receiveInstantiationResult);
|
||||
}
|
||||
}
|
||||
if (Module["instantiateWasm"]) {
|
||||
try {
|
||||
var exports = Module["instantiateWasm"](info, receiveInstance);
|
||||
return exports;
|
||||
} catch (e) {
|
||||
err("Module.instantiateWasm callback failed with error: " + e);
|
||||
return false;
|
||||
}
|
||||
}
|
||||
instantiateAsync();
|
||||
return {};
|
||||
}
|
||||
var ExitStatus = class {
|
||||
constructor(status) {
|
||||
this.name = "ExitStatus";
|
||||
this.message = `Program terminated with exit(${status})`;
|
||||
this.status = status;
|
||||
}
|
||||
};
|
||||
var callRuntimeCallbacks = (callbacks) => {
|
||||
while (callbacks.length > 0) {
|
||||
callbacks.shift()(Module);
|
||||
}
|
||||
};
|
||||
var noExitRuntime = Module["noExitRuntime"] || true;
|
||||
var __abort_js = () => abort("");
|
||||
var runtimeKeepaliveCounter = 0;
|
||||
var __emscripten_runtime_keepalive_clear = () => {
|
||||
noExitRuntime = false;
|
||||
runtimeKeepaliveCounter = 0;
|
||||
};
|
||||
var timers = {};
|
||||
var handleException = (e) => {
|
||||
if (e instanceof ExitStatus || e == "unwind") {
|
||||
return EXITSTATUS;
|
||||
}
|
||||
quit_(1, e);
|
||||
};
|
||||
var keepRuntimeAlive = () => noExitRuntime || runtimeKeepaliveCounter > 0;
|
||||
var _proc_exit = (code) => {
|
||||
var _a;
|
||||
EXITSTATUS = code;
|
||||
if (!keepRuntimeAlive()) {
|
||||
(_a = Module["onExit"]) === null || _a === void 0 ? void 0 : _a.call(Module, code);
|
||||
ABORT = true;
|
||||
}
|
||||
quit_(code, new ExitStatus(code));
|
||||
};
|
||||
var exitJS = (status, implicit) => {
|
||||
EXITSTATUS = status;
|
||||
_proc_exit(status);
|
||||
};
|
||||
var _exit = exitJS;
|
||||
var maybeExit = () => {
|
||||
if (!keepRuntimeAlive()) {
|
||||
try {
|
||||
_exit(EXITSTATUS);
|
||||
} catch (e) {
|
||||
handleException(e);
|
||||
}
|
||||
}
|
||||
};
|
||||
var callUserCallback = (func) => {
|
||||
if (ABORT) {
|
||||
return;
|
||||
}
|
||||
try {
|
||||
func();
|
||||
maybeExit();
|
||||
} catch (e) {
|
||||
handleException(e);
|
||||
}
|
||||
};
|
||||
var _emscripten_get_now = () => performance.now();
|
||||
var __setitimer_js = (which, timeout_ms) => {
|
||||
if (timers[which]) {
|
||||
clearTimeout(timers[which].id);
|
||||
delete timers[which];
|
||||
}
|
||||
if (!timeout_ms)
|
||||
return 0;
|
||||
var id = setTimeout(() => {
|
||||
delete timers[which];
|
||||
callUserCallback(() => __emscripten_timeout(which, _emscripten_get_now()));
|
||||
}, timeout_ms);
|
||||
timers[which] = { id, timeout_ms };
|
||||
return 0;
|
||||
};
|
||||
var getHeapMax = () => 2147483648;
|
||||
var alignMemory = (size, alignment) => Math.ceil(size / alignment) * alignment;
|
||||
var growMemory = (size) => {
|
||||
var b = wasmMemory.buffer;
|
||||
var pages = (size - b.byteLength + 65535) / 65536 | 0;
|
||||
try {
|
||||
wasmMemory.grow(pages);
|
||||
updateMemoryViews();
|
||||
return 1;
|
||||
} catch (e) {
|
||||
}
|
||||
};
|
||||
var _emscripten_resize_heap = (requestedSize) => {
|
||||
var oldSize = HEAPU8.length;
|
||||
requestedSize >>>= 0;
|
||||
var maxHeapSize = getHeapMax();
|
||||
if (requestedSize > maxHeapSize) {
|
||||
return false;
|
||||
}
|
||||
for (var cutDown = 1; cutDown <= 4; cutDown *= 2) {
|
||||
var overGrownHeapSize = oldSize * (1 + 0.2 / cutDown);
|
||||
overGrownHeapSize = Math.min(overGrownHeapSize, requestedSize + 100663296);
|
||||
var newSize = Math.min(maxHeapSize, alignMemory(Math.max(requestedSize, overGrownHeapSize), 65536));
|
||||
var replacement = growMemory(newSize);
|
||||
if (replacement) {
|
||||
return true;
|
||||
}
|
||||
}
|
||||
return false;
|
||||
};
|
||||
var wasmImports = {
|
||||
c: __abort_js,
|
||||
b: __emscripten_runtime_keepalive_clear,
|
||||
d: __setitimer_js,
|
||||
e: _emscripten_resize_heap,
|
||||
a: _proc_exit
|
||||
};
|
||||
var wasmExports;
|
||||
var _ZSTD_isError = Module["_ZSTD_isError"] = (a0) => (_ZSTD_isError = Module["_ZSTD_isError"] = wasmExports["h"])(a0);
|
||||
var _ZSTD_compressBound = Module["_ZSTD_compressBound"] = (a0) => (_ZSTD_compressBound = Module["_ZSTD_compressBound"] = wasmExports["i"])(a0);
|
||||
var _ZSTD_createCCtx = Module["_ZSTD_createCCtx"] = () => (_ZSTD_createCCtx = Module["_ZSTD_createCCtx"] = wasmExports["j"])();
|
||||
var _ZSTD_freeCCtx = Module["_ZSTD_freeCCtx"] = (a0) => (_ZSTD_freeCCtx = Module["_ZSTD_freeCCtx"] = wasmExports["k"])(a0);
|
||||
var _ZSTD_compress_usingDict = Module["_ZSTD_compress_usingDict"] = (a0, a1, a2, a3, a4, a5, a6, a7) => (_ZSTD_compress_usingDict = Module["_ZSTD_compress_usingDict"] = wasmExports["l"])(a0, a1, a2, a3, a4, a5, a6, a7);
|
||||
var _ZSTD_compress = Module["_ZSTD_compress"] = (a0, a1, a2, a3, a4) => (_ZSTD_compress = Module["_ZSTD_compress"] = wasmExports["m"])(a0, a1, a2, a3, a4);
|
||||
var _ZSTD_createDCtx = Module["_ZSTD_createDCtx"] = () => (_ZSTD_createDCtx = Module["_ZSTD_createDCtx"] = wasmExports["n"])();
|
||||
var _ZSTD_freeDCtx = Module["_ZSTD_freeDCtx"] = (a0) => (_ZSTD_freeDCtx = Module["_ZSTD_freeDCtx"] = wasmExports["o"])(a0);
|
||||
var _ZSTD_getFrameContentSize = Module["_ZSTD_getFrameContentSize"] = (a0, a1) => (_ZSTD_getFrameContentSize = Module["_ZSTD_getFrameContentSize"] = wasmExports["p"])(a0, a1);
|
||||
var _ZSTD_decompress_usingDict = Module["_ZSTD_decompress_usingDict"] = (a0, a1, a2, a3, a4, a5, a6) => (_ZSTD_decompress_usingDict = Module["_ZSTD_decompress_usingDict"] = wasmExports["q"])(a0, a1, a2, a3, a4, a5, a6);
|
||||
var _ZSTD_decompress = Module["_ZSTD_decompress"] = (a0, a1, a2, a3) => (_ZSTD_decompress = Module["_ZSTD_decompress"] = wasmExports["r"])(a0, a1, a2, a3);
|
||||
var _malloc = Module["_malloc"] = (a0) => (_malloc = Module["_malloc"] = wasmExports["s"])(a0);
|
||||
var _free = Module["_free"] = (a0) => (_free = Module["_free"] = wasmExports["t"])(a0);
|
||||
var __emscripten_timeout = (a0, a1) => (__emscripten_timeout = wasmExports["v"])(a0, a1);
|
||||
var calledRun;
|
||||
dependenciesFulfilled = function runCaller() {
|
||||
if (!calledRun)
|
||||
run();
|
||||
if (!calledRun)
|
||||
dependenciesFulfilled = runCaller;
|
||||
};
|
||||
function run() {
|
||||
if (runDependencies > 0) {
|
||||
return;
|
||||
}
|
||||
preRun();
|
||||
if (runDependencies > 0) {
|
||||
return;
|
||||
}
|
||||
function doRun() {
|
||||
var _a;
|
||||
if (calledRun)
|
||||
return;
|
||||
calledRun = true;
|
||||
Module["calledRun"] = true;
|
||||
if (ABORT)
|
||||
return;
|
||||
initRuntime();
|
||||
(_a = Module["onRuntimeInitialized"]) === null || _a === void 0 ? void 0 : _a.call(Module);
|
||||
postRun();
|
||||
}
|
||||
if (Module["setStatus"]) {
|
||||
Module["setStatus"]("Running...");
|
||||
setTimeout(() => {
|
||||
setTimeout(() => Module["setStatus"](""), 1);
|
||||
doRun();
|
||||
}, 1);
|
||||
} else {
|
||||
doRun();
|
||||
}
|
||||
}
|
||||
Module["run"] = run;
|
||||
if (Module["preInit"]) {
|
||||
if (typeof Module["preInit"] == "function")
|
||||
Module["preInit"] = [Module["preInit"]];
|
||||
while (Module["preInit"].length > 0) {
|
||||
Module["preInit"].pop()();
|
||||
}
|
||||
}
|
||||
Module["init"] = init;
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/module.js
|
||||
var __awaiter = function(thisArg, _arguments, P, generator) {
|
||||
function adopt(value) {
|
||||
return value instanceof P ? value : new P(function(resolve) {
|
||||
resolve(value);
|
||||
});
|
||||
}
|
||||
return new (P || (P = Promise))(function(resolve, reject) {
|
||||
function fulfilled(value) {
|
||||
try {
|
||||
step(generator.next(value));
|
||||
} catch (e) {
|
||||
reject(e);
|
||||
}
|
||||
}
|
||||
function rejected(value) {
|
||||
try {
|
||||
step(generator["throw"](value));
|
||||
} catch (e) {
|
||||
reject(e);
|
||||
}
|
||||
}
|
||||
function step(result) {
|
||||
result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected);
|
||||
}
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
var initialized = (() => new Promise((resolve) => {
|
||||
Module.onRuntimeInitialized = resolve;
|
||||
}))();
|
||||
var waitInitialized = () => __awaiter(void 0, void 0, void 0, function* () {
|
||||
yield initialized;
|
||||
});
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/errors/index.js
|
||||
var isError = (code) => {
|
||||
const _isError = Module["_ZSTD_isError"];
|
||||
return _isError(code);
|
||||
};
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/simple/decompress.js
|
||||
var getFrameContentSize = (src, size) => {
|
||||
const getSize = Module["_ZSTD_getFrameContentSize"];
|
||||
return getSize(src, size);
|
||||
};
|
||||
var decompress = (buf, opts = { defaultHeapSize: 1024 * 1024 }) => {
|
||||
const malloc = Module["_malloc"];
|
||||
const src = malloc(buf.byteLength);
|
||||
Module.HEAP8.set(buf, src);
|
||||
const contentSize = getFrameContentSize(src, buf.byteLength);
|
||||
const size = contentSize === -1 ? opts.defaultHeapSize : contentSize;
|
||||
const free = Module["_free"];
|
||||
const heap = malloc(size);
|
||||
try {
|
||||
const _decompress = Module["_ZSTD_decompress"];
|
||||
const sizeOrError = _decompress(heap, size, src, buf.byteLength);
|
||||
if (isError(sizeOrError)) {
|
||||
throw new Error(`Failed to compress with code ${sizeOrError}`);
|
||||
}
|
||||
const data = new Uint8Array(Module.HEAPU8.buffer, heap, sizeOrError).slice();
|
||||
free(heap, size);
|
||||
free(src, buf.byteLength);
|
||||
return data;
|
||||
} catch (e) {
|
||||
free(heap, size);
|
||||
free(src, buf.byteLength);
|
||||
throw e;
|
||||
}
|
||||
};
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/simple/compress.js
|
||||
var compressBound = (size) => {
|
||||
const bound = Module["_ZSTD_compressBound"];
|
||||
return bound(size);
|
||||
};
|
||||
var compress = (buf, level) => {
|
||||
const bound = compressBound(buf.byteLength);
|
||||
const malloc = Module["_malloc"];
|
||||
const compressed = malloc(bound);
|
||||
const src = malloc(buf.byteLength);
|
||||
Module.HEAP8.set(buf, src);
|
||||
const free = Module["_free"];
|
||||
try {
|
||||
const _compress = Module["_ZSTD_compress"];
|
||||
const sizeOrError = _compress(compressed, bound, src, buf.byteLength, level !== null && level !== void 0 ? level : 3);
|
||||
if (isError(sizeOrError)) {
|
||||
throw new Error(`Failed to compress with code ${sizeOrError}`);
|
||||
}
|
||||
const data = new Uint8Array(Module.HEAPU8.buffer, compressed, sizeOrError).slice();
|
||||
free(compressed, bound);
|
||||
free(src, buf.byteLength);
|
||||
return data;
|
||||
} catch (e) {
|
||||
free(compressed, bound);
|
||||
free(src, buf.byteLength);
|
||||
throw e;
|
||||
}
|
||||
};
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/simple/decompress_using_dict.js
|
||||
var getFrameContentSize2 = (src, size) => {
|
||||
const getSize = Module["_ZSTD_getFrameContentSize"];
|
||||
return getSize(src, size);
|
||||
};
|
||||
var createDCtx = () => {
|
||||
return Module["_ZSTD_createDCtx"]();
|
||||
};
|
||||
var freeDCtx = (dctx) => {
|
||||
return Module["_ZSTD_freeDCtx"](dctx);
|
||||
};
|
||||
var decompressUsingDict = (dctx, buf, dict, opts = { defaultHeapSize: 1024 * 1024 }) => {
|
||||
const malloc = Module["_malloc"];
|
||||
const src = malloc(buf.byteLength);
|
||||
Module.HEAP8.set(buf, src);
|
||||
const pdict = malloc(dict.byteLength);
|
||||
Module.HEAP8.set(dict, pdict);
|
||||
const contentSize = getFrameContentSize2(src, buf.byteLength);
|
||||
const size = contentSize === -1 ? opts.defaultHeapSize : contentSize;
|
||||
const free = Module["_free"];
|
||||
const heap = malloc(size);
|
||||
try {
|
||||
const _decompress = Module["_ZSTD_decompress_usingDict"];
|
||||
const sizeOrError = _decompress(dctx, heap, size, src, buf.byteLength, pdict, dict.byteLength);
|
||||
if (isError(sizeOrError)) {
|
||||
throw new Error(`Failed to compress with code ${sizeOrError}`);
|
||||
}
|
||||
const data = new Uint8Array(Module.HEAPU8.buffer, heap, sizeOrError).slice();
|
||||
free(heap, size);
|
||||
free(src, buf.byteLength);
|
||||
free(pdict, dict.byteLength);
|
||||
return data;
|
||||
} catch (e) {
|
||||
free(heap, size);
|
||||
free(src, buf.byteLength);
|
||||
free(pdict, dict.byteLength);
|
||||
throw e;
|
||||
}
|
||||
};
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/simple/compress_using_dict.js
|
||||
var compressBound2 = (size) => {
|
||||
const bound = Module["_ZSTD_compressBound"];
|
||||
return bound(size);
|
||||
};
|
||||
var createCCtx = () => {
|
||||
return Module["_ZSTD_createCCtx"]();
|
||||
};
|
||||
var freeCCtx = (cctx) => {
|
||||
return Module["_ZSTD_freeCCtx"](cctx);
|
||||
};
|
||||
var compressUsingDict = (cctx, buf, dict, level) => {
|
||||
const bound = compressBound2(buf.byteLength);
|
||||
const malloc = Module["_malloc"];
|
||||
const compressed = malloc(bound);
|
||||
const src = malloc(buf.byteLength);
|
||||
Module.HEAP8.set(buf, src);
|
||||
const pdict = malloc(dict.byteLength);
|
||||
Module.HEAP8.set(dict, pdict);
|
||||
const free = Module["_free"];
|
||||
try {
|
||||
const _compress = Module["_ZSTD_compress_usingDict"];
|
||||
const sizeOrError = _compress(cctx, compressed, bound, src, buf.byteLength, pdict, dict.byteLength, level !== null && level !== void 0 ? level : 3);
|
||||
if (isError(sizeOrError)) {
|
||||
throw new Error(`Failed to compress with code ${sizeOrError}`);
|
||||
}
|
||||
const data = new Uint8Array(Module.HEAPU8.buffer, compressed, sizeOrError).slice();
|
||||
free(compressed, bound);
|
||||
free(src, buf.byteLength);
|
||||
free(pdict, dict.byteLength);
|
||||
return data;
|
||||
} catch (e) {
|
||||
free(compressed, bound);
|
||||
free(src, buf.byteLength);
|
||||
free(pdict, dict.byteLength);
|
||||
throw e;
|
||||
}
|
||||
};
|
||||
|
||||
// node_modules/@bokuweb/zstd-wasm/dist/web/index.web.js
|
||||
var __awaiter2 = function(thisArg, _arguments, P, generator) {
|
||||
function adopt(value) {
|
||||
return value instanceof P ? value : new P(function(resolve) {
|
||||
resolve(value);
|
||||
});
|
||||
}
|
||||
return new (P || (P = Promise))(function(resolve, reject) {
|
||||
function fulfilled(value) {
|
||||
try {
|
||||
step(generator.next(value));
|
||||
} catch (e) {
|
||||
reject(e);
|
||||
}
|
||||
}
|
||||
function rejected(value) {
|
||||
try {
|
||||
step(generator["throw"](value));
|
||||
} catch (e) {
|
||||
reject(e);
|
||||
}
|
||||
}
|
||||
function step(result) {
|
||||
result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected);
|
||||
}
|
||||
step((generator = generator.apply(thisArg, _arguments || [])).next());
|
||||
});
|
||||
};
|
||||
var init2 = (path) => __awaiter2(void 0, void 0, void 0, function* () {
|
||||
const url = new URL(`./zstd.wasm`, import.meta.url).href;
|
||||
Module["init"](path !== null && path !== void 0 ? path : url);
|
||||
yield waitInitialized();
|
||||
});
|
||||
export {
|
||||
compress,
|
||||
compressUsingDict,
|
||||
createCCtx,
|
||||
createDCtx,
|
||||
decompress,
|
||||
decompressUsingDict,
|
||||
freeCCtx,
|
||||
freeDCtx,
|
||||
init2 as init
|
||||
};
|
||||
//# sourceMappingURL=zstd-wasm.js.map
|
||||
7
vendor/yomitan/lib/zstd-wasm.js.map
vendored
Normal file
7
vendor/yomitan/lib/zstd-wasm.js.map
vendored
Normal file
File diff suppressed because one or more lines are too long
120
vendor/yomitan/manifest_static.json
vendored
120
vendor/yomitan/manifest_static.json
vendored
@@ -1,120 +0,0 @@
|
||||
{
|
||||
"manifest_version": 3,
|
||||
"name": "Yomitan Popup Dictionary",
|
||||
"version": "25.9.29.0",
|
||||
"description": "Popup dictionary for language learning",
|
||||
"key": "likgccmbimhjbgkjambclfkhldnlhbnn",
|
||||
"author": {
|
||||
"email": "themoeway@googlegroups.com"
|
||||
},
|
||||
"icons": {
|
||||
"16": "images/icon16.png",
|
||||
"19": "images/icon19.png",
|
||||
"32": "images/icon32.png",
|
||||
"38": "images/icon38.png",
|
||||
"48": "images/icon48.png",
|
||||
"64": "images/icon64.png",
|
||||
"128": "images/icon128.png"
|
||||
},
|
||||
"action": {
|
||||
"default_icon": {
|
||||
"16": "images/icon16.png",
|
||||
"19": "images/icon19.png",
|
||||
"32": "images/icon32.png",
|
||||
"38": "images/icon38.png",
|
||||
"48": "images/icon48.png",
|
||||
"64": "images/icon64.png",
|
||||
"128": "images/icon128.png"
|
||||
},
|
||||
"default_title": "Yomitan",
|
||||
"default_popup": "action-popup.html"
|
||||
},
|
||||
"background": {
|
||||
"service_worker": "sw.js",
|
||||
"type": "module"
|
||||
},
|
||||
"omnibox": {
|
||||
"keyword": "yomi"
|
||||
},
|
||||
"content_scripts": [
|
||||
{
|
||||
"run_at": "document_idle",
|
||||
"matches": [
|
||||
"http://*/*",
|
||||
"https://*/*",
|
||||
"file://*/*"
|
||||
],
|
||||
"match_about_blank": true,
|
||||
"all_frames": true,
|
||||
"js": [
|
||||
"js/app/content-script-wrapper.js"
|
||||
]
|
||||
}
|
||||
],
|
||||
"minimum_chrome_version": "102.0.0.0",
|
||||
"options_ui": {
|
||||
"page": "settings.html",
|
||||
"open_in_tab": true
|
||||
},
|
||||
"sandbox": {
|
||||
"pages": [
|
||||
"template-renderer.html"
|
||||
]
|
||||
},
|
||||
"permissions": [
|
||||
"storage",
|
||||
"clipboardWrite",
|
||||
"unlimitedStorage",
|
||||
"declarativeNetRequest",
|
||||
"scripting",
|
||||
"offscreen",
|
||||
"contextMenus"
|
||||
],
|
||||
"optional_permissions": [
|
||||
"clipboardRead",
|
||||
"nativeMessaging"
|
||||
],
|
||||
"host_permissions": [
|
||||
"<all_urls>"
|
||||
],
|
||||
"commands": {
|
||||
"toggleTextScanning": {
|
||||
"suggested_key": {
|
||||
"default": "Alt+Delete"
|
||||
},
|
||||
"description": "Toggle text scanning on/off"
|
||||
},
|
||||
"openInfoPage": {
|
||||
"description": "Open the info page"
|
||||
},
|
||||
"openSettingsPage": {
|
||||
"description": "Open the settings page"
|
||||
},
|
||||
"openSearchPage": {
|
||||
"suggested_key": {
|
||||
"default": "Alt+Insert"
|
||||
},
|
||||
"description": "Open the search page"
|
||||
},
|
||||
"openPopupWindow": {
|
||||
"description": "Open the popup window"
|
||||
}
|
||||
},
|
||||
"web_accessible_resources": [
|
||||
{
|
||||
"resources": [
|
||||
"popup.html",
|
||||
"template-renderer.html",
|
||||
"js/*",
|
||||
"lib/resvg.wasm"
|
||||
],
|
||||
"matches": [
|
||||
"<all_urls>"
|
||||
]
|
||||
}
|
||||
],
|
||||
"content_security_policy": {
|
||||
"extension_pages": "default-src 'self'; script-src 'self' 'wasm-unsafe-eval'; img-src blob: 'self'; style-src 'self' 'unsafe-inline'; media-src *; connect-src *",
|
||||
"sandbox": "sandbox allow-scripts; default-src 'self'; script-src 'self' 'unsafe-eval'; style-src 'unsafe-inline'"
|
||||
}
|
||||
}
|
||||
Reference in New Issue
Block a user