Compare commits

...

7 Commits

Author SHA1 Message Date
a954f62f55 Decouple stats daemon and preserve final mine OSD status
- Run `subminer stats -b` as a dedicated daemon process, independent from the overlay app
- Stop Anki progress spinner before showing final `✓`/`x` mine result so it is not overwritten
- Keep grammar/noise subtitle tokens hoverable while stripping annotation metadata
2026-03-18 23:49:27 -07:00
4d96ebf5c0 fix: reduce prefetched subtitle annotation delay 2026-03-18 23:47:33 -07:00
7a0d7a488b docs: redesign README for cleaner layout and scannability
- Condense features into bold-label paragraphs instead of H3 subsections
- Collapse install instructions into <details> sections
- Remove redundant screenshots (annotations-key, stats-vocabulary)
- Add AUR version badge
- Merge first-launch and setup steps into a single paragraph
- Add horizontal rule dividers between major sections
2026-03-18 23:35:17 -07:00
f916b65d7f fix: sync texthooker-ui annotation overrides 2026-03-18 19:32:51 -07:00
36627bf87d fix(anki): avoid unnecessary known-word cache restarts 2026-03-18 19:29:47 -07:00
ad1f66a842 feat: sync animated anki images to sentence audio 2026-03-18 19:21:12 -07:00
f4cce31d4a fix: align texthooker and stats formatting with CI expectations 2026-03-18 19:01:29 -07:00
73 changed files with 3098 additions and 301 deletions

149
README.md
View File

@@ -1,16 +1,20 @@
<div align="center">
<img src="assets/SubMiner.png" width="169" alt="SubMiner logo">
<h1>SubMiner</h1>
<strong>Look up words, mine to Anki, and enrich cards with context — without leaving mpv.</strong>
<br /><br />
<img src="assets/SubMiner.png" width="140" alt="SubMiner logo">
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)
[![Linux](https://img.shields.io/badge/platform-Linux%20%7C%20macOS%20%7C%20Windows-informational)]()
[![Docs](https://img.shields.io/badge/docs-docs.subminer.moe-blueviolet)](https://docs.subminer.moe)
# SubMiner
**Sentence-mine from mpv — look up words, one-key Anki export, immersion tracking.**
[![License: GPL v3](https://img.shields.io/badge/License-GPLv3-blue.svg)](https://www.gnu.org/licenses/gpl-3.0)
[![Linux](https://img.shields.io/badge/platform-Linux%20%7C%20macOS%20%7C%20Windows-informational)]()
[![Docs](https://img.shields.io/badge/docs-docs.subminer.moe-blueviolet)](https://docs.subminer.moe)
[![AUR](https://img.shields.io/aur/version/subminer-bin)](https://aur.archlinux.org/packages/subminer-bin)
</div>
<br />
---
SubMiner is an Electron overlay for [mpv](https://mpv.io) that turns video into a sentence-mining workstation. Look up any word with [Yomitan](https://github.com/yomidevs/yomitan), mine it to Anki with one key, and track your immersion over time.
<div align="center">
@@ -18,66 +22,42 @@
</div>
<br />
## What it does
SubMiner is an Electron overlay that sits on top of mpv. It turns your video player into a full sentence-mining workstation — look up any word with Yomitan, mine it to Anki with one key, and track your immersion progress over time.
## Features
### Dictionary Lookups While You Watch
**Dictionary lookups** — Yomitan runs inside the overlay. Hover or navigate to any word for full dictionary popups without leaving mpv.
Yomitan runs directly inside the overlay. Hover over any word in the subtitles or navigate with keyboard/controller to get full dictionary popups without pausing or switching windows.
### One-Key Anki Mining
Press a single key to send a word to Anki. SubMiner auto-fills the card with the sentence, audio clip, screenshot, and machine translation — all captured from the exact moment you looked it up.
**One-key Anki mining** — Press one key to create a card with the sentence, audio clip, screenshot, and machine translation from the exact playback moment.
<div align="center">
<img src="docs-site/public/screenshots/yomitan-lookup.png" width="800" alt="One-key Anki card creation — Yomitan popup with dictionary entry and mine button over annotated subtitles">
<img src="docs-site/public/screenshots/yomitan-lookup.png" width="800" alt="Yomitan popup with dictionary entry and mine button over annotated subtitles in mpv">
</div>
### Reading Annotations
Subtitles are annotated in real time with N+1 targeting, frequency-dictionary highlighting, JLPT level tags, and a character name dictionary for anime and manga proper nouns.
**Reading annotations** — Real-time subtitle annotations with N+1 targeting, frequency highlighting, JLPT tags, and a character name dictionary. Grammar-only tokens render as plain text.
<div align="center">
<img src="docs-site/public/screenshots/annotations.png" width="800" alt="Subtitle annotations with frequency highlighting, JLPT underlines, known words, N+1 targets, and character names">
<br/>
<img src="docs-site/public/screenshots/annotations-key.png" width="800" alt="Subtitle annotations with frequency highlighting, JLPT underlines, known words, N+1 targets, and character names">
<img src="docs-site/public/screenshots/annotations.png" width="800" alt="Annotated subtitles with frequency highlighting, JLPT underlines, known words, and N+1 targets">
</div>
### Immersion Dashboard
A local stats dashboard tracks your watch time, anime progress, vocabulary growth, mining throughput, and session history. Drill down into individual sessions or browse your full library.
**Immersion dashboard** — Local stats dashboard with watch time, anime progress, vocabulary growth, mining throughput, and session history.
<div align="center">
<img src="docs-site/public/screenshots/stats-overview.png" width="800" alt="Stats dashboard — overview with watch time, cards mined, streaks, and tracking snapshot">
<br /><br />
<img src="docs-site/public/screenshots/stats-vocabulary.png" width="800" alt="Vocabulary tab — unique words, known words, top repeated words, and unmined word list">
<br /><br />
<!-- <img src="docs-site/public/screenshots/stats-library.png" width="800" alt="Library tab — anime grid with episode counts and watch time"> -->
<img src="docs-site/public/screenshots/stats-overview.png" width="800" alt="Stats dashboard with watch time, cards mined, streaks, and tracking snapshot">
</div>
### External Integrations
- **AniList** — Automatic episode progress tracking
- **Jellyfin** — Remote playback, cast device mode
- **Subtitle tools** — Download from Jimaku, sync with alass/ffsubsync
- **Texthooker & API** — Custom texthooker page and annotated websocket feed for external clients
**Integrations** — AniList episode tracking, Jellyfin remote playback, Jimaku subtitle downloads, alass/ffsubsync, and an annotated websocket feed for external clients.
<div align="center">
<img src="docs-site/public/screenshots/texthooker.png" width="800" alt="Texthooker page with annotated subtitle lines — known words, N+1 targets, character names, and frequency highlighting">
<img src="docs-site/public/screenshots/texthooker.png" width="800" alt="Texthooker page with annotated subtitle lines and frequency highlighting">
</div>
## Quick start
---
### 1. Install
## Quick Start
**Arch Linux (AUR):**
### Install
Install [`subminer-bin`](https://aur.archlinux.org/packages/subminer-bin) from the AUR. It installs the packaged AppImage plus the `subminer` wrapper:
<details>
<summary><b>Arch Linux (AUR)</b></summary>
```bash
paru -S subminer-bin
@@ -86,84 +66,71 @@ paru -S subminer-bin
Or manually:
```bash
git clone https://aur.archlinux.org/subminer-bin.git
cd subminer-bin
makepkg -si
git clone https://aur.archlinux.org/subminer-bin.git && cd subminer-bin && makepkg -si
```
**Linux (AppImage):**
</details>
<details>
<summary><b>Linux (AppImage)</b></summary>
```bash
wget https://github.com/ksyasuda/SubMiner/releases/latest/download/SubMiner.AppImage -O ~/.local/bin/SubMiner.AppImage \
&& chmod +x ~/.local/bin/SubMiner.AppImage
wget https://github.com/ksyasuda/SubMiner/releases/latest/download/subminer -O ~/.local/bin/subminer \
&& chmod +x ~/.local/bin/subminer
```
> [!NOTE]
> The `subminer` wrapper uses a [Bun](https://bun.sh) shebang. Make sure `bun` is on your `PATH`.
**macOS (DMG/ZIP):** download the latest packaged build from [GitHub Releases](https://github.com/ksyasuda/SubMiner/releases/latest) and drag `SubMiner.app` into `/Applications`.
</details>
**Windows (Installer/ZIP):** download the latest `SubMiner-<version>.exe` installer or portable `.zip` from [GitHub Releases](https://github.com/ksyasuda/SubMiner/releases/latest). Keep `mpv` installed and available on `PATH`.
<details>
<summary><b>macOS / Windows / From source</b></summary>
**From source** — see [docs.subminer.moe/installation#from-source](https://docs.subminer.moe/installation#from-source).
**macOS** — Download the latest DMG/ZIP from [GitHub Releases](https://github.com/ksyasuda/SubMiner/releases/latest) and drag `SubMiner.app` into `/Applications`.
### 2. Launch the app once
**Windows** — Download the latest installer or portable `.zip` from [GitHub Releases](https://github.com/ksyasuda/SubMiner/releases/latest). Keep `mpv` on `PATH`.
**From source** — See [docs.subminer.moe/installation#from-source](https://docs.subminer.moe/installation#from-source).
</details>
### First Launch
Run `SubMiner.AppImage` (Linux), `SubMiner.app` (macOS), or `SubMiner.exe` (Windows). On first launch, SubMiner starts in the tray, creates a default config, and opens a setup popup where you can install the mpv plugin and configure Yomitan dictionaries.
### Mine
```bash
# Linux
SubMiner.AppImage
subminer video.mkv # auto-starts overlay + resumes playback
subminer --start video.mkv # explicit overlay start (if plugin auto_start=no)
subminer stats # open the immersion dashboard
```
On macOS, launch `SubMiner.app`. On Windows, launch `SubMiner.exe` from the Start menu or install directory.
On first launch, SubMiner:
- starts in the tray/background
- creates the default config directory and `config.jsonc`
- opens a compact setup popup
- can install the mpv plugin to the default mpv scripts location for you
- links directly to Yomitan settings so you can install dictionaries before finishing setup
### 3. Finish setup
- click `Install mpv plugin` if you want the default plugin auto-start flow
- click `Open Yomitan Settings` and install at least one dictionary
- click `Refresh status`
- click `Finish setup`
The mpv plugin step is optional. Yomitan must report at least one installed dictionary before setup can be completed.
### 4. Mine
```bash
subminer video.mkv # default plugin config auto-starts visible overlay + resumes playback when ready
subminer --start video.mkv # optional explicit overlay start when plugin auto_start=no
subminer stats # open the local stats dashboard in your browser
```
---
## Requirements
| Required | Optional |
| ------------------------------------------ | -------------------------------------------------- |
| `bun` (source builds, Linux `subminer`) | |
| `mpv` with IPC socket | `yt-dlp` |
| `ffmpeg` | `guessit` (better AniSkip title/episode detection) |
|---|---|
| [`mpv`](https://mpv.io) with IPC socket | `yt-dlp` |
| `ffmpeg` | `guessit` (AniSkip detection) |
| `mecab` + `mecab-ipadic` | `fzf` / `rofi` |
| Linux: `hyprctl` or `xdotool` + `xwininfo` | `chafa`, `ffmpegthumbnailer` |
| [`bun`](https://bun.sh) (source builds, Linux wrapper) | `chafa`, `ffmpegthumbnailer` |
| Linux: `hyprctl` or `xdotool` + `xwininfo` | |
| macOS: Accessibility permission | |
Windows builds use native window tracking and do not require the Linux compositor helper tools.
Windows uses native window tracking and does not need the Linux compositor tools.
## Documentation
For full guides on configuration, Anki, Jellyfin, immersion tracking/stats, and more, see [docs.subminer.moe](https://docs.subminer.moe). The VitePress source for that site lives in [`docs-site/`](./docs-site/).
Full guides on configuration, Anki, Jellyfin, immersion tracking, and more at **[docs.subminer.moe](https://docs.subminer.moe)**.
## Acknowledgments
Built on the shoulders of [GameSentenceMiner](https://github.com/bpwhelan/GameSentenceMiner), [Renji's Texthooker Page](https://github.com/Renji-XD/texthooker-ui), [Anacreon-Script](https://github.com/friedrich-de/Anacreon-Script), and [Bee's Character Dictionary](https://github.com/bee-san/Japanese_Character_Name_Dictionary). Subtitles powered by [Jimaku.cc](https://jimaku.cc). Dictionary lookups via [Yomitan](https://github.com/yomidevs/yomitan), and JLPT tags from [yomitan-jlpt-vocab](https://github.com/stephenmk/yomitan-jlpt-vocab).
Built on [GameSentenceMiner](https://github.com/bpwhelan/GameSentenceMiner), [Renji's Texthooker Page](https://github.com/Renji-XD/texthooker-ui), [Anacreon-Script](https://github.com/friedrich-de/Anacreon-Script), and [Bee's Character Dictionary](https://github.com/bee-san/Japanese_Character_Name_Dictionary). Subtitles from [Jimaku.cc](https://jimaku.cc). Lookups via [Yomitan](https://github.com/yomidevs/yomitan). JLPT tags from [yomitan-jlpt-vocab](https://github.com/stephenmk/yomitan-jlpt-vocab).
## License

View File

@@ -0,0 +1,64 @@
---
id: TASK-195
title: Keep final card-mine OSD result from being overwritten by progress spinner
status: Done
assignee:
- Codex
created_date: '2026-03-18 19:40'
updated_date: '2026-03-18 19:49'
labels:
- anki
- ui
- bug
milestone: m-1
dependencies: []
references:
- src/anki-integration/ui-feedback.ts
- src/anki-integration.ts
- src/anki-integration/card-creation.ts
priority: medium
ordinal: 105610
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
When a card mine finishes, the mpv OSD currently tries to show the final status text but the in-flight Anki progress spinner can immediately overwrite it on the next tick. Stop the spinner first, then show a single-line final result with a success/failure marker and the mined-word notification.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Successful mine/update OSD results render after the spinner is stopped and do not get overwritten by a later spinner tick.
- [x] #2 Failure results that replace the spinner show an `x` marker and stay visible on the same OSD line.
- [x] #3 Regression coverage locks the spinner teardown/result-notification ordering.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Add a focused failing regression test around the Anki UI-feedback spinner/result helper.
2. Add a helper that stops progress before emitting the final OSD result line with `✓`/`x`.
3. Route mine/update result notifications through that helper, then run targeted verification.
<!-- SECTION:PLAN:END -->
## Outcome
<!-- SECTION:OUTCOME:BEGIN -->
Added a dedicated Anki UI-feedback result helper that force-clears the in-flight spinner state before emitting the final OSD result line. Successful card-update notifications now render as `✓ Updated card: ...`, and sentence-card creation failures now render as `x Sentence card failed: ...` without a later spinner tick reclaiming the line.
Verification:
- `bun test src/anki-integration/ui-feedback.test.ts`
- `bun test src/anki-integration/ui-feedback.test.ts src/anki-integration/note-update-workflow.test.ts src/anki-integration.test.ts src/core/services/mining.test.ts src/main/runtime/mining-actions.test.ts`
- `bun x prettier --check src/anki-integration/ui-feedback.ts src/anki-integration/ui-feedback.test.ts src/anki-integration.ts src/anki-integration/card-creation.ts "backlog/tasks/task-195 - Keep-final-card-mine-OSD-result-from-being-overwritten-by-progress-spinner.md" changes/2026-03-18-mine-osd-spinner-result.md`
- `bun run changelog:lint`
- `bash .agents/skills/subminer-change-verification/scripts/verify_subminer_change.sh --lane core src/anki-integration/ui-feedback.ts src/anki-integration/ui-feedback.test.ts src/anki-integration.ts src/anki-integration/card-creation.ts changes/2026-03-18-mine-osd-spinner-result.md`
- Verifier artifacts: `.tmp/skill-verification/subminer-verify-20260318-194614-uZMrAx/`
<!-- SECTION:OUTCOME:END -->

View File

@@ -0,0 +1,43 @@
---
id: TASK-196
title: Fix subtitle prefetch cache-key mismatch and active-cue window
status: Done
assignee: []
created_date: '2026-03-18 16:05'
labels: []
dependencies: []
references:
- /home/sudacode/projects/japanese/SubMiner/src/core/services/subtitle-processing-controller.ts
- /home/sudacode/projects/japanese/SubMiner/src/core/services/subtitle-prefetch.ts
documentation: []
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Investigate and fix file-backed subtitle annotation latency where prefetch should warm upcoming lines but live playback still tokenizes each subtitle line. Likely causes: cache-key mismatch between parsed cue text and mpv `sub-text`, and priority-window selection skipping the currently active cue during mid-line starts/seeks.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Prefetched subtitle entries are reused when live subtitle text differs only by normalization details such as ASS `\N`, newline collapsing, or surrounding whitespace.
- [x] #2 Priority-window selection includes the currently active cue when playback starts or seeks into the middle of a cue.
- [x] #3 Regression tests cover the cache-hit normalization path and active-cue priority-window behavior.
- [x] #4 Verification covers the touched prefetch/controller lane.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Add failing regression tests in `subtitle-processing-controller.test.ts` and `subtitle-prefetch.test.ts`.
2. Normalize cache keys in the subtitle processing controller so prefetch/live paths share keys.
3. Adjust prefetch priority-window selection to include the active cue.
4. Run targeted tests, then SubMiner verification lane for touched files.
<!-- SECTION:PLAN:END -->
## Outcome
<!-- SECTION:OUTCOME:BEGIN -->
Normalized subtitle cache keys inside the processing controller so prefetched ASS/VTT/live subtitle text variants reuse the same cache entry, and changed priority-window selection to include the currently active cue based on cue end time. Added regression coverage for both paths and verified the change with the `core` lane.
<!-- SECTION:OUTCOME:END -->

View File

@@ -0,0 +1,45 @@
---
id: TASK-197
title: Eliminate per-line plain subtitle flash on prefetch cache hit
status: Done
assignee: []
created_date: '2026-03-18 16:28'
labels: []
dependencies:
- TASK-196
references:
- /home/sudacode/projects/japanese/SubMiner/src/core/services/subtitle-processing-controller.ts
- /home/sudacode/projects/japanese/SubMiner/src/main/runtime/mpv-main-event-actions.ts
- /home/sudacode/projects/japanese/SubMiner/src/main/runtime/mpv-main-event-main-deps.ts
documentation: []
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Remove the remaining small per-line subtitle annotation delay after prefetch warmup by avoiding the unconditional plain-subtitle broadcast on mpv subtitle-change events when a cached annotated payload already exists.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 On a subtitle cache hit, the mpv subtitle-change path can emit annotated subtitle payload synchronously instead of first broadcasting `tokens: null`.
- [x] #2 Cache-miss behavior still preserves immediate plain-text subtitle display while async tokenization runs.
- [x] #3 Regression tests cover the controller cache-consume path and the mpv subtitle-change handler cache-hit branch.
- [x] #4 Verification covers the touched core/runtime lane.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Add failing tests for controller cache consumption and mpv subtitle-change immediate annotated emission.
2. Add a controller method that consumes cached subtitle payload synchronously while updating internal latest/emitted state.
3. Wire the mpv subtitle-change handler to use the immediate cached payload when present, falling back to the existing plain-text path on misses.
4. Run focused tests and the cheapest sufficient verification lane.
<!-- SECTION:PLAN:END -->
## Outcome
<!-- SECTION:OUTCOME:BEGIN -->
Added `consumeCachedSubtitle` to the subtitle processing controller so cache hits can be claimed synchronously without reprocessing, then wired the mpv subtitle-change handler to emit cached annotated payloads immediately while preserving the existing plain-text fallback for misses. Verified with focused unit tests plus the `runtime-compat` lane.
<!-- SECTION:OUTCOME:END -->

View File

@@ -0,0 +1,45 @@
---
id: TASK-199
title: Forward launcher log level into mpv plugin script opts
status: Done
assignee: []
created_date: '2026-03-18 21:16'
labels: []
dependencies:
- TASK-198
references:
- /home/sudacode/projects/japanese/SubMiner/launcher/aniskip-metadata.ts
- /home/sudacode/projects/japanese/SubMiner/launcher/mpv.ts
- /home/sudacode/projects/japanese/SubMiner/launcher/main.test.ts
- /home/sudacode/projects/japanese/SubMiner/launcher/aniskip-metadata.test.ts
documentation: []
priority: medium
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Make `subminer --log-level=debug ...` reach the mpv plugin auto-start path by forwarding the launcher log level into `--script-opts`, so plugin-started overlay and texthooker subprocesses inherit debug logging.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Launcher mpv playback includes `subminer-log_level=<level>` in `--script-opts` when a non-info CLI log level is used.
- [x] #2 Detached idle mpv launch uses the same script-opt forwarding.
- [x] #3 Regression tests cover launcher script-opt forwarding.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Add a failing launcher regression test that captures mpv argv and expects `subminer-log_level=debug` inside `--script-opts`.
2. Extend the shared script-opt builder to accept launcher log level and emit `subminer-log_level` for non-info runs.
3. Reuse that builder in both normal mpv playback and detached idle mpv launch.
4. Run focused launcher tests and launcher-plugin verification.
<!-- SECTION:PLAN:END -->
## Outcome
<!-- SECTION:OUTCOME:BEGIN -->
Forwarded launcher log level into mpv plugin script opts via the shared builder and reused that builder for idle mpv launch. `subminer --log-level=debug ...` now gives the plugin `opts.log_level=debug`, so auto-started overlay and texthooker subprocesses include `--log-level debug` and the tokenizer timing logs can actually appear in the app log.
<!-- SECTION:OUTCOME:END -->

View File

@@ -0,0 +1,4 @@
type: fixed
area: anki
- Fixed card-mine OSD feedback so the final mine result stops the Anki spinner first, then shows a single-line `✓`/`x` status without being overwritten by a later spinner tick.

View File

@@ -0,0 +1,5 @@
type: fixed
area: stats
- `subminer stats -b` now runs as a standalone background stats daemon instead of reusing the main SubMiner app process, so the overlay app can still be launched separately for normal video watching.
- Dashboard word mining still works against the background daemon by using a short-lived hidden helper for the Yomitan add-note flow.

View File

@@ -1,4 +1,5 @@
type: changed
area: overlay
- Excluded interjections and sound-effect tokens from subtitle annotation styling so they no longer inherit misleading lexical highlight treatment while still remaining visible and non-interactive in the subtitle line.
- Excluded interjections and sound-effect tokens from subtitle annotation styling so they no longer inherit misleading lexical highlight treatment while still remaining visible and hoverable as plain subtitle tokens.
- Expanded subtitle annotation noise filtering to also strip annotation metadata from standalone grammar-only helper tokens such as particles, auxiliaries, adnominals, common explanatory endings like `んです` / `のだ`, and merged trailing quote-particle forms like `...って` while keeping them tokenized for hover lookup.

View File

@@ -319,7 +319,7 @@
"SubMiner"
], // Tags to add to cards mined or updated by SubMiner. Provide an empty array to disable automatic tagging.
"fields": {
"word": "Expression", // Word setting.
"word": "Expression", // Card field for the mined word or expression text.
"audio": "ExpressionAudio", // Audio setting.
"image": "Picture", // Image setting.
"sentence": "Sentence", // Sentence setting.
@@ -340,6 +340,7 @@
"animatedFps": 10, // Animated fps setting.
"animatedMaxWidth": 640, // Animated max width setting.
"animatedCrf": 35, // Animated crf setting.
"syncAnimatedImageToWordAudio": true, // For animated AVIF images, prepend a frozen first frame matching the existing word-audio duration so motion starts with sentence audio. Values: true | false
"audioPadding": 0.5, // Audio padding setting.
"fallbackDuration": 3, // Fallback duration setting.
"maxMediaDuration": 30 // Max media duration setting.

View File

@@ -26,7 +26,7 @@ The same immersion data powers the stats dashboard.
- In-app overlay: focus the visible overlay, then press the key from `stats.toggleKey` (default: `` ` `` / `Backquote`).
- Launcher command: run `subminer stats` to start the local stats server on demand and open the dashboard in your browser.
- Background server: run `subminer stats -b` to start or reuse a dedicated background stats server without keeping the launcher attached, and `subminer stats -s` to stop that background server.
- Background server: run `subminer stats -b` to start or reuse a dedicated background stats daemon without keeping the launcher attached, and `subminer stats -s` to stop that daemon.
- Maintenance command: run `subminer stats cleanup` or `subminer stats cleanup -v` to backfill/repair vocabulary metadata (`headword`, `reading`, POS) and purge stale or excluded rows from `imm_words` on demand.
- Browser page: open `http://127.0.0.1:5175` directly if the local stats server is already running.
@@ -80,8 +80,9 @@ Stats server config lives under `stats`:
- `autoStartServer` starts the local stats HTTP server on launch once immersion tracking is active, or reuses the dedicated background stats server when one is already running.
- `autoOpenBrowser` controls whether `subminer stats` launches the dashboard URL in your browser after ensuring the server is running.
- `subminer stats` forces the dashboard server to start even when `autoStartServer` is `false`.
- `subminer stats -b` starts or reuses the dedicated background stats server and exits after startup acknowledgement.
- `subminer stats -s` stops the dedicated background stats server without closing any browser tabs.
- `subminer stats -b` starts or reuses the dedicated background stats daemon and exits after startup acknowledgement.
- The background stats daemon is separate from the normal SubMiner overlay app, so you can leave it running and still launch SubMiner later to watch or mine from video.
- `subminer stats -s` stops the dedicated background stats daemon without closing any browser tabs.
- `subminer stats` fails with an error when `immersionTracking.enabled` is `false`.
- `subminer stats cleanup` defaults to vocabulary cleanup, repairs stale `headword`, `reading`, and `part_of_speech` values, attempts best-effort MeCab backfill for legacy rows, and removes rows that still fail vocab filtering.
@@ -89,7 +90,7 @@ Stats server config lives under `stats`:
The Vocabulary tab's word detail panel shows example lines from your viewing history. Each example line with a valid source file offers three mining buttons:
- **Mine Word** — performs a full Yomitan dictionary lookup for the word (definition, reading, pitch accent, etc.) via the hidden search page, then enriches the card with sentence audio, a screenshot or animated AVIF clip, the highlighted sentence, and metadata extracted from the source video file. Requires Anki and Yomitan dictionaries to be loaded.
- **Mine Word** — performs a full Yomitan dictionary lookup for the word (definition, reading, pitch accent, etc.) via a short-lived hidden helper, then enriches the card with sentence audio, a screenshot or animated AVIF clip, the highlighted sentence, and metadata extracted from the source video file. Requires Anki and Yomitan dictionaries to be loaded.
- **Mine Sentence** — creates a sentence card directly with the `IsSentenceCard` flag set (for Lapis/Kiku workflows), along with audio, image, and translation from the secondary subtitle if available.
- **Mine Audio** — creates an audio-only card with the `IsAudioCard` flag, attaching only the sentence audio clip.

View File

@@ -319,7 +319,7 @@
"SubMiner"
], // Tags to add to cards mined or updated by SubMiner. Provide an empty array to disable automatic tagging.
"fields": {
"word": "Expression", // Word setting.
"word": "Expression", // Card field for the mined word or expression text.
"audio": "ExpressionAudio", // Audio setting.
"image": "Picture", // Image setting.
"sentence": "Sentence", // Sentence setting.
@@ -340,6 +340,7 @@
"animatedFps": 10, // Animated fps setting.
"animatedMaxWidth": 640, // Animated max width setting.
"animatedCrf": 35, // Animated crf setting.
"syncAnimatedImageToWordAudio": true, // For animated AVIF images, prepend a frozen first frame matching the existing word-audio duration so motion starts with sentence audio. Values: true | false
"audioPadding": 0.5, // Audio padding setting.
"fallbackDuration": 3, // Fallback duration setting.
"maxMediaDuration": 30 // Max media duration setting.

Binary file not shown.

Before

Width:  |  Height:  |  Size: 38 KiB

After

Width:  |  Height:  |  Size: 42 KiB

View File

@@ -4,6 +4,8 @@ SubMiner annotates subtitle tokens in real time as they appear in the overlay. F
All four are opt-in and configured under `subtitleStyle`, `ankiConnect.knownWords`, and `ankiConnect.nPlusOne` in your config. They apply independently — you can enable any combination.
Before any of those layers render, SubMiner strips annotation metadata from tokens that are usually just subtitle glue or annotation noise. Standalone particles, auxiliaries, adnominals, common explanatory endings like `んです` / `のだ`, merged trailing quote-particle forms like `...って`, repeated kana interjections, and similar non-lexical helper tokens remain hoverable in the subtitle text, but they render as plain tokens without known-word, N+1, frequency, JLPT, or name-match annotation styling.
## N+1 Word Highlighting
N+1 highlighting identifies sentences where you know every word except one, making them ideal mining targets. When enabled, SubMiner builds a local cache of your known vocabulary from Anki and highlights tokens accordingly.
@@ -80,6 +82,10 @@ When `sourcePath` is omitted, SubMiner searches default install/runtime location
Frequency highlighting skips tokens that look like non-lexical noise (kana reduplication, short kana endings like `っ`), even when dictionary ranks exist.
:::
::: info
Frequency, JLPT, and N+1 metadata are only shown for tokens that survive the subtitle-annotation noise filter. Standalone grammar tokens like `は`, `です`, and `この` are intentionally left unannotated even if a dictionary can assign them metadata.
:::
## JLPT Tagging
JLPT tagging adds colored underlines to tokens based on their JLPT level (N1N5), giving you an at-a-glance sense of difficulty distribution in each subtitle line.

View File

@@ -145,7 +145,10 @@ test('resolveAniSkipMetadataForFile emits missing_mal_id when MAL search misses'
});
test('buildSubminerScriptOpts includes aniskip payload fields', () => {
const opts = buildSubminerScriptOpts('/tmp/SubMiner.AppImage', '/tmp/subminer.sock', {
const opts = buildSubminerScriptOpts(
'/tmp/SubMiner.AppImage',
'/tmp/subminer.sock',
{
title: "Frieren: Beyond Journey's End",
season: 1,
episode: 5,
@@ -154,10 +157,13 @@ test('buildSubminerScriptOpts includes aniskip payload fields', () => {
introStart: 30.5,
introEnd: 62,
lookupStatus: 'ready',
});
},
'debug',
);
const payloadMatch = opts.match(/subminer-aniskip_payload=([^,]+)/);
assert.match(opts, /subminer-binary_path=\/tmp\/SubMiner\.AppImage/);
assert.match(opts, /subminer-socket_path=\/tmp\/subminer\.sock/);
assert.match(opts, /subminer-log_level=debug/);
assert.match(opts, /subminer-aniskip_title=Frieren: Beyond Journey's End/);
assert.match(opts, /subminer-aniskip_season=1/);
assert.match(opts, /subminer-aniskip_episode=5/);

View File

@@ -1,5 +1,6 @@
import path from 'node:path';
import { spawnSync } from 'node:child_process';
import type { LogLevel } from './types.js';
import { commandExists } from './util.js';
export type AniSkipLookupStatus =
@@ -551,11 +552,15 @@ export function buildSubminerScriptOpts(
appPath: string,
socketPath: string,
aniSkipMetadata: AniSkipMetadata | null,
logLevel: LogLevel = 'info',
): string {
const parts = [
`subminer-binary_path=${sanitizeScriptOptValue(appPath)}`,
`subminer-socket_path=${sanitizeScriptOptValue(socketPath)}`,
];
if (logLevel !== 'info') {
parts.push(`subminer-log_level=${sanitizeScriptOptValue(logLevel)}`);
}
if (aniSkipMetadata && aniSkipMetadata.title) {
parts.push(`subminer-aniskip_title=${sanitizeScriptOptValue(aniSkipMetadata.title)}`);
}

View File

@@ -150,16 +150,17 @@ test('stats command launches attached app command with response path', async ()
assert.equal(handled, true);
assert.deepEqual(forwarded, [
[
'--stats',
'--stats-daemon-start',
'--stats-response-path',
'/tmp/subminer-stats-test/response.json',
'--stats-daemon-open-browser',
'--log-level',
'debug',
],
]);
});
test('stats background command launches detached app command with response path', async () => {
test('stats background command launches attached daemon control command with response path', async () => {
const context = createContext();
context.args.stats = true;
(context.args as typeof context.args & { statsBackground?: boolean }).statsBackground = true;
@@ -168,11 +169,9 @@ test('stats background command launches detached app command with response path'
const handled = await runStatsCommand(context, {
createTempDir: () => '/tmp/subminer-stats-test',
joinPath: (...parts) => parts.join('/'),
runAppCommandAttached: async () => {
throw new Error('attached path should not run for stats -b');
},
launchAppCommandDetached: (_appPath, appArgs) => {
runAppCommandAttached: async (_appPath, appArgs) => {
forwarded.push(appArgs);
return 0;
},
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
removeDir: () => {},
@@ -181,10 +180,9 @@ test('stats background command launches detached app command with response path'
assert.equal(handled, true);
assert.deepEqual(forwarded, [
[
'--stats',
'--stats-daemon-start',
'--stats-response-path',
'/tmp/subminer-stats-test/response.json',
'--stats-background',
],
]);
});
@@ -215,7 +213,12 @@ test('stats command returns after startup response even if app process stays run
const final = await statsCommand;
assert.equal(final, true);
assert.deepEqual(forwarded, [
['--stats', '--stats-response-path', '/tmp/subminer-stats-test/response.json'],
[
'--stats-daemon-start',
'--stats-response-path',
'/tmp/subminer-stats-test/response.json',
'--stats-daemon-open-browser',
],
]);
});
@@ -268,7 +271,11 @@ test('stats stop command forwards stop flag to the app', async () => {
assert.equal(handled, true);
assert.deepEqual(forwarded, [
['--stats', '--stats-response-path', '/tmp/subminer-stats-test/response.json', '--stats-stop'],
[
'--stats-daemon-stop',
'--stats-response-path',
'/tmp/subminer-stats-test/response.json',
],
]);
});

View File

@@ -1,7 +1,7 @@
import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import { launchAppCommandDetached, runAppCommandAttached } from '../mpv.js';
import { runAppCommandAttached } from '../mpv.js';
import { sleep } from '../util.js';
import type { LauncherCommandContext } from './context.js';
@@ -20,12 +20,6 @@ type StatsCommandDeps = {
logLevel: LauncherCommandContext['args']['logLevel'],
label: string,
) => Promise<number>;
launchAppCommandDetached: (
appPath: string,
appArgs: string[],
logLevel: LauncherCommandContext['args']['logLevel'],
label: string,
) => void;
waitForStatsResponse: (responsePath: string) => Promise<StatsCommandResponse>;
removeDir: (targetPath: string) => void;
};
@@ -37,8 +31,6 @@ const defaultDeps: StatsCommandDeps = {
joinPath: (...parts) => path.join(...parts),
runAppCommandAttached: (appPath, appArgs, logLevel, label) =>
runAppCommandAttached(appPath, appArgs, logLevel, label),
launchAppCommandDetached: (appPath, appArgs, logLevel, label) =>
launchAppCommandDetached(appPath, appArgs, logLevel, label),
waitForStatsResponse: async (responsePath) => {
const deadline = Date.now() + STATS_STARTUP_RESPONSE_TIMEOUT_MS;
while (Date.now() < deadline) {
@@ -75,12 +67,15 @@ export async function runStatsCommand(
const responsePath = resolvedDeps.joinPath(tempDir, 'response.json');
try {
const forwarded = ['--stats', '--stats-response-path', responsePath];
if (args.statsBackground) {
forwarded.push('--stats-background');
}
if (args.statsStop) {
forwarded.push('--stats-stop');
const forwarded = args.statsCleanup
? ['--stats', '--stats-response-path', responsePath]
: [
args.statsStop ? '--stats-daemon-stop' : '--stats-daemon-start',
'--stats-response-path',
responsePath,
];
if (!args.statsCleanup && !args.statsBackground && !args.statsStop) {
forwarded.push('--stats-daemon-open-browser');
}
if (args.statsCleanup) {
forwarded.push('--stats-cleanup');
@@ -94,14 +89,6 @@ export async function runStatsCommand(
if (args.logLevel !== 'info') {
forwarded.push('--log-level', args.logLevel);
}
if (args.statsBackground) {
resolvedDeps.launchAppCommandDetached(appPath, forwarded, args.logLevel, 'stats');
const startupResult = await resolvedDeps.waitForStatsResponse(responsePath);
if (!startupResult.ok) {
throw new Error(startupResult.error || 'Stats dashboard failed to start.');
}
return true;
}
const attachedExitPromise = resolvedDeps.runAppCommandAttached(
appPath,
forwarded,

View File

@@ -276,6 +276,16 @@ export function parseCliPrograms(
if (statsBackground && statsStop) {
throw new Error('Stats background and stop flags cannot be combined.');
}
if (
normalizedAction &&
normalizedAction !== 'cleanup' &&
normalizedAction !== 'rebuild' &&
normalizedAction !== 'backfill'
) {
throw new Error(
'Invalid stats action. Valid values are cleanup, rebuild, or backfill.',
);
}
if (normalizedAction && (statsBackground || statsStop)) {
throw new Error('Stats background and stop flags cannot be combined with stats actions.');
}

View File

@@ -387,6 +387,76 @@ ${bunBinary} -e "const net=require('node:net'); const fs=require('node:fs'); con
});
});
test('launcher forwards non-info log level into mpv plugin script opts', { timeout: 15000 }, () => {
withTempDir((root) => {
const homeDir = path.join(root, 'home');
const xdgConfigHome = path.join(root, 'xdg');
const binDir = path.join(root, 'bin');
const appPath = path.join(root, 'fake-subminer.sh');
const videoPath = path.join(root, 'movie.mkv');
const mpvArgsPath = path.join(root, 'mpv-args.txt');
const socketPath = path.join(root, 'mpv.sock');
const bunBinary = JSON.stringify(process.execPath.replace(/\\/g, '/'));
fs.mkdirSync(binDir, { recursive: true });
fs.mkdirSync(path.join(xdgConfigHome, 'SubMiner'), { recursive: true });
fs.mkdirSync(path.join(xdgConfigHome, 'mpv', 'script-opts'), { recursive: true });
fs.writeFileSync(videoPath, 'fake video content');
fs.writeFileSync(
path.join(xdgConfigHome, 'SubMiner', 'setup-state.json'),
JSON.stringify({
version: 1,
status: 'completed',
completedAt: '2026-03-08T00:00:00.000Z',
completionSource: 'user',
lastSeenYomitanDictionaryCount: 0,
pluginInstallStatus: 'installed',
pluginInstallPathSummary: null,
}),
);
fs.writeFileSync(
path.join(xdgConfigHome, 'mpv', 'script-opts', 'subminer.conf'),
`socket_path=${socketPath}\nauto_start=yes\nauto_start_visible_overlay=yes\nauto_start_pause_until_ready=yes\n`,
);
fs.writeFileSync(appPath, '#!/bin/sh\nexit 0\n');
fs.chmodSync(appPath, 0o755);
fs.writeFileSync(
path.join(binDir, 'mpv'),
`#!/bin/sh
set -eu
printf '%s\\n' "$@" > "$SUBMINER_TEST_MPV_ARGS"
socket_path=""
for arg in "$@"; do
case "$arg" in
--input-ipc-server=*)
socket_path="\${arg#--input-ipc-server=}"
;;
esac
done
${bunBinary} -e "const net=require('node:net'); const fs=require('node:fs'); const path=require('node:path'); const socket=process.argv[1]||''; try{ if (socket) fs.mkdirSync(path.dirname(socket),{recursive:true}); }catch{} try{ if (socket) fs.rmSync(socket,{force:true}); }catch{} if(!socket) process.exit(0); const server=net.createServer((c)=>c.end()); server.on('error',()=>process.exit(0)); try{ server.listen(socket,()=>setTimeout(()=>server.close(()=>process.exit(0)),250)); } catch { process.exit(0); }" "$socket_path"
`,
'utf8',
);
fs.chmodSync(path.join(binDir, 'mpv'), 0o755);
const env = {
...makeTestEnv(homeDir, xdgConfigHome),
PATH: `${binDir}${path.delimiter}${process.env.Path || process.env.PATH || ''}`,
Path: `${binDir}${path.delimiter}${process.env.Path || process.env.PATH || ''}`,
SUBMINER_APPIMAGE_PATH: appPath,
SUBMINER_TEST_MPV_ARGS: mpvArgsPath,
};
const result = runLauncher(['--log-level', 'debug', videoPath], env);
assert.equal(result.status, 0, `stdout:\n${result.stdout}\nstderr:\n${result.stderr}`);
assert.match(
fs.readFileSync(mpvArgsPath, 'utf8'),
/--script-opts=.*subminer-log_level=debug/,
);
});
});
test('dictionary command forwards --dictionary and --dictionary-target to app command path', () => {
withTempDir((root) => {
const homeDir = path.join(root, 'home');
@@ -466,7 +536,7 @@ exit 0
assert.equal(result.status, 0, `stdout:\n${result.stdout}\nstderr:\n${result.stderr}`);
assert.match(
fs.readFileSync(capturePath, 'utf8'),
/^--stats\n--stats-response-path\n.+\n--log-level\ndebug\n$/,
/^--stats-daemon-start\n--stats-response-path\n.+\n--stats-daemon-open-browser\n--log-level\ndebug\n$/,
);
});
},

View File

@@ -45,6 +45,8 @@ export function parseMpvArgString(input: string): string[] {
let inSingleQuote = false;
let inDoubleQuote = false;
let escaping = false;
const canEscape = (nextChar: string | undefined): boolean =>
nextChar === undefined || nextChar === '"' || nextChar === "'" || nextChar === '\\' || /\s/.test(nextChar);
for (let i = 0; i < chars.length; i += 1) {
const ch = chars[i] || '';
@@ -65,7 +67,11 @@ export function parseMpvArgString(input: string): string[] {
if (inDoubleQuote) {
if (ch === '\\') {
if (canEscape(chars[i + 1])) {
escaping = true;
} else {
current += ch;
}
continue;
}
if (ch === '"') {
@@ -77,7 +83,11 @@ export function parseMpvArgString(input: string): string[] {
}
if (ch === '\\') {
if (canEscape(chars[i + 1])) {
escaping = true;
} else {
current += ch;
}
continue;
}
if (ch === "'") {
@@ -576,7 +586,7 @@ export async function startMpv(
const aniSkipMetadata = shouldResolveAniSkipMetadata(target, targetKind, preloadedSubtitles)
? await resolveAniSkipMetadataForFile(target)
: null;
const scriptOpts = buildSubminerScriptOpts(appPath, socketPath, aniSkipMetadata);
const scriptOpts = buildSubminerScriptOpts(appPath, socketPath, aniSkipMetadata, args.logLevel);
if (aniSkipMetadata) {
log(
'debug',
@@ -857,8 +867,14 @@ export function runAppCommandAttached(
proc.once('error', (error) => {
reject(error);
});
proc.once('exit', (code) => {
resolve(code ?? 0);
proc.once('exit', (code, signal) => {
if (code !== null) {
resolve(code);
} else if (signal) {
resolve(128);
} else {
resolve(0);
}
});
});
}
@@ -939,9 +955,7 @@ export function launchMpvIdleDetached(
mpvArgs.push(...parseMpvArgString(args.mpvArgs));
}
mpvArgs.push('--idle=yes');
mpvArgs.push(
`--script-opts=subminer-binary_path=${appPath},subminer-socket_path=${socketPath}`,
);
mpvArgs.push(`--script-opts=${buildSubminerScriptOpts(appPath, socketPath, null, args.logLevel)}`);
mpvArgs.push(`--log-file=${getMpvLogPath()}`);
mpvArgs.push(`--input-ipc-server=${socketPath}`);
const mpvTarget = resolveCommandInvocation('mpv', mpvArgs);

View File

@@ -40,8 +40,10 @@ import { createLogger } from './logger';
import {
createUiFeedbackState,
beginUpdateProgress,
clearUpdateProgress,
endUpdateProgress,
showStatusNotification,
showUpdateResult,
withUpdateProgress,
UiFeedbackState,
} from './anki-integration/ui-feedback';
@@ -54,6 +56,7 @@ import { FieldGroupingService } from './anki-integration/field-grouping';
import { FieldGroupingMergeCollaborator } from './anki-integration/field-grouping-merge';
import { NoteUpdateWorkflow } from './anki-integration/note-update-workflow';
import { FieldGroupingWorkflow } from './anki-integration/field-grouping-workflow';
import { resolveAnimatedImageLeadInSeconds } from './anki-integration/animated-image-sync';
import { AnkiIntegrationRuntime, normalizeAnkiIntegrationConfig } from './anki-integration/runtime';
const log = createLogger('anki').child('integration');
@@ -190,7 +193,7 @@ export class AnkiIntegration {
this.resolveNoteFieldName(noteInfo, preferredName),
extractFields: (fields) => this.extractFields(fields),
processSentence: (mpvSentence, noteFields) => this.processSentence(mpvSentence, noteFields),
generateMediaForMerge: () => this.generateMediaForMerge(),
generateMediaForMerge: (noteInfo) => this.generateMediaForMerge(noteInfo),
warnFieldParseOnce: (fieldName, reason, detail) =>
this.warnFieldParseOnce(fieldName, reason, detail),
});
@@ -286,6 +289,7 @@ export class AnkiIntegration {
storeMediaFile: (filename, data) => this.client.storeMediaFile(filename, data),
findNotes: async (query, options) =>
(await this.client.findNotes(query, options)) as number[],
retrieveMediaFile: (filename) => this.client.retrieveMediaFile(filename),
},
mediaGenerator: {
generateAudio: (videoPath, startTime, endTime, audioPadding, audioStreamIndex) =>
@@ -308,6 +312,8 @@ export class AnkiIntegration {
),
},
showOsdNotification: (text: string) => this.showOsdNotification(text),
showUpdateResult: (message: string, success: boolean) =>
this.showUpdateResult(message, success),
showStatusNotification: (message: string) => this.showStatusNotification(message),
showNotification: (noteId, label, errorSuffix) =>
this.showNotification(noteId, label, errorSuffix),
@@ -319,6 +325,7 @@ export class AnkiIntegration {
this.resolveConfiguredFieldName(noteInfo, ...preferredNames),
resolveNoteFieldName: (noteInfo, preferredName) =>
this.resolveNoteFieldName(noteInfo, preferredName),
getAnimatedImageLeadInSeconds: (noteInfo) => this.getAnimatedImageLeadInSeconds(noteInfo),
extractFields: (fields) => this.extractFields(fields),
processSentence: (mpvSentence, noteFields) => this.processSentence(mpvSentence, noteFields),
setCardTypeFields: (updatedFields, availableFieldNames, cardKind) =>
@@ -407,12 +414,13 @@ export class AnkiIntegration {
this.resolveConfiguredFieldName(noteInfo, ...preferredNames),
getResolvedSentenceAudioFieldName: (noteInfo) =>
this.getResolvedSentenceAudioFieldName(noteInfo),
getAnimatedImageLeadInSeconds: (noteInfo) => this.getAnimatedImageLeadInSeconds(noteInfo),
mergeFieldValue: (existing, newValue, overwrite) =>
this.mergeFieldValue(existing, newValue, overwrite),
generateAudioFilename: () => this.generateAudioFilename(),
generateAudio: () => this.generateAudio(),
generateImageFilename: () => this.generateImageFilename(),
generateImage: () => this.generateImage(),
generateImage: (animatedLeadInSeconds) => this.generateImage(animatedLeadInSeconds),
formatMiscInfoPattern: (fallbackFilename, startTimeSeconds) =>
this.formatMiscInfoPattern(fallbackFilename, startTimeSeconds),
addConfiguredTagsToNote: (noteId) => this.addConfiguredTagsToNote(noteId),
@@ -637,7 +645,7 @@ export class AnkiIntegration {
);
}
private async generateImage(): Promise<Buffer | null> {
private async generateImage(animatedLeadInSeconds = 0): Promise<Buffer | null> {
if (!this.mpvClient || !this.mpvClient.currentVideoPath) {
return null;
}
@@ -665,6 +673,7 @@ export class AnkiIntegration {
maxWidth: this.config.media?.animatedMaxWidth,
maxHeight: this.config.media?.animatedMaxHeight,
crf: this.config.media?.animatedCrf,
leadingStillDuration: animatedLeadInSeconds,
},
);
} else {
@@ -768,6 +777,12 @@ export class AnkiIntegration {
});
}
private clearUpdateProgress(): void {
clearUpdateProgress(this.uiFeedbackState, (timer) => {
clearInterval(timer);
});
}
private async withUpdateProgress<T>(
initialMessage: string,
action: () => Promise<T>,
@@ -898,7 +913,9 @@ export class AnkiIntegration {
const type = this.config.behavior?.notificationType || 'osd';
if (type === 'osd' || type === 'both') {
this.showOsdNotification(message);
this.showUpdateResult(message, true);
} else {
this.clearUpdateProgress();
}
if ((type === 'system' || type === 'both') && this.notificationCallback) {
@@ -933,6 +950,21 @@ export class AnkiIntegration {
}
}
private showUpdateResult(message: string, success: boolean): void {
showUpdateResult(
this.uiFeedbackState,
{
clearProgressTimer: (timer) => {
clearInterval(timer);
},
showOsdNotification: (text) => {
this.showOsdNotification(text);
},
},
{ message, success },
);
}
private mergeFieldValue(existing: string, newValue: string, overwrite: boolean): string {
if (overwrite || !existing.trim()) {
return newValue;
@@ -1020,7 +1052,18 @@ export class AnkiIntegration {
return getPreferredWordValueFromExtractedFields(fields, this.config);
}
private async generateMediaForMerge(): Promise<{
private async getAnimatedImageLeadInSeconds(noteInfo: NoteInfo): Promise<number> {
return resolveAnimatedImageLeadInSeconds({
config: this.config,
noteInfo,
resolveConfiguredFieldName: (candidateNoteInfo, ...preferredNames) =>
this.resolveConfiguredFieldName(candidateNoteInfo, ...preferredNames),
retrieveMediaFileBase64: (filename) => this.client.retrieveMediaFile(filename),
logWarn: (message, ...args) => log.warn(message, ...args),
});
}
private async generateMediaForMerge(noteInfo?: NoteInfo): Promise<{
audioField?: string;
audioValue?: string;
imageField?: string;
@@ -1057,8 +1100,11 @@ export class AnkiIntegration {
if (this.config.media?.generateImage && this.mpvClient?.currentVideoPath) {
try {
const animatedLeadInSeconds = noteInfo
? await this.getAnimatedImageLeadInSeconds(noteInfo)
: 0;
const imageFilename = this.generateImageFilename();
const imageBuffer = await this.generateImage();
const imageBuffer = await this.generateImage(animatedLeadInSeconds);
if (imageBuffer) {
await this.client.storeMediaFile(imageFilename, imageBuffer);
result.imageField = this.config.fields?.image || DEFAULT_ANKI_CONNECT_CONFIG.fields.image;

View File

@@ -0,0 +1,82 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { resolveAnimatedImageLeadInSeconds, extractSoundFilenames } from './animated-image-sync';
test('extractSoundFilenames returns ordered sound filenames from an Anki field value', () => {
assert.deepEqual(
extractSoundFilenames('before [sound:word.mp3] middle [sound:alt.ogg] after'),
['word.mp3', 'alt.ogg'],
);
});
test('resolveAnimatedImageLeadInSeconds sums configured word audio durations for animated images', async () => {
const leadInSeconds = await resolveAnimatedImageLeadInSeconds({
config: {
fields: {
audio: 'ExpressionAudio',
},
media: {
imageType: 'avif',
syncAnimatedImageToWordAudio: true,
},
},
noteInfo: {
noteId: 42,
fields: {
ExpressionAudio: {
value: '[sound:word.mp3][sound:alt.ogg]',
},
},
},
resolveConfiguredFieldName: (noteInfo, ...preferredNames) => {
for (const preferredName of preferredNames) {
if (!preferredName) continue;
const resolved = Object.keys(noteInfo.fields).find(
(fieldName) => fieldName.toLowerCase() === preferredName.toLowerCase(),
);
if (resolved) return resolved;
}
return null;
},
retrieveMediaFileBase64: async (filename) =>
filename === 'word.mp3' ? 'd29yZA==' : filename === 'alt.ogg' ? 'YWx0' : '',
probeAudioDurationSeconds: async (_buffer, filename) =>
filename === 'word.mp3' ? 0.41 : filename === 'alt.ogg' ? 0.84 : null,
logWarn: () => undefined,
});
assert.equal(leadInSeconds, 1.25);
});
test('resolveAnimatedImageLeadInSeconds falls back to zero when sync is disabled', async () => {
const leadInSeconds = await resolveAnimatedImageLeadInSeconds({
config: {
fields: {
audio: 'ExpressionAudio',
},
media: {
imageType: 'avif',
syncAnimatedImageToWordAudio: false,
},
},
noteInfo: {
noteId: 42,
fields: {
ExpressionAudio: {
value: '[sound:word.mp3]',
},
},
},
resolveConfiguredFieldName: () => 'ExpressionAudio',
retrieveMediaFileBase64: async () => {
throw new Error('should not be called');
},
probeAudioDurationSeconds: async () => {
throw new Error('should not be called');
},
logWarn: () => undefined,
});
assert.equal(leadInSeconds, 0);
});

View File

@@ -0,0 +1,133 @@
import { execFile as nodeExecFile } from 'node:child_process';
import * as fs from 'node:fs';
import * as os from 'node:os';
import * as path from 'node:path';
import { DEFAULT_ANKI_CONNECT_CONFIG } from '../config';
import type { AnkiConnectConfig } from '../types';
type NoteInfoLike = {
noteId: number;
fields: Record<string, { value: string }>;
};
interface ResolveAnimatedImageLeadInSecondsArgs<TNoteInfo extends NoteInfoLike> {
config: Pick<AnkiConnectConfig, 'fields' | 'media'>;
noteInfo: TNoteInfo;
resolveConfiguredFieldName: (
noteInfo: TNoteInfo,
...preferredNames: (string | undefined)[]
) => string | null;
retrieveMediaFileBase64: (filename: string) => Promise<string>;
probeAudioDurationSeconds?: (buffer: Buffer, filename: string) => Promise<number | null>;
logWarn?: (message: string, ...args: unknown[]) => void;
}
interface ProbeAudioDurationDeps {
execFile?: typeof nodeExecFile;
mkdtempSync?: typeof fs.mkdtempSync;
writeFileSync?: typeof fs.writeFileSync;
rmSync?: typeof fs.rmSync;
}
export function extractSoundFilenames(value: string): string[] {
const matches = value.matchAll(/\[sound:([^\]]+)\]/gi);
return Array.from(matches, (match) => match[1]?.trim() || '').filter((value) => value.length > 0);
}
function shouldSyncAnimatedImageToWordAudio(config: Pick<AnkiConnectConfig, 'media'>): boolean {
return (
config.media?.imageType === 'avif' && config.media?.syncAnimatedImageToWordAudio !== false
);
}
export async function probeAudioDurationSeconds(
buffer: Buffer,
filename: string,
deps: ProbeAudioDurationDeps = {},
): Promise<number | null> {
const execFile = deps.execFile ?? nodeExecFile;
const mkdtempSync = deps.mkdtempSync ?? fs.mkdtempSync;
const writeFileSync = deps.writeFileSync ?? fs.writeFileSync;
const rmSync = deps.rmSync ?? fs.rmSync;
const tempDir = mkdtempSync(path.join(os.tmpdir(), 'subminer-audio-probe-'));
const ext = path.extname(filename) || '.bin';
const tempPath = path.join(tempDir, `probe${ext}`);
writeFileSync(tempPath, buffer);
return new Promise((resolve) => {
execFile(
'ffprobe',
[
'-v',
'error',
'-show_entries',
'format=duration',
'-of',
'default=noprint_wrappers=1:nokey=1',
tempPath,
],
(error, stdout) => {
try {
if (error) {
resolve(null);
return;
}
const durationSeconds = Number.parseFloat((stdout || '').trim());
resolve(Number.isFinite(durationSeconds) && durationSeconds > 0 ? durationSeconds : null);
} finally {
rmSync(tempDir, { recursive: true, force: true });
}
},
);
});
}
export async function resolveAnimatedImageLeadInSeconds<TNoteInfo extends NoteInfoLike>({
config,
noteInfo,
resolveConfiguredFieldName,
retrieveMediaFileBase64,
probeAudioDurationSeconds: probeDuration = probeAudioDurationSeconds,
logWarn,
}: ResolveAnimatedImageLeadInSecondsArgs<TNoteInfo>): Promise<number> {
if (!shouldSyncAnimatedImageToWordAudio(config)) {
return 0;
}
const wordAudioFieldName = resolveConfiguredFieldName(
noteInfo,
config.fields?.audio,
DEFAULT_ANKI_CONNECT_CONFIG.fields.audio,
);
if (!wordAudioFieldName) {
return 0;
}
const wordAudioValue = noteInfo.fields[wordAudioFieldName]?.value || '';
const filenames = extractSoundFilenames(wordAudioValue);
if (filenames.length === 0) {
return 0;
}
let totalLeadInSeconds = 0;
for (const filename of filenames) {
const encoded = await retrieveMediaFileBase64(filename);
if (!encoded) {
logWarn?.('Animated image sync skipped: failed to retrieve word audio', filename);
return 0;
}
const durationSeconds = await probeDuration(Buffer.from(encoded, 'base64'), filename);
if (!(typeof durationSeconds === 'number' && Number.isFinite(durationSeconds))) {
logWarn?.('Animated image sync skipped: failed to probe word audio duration', filename);
return 0;
}
totalLeadInSeconds += durationSeconds;
}
return totalLeadInSeconds;
}

View File

@@ -30,6 +30,7 @@ interface CardCreationClient {
updateNoteFields(noteId: number, fields: Record<string, string>): Promise<void>;
storeMediaFile(filename: string, data: Buffer): Promise<void>;
findNotes(query: string, options?: { maxRetries?: number }): Promise<number[]>;
retrieveMediaFile(filename: string): Promise<string>;
}
interface CardCreationMediaGenerator {
@@ -60,6 +61,7 @@ interface CardCreationMediaGenerator {
maxWidth?: number;
maxHeight?: number;
crf?: number;
leadingStillDuration?: number;
},
): Promise<Buffer | null>;
}
@@ -73,6 +75,7 @@ interface CardCreationDeps {
client: CardCreationClient;
mediaGenerator: CardCreationMediaGenerator;
showOsdNotification: (text: string) => void;
showUpdateResult: (message: string, success: boolean) => void;
showStatusNotification: (message: string) => void;
showNotification: (noteId: number, label: string | number, errorSuffix?: string) => Promise<void>;
beginUpdateProgress: (initialMessage: string) => void;
@@ -83,6 +86,7 @@ interface CardCreationDeps {
...preferredNames: (string | undefined)[]
) => string | null;
resolveNoteFieldName: (noteInfo: CardCreationNoteInfo, preferredName?: string) => string | null;
getAnimatedImageLeadInSeconds: (noteInfo: CardCreationNoteInfo) => Promise<number>;
extractFields: (fields: Record<string, { value: string }>) => Record<string, string>;
processSentence: (mpvSentence: string, noteFields: Record<string, string>) => string;
setCardTypeFields: (
@@ -258,11 +262,13 @@ export class CardCreationService {
if (this.deps.getConfig().media?.generateImage) {
try {
const animatedLeadInSeconds = await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
const imageFilename = this.generateImageFilename();
const imageBuffer = await this.generateImageBuffer(
mpvClient.currentVideoPath,
rangeStart,
rangeEnd,
animatedLeadInSeconds,
);
if (imageBuffer) {
@@ -414,11 +420,13 @@ export class CardCreationService {
if (this.deps.getConfig().media?.generateImage) {
try {
const animatedLeadInSeconds = await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
const imageFilename = this.generateImageFilename();
const imageBuffer = await this.generateImageBuffer(
mpvClient.currentVideoPath,
startTime,
endTime,
animatedLeadInSeconds,
);
const imageField = this.deps.getConfig().fields?.image;
@@ -545,7 +553,7 @@ export class CardCreationService {
this.deps.trackLastAddedNoteId?.(noteId);
} catch (error) {
log.error('Failed to create sentence card:', (error as Error).message);
this.deps.showOsdNotification(`Sentence card failed: ${(error as Error).message}`);
this.deps.showUpdateResult(`Sentence card failed: ${(error as Error).message}`, false);
return false;
}
@@ -642,7 +650,7 @@ export class CardCreationService {
});
} catch (error) {
log.error('Error creating sentence card:', (error as Error).message);
this.deps.showOsdNotification(`Sentence card failed: ${(error as Error).message}`);
this.deps.showUpdateResult(`Sentence card failed: ${(error as Error).message}`, false);
return false;
}
}
@@ -679,6 +687,7 @@ export class CardCreationService {
videoPath: string,
startTime: number,
endTime: number,
animatedLeadInSeconds = 0,
): Promise<Buffer | null> {
const mpvClient = this.deps.getMpvClient();
if (!mpvClient) {
@@ -707,6 +716,7 @@ export class CardCreationService {
maxWidth: this.deps.getConfig().media?.animatedMaxWidth,
maxHeight: this.deps.getConfig().media?.animatedMaxHeight,
crf: this.deps.getConfig().media?.animatedCrf,
leadingStillDuration: animatedLeadInSeconds,
},
);
}

View File

@@ -28,7 +28,7 @@ interface FieldGroupingMergeDeps {
) => string | null;
extractFields: (fields: Record<string, { value: string }>) => Record<string, string>;
processSentence: (mpvSentence: string, noteFields: Record<string, string>) => string;
generateMediaForMerge: () => Promise<FieldGroupingMergeMedia>;
generateMediaForMerge: (noteInfo: FieldGroupingMergeNoteInfo) => Promise<FieldGroupingMergeMedia>;
warnFieldParseOnce: (fieldName: string, reason: string, detail?: string) => void;
}
@@ -132,7 +132,7 @@ export class FieldGroupingMergeCollaborator {
}
if (includeGeneratedMedia) {
const media = await this.deps.generateMediaForMerge();
const media = await this.deps.generateMediaForMerge(keepNoteInfo);
if (media.audioField && media.audioValue && !sourceFields[media.audioField]) {
sourceFields[media.audioField] = media.audioValue;
}

View File

@@ -62,6 +62,7 @@ function createWorkflowHarness() {
return names.find((name) => name.toLowerCase() === preferred.toLowerCase()) ?? null;
},
getResolvedSentenceAudioFieldName: () => null,
getAnimatedImageLeadInSeconds: async () => 0,
mergeFieldValue: (_existing: string, next: string, _overwrite: boolean) => next,
generateAudioFilename: () => 'audio_1.mp3',
generateAudio: async () => null,
@@ -163,3 +164,42 @@ test('NoteUpdateWorkflow updates note before auto field grouping merge', async (
assert.deepEqual(callOrder, ['update', 'auto']);
assert.equal(harness.updates.length, 1);
});
test('NoteUpdateWorkflow passes animated image lead-in when syncing avif to word audio', async () => {
const harness = createWorkflowHarness();
let receivedLeadInSeconds = 0;
harness.deps.client.notesInfo = async () =>
[
{
noteId: 42,
fields: {
Expression: { value: 'taberu' },
ExpressionAudio: { value: '[sound:word.mp3]' },
Sentence: { value: '' },
Picture: { value: '' },
},
},
] satisfies NoteUpdateWorkflowNoteInfo[];
harness.deps.getConfig = () => ({
fields: {
sentence: 'Sentence',
image: 'Picture',
},
media: {
generateImage: true,
imageType: 'avif',
syncAnimatedImageToWordAudio: true,
},
behavior: {},
});
harness.deps.getAnimatedImageLeadInSeconds = async () => 1.25;
harness.deps.generateImage = async (leadInSeconds?: number) => {
receivedLeadInSeconds = leadInSeconds ?? 0;
return Buffer.from('image');
};
await harness.workflow.execute(42);
assert.equal(receivedLeadInSeconds, 1.25);
});

View File

@@ -22,6 +22,8 @@ export interface NoteUpdateWorkflowDeps {
media?: {
generateAudio?: boolean;
generateImage?: boolean;
imageType?: 'static' | 'avif';
syncAnimatedImageToWordAudio?: boolean;
};
behavior?: {
overwriteAudio?: boolean;
@@ -60,11 +62,12 @@ export interface NoteUpdateWorkflowDeps {
...preferredNames: (string | undefined)[]
) => string | null;
getResolvedSentenceAudioFieldName: (noteInfo: NoteUpdateWorkflowNoteInfo) => string | null;
getAnimatedImageLeadInSeconds: (noteInfo: NoteUpdateWorkflowNoteInfo) => Promise<number>;
mergeFieldValue: (existing: string, newValue: string, overwrite: boolean) => string;
generateAudioFilename: () => string;
generateAudio: () => Promise<Buffer | null>;
generateImageFilename: () => string;
generateImage: () => Promise<Buffer | null>;
generateImage: (animatedLeadInSeconds?: number) => Promise<Buffer | null>;
formatMiscInfoPattern: (fallbackFilename: string, startTimeSeconds?: number) => string;
addConfiguredTagsToNote: (noteId: number) => Promise<void>;
showNotification: (noteId: number, label: string | number) => Promise<void>;
@@ -153,8 +156,9 @@ export class NoteUpdateWorkflow {
if (config.media?.generateImage) {
try {
const animatedLeadInSeconds = await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
const imageFilename = this.deps.generateImageFilename();
const imageBuffer = await this.deps.generateImage();
const imageBuffer = await this.deps.generateImage(animatedLeadInSeconds);
if (imageBuffer) {
await this.deps.client.storeMediaFile(imageFilename, imageBuffer);

View File

@@ -59,6 +59,10 @@ test('AnkiIntegrationRuntime normalizes url and proxy defaults', () => {
normalized.media?.fallbackDuration,
DEFAULT_ANKI_CONNECT_CONFIG.media.fallbackDuration,
);
assert.equal(
normalized.media?.syncAnimatedImageToWordAudio,
DEFAULT_ANKI_CONNECT_CONFIG.media.syncAnimatedImageToWordAudio,
);
});
test('AnkiIntegrationRuntime starts proxy transport when proxy mode is enabled', () => {
@@ -106,3 +110,44 @@ test('AnkiIntegrationRuntime switches transports and clears known words when run
'proxy:start:127.0.0.1:8766:http://127.0.0.1:8765',
]);
});
test('AnkiIntegrationRuntime skips known-word lifecycle restart for unrelated runtime patches', () => {
const { runtime, calls } = createRuntime({
knownWords: {
highlightEnabled: true,
},
pollingRate: 250,
});
runtime.start();
calls.length = 0;
runtime.applyRuntimeConfigPatch({
behavior: {
autoUpdateNewCards: false,
},
});
assert.deepEqual(calls, []);
});
test('AnkiIntegrationRuntime restarts known-word lifecycle when known-word settings change', () => {
const { runtime, calls } = createRuntime({
knownWords: {
highlightEnabled: true,
refreshMinutes: 90,
},
pollingRate: 250,
});
runtime.start();
calls.length = 0;
runtime.applyRuntimeConfigPatch({
knownWords: {
refreshMinutes: 120,
},
});
assert.deepEqual(calls, ['known:start']);
});

View File

@@ -145,6 +145,9 @@ export class AnkiIntegrationRuntime {
applyRuntimeConfigPatch(patch: Partial<AnkiConnectConfig>): void {
const wasKnownWordCacheEnabled = this.config.knownWords?.highlightEnabled === true;
const previousKnownWordCacheConfig = wasKnownWordCacheEnabled
? this.getKnownWordCacheLifecycleConfig(this.config)
: null;
const previousTransportKey = this.getTransportConfigKey(this.config);
const mergedConfig: AnkiConnectConfig = {
@@ -191,11 +194,19 @@ export class AnkiIntegrationRuntime {
};
this.config = normalizeAnkiIntegrationConfig(mergedConfig);
this.deps.onConfigChanged?.(this.config);
const nextKnownWordCacheEnabled = this.config.knownWords?.highlightEnabled === true;
if (wasKnownWordCacheEnabled && this.config.knownWords?.highlightEnabled === false) {
this.deps.knownWordCache.stopLifecycle();
this.deps.knownWordCache.clearKnownWordCacheState();
} else {
} else if (!wasKnownWordCacheEnabled && nextKnownWordCacheEnabled) {
this.deps.knownWordCache.startLifecycle();
} else if (
wasKnownWordCacheEnabled &&
nextKnownWordCacheEnabled &&
previousKnownWordCacheConfig !== null &&
previousKnownWordCacheConfig !== this.getKnownWordCacheLifecycleConfig(this.config)
) {
this.deps.knownWordCache.startLifecycle();
}
@@ -206,6 +217,48 @@ export class AnkiIntegrationRuntime {
}
}
private getKnownWordCacheLifecycleConfig(config: AnkiConnectConfig): string {
return JSON.stringify({
refreshMinutes: this.getKnownWordRefreshIntervalMinutes(config),
scope: this.getKnownWordCacheScopeForConfig(config),
fieldsWord: trimToNonEmptyString(config.fields?.word) ?? '',
});
}
private getKnownWordRefreshIntervalMinutes(config: AnkiConnectConfig): number {
const refreshMinutes = config.knownWords?.refreshMinutes;
return typeof refreshMinutes === 'number' && Number.isFinite(refreshMinutes) && refreshMinutes > 0
? refreshMinutes
: DEFAULT_ANKI_CONNECT_CONFIG.knownWords.refreshMinutes;
}
private getKnownWordCacheScopeForConfig(config: AnkiConnectConfig): string {
const configuredDecks = config.knownWords?.decks;
if (configuredDecks && typeof configuredDecks === 'object' && !Array.isArray(configuredDecks)) {
const normalizedDecks = Object.entries(configuredDecks)
.map(([deckName, fields]) => {
const name = trimToNonEmptyString(deckName);
if (!name) return null;
const normalizedFields = Array.isArray(fields)
? [
...new Set(
fields.map(String).map(trimToNonEmptyString).filter((field): field is string => Boolean(field)),
),
].sort()
: [];
return [name, normalizedFields];
})
.filter((entry): entry is [string, string[]] => entry !== null)
.sort(([a], [b]) => a.localeCompare(b));
if (normalizedDecks.length > 0) {
return `decks:${JSON.stringify(normalizedDecks)}`;
}
}
const configuredDeck = trimToNonEmptyString(config.deck);
return configuredDeck ? `deck:${configuredDeck}` : 'is:note';
}
getOrCreateProxyServer(): AnkiIntegrationRuntimeProxyServer {
if (!this.proxyServer) {
this.proxyServer = this.deps.proxyServerFactory();

View File

@@ -0,0 +1,67 @@
import test from 'node:test';
import assert from 'node:assert/strict';
import {
beginUpdateProgress,
createUiFeedbackState,
showProgressTick,
showUpdateResult,
} from './ui-feedback';
test('showUpdateResult stops spinner before success notification and suppresses stale ticks', () => {
const state = createUiFeedbackState();
const osdMessages: string[] = [];
beginUpdateProgress(state, 'Creating sentence card', () => {
showProgressTick(state, (text) => {
osdMessages.push(text);
});
});
showUpdateResult(
state,
{
clearProgressTimer: (timer) => {
clearInterval(timer);
},
showOsdNotification: (text) => {
osdMessages.push(text);
},
},
{ success: true, message: 'Updated card: taberu' },
);
showProgressTick(state, (text) => {
osdMessages.push(text);
});
assert.deepEqual(osdMessages, ['Creating sentence card |', '✓ Updated card: taberu']);
});
test('showUpdateResult renders failed updates with an x marker', () => {
const state = createUiFeedbackState();
const osdMessages: string[] = [];
beginUpdateProgress(state, 'Creating sentence card', () => {
showProgressTick(state, (text) => {
osdMessages.push(text);
});
});
showUpdateResult(
state,
{
clearProgressTimer: (timer) => {
clearInterval(timer);
},
showOsdNotification: (text) => {
osdMessages.push(text);
},
},
{ success: false, message: 'Sentence card failed: deck missing' },
);
assert.deepEqual(osdMessages, [
'Creating sentence card |',
'x Sentence card failed: deck missing',
]);
});

View File

@@ -7,6 +7,11 @@ export interface UiFeedbackState {
progressFrame: number;
}
export interface UiFeedbackResult {
success: boolean;
message: string;
}
export interface UiFeedbackNotificationContext {
getNotificationType: () => string | undefined;
showOsd: (text: string) => void;
@@ -66,6 +71,15 @@ export function endUpdateProgress(
state.progressDepth = Math.max(0, state.progressDepth - 1);
if (state.progressDepth > 0) return;
clearUpdateProgress(state, clearProgressTimer);
}
export function clearUpdateProgress(
state: UiFeedbackState,
clearProgressTimer: (timer: ReturnType<typeof setInterval>) => void,
): void {
state.progressDepth = 0;
if (state.progressTimer) {
clearProgressTimer(state.progressTimer);
state.progressTimer = null;
@@ -85,6 +99,19 @@ export function showProgressTick(
showOsdNotification(`${state.progressMessage} ${frame}`);
}
export function showUpdateResult(
state: UiFeedbackState,
options: {
clearProgressTimer: (timer: ReturnType<typeof setInterval>) => void;
showOsdNotification: (text: string) => void;
},
result: UiFeedbackResult,
): void {
clearUpdateProgress(state, options.clearProgressTimer);
const prefix = result.success ? '✓' : 'x';
options.showOsdNotification(`${prefix} ${result.message}`);
}
export async function withUpdateProgress<T>(
state: UiFeedbackState,
options: UiFeedbackOptions,

View File

@@ -47,6 +47,7 @@ export const INTEGRATIONS_DEFAULT_CONFIG: Pick<
animatedMaxWidth: 640,
animatedMaxHeight: undefined,
animatedCrf: 35,
syncAnimatedImageToWordAudio: true,
audioPadding: 0.5,
fallbackDuration: 3.0,
maxMediaDuration: 30,

View File

@@ -82,6 +82,13 @@ export function buildIntegrationConfigOptionRegistry(
description: 'Automatically update newly added cards.',
runtime: runtimeOptionById.get('anki.autoUpdateNewCards'),
},
{
path: 'ankiConnect.media.syncAnimatedImageToWordAudio',
kind: 'boolean',
defaultValue: defaultConfig.ankiConnect.media.syncAnimatedImageToWordAudio,
description:
'For animated AVIF images, prepend a frozen first frame matching the existing word-audio duration so motion starts with sentence audio.',
},
{
path: 'ankiConnect.knownWords.matchMode',
kind: 'enum',

View File

@@ -121,6 +121,22 @@ test('accepts configured ankiConnect.fields.word override', () => {
);
});
test('accepts ankiConnect.media.syncAnimatedImageToWordAudio override', () => {
const { context, warnings } = makeContext({
media: {
syncAnimatedImageToWordAudio: false,
},
});
applyAnkiConnectResolution(context);
assert.equal(context.resolved.ankiConnect.media.syncAnimatedImageToWordAudio, false);
assert.equal(
warnings.some((warning) => warning.path === 'ankiConnect.media.syncAnimatedImageToWordAudio'),
false,
);
});
test('maps legacy ankiConnect.wordField to modern ankiConnect.fields.word', () => {
const { context, warnings } = makeContext({
wordField: 'TargetWordLegacy',

View File

@@ -31,6 +31,7 @@ export function applyAnkiConnectResolution(context: ResolveContext): void {
'animatedMaxWidth',
'animatedMaxHeight',
'animatedCrf',
'syncAnimatedImageToWordAudio',
'audioPadding',
'fallbackDuration',
'maxMediaDuration',
@@ -536,6 +537,17 @@ export function applyAnkiConnectResolution(context: ResolveContext): void {
'Expected integer between 0 and 63.',
);
}
if (!hasOwn(media, 'syncAnimatedImageToWordAudio')) {
mapLegacy(
'syncAnimatedImageToWordAudio',
asBoolean,
(value) => {
context.resolved.ankiConnect.media.syncAnimatedImageToWordAudio = value;
},
context.resolved.ankiConnect.media.syncAnimatedImageToWordAudio,
'Expected boolean.',
);
}
if (!hasOwn(media, 'audioPadding')) {
mapLegacy(
'audioPadding',

View File

@@ -12,6 +12,12 @@ import {
getConfiguredWordFieldName,
getPreferredNoteFieldValue,
} from '../../anki-field-config.js';
import { resolveAnimatedImageLeadInSeconds } from '../../anki-integration/animated-image-sync.js';
type StatsServerNoteInfo = {
noteId: number;
fields: Record<string, { value: string }>;
};
function parseIntQuery(raw: string | undefined, fallback: number, maxLimit?: number): number {
if (raw === undefined) return fallback;
@@ -40,6 +46,20 @@ function parseEventTypesQuery(raw: string | undefined): number[] | undefined {
return parsed.length > 0 ? parsed : undefined;
}
function resolveStatsNoteFieldName(
noteInfo: StatsServerNoteInfo,
...preferredNames: (string | undefined)[]
): string | null {
for (const preferredName of preferredNames) {
if (!preferredName) continue;
const resolved = Object.keys(noteInfo.fields).find(
(fieldName) => fieldName.toLowerCase() === preferredName.toLowerCase(),
);
if (resolved) return resolved;
}
return null;
}
/** Load known words cache from disk into a Set. Returns null if unavailable. */
function loadKnownWordsSet(cachePath: string | undefined): Set<string> | null {
if (!cachePath || !existsSync(cachePath)) return null;
@@ -621,36 +641,41 @@ export function createStatsApp(
const generateAudio = ankiConfig.media?.generateAudio !== false;
const generateImage = ankiConfig.media?.generateImage !== false && mode !== 'audio';
const imageType = ankiConfig.media?.imageType ?? 'static';
const syncAnimatedImageToWordAudio =
imageType === 'avif' && ankiConfig.media?.syncAnimatedImageToWordAudio !== false;
const audioPromise = generateAudio
? mediaGen.generateAudio(sourcePath, startSec, clampedEndSec, audioPadding)
: Promise.resolve(null);
let imagePromise: Promise<Buffer | null>;
const createImagePromise = (animatedLeadInSeconds = 0): Promise<Buffer | null> => {
if (!generateImage) {
imagePromise = Promise.resolve(null);
} else if (imageType === 'avif') {
imagePromise = mediaGen.generateAnimatedImage(
sourcePath,
startSec,
clampedEndSec,
audioPadding,
{
return Promise.resolve(null);
}
if (imageType === 'avif') {
return mediaGen.generateAnimatedImage(sourcePath, startSec, clampedEndSec, audioPadding, {
fps: ankiConfig.media?.animatedFps ?? 10,
maxWidth: ankiConfig.media?.animatedMaxWidth ?? 640,
maxHeight: ankiConfig.media?.animatedMaxHeight,
crf: ankiConfig.media?.animatedCrf ?? 35,
},
);
} else {
leadingStillDuration: animatedLeadInSeconds,
});
}
const midpointSec = (startSec + clampedEndSec) / 2;
imagePromise = mediaGen.generateScreenshot(sourcePath, midpointSec, {
return mediaGen.generateScreenshot(sourcePath, midpointSec, {
format: ankiConfig.media?.imageFormat ?? 'jpg',
quality: ankiConfig.media?.imageQuality ?? 92,
maxWidth: ankiConfig.media?.imageMaxWidth,
maxHeight: ankiConfig.media?.imageMaxHeight,
});
}
};
const imagePromise =
mode === 'word' && syncAnimatedImageToWordAudio
? Promise.resolve<Buffer | null>(null)
: createImagePromise();
const errors: string[] = [];
let noteId: number;
@@ -677,12 +702,31 @@ export function createStatsApp(
noteId = yomitanResult.value;
const audioBuffer = audioResult.status === 'fulfilled' ? audioResult.value : null;
const imageBuffer = imageResult.status === 'fulfilled' ? imageResult.value : null;
if (audioResult.status === 'rejected')
errors.push(`audio: ${(audioResult.reason as Error).message}`);
if (imageResult.status === 'rejected')
errors.push(`image: ${(imageResult.reason as Error).message}`);
let imageBuffer = imageResult.status === 'fulfilled' ? imageResult.value : null;
if (syncAnimatedImageToWordAudio && generateImage) {
try {
const noteInfoResult = (await client.notesInfo([noteId])) as StatsServerNoteInfo[];
const noteInfo = noteInfoResult[0] ?? null;
const animatedLeadInSeconds = noteInfo
? await resolveAnimatedImageLeadInSeconds({
config: ankiConfig,
noteInfo,
resolveConfiguredFieldName: (candidateNoteInfo, ...preferredNames) =>
resolveStatsNoteFieldName(candidateNoteInfo, ...preferredNames),
retrieveMediaFileBase64: (filename) => client.retrieveMediaFile(filename),
})
: 0;
imageBuffer = await createImagePromise(animatedLeadInSeconds);
} catch (err) {
errors.push(`image: ${(err as Error).message}`);
}
}
const mediaFields: Record<string, string> = {};
const timestamp = Date.now();
const sentenceFieldName = ankiConfig.fields?.sentence ?? 'Sentence';

View File

@@ -17,18 +17,19 @@ test('computePriorityWindow returns next N cues from current position', () => {
const window = computePriorityWindow(cues, 12.0, 5);
assert.equal(window.length, 5);
// Position 12.0 falls during cue 2, so the window starts at cue 3 (startTime >= 12.0).
assert.equal(window[0]!.text, 'line-3');
assert.equal(window[4]!.text, 'line-7');
// Position 12.0 falls during cue 2, so the active cue should be warmed first.
assert.equal(window[0]!.text, 'line-2');
assert.equal(window[4]!.text, 'line-6');
});
test('computePriorityWindow clamps to remaining cues at end of file', () => {
const cues = makeCues(5);
const window = computePriorityWindow(cues, 18.0, 10);
// Position 18.0 is during cue 3 (start=15). Only cue 4 is ahead.
assert.equal(window.length, 1);
assert.equal(window[0]!.text, 'line-4');
// Position 18.0 is during cue 3 (start=15), so cue 3 and cue 4 remain.
assert.equal(window.length, 2);
assert.equal(window[0]!.text, 'line-3');
assert.equal(window[1]!.text, 'line-4');
});
test('computePriorityWindow returns empty when past all cues', () => {
@@ -45,6 +46,16 @@ test('computePriorityWindow at position 0 returns first N cues', () => {
assert.equal(window[0]!.text, 'line-0');
});
test('computePriorityWindow includes the active cue when current position is mid-line', () => {
const cues = makeCues(20);
const window = computePriorityWindow(cues, 18.0, 3);
assert.equal(window.length, 3);
assert.equal(window[0]!.text, 'line-3');
assert.equal(window[1]!.text, 'line-4');
assert.equal(window[2]!.text, 'line-5');
});
function flushMicrotasks(): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, 0));
}

View File

@@ -28,12 +28,12 @@ export function computePriorityWindow(
return [];
}
// Find the first cue whose start time is >= current position.
// This includes cues that start exactly at the current time (they haven't
// been displayed yet and should be prefetched).
// Find the first cue whose end time is after the current position.
// This includes the currently active cue when playback starts or seeks
// mid-line, while still skipping cues that have already finished.
let startIndex = -1;
for (let i = 0; i < cues.length; i += 1) {
if (cues[i]!.startTime >= currentTimeSeconds) {
if (cues[i]!.endTime > currentTimeSeconds) {
startIndex = i;
break;
}

View File

@@ -190,6 +190,48 @@ test('preCacheTokenization stores entry that is returned on next subtitle change
assert.deepEqual(emitted, [{ text: '予め', tokens: [] }]);
});
test('preCacheTokenization reuses normalized subtitle text across ASS linebreak variants', async () => {
const emitted: SubtitleData[] = [];
let tokenizeCalls = 0;
const controller = createSubtitleProcessingController({
tokenizeSubtitle: async (text) => {
tokenizeCalls += 1;
return { text, tokens: [] };
},
emitSubtitle: (payload) => emitted.push(payload),
});
controller.preCacheTokenization('一行目\\N二行目', { text: '一行目\n二行目', tokens: [] });
controller.onSubtitleChange('一行目\n二行目');
await flushMicrotasks();
assert.equal(tokenizeCalls, 0, 'should not call tokenize when normalized text matches');
assert.deepEqual(emitted, [{ text: '一行目\n二行目', tokens: [] }]);
});
test('consumeCachedSubtitle returns prefetched payload and prevents reprocessing same line', async () => {
const emitted: SubtitleData[] = [];
let tokenizeCalls = 0;
const controller = createSubtitleProcessingController({
tokenizeSubtitle: async (text) => {
tokenizeCalls += 1;
return { text, tokens: [] };
},
emitSubtitle: (payload) => emitted.push(payload),
});
controller.preCacheTokenization('猫\\Nです', { text: '猫\nです', tokens: [] });
const immediate = controller.consumeCachedSubtitle('猫\nです');
assert.deepEqual(immediate, { text: '猫\nです', tokens: [] });
controller.onSubtitleChange('猫\nです');
await flushMicrotasks();
assert.equal(tokenizeCalls, 0, 'same cached subtitle should not reprocess after immediate consume');
assert.deepEqual(emitted, []);
});
test('isCacheFull returns false when cache is below limit', () => {
const controller = createSubtitleProcessingController({
tokenizeSubtitle: async (text) => ({ text, tokens: null }),

View File

@@ -12,9 +12,14 @@ export interface SubtitleProcessingController {
refreshCurrentSubtitle: (textOverride?: string) => void;
invalidateTokenizationCache: () => void;
preCacheTokenization: (text: string, data: SubtitleData) => void;
consumeCachedSubtitle: (text: string) => SubtitleData | null;
isCacheFull: () => boolean;
}
function normalizeSubtitleCacheKey(text: string): string {
return text.replace(/\r\n/g, '\n').replace(/\\N/g, '\n').replace(/\\n/g, '\n').trim();
}
export function createSubtitleProcessingController(
deps: SubtitleProcessingControllerDeps,
): SubtitleProcessingController {
@@ -28,18 +33,19 @@ export function createSubtitleProcessingController(
const now = deps.now ?? (() => Date.now());
const getCachedTokenization = (text: string): SubtitleData | null => {
const cached = tokenizationCache.get(text);
const cacheKey = normalizeSubtitleCacheKey(text);
const cached = tokenizationCache.get(cacheKey);
if (!cached) {
return null;
}
tokenizationCache.delete(text);
tokenizationCache.set(text, cached);
tokenizationCache.delete(cacheKey);
tokenizationCache.set(cacheKey, cached);
return cached;
};
const setCachedTokenization = (text: string, payload: SubtitleData): void => {
tokenizationCache.set(text, payload);
tokenizationCache.set(normalizeSubtitleCacheKey(text), payload);
while (tokenizationCache.size > SUBTITLE_TOKENIZATION_CACHE_LIMIT) {
const firstKey = tokenizationCache.keys().next().value;
if (firstKey !== undefined) {
@@ -135,6 +141,17 @@ export function createSubtitleProcessingController(
preCacheTokenization: (text: string, data: SubtitleData) => {
setCachedTokenization(text, data);
},
consumeCachedSubtitle: (text: string) => {
const cached = getCachedTokenization(text);
if (!cached) {
return null;
}
latestText = text;
lastEmittedText = text;
refreshRequested = false;
return cached;
},
isCacheFull: () => {
return tokenizationCache.size >= SUBTITLE_TOKENIZATION_CACHE_LIMIT;
},

View File

@@ -130,6 +130,30 @@ test('serializeSubtitleMarkup preserves tooltip attrs and name-match precedence'
assert.doesNotMatch(markup, /data-frequency-rank="12"|data-jlpt-level="N5"|word-jlpt-n5/);
});
test('serializeSubtitleMarkup keeps filtered tokens hoverable without annotation attrs', () => {
const payload: SubtitleData = {
text: 'は',
tokens: [
{
surface: 'は',
reading: 'は',
headword: 'は',
startPos: 0,
endPos: 1,
partOfSpeech: PartOfSpeech.particle,
pos1: '助詞',
isMerged: false,
isKnown: false,
isNPlusOneTarget: false,
isNameMatch: false,
},
],
};
const markup = serializeSubtitleMarkup(payload, frequencyOptions);
assert.equal(markup, '<span class="word" data-reading="は" data-headword="は">は</span>');
});
test('serializeSubtitleWebsocketMessage emits sentence payload', () => {
const payload: SubtitleData = {
text: '字幕',

View File

@@ -1,23 +1,72 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { injectTexthookerBootstrapHtml } from './texthooker';
import { injectTexthookerBootstrapHtml, type TexthookerBootstrapSettings } from './texthooker';
test('injectTexthookerBootstrapHtml injects websocket bootstrap before head close', () => {
const html = '<html><head><title>Texthooker</title></head><body></body></html>';
const actual = injectTexthookerBootstrapHtml(html, 'ws://127.0.0.1:6678');
const settings: TexthookerBootstrapSettings = {
enableKnownWordColoring: true,
enableNPlusOneColoring: true,
enableNameMatchColoring: true,
enableFrequencyColoring: true,
enableJlptColoring: true,
characterDictionaryEnabled: true,
knownWordColor: '#a6da95',
nPlusOneColor: '#c6a0f6',
nameMatchColor: '#f5bde6',
hoverTokenColor: '#f4dbd6',
hoverTokenBackgroundColor: 'rgba(54, 58, 79, 0.84)',
jlptColors: {
N1: '#ed8796',
N2: '#f5a97f',
N3: '#f9e2af',
N4: '#a6e3a1',
N5: '#8aadf4',
},
frequencyDictionary: {
singleColor: '#f5a97f',
bandedColors: ['#ed8796', '#f5a97f', '#f9e2af', '#8bd5ca', '#8aadf4'],
},
};
const actual = injectTexthookerBootstrapHtml(html, 'ws://127.0.0.1:6678', settings);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-websocketUrl', "ws:\/\/127\.0\.0\.1:6678"\)/,
);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-enableKnownWordColoring', "1"\)/,
);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-enableNPlusOneColoring', "1"\)/,
);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-enableNameMatchColoring', "1"\)/,
);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-enableFrequencyColoring', "1"\)/,
);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-enableJlptColoring', "1"\)/,
);
assert.match(
actual,
/window\.localStorage\.setItem\('bannou-texthooker-characterDictionaryEnabled', "1"\)/,
);
assert.match(actual, /--subminer-known-word-color:\s*#a6da95;/);
assert.match(actual, /--subminer-n-plus-one-color:\s*#c6a0f6;/);
assert.match(actual, /--subminer-name-match-color:\s*#f5bde6;/);
assert.match(actual, /--subminer-jlpt-n1-color:\s*#ed8796;/);
assert.match(actual, /--subminer-frequency-band-4-color:\s*#8bd5ca;/);
assert.match(actual, /--sm-token-hover-bg:\s*rgba\(54, 58, 79, 0\.84\);/);
assert.doesNotMatch(actual, /p \.word\.word-known\s*\{/);
assert.ok(actual.indexOf('</script></head>') !== -1);
assert.ok(actual.includes('bannou-texthooker-websocketUrl'));
assert.ok(!actual.includes('bannou-texthooker-enableKnownWordColoring'));
assert.ok(!actual.includes('bannou-texthooker-enableNPlusOneColoring'));
assert.ok(!actual.includes('bannou-texthooker-enableNameMatchColoring'));
assert.ok(!actual.includes('bannou-texthooker-enableFrequencyColoring'));
assert.ok(!actual.includes('bannou-texthooker-enableJlptColoring'));
});
test('injectTexthookerBootstrapHtml leaves html unchanged without websocketUrl', () => {

View File

@@ -5,23 +5,92 @@ import { createLogger } from '../../logger';
const logger = createLogger('main:texthooker');
export function injectTexthookerBootstrapHtml(html: string, websocketUrl?: string): string {
if (!websocketUrl) {
export type TexthookerBootstrapSettings = {
enableKnownWordColoring: boolean;
enableNPlusOneColoring: boolean;
enableNameMatchColoring: boolean;
enableFrequencyColoring: boolean;
enableJlptColoring: boolean;
characterDictionaryEnabled: boolean;
knownWordColor: string;
nPlusOneColor: string;
nameMatchColor: string;
hoverTokenColor: string;
hoverTokenBackgroundColor: string;
jlptColors: {
N1: string;
N2: string;
N3: string;
N4: string;
N5: string;
};
frequencyDictionary: {
singleColor: string;
bandedColors: readonly [string, string, string, string, string];
};
};
function buildTexthookerBootstrapScript(
websocketUrl?: string,
settings?: TexthookerBootstrapSettings,
): string {
const statements: string[] = [];
if (websocketUrl) {
statements.push(
`window.localStorage.setItem('bannou-texthooker-websocketUrl', ${JSON.stringify(websocketUrl)});`,
);
}
if (settings) {
const booleanStorageValue = (enabled: boolean): '"1"' | '"0"' => (enabled ? '"1"' : '"0"');
statements.push(
`window.localStorage.setItem('bannou-texthooker-enableKnownWordColoring', ${booleanStorageValue(settings.enableKnownWordColoring)});`,
`window.localStorage.setItem('bannou-texthooker-enableNPlusOneColoring', ${booleanStorageValue(settings.enableNPlusOneColoring)});`,
`window.localStorage.setItem('bannou-texthooker-enableNameMatchColoring', ${booleanStorageValue(settings.enableNameMatchColoring)});`,
`window.localStorage.setItem('bannou-texthooker-enableFrequencyColoring', ${booleanStorageValue(settings.enableFrequencyColoring)});`,
`window.localStorage.setItem('bannou-texthooker-enableJlptColoring', ${booleanStorageValue(settings.enableJlptColoring)});`,
`window.localStorage.setItem('bannou-texthooker-characterDictionaryEnabled', ${booleanStorageValue(settings.characterDictionaryEnabled)});`,
);
}
return statements.length > 0 ? `<script>${statements.join('')}</script>` : '';
}
function buildTexthookerBootstrapStyle(settings?: TexthookerBootstrapSettings): string {
if (!settings) {
return '';
}
const [band1, band2, band3, band4, band5] = settings.frequencyDictionary.bandedColors;
return `<style id="subminer-texthooker-bootstrap-style">:root{--subminer-known-word-color:${settings.knownWordColor};--subminer-n-plus-one-color:${settings.nPlusOneColor};--subminer-name-match-color:${settings.nameMatchColor};--subminer-jlpt-n1-color:${settings.jlptColors.N1};--subminer-jlpt-n2-color:${settings.jlptColors.N2};--subminer-jlpt-n3-color:${settings.jlptColors.N3};--subminer-jlpt-n4-color:${settings.jlptColors.N4};--subminer-jlpt-n5-color:${settings.jlptColors.N5};--subminer-frequency-single-color:${settings.frequencyDictionary.singleColor};--subminer-frequency-band-1-color:${band1};--subminer-frequency-band-2-color:${band2};--subminer-frequency-band-3-color:${band3};--subminer-frequency-band-4-color:${band4};--subminer-frequency-band-5-color:${band5};--sm-token-hover-bg:${settings.hoverTokenBackgroundColor};--sm-token-hover-text:${settings.hoverTokenColor};}</style>`;
}
export function injectTexthookerBootstrapHtml(
html: string,
websocketUrl?: string,
settings?: TexthookerBootstrapSettings,
): string {
const bootstrapStyle = buildTexthookerBootstrapStyle(settings);
const bootstrapScript = buildTexthookerBootstrapScript(websocketUrl, settings);
if (!bootstrapStyle && !bootstrapScript) {
return html;
}
const bootstrapScript = `<script>window.localStorage.setItem('bannou-texthooker-websocketUrl', ${JSON.stringify(
websocketUrl,
)});</script>`;
if (html.includes('</head>')) {
return html.replace('</head>', `${bootstrapScript}</head>`);
return html.replace('</head>', `${bootstrapStyle}${bootstrapScript}</head>`);
}
return `${bootstrapScript}${html}`;
return `${bootstrapStyle}${bootstrapScript}${html}`;
}
export class Texthooker {
constructor(
private readonly getBootstrapSettings?: () => TexthookerBootstrapSettings | undefined,
) {}
private server: http.Server | null = null;
public isRunning(): boolean {
@@ -62,9 +131,16 @@ export class Texthooker {
res.end('Not found');
return;
}
const bootstrapSettings = this.getBootstrapSettings?.();
const responseData =
urlPath === '/' || urlPath === '/index.html'
? Buffer.from(injectTexthookerBootstrapHtml(data.toString('utf-8'), websocketUrl))
? Buffer.from(
injectTexthookerBootstrapHtml(
data.toString('utf-8'),
websocketUrl,
bootstrapSettings,
),
)
: data;
res.writeHead(200, { 'Content-Type': mimeTypes[ext] || 'text/plain' });
res.end(responseData);

View File

@@ -1305,7 +1305,7 @@ test('tokenizeSubtitle ignores frequency lookup failures', async () => {
assert.equal(result.tokens?.[0]?.frequencyRank, undefined);
});
test('tokenizeSubtitle skips frequency rank when Yomitan token is enriched as particle by mecab pos1', async () => {
test('tokenizeSubtitle keeps standalone particle token hoverable while clearing annotation metadata', async () => {
const result = await tokenizeSubtitle(
'は',
makeDeps({
@@ -1350,9 +1350,33 @@ test('tokenizeSubtitle skips frequency rank when Yomitan token is enriched as pa
}),
);
assert.equal(result.tokens?.length, 1);
assert.equal(result.tokens?.[0]?.pos1, '助詞');
assert.equal(result.tokens?.[0]?.frequencyRank, undefined);
assert.equal(result.text, 'は');
assert.deepEqual(
result.tokens?.map((token) => ({
surface: token.surface,
reading: token.reading,
headword: token.headword,
pos1: token.pos1,
isKnown: token.isKnown,
isNPlusOneTarget: token.isNPlusOneTarget,
isNameMatch: token.isNameMatch,
jlptLevel: token.jlptLevel,
frequencyRank: token.frequencyRank,
})),
[
{
surface: 'は',
reading: 'は',
headword: 'は',
pos1: '助詞',
isKnown: false,
isNPlusOneTarget: false,
isNameMatch: false,
jlptLevel: undefined,
frequencyRank: undefined,
},
],
);
});
test('tokenizeSubtitle keeps frequency rank when mecab tags classify token as content-bearing', async () => {
@@ -1460,7 +1484,7 @@ test('tokenizeSubtitle skips JLPT level for excluded demonstratives', async () =
assert.equal(result.tokens?.[0]?.jlptLevel, undefined);
});
test('tokenizeSubtitle excludes repeated kana interjections from annotation payloads entirely', async () => {
test('tokenizeSubtitle keeps repeated kana interjections tokenized while clearing annotation metadata', async () => {
const result = await tokenizeSubtitle(
'ああ',
makeDeps({
@@ -1491,7 +1515,29 @@ test('tokenizeSubtitle excludes repeated kana interjections from annotation payl
}),
);
assert.deepEqual(result, { text: 'ああ', tokens: null });
assert.equal(result.text, 'ああ');
assert.deepEqual(
result.tokens?.map((token) => ({
surface: token.surface,
headword: token.headword,
reading: token.reading,
jlptLevel: token.jlptLevel,
frequencyRank: token.frequencyRank,
isKnown: token.isKnown,
isNPlusOneTarget: token.isNPlusOneTarget,
})),
[
{
surface: 'ああ',
headword: 'ああ',
reading: 'ああ',
jlptLevel: undefined,
frequencyRank: undefined,
isKnown: false,
isNPlusOneTarget: false,
},
],
);
});
test('tokenizeSubtitle assigns JLPT level to Yomitan tokens', async () => {
@@ -2578,7 +2624,15 @@ test('tokenizeSubtitle keeps correct MeCab pos1 enrichment when Yomitan offsets
const gaToken = result.tokens?.find((token) => token.surface === 'が');
const desuToken = result.tokens?.find((token) => token.surface === 'です');
assert.equal(gaToken?.pos1, '助詞');
assert.equal(gaToken?.isKnown, false);
assert.equal(gaToken?.isNPlusOneTarget, false);
assert.equal(gaToken?.jlptLevel, undefined);
assert.equal(gaToken?.frequencyRank, undefined);
assert.equal(desuToken?.pos1, '助動詞');
assert.equal(desuToken?.isKnown, false);
assert.equal(desuToken?.isNPlusOneTarget, false);
assert.equal(desuToken?.jlptLevel, undefined);
assert.equal(desuToken?.frequencyRank, undefined);
assert.equal(targets.length, 1);
assert.equal(targets[0]?.surface, '仮面');
});
@@ -3056,7 +3110,7 @@ test('tokenizeSubtitle excludes default non-independent pos2 from N+1 and freque
assert.equal(result.tokens?.[0]?.isNPlusOneTarget, false);
});
test('tokenizeSubtitle excludes mecab-tagged interjections from annotation payloads entirely', async () => {
test('tokenizeSubtitle keeps mecab-tagged interjections tokenized while clearing annotation metadata', async () => {
const result = await tokenizeSubtitle(
'ぐはっ',
makeDepsFromYomitanTokens([{ surface: 'ぐはっ', reading: 'ぐはっ', headword: 'ぐはっ' }], {
@@ -3080,10 +3134,34 @@ test('tokenizeSubtitle excludes mecab-tagged interjections from annotation paylo
}),
);
assert.deepEqual(result, { text: 'ぐはっ', tokens: null });
assert.equal(result.text, 'ぐはっ');
assert.deepEqual(
result.tokens?.map((token) => ({
surface: token.surface,
headword: token.headword,
reading: token.reading,
pos1: token.pos1,
jlptLevel: token.jlptLevel,
frequencyRank: token.frequencyRank,
isKnown: token.isKnown,
isNPlusOneTarget: token.isNPlusOneTarget,
})),
[
{
surface: 'ぐはっ',
headword: 'ぐはっ',
reading: 'ぐはっ',
pos1: '感動詞',
jlptLevel: undefined,
frequencyRank: undefined,
isKnown: false,
isNPlusOneTarget: false,
},
],
);
});
test('tokenizeSubtitle keeps visible text while excluding interjections from mixed annotation payloads', async () => {
test('tokenizeSubtitle keeps excluded interjections hoverable while clearing only their annotation metadata', async () => {
const result = await tokenizeSubtitle(
'ぐはっ 猫',
makeDeps({
@@ -3147,8 +3225,261 @@ test('tokenizeSubtitle keeps visible text while excluding interjections from mix
result.tokens?.map((token) => ({
surface: token.surface,
headword: token.headword,
frequencyRank: token.frequencyRank,
jlptLevel: token.jlptLevel,
})),
[{ surface: '猫', headword: '猫' }],
[
{ surface: 'ぐはっ', headword: 'ぐはっ', frequencyRank: undefined, jlptLevel: undefined },
{ surface: '猫', headword: '猫', frequencyRank: 11, jlptLevel: 'N5' },
],
);
});
test('tokenizeSubtitle keeps explanatory ending variants hoverable while clearing only their annotation metadata', async () => {
const result = await tokenizeSubtitle(
'猫んです',
makeDepsFromYomitanTokens(
[
{ surface: '猫', reading: 'ねこ', headword: '猫' },
{ surface: 'んです', reading: 'んです', headword: 'ん' },
],
{
getFrequencyDictionaryEnabled: () => true,
getFrequencyRank: (text) => (text === '猫' ? 11 : 500),
getJlptLevel: (text) => (text === '猫' ? 'N5' : null),
tokenizeWithMecab: async () => [
{
headword: '猫',
surface: '猫',
reading: 'ネコ',
startPos: 0,
endPos: 1,
partOfSpeech: PartOfSpeech.noun,
pos1: '名詞',
pos2: '一般',
isMerged: false,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: 'ん',
surface: 'ん',
reading: 'ン',
startPos: 1,
endPos: 2,
partOfSpeech: PartOfSpeech.other,
pos1: '名詞',
pos2: '非自立',
isMerged: false,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: 'です',
surface: 'です',
reading: 'デス',
startPos: 2,
endPos: 4,
partOfSpeech: PartOfSpeech.bound_auxiliary,
pos1: '助動詞',
isMerged: false,
isKnown: false,
isNPlusOneTarget: false,
},
],
},
),
);
assert.equal(result.text, '猫んです');
assert.deepEqual(
result.tokens?.map((token) => ({
surface: token.surface,
headword: token.headword,
jlptLevel: token.jlptLevel,
frequencyRank: token.frequencyRank,
})),
[
{ surface: '猫', headword: '猫', jlptLevel: 'N5', frequencyRank: 11 },
{ surface: 'んです', headword: 'ん', jlptLevel: undefined, frequencyRank: undefined },
],
);
});
test('tokenizeSubtitle keeps standalone grammar-only tokens hoverable while clearing only their annotation metadata', async () => {
const result = await tokenizeSubtitle(
'私はこの猫です',
makeDeps({
getFrequencyDictionaryEnabled: () => true,
getFrequencyRank: (text) => (text === '私' ? 50 : text === '猫' ? 11 : 500),
getJlptLevel: (text) => (text === '私' ? 'N5' : text === '猫' ? 'N5' : null),
getYomitanExt: () => ({ id: 'dummy-ext' }) as any,
getYomitanParserWindow: () =>
({
isDestroyed: () => false,
webContents: {
executeJavaScript: async (script: string) => {
if (script.includes('getTermFrequencies')) {
return [];
}
return [
{
source: 'scanning-parser',
index: 0,
content: [
[{ text: '私', reading: 'わたし', headwords: [[{ term: '私' }]] }],
[{ text: 'は', reading: 'は', headwords: [[{ term: 'は' }]] }],
[{ text: 'この', reading: 'この', headwords: [[{ term: 'この' }]] }],
[{ text: '猫', reading: 'ねこ', headwords: [[{ term: '猫' }]] }],
[{ text: 'です', reading: 'です', headwords: [[{ term: 'です' }]] }],
],
},
];
},
},
}) as unknown as Electron.BrowserWindow,
tokenizeWithMecab: async () => [
{
headword: '私',
surface: '私',
reading: 'ワタシ',
startPos: 0,
endPos: 1,
partOfSpeech: PartOfSpeech.noun,
pos1: '名詞',
pos2: '代名詞',
isMerged: true,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: 'は',
surface: 'は',
reading: 'ハ',
startPos: 1,
endPos: 2,
partOfSpeech: PartOfSpeech.particle,
pos1: '助詞',
pos2: '係助詞',
isMerged: true,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: 'この',
surface: 'この',
reading: 'コノ',
startPos: 2,
endPos: 4,
partOfSpeech: PartOfSpeech.other,
pos1: '連体詞',
isMerged: true,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: '猫',
surface: '猫',
reading: 'ネコ',
startPos: 4,
endPos: 5,
partOfSpeech: PartOfSpeech.noun,
pos1: '名詞',
pos2: '一般',
isMerged: true,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: 'です',
surface: 'です',
reading: 'デス',
startPos: 5,
endPos: 7,
partOfSpeech: PartOfSpeech.bound_auxiliary,
pos1: '助動詞',
isMerged: true,
isKnown: false,
isNPlusOneTarget: false,
},
],
}),
);
assert.equal(result.text, '私はこの猫です');
assert.deepEqual(
result.tokens?.map((token) => ({
surface: token.surface,
headword: token.headword,
frequencyRank: token.frequencyRank,
jlptLevel: token.jlptLevel,
})),
[
{ surface: '私', headword: '私', frequencyRank: 50, jlptLevel: 'N5' },
{ surface: 'は', headword: 'は', frequencyRank: undefined, jlptLevel: undefined },
{ surface: 'この', headword: 'この', frequencyRank: undefined, jlptLevel: undefined },
{ surface: '猫', headword: '猫', frequencyRank: 11, jlptLevel: 'N5' },
{ surface: 'です', headword: 'です', frequencyRank: undefined, jlptLevel: undefined },
],
);
});
test('tokenizeSubtitle keeps trailing quote-particle merged tokens hoverable while clearing only their annotation metadata', async () => {
const result = await tokenizeSubtitle(
'どうしてもって',
makeDepsFromYomitanTokens([{ surface: 'どうしてもって', reading: 'どうしてもって', headword: 'どうしても' }], {
getFrequencyDictionaryEnabled: () => true,
getFrequencyRank: (text) => (text === 'どうしても' ? 123 : null),
getJlptLevel: (text) => (text === 'どうしても' ? 'N3' : null),
tokenizeWithMecab: async () => [
{
headword: 'どうしても',
surface: 'どうしても',
reading: 'ドウシテモ',
startPos: 0,
endPos: 5,
partOfSpeech: PartOfSpeech.other,
pos1: '副詞',
pos2: '一般',
isMerged: false,
isKnown: false,
isNPlusOneTarget: false,
},
{
headword: 'って',
surface: 'って',
reading: 'ッテ',
startPos: 5,
endPos: 7,
partOfSpeech: PartOfSpeech.particle,
pos1: '助詞',
pos2: '格助詞',
isMerged: false,
isKnown: false,
isNPlusOneTarget: false,
},
],
getMinSentenceWordsForNPlusOne: () => 1,
}),
);
assert.equal(result.text, 'どうしてもって');
assert.deepEqual(
result.tokens?.map((token) => ({
surface: token.surface,
headword: token.headword,
jlptLevel: token.jlptLevel,
frequencyRank: token.frequencyRank,
})),
[
{
surface: 'どうしてもって',
headword: 'どうしても',
jlptLevel: undefined,
frequencyRank: undefined,
},
],
);
});

View File

@@ -178,7 +178,7 @@ async function applyAnnotationStage(
);
}
async function filterSubtitleAnnotationTokens(tokens: MergedToken[]): Promise<MergedToken[]> {
async function stripSubtitleAnnotationMetadata(tokens: MergedToken[]): Promise<MergedToken[]> {
if (tokens.length === 0) {
return tokens;
}
@@ -188,9 +188,7 @@ async function filterSubtitleAnnotationTokens(tokens: MergedToken[]): Promise<Me
}
const annotationStage = await annotationStageModulePromise;
return tokens.filter(
(token) => !annotationStage.shouldExcludeTokenFromSubtitleAnnotations(token),
);
return tokens.map((token) => annotationStage.stripSubtitleAnnotationMetadata(token));
}
export function createTokenizerDepsRuntime(
@@ -721,12 +719,12 @@ export async function tokenizeSubtitle(
const yomitanTokens = await parseWithYomitanInternalParser(tokenizeText, deps, annotationOptions);
if (yomitanTokens && yomitanTokens.length > 0) {
const filteredTokens = await filterSubtitleAnnotationTokens(
const annotatedTokens = await stripSubtitleAnnotationMetadata(
await applyAnnotationStage(yomitanTokens, deps, annotationOptions),
);
return {
text: displayText,
tokens: filteredTokens.length > 0 ? filteredTokens : null,
tokens: annotatedTokens.length > 0 ? annotatedTokens : null,
};
}

View File

@@ -1,7 +1,12 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { MergedToken, PartOfSpeech } from '../../../types';
import { annotateTokens, AnnotationStageDeps } from './annotation-stage';
import {
annotateTokens,
AnnotationStageDeps,
shouldExcludeTokenFromSubtitleAnnotations,
stripSubtitleAnnotationMetadata,
} from './annotation-stage';
function makeToken(overrides: Partial<MergedToken> = {}): MergedToken {
return {
@@ -150,6 +155,170 @@ test('annotateTokens handles JLPT disabled and eligibility exclusion paths', ()
assert.equal(excludedLookupCalls, 0);
});
test('shouldExcludeTokenFromSubtitleAnnotations excludes explanatory ending variants', () => {
const tokens = [
makeToken({
surface: 'んです',
headword: 'ん',
reading: 'ンデス',
pos1: '名詞|助動詞',
pos2: '非自立',
}),
makeToken({
surface: 'のだ',
headword: 'の',
reading: 'ノダ',
pos1: '名詞|助動詞',
pos2: '非自立',
}),
makeToken({
surface: 'んだ',
headword: 'ん',
reading: 'ンダ',
pos1: '名詞|助動詞',
pos2: '非自立',
}),
makeToken({
surface: 'のです',
headword: 'の',
reading: 'ノデス',
pos1: '名詞|助動詞',
pos2: '非自立',
}),
makeToken({
surface: 'なんです',
headword: 'だ',
reading: 'ナンデス',
pos1: '助動詞|名詞|助動詞',
pos2: '|非自立',
}),
makeToken({
surface: 'んでした',
headword: 'ん',
reading: 'ンデシタ',
pos1: '助動詞|助動詞|助動詞',
}),
makeToken({
surface: 'のでは',
headword: 'の',
reading: 'ノデハ',
pos1: '助詞|接続詞',
}),
];
for (const token of tokens) {
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true, token.surface);
}
});
test('shouldExcludeTokenFromSubtitleAnnotations keeps lexical tokens outside explanatory ending family', () => {
const token = makeToken({
surface: '問題',
headword: '問題',
reading: 'モンダイ',
partOfSpeech: PartOfSpeech.noun,
pos1: '名詞',
pos2: '一般',
});
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), false);
});
test('shouldExcludeTokenFromSubtitleAnnotations excludes standalone particles auxiliaries and adnominals', () => {
const tokens = [
makeToken({
surface: 'は',
headword: 'は',
reading: 'ハ',
partOfSpeech: PartOfSpeech.particle,
pos1: '助詞',
}),
makeToken({
surface: 'です',
headword: 'です',
reading: 'デス',
partOfSpeech: PartOfSpeech.bound_auxiliary,
pos1: '助動詞',
}),
makeToken({
surface: 'この',
headword: 'この',
reading: 'コノ',
partOfSpeech: PartOfSpeech.other,
pos1: '連体詞',
}),
];
for (const token of tokens) {
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true, token.surface);
}
});
test('shouldExcludeTokenFromSubtitleAnnotations keeps mixed content tokens with trailing helpers', () => {
const token = makeToken({
surface: '行きます',
headword: '行く',
reading: 'イキマス',
partOfSpeech: PartOfSpeech.verb,
pos1: '動詞|助動詞',
pos2: '自立',
});
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), false);
});
test('shouldExcludeTokenFromSubtitleAnnotations excludes merged lexical tokens with trailing quote particles', () => {
const token = makeToken({
surface: 'どうしてもって',
headword: 'どうしても',
reading: 'ドウシテモッテ',
partOfSpeech: PartOfSpeech.other,
pos1: '副詞|助詞',
pos2: '一般|格助詞',
});
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true);
});
test('stripSubtitleAnnotationMetadata keeps token hover data while clearing annotation fields', () => {
const token = makeToken({
surface: 'は',
headword: 'は',
reading: 'ハ',
partOfSpeech: PartOfSpeech.particle,
pos1: '助詞',
isKnown: true,
isNPlusOneTarget: true,
isNameMatch: true,
jlptLevel: 'N5',
frequencyRank: 12,
});
assert.deepEqual(stripSubtitleAnnotationMetadata(token), {
...token,
isKnown: false,
isNPlusOneTarget: false,
isNameMatch: false,
jlptLevel: undefined,
frequencyRank: undefined,
});
});
test('stripSubtitleAnnotationMetadata leaves content tokens unchanged', () => {
const token = makeToken({
surface: '猫',
headword: '猫',
reading: 'ネコ',
partOfSpeech: PartOfSpeech.noun,
pos1: '名詞',
isKnown: true,
jlptLevel: 'N5',
frequencyRank: 42,
});
assert.strictEqual(stripSubtitleAnnotationMetadata(token), token);
});
test('annotateTokens prioritizes name matches over n+1, frequency, and JLPT when enabled', () => {
let jlptLookupCalls = 0;
const tokens = [

View File

@@ -25,6 +25,45 @@ const SUBTITLE_ANNOTATION_EXCLUDED_TERMS = new Set([
'ふう',
'ほう',
]);
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_PREFIXES = ['ん', 'の', 'なん', 'なの'];
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_CORES = [
'だ',
'です',
'でした',
'だった',
'では',
'じゃ',
'でしょう',
'だろう',
] as const;
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_TRAILING_PARTICLES = [
'',
'か',
'ね',
'よ',
'な',
'よね',
'かな',
'かね',
] as const;
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS = new Set(
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_PREFIXES.flatMap((prefix) =>
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_CORES.flatMap((core) =>
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_TRAILING_PARTICLES.map(
(particle) => `${prefix}${core}${particle}`,
),
),
),
);
const SUBTITLE_ANNOTATION_EXCLUDED_TRAILING_PARTICLE_SUFFIXES = new Set([
'って',
'ってよ',
'ってね',
'ってな',
'ってさ',
'ってか',
'ってば',
]);
const jlptLevelLookupCaches = new WeakMap<
(text: string) => JlptLevel | null,
@@ -60,6 +99,7 @@ function normalizePos1Tag(pos1: string | undefined): string {
}
const SUBTITLE_ANNOTATION_EXCLUDED_POS1 = new Set(['感動詞']);
const SUBTITLE_ANNOTATION_GRAMMAR_ONLY_POS1 = new Set(['助詞', '助動詞', '連体詞']);
function splitNormalizedTagParts(normalizedTag: string): string[] {
if (!normalizedTag) {
@@ -84,7 +124,36 @@ function isExcludedByTagSet(normalizedTag: string, exclusions: ReadonlySet<strin
function isExcludedFromSubtitleAnnotationsByPos1(normalizedPos1: string): boolean {
const parts = splitNormalizedTagParts(normalizedPos1);
return parts.some((part) => SUBTITLE_ANNOTATION_EXCLUDED_POS1.has(part));
if (parts.some((part) => SUBTITLE_ANNOTATION_EXCLUDED_POS1.has(part))) {
return true;
}
return parts.length > 0 && parts.every((part) => SUBTITLE_ANNOTATION_GRAMMAR_ONLY_POS1.has(part));
}
function isExcludedTrailingParticleMergedToken(token: MergedToken): boolean {
const normalizedSurface = normalizeJlptTextForExclusion(token.surface);
const normalizedHeadword = normalizeJlptTextForExclusion(token.headword);
if (!normalizedSurface || !normalizedHeadword || !normalizedSurface.startsWith(normalizedHeadword)) {
return false;
}
const suffix = normalizedSurface.slice(normalizedHeadword.length);
if (!SUBTITLE_ANNOTATION_EXCLUDED_TRAILING_PARTICLE_SUFFIXES.has(suffix)) {
return false;
}
const pos1Parts = splitNormalizedTagParts(normalizePos1Tag(token.pos1));
if (pos1Parts.length < 2) {
return false;
}
const [leadingPos1, ...trailingPos1] = pos1Parts;
if (!leadingPos1 || SUBTITLE_ANNOTATION_GRAMMAR_ONLY_POS1.has(leadingPos1)) {
return false;
}
return trailingPos1.length > 0 && trailingPos1.every((part) => part === '助詞');
}
function resolvePos1Exclusions(options: AnnotationStageOptions): ReadonlySet<string> {
@@ -520,12 +589,7 @@ function isJlptEligibleToken(token: MergedToken): boolean {
}
function isExcludedFromSubtitleAnnotationsByTerm(token: MergedToken): boolean {
const candidates = [
resolveJlptLookupText(token),
token.surface,
token.headword,
token.reading,
].filter(
const candidates = [token.surface, token.reading, resolveJlptLookupText(token)].filter(
(candidate): candidate is string => typeof candidate === 'string' && candidate.length > 0,
);
@@ -542,7 +606,9 @@ function isExcludedFromSubtitleAnnotationsByTerm(token: MergedToken): boolean {
if (
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(trimmedCandidate) ||
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(normalizedCandidate)
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(normalizedCandidate) ||
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS.has(trimmedCandidate) ||
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS.has(normalizedCandidate)
) {
return true;
}
@@ -565,9 +631,28 @@ export function shouldExcludeTokenFromSubtitleAnnotations(token: MergedToken): b
return true;
}
if (isExcludedTrailingParticleMergedToken(token)) {
return true;
}
return isExcludedFromSubtitleAnnotationsByTerm(token);
}
export function stripSubtitleAnnotationMetadata(token: MergedToken): MergedToken {
if (!shouldExcludeTokenFromSubtitleAnnotations(token)) {
return token;
}
return {
...token,
isKnown: false,
isNPlusOneTarget: false,
isNameMatch: false,
jlptLevel: undefined,
frequencyRank: undefined,
};
}
function computeTokenKnownStatus(
token: MergedToken,
isKnownWord: (text: string) => boolean,

View File

@@ -11,6 +11,7 @@ import {
shouldDetachBackgroundLaunch,
shouldHandleHelpOnlyAtEntry,
shouldHandleLaunchMpvAtEntry,
shouldHandleStatsDaemonCommandAtEntry,
} from './main-entry-runtime';
test('normalizeStartupArgv defaults no-arg startup to --start --background on non-Windows', () => {
@@ -71,6 +72,25 @@ test('launch-mpv entry helpers detect and normalize targets', () => {
]);
});
test('stats-daemon entry helper detects internal daemon commands', () => {
assert.equal(
shouldHandleStatsDaemonCommandAtEntry(['SubMiner.AppImage', '--stats-daemon-start'], {}),
true,
);
assert.equal(
shouldHandleStatsDaemonCommandAtEntry(['SubMiner.AppImage', '--stats-daemon-stop'], {}),
true,
);
assert.equal(
shouldHandleStatsDaemonCommandAtEntry(
['SubMiner.AppImage', '--stats-daemon-start'],
{ ELECTRON_RUN_AS_NODE: '1' },
),
false,
);
assert.equal(shouldHandleStatsDaemonCommandAtEntry(['SubMiner.AppImage', '--start'], {}), false);
});
test('sanitizeStartupEnv suppresses warnings and lsfg layer', () => {
const env = sanitizeStartupEnv({
VK_INSTANCE_LAYERS: 'foo:lsfg-vk:bar',

View File

@@ -112,6 +112,14 @@ export function shouldHandleLaunchMpvAtEntry(argv: string[], env: NodeJS.Process
return parseCliArgs(argv).launchMpv;
}
export function shouldHandleStatsDaemonCommandAtEntry(
argv: string[],
env: NodeJS.ProcessEnv,
): boolean {
if (env.ELECTRON_RUN_AS_NODE === '1') return false;
return argv.includes('--stats-daemon-start') || argv.includes('--stats-daemon-stop');
}
export function normalizeLaunchMpvTargets(argv: string[]): string[] {
return parseCliArgs(argv).launchMpvTargets;
}

View File

@@ -12,9 +12,11 @@ import {
shouldDetachBackgroundLaunch,
shouldHandleHelpOnlyAtEntry,
shouldHandleLaunchMpvAtEntry,
shouldHandleStatsDaemonCommandAtEntry,
} from './main-entry-runtime';
import { requestSingleInstanceLockEarly } from './main/early-single-instance';
import { createWindowsMpvLaunchDeps, launchWindowsMpv } from './main/runtime/windows-mpv-launch';
import { runStatsDaemonControlFromProcess } from './stats-daemon-entry';
const DEFAULT_TEXTHOOKER_PORT = 5174;
@@ -69,6 +71,11 @@ if (shouldHandleLaunchMpvAtEntry(process.argv, process.env)) {
);
app.exit(result.ok ? 0 : 1);
});
} else if (shouldHandleStatsDaemonCommandAtEntry(process.argv, process.env)) {
void app.whenReady().then(async () => {
const exitCode = await runStatsDaemonControlFromProcess(app.getPath('userData'));
app.exit(exitCode);
});
} else {
const gotSingleInstanceLock = requestSingleInstanceLockEarly(app);
if (!gotSingleInstanceLock) {

View File

@@ -568,7 +568,40 @@ const anilistUpdateQueue = createAnilistUpdateQueue(
},
);
const isDev = process.argv.includes('--dev') || process.argv.includes('--debug');
const texthookerService = new Texthooker();
const texthookerService = new Texthooker(() => {
const config = getResolvedConfig();
const characterDictionaryEnabled =
config.anilist.characterDictionary.enabled && yomitanProfilePolicy.isCharacterDictionaryEnabled();
const knownAndNPlusOneEnabled = getRuntimeBooleanOption(
'subtitle.annotation.nPlusOne',
config.ankiConnect.knownWords.highlightEnabled,
);
return {
enableKnownWordColoring: knownAndNPlusOneEnabled,
enableNPlusOneColoring: knownAndNPlusOneEnabled,
enableNameMatchColoring: config.subtitleStyle.nameMatchEnabled && characterDictionaryEnabled,
enableFrequencyColoring: getRuntimeBooleanOption(
'subtitle.annotation.frequency',
config.subtitleStyle.frequencyDictionary.enabled,
),
enableJlptColoring: getRuntimeBooleanOption(
'subtitle.annotation.jlpt',
config.subtitleStyle.enableJlpt,
),
characterDictionaryEnabled,
knownWordColor: config.ankiConnect.knownWords.color,
nPlusOneColor: config.ankiConnect.nPlusOne.nPlusOne,
nameMatchColor: config.subtitleStyle.nameMatchColor,
hoverTokenColor: config.subtitleStyle.hoverTokenColor,
hoverTokenBackgroundColor: config.subtitleStyle.hoverTokenBackgroundColor,
jlptColors: config.subtitleStyle.jlptColors,
frequencyDictionary: {
singleColor: config.subtitleStyle.frequencyDictionary.singleColor,
bandedColors: config.subtitleStyle.frequencyDictionary.bandedColors,
},
};
});
const subtitleWsService = new SubtitleWebSocket();
const annotationSubtitleWsService = new SubtitleWebSocket();
const logger = createLogger('main');
@@ -1102,11 +1135,7 @@ function maybeSignalPluginAutoplayReady(
let appTray: Tray | null = null;
let tokenizeSubtitleDeferred: ((text: string) => Promise<SubtitleData>) | null = null;
const buildSubtitleProcessingControllerMainDepsHandler =
createBuildSubtitleProcessingControllerMainDepsHandler({
tokenizeSubtitle: async (text: string) =>
tokenizeSubtitleDeferred ? await tokenizeSubtitleDeferred(text) : { text, tokens: null },
emitSubtitle: (payload) => {
function emitSubtitlePayload(payload: SubtitleData): void {
appState.currentSubtitleData = payload;
broadcastToOverlayWindows('subtitle:set', payload);
subtitleWsService.broadcast(payload, {
@@ -1120,7 +1149,12 @@ const buildSubtitleProcessingControllerMainDepsHandler =
mode: getResolvedConfig().subtitleStyle.frequencyDictionary.mode,
});
subtitlePrefetchService?.resume();
},
}
const buildSubtitleProcessingControllerMainDepsHandler =
createBuildSubtitleProcessingControllerMainDepsHandler({
tokenizeSubtitle: async (text: string) =>
tokenizeSubtitleDeferred ? await tokenizeSubtitleDeferred(text) : { text, tokens: null },
emitSubtitle: (payload) => emitSubtitlePayload(payload),
logDebug: (message) => {
logger.debug(`[subtitle-processing] ${message}`);
},
@@ -3102,6 +3136,10 @@ const {
broadcastToOverlayWindows: (channel, payload) => {
broadcastToOverlayWindows(channel, payload);
},
getImmediateSubtitlePayload: (text) => subtitleProcessingController.consumeCachedSubtitle(text),
emitImmediateSubtitle: (payload) => {
emitSubtitlePayload(payload);
},
onSubtitleChange: (text) => {
subtitlePrefetchService?.pause();
subtitleProcessingController.onSubtitleChange(text);

View File

@@ -16,6 +16,7 @@ test('subtitle change handler updates state, broadcasts, and forwards', () => {
const calls: string[] = [];
const handler = createHandleMpvSubtitleChangeHandler({
setCurrentSubText: (text) => calls.push(`set:${text}`),
getImmediateSubtitlePayload: () => null,
broadcastSubtitle: (payload) => calls.push(`broadcast:${payload.text}`),
onSubtitleChange: (text) => calls.push(`process:${text}`),
refreshDiscordPresence: () => calls.push('presence'),
@@ -25,6 +26,35 @@ test('subtitle change handler updates state, broadcasts, and forwards', () => {
assert.deepEqual(calls, ['set:line', 'broadcast:line', 'process:line', 'presence']);
});
test('subtitle change handler broadcasts cached annotated payload immediately when available', () => {
const payloads: Array<{ text: string; tokens: unknown[] | null }> = [];
const calls: string[] = [];
const handler = createHandleMpvSubtitleChangeHandler({
setCurrentSubText: (text) => calls.push(`set:${text}`),
getImmediateSubtitlePayload: (text) => {
calls.push(`lookup:${text}`);
return { text, tokens: [] };
},
broadcastSubtitle: (payload) => {
payloads.push(payload);
calls.push(`broadcast:${payload.tokens === null ? 'plain' : 'annotated'}`);
},
onSubtitleChange: (text) => calls.push(`process:${text}`),
refreshDiscordPresence: () => calls.push('presence'),
});
handler({ text: 'line' });
assert.deepEqual(payloads, [{ text: 'line', tokens: [] }]);
assert.deepEqual(calls, [
'set:line',
'lookup:line',
'broadcast:annotated',
'process:line',
'presence',
]);
});
test('subtitle ass change handler updates state and broadcasts', () => {
const calls: string[] = [];
const handler = createHandleMpvSubtitleAssChangeHandler({

View File

@@ -1,12 +1,24 @@
import type { SubtitleData } from '../../types';
export function createHandleMpvSubtitleChangeHandler(deps: {
setCurrentSubText: (text: string) => void;
broadcastSubtitle: (payload: { text: string; tokens: null }) => void;
getImmediateSubtitlePayload?: (text: string) => SubtitleData | null;
emitImmediateSubtitle?: (payload: SubtitleData) => void;
broadcastSubtitle: (payload: SubtitleData) => void;
onSubtitleChange: (text: string) => void;
refreshDiscordPresence: () => void;
}) {
return ({ text }: { text: string }): void => {
deps.setCurrentSubText(text);
deps.broadcastSubtitle({ text, tokens: null });
const immediatePayload = deps.getImmediateSubtitlePayload?.(text) ?? null;
if (immediatePayload) {
(deps.emitImmediateSubtitle ?? deps.broadcastSubtitle)(immediatePayload);
} else {
deps.broadcastSubtitle({
text,
tokens: null,
});
}
deps.onSubtitleChange(text);
deps.refreshDiscordPresence();
};

View File

@@ -1,3 +1,4 @@
import type { SubtitleData } from '../../types';
import {
createBindMpvClientEventHandlers,
createHandleMpvConnectionChangeHandler,
@@ -35,7 +36,9 @@ export function createBindMpvMainEventHandlersHandler(deps: {
logSubtitleTimingError: (message: string, error: unknown) => void;
setCurrentSubText: (text: string) => void;
broadcastSubtitle: (payload: { text: string; tokens: null }) => void;
getImmediateSubtitlePayload?: (text: string) => SubtitleData | null;
emitImmediateSubtitle?: (payload: SubtitleData) => void;
broadcastSubtitle: (payload: SubtitleData) => void;
onSubtitleChange: (text: string) => void;
refreshDiscordPresence: () => void;
@@ -89,6 +92,8 @@ export function createBindMpvMainEventHandlersHandler(deps: {
});
const handleMpvSubtitleChange = createHandleMpvSubtitleChangeHandler({
setCurrentSubText: (text) => deps.setCurrentSubText(text),
getImmediateSubtitlePayload: (text) => deps.getImmediateSubtitlePayload?.(text) ?? null,
emitImmediateSubtitle: (payload) => deps.emitImmediateSubtitle?.(payload),
broadcastSubtitle: (payload) => deps.broadcastSubtitle(payload),
onSubtitleChange: (text) => deps.onSubtitleChange(text),
refreshDiscordPresence: () => deps.refreshDiscordPresence(),

View File

@@ -35,6 +35,8 @@ export function createBuildBindMpvMainEventHandlersMainDepsHandler(deps: {
maybeRunAnilistPostWatchUpdate: () => Promise<void>;
logSubtitleTimingError: (message: string, error: unknown) => void;
broadcastToOverlayWindows: (channel: string, payload: unknown) => void;
getImmediateSubtitlePayload?: (text: string) => SubtitleData | null;
emitImmediateSubtitle?: (payload: SubtitleData) => void;
onSubtitleChange: (text: string) => void;
onSubtitleTrackChange?: (sid: number | null) => void;
onSubtitleTrackListChange?: (trackList: unknown[] | null) => void;
@@ -102,7 +104,13 @@ export function createBuildBindMpvMainEventHandlersMainDepsHandler(deps: {
setCurrentSubText: (text: string) => {
deps.appState.currentSubText = text;
},
broadcastSubtitle: (payload: { text: string; tokens: null }) =>
getImmediateSubtitlePayload: deps.getImmediateSubtitlePayload
? (text: string) => deps.getImmediateSubtitlePayload!(text)
: undefined,
emitImmediateSubtitle: deps.emitImmediateSubtitle
? (payload: SubtitleData) => deps.emitImmediateSubtitle!(payload)
: undefined,
broadcastSubtitle: (payload: SubtitleData) =>
deps.broadcastToOverlayWindows('subtitle:set', payload),
onSubtitleChange: (text: string) => deps.onSubtitleChange(text),
onSubtitleTrackChange: deps.onSubtitleTrackChange

View File

@@ -0,0 +1,15 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { buildAnimatedImageVideoFilter } from './media-generator';
test('buildAnimatedImageVideoFilter prepends a cloned first frame when lead-in is provided', () => {
assert.equal(
buildAnimatedImageVideoFilter({
fps: 10,
maxWidth: 640,
leadingStillDuration: 1.25,
}),
'tpad=start_duration=1.25:start_mode=clone,fps=10,scale=w=640:h=-2',
);
});

View File

@@ -24,6 +24,33 @@ import { createLogger } from './logger';
const log = createLogger('media');
export function buildAnimatedImageVideoFilter(options: {
fps?: number;
maxWidth?: number;
maxHeight?: number;
leadingStillDuration?: number;
}): string {
const { fps = 10, maxWidth = 640, maxHeight, leadingStillDuration = 0 } = options;
const clampedFps = Math.max(1, Math.min(60, fps));
const vfParts: string[] = [];
if (leadingStillDuration > 0) {
vfParts.push(`tpad=start_duration=${leadingStillDuration}:start_mode=clone`);
}
vfParts.push(`fps=${clampedFps}`);
if (maxWidth && maxWidth > 0 && maxHeight && maxHeight > 0) {
vfParts.push(`scale=w=${maxWidth}:h=${maxHeight}:force_original_aspect_ratio=decrease`);
} else if (maxWidth && maxWidth > 0) {
vfParts.push(`scale=w=${maxWidth}:h=-2`);
} else if (maxHeight && maxHeight > 0) {
vfParts.push(`scale=w=-2:h=${maxHeight}`);
}
return vfParts.join(',');
}
export class MediaGenerator {
private tempDir: string;
private notifyIconDir: string;
@@ -289,25 +316,15 @@ export class MediaGenerator {
maxWidth?: number;
maxHeight?: number;
crf?: number;
leadingStillDuration?: number;
} = {},
): Promise<Buffer> {
const start = Math.max(0, startTime - padding);
const duration = endTime - startTime + 2 * padding;
const { fps = 10, maxWidth = 640, maxHeight, crf = 35 } = options;
const { fps = 10, maxWidth = 640, maxHeight, crf = 35, leadingStillDuration = 0 } = options;
const clampedFps = Math.max(1, Math.min(60, fps));
const clampedCrf = Math.max(0, Math.min(63, crf));
const vfParts: string[] = [];
vfParts.push(`fps=${clampedFps}`);
if (maxWidth && maxWidth > 0 && maxHeight && maxHeight > 0) {
vfParts.push(`scale=w=${maxWidth}:h=${maxHeight}:force_original_aspect_ratio=decrease`);
} else if (maxWidth && maxWidth > 0) {
vfParts.push(`scale=w=${maxWidth}:h=-2`);
} else if (maxHeight && maxHeight > 0) {
vfParts.push(`scale=w=-2:h=${maxHeight}`);
}
const av1Encoder = await this.detectAv1Encoder();
if (!av1Encoder) {
throw new Error(
@@ -338,7 +355,12 @@ export class MediaGenerator {
'-i',
videoPath,
'-vf',
vfParts.join(','),
buildAnimatedImageVideoFilter({
fps,
maxWidth,
maxHeight,
leadingStillDuration,
}),
...encoderArgs,
'-y',
outputPath,

View File

@@ -682,7 +682,7 @@ test('renderSubtitle preserves unsupported punctuation while keeping it non-inte
}
});
test('renderSubtitle keeps excluded interjection text visible while only rendering remaining tokens as interactive', () => {
test('renderSubtitle keeps excluded interjection tokens hoverable while rendering them without annotation styling', () => {
const restoreDocument = installFakeDocument();
try {
@@ -718,13 +718,19 @@ test('renderSubtitle keeps excluded interjection text visible while only renderi
renderer.renderSubtitle({
text: 'ぐはっ 猫',
tokens: [createToken({ surface: '猫', headword: '猫', reading: 'ねこ' })],
tokens: [
createToken({ surface: 'ぐはっ', headword: 'ぐはっ', reading: 'ぐはっ' }),
createToken({ surface: '猫', headword: '猫', reading: 'ねこ' }),
],
});
assert.equal(subtitleRoot.textContent, 'ぐはっ 猫');
assert.deepEqual(
collectWordNodes(subtitleRoot).map((node) => [node.textContent, node.dataset.tokenIndex]),
[['猫', '0']],
[
['ぐはっ', '0'],
['猫', '1'],
],
);
} finally {
restoreDocument();

View File

@@ -0,0 +1,158 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { createRunStatsDaemonControlHandler } from './stats-daemon-control';
test('stats daemon control reuses live daemon and writes launcher response', async () => {
const calls: string[] = [];
const responses: Array<{ path: string; payload: { ok: boolean; url?: string; error?: string } }> =
[];
const handler = createRunStatsDaemonControlHandler({
statePath: '/tmp/stats-daemon.json',
readState: () => ({ pid: 4242, port: 5175, startedAtMs: 1 }),
removeState: () => {
calls.push('removeState');
},
isProcessAlive: (pid) => {
calls.push(`isProcessAlive:${pid}`);
return true;
},
resolveUrl: (state) => `http://127.0.0.1:${state.port}`,
spawnDaemon: async () => {
calls.push('spawnDaemon');
return 1;
},
waitForDaemonResponse: async () => {
calls.push('waitForDaemonResponse');
return { ok: true, url: 'http://127.0.0.1:5175' };
},
openExternal: async (url) => {
calls.push(`openExternal:${url}`);
},
writeResponse: (responsePath, payload) => {
responses.push({ path: responsePath, payload });
},
killProcess: () => {
calls.push('killProcess');
},
sleep: async () => {},
});
const exitCode = await handler({
action: 'start',
responsePath: '/tmp/response.json',
openBrowser: true,
daemonScriptPath: '/tmp/stats-daemon-runner.js',
userDataPath: '/tmp/SubMiner',
});
assert.equal(exitCode, 0);
assert.deepEqual(calls, ['isProcessAlive:4242', 'openExternal:http://127.0.0.1:5175']);
assert.deepEqual(responses, [
{
path: '/tmp/response.json',
payload: { ok: true, url: 'http://127.0.0.1:5175' },
},
]);
});
test('stats daemon control clears stale state, starts daemon, and waits for response', async () => {
const calls: string[] = [];
const handler = createRunStatsDaemonControlHandler({
statePath: '/tmp/stats-daemon.json',
readState: () => ({ pid: 4242, port: 5175, startedAtMs: 1 }),
removeState: () => {
calls.push('removeState');
},
isProcessAlive: (pid) => {
calls.push(`isProcessAlive:${pid}`);
return false;
},
resolveUrl: (state) => `http://127.0.0.1:${state.port}`,
spawnDaemon: async (options) => {
calls.push(`spawnDaemon:${options.scriptPath}:${options.responsePath}:${options.userDataPath}`);
return 999;
},
waitForDaemonResponse: async (responsePath) => {
calls.push(`waitForDaemonResponse:${responsePath}`);
return { ok: true, url: 'http://127.0.0.1:5175' };
},
openExternal: async (url) => {
calls.push(`openExternal:${url}`);
},
writeResponse: () => {
calls.push('writeResponse');
},
killProcess: () => {
calls.push('killProcess');
},
sleep: async () => {},
});
const exitCode = await handler({
action: 'start',
responsePath: '/tmp/response.json',
openBrowser: false,
daemonScriptPath: '/tmp/stats-daemon-runner.js',
userDataPath: '/tmp/SubMiner',
});
assert.equal(exitCode, 0);
assert.deepEqual(calls, [
'isProcessAlive:4242',
'removeState',
'spawnDaemon:/tmp/stats-daemon-runner.js:/tmp/response.json:/tmp/SubMiner',
'waitForDaemonResponse:/tmp/response.json',
]);
});
test('stats daemon control stops live daemon and treats stale state as success', async () => {
const responses: Array<{ path: string; payload: { ok: boolean; url?: string; error?: string } }> =
[];
const calls: string[] = [];
let aliveChecks = 0;
const handler = createRunStatsDaemonControlHandler({
statePath: '/tmp/stats-daemon.json',
readState: () => ({ pid: 4242, port: 5175, startedAtMs: 1 }),
removeState: () => {
calls.push('removeState');
},
isProcessAlive: (pid) => {
aliveChecks += 1;
calls.push(`isProcessAlive:${pid}:${aliveChecks}`);
return aliveChecks === 1;
},
resolveUrl: (state) => `http://127.0.0.1:${state.port}`,
spawnDaemon: async () => 1,
waitForDaemonResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
openExternal: async () => {},
writeResponse: (responsePath, payload) => {
responses.push({ path: responsePath, payload });
},
killProcess: (pid, signal) => {
calls.push(`killProcess:${pid}:${signal}`);
},
sleep: async () => {},
});
const exitCode = await handler({
action: 'stop',
responsePath: '/tmp/response.json',
openBrowser: false,
daemonScriptPath: '/tmp/stats-daemon-runner.js',
userDataPath: '/tmp/SubMiner',
});
assert.equal(exitCode, 0);
assert.deepEqual(calls, [
'isProcessAlive:4242:1',
'killProcess:4242:SIGTERM',
'isProcessAlive:4242:2',
'removeState',
]);
assert.deepEqual(responses, [
{
path: '/tmp/response.json',
payload: { ok: true },
},
]);
});

102
src/stats-daemon-control.ts Normal file
View File

@@ -0,0 +1,102 @@
import type { BackgroundStatsServerState } from './main/runtime/stats-daemon';
import type { StatsCliCommandResponse } from './main/runtime/stats-cli-command';
export type StatsDaemonControlAction = 'start' | 'stop';
export type StatsDaemonControlArgs = {
action: StatsDaemonControlAction;
responsePath?: string;
openBrowser: boolean;
daemonScriptPath: string;
userDataPath: string;
};
type SpawnStatsDaemonOptions = {
scriptPath: string;
responsePath?: string;
userDataPath: string;
};
export function createRunStatsDaemonControlHandler(deps: {
statePath: string;
readState: () => BackgroundStatsServerState | null;
removeState: () => void;
isProcessAlive: (pid: number) => boolean;
resolveUrl: (state: Pick<BackgroundStatsServerState, 'port'>) => string;
spawnDaemon: (options: SpawnStatsDaemonOptions) => Promise<number> | number;
waitForDaemonResponse: (responsePath: string) => Promise<StatsCliCommandResponse>;
openExternal: (url: string) => Promise<unknown>;
writeResponse: (responsePath: string, payload: StatsCliCommandResponse) => void;
killProcess: (pid: number, signal: NodeJS.Signals) => void;
sleep: (ms: number) => Promise<void>;
}) {
const writeResponseSafe = (
responsePath: string | undefined,
payload: StatsCliCommandResponse,
): void => {
if (!responsePath) return;
deps.writeResponse(responsePath, payload);
};
return async (args: StatsDaemonControlArgs): Promise<number> => {
if (args.action === 'start') {
const state = deps.readState();
if (state) {
if (deps.isProcessAlive(state.pid)) {
const url = deps.resolveUrl(state);
writeResponseSafe(args.responsePath, { ok: true, url });
if (args.openBrowser) {
await deps.openExternal(url);
}
return 0;
}
deps.removeState();
}
if (!args.responsePath) {
throw new Error('Missing --stats-response-path for stats daemon start.');
}
await deps.spawnDaemon({
scriptPath: args.daemonScriptPath,
responsePath: args.responsePath,
userDataPath: args.userDataPath,
});
const response = await deps.waitForDaemonResponse(args.responsePath);
if (response.ok && args.openBrowser && response.url) {
await deps.openExternal(response.url);
}
return response.ok ? 0 : 1;
}
const state = deps.readState();
if (!state) {
deps.removeState();
writeResponseSafe(args.responsePath, { ok: true });
return 0;
}
if (!deps.isProcessAlive(state.pid)) {
deps.removeState();
writeResponseSafe(args.responsePath, { ok: true });
return 0;
}
deps.killProcess(state.pid, 'SIGTERM');
const deadline = Date.now() + 2_000;
while (Date.now() < deadline) {
if (!deps.isProcessAlive(state.pid)) {
deps.removeState();
writeResponseSafe(args.responsePath, { ok: true });
return 0;
}
await deps.sleep(50);
}
writeResponseSafe(args.responsePath, {
ok: false,
error: 'Timed out stopping background stats server.',
});
return 1;
};
}

135
src/stats-daemon-entry.ts Normal file
View File

@@ -0,0 +1,135 @@
import fs from 'node:fs';
import path from 'node:path';
import { spawn } from 'node:child_process';
import { shell } from 'electron';
import { sanitizeStartupEnv } from './main-entry-runtime';
import {
isBackgroundStatsServerProcessAlive,
readBackgroundStatsServerState,
removeBackgroundStatsServerState,
resolveBackgroundStatsServerUrl,
} from './main/runtime/stats-daemon';
import {
createRunStatsDaemonControlHandler,
type StatsDaemonControlArgs,
} from './stats-daemon-control';
import {
type StatsCliCommandResponse,
writeStatsCliCommandResponse,
} from './main/runtime/stats-cli-command';
const STATS_DAEMON_RESPONSE_TIMEOUT_MS = 12_000;
function readFlagValue(argv: string[], flag: string): string | undefined {
for (let i = 0; i < argv.length; i += 1) {
const arg = argv[i];
if (!arg) continue;
if (arg === flag) {
const value = argv[i + 1];
if (value && !value.startsWith('--')) {
return value;
}
return undefined;
}
if (arg.startsWith(`${flag}=`)) {
return arg.split('=', 2)[1];
}
}
return undefined;
}
function hasFlag(argv: string[], flag: string): boolean {
return argv.includes(flag);
}
function parseControlArgs(argv: string[], userDataPath: string): StatsDaemonControlArgs {
return {
action: hasFlag(argv, '--stats-daemon-stop') ? 'stop' : 'start',
responsePath: readFlagValue(argv, '--stats-response-path'),
openBrowser: hasFlag(argv, '--stats-daemon-open-browser'),
daemonScriptPath: path.join(__dirname, 'stats-daemon-runner.js'),
userDataPath,
};
}
async function waitForDaemonResponse(responsePath: string): Promise<StatsCliCommandResponse> {
const deadline = Date.now() + STATS_DAEMON_RESPONSE_TIMEOUT_MS;
while (Date.now() < deadline) {
try {
if (fs.existsSync(responsePath)) {
return JSON.parse(fs.readFileSync(responsePath, 'utf8')) as StatsCliCommandResponse;
}
} catch {
// retry until timeout
}
await new Promise((resolve) => setTimeout(resolve, 100));
}
return {
ok: false,
error: 'Timed out waiting for stats daemon startup response.',
};
}
export async function runStatsDaemonControlFromProcess(userDataPath: string): Promise<number> {
const args = parseControlArgs(process.argv, userDataPath);
const statePath = path.join(userDataPath, 'stats-daemon.json');
const writeFailureResponse = (message: string): void => {
if (args.responsePath) {
try {
writeStatsCliCommandResponse(args.responsePath, {
ok: false,
error: message,
});
} catch {
// ignore secondary response-write failures
}
}
};
const handler = createRunStatsDaemonControlHandler({
statePath,
readState: () => readBackgroundStatsServerState(statePath),
removeState: () => {
removeBackgroundStatsServerState(statePath);
},
isProcessAlive: (pid) => isBackgroundStatsServerProcessAlive(pid),
resolveUrl: (state) => resolveBackgroundStatsServerUrl(state),
spawnDaemon: async (options) => {
const childArgs = [options.scriptPath, '--stats-user-data-path', options.userDataPath];
if (options.responsePath) {
childArgs.push('--stats-response-path', options.responsePath);
}
const logLevel = readFlagValue(process.argv, '--log-level');
if (logLevel) {
childArgs.push('--log-level', logLevel);
}
const child = spawn(process.execPath, childArgs, {
detached: true,
stdio: 'ignore',
env: {
...sanitizeStartupEnv(process.env),
ELECTRON_RUN_AS_NODE: '1',
},
});
child.unref();
return child.pid ?? 0;
},
waitForDaemonResponse,
openExternal: async (url) => shell.openExternal(url),
writeResponse: writeStatsCliCommandResponse,
killProcess: (pid, signal) => {
process.kill(pid, signal);
},
sleep: async (ms) => new Promise((resolve) => setTimeout(resolve, ms)),
});
try {
return await handler(args);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
writeFailureResponse(message);
return 1;
}
}

225
src/stats-daemon-runner.ts Normal file
View File

@@ -0,0 +1,225 @@
import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import { spawn } from 'node:child_process';
import { ConfigService } from './config/service';
import { createLogger, setLogLevel } from './logger';
import { ImmersionTrackerService } from './core/services/immersion-tracker-service';
import { createCoverArtFetcher } from './core/services/anilist/cover-art-fetcher';
import { createAnilistRateLimiter } from './core/services/anilist/rate-limiter';
import { startStatsServer } from './core/services/stats-server';
import {
removeBackgroundStatsServerState,
writeBackgroundStatsServerState,
} from './main/runtime/stats-daemon';
import { writeStatsCliCommandResponse } from './main/runtime/stats-cli-command';
import { createInvokeStatsWordHelperHandler, type StatsWordHelperResponse } from './stats-word-helper-client';
const logger = createLogger('stats-daemon');
const STATS_WORD_HELPER_RESPONSE_TIMEOUT_MS = 20_000;
function readFlagValue(argv: string[], flag: string): string | undefined {
for (let i = 0; i < argv.length; i += 1) {
const arg = argv[i];
if (!arg) continue;
if (arg === flag) {
const value = argv[i + 1];
if (value && !value.startsWith('--')) {
return value;
}
return undefined;
}
if (arg.startsWith(`${flag}=`)) {
return arg.split('=', 2)[1];
}
}
return undefined;
}
async function waitForWordHelperResponse(responsePath: string): Promise<StatsWordHelperResponse> {
const deadline = Date.now() + STATS_WORD_HELPER_RESPONSE_TIMEOUT_MS;
while (Date.now() < deadline) {
try {
if (fs.existsSync(responsePath)) {
return JSON.parse(fs.readFileSync(responsePath, 'utf8')) as StatsWordHelperResponse;
}
} catch {
// retry until timeout
}
await new Promise((resolve) => setTimeout(resolve, 100));
}
return {
ok: false,
error: 'Timed out waiting for stats word helper response.',
};
}
const invokeStatsWordHelper = createInvokeStatsWordHelperHandler({
createTempDir: (prefix) => fs.mkdtempSync(path.join(os.tmpdir(), prefix)),
joinPath: (...parts) => path.join(...parts),
spawnHelper: async (options) => {
const childArgs = [
options.scriptPath,
'--stats-word-helper-response-path',
options.responsePath,
'--stats-word-helper-user-data-path',
options.userDataPath,
'--stats-word-helper-word',
options.word,
];
const logLevel = readFlagValue(process.argv, '--log-level');
if (logLevel) {
childArgs.push('--log-level', logLevel);
}
const child = spawn(process.execPath, childArgs, {
stdio: 'ignore',
env: {
...process.env,
ELECTRON_RUN_AS_NODE: undefined,
},
});
return await new Promise<number>((resolve) => {
child.once('exit', (code) => resolve(code ?? 1));
child.once('error', () => resolve(1));
});
},
waitForResponse: waitForWordHelperResponse,
removeDir: (targetPath) => {
fs.rmSync(targetPath, { recursive: true, force: true });
},
});
const userDataPath = readFlagValue(process.argv, '--stats-user-data-path')?.trim();
const responsePath = readFlagValue(process.argv, '--stats-response-path')?.trim();
const logLevel = readFlagValue(process.argv, '--log-level');
if (logLevel) {
setLogLevel(logLevel, 'cli');
}
if (!userDataPath) {
if (responsePath) {
writeStatsCliCommandResponse(responsePath, {
ok: false,
error: 'Missing --stats-user-data-path for stats daemon runner.',
});
}
process.exit(1);
}
const daemonUserDataPath = userDataPath;
const statePath = path.join(userDataPath, 'stats-daemon.json');
const knownWordCachePath = path.join(userDataPath, 'known-words-cache.json');
const statsDistPath = path.join(__dirname, '..', 'stats', 'dist');
const wordHelperScriptPath = path.join(__dirname, 'stats-word-helper.js');
let tracker: ImmersionTrackerService | null = null;
let statsServer: ReturnType<typeof startStatsServer> | null = null;
function writeFailureResponse(message: string): void {
if (!responsePath) return;
writeStatsCliCommandResponse(responsePath, { ok: false, error: message });
}
function clearOwnedState(): void {
const rawState = (() => {
try {
return JSON.parse(fs.readFileSync(statePath, 'utf8')) as { pid?: number };
} catch {
return null;
}
})();
if (rawState?.pid === process.pid) {
removeBackgroundStatsServerState(statePath);
}
}
function shutdown(code = 0): void {
try {
statsServer?.close();
} catch {
// ignore
}
statsServer = null;
try {
tracker?.destroy();
} catch {
// ignore
}
tracker = null;
clearOwnedState();
process.exit(code);
}
process.on('SIGINT', () => shutdown(0));
process.on('SIGTERM', () => shutdown(0));
async function main(): Promise<void> {
try {
const configService = new ConfigService(daemonUserDataPath);
const config = configService.getConfig();
if (config.immersionTracking?.enabled === false) {
throw new Error('Immersion tracking is disabled in config.');
}
const configuredDbPath = config.immersionTracking?.dbPath?.trim() || '';
tracker = new ImmersionTrackerService({
dbPath: configuredDbPath || path.join(daemonUserDataPath, 'immersion.sqlite'),
policy: {
batchSize: config.immersionTracking.batchSize,
flushIntervalMs: config.immersionTracking.flushIntervalMs,
queueCap: config.immersionTracking.queueCap,
payloadCapBytes: config.immersionTracking.payloadCapBytes,
maintenanceIntervalMs: config.immersionTracking.maintenanceIntervalMs,
retention: {
eventsDays: config.immersionTracking.retention.eventsDays,
telemetryDays: config.immersionTracking.retention.telemetryDays,
sessionsDays: config.immersionTracking.retention.sessionsDays,
dailyRollupsDays: config.immersionTracking.retention.dailyRollupsDays,
monthlyRollupsDays: config.immersionTracking.retention.monthlyRollupsDays,
vacuumIntervalDays: config.immersionTracking.retention.vacuumIntervalDays,
},
},
});
tracker.setCoverArtFetcher(
createCoverArtFetcher(createAnilistRateLimiter(), createLogger('stats-daemon:cover-art')),
);
statsServer = startStatsServer({
port: config.stats.serverPort,
staticDir: statsDistPath,
tracker,
knownWordCachePath,
ankiConnectConfig: config.ankiConnect,
addYomitanNote: async (word: string) =>
await invokeStatsWordHelper({
helperScriptPath: wordHelperScriptPath,
userDataPath: daemonUserDataPath,
word,
}),
});
writeBackgroundStatsServerState(statePath, {
pid: process.pid,
port: config.stats.serverPort,
startedAtMs: Date.now(),
});
if (responsePath) {
writeStatsCliCommandResponse(responsePath, {
ok: true,
url: `http://127.0.0.1:${config.stats.serverPort}`,
});
}
logger.info(`Background stats daemon listening on http://127.0.0.1:${config.stats.serverPort}`);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
logger.error('Failed to start stats daemon', message);
writeFailureResponse(message);
shutdown(1);
}
}
void main();

View File

@@ -0,0 +1,57 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { createInvokeStatsWordHelperHandler } from './stats-word-helper-client';
test('word helper client returns note id when helper responds before exit', async () => {
const calls: string[] = [];
const handler = createInvokeStatsWordHelperHandler({
createTempDir: () => '/tmp/stats-word-helper',
joinPath: (...parts) => parts.join('/'),
spawnHelper: async (options) => {
calls.push(
`spawnHelper:${options.scriptPath}:${options.responsePath}:${options.userDataPath}:${options.word}`,
);
return new Promise<number>((resolve) => setTimeout(() => resolve(0), 20));
},
waitForResponse: async (responsePath) => {
calls.push(`waitForResponse:${responsePath}`);
return { ok: true, noteId: 123 };
},
removeDir: (targetPath) => {
calls.push(`removeDir:${targetPath}`);
},
});
const noteId = await handler({
helperScriptPath: '/tmp/stats-word-helper.js',
userDataPath: '/tmp/SubMiner',
word: '猫',
});
assert.equal(noteId, 123);
assert.deepEqual(calls, [
'spawnHelper:/tmp/stats-word-helper.js:/tmp/stats-word-helper/response.json:/tmp/SubMiner:猫',
'waitForResponse:/tmp/stats-word-helper/response.json',
'removeDir:/tmp/stats-word-helper',
]);
});
test('word helper client throws helper response errors', async () => {
const handler = createInvokeStatsWordHelperHandler({
createTempDir: () => '/tmp/stats-word-helper',
joinPath: (...parts) => parts.join('/'),
spawnHelper: async () => 0,
waitForResponse: async () => ({ ok: false, error: 'helper failed' }),
removeDir: () => {},
});
await assert.rejects(
async () =>
handler({
helperScriptPath: '/tmp/stats-word-helper.js',
userDataPath: '/tmp/SubMiner',
word: '猫',
}),
/helper failed/,
);
});

View File

@@ -0,0 +1,62 @@
export type StatsWordHelperResponse = {
ok: boolean;
noteId?: number;
error?: string;
};
export function createInvokeStatsWordHelperHandler(deps: {
createTempDir: (prefix: string) => string;
joinPath: (...parts: string[]) => string;
spawnHelper: (options: {
scriptPath: string;
responsePath: string;
userDataPath: string;
word: string;
}) => Promise<number>;
waitForResponse: (responsePath: string) => Promise<StatsWordHelperResponse>;
removeDir: (targetPath: string) => void;
}) {
return async (options: {
helperScriptPath: string;
userDataPath: string;
word: string;
}): Promise<number> => {
const tempDir = deps.createTempDir('subminer-stats-word-helper-');
const responsePath = deps.joinPath(tempDir, 'response.json');
try {
const helperExitPromise = deps.spawnHelper({
scriptPath: options.helperScriptPath,
responsePath,
userDataPath: options.userDataPath,
word: options.word,
});
const startupResult = await Promise.race([
deps.waitForResponse(responsePath).then((response) => ({ kind: 'response' as const, response })),
helperExitPromise.then((status) => ({ kind: 'exit' as const, status })),
]);
let response: StatsWordHelperResponse;
if (startupResult.kind === 'response') {
response = startupResult.response;
} else {
if (startupResult.status !== 0) {
throw new Error(`Stats word helper exited before response (status ${startupResult.status}).`);
}
response = await deps.waitForResponse(responsePath);
}
const exitStatus = await helperExitPromise;
if (exitStatus !== 0) {
throw new Error(`Stats word helper exited with status ${exitStatus}.`);
}
if (!response.ok || typeof response.noteId !== 'number') {
throw new Error(response.error || 'Stats word helper failed.');
}
return response.noteId;
} finally {
deps.removeDir(tempDir);
}
};
}

193
src/stats-word-helper.ts Normal file
View File

@@ -0,0 +1,193 @@
import fs from 'node:fs';
import path from 'node:path';
import { app, protocol } from 'electron';
import type { BrowserWindow, Extension, Session } from 'electron';
import { ConfigService } from './config/service';
import { createLogger, setLogLevel } from './logger';
import { loadYomitanExtension } from './core/services/yomitan-extension-loader';
import {
addYomitanNoteViaSearch,
syncYomitanDefaultAnkiServer,
} from './core/services/tokenizer/yomitan-parser-runtime';
import type { StatsWordHelperResponse } from './stats-word-helper-client';
import { clearYomitanExtensionRuntimeState } from './core/services/yomitan-extension-runtime-state';
protocol.registerSchemesAsPrivileged([
{
scheme: 'chrome-extension',
privileges: {
standard: true,
secure: true,
supportFetchAPI: true,
corsEnabled: true,
bypassCSP: true,
},
},
]);
const logger = createLogger('stats-word-helper');
function readFlagValue(argv: string[], flag: string): string | undefined {
for (let i = 0; i < argv.length; i += 1) {
const arg = argv[i];
if (!arg) continue;
if (arg === flag) {
const value = argv[i + 1];
if (value && !value.startsWith('--')) {
return value;
}
return undefined;
}
if (arg.startsWith(`${flag}=`)) {
return arg.split('=', 2)[1];
}
}
return undefined;
}
function writeResponse(responsePath: string | undefined, payload: StatsWordHelperResponse): void {
if (!responsePath) return;
fs.mkdirSync(path.dirname(responsePath), { recursive: true });
fs.writeFileSync(responsePath, JSON.stringify(payload, null, 2), 'utf8');
}
const responsePath = readFlagValue(process.argv, '--stats-word-helper-response-path')?.trim();
const userDataPath = readFlagValue(process.argv, '--stats-word-helper-user-data-path')?.trim();
const word = readFlagValue(process.argv, '--stats-word-helper-word');
const logLevel = readFlagValue(process.argv, '--log-level');
if (logLevel) {
setLogLevel(logLevel, 'cli');
}
if (!userDataPath || !word) {
writeResponse(responsePath, {
ok: false,
error: 'Missing stats word helper arguments.',
});
app.exit(1);
}
app.setName('SubMiner');
app.setPath('userData', userDataPath!);
let yomitanExt: Extension | null = null;
let yomitanSession: Session | null = null;
let yomitanParserWindow: BrowserWindow | null = null;
let yomitanParserReadyPromise: Promise<void> | null = null;
let yomitanParserInitPromise: Promise<boolean> | null = null;
function cleanup(): void {
clearYomitanExtensionRuntimeState({
getYomitanParserWindow: () => yomitanParserWindow,
setYomitanParserWindow: () => {
yomitanParserWindow = null;
},
setYomitanParserReadyPromise: () => {
yomitanParserReadyPromise = null;
},
setYomitanParserInitPromise: () => {
yomitanParserInitPromise = null;
},
setYomitanExtension: () => {
yomitanExt = null;
},
setYomitanSession: () => {
yomitanSession = null;
},
});
}
async function main(): Promise<void> {
try {
const configService = new ConfigService(userDataPath!);
const config = configService.getConfig();
const extension = await loadYomitanExtension({
userDataPath: userDataPath!,
getYomitanParserWindow: () => yomitanParserWindow,
setYomitanParserWindow: (window) => {
yomitanParserWindow = window;
},
setYomitanParserReadyPromise: (promise) => {
yomitanParserReadyPromise = promise;
},
setYomitanParserInitPromise: (promise) => {
yomitanParserInitPromise = promise;
},
setYomitanExtension: (extensionValue) => {
yomitanExt = extensionValue;
},
setYomitanSession: (sessionValue) => {
yomitanSession = sessionValue;
},
});
if (!extension) {
throw new Error('Yomitan extension failed to load.');
}
await syncYomitanDefaultAnkiServer(
config.ankiConnect?.url || 'http://127.0.0.1:8765',
{
getYomitanExt: () => yomitanExt,
getYomitanSession: () => yomitanSession,
getYomitanParserWindow: () => yomitanParserWindow,
setYomitanParserWindow: (window) => {
yomitanParserWindow = window;
},
getYomitanParserReadyPromise: () => yomitanParserReadyPromise,
setYomitanParserReadyPromise: (promise) => {
yomitanParserReadyPromise = promise;
},
getYomitanParserInitPromise: () => yomitanParserInitPromise,
setYomitanParserInitPromise: (promise) => {
yomitanParserInitPromise = promise;
},
},
logger,
{ forceOverride: true },
);
const noteId = await addYomitanNoteViaSearch(
word!,
{
getYomitanExt: () => yomitanExt,
getYomitanSession: () => yomitanSession,
getYomitanParserWindow: () => yomitanParserWindow,
setYomitanParserWindow: (window) => {
yomitanParserWindow = window;
},
getYomitanParserReadyPromise: () => yomitanParserReadyPromise,
setYomitanParserReadyPromise: (promise) => {
yomitanParserReadyPromise = promise;
},
getYomitanParserInitPromise: () => yomitanParserInitPromise,
setYomitanParserInitPromise: (promise) => {
yomitanParserInitPromise = promise;
},
},
logger,
);
if (typeof noteId !== 'number') {
throw new Error('Yomitan failed to create note.');
}
writeResponse(responsePath, {
ok: true,
noteId,
});
cleanup();
app.exit(0);
} catch (error) {
const message = error instanceof Error ? error.message : String(error);
logger.error('Stats word helper failed', message);
writeResponse(responsePath, {
ok: false,
error: message,
});
cleanup();
app.exit(1);
}
}
void app.whenReady().then(() => main());

View File

@@ -241,6 +241,7 @@ export interface AnkiConnectConfig {
animatedMaxWidth?: number;
animatedMaxHeight?: number;
animatedCrf?: number;
syncAnimatedImageToWordAudio?: boolean;
audioPadding?: number;
fallbackDuration?: number;
maxMediaDuration?: number;
@@ -745,6 +746,7 @@ export interface ResolvedConfig {
animatedMaxWidth: number;
animatedMaxHeight?: number;
animatedCrf: number;
syncAnimatedImageToWordAudio: boolean;
audioPadding: number;
fallbackDuration: number;
maxMediaDuration: number;

View File

@@ -55,7 +55,9 @@ export function MediaHeader({ detail, initialKnownWordsSummary = null }: MediaHe
<div className="text-xs text-ctp-overlay2">total watch time</div>
</div>
<div>
<div className="text-ctp-cards-mined font-medium">{formatNumber(detail.totalCards)}</div>
<div className="text-ctp-cards-mined font-medium">
{formatNumber(detail.totalCards)}
</div>
<div className="text-xs text-ctp-overlay2">cards mined</div>
</div>
<div>

View File

@@ -1,4 +1,8 @@
import { formatEventSeconds, type SessionChartMarker, type SessionEventNoteInfo } from '../../lib/session-events';
import {
formatEventSeconds,
type SessionChartMarker,
type SessionEventNoteInfo,
} from '../../lib/session-events';
interface SessionEventPopoverProps {
marker: SessionChartMarker;
@@ -83,7 +87,8 @@ export function SessionEventPopover({
{marker.kind === 'seek' && (
<div className="space-y-1 text-xs text-ctp-subtext0">
<div>
From <span className="text-ctp-teal">{formatEventSeconds(marker.fromMs) ?? '\u2014'}</span>{' '}
From{' '}
<span className="text-ctp-teal">{formatEventSeconds(marker.fromMs) ?? '\u2014'}</span>{' '}
to <span className="text-ctp-teal">{formatEventSeconds(marker.toMs) ?? '\u2014'}</span>
</div>
<div>
@@ -120,7 +125,9 @@ export function SessionEventPopover({
) : null}
</div>
{info?.expression ? (
<div className="mb-1 text-sm font-medium text-ctp-text">{info.expression}</div>
<div className="mb-1 text-sm font-medium text-ctp-text">
{info.expression}
</div>
) : null}
{info?.context ? (
<div className="mb-1 text-xs text-ctp-subtext0">{info.context}</div>

View File

@@ -177,7 +177,12 @@ export function TrendsTab() {
color="#8aadf4"
type="bar"
/>
<TrendChart title="Cards Mined" data={data.activity.cards} color={cardsMinedColor} type="bar" />
<TrendChart
title="Cards Mined"
data={data.activity.cards}
color={cardsMinedColor}
type="bar"
/>
<TrendChart title="Tokens Seen" data={data.activity.words} color="#8bd5ca" type="bar" />
<TrendChart title="Sessions" data={data.activity.sessions} color="#b7bdf8" type="bar" />
@@ -196,7 +201,12 @@ export function TrendsTab() {
color="#c6a0f6"
type="line"
/>
<TrendChart title="Cards Mined" data={data.progress.cards} color={cardsMinedColor} type="line" />
<TrendChart
title="Cards Mined"
data={data.progress.cards}
color={cardsMinedColor}
type="line"
/>
<TrendChart
title="Episodes Watched"
data={data.progress.episodes}

View File

@@ -146,7 +146,9 @@ test('extractSessionEventNoteInfo ignores malformed notes without a numeric note
});
test('mergeSessionEventNoteInfos keys previews by both requested and returned note ids', () => {
const noteInfos = mergeSessionEventNoteInfos([111], [
const noteInfos = mergeSessionEventNoteInfos(
[111],
[
{
noteId: 222,
fields: {
@@ -154,7 +156,8 @@ test('mergeSessionEventNoteInfos keys previews by both requested and returned no
Sentence: { value: 'この剣は呪いだ' },
},
},
]);
],
);
assert.deepEqual(noteInfos.get(111), {
noteId: 222,

View File

@@ -237,17 +237,16 @@ export function collectPendingSessionEventNoteIds(
return next;
}
export function getSessionEventCardRequest(
marker: SessionChartMarker | null,
): { noteIds: number[]; requestKey: string | null } {
export function getSessionEventCardRequest(marker: SessionChartMarker | null): {
noteIds: number[];
requestKey: string | null;
} {
if (!marker || marker.kind !== 'card' || marker.noteIds.length === 0) {
return { noteIds: [], requestKey: null };
}
const noteIds = Array.from(
new Set(
marker.noteIds.filter((noteId) => Number.isInteger(noteId) && noteId > 0),
),
new Set(marker.noteIds.filter((noteId) => Number.isInteger(noteId) && noteId > 0)),
);
return {