mirror of
https://github.com/ksyasuda/SubMiner.git
synced 2026-03-20 12:11:28 -07:00
Decouple stats daemon and preserve final mine OSD status
- Run `subminer stats -b` as a dedicated daemon process, independent from the overlay app - Stop Anki progress spinner before showing final `✓`/`x` mine result so it is not overwritten - Keep grammar/noise subtitle tokens hoverable while stripping annotation metadata
This commit is contained in:
@@ -0,0 +1,64 @@
|
|||||||
|
---
|
||||||
|
id: TASK-195
|
||||||
|
title: Keep final card-mine OSD result from being overwritten by progress spinner
|
||||||
|
status: Done
|
||||||
|
assignee:
|
||||||
|
- Codex
|
||||||
|
created_date: '2026-03-18 19:40'
|
||||||
|
updated_date: '2026-03-18 19:49'
|
||||||
|
labels:
|
||||||
|
- anki
|
||||||
|
- ui
|
||||||
|
- bug
|
||||||
|
milestone: m-1
|
||||||
|
dependencies: []
|
||||||
|
references:
|
||||||
|
- src/anki-integration/ui-feedback.ts
|
||||||
|
- src/anki-integration.ts
|
||||||
|
- src/anki-integration/card-creation.ts
|
||||||
|
priority: medium
|
||||||
|
ordinal: 105610
|
||||||
|
---
|
||||||
|
|
||||||
|
## Description
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||||
|
|
||||||
|
When a card mine finishes, the mpv OSD currently tries to show the final status text but the in-flight Anki progress spinner can immediately overwrite it on the next tick. Stop the spinner first, then show a single-line final result with a success/failure marker and the mined-word notification.
|
||||||
|
|
||||||
|
<!-- SECTION:DESCRIPTION:END -->
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
<!-- AC:BEGIN -->
|
||||||
|
|
||||||
|
- [x] #1 Successful mine/update OSD results render after the spinner is stopped and do not get overwritten by a later spinner tick.
|
||||||
|
- [x] #2 Failure results that replace the spinner show an `x` marker and stay visible on the same OSD line.
|
||||||
|
- [x] #3 Regression coverage locks the spinner teardown/result-notification ordering.
|
||||||
|
<!-- AC:END -->
|
||||||
|
|
||||||
|
## Implementation Plan
|
||||||
|
|
||||||
|
<!-- SECTION:PLAN:BEGIN -->
|
||||||
|
|
||||||
|
1. Add a focused failing regression test around the Anki UI-feedback spinner/result helper.
|
||||||
|
2. Add a helper that stops progress before emitting the final OSD result line with `✓`/`x`.
|
||||||
|
3. Route mine/update result notifications through that helper, then run targeted verification.
|
||||||
|
<!-- SECTION:PLAN:END -->
|
||||||
|
|
||||||
|
## Outcome
|
||||||
|
|
||||||
|
<!-- SECTION:OUTCOME:BEGIN -->
|
||||||
|
|
||||||
|
Added a dedicated Anki UI-feedback result helper that force-clears the in-flight spinner state before emitting the final OSD result line. Successful card-update notifications now render as `✓ Updated card: ...`, and sentence-card creation failures now render as `x Sentence card failed: ...` without a later spinner tick reclaiming the line.
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
|
||||||
|
- `bun test src/anki-integration/ui-feedback.test.ts`
|
||||||
|
- `bun test src/anki-integration/ui-feedback.test.ts src/anki-integration/note-update-workflow.test.ts src/anki-integration.test.ts src/core/services/mining.test.ts src/main/runtime/mining-actions.test.ts`
|
||||||
|
- `bun x prettier --check src/anki-integration/ui-feedback.ts src/anki-integration/ui-feedback.test.ts src/anki-integration.ts src/anki-integration/card-creation.ts "backlog/tasks/task-195 - Keep-final-card-mine-OSD-result-from-being-overwritten-by-progress-spinner.md" changes/2026-03-18-mine-osd-spinner-result.md`
|
||||||
|
- `bun run changelog:lint`
|
||||||
|
- `bash .agents/skills/subminer-change-verification/scripts/verify_subminer_change.sh --lane core src/anki-integration/ui-feedback.ts src/anki-integration/ui-feedback.test.ts src/anki-integration.ts src/anki-integration/card-creation.ts changes/2026-03-18-mine-osd-spinner-result.md`
|
||||||
|
- Verifier artifacts: `.tmp/skill-verification/subminer-verify-20260318-194614-uZMrAx/`
|
||||||
|
|
||||||
|
<!-- SECTION:OUTCOME:END -->
|
||||||
4
changes/2026-03-18-mine-osd-spinner-result.md
Normal file
4
changes/2026-03-18-mine-osd-spinner-result.md
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
type: fixed
|
||||||
|
area: anki
|
||||||
|
|
||||||
|
- Fixed card-mine OSD feedback so the final mine result stops the Anki spinner first, then shows a single-line `✓`/`x` status without being overwritten by a later spinner tick.
|
||||||
5
changes/2026-03-18-stats-daemon-decoupling.md
Normal file
5
changes/2026-03-18-stats-daemon-decoupling.md
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
type: fixed
|
||||||
|
area: stats
|
||||||
|
|
||||||
|
- `subminer stats -b` now runs as a standalone background stats daemon instead of reusing the main SubMiner app process, so the overlay app can still be launched separately for normal video watching.
|
||||||
|
- Dashboard word mining still works against the background daemon by using a short-lived hidden helper for the Yomitan add-note flow.
|
||||||
@@ -1,4 +1,5 @@
|
|||||||
type: changed
|
type: changed
|
||||||
area: overlay
|
area: overlay
|
||||||
|
|
||||||
- Excluded interjections and sound-effect tokens from subtitle annotation styling so they no longer inherit misleading lexical highlight treatment while still remaining visible and non-interactive in the subtitle line.
|
- Excluded interjections and sound-effect tokens from subtitle annotation styling so they no longer inherit misleading lexical highlight treatment while still remaining visible and hoverable as plain subtitle tokens.
|
||||||
|
- Expanded subtitle annotation noise filtering to also strip annotation metadata from standalone grammar-only helper tokens such as particles, auxiliaries, adnominals, common explanatory endings like `んです` / `のだ`, and merged trailing quote-particle forms like `...って` while keeping them tokenized for hover lookup.
|
||||||
|
|||||||
@@ -26,7 +26,7 @@ The same immersion data powers the stats dashboard.
|
|||||||
|
|
||||||
- In-app overlay: focus the visible overlay, then press the key from `stats.toggleKey` (default: `` ` `` / `Backquote`).
|
- In-app overlay: focus the visible overlay, then press the key from `stats.toggleKey` (default: `` ` `` / `Backquote`).
|
||||||
- Launcher command: run `subminer stats` to start the local stats server on demand and open the dashboard in your browser.
|
- Launcher command: run `subminer stats` to start the local stats server on demand and open the dashboard in your browser.
|
||||||
- Background server: run `subminer stats -b` to start or reuse a dedicated background stats server without keeping the launcher attached, and `subminer stats -s` to stop that background server.
|
- Background server: run `subminer stats -b` to start or reuse a dedicated background stats daemon without keeping the launcher attached, and `subminer stats -s` to stop that daemon.
|
||||||
- Maintenance command: run `subminer stats cleanup` or `subminer stats cleanup -v` to backfill/repair vocabulary metadata (`headword`, `reading`, POS) and purge stale or excluded rows from `imm_words` on demand.
|
- Maintenance command: run `subminer stats cleanup` or `subminer stats cleanup -v` to backfill/repair vocabulary metadata (`headword`, `reading`, POS) and purge stale or excluded rows from `imm_words` on demand.
|
||||||
- Browser page: open `http://127.0.0.1:5175` directly if the local stats server is already running.
|
- Browser page: open `http://127.0.0.1:5175` directly if the local stats server is already running.
|
||||||
|
|
||||||
@@ -80,8 +80,9 @@ Stats server config lives under `stats`:
|
|||||||
- `autoStartServer` starts the local stats HTTP server on launch once immersion tracking is active, or reuses the dedicated background stats server when one is already running.
|
- `autoStartServer` starts the local stats HTTP server on launch once immersion tracking is active, or reuses the dedicated background stats server when one is already running.
|
||||||
- `autoOpenBrowser` controls whether `subminer stats` launches the dashboard URL in your browser after ensuring the server is running.
|
- `autoOpenBrowser` controls whether `subminer stats` launches the dashboard URL in your browser after ensuring the server is running.
|
||||||
- `subminer stats` forces the dashboard server to start even when `autoStartServer` is `false`.
|
- `subminer stats` forces the dashboard server to start even when `autoStartServer` is `false`.
|
||||||
- `subminer stats -b` starts or reuses the dedicated background stats server and exits after startup acknowledgement.
|
- `subminer stats -b` starts or reuses the dedicated background stats daemon and exits after startup acknowledgement.
|
||||||
- `subminer stats -s` stops the dedicated background stats server without closing any browser tabs.
|
- The background stats daemon is separate from the normal SubMiner overlay app, so you can leave it running and still launch SubMiner later to watch or mine from video.
|
||||||
|
- `subminer stats -s` stops the dedicated background stats daemon without closing any browser tabs.
|
||||||
- `subminer stats` fails with an error when `immersionTracking.enabled` is `false`.
|
- `subminer stats` fails with an error when `immersionTracking.enabled` is `false`.
|
||||||
- `subminer stats cleanup` defaults to vocabulary cleanup, repairs stale `headword`, `reading`, and `part_of_speech` values, attempts best-effort MeCab backfill for legacy rows, and removes rows that still fail vocab filtering.
|
- `subminer stats cleanup` defaults to vocabulary cleanup, repairs stale `headword`, `reading`, and `part_of_speech` values, attempts best-effort MeCab backfill for legacy rows, and removes rows that still fail vocab filtering.
|
||||||
|
|
||||||
@@ -89,7 +90,7 @@ Stats server config lives under `stats`:
|
|||||||
|
|
||||||
The Vocabulary tab's word detail panel shows example lines from your viewing history. Each example line with a valid source file offers three mining buttons:
|
The Vocabulary tab's word detail panel shows example lines from your viewing history. Each example line with a valid source file offers three mining buttons:
|
||||||
|
|
||||||
- **Mine Word** — performs a full Yomitan dictionary lookup for the word (definition, reading, pitch accent, etc.) via the hidden search page, then enriches the card with sentence audio, a screenshot or animated AVIF clip, the highlighted sentence, and metadata extracted from the source video file. Requires Anki and Yomitan dictionaries to be loaded.
|
- **Mine Word** — performs a full Yomitan dictionary lookup for the word (definition, reading, pitch accent, etc.) via a short-lived hidden helper, then enriches the card with sentence audio, a screenshot or animated AVIF clip, the highlighted sentence, and metadata extracted from the source video file. Requires Anki and Yomitan dictionaries to be loaded.
|
||||||
- **Mine Sentence** — creates a sentence card directly with the `IsSentenceCard` flag set (for Lapis/Kiku workflows), along with audio, image, and translation from the secondary subtitle if available.
|
- **Mine Sentence** — creates a sentence card directly with the `IsSentenceCard` flag set (for Lapis/Kiku workflows), along with audio, image, and translation from the secondary subtitle if available.
|
||||||
- **Mine Audio** — creates an audio-only card with the `IsAudioCard` flag, attaching only the sentence audio clip.
|
- **Mine Audio** — creates an audio-only card with the `IsAudioCard` flag, attaching only the sentence audio clip.
|
||||||
|
|
||||||
|
|||||||
Binary file not shown.
|
Before Width: | Height: | Size: 38 KiB After Width: | Height: | Size: 42 KiB |
@@ -4,6 +4,8 @@ SubMiner annotates subtitle tokens in real time as they appear in the overlay. F
|
|||||||
|
|
||||||
All four are opt-in and configured under `subtitleStyle`, `ankiConnect.knownWords`, and `ankiConnect.nPlusOne` in your config. They apply independently — you can enable any combination.
|
All four are opt-in and configured under `subtitleStyle`, `ankiConnect.knownWords`, and `ankiConnect.nPlusOne` in your config. They apply independently — you can enable any combination.
|
||||||
|
|
||||||
|
Before any of those layers render, SubMiner strips annotation metadata from tokens that are usually just subtitle glue or annotation noise. Standalone particles, auxiliaries, adnominals, common explanatory endings like `んです` / `のだ`, merged trailing quote-particle forms like `...って`, repeated kana interjections, and similar non-lexical helper tokens remain hoverable in the subtitle text, but they render as plain tokens without known-word, N+1, frequency, JLPT, or name-match annotation styling.
|
||||||
|
|
||||||
## N+1 Word Highlighting
|
## N+1 Word Highlighting
|
||||||
|
|
||||||
N+1 highlighting identifies sentences where you know every word except one, making them ideal mining targets. When enabled, SubMiner builds a local cache of your known vocabulary from Anki and highlights tokens accordingly.
|
N+1 highlighting identifies sentences where you know every word except one, making them ideal mining targets. When enabled, SubMiner builds a local cache of your known vocabulary from Anki and highlights tokens accordingly.
|
||||||
@@ -80,6 +82,10 @@ When `sourcePath` is omitted, SubMiner searches default install/runtime location
|
|||||||
Frequency highlighting skips tokens that look like non-lexical noise (kana reduplication, short kana endings like `っ`), even when dictionary ranks exist.
|
Frequency highlighting skips tokens that look like non-lexical noise (kana reduplication, short kana endings like `っ`), even when dictionary ranks exist.
|
||||||
:::
|
:::
|
||||||
|
|
||||||
|
::: info
|
||||||
|
Frequency, JLPT, and N+1 metadata are only shown for tokens that survive the subtitle-annotation noise filter. Standalone grammar tokens like `は`, `です`, and `この` are intentionally left unannotated even if a dictionary can assign them metadata.
|
||||||
|
:::
|
||||||
|
|
||||||
## JLPT Tagging
|
## JLPT Tagging
|
||||||
|
|
||||||
JLPT tagging adds colored underlines to tokens based on their JLPT level (N1–N5), giving you an at-a-glance sense of difficulty distribution in each subtitle line.
|
JLPT tagging adds colored underlines to tokens based on their JLPT level (N1–N5), giving you an at-a-glance sense of difficulty distribution in each subtitle line.
|
||||||
|
|||||||
@@ -150,16 +150,17 @@ test('stats command launches attached app command with response path', async ()
|
|||||||
assert.equal(handled, true);
|
assert.equal(handled, true);
|
||||||
assert.deepEqual(forwarded, [
|
assert.deepEqual(forwarded, [
|
||||||
[
|
[
|
||||||
'--stats',
|
'--stats-daemon-start',
|
||||||
'--stats-response-path',
|
'--stats-response-path',
|
||||||
'/tmp/subminer-stats-test/response.json',
|
'/tmp/subminer-stats-test/response.json',
|
||||||
|
'--stats-daemon-open-browser',
|
||||||
'--log-level',
|
'--log-level',
|
||||||
'debug',
|
'debug',
|
||||||
],
|
],
|
||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('stats background command launches detached app command with response path', async () => {
|
test('stats background command launches attached daemon control command with response path', async () => {
|
||||||
const context = createContext();
|
const context = createContext();
|
||||||
context.args.stats = true;
|
context.args.stats = true;
|
||||||
(context.args as typeof context.args & { statsBackground?: boolean }).statsBackground = true;
|
(context.args as typeof context.args & { statsBackground?: boolean }).statsBackground = true;
|
||||||
@@ -168,11 +169,9 @@ test('stats background command launches detached app command with response path'
|
|||||||
const handled = await runStatsCommand(context, {
|
const handled = await runStatsCommand(context, {
|
||||||
createTempDir: () => '/tmp/subminer-stats-test',
|
createTempDir: () => '/tmp/subminer-stats-test',
|
||||||
joinPath: (...parts) => parts.join('/'),
|
joinPath: (...parts) => parts.join('/'),
|
||||||
runAppCommandAttached: async () => {
|
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||||
throw new Error('attached path should not run for stats -b');
|
|
||||||
},
|
|
||||||
launchAppCommandDetached: (_appPath, appArgs) => {
|
|
||||||
forwarded.push(appArgs);
|
forwarded.push(appArgs);
|
||||||
|
return 0;
|
||||||
},
|
},
|
||||||
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
||||||
removeDir: () => {},
|
removeDir: () => {},
|
||||||
@@ -181,10 +180,9 @@ test('stats background command launches detached app command with response path'
|
|||||||
assert.equal(handled, true);
|
assert.equal(handled, true);
|
||||||
assert.deepEqual(forwarded, [
|
assert.deepEqual(forwarded, [
|
||||||
[
|
[
|
||||||
'--stats',
|
'--stats-daemon-start',
|
||||||
'--stats-response-path',
|
'--stats-response-path',
|
||||||
'/tmp/subminer-stats-test/response.json',
|
'/tmp/subminer-stats-test/response.json',
|
||||||
'--stats-background',
|
|
||||||
],
|
],
|
||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
@@ -215,7 +213,12 @@ test('stats command returns after startup response even if app process stays run
|
|||||||
const final = await statsCommand;
|
const final = await statsCommand;
|
||||||
assert.equal(final, true);
|
assert.equal(final, true);
|
||||||
assert.deepEqual(forwarded, [
|
assert.deepEqual(forwarded, [
|
||||||
['--stats', '--stats-response-path', '/tmp/subminer-stats-test/response.json'],
|
[
|
||||||
|
'--stats-daemon-start',
|
||||||
|
'--stats-response-path',
|
||||||
|
'/tmp/subminer-stats-test/response.json',
|
||||||
|
'--stats-daemon-open-browser',
|
||||||
|
],
|
||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -268,7 +271,11 @@ test('stats stop command forwards stop flag to the app', async () => {
|
|||||||
|
|
||||||
assert.equal(handled, true);
|
assert.equal(handled, true);
|
||||||
assert.deepEqual(forwarded, [
|
assert.deepEqual(forwarded, [
|
||||||
['--stats', '--stats-response-path', '/tmp/subminer-stats-test/response.json', '--stats-stop'],
|
[
|
||||||
|
'--stats-daemon-stop',
|
||||||
|
'--stats-response-path',
|
||||||
|
'/tmp/subminer-stats-test/response.json',
|
||||||
|
],
|
||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
import fs from 'node:fs';
|
import fs from 'node:fs';
|
||||||
import os from 'node:os';
|
import os from 'node:os';
|
||||||
import path from 'node:path';
|
import path from 'node:path';
|
||||||
import { launchAppCommandDetached, runAppCommandAttached } from '../mpv.js';
|
import { runAppCommandAttached } from '../mpv.js';
|
||||||
import { sleep } from '../util.js';
|
import { sleep } from '../util.js';
|
||||||
import type { LauncherCommandContext } from './context.js';
|
import type { LauncherCommandContext } from './context.js';
|
||||||
|
|
||||||
@@ -20,12 +20,6 @@ type StatsCommandDeps = {
|
|||||||
logLevel: LauncherCommandContext['args']['logLevel'],
|
logLevel: LauncherCommandContext['args']['logLevel'],
|
||||||
label: string,
|
label: string,
|
||||||
) => Promise<number>;
|
) => Promise<number>;
|
||||||
launchAppCommandDetached: (
|
|
||||||
appPath: string,
|
|
||||||
appArgs: string[],
|
|
||||||
logLevel: LauncherCommandContext['args']['logLevel'],
|
|
||||||
label: string,
|
|
||||||
) => void;
|
|
||||||
waitForStatsResponse: (responsePath: string) => Promise<StatsCommandResponse>;
|
waitForStatsResponse: (responsePath: string) => Promise<StatsCommandResponse>;
|
||||||
removeDir: (targetPath: string) => void;
|
removeDir: (targetPath: string) => void;
|
||||||
};
|
};
|
||||||
@@ -37,8 +31,6 @@ const defaultDeps: StatsCommandDeps = {
|
|||||||
joinPath: (...parts) => path.join(...parts),
|
joinPath: (...parts) => path.join(...parts),
|
||||||
runAppCommandAttached: (appPath, appArgs, logLevel, label) =>
|
runAppCommandAttached: (appPath, appArgs, logLevel, label) =>
|
||||||
runAppCommandAttached(appPath, appArgs, logLevel, label),
|
runAppCommandAttached(appPath, appArgs, logLevel, label),
|
||||||
launchAppCommandDetached: (appPath, appArgs, logLevel, label) =>
|
|
||||||
launchAppCommandDetached(appPath, appArgs, logLevel, label),
|
|
||||||
waitForStatsResponse: async (responsePath) => {
|
waitForStatsResponse: async (responsePath) => {
|
||||||
const deadline = Date.now() + STATS_STARTUP_RESPONSE_TIMEOUT_MS;
|
const deadline = Date.now() + STATS_STARTUP_RESPONSE_TIMEOUT_MS;
|
||||||
while (Date.now() < deadline) {
|
while (Date.now() < deadline) {
|
||||||
@@ -75,12 +67,15 @@ export async function runStatsCommand(
|
|||||||
const responsePath = resolvedDeps.joinPath(tempDir, 'response.json');
|
const responsePath = resolvedDeps.joinPath(tempDir, 'response.json');
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const forwarded = ['--stats', '--stats-response-path', responsePath];
|
const forwarded = args.statsCleanup
|
||||||
if (args.statsBackground) {
|
? ['--stats', '--stats-response-path', responsePath]
|
||||||
forwarded.push('--stats-background');
|
: [
|
||||||
}
|
args.statsStop ? '--stats-daemon-stop' : '--stats-daemon-start',
|
||||||
if (args.statsStop) {
|
'--stats-response-path',
|
||||||
forwarded.push('--stats-stop');
|
responsePath,
|
||||||
|
];
|
||||||
|
if (!args.statsCleanup && !args.statsBackground && !args.statsStop) {
|
||||||
|
forwarded.push('--stats-daemon-open-browser');
|
||||||
}
|
}
|
||||||
if (args.statsCleanup) {
|
if (args.statsCleanup) {
|
||||||
forwarded.push('--stats-cleanup');
|
forwarded.push('--stats-cleanup');
|
||||||
@@ -94,14 +89,6 @@ export async function runStatsCommand(
|
|||||||
if (args.logLevel !== 'info') {
|
if (args.logLevel !== 'info') {
|
||||||
forwarded.push('--log-level', args.logLevel);
|
forwarded.push('--log-level', args.logLevel);
|
||||||
}
|
}
|
||||||
if (args.statsBackground) {
|
|
||||||
resolvedDeps.launchAppCommandDetached(appPath, forwarded, args.logLevel, 'stats');
|
|
||||||
const startupResult = await resolvedDeps.waitForStatsResponse(responsePath);
|
|
||||||
if (!startupResult.ok) {
|
|
||||||
throw new Error(startupResult.error || 'Stats dashboard failed to start.');
|
|
||||||
}
|
|
||||||
return true;
|
|
||||||
}
|
|
||||||
const attachedExitPromise = resolvedDeps.runAppCommandAttached(
|
const attachedExitPromise = resolvedDeps.runAppCommandAttached(
|
||||||
appPath,
|
appPath,
|
||||||
forwarded,
|
forwarded,
|
||||||
|
|||||||
@@ -276,6 +276,16 @@ export function parseCliPrograms(
|
|||||||
if (statsBackground && statsStop) {
|
if (statsBackground && statsStop) {
|
||||||
throw new Error('Stats background and stop flags cannot be combined.');
|
throw new Error('Stats background and stop flags cannot be combined.');
|
||||||
}
|
}
|
||||||
|
if (
|
||||||
|
normalizedAction &&
|
||||||
|
normalizedAction !== 'cleanup' &&
|
||||||
|
normalizedAction !== 'rebuild' &&
|
||||||
|
normalizedAction !== 'backfill'
|
||||||
|
) {
|
||||||
|
throw new Error(
|
||||||
|
'Invalid stats action. Valid values are cleanup, rebuild, or backfill.',
|
||||||
|
);
|
||||||
|
}
|
||||||
if (normalizedAction && (statsBackground || statsStop)) {
|
if (normalizedAction && (statsBackground || statsStop)) {
|
||||||
throw new Error('Stats background and stop flags cannot be combined with stats actions.');
|
throw new Error('Stats background and stop flags cannot be combined with stats actions.');
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -536,7 +536,7 @@ exit 0
|
|||||||
assert.equal(result.status, 0, `stdout:\n${result.stdout}\nstderr:\n${result.stderr}`);
|
assert.equal(result.status, 0, `stdout:\n${result.stdout}\nstderr:\n${result.stderr}`);
|
||||||
assert.match(
|
assert.match(
|
||||||
fs.readFileSync(capturePath, 'utf8'),
|
fs.readFileSync(capturePath, 'utf8'),
|
||||||
/^--stats\n--stats-response-path\n.+\n--log-level\ndebug\n$/,
|
/^--stats-daemon-start\n--stats-response-path\n.+\n--stats-daemon-open-browser\n--log-level\ndebug\n$/,
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -45,6 +45,8 @@ export function parseMpvArgString(input: string): string[] {
|
|||||||
let inSingleQuote = false;
|
let inSingleQuote = false;
|
||||||
let inDoubleQuote = false;
|
let inDoubleQuote = false;
|
||||||
let escaping = false;
|
let escaping = false;
|
||||||
|
const canEscape = (nextChar: string | undefined): boolean =>
|
||||||
|
nextChar === undefined || nextChar === '"' || nextChar === "'" || nextChar === '\\' || /\s/.test(nextChar);
|
||||||
|
|
||||||
for (let i = 0; i < chars.length; i += 1) {
|
for (let i = 0; i < chars.length; i += 1) {
|
||||||
const ch = chars[i] || '';
|
const ch = chars[i] || '';
|
||||||
@@ -65,7 +67,11 @@ export function parseMpvArgString(input: string): string[] {
|
|||||||
|
|
||||||
if (inDoubleQuote) {
|
if (inDoubleQuote) {
|
||||||
if (ch === '\\') {
|
if (ch === '\\') {
|
||||||
|
if (canEscape(chars[i + 1])) {
|
||||||
escaping = true;
|
escaping = true;
|
||||||
|
} else {
|
||||||
|
current += ch;
|
||||||
|
}
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
if (ch === '"') {
|
if (ch === '"') {
|
||||||
@@ -77,7 +83,11 @@ export function parseMpvArgString(input: string): string[] {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (ch === '\\') {
|
if (ch === '\\') {
|
||||||
|
if (canEscape(chars[i + 1])) {
|
||||||
escaping = true;
|
escaping = true;
|
||||||
|
} else {
|
||||||
|
current += ch;
|
||||||
|
}
|
||||||
continue;
|
continue;
|
||||||
}
|
}
|
||||||
if (ch === "'") {
|
if (ch === "'") {
|
||||||
@@ -857,8 +867,14 @@ export function runAppCommandAttached(
|
|||||||
proc.once('error', (error) => {
|
proc.once('error', (error) => {
|
||||||
reject(error);
|
reject(error);
|
||||||
});
|
});
|
||||||
proc.once('exit', (code) => {
|
proc.once('exit', (code, signal) => {
|
||||||
resolve(code ?? 0);
|
if (code !== null) {
|
||||||
|
resolve(code);
|
||||||
|
} else if (signal) {
|
||||||
|
resolve(128);
|
||||||
|
} else {
|
||||||
|
resolve(0);
|
||||||
|
}
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -40,8 +40,10 @@ import { createLogger } from './logger';
|
|||||||
import {
|
import {
|
||||||
createUiFeedbackState,
|
createUiFeedbackState,
|
||||||
beginUpdateProgress,
|
beginUpdateProgress,
|
||||||
|
clearUpdateProgress,
|
||||||
endUpdateProgress,
|
endUpdateProgress,
|
||||||
showStatusNotification,
|
showStatusNotification,
|
||||||
|
showUpdateResult,
|
||||||
withUpdateProgress,
|
withUpdateProgress,
|
||||||
UiFeedbackState,
|
UiFeedbackState,
|
||||||
} from './anki-integration/ui-feedback';
|
} from './anki-integration/ui-feedback';
|
||||||
@@ -310,6 +312,8 @@ export class AnkiIntegration {
|
|||||||
),
|
),
|
||||||
},
|
},
|
||||||
showOsdNotification: (text: string) => this.showOsdNotification(text),
|
showOsdNotification: (text: string) => this.showOsdNotification(text),
|
||||||
|
showUpdateResult: (message: string, success: boolean) =>
|
||||||
|
this.showUpdateResult(message, success),
|
||||||
showStatusNotification: (message: string) => this.showStatusNotification(message),
|
showStatusNotification: (message: string) => this.showStatusNotification(message),
|
||||||
showNotification: (noteId, label, errorSuffix) =>
|
showNotification: (noteId, label, errorSuffix) =>
|
||||||
this.showNotification(noteId, label, errorSuffix),
|
this.showNotification(noteId, label, errorSuffix),
|
||||||
@@ -773,6 +777,12 @@ export class AnkiIntegration {
|
|||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private clearUpdateProgress(): void {
|
||||||
|
clearUpdateProgress(this.uiFeedbackState, (timer) => {
|
||||||
|
clearInterval(timer);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
private async withUpdateProgress<T>(
|
private async withUpdateProgress<T>(
|
||||||
initialMessage: string,
|
initialMessage: string,
|
||||||
action: () => Promise<T>,
|
action: () => Promise<T>,
|
||||||
@@ -903,7 +913,9 @@ export class AnkiIntegration {
|
|||||||
const type = this.config.behavior?.notificationType || 'osd';
|
const type = this.config.behavior?.notificationType || 'osd';
|
||||||
|
|
||||||
if (type === 'osd' || type === 'both') {
|
if (type === 'osd' || type === 'both') {
|
||||||
this.showOsdNotification(message);
|
this.showUpdateResult(message, true);
|
||||||
|
} else {
|
||||||
|
this.clearUpdateProgress();
|
||||||
}
|
}
|
||||||
|
|
||||||
if ((type === 'system' || type === 'both') && this.notificationCallback) {
|
if ((type === 'system' || type === 'both') && this.notificationCallback) {
|
||||||
@@ -938,6 +950,21 @@ export class AnkiIntegration {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
private showUpdateResult(message: string, success: boolean): void {
|
||||||
|
showUpdateResult(
|
||||||
|
this.uiFeedbackState,
|
||||||
|
{
|
||||||
|
clearProgressTimer: (timer) => {
|
||||||
|
clearInterval(timer);
|
||||||
|
},
|
||||||
|
showOsdNotification: (text) => {
|
||||||
|
this.showOsdNotification(text);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{ message, success },
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
private mergeFieldValue(existing: string, newValue: string, overwrite: boolean): string {
|
private mergeFieldValue(existing: string, newValue: string, overwrite: boolean): string {
|
||||||
if (overwrite || !existing.trim()) {
|
if (overwrite || !existing.trim()) {
|
||||||
return newValue;
|
return newValue;
|
||||||
|
|||||||
@@ -75,6 +75,7 @@ interface CardCreationDeps {
|
|||||||
client: CardCreationClient;
|
client: CardCreationClient;
|
||||||
mediaGenerator: CardCreationMediaGenerator;
|
mediaGenerator: CardCreationMediaGenerator;
|
||||||
showOsdNotification: (text: string) => void;
|
showOsdNotification: (text: string) => void;
|
||||||
|
showUpdateResult: (message: string, success: boolean) => void;
|
||||||
showStatusNotification: (message: string) => void;
|
showStatusNotification: (message: string) => void;
|
||||||
showNotification: (noteId: number, label: string | number, errorSuffix?: string) => Promise<void>;
|
showNotification: (noteId: number, label: string | number, errorSuffix?: string) => Promise<void>;
|
||||||
beginUpdateProgress: (initialMessage: string) => void;
|
beginUpdateProgress: (initialMessage: string) => void;
|
||||||
@@ -261,8 +262,7 @@ export class CardCreationService {
|
|||||||
|
|
||||||
if (this.deps.getConfig().media?.generateImage) {
|
if (this.deps.getConfig().media?.generateImage) {
|
||||||
try {
|
try {
|
||||||
const animatedLeadInSeconds =
|
const animatedLeadInSeconds = await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
|
||||||
await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
|
|
||||||
const imageFilename = this.generateImageFilename();
|
const imageFilename = this.generateImageFilename();
|
||||||
const imageBuffer = await this.generateImageBuffer(
|
const imageBuffer = await this.generateImageBuffer(
|
||||||
mpvClient.currentVideoPath,
|
mpvClient.currentVideoPath,
|
||||||
@@ -420,8 +420,7 @@ export class CardCreationService {
|
|||||||
|
|
||||||
if (this.deps.getConfig().media?.generateImage) {
|
if (this.deps.getConfig().media?.generateImage) {
|
||||||
try {
|
try {
|
||||||
const animatedLeadInSeconds =
|
const animatedLeadInSeconds = await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
|
||||||
await this.deps.getAnimatedImageLeadInSeconds(noteInfo);
|
|
||||||
const imageFilename = this.generateImageFilename();
|
const imageFilename = this.generateImageFilename();
|
||||||
const imageBuffer = await this.generateImageBuffer(
|
const imageBuffer = await this.generateImageBuffer(
|
||||||
mpvClient.currentVideoPath,
|
mpvClient.currentVideoPath,
|
||||||
@@ -554,7 +553,7 @@ export class CardCreationService {
|
|||||||
this.deps.trackLastAddedNoteId?.(noteId);
|
this.deps.trackLastAddedNoteId?.(noteId);
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
log.error('Failed to create sentence card:', (error as Error).message);
|
log.error('Failed to create sentence card:', (error as Error).message);
|
||||||
this.deps.showOsdNotification(`Sentence card failed: ${(error as Error).message}`);
|
this.deps.showUpdateResult(`Sentence card failed: ${(error as Error).message}`, false);
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -651,7 +650,7 @@ export class CardCreationService {
|
|||||||
});
|
});
|
||||||
} catch (error) {
|
} catch (error) {
|
||||||
log.error('Error creating sentence card:', (error as Error).message);
|
log.error('Error creating sentence card:', (error as Error).message);
|
||||||
this.deps.showOsdNotification(`Sentence card failed: ${(error as Error).message}`);
|
this.deps.showUpdateResult(`Sentence card failed: ${(error as Error).message}`, false);
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
67
src/anki-integration/ui-feedback.test.ts
Normal file
67
src/anki-integration/ui-feedback.test.ts
Normal file
@@ -0,0 +1,67 @@
|
|||||||
|
import test from 'node:test';
|
||||||
|
import assert from 'node:assert/strict';
|
||||||
|
import {
|
||||||
|
beginUpdateProgress,
|
||||||
|
createUiFeedbackState,
|
||||||
|
showProgressTick,
|
||||||
|
showUpdateResult,
|
||||||
|
} from './ui-feedback';
|
||||||
|
|
||||||
|
test('showUpdateResult stops spinner before success notification and suppresses stale ticks', () => {
|
||||||
|
const state = createUiFeedbackState();
|
||||||
|
const osdMessages: string[] = [];
|
||||||
|
|
||||||
|
beginUpdateProgress(state, 'Creating sentence card', () => {
|
||||||
|
showProgressTick(state, (text) => {
|
||||||
|
osdMessages.push(text);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
showUpdateResult(
|
||||||
|
state,
|
||||||
|
{
|
||||||
|
clearProgressTimer: (timer) => {
|
||||||
|
clearInterval(timer);
|
||||||
|
},
|
||||||
|
showOsdNotification: (text) => {
|
||||||
|
osdMessages.push(text);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{ success: true, message: 'Updated card: taberu' },
|
||||||
|
);
|
||||||
|
|
||||||
|
showProgressTick(state, (text) => {
|
||||||
|
osdMessages.push(text);
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.deepEqual(osdMessages, ['Creating sentence card |', '✓ Updated card: taberu']);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('showUpdateResult renders failed updates with an x marker', () => {
|
||||||
|
const state = createUiFeedbackState();
|
||||||
|
const osdMessages: string[] = [];
|
||||||
|
|
||||||
|
beginUpdateProgress(state, 'Creating sentence card', () => {
|
||||||
|
showProgressTick(state, (text) => {
|
||||||
|
osdMessages.push(text);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
showUpdateResult(
|
||||||
|
state,
|
||||||
|
{
|
||||||
|
clearProgressTimer: (timer) => {
|
||||||
|
clearInterval(timer);
|
||||||
|
},
|
||||||
|
showOsdNotification: (text) => {
|
||||||
|
osdMessages.push(text);
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{ success: false, message: 'Sentence card failed: deck missing' },
|
||||||
|
);
|
||||||
|
|
||||||
|
assert.deepEqual(osdMessages, [
|
||||||
|
'Creating sentence card |',
|
||||||
|
'x Sentence card failed: deck missing',
|
||||||
|
]);
|
||||||
|
});
|
||||||
@@ -7,6 +7,11 @@ export interface UiFeedbackState {
|
|||||||
progressFrame: number;
|
progressFrame: number;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface UiFeedbackResult {
|
||||||
|
success: boolean;
|
||||||
|
message: string;
|
||||||
|
}
|
||||||
|
|
||||||
export interface UiFeedbackNotificationContext {
|
export interface UiFeedbackNotificationContext {
|
||||||
getNotificationType: () => string | undefined;
|
getNotificationType: () => string | undefined;
|
||||||
showOsd: (text: string) => void;
|
showOsd: (text: string) => void;
|
||||||
@@ -66,6 +71,15 @@ export function endUpdateProgress(
|
|||||||
state.progressDepth = Math.max(0, state.progressDepth - 1);
|
state.progressDepth = Math.max(0, state.progressDepth - 1);
|
||||||
if (state.progressDepth > 0) return;
|
if (state.progressDepth > 0) return;
|
||||||
|
|
||||||
|
clearUpdateProgress(state, clearProgressTimer);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function clearUpdateProgress(
|
||||||
|
state: UiFeedbackState,
|
||||||
|
clearProgressTimer: (timer: ReturnType<typeof setInterval>) => void,
|
||||||
|
): void {
|
||||||
|
state.progressDepth = 0;
|
||||||
|
|
||||||
if (state.progressTimer) {
|
if (state.progressTimer) {
|
||||||
clearProgressTimer(state.progressTimer);
|
clearProgressTimer(state.progressTimer);
|
||||||
state.progressTimer = null;
|
state.progressTimer = null;
|
||||||
@@ -85,6 +99,19 @@ export function showProgressTick(
|
|||||||
showOsdNotification(`${state.progressMessage} ${frame}`);
|
showOsdNotification(`${state.progressMessage} ${frame}`);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function showUpdateResult(
|
||||||
|
state: UiFeedbackState,
|
||||||
|
options: {
|
||||||
|
clearProgressTimer: (timer: ReturnType<typeof setInterval>) => void;
|
||||||
|
showOsdNotification: (text: string) => void;
|
||||||
|
},
|
||||||
|
result: UiFeedbackResult,
|
||||||
|
): void {
|
||||||
|
clearUpdateProgress(state, options.clearProgressTimer);
|
||||||
|
const prefix = result.success ? '✓' : 'x';
|
||||||
|
options.showOsdNotification(`${prefix} ${result.message}`);
|
||||||
|
}
|
||||||
|
|
||||||
export async function withUpdateProgress<T>(
|
export async function withUpdateProgress<T>(
|
||||||
state: UiFeedbackState,
|
state: UiFeedbackState,
|
||||||
options: UiFeedbackOptions,
|
options: UiFeedbackOptions,
|
||||||
|
|||||||
@@ -130,6 +130,30 @@ test('serializeSubtitleMarkup preserves tooltip attrs and name-match precedence'
|
|||||||
assert.doesNotMatch(markup, /data-frequency-rank="12"|data-jlpt-level="N5"|word-jlpt-n5/);
|
assert.doesNotMatch(markup, /data-frequency-rank="12"|data-jlpt-level="N5"|word-jlpt-n5/);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test('serializeSubtitleMarkup keeps filtered tokens hoverable without annotation attrs', () => {
|
||||||
|
const payload: SubtitleData = {
|
||||||
|
text: 'は',
|
||||||
|
tokens: [
|
||||||
|
{
|
||||||
|
surface: 'は',
|
||||||
|
reading: 'は',
|
||||||
|
headword: 'は',
|
||||||
|
startPos: 0,
|
||||||
|
endPos: 1,
|
||||||
|
partOfSpeech: PartOfSpeech.particle,
|
||||||
|
pos1: '助詞',
|
||||||
|
isMerged: false,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
isNameMatch: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
};
|
||||||
|
|
||||||
|
const markup = serializeSubtitleMarkup(payload, frequencyOptions);
|
||||||
|
assert.equal(markup, '<span class="word" data-reading="は" data-headword="は">は</span>');
|
||||||
|
});
|
||||||
|
|
||||||
test('serializeSubtitleWebsocketMessage emits sentence payload', () => {
|
test('serializeSubtitleWebsocketMessage emits sentence payload', () => {
|
||||||
const payload: SubtitleData = {
|
const payload: SubtitleData = {
|
||||||
text: '字幕',
|
text: '字幕',
|
||||||
|
|||||||
@@ -1305,7 +1305,7 @@ test('tokenizeSubtitle ignores frequency lookup failures', async () => {
|
|||||||
assert.equal(result.tokens?.[0]?.frequencyRank, undefined);
|
assert.equal(result.tokens?.[0]?.frequencyRank, undefined);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('tokenizeSubtitle skips frequency rank when Yomitan token is enriched as particle by mecab pos1', async () => {
|
test('tokenizeSubtitle keeps standalone particle token hoverable while clearing annotation metadata', async () => {
|
||||||
const result = await tokenizeSubtitle(
|
const result = await tokenizeSubtitle(
|
||||||
'は',
|
'は',
|
||||||
makeDeps({
|
makeDeps({
|
||||||
@@ -1350,9 +1350,33 @@ test('tokenizeSubtitle skips frequency rank when Yomitan token is enriched as pa
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
assert.equal(result.tokens?.length, 1);
|
assert.equal(result.text, 'は');
|
||||||
assert.equal(result.tokens?.[0]?.pos1, '助詞');
|
assert.deepEqual(
|
||||||
assert.equal(result.tokens?.[0]?.frequencyRank, undefined);
|
result.tokens?.map((token) => ({
|
||||||
|
surface: token.surface,
|
||||||
|
reading: token.reading,
|
||||||
|
headword: token.headword,
|
||||||
|
pos1: token.pos1,
|
||||||
|
isKnown: token.isKnown,
|
||||||
|
isNPlusOneTarget: token.isNPlusOneTarget,
|
||||||
|
isNameMatch: token.isNameMatch,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
})),
|
||||||
|
[
|
||||||
|
{
|
||||||
|
surface: 'は',
|
||||||
|
reading: 'は',
|
||||||
|
headword: 'は',
|
||||||
|
pos1: '助詞',
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
isNameMatch: false,
|
||||||
|
jlptLevel: undefined,
|
||||||
|
frequencyRank: undefined,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('tokenizeSubtitle keeps frequency rank when mecab tags classify token as content-bearing', async () => {
|
test('tokenizeSubtitle keeps frequency rank when mecab tags classify token as content-bearing', async () => {
|
||||||
@@ -1460,7 +1484,7 @@ test('tokenizeSubtitle skips JLPT level for excluded demonstratives', async () =
|
|||||||
assert.equal(result.tokens?.[0]?.jlptLevel, undefined);
|
assert.equal(result.tokens?.[0]?.jlptLevel, undefined);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('tokenizeSubtitle excludes repeated kana interjections from annotation payloads entirely', async () => {
|
test('tokenizeSubtitle keeps repeated kana interjections tokenized while clearing annotation metadata', async () => {
|
||||||
const result = await tokenizeSubtitle(
|
const result = await tokenizeSubtitle(
|
||||||
'ああ',
|
'ああ',
|
||||||
makeDeps({
|
makeDeps({
|
||||||
@@ -1491,7 +1515,29 @@ test('tokenizeSubtitle excludes repeated kana interjections from annotation payl
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
assert.deepEqual(result, { text: 'ああ', tokens: null });
|
assert.equal(result.text, 'ああ');
|
||||||
|
assert.deepEqual(
|
||||||
|
result.tokens?.map((token) => ({
|
||||||
|
surface: token.surface,
|
||||||
|
headword: token.headword,
|
||||||
|
reading: token.reading,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
isKnown: token.isKnown,
|
||||||
|
isNPlusOneTarget: token.isNPlusOneTarget,
|
||||||
|
})),
|
||||||
|
[
|
||||||
|
{
|
||||||
|
surface: 'ああ',
|
||||||
|
headword: 'ああ',
|
||||||
|
reading: 'ああ',
|
||||||
|
jlptLevel: undefined,
|
||||||
|
frequencyRank: undefined,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('tokenizeSubtitle assigns JLPT level to Yomitan tokens', async () => {
|
test('tokenizeSubtitle assigns JLPT level to Yomitan tokens', async () => {
|
||||||
@@ -2578,7 +2624,15 @@ test('tokenizeSubtitle keeps correct MeCab pos1 enrichment when Yomitan offsets
|
|||||||
const gaToken = result.tokens?.find((token) => token.surface === 'が');
|
const gaToken = result.tokens?.find((token) => token.surface === 'が');
|
||||||
const desuToken = result.tokens?.find((token) => token.surface === 'です');
|
const desuToken = result.tokens?.find((token) => token.surface === 'です');
|
||||||
assert.equal(gaToken?.pos1, '助詞');
|
assert.equal(gaToken?.pos1, '助詞');
|
||||||
|
assert.equal(gaToken?.isKnown, false);
|
||||||
|
assert.equal(gaToken?.isNPlusOneTarget, false);
|
||||||
|
assert.equal(gaToken?.jlptLevel, undefined);
|
||||||
|
assert.equal(gaToken?.frequencyRank, undefined);
|
||||||
assert.equal(desuToken?.pos1, '助動詞');
|
assert.equal(desuToken?.pos1, '助動詞');
|
||||||
|
assert.equal(desuToken?.isKnown, false);
|
||||||
|
assert.equal(desuToken?.isNPlusOneTarget, false);
|
||||||
|
assert.equal(desuToken?.jlptLevel, undefined);
|
||||||
|
assert.equal(desuToken?.frequencyRank, undefined);
|
||||||
assert.equal(targets.length, 1);
|
assert.equal(targets.length, 1);
|
||||||
assert.equal(targets[0]?.surface, '仮面');
|
assert.equal(targets[0]?.surface, '仮面');
|
||||||
});
|
});
|
||||||
@@ -3056,7 +3110,7 @@ test('tokenizeSubtitle excludes default non-independent pos2 from N+1 and freque
|
|||||||
assert.equal(result.tokens?.[0]?.isNPlusOneTarget, false);
|
assert.equal(result.tokens?.[0]?.isNPlusOneTarget, false);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('tokenizeSubtitle excludes mecab-tagged interjections from annotation payloads entirely', async () => {
|
test('tokenizeSubtitle keeps mecab-tagged interjections tokenized while clearing annotation metadata', async () => {
|
||||||
const result = await tokenizeSubtitle(
|
const result = await tokenizeSubtitle(
|
||||||
'ぐはっ',
|
'ぐはっ',
|
||||||
makeDepsFromYomitanTokens([{ surface: 'ぐはっ', reading: 'ぐはっ', headword: 'ぐはっ' }], {
|
makeDepsFromYomitanTokens([{ surface: 'ぐはっ', reading: 'ぐはっ', headword: 'ぐはっ' }], {
|
||||||
@@ -3080,10 +3134,34 @@ test('tokenizeSubtitle excludes mecab-tagged interjections from annotation paylo
|
|||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|
||||||
assert.deepEqual(result, { text: 'ぐはっ', tokens: null });
|
assert.equal(result.text, 'ぐはっ');
|
||||||
|
assert.deepEqual(
|
||||||
|
result.tokens?.map((token) => ({
|
||||||
|
surface: token.surface,
|
||||||
|
headword: token.headword,
|
||||||
|
reading: token.reading,
|
||||||
|
pos1: token.pos1,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
isKnown: token.isKnown,
|
||||||
|
isNPlusOneTarget: token.isNPlusOneTarget,
|
||||||
|
})),
|
||||||
|
[
|
||||||
|
{
|
||||||
|
surface: 'ぐはっ',
|
||||||
|
headword: 'ぐはっ',
|
||||||
|
reading: 'ぐはっ',
|
||||||
|
pos1: '感動詞',
|
||||||
|
jlptLevel: undefined,
|
||||||
|
frequencyRank: undefined,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
test('tokenizeSubtitle keeps visible text while excluding interjections from mixed annotation payloads', async () => {
|
test('tokenizeSubtitle keeps excluded interjections hoverable while clearing only their annotation metadata', async () => {
|
||||||
const result = await tokenizeSubtitle(
|
const result = await tokenizeSubtitle(
|
||||||
'ぐはっ 猫',
|
'ぐはっ 猫',
|
||||||
makeDeps({
|
makeDeps({
|
||||||
@@ -3147,8 +3225,261 @@ test('tokenizeSubtitle keeps visible text while excluding interjections from mix
|
|||||||
result.tokens?.map((token) => ({
|
result.tokens?.map((token) => ({
|
||||||
surface: token.surface,
|
surface: token.surface,
|
||||||
headword: token.headword,
|
headword: token.headword,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
})),
|
})),
|
||||||
[{ surface: '猫', headword: '猫' }],
|
[
|
||||||
|
{ surface: 'ぐはっ', headword: 'ぐはっ', frequencyRank: undefined, jlptLevel: undefined },
|
||||||
|
{ surface: '猫', headword: '猫', frequencyRank: 11, jlptLevel: 'N5' },
|
||||||
|
],
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('tokenizeSubtitle keeps explanatory ending variants hoverable while clearing only their annotation metadata', async () => {
|
||||||
|
const result = await tokenizeSubtitle(
|
||||||
|
'猫んです',
|
||||||
|
makeDepsFromYomitanTokens(
|
||||||
|
[
|
||||||
|
{ surface: '猫', reading: 'ねこ', headword: '猫' },
|
||||||
|
{ surface: 'んです', reading: 'んです', headword: 'ん' },
|
||||||
|
],
|
||||||
|
{
|
||||||
|
getFrequencyDictionaryEnabled: () => true,
|
||||||
|
getFrequencyRank: (text) => (text === '猫' ? 11 : 500),
|
||||||
|
getJlptLevel: (text) => (text === '猫' ? 'N5' : null),
|
||||||
|
tokenizeWithMecab: async () => [
|
||||||
|
{
|
||||||
|
headword: '猫',
|
||||||
|
surface: '猫',
|
||||||
|
reading: 'ネコ',
|
||||||
|
startPos: 0,
|
||||||
|
endPos: 1,
|
||||||
|
partOfSpeech: PartOfSpeech.noun,
|
||||||
|
pos1: '名詞',
|
||||||
|
pos2: '一般',
|
||||||
|
isMerged: false,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: 'ん',
|
||||||
|
surface: 'ん',
|
||||||
|
reading: 'ン',
|
||||||
|
startPos: 1,
|
||||||
|
endPos: 2,
|
||||||
|
partOfSpeech: PartOfSpeech.other,
|
||||||
|
pos1: '名詞',
|
||||||
|
pos2: '非自立',
|
||||||
|
isMerged: false,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: 'です',
|
||||||
|
surface: 'です',
|
||||||
|
reading: 'デス',
|
||||||
|
startPos: 2,
|
||||||
|
endPos: 4,
|
||||||
|
partOfSpeech: PartOfSpeech.bound_auxiliary,
|
||||||
|
pos1: '助動詞',
|
||||||
|
isMerged: false,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
),
|
||||||
|
);
|
||||||
|
|
||||||
|
assert.equal(result.text, '猫んです');
|
||||||
|
assert.deepEqual(
|
||||||
|
result.tokens?.map((token) => ({
|
||||||
|
surface: token.surface,
|
||||||
|
headword: token.headword,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
})),
|
||||||
|
[
|
||||||
|
{ surface: '猫', headword: '猫', jlptLevel: 'N5', frequencyRank: 11 },
|
||||||
|
{ surface: 'んです', headword: 'ん', jlptLevel: undefined, frequencyRank: undefined },
|
||||||
|
],
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('tokenizeSubtitle keeps standalone grammar-only tokens hoverable while clearing only their annotation metadata', async () => {
|
||||||
|
const result = await tokenizeSubtitle(
|
||||||
|
'私はこの猫です',
|
||||||
|
makeDeps({
|
||||||
|
getFrequencyDictionaryEnabled: () => true,
|
||||||
|
getFrequencyRank: (text) => (text === '私' ? 50 : text === '猫' ? 11 : 500),
|
||||||
|
getJlptLevel: (text) => (text === '私' ? 'N5' : text === '猫' ? 'N5' : null),
|
||||||
|
getYomitanExt: () => ({ id: 'dummy-ext' }) as any,
|
||||||
|
getYomitanParserWindow: () =>
|
||||||
|
({
|
||||||
|
isDestroyed: () => false,
|
||||||
|
webContents: {
|
||||||
|
executeJavaScript: async (script: string) => {
|
||||||
|
if (script.includes('getTermFrequencies')) {
|
||||||
|
return [];
|
||||||
|
}
|
||||||
|
|
||||||
|
return [
|
||||||
|
{
|
||||||
|
source: 'scanning-parser',
|
||||||
|
index: 0,
|
||||||
|
content: [
|
||||||
|
[{ text: '私', reading: 'わたし', headwords: [[{ term: '私' }]] }],
|
||||||
|
[{ text: 'は', reading: 'は', headwords: [[{ term: 'は' }]] }],
|
||||||
|
[{ text: 'この', reading: 'この', headwords: [[{ term: 'この' }]] }],
|
||||||
|
[{ text: '猫', reading: 'ねこ', headwords: [[{ term: '猫' }]] }],
|
||||||
|
[{ text: 'です', reading: 'です', headwords: [[{ term: 'です' }]] }],
|
||||||
|
],
|
||||||
|
},
|
||||||
|
];
|
||||||
|
},
|
||||||
|
},
|
||||||
|
}) as unknown as Electron.BrowserWindow,
|
||||||
|
tokenizeWithMecab: async () => [
|
||||||
|
{
|
||||||
|
headword: '私',
|
||||||
|
surface: '私',
|
||||||
|
reading: 'ワタシ',
|
||||||
|
startPos: 0,
|
||||||
|
endPos: 1,
|
||||||
|
partOfSpeech: PartOfSpeech.noun,
|
||||||
|
pos1: '名詞',
|
||||||
|
pos2: '代名詞',
|
||||||
|
isMerged: true,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: 'は',
|
||||||
|
surface: 'は',
|
||||||
|
reading: 'ハ',
|
||||||
|
startPos: 1,
|
||||||
|
endPos: 2,
|
||||||
|
partOfSpeech: PartOfSpeech.particle,
|
||||||
|
pos1: '助詞',
|
||||||
|
pos2: '係助詞',
|
||||||
|
isMerged: true,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: 'この',
|
||||||
|
surface: 'この',
|
||||||
|
reading: 'コノ',
|
||||||
|
startPos: 2,
|
||||||
|
endPos: 4,
|
||||||
|
partOfSpeech: PartOfSpeech.other,
|
||||||
|
pos1: '連体詞',
|
||||||
|
isMerged: true,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: '猫',
|
||||||
|
surface: '猫',
|
||||||
|
reading: 'ネコ',
|
||||||
|
startPos: 4,
|
||||||
|
endPos: 5,
|
||||||
|
partOfSpeech: PartOfSpeech.noun,
|
||||||
|
pos1: '名詞',
|
||||||
|
pos2: '一般',
|
||||||
|
isMerged: true,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: 'です',
|
||||||
|
surface: 'です',
|
||||||
|
reading: 'デス',
|
||||||
|
startPos: 5,
|
||||||
|
endPos: 7,
|
||||||
|
partOfSpeech: PartOfSpeech.bound_auxiliary,
|
||||||
|
pos1: '助動詞',
|
||||||
|
isMerged: true,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
assert.equal(result.text, '私はこの猫です');
|
||||||
|
assert.deepEqual(
|
||||||
|
result.tokens?.map((token) => ({
|
||||||
|
surface: token.surface,
|
||||||
|
headword: token.headword,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
|
})),
|
||||||
|
[
|
||||||
|
{ surface: '私', headword: '私', frequencyRank: 50, jlptLevel: 'N5' },
|
||||||
|
{ surface: 'は', headword: 'は', frequencyRank: undefined, jlptLevel: undefined },
|
||||||
|
{ surface: 'この', headword: 'この', frequencyRank: undefined, jlptLevel: undefined },
|
||||||
|
{ surface: '猫', headword: '猫', frequencyRank: 11, jlptLevel: 'N5' },
|
||||||
|
{ surface: 'です', headword: 'です', frequencyRank: undefined, jlptLevel: undefined },
|
||||||
|
],
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('tokenizeSubtitle keeps trailing quote-particle merged tokens hoverable while clearing only their annotation metadata', async () => {
|
||||||
|
const result = await tokenizeSubtitle(
|
||||||
|
'どうしてもって',
|
||||||
|
makeDepsFromYomitanTokens([{ surface: 'どうしてもって', reading: 'どうしてもって', headword: 'どうしても' }], {
|
||||||
|
getFrequencyDictionaryEnabled: () => true,
|
||||||
|
getFrequencyRank: (text) => (text === 'どうしても' ? 123 : null),
|
||||||
|
getJlptLevel: (text) => (text === 'どうしても' ? 'N3' : null),
|
||||||
|
tokenizeWithMecab: async () => [
|
||||||
|
{
|
||||||
|
headword: 'どうしても',
|
||||||
|
surface: 'どうしても',
|
||||||
|
reading: 'ドウシテモ',
|
||||||
|
startPos: 0,
|
||||||
|
endPos: 5,
|
||||||
|
partOfSpeech: PartOfSpeech.other,
|
||||||
|
pos1: '副詞',
|
||||||
|
pos2: '一般',
|
||||||
|
isMerged: false,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
{
|
||||||
|
headword: 'って',
|
||||||
|
surface: 'って',
|
||||||
|
reading: 'ッテ',
|
||||||
|
startPos: 5,
|
||||||
|
endPos: 7,
|
||||||
|
partOfSpeech: PartOfSpeech.particle,
|
||||||
|
pos1: '助詞',
|
||||||
|
pos2: '格助詞',
|
||||||
|
isMerged: false,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
},
|
||||||
|
],
|
||||||
|
getMinSentenceWordsForNPlusOne: () => 1,
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
|
||||||
|
assert.equal(result.text, 'どうしてもって');
|
||||||
|
assert.deepEqual(
|
||||||
|
result.tokens?.map((token) => ({
|
||||||
|
surface: token.surface,
|
||||||
|
headword: token.headword,
|
||||||
|
jlptLevel: token.jlptLevel,
|
||||||
|
frequencyRank: token.frequencyRank,
|
||||||
|
})),
|
||||||
|
[
|
||||||
|
{
|
||||||
|
surface: 'どうしてもって',
|
||||||
|
headword: 'どうしても',
|
||||||
|
jlptLevel: undefined,
|
||||||
|
frequencyRank: undefined,
|
||||||
|
},
|
||||||
|
],
|
||||||
);
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
|||||||
@@ -178,7 +178,7 @@ async function applyAnnotationStage(
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
async function filterSubtitleAnnotationTokens(tokens: MergedToken[]): Promise<MergedToken[]> {
|
async function stripSubtitleAnnotationMetadata(tokens: MergedToken[]): Promise<MergedToken[]> {
|
||||||
if (tokens.length === 0) {
|
if (tokens.length === 0) {
|
||||||
return tokens;
|
return tokens;
|
||||||
}
|
}
|
||||||
@@ -188,9 +188,7 @@ async function filterSubtitleAnnotationTokens(tokens: MergedToken[]): Promise<Me
|
|||||||
}
|
}
|
||||||
|
|
||||||
const annotationStage = await annotationStageModulePromise;
|
const annotationStage = await annotationStageModulePromise;
|
||||||
return tokens.filter(
|
return tokens.map((token) => annotationStage.stripSubtitleAnnotationMetadata(token));
|
||||||
(token) => !annotationStage.shouldExcludeTokenFromSubtitleAnnotations(token),
|
|
||||||
);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export function createTokenizerDepsRuntime(
|
export function createTokenizerDepsRuntime(
|
||||||
@@ -721,12 +719,12 @@ export async function tokenizeSubtitle(
|
|||||||
|
|
||||||
const yomitanTokens = await parseWithYomitanInternalParser(tokenizeText, deps, annotationOptions);
|
const yomitanTokens = await parseWithYomitanInternalParser(tokenizeText, deps, annotationOptions);
|
||||||
if (yomitanTokens && yomitanTokens.length > 0) {
|
if (yomitanTokens && yomitanTokens.length > 0) {
|
||||||
const filteredTokens = await filterSubtitleAnnotationTokens(
|
const annotatedTokens = await stripSubtitleAnnotationMetadata(
|
||||||
await applyAnnotationStage(yomitanTokens, deps, annotationOptions),
|
await applyAnnotationStage(yomitanTokens, deps, annotationOptions),
|
||||||
);
|
);
|
||||||
return {
|
return {
|
||||||
text: displayText,
|
text: displayText,
|
||||||
tokens: filteredTokens.length > 0 ? filteredTokens : null,
|
tokens: annotatedTokens.length > 0 ? annotatedTokens : null,
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,12 @@
|
|||||||
import assert from 'node:assert/strict';
|
import assert from 'node:assert/strict';
|
||||||
import test from 'node:test';
|
import test from 'node:test';
|
||||||
import { MergedToken, PartOfSpeech } from '../../../types';
|
import { MergedToken, PartOfSpeech } from '../../../types';
|
||||||
import { annotateTokens, AnnotationStageDeps } from './annotation-stage';
|
import {
|
||||||
|
annotateTokens,
|
||||||
|
AnnotationStageDeps,
|
||||||
|
shouldExcludeTokenFromSubtitleAnnotations,
|
||||||
|
stripSubtitleAnnotationMetadata,
|
||||||
|
} from './annotation-stage';
|
||||||
|
|
||||||
function makeToken(overrides: Partial<MergedToken> = {}): MergedToken {
|
function makeToken(overrides: Partial<MergedToken> = {}): MergedToken {
|
||||||
return {
|
return {
|
||||||
@@ -150,6 +155,170 @@ test('annotateTokens handles JLPT disabled and eligibility exclusion paths', ()
|
|||||||
assert.equal(excludedLookupCalls, 0);
|
assert.equal(excludedLookupCalls, 0);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test('shouldExcludeTokenFromSubtitleAnnotations excludes explanatory ending variants', () => {
|
||||||
|
const tokens = [
|
||||||
|
makeToken({
|
||||||
|
surface: 'んです',
|
||||||
|
headword: 'ん',
|
||||||
|
reading: 'ンデス',
|
||||||
|
pos1: '名詞|助動詞',
|
||||||
|
pos2: '非自立',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'のだ',
|
||||||
|
headword: 'の',
|
||||||
|
reading: 'ノダ',
|
||||||
|
pos1: '名詞|助動詞',
|
||||||
|
pos2: '非自立',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'んだ',
|
||||||
|
headword: 'ん',
|
||||||
|
reading: 'ンダ',
|
||||||
|
pos1: '名詞|助動詞',
|
||||||
|
pos2: '非自立',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'のです',
|
||||||
|
headword: 'の',
|
||||||
|
reading: 'ノデス',
|
||||||
|
pos1: '名詞|助動詞',
|
||||||
|
pos2: '非自立',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'なんです',
|
||||||
|
headword: 'だ',
|
||||||
|
reading: 'ナンデス',
|
||||||
|
pos1: '助動詞|名詞|助動詞',
|
||||||
|
pos2: '|非自立',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'んでした',
|
||||||
|
headword: 'ん',
|
||||||
|
reading: 'ンデシタ',
|
||||||
|
pos1: '助動詞|助動詞|助動詞',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'のでは',
|
||||||
|
headword: 'の',
|
||||||
|
reading: 'ノデハ',
|
||||||
|
pos1: '助詞|接続詞',
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const token of tokens) {
|
||||||
|
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true, token.surface);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
test('shouldExcludeTokenFromSubtitleAnnotations keeps lexical tokens outside explanatory ending family', () => {
|
||||||
|
const token = makeToken({
|
||||||
|
surface: '問題',
|
||||||
|
headword: '問題',
|
||||||
|
reading: 'モンダイ',
|
||||||
|
partOfSpeech: PartOfSpeech.noun,
|
||||||
|
pos1: '名詞',
|
||||||
|
pos2: '一般',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), false);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('shouldExcludeTokenFromSubtitleAnnotations excludes standalone particles auxiliaries and adnominals', () => {
|
||||||
|
const tokens = [
|
||||||
|
makeToken({
|
||||||
|
surface: 'は',
|
||||||
|
headword: 'は',
|
||||||
|
reading: 'ハ',
|
||||||
|
partOfSpeech: PartOfSpeech.particle,
|
||||||
|
pos1: '助詞',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'です',
|
||||||
|
headword: 'です',
|
||||||
|
reading: 'デス',
|
||||||
|
partOfSpeech: PartOfSpeech.bound_auxiliary,
|
||||||
|
pos1: '助動詞',
|
||||||
|
}),
|
||||||
|
makeToken({
|
||||||
|
surface: 'この',
|
||||||
|
headword: 'この',
|
||||||
|
reading: 'コノ',
|
||||||
|
partOfSpeech: PartOfSpeech.other,
|
||||||
|
pos1: '連体詞',
|
||||||
|
}),
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const token of tokens) {
|
||||||
|
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true, token.surface);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
test('shouldExcludeTokenFromSubtitleAnnotations keeps mixed content tokens with trailing helpers', () => {
|
||||||
|
const token = makeToken({
|
||||||
|
surface: '行きます',
|
||||||
|
headword: '行く',
|
||||||
|
reading: 'イキマス',
|
||||||
|
partOfSpeech: PartOfSpeech.verb,
|
||||||
|
pos1: '動詞|助動詞',
|
||||||
|
pos2: '自立',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), false);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('shouldExcludeTokenFromSubtitleAnnotations excludes merged lexical tokens with trailing quote particles', () => {
|
||||||
|
const token = makeToken({
|
||||||
|
surface: 'どうしてもって',
|
||||||
|
headword: 'どうしても',
|
||||||
|
reading: 'ドウシテモッテ',
|
||||||
|
partOfSpeech: PartOfSpeech.other,
|
||||||
|
pos1: '副詞|助詞',
|
||||||
|
pos2: '一般|格助詞',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('stripSubtitleAnnotationMetadata keeps token hover data while clearing annotation fields', () => {
|
||||||
|
const token = makeToken({
|
||||||
|
surface: 'は',
|
||||||
|
headword: 'は',
|
||||||
|
reading: 'ハ',
|
||||||
|
partOfSpeech: PartOfSpeech.particle,
|
||||||
|
pos1: '助詞',
|
||||||
|
isKnown: true,
|
||||||
|
isNPlusOneTarget: true,
|
||||||
|
isNameMatch: true,
|
||||||
|
jlptLevel: 'N5',
|
||||||
|
frequencyRank: 12,
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.deepEqual(stripSubtitleAnnotationMetadata(token), {
|
||||||
|
...token,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
isNameMatch: false,
|
||||||
|
jlptLevel: undefined,
|
||||||
|
frequencyRank: undefined,
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
test('stripSubtitleAnnotationMetadata leaves content tokens unchanged', () => {
|
||||||
|
const token = makeToken({
|
||||||
|
surface: '猫',
|
||||||
|
headword: '猫',
|
||||||
|
reading: 'ネコ',
|
||||||
|
partOfSpeech: PartOfSpeech.noun,
|
||||||
|
pos1: '名詞',
|
||||||
|
isKnown: true,
|
||||||
|
jlptLevel: 'N5',
|
||||||
|
frequencyRank: 42,
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.strictEqual(stripSubtitleAnnotationMetadata(token), token);
|
||||||
|
});
|
||||||
|
|
||||||
test('annotateTokens prioritizes name matches over n+1, frequency, and JLPT when enabled', () => {
|
test('annotateTokens prioritizes name matches over n+1, frequency, and JLPT when enabled', () => {
|
||||||
let jlptLookupCalls = 0;
|
let jlptLookupCalls = 0;
|
||||||
const tokens = [
|
const tokens = [
|
||||||
|
|||||||
@@ -25,6 +25,45 @@ const SUBTITLE_ANNOTATION_EXCLUDED_TERMS = new Set([
|
|||||||
'ふう',
|
'ふう',
|
||||||
'ほう',
|
'ほう',
|
||||||
]);
|
]);
|
||||||
|
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_PREFIXES = ['ん', 'の', 'なん', 'なの'];
|
||||||
|
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_CORES = [
|
||||||
|
'だ',
|
||||||
|
'です',
|
||||||
|
'でした',
|
||||||
|
'だった',
|
||||||
|
'では',
|
||||||
|
'じゃ',
|
||||||
|
'でしょう',
|
||||||
|
'だろう',
|
||||||
|
] as const;
|
||||||
|
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_TRAILING_PARTICLES = [
|
||||||
|
'',
|
||||||
|
'か',
|
||||||
|
'ね',
|
||||||
|
'よ',
|
||||||
|
'な',
|
||||||
|
'よね',
|
||||||
|
'かな',
|
||||||
|
'かね',
|
||||||
|
] as const;
|
||||||
|
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS = new Set(
|
||||||
|
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_PREFIXES.flatMap((prefix) =>
|
||||||
|
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_CORES.flatMap((core) =>
|
||||||
|
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_TRAILING_PARTICLES.map(
|
||||||
|
(particle) => `${prefix}${core}${particle}`,
|
||||||
|
),
|
||||||
|
),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
const SUBTITLE_ANNOTATION_EXCLUDED_TRAILING_PARTICLE_SUFFIXES = new Set([
|
||||||
|
'って',
|
||||||
|
'ってよ',
|
||||||
|
'ってね',
|
||||||
|
'ってな',
|
||||||
|
'ってさ',
|
||||||
|
'ってか',
|
||||||
|
'ってば',
|
||||||
|
]);
|
||||||
|
|
||||||
const jlptLevelLookupCaches = new WeakMap<
|
const jlptLevelLookupCaches = new WeakMap<
|
||||||
(text: string) => JlptLevel | null,
|
(text: string) => JlptLevel | null,
|
||||||
@@ -60,6 +99,7 @@ function normalizePos1Tag(pos1: string | undefined): string {
|
|||||||
}
|
}
|
||||||
|
|
||||||
const SUBTITLE_ANNOTATION_EXCLUDED_POS1 = new Set(['感動詞']);
|
const SUBTITLE_ANNOTATION_EXCLUDED_POS1 = new Set(['感動詞']);
|
||||||
|
const SUBTITLE_ANNOTATION_GRAMMAR_ONLY_POS1 = new Set(['助詞', '助動詞', '連体詞']);
|
||||||
|
|
||||||
function splitNormalizedTagParts(normalizedTag: string): string[] {
|
function splitNormalizedTagParts(normalizedTag: string): string[] {
|
||||||
if (!normalizedTag) {
|
if (!normalizedTag) {
|
||||||
@@ -84,7 +124,36 @@ function isExcludedByTagSet(normalizedTag: string, exclusions: ReadonlySet<strin
|
|||||||
|
|
||||||
function isExcludedFromSubtitleAnnotationsByPos1(normalizedPos1: string): boolean {
|
function isExcludedFromSubtitleAnnotationsByPos1(normalizedPos1: string): boolean {
|
||||||
const parts = splitNormalizedTagParts(normalizedPos1);
|
const parts = splitNormalizedTagParts(normalizedPos1);
|
||||||
return parts.some((part) => SUBTITLE_ANNOTATION_EXCLUDED_POS1.has(part));
|
if (parts.some((part) => SUBTITLE_ANNOTATION_EXCLUDED_POS1.has(part))) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
return parts.length > 0 && parts.every((part) => SUBTITLE_ANNOTATION_GRAMMAR_ONLY_POS1.has(part));
|
||||||
|
}
|
||||||
|
|
||||||
|
function isExcludedTrailingParticleMergedToken(token: MergedToken): boolean {
|
||||||
|
const normalizedSurface = normalizeJlptTextForExclusion(token.surface);
|
||||||
|
const normalizedHeadword = normalizeJlptTextForExclusion(token.headword);
|
||||||
|
if (!normalizedSurface || !normalizedHeadword || !normalizedSurface.startsWith(normalizedHeadword)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const suffix = normalizedSurface.slice(normalizedHeadword.length);
|
||||||
|
if (!SUBTITLE_ANNOTATION_EXCLUDED_TRAILING_PARTICLE_SUFFIXES.has(suffix)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const pos1Parts = splitNormalizedTagParts(normalizePos1Tag(token.pos1));
|
||||||
|
if (pos1Parts.length < 2) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
const [leadingPos1, ...trailingPos1] = pos1Parts;
|
||||||
|
if (!leadingPos1 || SUBTITLE_ANNOTATION_GRAMMAR_ONLY_POS1.has(leadingPos1)) {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
|
||||||
|
return trailingPos1.length > 0 && trailingPos1.every((part) => part === '助詞');
|
||||||
}
|
}
|
||||||
|
|
||||||
function resolvePos1Exclusions(options: AnnotationStageOptions): ReadonlySet<string> {
|
function resolvePos1Exclusions(options: AnnotationStageOptions): ReadonlySet<string> {
|
||||||
@@ -520,12 +589,7 @@ function isJlptEligibleToken(token: MergedToken): boolean {
|
|||||||
}
|
}
|
||||||
|
|
||||||
function isExcludedFromSubtitleAnnotationsByTerm(token: MergedToken): boolean {
|
function isExcludedFromSubtitleAnnotationsByTerm(token: MergedToken): boolean {
|
||||||
const candidates = [
|
const candidates = [token.surface, token.reading, resolveJlptLookupText(token)].filter(
|
||||||
resolveJlptLookupText(token),
|
|
||||||
token.surface,
|
|
||||||
token.headword,
|
|
||||||
token.reading,
|
|
||||||
].filter(
|
|
||||||
(candidate): candidate is string => typeof candidate === 'string' && candidate.length > 0,
|
(candidate): candidate is string => typeof candidate === 'string' && candidate.length > 0,
|
||||||
);
|
);
|
||||||
|
|
||||||
@@ -542,7 +606,9 @@ function isExcludedFromSubtitleAnnotationsByTerm(token: MergedToken): boolean {
|
|||||||
|
|
||||||
if (
|
if (
|
||||||
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(trimmedCandidate) ||
|
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(trimmedCandidate) ||
|
||||||
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(normalizedCandidate)
|
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(normalizedCandidate) ||
|
||||||
|
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS.has(trimmedCandidate) ||
|
||||||
|
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS.has(normalizedCandidate)
|
||||||
) {
|
) {
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
@@ -565,9 +631,28 @@ export function shouldExcludeTokenFromSubtitleAnnotations(token: MergedToken): b
|
|||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (isExcludedTrailingParticleMergedToken(token)) {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
return isExcludedFromSubtitleAnnotationsByTerm(token);
|
return isExcludedFromSubtitleAnnotationsByTerm(token);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function stripSubtitleAnnotationMetadata(token: MergedToken): MergedToken {
|
||||||
|
if (!shouldExcludeTokenFromSubtitleAnnotations(token)) {
|
||||||
|
return token;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
...token,
|
||||||
|
isKnown: false,
|
||||||
|
isNPlusOneTarget: false,
|
||||||
|
isNameMatch: false,
|
||||||
|
jlptLevel: undefined,
|
||||||
|
frequencyRank: undefined,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
function computeTokenKnownStatus(
|
function computeTokenKnownStatus(
|
||||||
token: MergedToken,
|
token: MergedToken,
|
||||||
isKnownWord: (text: string) => boolean,
|
isKnownWord: (text: string) => boolean,
|
||||||
|
|||||||
@@ -11,6 +11,7 @@ import {
|
|||||||
shouldDetachBackgroundLaunch,
|
shouldDetachBackgroundLaunch,
|
||||||
shouldHandleHelpOnlyAtEntry,
|
shouldHandleHelpOnlyAtEntry,
|
||||||
shouldHandleLaunchMpvAtEntry,
|
shouldHandleLaunchMpvAtEntry,
|
||||||
|
shouldHandleStatsDaemonCommandAtEntry,
|
||||||
} from './main-entry-runtime';
|
} from './main-entry-runtime';
|
||||||
|
|
||||||
test('normalizeStartupArgv defaults no-arg startup to --start --background on non-Windows', () => {
|
test('normalizeStartupArgv defaults no-arg startup to --start --background on non-Windows', () => {
|
||||||
@@ -71,6 +72,25 @@ test('launch-mpv entry helpers detect and normalize targets', () => {
|
|||||||
]);
|
]);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
test('stats-daemon entry helper detects internal daemon commands', () => {
|
||||||
|
assert.equal(
|
||||||
|
shouldHandleStatsDaemonCommandAtEntry(['SubMiner.AppImage', '--stats-daemon-start'], {}),
|
||||||
|
true,
|
||||||
|
);
|
||||||
|
assert.equal(
|
||||||
|
shouldHandleStatsDaemonCommandAtEntry(['SubMiner.AppImage', '--stats-daemon-stop'], {}),
|
||||||
|
true,
|
||||||
|
);
|
||||||
|
assert.equal(
|
||||||
|
shouldHandleStatsDaemonCommandAtEntry(
|
||||||
|
['SubMiner.AppImage', '--stats-daemon-start'],
|
||||||
|
{ ELECTRON_RUN_AS_NODE: '1' },
|
||||||
|
),
|
||||||
|
false,
|
||||||
|
);
|
||||||
|
assert.equal(shouldHandleStatsDaemonCommandAtEntry(['SubMiner.AppImage', '--start'], {}), false);
|
||||||
|
});
|
||||||
|
|
||||||
test('sanitizeStartupEnv suppresses warnings and lsfg layer', () => {
|
test('sanitizeStartupEnv suppresses warnings and lsfg layer', () => {
|
||||||
const env = sanitizeStartupEnv({
|
const env = sanitizeStartupEnv({
|
||||||
VK_INSTANCE_LAYERS: 'foo:lsfg-vk:bar',
|
VK_INSTANCE_LAYERS: 'foo:lsfg-vk:bar',
|
||||||
|
|||||||
@@ -112,6 +112,14 @@ export function shouldHandleLaunchMpvAtEntry(argv: string[], env: NodeJS.Process
|
|||||||
return parseCliArgs(argv).launchMpv;
|
return parseCliArgs(argv).launchMpv;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function shouldHandleStatsDaemonCommandAtEntry(
|
||||||
|
argv: string[],
|
||||||
|
env: NodeJS.ProcessEnv,
|
||||||
|
): boolean {
|
||||||
|
if (env.ELECTRON_RUN_AS_NODE === '1') return false;
|
||||||
|
return argv.includes('--stats-daemon-start') || argv.includes('--stats-daemon-stop');
|
||||||
|
}
|
||||||
|
|
||||||
export function normalizeLaunchMpvTargets(argv: string[]): string[] {
|
export function normalizeLaunchMpvTargets(argv: string[]): string[] {
|
||||||
return parseCliArgs(argv).launchMpvTargets;
|
return parseCliArgs(argv).launchMpvTargets;
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -12,9 +12,11 @@ import {
|
|||||||
shouldDetachBackgroundLaunch,
|
shouldDetachBackgroundLaunch,
|
||||||
shouldHandleHelpOnlyAtEntry,
|
shouldHandleHelpOnlyAtEntry,
|
||||||
shouldHandleLaunchMpvAtEntry,
|
shouldHandleLaunchMpvAtEntry,
|
||||||
|
shouldHandleStatsDaemonCommandAtEntry,
|
||||||
} from './main-entry-runtime';
|
} from './main-entry-runtime';
|
||||||
import { requestSingleInstanceLockEarly } from './main/early-single-instance';
|
import { requestSingleInstanceLockEarly } from './main/early-single-instance';
|
||||||
import { createWindowsMpvLaunchDeps, launchWindowsMpv } from './main/runtime/windows-mpv-launch';
|
import { createWindowsMpvLaunchDeps, launchWindowsMpv } from './main/runtime/windows-mpv-launch';
|
||||||
|
import { runStatsDaemonControlFromProcess } from './stats-daemon-entry';
|
||||||
|
|
||||||
const DEFAULT_TEXTHOOKER_PORT = 5174;
|
const DEFAULT_TEXTHOOKER_PORT = 5174;
|
||||||
|
|
||||||
@@ -69,6 +71,11 @@ if (shouldHandleLaunchMpvAtEntry(process.argv, process.env)) {
|
|||||||
);
|
);
|
||||||
app.exit(result.ok ? 0 : 1);
|
app.exit(result.ok ? 0 : 1);
|
||||||
});
|
});
|
||||||
|
} else if (shouldHandleStatsDaemonCommandAtEntry(process.argv, process.env)) {
|
||||||
|
void app.whenReady().then(async () => {
|
||||||
|
const exitCode = await runStatsDaemonControlFromProcess(app.getPath('userData'));
|
||||||
|
app.exit(exitCode);
|
||||||
|
});
|
||||||
} else {
|
} else {
|
||||||
const gotSingleInstanceLock = requestSingleInstanceLockEarly(app);
|
const gotSingleInstanceLock = requestSingleInstanceLockEarly(app);
|
||||||
if (!gotSingleInstanceLock) {
|
if (!gotSingleInstanceLock) {
|
||||||
|
|||||||
@@ -682,7 +682,7 @@ test('renderSubtitle preserves unsupported punctuation while keeping it non-inte
|
|||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
test('renderSubtitle keeps excluded interjection text visible while only rendering remaining tokens as interactive', () => {
|
test('renderSubtitle keeps excluded interjection tokens hoverable while rendering them without annotation styling', () => {
|
||||||
const restoreDocument = installFakeDocument();
|
const restoreDocument = installFakeDocument();
|
||||||
|
|
||||||
try {
|
try {
|
||||||
@@ -718,13 +718,19 @@ test('renderSubtitle keeps excluded interjection text visible while only renderi
|
|||||||
|
|
||||||
renderer.renderSubtitle({
|
renderer.renderSubtitle({
|
||||||
text: 'ぐはっ 猫',
|
text: 'ぐはっ 猫',
|
||||||
tokens: [createToken({ surface: '猫', headword: '猫', reading: 'ねこ' })],
|
tokens: [
|
||||||
|
createToken({ surface: 'ぐはっ', headword: 'ぐはっ', reading: 'ぐはっ' }),
|
||||||
|
createToken({ surface: '猫', headword: '猫', reading: 'ねこ' }),
|
||||||
|
],
|
||||||
});
|
});
|
||||||
|
|
||||||
assert.equal(subtitleRoot.textContent, 'ぐはっ 猫');
|
assert.equal(subtitleRoot.textContent, 'ぐはっ 猫');
|
||||||
assert.deepEqual(
|
assert.deepEqual(
|
||||||
collectWordNodes(subtitleRoot).map((node) => [node.textContent, node.dataset.tokenIndex]),
|
collectWordNodes(subtitleRoot).map((node) => [node.textContent, node.dataset.tokenIndex]),
|
||||||
[['猫', '0']],
|
[
|
||||||
|
['ぐはっ', '0'],
|
||||||
|
['猫', '1'],
|
||||||
|
],
|
||||||
);
|
);
|
||||||
} finally {
|
} finally {
|
||||||
restoreDocument();
|
restoreDocument();
|
||||||
|
|||||||
158
src/stats-daemon-control.test.ts
Normal file
158
src/stats-daemon-control.test.ts
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
import assert from 'node:assert/strict';
|
||||||
|
import test from 'node:test';
|
||||||
|
import { createRunStatsDaemonControlHandler } from './stats-daemon-control';
|
||||||
|
|
||||||
|
test('stats daemon control reuses live daemon and writes launcher response', async () => {
|
||||||
|
const calls: string[] = [];
|
||||||
|
const responses: Array<{ path: string; payload: { ok: boolean; url?: string; error?: string } }> =
|
||||||
|
[];
|
||||||
|
const handler = createRunStatsDaemonControlHandler({
|
||||||
|
statePath: '/tmp/stats-daemon.json',
|
||||||
|
readState: () => ({ pid: 4242, port: 5175, startedAtMs: 1 }),
|
||||||
|
removeState: () => {
|
||||||
|
calls.push('removeState');
|
||||||
|
},
|
||||||
|
isProcessAlive: (pid) => {
|
||||||
|
calls.push(`isProcessAlive:${pid}`);
|
||||||
|
return true;
|
||||||
|
},
|
||||||
|
resolveUrl: (state) => `http://127.0.0.1:${state.port}`,
|
||||||
|
spawnDaemon: async () => {
|
||||||
|
calls.push('spawnDaemon');
|
||||||
|
return 1;
|
||||||
|
},
|
||||||
|
waitForDaemonResponse: async () => {
|
||||||
|
calls.push('waitForDaemonResponse');
|
||||||
|
return { ok: true, url: 'http://127.0.0.1:5175' };
|
||||||
|
},
|
||||||
|
openExternal: async (url) => {
|
||||||
|
calls.push(`openExternal:${url}`);
|
||||||
|
},
|
||||||
|
writeResponse: (responsePath, payload) => {
|
||||||
|
responses.push({ path: responsePath, payload });
|
||||||
|
},
|
||||||
|
killProcess: () => {
|
||||||
|
calls.push('killProcess');
|
||||||
|
},
|
||||||
|
sleep: async () => {},
|
||||||
|
});
|
||||||
|
|
||||||
|
const exitCode = await handler({
|
||||||
|
action: 'start',
|
||||||
|
responsePath: '/tmp/response.json',
|
||||||
|
openBrowser: true,
|
||||||
|
daemonScriptPath: '/tmp/stats-daemon-runner.js',
|
||||||
|
userDataPath: '/tmp/SubMiner',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(exitCode, 0);
|
||||||
|
assert.deepEqual(calls, ['isProcessAlive:4242', 'openExternal:http://127.0.0.1:5175']);
|
||||||
|
assert.deepEqual(responses, [
|
||||||
|
{
|
||||||
|
path: '/tmp/response.json',
|
||||||
|
payload: { ok: true, url: 'http://127.0.0.1:5175' },
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('stats daemon control clears stale state, starts daemon, and waits for response', async () => {
|
||||||
|
const calls: string[] = [];
|
||||||
|
const handler = createRunStatsDaemonControlHandler({
|
||||||
|
statePath: '/tmp/stats-daemon.json',
|
||||||
|
readState: () => ({ pid: 4242, port: 5175, startedAtMs: 1 }),
|
||||||
|
removeState: () => {
|
||||||
|
calls.push('removeState');
|
||||||
|
},
|
||||||
|
isProcessAlive: (pid) => {
|
||||||
|
calls.push(`isProcessAlive:${pid}`);
|
||||||
|
return false;
|
||||||
|
},
|
||||||
|
resolveUrl: (state) => `http://127.0.0.1:${state.port}`,
|
||||||
|
spawnDaemon: async (options) => {
|
||||||
|
calls.push(`spawnDaemon:${options.scriptPath}:${options.responsePath}:${options.userDataPath}`);
|
||||||
|
return 999;
|
||||||
|
},
|
||||||
|
waitForDaemonResponse: async (responsePath) => {
|
||||||
|
calls.push(`waitForDaemonResponse:${responsePath}`);
|
||||||
|
return { ok: true, url: 'http://127.0.0.1:5175' };
|
||||||
|
},
|
||||||
|
openExternal: async (url) => {
|
||||||
|
calls.push(`openExternal:${url}`);
|
||||||
|
},
|
||||||
|
writeResponse: () => {
|
||||||
|
calls.push('writeResponse');
|
||||||
|
},
|
||||||
|
killProcess: () => {
|
||||||
|
calls.push('killProcess');
|
||||||
|
},
|
||||||
|
sleep: async () => {},
|
||||||
|
});
|
||||||
|
|
||||||
|
const exitCode = await handler({
|
||||||
|
action: 'start',
|
||||||
|
responsePath: '/tmp/response.json',
|
||||||
|
openBrowser: false,
|
||||||
|
daemonScriptPath: '/tmp/stats-daemon-runner.js',
|
||||||
|
userDataPath: '/tmp/SubMiner',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(exitCode, 0);
|
||||||
|
assert.deepEqual(calls, [
|
||||||
|
'isProcessAlive:4242',
|
||||||
|
'removeState',
|
||||||
|
'spawnDaemon:/tmp/stats-daemon-runner.js:/tmp/response.json:/tmp/SubMiner',
|
||||||
|
'waitForDaemonResponse:/tmp/response.json',
|
||||||
|
]);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('stats daemon control stops live daemon and treats stale state as success', async () => {
|
||||||
|
const responses: Array<{ path: string; payload: { ok: boolean; url?: string; error?: string } }> =
|
||||||
|
[];
|
||||||
|
const calls: string[] = [];
|
||||||
|
let aliveChecks = 0;
|
||||||
|
const handler = createRunStatsDaemonControlHandler({
|
||||||
|
statePath: '/tmp/stats-daemon.json',
|
||||||
|
readState: () => ({ pid: 4242, port: 5175, startedAtMs: 1 }),
|
||||||
|
removeState: () => {
|
||||||
|
calls.push('removeState');
|
||||||
|
},
|
||||||
|
isProcessAlive: (pid) => {
|
||||||
|
aliveChecks += 1;
|
||||||
|
calls.push(`isProcessAlive:${pid}:${aliveChecks}`);
|
||||||
|
return aliveChecks === 1;
|
||||||
|
},
|
||||||
|
resolveUrl: (state) => `http://127.0.0.1:${state.port}`,
|
||||||
|
spawnDaemon: async () => 1,
|
||||||
|
waitForDaemonResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
||||||
|
openExternal: async () => {},
|
||||||
|
writeResponse: (responsePath, payload) => {
|
||||||
|
responses.push({ path: responsePath, payload });
|
||||||
|
},
|
||||||
|
killProcess: (pid, signal) => {
|
||||||
|
calls.push(`killProcess:${pid}:${signal}`);
|
||||||
|
},
|
||||||
|
sleep: async () => {},
|
||||||
|
});
|
||||||
|
|
||||||
|
const exitCode = await handler({
|
||||||
|
action: 'stop',
|
||||||
|
responsePath: '/tmp/response.json',
|
||||||
|
openBrowser: false,
|
||||||
|
daemonScriptPath: '/tmp/stats-daemon-runner.js',
|
||||||
|
userDataPath: '/tmp/SubMiner',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(exitCode, 0);
|
||||||
|
assert.deepEqual(calls, [
|
||||||
|
'isProcessAlive:4242:1',
|
||||||
|
'killProcess:4242:SIGTERM',
|
||||||
|
'isProcessAlive:4242:2',
|
||||||
|
'removeState',
|
||||||
|
]);
|
||||||
|
assert.deepEqual(responses, [
|
||||||
|
{
|
||||||
|
path: '/tmp/response.json',
|
||||||
|
payload: { ok: true },
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
});
|
||||||
102
src/stats-daemon-control.ts
Normal file
102
src/stats-daemon-control.ts
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
import type { BackgroundStatsServerState } from './main/runtime/stats-daemon';
|
||||||
|
import type { StatsCliCommandResponse } from './main/runtime/stats-cli-command';
|
||||||
|
|
||||||
|
export type StatsDaemonControlAction = 'start' | 'stop';
|
||||||
|
|
||||||
|
export type StatsDaemonControlArgs = {
|
||||||
|
action: StatsDaemonControlAction;
|
||||||
|
responsePath?: string;
|
||||||
|
openBrowser: boolean;
|
||||||
|
daemonScriptPath: string;
|
||||||
|
userDataPath: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
type SpawnStatsDaemonOptions = {
|
||||||
|
scriptPath: string;
|
||||||
|
responsePath?: string;
|
||||||
|
userDataPath: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function createRunStatsDaemonControlHandler(deps: {
|
||||||
|
statePath: string;
|
||||||
|
readState: () => BackgroundStatsServerState | null;
|
||||||
|
removeState: () => void;
|
||||||
|
isProcessAlive: (pid: number) => boolean;
|
||||||
|
resolveUrl: (state: Pick<BackgroundStatsServerState, 'port'>) => string;
|
||||||
|
spawnDaemon: (options: SpawnStatsDaemonOptions) => Promise<number> | number;
|
||||||
|
waitForDaemonResponse: (responsePath: string) => Promise<StatsCliCommandResponse>;
|
||||||
|
openExternal: (url: string) => Promise<unknown>;
|
||||||
|
writeResponse: (responsePath: string, payload: StatsCliCommandResponse) => void;
|
||||||
|
killProcess: (pid: number, signal: NodeJS.Signals) => void;
|
||||||
|
sleep: (ms: number) => Promise<void>;
|
||||||
|
}) {
|
||||||
|
const writeResponseSafe = (
|
||||||
|
responsePath: string | undefined,
|
||||||
|
payload: StatsCliCommandResponse,
|
||||||
|
): void => {
|
||||||
|
if (!responsePath) return;
|
||||||
|
deps.writeResponse(responsePath, payload);
|
||||||
|
};
|
||||||
|
|
||||||
|
return async (args: StatsDaemonControlArgs): Promise<number> => {
|
||||||
|
if (args.action === 'start') {
|
||||||
|
const state = deps.readState();
|
||||||
|
if (state) {
|
||||||
|
if (deps.isProcessAlive(state.pid)) {
|
||||||
|
const url = deps.resolveUrl(state);
|
||||||
|
writeResponseSafe(args.responsePath, { ok: true, url });
|
||||||
|
if (args.openBrowser) {
|
||||||
|
await deps.openExternal(url);
|
||||||
|
}
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
deps.removeState();
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!args.responsePath) {
|
||||||
|
throw new Error('Missing --stats-response-path for stats daemon start.');
|
||||||
|
}
|
||||||
|
|
||||||
|
await deps.spawnDaemon({
|
||||||
|
scriptPath: args.daemonScriptPath,
|
||||||
|
responsePath: args.responsePath,
|
||||||
|
userDataPath: args.userDataPath,
|
||||||
|
});
|
||||||
|
const response = await deps.waitForDaemonResponse(args.responsePath);
|
||||||
|
if (response.ok && args.openBrowser && response.url) {
|
||||||
|
await deps.openExternal(response.url);
|
||||||
|
}
|
||||||
|
return response.ok ? 0 : 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
const state = deps.readState();
|
||||||
|
if (!state) {
|
||||||
|
deps.removeState();
|
||||||
|
writeResponseSafe(args.responsePath, { ok: true });
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!deps.isProcessAlive(state.pid)) {
|
||||||
|
deps.removeState();
|
||||||
|
writeResponseSafe(args.responsePath, { ok: true });
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
deps.killProcess(state.pid, 'SIGTERM');
|
||||||
|
const deadline = Date.now() + 2_000;
|
||||||
|
while (Date.now() < deadline) {
|
||||||
|
if (!deps.isProcessAlive(state.pid)) {
|
||||||
|
deps.removeState();
|
||||||
|
writeResponseSafe(args.responsePath, { ok: true });
|
||||||
|
return 0;
|
||||||
|
}
|
||||||
|
await deps.sleep(50);
|
||||||
|
}
|
||||||
|
|
||||||
|
writeResponseSafe(args.responsePath, {
|
||||||
|
ok: false,
|
||||||
|
error: 'Timed out stopping background stats server.',
|
||||||
|
});
|
||||||
|
return 1;
|
||||||
|
};
|
||||||
|
}
|
||||||
135
src/stats-daemon-entry.ts
Normal file
135
src/stats-daemon-entry.ts
Normal file
@@ -0,0 +1,135 @@
|
|||||||
|
import fs from 'node:fs';
|
||||||
|
import path from 'node:path';
|
||||||
|
import { spawn } from 'node:child_process';
|
||||||
|
import { shell } from 'electron';
|
||||||
|
import { sanitizeStartupEnv } from './main-entry-runtime';
|
||||||
|
import {
|
||||||
|
isBackgroundStatsServerProcessAlive,
|
||||||
|
readBackgroundStatsServerState,
|
||||||
|
removeBackgroundStatsServerState,
|
||||||
|
resolveBackgroundStatsServerUrl,
|
||||||
|
} from './main/runtime/stats-daemon';
|
||||||
|
import {
|
||||||
|
createRunStatsDaemonControlHandler,
|
||||||
|
type StatsDaemonControlArgs,
|
||||||
|
} from './stats-daemon-control';
|
||||||
|
import {
|
||||||
|
type StatsCliCommandResponse,
|
||||||
|
writeStatsCliCommandResponse,
|
||||||
|
} from './main/runtime/stats-cli-command';
|
||||||
|
|
||||||
|
const STATS_DAEMON_RESPONSE_TIMEOUT_MS = 12_000;
|
||||||
|
|
||||||
|
function readFlagValue(argv: string[], flag: string): string | undefined {
|
||||||
|
for (let i = 0; i < argv.length; i += 1) {
|
||||||
|
const arg = argv[i];
|
||||||
|
if (!arg) continue;
|
||||||
|
if (arg === flag) {
|
||||||
|
const value = argv[i + 1];
|
||||||
|
if (value && !value.startsWith('--')) {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
if (arg.startsWith(`${flag}=`)) {
|
||||||
|
return arg.split('=', 2)[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
function hasFlag(argv: string[], flag: string): boolean {
|
||||||
|
return argv.includes(flag);
|
||||||
|
}
|
||||||
|
|
||||||
|
function parseControlArgs(argv: string[], userDataPath: string): StatsDaemonControlArgs {
|
||||||
|
return {
|
||||||
|
action: hasFlag(argv, '--stats-daemon-stop') ? 'stop' : 'start',
|
||||||
|
responsePath: readFlagValue(argv, '--stats-response-path'),
|
||||||
|
openBrowser: hasFlag(argv, '--stats-daemon-open-browser'),
|
||||||
|
daemonScriptPath: path.join(__dirname, 'stats-daemon-runner.js'),
|
||||||
|
userDataPath,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async function waitForDaemonResponse(responsePath: string): Promise<StatsCliCommandResponse> {
|
||||||
|
const deadline = Date.now() + STATS_DAEMON_RESPONSE_TIMEOUT_MS;
|
||||||
|
while (Date.now() < deadline) {
|
||||||
|
try {
|
||||||
|
if (fs.existsSync(responsePath)) {
|
||||||
|
return JSON.parse(fs.readFileSync(responsePath, 'utf8')) as StatsCliCommandResponse;
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// retry until timeout
|
||||||
|
}
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 100));
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
ok: false,
|
||||||
|
error: 'Timed out waiting for stats daemon startup response.',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export async function runStatsDaemonControlFromProcess(userDataPath: string): Promise<number> {
|
||||||
|
const args = parseControlArgs(process.argv, userDataPath);
|
||||||
|
const statePath = path.join(userDataPath, 'stats-daemon.json');
|
||||||
|
|
||||||
|
const writeFailureResponse = (message: string): void => {
|
||||||
|
if (args.responsePath) {
|
||||||
|
try {
|
||||||
|
writeStatsCliCommandResponse(args.responsePath, {
|
||||||
|
ok: false,
|
||||||
|
error: message,
|
||||||
|
});
|
||||||
|
} catch {
|
||||||
|
// ignore secondary response-write failures
|
||||||
|
}
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
const handler = createRunStatsDaemonControlHandler({
|
||||||
|
statePath,
|
||||||
|
readState: () => readBackgroundStatsServerState(statePath),
|
||||||
|
removeState: () => {
|
||||||
|
removeBackgroundStatsServerState(statePath);
|
||||||
|
},
|
||||||
|
isProcessAlive: (pid) => isBackgroundStatsServerProcessAlive(pid),
|
||||||
|
resolveUrl: (state) => resolveBackgroundStatsServerUrl(state),
|
||||||
|
spawnDaemon: async (options) => {
|
||||||
|
const childArgs = [options.scriptPath, '--stats-user-data-path', options.userDataPath];
|
||||||
|
if (options.responsePath) {
|
||||||
|
childArgs.push('--stats-response-path', options.responsePath);
|
||||||
|
}
|
||||||
|
const logLevel = readFlagValue(process.argv, '--log-level');
|
||||||
|
if (logLevel) {
|
||||||
|
childArgs.push('--log-level', logLevel);
|
||||||
|
}
|
||||||
|
const child = spawn(process.execPath, childArgs, {
|
||||||
|
detached: true,
|
||||||
|
stdio: 'ignore',
|
||||||
|
env: {
|
||||||
|
...sanitizeStartupEnv(process.env),
|
||||||
|
ELECTRON_RUN_AS_NODE: '1',
|
||||||
|
},
|
||||||
|
});
|
||||||
|
child.unref();
|
||||||
|
return child.pid ?? 0;
|
||||||
|
},
|
||||||
|
waitForDaemonResponse,
|
||||||
|
openExternal: async (url) => shell.openExternal(url),
|
||||||
|
writeResponse: writeStatsCliCommandResponse,
|
||||||
|
killProcess: (pid, signal) => {
|
||||||
|
process.kill(pid, signal);
|
||||||
|
},
|
||||||
|
sleep: async (ms) => new Promise((resolve) => setTimeout(resolve, ms)),
|
||||||
|
});
|
||||||
|
|
||||||
|
try {
|
||||||
|
return await handler(args);
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : String(error);
|
||||||
|
writeFailureResponse(message);
|
||||||
|
return 1;
|
||||||
|
}
|
||||||
|
}
|
||||||
225
src/stats-daemon-runner.ts
Normal file
225
src/stats-daemon-runner.ts
Normal file
@@ -0,0 +1,225 @@
|
|||||||
|
import fs from 'node:fs';
|
||||||
|
import os from 'node:os';
|
||||||
|
import path from 'node:path';
|
||||||
|
import { spawn } from 'node:child_process';
|
||||||
|
import { ConfigService } from './config/service';
|
||||||
|
import { createLogger, setLogLevel } from './logger';
|
||||||
|
import { ImmersionTrackerService } from './core/services/immersion-tracker-service';
|
||||||
|
import { createCoverArtFetcher } from './core/services/anilist/cover-art-fetcher';
|
||||||
|
import { createAnilistRateLimiter } from './core/services/anilist/rate-limiter';
|
||||||
|
import { startStatsServer } from './core/services/stats-server';
|
||||||
|
import {
|
||||||
|
removeBackgroundStatsServerState,
|
||||||
|
writeBackgroundStatsServerState,
|
||||||
|
} from './main/runtime/stats-daemon';
|
||||||
|
import { writeStatsCliCommandResponse } from './main/runtime/stats-cli-command';
|
||||||
|
import { createInvokeStatsWordHelperHandler, type StatsWordHelperResponse } from './stats-word-helper-client';
|
||||||
|
|
||||||
|
const logger = createLogger('stats-daemon');
|
||||||
|
const STATS_WORD_HELPER_RESPONSE_TIMEOUT_MS = 20_000;
|
||||||
|
|
||||||
|
function readFlagValue(argv: string[], flag: string): string | undefined {
|
||||||
|
for (let i = 0; i < argv.length; i += 1) {
|
||||||
|
const arg = argv[i];
|
||||||
|
if (!arg) continue;
|
||||||
|
if (arg === flag) {
|
||||||
|
const value = argv[i + 1];
|
||||||
|
if (value && !value.startsWith('--')) {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
if (arg.startsWith(`${flag}=`)) {
|
||||||
|
return arg.split('=', 2)[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function waitForWordHelperResponse(responsePath: string): Promise<StatsWordHelperResponse> {
|
||||||
|
const deadline = Date.now() + STATS_WORD_HELPER_RESPONSE_TIMEOUT_MS;
|
||||||
|
while (Date.now() < deadline) {
|
||||||
|
try {
|
||||||
|
if (fs.existsSync(responsePath)) {
|
||||||
|
return JSON.parse(fs.readFileSync(responsePath, 'utf8')) as StatsWordHelperResponse;
|
||||||
|
}
|
||||||
|
} catch {
|
||||||
|
// retry until timeout
|
||||||
|
}
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 100));
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
ok: false,
|
||||||
|
error: 'Timed out waiting for stats word helper response.',
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const invokeStatsWordHelper = createInvokeStatsWordHelperHandler({
|
||||||
|
createTempDir: (prefix) => fs.mkdtempSync(path.join(os.tmpdir(), prefix)),
|
||||||
|
joinPath: (...parts) => path.join(...parts),
|
||||||
|
spawnHelper: async (options) => {
|
||||||
|
const childArgs = [
|
||||||
|
options.scriptPath,
|
||||||
|
'--stats-word-helper-response-path',
|
||||||
|
options.responsePath,
|
||||||
|
'--stats-word-helper-user-data-path',
|
||||||
|
options.userDataPath,
|
||||||
|
'--stats-word-helper-word',
|
||||||
|
options.word,
|
||||||
|
];
|
||||||
|
const logLevel = readFlagValue(process.argv, '--log-level');
|
||||||
|
if (logLevel) {
|
||||||
|
childArgs.push('--log-level', logLevel);
|
||||||
|
}
|
||||||
|
const child = spawn(process.execPath, childArgs, {
|
||||||
|
stdio: 'ignore',
|
||||||
|
env: {
|
||||||
|
...process.env,
|
||||||
|
ELECTRON_RUN_AS_NODE: undefined,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
return await new Promise<number>((resolve) => {
|
||||||
|
child.once('exit', (code) => resolve(code ?? 1));
|
||||||
|
child.once('error', () => resolve(1));
|
||||||
|
});
|
||||||
|
},
|
||||||
|
waitForResponse: waitForWordHelperResponse,
|
||||||
|
removeDir: (targetPath) => {
|
||||||
|
fs.rmSync(targetPath, { recursive: true, force: true });
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const userDataPath = readFlagValue(process.argv, '--stats-user-data-path')?.trim();
|
||||||
|
const responsePath = readFlagValue(process.argv, '--stats-response-path')?.trim();
|
||||||
|
const logLevel = readFlagValue(process.argv, '--log-level');
|
||||||
|
|
||||||
|
if (logLevel) {
|
||||||
|
setLogLevel(logLevel, 'cli');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!userDataPath) {
|
||||||
|
if (responsePath) {
|
||||||
|
writeStatsCliCommandResponse(responsePath, {
|
||||||
|
ok: false,
|
||||||
|
error: 'Missing --stats-user-data-path for stats daemon runner.',
|
||||||
|
});
|
||||||
|
}
|
||||||
|
process.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
const daemonUserDataPath = userDataPath;
|
||||||
|
|
||||||
|
const statePath = path.join(userDataPath, 'stats-daemon.json');
|
||||||
|
const knownWordCachePath = path.join(userDataPath, 'known-words-cache.json');
|
||||||
|
const statsDistPath = path.join(__dirname, '..', 'stats', 'dist');
|
||||||
|
const wordHelperScriptPath = path.join(__dirname, 'stats-word-helper.js');
|
||||||
|
|
||||||
|
let tracker: ImmersionTrackerService | null = null;
|
||||||
|
let statsServer: ReturnType<typeof startStatsServer> | null = null;
|
||||||
|
|
||||||
|
function writeFailureResponse(message: string): void {
|
||||||
|
if (!responsePath) return;
|
||||||
|
writeStatsCliCommandResponse(responsePath, { ok: false, error: message });
|
||||||
|
}
|
||||||
|
|
||||||
|
function clearOwnedState(): void {
|
||||||
|
const rawState = (() => {
|
||||||
|
try {
|
||||||
|
return JSON.parse(fs.readFileSync(statePath, 'utf8')) as { pid?: number };
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
})();
|
||||||
|
if (rawState?.pid === process.pid) {
|
||||||
|
removeBackgroundStatsServerState(statePath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function shutdown(code = 0): void {
|
||||||
|
try {
|
||||||
|
statsServer?.close();
|
||||||
|
} catch {
|
||||||
|
// ignore
|
||||||
|
}
|
||||||
|
statsServer = null;
|
||||||
|
try {
|
||||||
|
tracker?.destroy();
|
||||||
|
} catch {
|
||||||
|
// ignore
|
||||||
|
}
|
||||||
|
tracker = null;
|
||||||
|
clearOwnedState();
|
||||||
|
process.exit(code);
|
||||||
|
}
|
||||||
|
|
||||||
|
process.on('SIGINT', () => shutdown(0));
|
||||||
|
process.on('SIGTERM', () => shutdown(0));
|
||||||
|
|
||||||
|
async function main(): Promise<void> {
|
||||||
|
try {
|
||||||
|
const configService = new ConfigService(daemonUserDataPath);
|
||||||
|
const config = configService.getConfig();
|
||||||
|
if (config.immersionTracking?.enabled === false) {
|
||||||
|
throw new Error('Immersion tracking is disabled in config.');
|
||||||
|
}
|
||||||
|
|
||||||
|
const configuredDbPath = config.immersionTracking?.dbPath?.trim() || '';
|
||||||
|
tracker = new ImmersionTrackerService({
|
||||||
|
dbPath: configuredDbPath || path.join(daemonUserDataPath, 'immersion.sqlite'),
|
||||||
|
policy: {
|
||||||
|
batchSize: config.immersionTracking.batchSize,
|
||||||
|
flushIntervalMs: config.immersionTracking.flushIntervalMs,
|
||||||
|
queueCap: config.immersionTracking.queueCap,
|
||||||
|
payloadCapBytes: config.immersionTracking.payloadCapBytes,
|
||||||
|
maintenanceIntervalMs: config.immersionTracking.maintenanceIntervalMs,
|
||||||
|
retention: {
|
||||||
|
eventsDays: config.immersionTracking.retention.eventsDays,
|
||||||
|
telemetryDays: config.immersionTracking.retention.telemetryDays,
|
||||||
|
sessionsDays: config.immersionTracking.retention.sessionsDays,
|
||||||
|
dailyRollupsDays: config.immersionTracking.retention.dailyRollupsDays,
|
||||||
|
monthlyRollupsDays: config.immersionTracking.retention.monthlyRollupsDays,
|
||||||
|
vacuumIntervalDays: config.immersionTracking.retention.vacuumIntervalDays,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
});
|
||||||
|
tracker.setCoverArtFetcher(
|
||||||
|
createCoverArtFetcher(createAnilistRateLimiter(), createLogger('stats-daemon:cover-art')),
|
||||||
|
);
|
||||||
|
|
||||||
|
statsServer = startStatsServer({
|
||||||
|
port: config.stats.serverPort,
|
||||||
|
staticDir: statsDistPath,
|
||||||
|
tracker,
|
||||||
|
knownWordCachePath,
|
||||||
|
ankiConnectConfig: config.ankiConnect,
|
||||||
|
addYomitanNote: async (word: string) =>
|
||||||
|
await invokeStatsWordHelper({
|
||||||
|
helperScriptPath: wordHelperScriptPath,
|
||||||
|
userDataPath: daemonUserDataPath,
|
||||||
|
word,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
writeBackgroundStatsServerState(statePath, {
|
||||||
|
pid: process.pid,
|
||||||
|
port: config.stats.serverPort,
|
||||||
|
startedAtMs: Date.now(),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (responsePath) {
|
||||||
|
writeStatsCliCommandResponse(responsePath, {
|
||||||
|
ok: true,
|
||||||
|
url: `http://127.0.0.1:${config.stats.serverPort}`,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
logger.info(`Background stats daemon listening on http://127.0.0.1:${config.stats.serverPort}`);
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : String(error);
|
||||||
|
logger.error('Failed to start stats daemon', message);
|
||||||
|
writeFailureResponse(message);
|
||||||
|
shutdown(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void main();
|
||||||
57
src/stats-word-helper-client.test.ts
Normal file
57
src/stats-word-helper-client.test.ts
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
import assert from 'node:assert/strict';
|
||||||
|
import test from 'node:test';
|
||||||
|
import { createInvokeStatsWordHelperHandler } from './stats-word-helper-client';
|
||||||
|
|
||||||
|
test('word helper client returns note id when helper responds before exit', async () => {
|
||||||
|
const calls: string[] = [];
|
||||||
|
const handler = createInvokeStatsWordHelperHandler({
|
||||||
|
createTempDir: () => '/tmp/stats-word-helper',
|
||||||
|
joinPath: (...parts) => parts.join('/'),
|
||||||
|
spawnHelper: async (options) => {
|
||||||
|
calls.push(
|
||||||
|
`spawnHelper:${options.scriptPath}:${options.responsePath}:${options.userDataPath}:${options.word}`,
|
||||||
|
);
|
||||||
|
return new Promise<number>((resolve) => setTimeout(() => resolve(0), 20));
|
||||||
|
},
|
||||||
|
waitForResponse: async (responsePath) => {
|
||||||
|
calls.push(`waitForResponse:${responsePath}`);
|
||||||
|
return { ok: true, noteId: 123 };
|
||||||
|
},
|
||||||
|
removeDir: (targetPath) => {
|
||||||
|
calls.push(`removeDir:${targetPath}`);
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const noteId = await handler({
|
||||||
|
helperScriptPath: '/tmp/stats-word-helper.js',
|
||||||
|
userDataPath: '/tmp/SubMiner',
|
||||||
|
word: '猫',
|
||||||
|
});
|
||||||
|
|
||||||
|
assert.equal(noteId, 123);
|
||||||
|
assert.deepEqual(calls, [
|
||||||
|
'spawnHelper:/tmp/stats-word-helper.js:/tmp/stats-word-helper/response.json:/tmp/SubMiner:猫',
|
||||||
|
'waitForResponse:/tmp/stats-word-helper/response.json',
|
||||||
|
'removeDir:/tmp/stats-word-helper',
|
||||||
|
]);
|
||||||
|
});
|
||||||
|
|
||||||
|
test('word helper client throws helper response errors', async () => {
|
||||||
|
const handler = createInvokeStatsWordHelperHandler({
|
||||||
|
createTempDir: () => '/tmp/stats-word-helper',
|
||||||
|
joinPath: (...parts) => parts.join('/'),
|
||||||
|
spawnHelper: async () => 0,
|
||||||
|
waitForResponse: async () => ({ ok: false, error: 'helper failed' }),
|
||||||
|
removeDir: () => {},
|
||||||
|
});
|
||||||
|
|
||||||
|
await assert.rejects(
|
||||||
|
async () =>
|
||||||
|
handler({
|
||||||
|
helperScriptPath: '/tmp/stats-word-helper.js',
|
||||||
|
userDataPath: '/tmp/SubMiner',
|
||||||
|
word: '猫',
|
||||||
|
}),
|
||||||
|
/helper failed/,
|
||||||
|
);
|
||||||
|
});
|
||||||
62
src/stats-word-helper-client.ts
Normal file
62
src/stats-word-helper-client.ts
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
export type StatsWordHelperResponse = {
|
||||||
|
ok: boolean;
|
||||||
|
noteId?: number;
|
||||||
|
error?: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function createInvokeStatsWordHelperHandler(deps: {
|
||||||
|
createTempDir: (prefix: string) => string;
|
||||||
|
joinPath: (...parts: string[]) => string;
|
||||||
|
spawnHelper: (options: {
|
||||||
|
scriptPath: string;
|
||||||
|
responsePath: string;
|
||||||
|
userDataPath: string;
|
||||||
|
word: string;
|
||||||
|
}) => Promise<number>;
|
||||||
|
waitForResponse: (responsePath: string) => Promise<StatsWordHelperResponse>;
|
||||||
|
removeDir: (targetPath: string) => void;
|
||||||
|
}) {
|
||||||
|
return async (options: {
|
||||||
|
helperScriptPath: string;
|
||||||
|
userDataPath: string;
|
||||||
|
word: string;
|
||||||
|
}): Promise<number> => {
|
||||||
|
const tempDir = deps.createTempDir('subminer-stats-word-helper-');
|
||||||
|
const responsePath = deps.joinPath(tempDir, 'response.json');
|
||||||
|
|
||||||
|
try {
|
||||||
|
const helperExitPromise = deps.spawnHelper({
|
||||||
|
scriptPath: options.helperScriptPath,
|
||||||
|
responsePath,
|
||||||
|
userDataPath: options.userDataPath,
|
||||||
|
word: options.word,
|
||||||
|
});
|
||||||
|
|
||||||
|
const startupResult = await Promise.race([
|
||||||
|
deps.waitForResponse(responsePath).then((response) => ({ kind: 'response' as const, response })),
|
||||||
|
helperExitPromise.then((status) => ({ kind: 'exit' as const, status })),
|
||||||
|
]);
|
||||||
|
|
||||||
|
let response: StatsWordHelperResponse;
|
||||||
|
if (startupResult.kind === 'response') {
|
||||||
|
response = startupResult.response;
|
||||||
|
} else {
|
||||||
|
if (startupResult.status !== 0) {
|
||||||
|
throw new Error(`Stats word helper exited before response (status ${startupResult.status}).`);
|
||||||
|
}
|
||||||
|
response = await deps.waitForResponse(responsePath);
|
||||||
|
}
|
||||||
|
|
||||||
|
const exitStatus = await helperExitPromise;
|
||||||
|
if (exitStatus !== 0) {
|
||||||
|
throw new Error(`Stats word helper exited with status ${exitStatus}.`);
|
||||||
|
}
|
||||||
|
if (!response.ok || typeof response.noteId !== 'number') {
|
||||||
|
throw new Error(response.error || 'Stats word helper failed.');
|
||||||
|
}
|
||||||
|
return response.noteId;
|
||||||
|
} finally {
|
||||||
|
deps.removeDir(tempDir);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
193
src/stats-word-helper.ts
Normal file
193
src/stats-word-helper.ts
Normal file
@@ -0,0 +1,193 @@
|
|||||||
|
import fs from 'node:fs';
|
||||||
|
import path from 'node:path';
|
||||||
|
import { app, protocol } from 'electron';
|
||||||
|
import type { BrowserWindow, Extension, Session } from 'electron';
|
||||||
|
import { ConfigService } from './config/service';
|
||||||
|
import { createLogger, setLogLevel } from './logger';
|
||||||
|
import { loadYomitanExtension } from './core/services/yomitan-extension-loader';
|
||||||
|
import {
|
||||||
|
addYomitanNoteViaSearch,
|
||||||
|
syncYomitanDefaultAnkiServer,
|
||||||
|
} from './core/services/tokenizer/yomitan-parser-runtime';
|
||||||
|
import type { StatsWordHelperResponse } from './stats-word-helper-client';
|
||||||
|
import { clearYomitanExtensionRuntimeState } from './core/services/yomitan-extension-runtime-state';
|
||||||
|
|
||||||
|
protocol.registerSchemesAsPrivileged([
|
||||||
|
{
|
||||||
|
scheme: 'chrome-extension',
|
||||||
|
privileges: {
|
||||||
|
standard: true,
|
||||||
|
secure: true,
|
||||||
|
supportFetchAPI: true,
|
||||||
|
corsEnabled: true,
|
||||||
|
bypassCSP: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
]);
|
||||||
|
|
||||||
|
const logger = createLogger('stats-word-helper');
|
||||||
|
|
||||||
|
function readFlagValue(argv: string[], flag: string): string | undefined {
|
||||||
|
for (let i = 0; i < argv.length; i += 1) {
|
||||||
|
const arg = argv[i];
|
||||||
|
if (!arg) continue;
|
||||||
|
if (arg === flag) {
|
||||||
|
const value = argv[i + 1];
|
||||||
|
if (value && !value.startsWith('--')) {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
if (arg.startsWith(`${flag}=`)) {
|
||||||
|
return arg.split('=', 2)[1];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
|
||||||
|
function writeResponse(responsePath: string | undefined, payload: StatsWordHelperResponse): void {
|
||||||
|
if (!responsePath) return;
|
||||||
|
fs.mkdirSync(path.dirname(responsePath), { recursive: true });
|
||||||
|
fs.writeFileSync(responsePath, JSON.stringify(payload, null, 2), 'utf8');
|
||||||
|
}
|
||||||
|
|
||||||
|
const responsePath = readFlagValue(process.argv, '--stats-word-helper-response-path')?.trim();
|
||||||
|
const userDataPath = readFlagValue(process.argv, '--stats-word-helper-user-data-path')?.trim();
|
||||||
|
const word = readFlagValue(process.argv, '--stats-word-helper-word');
|
||||||
|
const logLevel = readFlagValue(process.argv, '--log-level');
|
||||||
|
|
||||||
|
if (logLevel) {
|
||||||
|
setLogLevel(logLevel, 'cli');
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!userDataPath || !word) {
|
||||||
|
writeResponse(responsePath, {
|
||||||
|
ok: false,
|
||||||
|
error: 'Missing stats word helper arguments.',
|
||||||
|
});
|
||||||
|
app.exit(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
app.setName('SubMiner');
|
||||||
|
app.setPath('userData', userDataPath!);
|
||||||
|
|
||||||
|
let yomitanExt: Extension | null = null;
|
||||||
|
let yomitanSession: Session | null = null;
|
||||||
|
let yomitanParserWindow: BrowserWindow | null = null;
|
||||||
|
let yomitanParserReadyPromise: Promise<void> | null = null;
|
||||||
|
let yomitanParserInitPromise: Promise<boolean> | null = null;
|
||||||
|
|
||||||
|
function cleanup(): void {
|
||||||
|
clearYomitanExtensionRuntimeState({
|
||||||
|
getYomitanParserWindow: () => yomitanParserWindow,
|
||||||
|
setYomitanParserWindow: () => {
|
||||||
|
yomitanParserWindow = null;
|
||||||
|
},
|
||||||
|
setYomitanParserReadyPromise: () => {
|
||||||
|
yomitanParserReadyPromise = null;
|
||||||
|
},
|
||||||
|
setYomitanParserInitPromise: () => {
|
||||||
|
yomitanParserInitPromise = null;
|
||||||
|
},
|
||||||
|
setYomitanExtension: () => {
|
||||||
|
yomitanExt = null;
|
||||||
|
},
|
||||||
|
setYomitanSession: () => {
|
||||||
|
yomitanSession = null;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function main(): Promise<void> {
|
||||||
|
try {
|
||||||
|
const configService = new ConfigService(userDataPath!);
|
||||||
|
const config = configService.getConfig();
|
||||||
|
const extension = await loadYomitanExtension({
|
||||||
|
userDataPath: userDataPath!,
|
||||||
|
getYomitanParserWindow: () => yomitanParserWindow,
|
||||||
|
setYomitanParserWindow: (window) => {
|
||||||
|
yomitanParserWindow = window;
|
||||||
|
},
|
||||||
|
setYomitanParserReadyPromise: (promise) => {
|
||||||
|
yomitanParserReadyPromise = promise;
|
||||||
|
},
|
||||||
|
setYomitanParserInitPromise: (promise) => {
|
||||||
|
yomitanParserInitPromise = promise;
|
||||||
|
},
|
||||||
|
setYomitanExtension: (extensionValue) => {
|
||||||
|
yomitanExt = extensionValue;
|
||||||
|
},
|
||||||
|
setYomitanSession: (sessionValue) => {
|
||||||
|
yomitanSession = sessionValue;
|
||||||
|
},
|
||||||
|
});
|
||||||
|
if (!extension) {
|
||||||
|
throw new Error('Yomitan extension failed to load.');
|
||||||
|
}
|
||||||
|
|
||||||
|
await syncYomitanDefaultAnkiServer(
|
||||||
|
config.ankiConnect?.url || 'http://127.0.0.1:8765',
|
||||||
|
{
|
||||||
|
getYomitanExt: () => yomitanExt,
|
||||||
|
getYomitanSession: () => yomitanSession,
|
||||||
|
getYomitanParserWindow: () => yomitanParserWindow,
|
||||||
|
setYomitanParserWindow: (window) => {
|
||||||
|
yomitanParserWindow = window;
|
||||||
|
},
|
||||||
|
getYomitanParserReadyPromise: () => yomitanParserReadyPromise,
|
||||||
|
setYomitanParserReadyPromise: (promise) => {
|
||||||
|
yomitanParserReadyPromise = promise;
|
||||||
|
},
|
||||||
|
getYomitanParserInitPromise: () => yomitanParserInitPromise,
|
||||||
|
setYomitanParserInitPromise: (promise) => {
|
||||||
|
yomitanParserInitPromise = promise;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logger,
|
||||||
|
{ forceOverride: true },
|
||||||
|
);
|
||||||
|
|
||||||
|
const noteId = await addYomitanNoteViaSearch(
|
||||||
|
word!,
|
||||||
|
{
|
||||||
|
getYomitanExt: () => yomitanExt,
|
||||||
|
getYomitanSession: () => yomitanSession,
|
||||||
|
getYomitanParserWindow: () => yomitanParserWindow,
|
||||||
|
setYomitanParserWindow: (window) => {
|
||||||
|
yomitanParserWindow = window;
|
||||||
|
},
|
||||||
|
getYomitanParserReadyPromise: () => yomitanParserReadyPromise,
|
||||||
|
setYomitanParserReadyPromise: (promise) => {
|
||||||
|
yomitanParserReadyPromise = promise;
|
||||||
|
},
|
||||||
|
getYomitanParserInitPromise: () => yomitanParserInitPromise,
|
||||||
|
setYomitanParserInitPromise: (promise) => {
|
||||||
|
yomitanParserInitPromise = promise;
|
||||||
|
},
|
||||||
|
},
|
||||||
|
logger,
|
||||||
|
);
|
||||||
|
|
||||||
|
if (typeof noteId !== 'number') {
|
||||||
|
throw new Error('Yomitan failed to create note.');
|
||||||
|
}
|
||||||
|
|
||||||
|
writeResponse(responsePath, {
|
||||||
|
ok: true,
|
||||||
|
noteId,
|
||||||
|
});
|
||||||
|
cleanup();
|
||||||
|
app.exit(0);
|
||||||
|
} catch (error) {
|
||||||
|
const message = error instanceof Error ? error.message : String(error);
|
||||||
|
logger.error('Stats word helper failed', message);
|
||||||
|
writeResponse(responsePath, {
|
||||||
|
ok: false,
|
||||||
|
error: message,
|
||||||
|
});
|
||||||
|
cleanup();
|
||||||
|
app.exit(1);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
void app.whenReady().then(() => main());
|
||||||
Reference in New Issue
Block a user