mirror of
https://github.com/ksyasuda/SubMiner.git
synced 2026-04-11 16:19:27 -07:00
Compare commits
8 Commits
24667ad6c9
...
af74102964
| Author | SHA1 | Date | |
|---|---|---|---|
|
af74102964
|
|||
|
bae2a49673
|
|||
|
1342393035
|
|||
|
9d109de8db
|
|||
|
bedeee22f5
|
|||
|
1267085306
|
|||
|
0ee150ed91
|
|||
|
3e5671270e
|
@@ -1,10 +1,11 @@
|
||||
---
|
||||
id: TASK-143
|
||||
title: Keep character dictionary auto-sync non-blocking during startup
|
||||
status: Done
|
||||
assignee: []
|
||||
status: In Progress
|
||||
assignee:
|
||||
- codex
|
||||
created_date: '2026-03-09 01:45'
|
||||
updated_date: '2026-03-18 05:28'
|
||||
updated_date: '2026-03-20 09:22'
|
||||
labels:
|
||||
- dictionary
|
||||
- startup
|
||||
@@ -33,8 +34,20 @@ Keep character dictionary auto-sync running in parallel during startup without d
|
||||
- [x] #3 Regression coverage verifies auto-sync builds before the gate and only mutates Yomitan after the gate resolves.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
<!-- SECTION:PLAN:BEGIN -->
|
||||
1. Add a regression test for startup autoplay release surviving delayed mpv readiness or late subtitle refresh after dictionary sync.
|
||||
2. Harden the autoplay-ready release path so paused startup keeps retrying until mpv is actually released or media changes, without resuming user-paused playback later.
|
||||
3. Keep the existing character-dictionary revisit fixes and paused-startup OSD fixes aligned with the autoplay change, then run targeted runtime tests and typecheck.
|
||||
<!-- SECTION:PLAN:END -->
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
<!-- SECTION:NOTES:BEGIN -->
|
||||
Added a small current-media tokenization gate in main runtime. Media changes reset the gate, the first tokenization-ready event marks it ready, and auto-sync now waits on that gate only before Yomitan dictionary inspection/import/settings updates. Snapshot generation and merged ZIP build still run immediately in parallel.
|
||||
|
||||
2026-03-20: User reports startup remains paused after annotations/tokenization are visible and only resumes after character-dictionary generation/import finishes. Investigating autoplay-ready release regression vs dictionary sync completion refresh.
|
||||
|
||||
2026-03-20: Added startup autoplay retry-budget helper so paused startup retries cover the full plugin gate window instead of only ~2.8s. Verification: bun test src/main/runtime/startup-autoplay-release-policy.test.ts src/main/runtime/character-dictionary-auto-sync.test.ts src/main/runtime/startup-osd-sequencer.test.ts src/main/runtime/character-dictionary-auto-sync-completion.test.ts; bun run typecheck; bun run test:fast; bun run test:env; bun run build; bun run test:smoke:dist; runtime-compat verifier passed at .tmp/skill-verification/subminer-verify-20260320-022106-nM28Nk. Pending real installed-app/mpv validation.
|
||||
<!-- SECTION:NOTES:END -->
|
||||
|
||||
@@ -0,0 +1,67 @@
|
||||
---
|
||||
id: TASK-192
|
||||
title: Fix stale anime cover art after AniList reassignment
|
||||
status: Done
|
||||
assignee:
|
||||
- codex
|
||||
created_date: '2026-03-20 00:12'
|
||||
updated_date: '2026-03-20 00:14'
|
||||
labels:
|
||||
- stats
|
||||
- immersion-tracker
|
||||
- anilist
|
||||
milestone: m-1
|
||||
dependencies: []
|
||||
references:
|
||||
- src/core/services/immersion-tracker-service.ts
|
||||
- src/core/services/immersion-tracker/query.ts
|
||||
- src/core/services/immersion-tracker-service.test.ts
|
||||
priority: medium
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
Fix the stats anime-detail cover image path so reassigning an anime to a different AniList entry replaces the stored cover art bytes instead of keeping the previous image blob under updated metadata.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
<!-- AC:BEGIN -->
|
||||
- [x] #1 Reassigning an anime to a different AniList entry stores the new cover art bytes for that anime's videos
|
||||
- [x] #2 Shared blob deduplication still works when multiple videos in the anime use the same new cover image
|
||||
- [x] #3 Focused regression coverage proves stale cover blobs are replaced on reassignment
|
||||
<!-- AC:END -->
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
<!-- SECTION:PLAN:BEGIN -->
|
||||
1. Add a failing regression test that reassigns an anime twice with different downloaded cover bytes and asserts the resolved cover updates.
|
||||
2. Update cover-art upsert logic so new blob bytes generate a new shared hash instead of reusing an existing hash for the row.
|
||||
3. Run the focused immersion tracker service test file and record the result.
|
||||
<!-- SECTION:PLAN:END -->
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
<!-- SECTION:NOTES:BEGIN -->
|
||||
2026-03-20: Created during live debugging of a user-reported stale anime profile picture after changing the AniList entry from the stats UI.
|
||||
2026-03-20: Root cause was in `upsertCoverArt(...)`. When a row already had `cover_blob_hash`, a later AniList reassignment with a freshly downloaded cover reused the existing hash instead of hashing the new bytes, so the blob store kept serving the old image while metadata changed.
|
||||
2026-03-20: Added a regression in `src/core/services/immersion-tracker-service.test.ts` that reassigns the same anime twice with different fetched image bytes and asserts the resolved anime cover changes to the second blob while both videos still deduplicate to one shared hash.
|
||||
2026-03-20: Fixed `src/core/services/immersion-tracker/query.ts` so incoming cover blob bytes compute a fresh hash before falling back to an existing row hash. Existing hashes are now reused only when no new bytes were fetched.
|
||||
2026-03-20: Verification commands run:
|
||||
- `bun test src/core/services/immersion-tracker-service.test.ts`
|
||||
- `bash .agents/skills/subminer-change-verification/scripts/classify_subminer_diff.sh src/core/services/immersion-tracker/query.ts src/core/services/immersion-tracker-service.test.ts`
|
||||
- `bash .agents/skills/subminer-change-verification/scripts/verify_subminer_change.sh --lane core src/core/services/immersion-tracker/query.ts src/core/services/immersion-tracker-service.test.ts`
|
||||
2026-03-20: Verification results:
|
||||
- focused service test: passed
|
||||
- verifier lane selection: `core`
|
||||
- verifier result: passed (`bun run typecheck`, `bun run test:fast`)
|
||||
- verifier artifacts: `.tmp/skill-verification/subminer-verify-20260320-001433-IZLFqs/`
|
||||
<!-- SECTION:NOTES:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
Fixed stale anime cover art after AniList reassignment by correcting cover-blob hash replacement in the immersion tracker storage layer. Reassignments now store the new fetched image bytes instead of reusing the previous blob hash from the row, while still deduplicating the updated image across videos in the same anime.
|
||||
|
||||
Added focused regression coverage that reproduces the exact failure mode: same anime reassigned twice with different cover downloads, with the second image expected to replace the first. Verified with the touched service test file plus the SubMiner `core` verification lane.
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
@@ -0,0 +1,33 @@
|
||||
---
|
||||
id: TASK-211
|
||||
title: Recover anime episode progress from subtitle timing when checkpoints are missing
|
||||
status: Done
|
||||
assignee:
|
||||
- '@Codex'
|
||||
created_date: '2026-03-20 10:15'
|
||||
updated_date: '2026-03-20 10:22'
|
||||
labels:
|
||||
- stats
|
||||
- bug
|
||||
milestone: m-1
|
||||
dependencies: []
|
||||
references:
|
||||
- src/core/services/immersion-tracker/query.ts
|
||||
- src/core/services/immersion-tracker/__tests__/query.test.ts
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
Anime episode progress can still show `0%` for older sessions that have watch-time and subtitle timing but no persisted `ended_media_ms` checkpoint. Recover progress from the latest retained subtitle/event segment end so already-recorded sessions render a useful progress percentage.
|
||||
|
||||
## Acceptance Criteria
|
||||
|
||||
- [x] `getAnimeEpisodes` returns the latest known session position even when `ended_media_ms` is null but subtitle/event timing exists.
|
||||
- [x] Existing ended-session metrics and aggregation totals do not regress.
|
||||
- [x] Regression coverage locks the fallback behavior.
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
Added a query-side fallback for anime episode progress: when the newest session for a video has no persisted `ended_media_ms`, `getAnimeEpisodes` now uses the latest retained subtitle-line or session-event `segment_end_ms` from that same session. This recovers useful progress for already-recorded sessions that have timing data but predate or missed checkpoint persistence.
|
||||
|
||||
Verification: `bun test src/core/services/immersion-tracker/__tests__/query.test.ts` passed. `bun run typecheck` passed.
|
||||
@@ -0,0 +1,43 @@
|
||||
---
|
||||
id: TASK-212
|
||||
title: Fix mac texthooker helper startup blocking mpv launch
|
||||
status: In Progress
|
||||
assignee: []
|
||||
created_date: '2026-03-20 08:27'
|
||||
updated_date: '2026-03-20 08:45'
|
||||
labels:
|
||||
- bug
|
||||
- macos
|
||||
- startup
|
||||
dependencies: []
|
||||
references:
|
||||
- /Users/sudacode/projects/japanese/SubMiner/src/core/services/startup.ts
|
||||
- /Users/sudacode/projects/japanese/SubMiner/src/main.ts
|
||||
- /Users/sudacode/projects/japanese/SubMiner/plugin/subminer/process.lua
|
||||
priority: high
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
`subminer` mpv auto-start on mac can stall before the video is usable because the helper process launched with `--texthooker` still runs heavy app-ready startup. Recent logs show the helper loading the Yomitan Chromium extension, emitting `Permission 'contextMenus' is unknown` warnings, then hitting Chromium runtime errors before SubMiner signals readiness back to the mpv plugin. The texthooker helper should take the minimal startup path needed to serve texthooker traffic without loading overlay/window-only startup work that can crash or delay readiness.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
<!-- AC:BEGIN -->
|
||||
- [x] #1 Launching SubMiner with `--texthooker` avoids heavy app-ready startup work that is not required for texthooker helper mode.
|
||||
- [x] #2 A regression test covers texthooker helper startup so it fails if Yomitan extension loading is reintroduced on that path.
|
||||
- [x] #3 The change preserves existing startup behavior for non-texthooker app launches.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
<!-- SECTION:NOTES:BEGIN -->
|
||||
Follow-up: user confirmed the root issue is the plugin auto-start ordering. Adjust mpv plugin sequencing so `--start` launches before any separate `--texthooker` helper, then verify plugin regressions still pass.
|
||||
<!-- SECTION:NOTES:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
Fixed the mac mpv startup hang caused by the `--texthooker` helper taking the full app-ready path. `runAppReadyRuntime` now fast-paths texthooker-only mode through minimal startup (`reloadConfig` plus CLI handling) so it no longer loads Yomitan or first-run setup work before serving texthooker traffic. Added regression coverage in `src/core/services/app-ready.test.ts`, then verified with `bun test src/core/services/app-ready.test.ts src/core/services/startup.test.ts`, `bun test src/cli/args.test.ts src/main/early-single-instance.test.ts src/main/runtime/stats-cli-command.test.ts`, and `bun run typecheck`.
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
@@ -0,0 +1,42 @@
|
||||
---
|
||||
id: TASK-213
|
||||
title: Show character dictionary progress during paused startup waits
|
||||
status: In Progress
|
||||
assignee: []
|
||||
created_date: '2026-03-20 08:59'
|
||||
updated_date: '2026-03-20 09:22'
|
||||
labels:
|
||||
- bug
|
||||
- ux
|
||||
- dictionary
|
||||
- startup
|
||||
dependencies: []
|
||||
references:
|
||||
- >-
|
||||
/Users/sudacode/projects/japanese/SubMiner/src/main/runtime/startup-osd-sequencer.ts
|
||||
- >-
|
||||
/Users/sudacode/projects/japanese/SubMiner/src/main/runtime/character-dictionary-auto-sync-notifications.ts
|
||||
- /Users/sudacode/projects/japanese/SubMiner/src/main.ts
|
||||
priority: medium
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
During startup on mpv auto-start, character dictionary regeneration/update can be active while playback remains paused. The current startup OSD sequencer buffers dictionary progress behind annotation-loading OSD, which leaves the user with no visible dictionary-specific progress while the pause is active. Adjust the startup OSD sequencing so dictionary progress can surface once tokenization is ready during the paused startup window, without regressing later ready/failure handling.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
<!-- AC:BEGIN -->
|
||||
- [ ] #1 When tokenization is ready during startup, later character dictionary progress updates are shown on OSD even if annotation-loading state is still active.
|
||||
- [ ] #2 Startup OSD completion/failure behavior for character dictionary sync remains coherent after the new progress ordering.
|
||||
- [ ] #3 Regression coverage exercises the paused startup sequencing for dictionary progress.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
<!-- SECTION:NOTES:BEGIN -->
|
||||
2026-03-20: Confirmed issue is broader than OSD-only. Paused-startup OSD fixes remain relevant, but current user report also points at a regression in non-blocking startup playback release (tracked in TASK-143).
|
||||
|
||||
2026-03-20: OSD sequencing fix remains in local patch alongside TASK-143 regression fix. Covered by startup-osd-sequencer tests; pending installed-app/mpv validation before task finalization.
|
||||
<!-- SECTION:NOTES:END -->
|
||||
4
changes/2026-03-19-texthooker-docs-bundle-update.md
Normal file
4
changes/2026-03-19-texthooker-docs-bundle-update.md
Normal file
@@ -0,0 +1,4 @@
|
||||
type: changed
|
||||
area: docs
|
||||
|
||||
- Refreshed the vendored Texthooker docs/index.html bundle to match the latest local build artifacts.
|
||||
@@ -0,0 +1,4 @@
|
||||
type: fixed
|
||||
area: stats
|
||||
|
||||
- Anime episode progress now falls back to the latest retained subtitle/event timing when a session is missing a persisted playback-position checkpoint, so older watch sessions no longer get stuck at `0%` progress.
|
||||
@@ -1,46 +0,0 @@
|
||||
<!-- read_when: changing known-word cache lifecycle, stats cache semantics, or Anki sync behavior -->
|
||||
|
||||
# Incremental Known-Word Cache Sync
|
||||
|
||||
## Goal
|
||||
|
||||
Stop rebuilding the entire known-word cache on startup or routine refreshes. Keep the cache correct through incremental reconciliation on the configured sync cadence, with an immediate append path for freshly mined cards.
|
||||
|
||||
## Scope
|
||||
|
||||
- Persist per-note extracted known-word snapshots beside the existing global `words` list.
|
||||
- Replace startup refresh with load-only behavior.
|
||||
- Make timed refresh diff current Anki note IDs against cached note IDs, then apply add/remove/edit deltas.
|
||||
- Add `ankiConnect.knownWords.addMinedWordsImmediately`, default `true`.
|
||||
- Keep full rebuild out of normal lifecycle; reserve it for explicit doctor tooling.
|
||||
|
||||
## Data Model
|
||||
|
||||
Persist versioned cache state with:
|
||||
|
||||
- `words`: deduplicated global known-word set for stats/UI consumers
|
||||
- `notes`: record of `noteId -> extractedWords[]`
|
||||
- `refreshedAtMs`
|
||||
- `scope`
|
||||
|
||||
The in-memory manager derives the global set from the per-note snapshots during sync updates so deletes and edits can remove stale words safely.
|
||||
|
||||
## Sync Behavior
|
||||
|
||||
- Startup: load persisted state only
|
||||
- Interval tick or explicit refresh command: run incremental sync
|
||||
- Incremental sync:
|
||||
- query tracked note IDs for configured deck scope
|
||||
- remove note snapshots for note IDs that disappeared
|
||||
- fetch `notesInfo` for note IDs that are new or need field reconciliation
|
||||
- compare extracted words per note and update the global set
|
||||
|
||||
## Immediate Mining Path
|
||||
|
||||
When SubMiner already has fresh `noteInfo` after mining or updating a note, append/update that note snapshot immediately if `addMinedWordsImmediately` is enabled.
|
||||
|
||||
## Verification
|
||||
|
||||
- focused cache manager tests for add/delete/edit reconciliation
|
||||
- focused integration/config tests for startup behavior and new config flag
|
||||
- config verification lane because defaults/schema/example change
|
||||
@@ -48,6 +48,64 @@ function createContext(overrides: Partial<LauncherCommandContext> = {}): Launche
|
||||
};
|
||||
}
|
||||
|
||||
type StatsTestArgOverrides = {
|
||||
stats?: boolean;
|
||||
statsBackground?: boolean;
|
||||
statsCleanup?: boolean;
|
||||
statsCleanupVocab?: boolean;
|
||||
statsCleanupLifetime?: boolean;
|
||||
statsStop?: boolean;
|
||||
logLevel?: LauncherCommandContext['args']['logLevel'];
|
||||
};
|
||||
|
||||
function createStatsTestHarness(overrides: StatsTestArgOverrides = {}) {
|
||||
const context = createContext();
|
||||
const forwarded: string[][] = [];
|
||||
const removedPaths: string[] = [];
|
||||
const createTempDir = (_prefix: string) => {
|
||||
const created = `/tmp/subminer-stats-test`;
|
||||
return created;
|
||||
};
|
||||
const joinPath = (...parts: string[]) => parts.join('/');
|
||||
const removeDir = (targetPath: string) => {
|
||||
removedPaths.push(targetPath);
|
||||
};
|
||||
const runAppCommandAttachedStub = async (
|
||||
_appPath: string,
|
||||
appArgs: string[],
|
||||
_logLevel: LauncherCommandContext['args']['logLevel'],
|
||||
_label: string,
|
||||
) => {
|
||||
forwarded.push(appArgs);
|
||||
return 0;
|
||||
};
|
||||
const waitForStatsResponseStub = async () => ({ ok: true, url: 'http://127.0.0.1:5175' });
|
||||
|
||||
context.args = {
|
||||
...context.args,
|
||||
stats: true,
|
||||
...overrides,
|
||||
};
|
||||
|
||||
return {
|
||||
context,
|
||||
forwarded,
|
||||
removedPaths,
|
||||
createTempDir,
|
||||
joinPath,
|
||||
removeDir,
|
||||
runAppCommandAttachedStub,
|
||||
waitForStatsResponseStub,
|
||||
commandDeps: {
|
||||
createTempDir,
|
||||
joinPath,
|
||||
runAppCommandAttached: runAppCommandAttachedStub,
|
||||
waitForStatsResponse: waitForStatsResponseStub,
|
||||
removeDir,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
test('config command writes newline-terminated path via process adapter', () => {
|
||||
const writes: string[] = [];
|
||||
const context = createContext();
|
||||
@@ -157,24 +215,11 @@ test('dictionary command throws if app handoff unexpectedly returns', () => {
|
||||
});
|
||||
|
||||
test('stats command launches attached app command with response path', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
context.args.logLevel = 'debug';
|
||||
const forwarded: string[][] = [];
|
||||
|
||||
const handled = await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||
forwarded.push(appArgs);
|
||||
return 0;
|
||||
},
|
||||
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
||||
removeDir: () => {},
|
||||
});
|
||||
const harness = createStatsTestHarness({ stats: true, logLevel: 'debug' });
|
||||
const handled = await runStatsCommand(harness.context, harness.commandDeps);
|
||||
|
||||
assert.equal(handled, true);
|
||||
assert.deepEqual(forwarded, [
|
||||
assert.deepEqual(harness.forwarded, [
|
||||
[
|
||||
'--stats',
|
||||
'--stats-response-path',
|
||||
@@ -183,50 +228,34 @@ test('stats command launches attached app command with response path', async ()
|
||||
'debug',
|
||||
],
|
||||
]);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats background command launches attached daemon control command with response path', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
(context.args as typeof context.args & { statsBackground?: boolean }).statsBackground = true;
|
||||
const forwarded: string[][] = [];
|
||||
|
||||
const handled = await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||
forwarded.push(appArgs);
|
||||
return 0;
|
||||
},
|
||||
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
||||
removeDir: () => {},
|
||||
} as Parameters<typeof runStatsCommand>[1]);
|
||||
const harness = createStatsTestHarness({ stats: true, statsBackground: true });
|
||||
const handled = await runStatsCommand(harness.context, harness.commandDeps);
|
||||
|
||||
assert.equal(handled, true);
|
||||
assert.deepEqual(forwarded, [
|
||||
assert.deepEqual(harness.forwarded, [
|
||||
[
|
||||
'--stats-daemon-start',
|
||||
'--stats-response-path',
|
||||
'/tmp/subminer-stats-test/response.json',
|
||||
],
|
||||
]);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats command waits for attached app exit after startup response', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
const forwarded: string[][] = [];
|
||||
const harness = createStatsTestHarness({ stats: true });
|
||||
const started = new Promise<number>((resolve) => setTimeout(() => resolve(0), 20));
|
||||
|
||||
const statsCommand = runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||
forwarded.push(appArgs);
|
||||
const statsCommand = runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
return started;
|
||||
},
|
||||
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
||||
removeDir: () => {},
|
||||
});
|
||||
const result = await Promise.race([
|
||||
statsCommand.then(() => 'resolved'),
|
||||
@@ -237,53 +266,46 @@ test('stats command waits for attached app exit after startup response', async (
|
||||
|
||||
const final = await statsCommand;
|
||||
assert.equal(final, true);
|
||||
assert.deepEqual(forwarded, [
|
||||
assert.deepEqual(harness.forwarded, [
|
||||
[
|
||||
'--stats',
|
||||
'--stats-response-path',
|
||||
'/tmp/subminer-stats-test/response.json',
|
||||
],
|
||||
]);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats command throws when attached app exits non-zero after startup response', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
const harness = createStatsTestHarness({ stats: true });
|
||||
|
||||
await assert.rejects(async () => {
|
||||
await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => {
|
||||
await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
await new Promise((resolve) => setTimeout(resolve, 10));
|
||||
return 3;
|
||||
},
|
||||
waitForStatsResponse: async () => ({ ok: true, url: 'http://127.0.0.1:5175' }),
|
||||
removeDir: () => {},
|
||||
});
|
||||
}, /Stats app exited with status 3\./);
|
||||
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats cleanup command forwards cleanup vocab flags to the app', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
context.args.statsCleanup = true;
|
||||
context.args.statsCleanupVocab = true;
|
||||
const forwarded: string[][] = [];
|
||||
|
||||
const handled = await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||
forwarded.push(appArgs);
|
||||
return 0;
|
||||
},
|
||||
const harness = createStatsTestHarness({
|
||||
stats: true,
|
||||
statsCleanup: true,
|
||||
statsCleanupVocab: true,
|
||||
});
|
||||
const handled = await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
waitForStatsResponse: async () => ({ ok: true }),
|
||||
removeDir: () => {},
|
||||
});
|
||||
|
||||
assert.equal(handled, true);
|
||||
assert.deepEqual(forwarded, [
|
||||
assert.deepEqual(harness.forwarded, [
|
||||
[
|
||||
'--stats',
|
||||
'--stats-response-path',
|
||||
@@ -292,76 +314,62 @@ test('stats cleanup command forwards cleanup vocab flags to the app', async () =
|
||||
'--stats-cleanup-vocab',
|
||||
],
|
||||
]);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats stop command forwards stop flag to the app', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
(context.args as typeof context.args & { statsStop?: boolean }).statsStop = true;
|
||||
const forwarded: string[][] = [];
|
||||
const harness = createStatsTestHarness({ stats: true, statsStop: true });
|
||||
|
||||
const handled = await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||
forwarded.push(appArgs);
|
||||
return 0;
|
||||
},
|
||||
const handled = await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
waitForStatsResponse: async () => ({ ok: true }),
|
||||
removeDir: () => {},
|
||||
});
|
||||
|
||||
assert.equal(handled, true);
|
||||
assert.deepEqual(forwarded, [
|
||||
assert.deepEqual(harness.forwarded, [
|
||||
[
|
||||
'--stats-daemon-stop',
|
||||
'--stats-response-path',
|
||||
'/tmp/subminer-stats-test/response.json',
|
||||
],
|
||||
]);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats stop command exits on process exit without waiting for startup response', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
(context.args as typeof context.args & { statsStop?: boolean }).statsStop = true;
|
||||
const harness = createStatsTestHarness({ stats: true, statsStop: true });
|
||||
let waitedForResponse = false;
|
||||
|
||||
const handled = await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => 0,
|
||||
const handled = await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
return 0;
|
||||
},
|
||||
waitForStatsResponse: async () => {
|
||||
waitedForResponse = true;
|
||||
return { ok: true };
|
||||
},
|
||||
removeDir: () => {},
|
||||
});
|
||||
|
||||
assert.equal(handled, true);
|
||||
assert.equal(waitedForResponse, false);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats cleanup command forwards lifetime rebuild flag to the app', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
context.args.statsCleanup = true;
|
||||
context.args.statsCleanupLifetime = true;
|
||||
const forwarded: string[][] = [];
|
||||
|
||||
const handled = await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async (_appPath, appArgs) => {
|
||||
forwarded.push(appArgs);
|
||||
return 0;
|
||||
},
|
||||
const harness = createStatsTestHarness({
|
||||
stats: true,
|
||||
statsCleanup: true,
|
||||
statsCleanupLifetime: true,
|
||||
});
|
||||
const handled = await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
waitForStatsResponse: async () => ({ ok: true }),
|
||||
removeDir: () => {},
|
||||
});
|
||||
|
||||
assert.equal(handled, true);
|
||||
assert.deepEqual(forwarded, [
|
||||
assert.deepEqual(harness.forwarded, [
|
||||
[
|
||||
'--stats',
|
||||
'--stats-response-path',
|
||||
@@ -370,56 +378,64 @@ test('stats cleanup command forwards lifetime rebuild flag to the app', async ()
|
||||
'--stats-cleanup-lifetime',
|
||||
],
|
||||
]);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats command throws when stats response reports an error', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
const harness = createStatsTestHarness({ stats: true });
|
||||
|
||||
await assert.rejects(async () => {
|
||||
await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => 0,
|
||||
await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
return 0;
|
||||
},
|
||||
waitForStatsResponse: async () => ({
|
||||
ok: false,
|
||||
error: 'Immersion tracking is disabled in config.',
|
||||
}),
|
||||
removeDir: () => {},
|
||||
});
|
||||
}, /Immersion tracking is disabled in config\./);
|
||||
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats cleanup command fails if attached app exits before startup response', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
context.args.statsCleanup = true;
|
||||
context.args.statsCleanupVocab = true;
|
||||
const harness = createStatsTestHarness({
|
||||
stats: true,
|
||||
statsCleanup: true,
|
||||
statsCleanupVocab: true,
|
||||
});
|
||||
|
||||
await assert.rejects(async () => {
|
||||
await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => 2,
|
||||
await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
return 2;
|
||||
},
|
||||
waitForStatsResponse: async () => {
|
||||
await new Promise((resolve) => setTimeout(resolve, 25));
|
||||
return { ok: true, url: 'http://127.0.0.1:5175' };
|
||||
},
|
||||
removeDir: () => {},
|
||||
});
|
||||
}, /Stats app exited before startup response \(status 2\)\./);
|
||||
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats command aborts pending response wait when app exits before startup response', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
const harness = createStatsTestHarness({ stats: true });
|
||||
let aborted = false;
|
||||
|
||||
await assert.rejects(async () => {
|
||||
await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => 2,
|
||||
await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
return 2;
|
||||
},
|
||||
waitForStatsResponse: async (_responsePath, signal) =>
|
||||
await new Promise((resolve) => {
|
||||
signal?.addEventListener(
|
||||
@@ -431,25 +447,24 @@ test('stats command aborts pending response wait when app exits before startup r
|
||||
{ once: true },
|
||||
);
|
||||
}),
|
||||
removeDir: () => {},
|
||||
});
|
||||
}, /Stats app exited before startup response \(status 2\)\./);
|
||||
|
||||
assert.equal(aborted, true);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats command aborts pending response wait when attached app fails to spawn', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
const harness = createStatsTestHarness({ stats: true });
|
||||
const spawnError = new Error('spawn failed');
|
||||
let aborted = false;
|
||||
|
||||
await assert.rejects(
|
||||
async () => {
|
||||
await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => {
|
||||
await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
throw spawnError;
|
||||
},
|
||||
waitForStatsResponse: async (_responsePath, signal) =>
|
||||
@@ -463,27 +478,30 @@ test('stats command aborts pending response wait when attached app fails to spaw
|
||||
{ once: true },
|
||||
);
|
||||
}),
|
||||
removeDir: () => {},
|
||||
});
|
||||
},
|
||||
(error: unknown) => error === spawnError,
|
||||
);
|
||||
|
||||
assert.equal(aborted, true);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
test('stats cleanup command aborts pending response wait when app exits before startup response', async () => {
|
||||
const context = createContext();
|
||||
context.args.stats = true;
|
||||
context.args.statsCleanup = true;
|
||||
context.args.statsCleanupVocab = true;
|
||||
const harness = createStatsTestHarness({
|
||||
stats: true,
|
||||
statsCleanup: true,
|
||||
statsCleanupVocab: true,
|
||||
});
|
||||
let aborted = false;
|
||||
|
||||
await assert.rejects(async () => {
|
||||
await runStatsCommand(context, {
|
||||
createTempDir: () => '/tmp/subminer-stats-test',
|
||||
joinPath: (...parts) => parts.join('/'),
|
||||
runAppCommandAttached: async () => 2,
|
||||
await runStatsCommand(harness.context, {
|
||||
...harness.commandDeps,
|
||||
runAppCommandAttached: async (...args) => {
|
||||
await harness.runAppCommandAttachedStub(...args);
|
||||
return 2;
|
||||
},
|
||||
waitForStatsResponse: async (_responsePath, signal) =>
|
||||
await new Promise((resolve) => {
|
||||
signal?.addEventListener(
|
||||
@@ -495,9 +513,9 @@ test('stats cleanup command aborts pending response wait when app exits before s
|
||||
{ once: true },
|
||||
);
|
||||
}),
|
||||
removeDir: () => {},
|
||||
});
|
||||
}, /Stats app exited before startup response \(status 2\)\./);
|
||||
|
||||
assert.equal(aborted, true);
|
||||
assert.equal(harness.removedPaths.length, 1);
|
||||
});
|
||||
|
||||
@@ -372,12 +372,9 @@ function M.create(ctx)
|
||||
end)
|
||||
end
|
||||
|
||||
launch_overlay_with_retry(1)
|
||||
if texthooker_enabled then
|
||||
ensure_texthooker_running(function()
|
||||
launch_overlay_with_retry(1)
|
||||
end)
|
||||
else
|
||||
launch_overlay_with_retry(1)
|
||||
ensure_texthooker_running(function() end)
|
||||
end
|
||||
end
|
||||
|
||||
@@ -481,31 +478,33 @@ function M.create(ctx)
|
||||
state.texthooker_running = false
|
||||
disarm_auto_play_ready_gate()
|
||||
|
||||
ensure_texthooker_running(function()
|
||||
local start_args = build_command_args("start")
|
||||
subminer_log("info", "process", "Starting overlay: " .. table.concat(start_args, " "))
|
||||
local start_args = build_command_args("start")
|
||||
subminer_log("info", "process", "Starting overlay: " .. table.concat(start_args, " "))
|
||||
|
||||
state.overlay_running = true
|
||||
mp.command_native_async({
|
||||
name = "subprocess",
|
||||
args = start_args,
|
||||
playback_only = false,
|
||||
capture_stdout = true,
|
||||
capture_stderr = true,
|
||||
}, function(success, result, error)
|
||||
if not success or (result and result.status ~= 0) then
|
||||
state.overlay_running = false
|
||||
subminer_log(
|
||||
"error",
|
||||
"process",
|
||||
"Overlay start failed: " .. (error or (result and result.stderr) or "unknown error")
|
||||
)
|
||||
show_osd("Restart failed")
|
||||
else
|
||||
show_osd("Restarted successfully")
|
||||
end
|
||||
end)
|
||||
state.overlay_running = true
|
||||
mp.command_native_async({
|
||||
name = "subprocess",
|
||||
args = start_args,
|
||||
playback_only = false,
|
||||
capture_stdout = true,
|
||||
capture_stderr = true,
|
||||
}, function(success, result, error)
|
||||
if not success or (result and result.status ~= 0) then
|
||||
state.overlay_running = false
|
||||
subminer_log(
|
||||
"error",
|
||||
"process",
|
||||
"Overlay start failed: " .. (error or (result and result.stderr) or "unknown error")
|
||||
)
|
||||
show_osd("Restart failed")
|
||||
else
|
||||
show_osd("Restarted successfully")
|
||||
end
|
||||
end)
|
||||
|
||||
if opts.texthooker_enabled then
|
||||
ensure_texthooker_running(function() end)
|
||||
end
|
||||
end)
|
||||
end
|
||||
|
||||
|
||||
@@ -344,6 +344,27 @@ local function count_start_calls(async_calls)
|
||||
return count
|
||||
end
|
||||
|
||||
local function find_texthooker_call(async_calls)
|
||||
for _, call in ipairs(async_calls) do
|
||||
local args = call.args or {}
|
||||
for i = 1, #args do
|
||||
if args[i] == "--texthooker" then
|
||||
return call
|
||||
end
|
||||
end
|
||||
end
|
||||
return nil
|
||||
end
|
||||
|
||||
local function find_call_index(async_calls, target_call)
|
||||
for index, call in ipairs(async_calls) do
|
||||
if call == target_call then
|
||||
return index
|
||||
end
|
||||
end
|
||||
return nil
|
||||
end
|
||||
|
||||
local function find_control_call(async_calls, flag)
|
||||
for _, call in ipairs(async_calls) do
|
||||
local args = call.args or {}
|
||||
@@ -643,6 +664,8 @@ do
|
||||
fire_event(recorded, "file-loaded")
|
||||
local start_call = find_start_call(recorded.async_calls)
|
||||
assert_true(start_call ~= nil, "auto-start should issue --start command")
|
||||
local texthooker_call = find_texthooker_call(recorded.async_calls)
|
||||
assert_true(texthooker_call ~= nil, "auto-start should issue texthooker helper command when enabled")
|
||||
assert_true(
|
||||
call_has_arg(start_call, "--show-visible-overlay"),
|
||||
"auto-start with visible overlay enabled should include --show-visible-overlay on --start"
|
||||
@@ -655,6 +678,10 @@ do
|
||||
find_control_call(recorded.async_calls, "--show-visible-overlay") ~= nil,
|
||||
"auto-start with visible overlay enabled should issue a separate --show-visible-overlay command"
|
||||
)
|
||||
assert_true(
|
||||
find_call_index(recorded.async_calls, start_call) < find_call_index(recorded.async_calls, texthooker_call),
|
||||
"auto-start should launch --start before separate --texthooker helper startup"
|
||||
)
|
||||
assert_true(
|
||||
not has_property_set(recorded.property_sets, "pause", true),
|
||||
"auto-start visible overlay should not force pause without explicit pause-until-ready option"
|
||||
|
||||
@@ -176,6 +176,22 @@ test('runAppReadyRuntime skips heavy startup when shouldSkipHeavyStartup returns
|
||||
assert.ok(calls.indexOf('handleFirstRunSetup') < calls.indexOf('handleInitialArgs'));
|
||||
});
|
||||
|
||||
test('runAppReadyRuntime uses minimal startup for texthooker-only mode', async () => {
|
||||
const { deps, calls } = makeDeps({
|
||||
texthookerOnlyMode: true,
|
||||
reloadConfig: () => calls.push('reloadConfig'),
|
||||
handleInitialArgs: () => calls.push('handleInitialArgs'),
|
||||
});
|
||||
|
||||
await runAppReadyRuntime(deps);
|
||||
|
||||
assert.deepEqual(calls, [
|
||||
'ensureDefaultConfigBootstrap',
|
||||
'reloadConfig',
|
||||
'handleInitialArgs',
|
||||
]);
|
||||
});
|
||||
|
||||
test('runAppReadyRuntime skips Jellyfin remote startup when dependency is not wired', async () => {
|
||||
const { deps, calls } = makeDeps({
|
||||
startJellyfinRemoteSession: undefined,
|
||||
|
||||
@@ -2128,6 +2128,129 @@ test('reassignAnimeAnilist deduplicates cover blobs and getCoverArt remains comp
|
||||
}
|
||||
});
|
||||
|
||||
test('reassignAnimeAnilist replaces stale cover blobs when the AniList cover changes', async () => {
|
||||
const dbPath = makeDbPath();
|
||||
let tracker: ImmersionTrackerService | null = null;
|
||||
const originalFetch = globalThis.fetch;
|
||||
const initialCoverBlob = Buffer.from([1, 2, 3, 4]);
|
||||
const replacementCoverBlob = Buffer.from([9, 8, 7, 6]);
|
||||
let fetchCallCount = 0;
|
||||
|
||||
try {
|
||||
globalThis.fetch = async () => {
|
||||
fetchCallCount += 1;
|
||||
const blob = fetchCallCount === 1 ? initialCoverBlob : replacementCoverBlob;
|
||||
return new Response(new Uint8Array(blob), {
|
||||
status: 200,
|
||||
headers: { 'Content-Type': 'image/jpeg' },
|
||||
});
|
||||
};
|
||||
const Ctor = await loadTrackerCtor();
|
||||
tracker = new Ctor({ dbPath });
|
||||
const privateApi = tracker as unknown as { db: DatabaseSync };
|
||||
|
||||
privateApi.db.exec(`
|
||||
INSERT INTO imm_anime (
|
||||
anime_id,
|
||||
normalized_title_key,
|
||||
canonical_title,
|
||||
CREATED_DATE,
|
||||
LAST_UPDATE_DATE
|
||||
) VALUES (
|
||||
1,
|
||||
'little witch academia',
|
||||
'Little Witch Academia',
|
||||
1000,
|
||||
1000
|
||||
);
|
||||
INSERT INTO imm_videos (
|
||||
video_id,
|
||||
video_key,
|
||||
canonical_title,
|
||||
source_type,
|
||||
duration_ms,
|
||||
anime_id,
|
||||
CREATED_DATE,
|
||||
LAST_UPDATE_DATE
|
||||
) VALUES
|
||||
(
|
||||
1,
|
||||
'local:/tmp/lwa-1.mkv',
|
||||
'Little Witch Academia S01E01',
|
||||
1,
|
||||
0,
|
||||
1,
|
||||
1000,
|
||||
1000
|
||||
),
|
||||
(
|
||||
2,
|
||||
'local:/tmp/lwa-2.mkv',
|
||||
'Little Witch Academia S01E02',
|
||||
1,
|
||||
0,
|
||||
1,
|
||||
1000,
|
||||
1000
|
||||
);
|
||||
`);
|
||||
|
||||
await tracker.reassignAnimeAnilist(1, {
|
||||
anilistId: 33489,
|
||||
titleRomaji: 'Little Witch Academia',
|
||||
coverUrl: 'https://example.com/lwa-old.jpg',
|
||||
});
|
||||
|
||||
await tracker.reassignAnimeAnilist(1, {
|
||||
anilistId: 100526,
|
||||
titleRomaji: 'Otome Game Sekai wa Mob ni Kibishii Sekai desu',
|
||||
coverUrl: 'https://example.com/mobseka-new.jpg',
|
||||
});
|
||||
|
||||
const mediaRows = privateApi.db
|
||||
.prepare(
|
||||
`
|
||||
SELECT
|
||||
video_id AS videoId,
|
||||
anilist_id AS anilistId,
|
||||
cover_url AS coverUrl,
|
||||
cover_blob_hash AS coverBlobHash
|
||||
FROM imm_media_art
|
||||
ORDER BY video_id ASC
|
||||
`,
|
||||
)
|
||||
.all() as Array<{
|
||||
videoId: number;
|
||||
anilistId: number | null;
|
||||
coverUrl: string | null;
|
||||
coverBlobHash: string | null;
|
||||
}>;
|
||||
const blobRows = privateApi.db
|
||||
.prepare('SELECT blob_hash AS blobHash, cover_blob AS coverBlob FROM imm_cover_art_blobs')
|
||||
.all() as Array<{ blobHash: string; coverBlob: Buffer }>;
|
||||
const resolvedCover = await tracker.getAnimeCoverArt(1);
|
||||
|
||||
assert.equal(fetchCallCount, 2);
|
||||
assert.equal(mediaRows.length, 2);
|
||||
assert.equal(mediaRows[0]?.anilistId, 100526);
|
||||
assert.equal(mediaRows[0]?.coverUrl, 'https://example.com/mobseka-new.jpg');
|
||||
assert.equal(mediaRows[0]?.coverBlobHash, mediaRows[1]?.coverBlobHash);
|
||||
assert.equal(blobRows.length, 1);
|
||||
assert.deepEqual(
|
||||
new Uint8Array(blobRows[0]?.coverBlob ?? Buffer.alloc(0)),
|
||||
new Uint8Array(replacementCoverBlob),
|
||||
);
|
||||
assert.deepEqual(
|
||||
new Uint8Array(resolvedCover?.coverBlob ?? Buffer.alloc(0)),
|
||||
new Uint8Array(replacementCoverBlob),
|
||||
);
|
||||
} finally {
|
||||
globalThis.fetch = originalFetch;
|
||||
tracker?.destroy();
|
||||
cleanupDbPath(dbPath);
|
||||
}
|
||||
});
|
||||
|
||||
test('reassignAnimeAnilist preserves existing description when description is omitted', async () => {
|
||||
const dbPath = makeDbPath();
|
||||
let tracker: ImmersionTrackerService | null = null;
|
||||
|
||||
@@ -207,6 +207,78 @@ test('getAnimeEpisodes prefers the latest session media position when the latest
|
||||
}
|
||||
});
|
||||
|
||||
test('getAnimeEpisodes falls back to the latest subtitle segment end when session progress checkpoints are missing', () => {
|
||||
const dbPath = makeDbPath();
|
||||
const db = new Database(dbPath);
|
||||
|
||||
try {
|
||||
ensureSchema(db);
|
||||
const stmts = createTrackerPreparedStatements(db);
|
||||
const videoId = getOrCreateVideoRecord(db, 'local:/tmp/subtitle-progress-fallback.mkv', {
|
||||
canonicalTitle: 'Subtitle Progress Fallback',
|
||||
sourcePath: '/tmp/subtitle-progress-fallback.mkv',
|
||||
sourceUrl: null,
|
||||
sourceType: SOURCE_TYPE_LOCAL,
|
||||
});
|
||||
const animeId = getOrCreateAnimeRecord(db, {
|
||||
parsedTitle: 'Subtitle Progress Fallback Anime',
|
||||
canonicalTitle: 'Subtitle Progress Fallback Anime',
|
||||
anilistId: null,
|
||||
titleRomaji: null,
|
||||
titleEnglish: null,
|
||||
titleNative: null,
|
||||
metadataJson: null,
|
||||
});
|
||||
linkVideoToAnimeRecord(db, videoId, {
|
||||
animeId,
|
||||
parsedBasename: 'subtitle-progress-fallback.mkv',
|
||||
parsedTitle: 'Subtitle Progress Fallback Anime',
|
||||
parsedSeason: 1,
|
||||
parsedEpisode: 1,
|
||||
parserSource: 'fallback',
|
||||
parserConfidence: 1,
|
||||
parseMetadataJson: '{"episode":1}',
|
||||
});
|
||||
db.prepare('UPDATE imm_videos SET duration_ms = ? WHERE video_id = ?').run(24_000, videoId);
|
||||
|
||||
const startedAtMs = 1_100_000;
|
||||
const sessionId = startSessionRecord(db, videoId, startedAtMs).sessionId;
|
||||
db.prepare(
|
||||
`
|
||||
UPDATE imm_sessions
|
||||
SET
|
||||
ended_at_ms = ?,
|
||||
status = 2,
|
||||
active_watched_ms = ?,
|
||||
LAST_UPDATE_DATE = ?
|
||||
WHERE session_id = ?
|
||||
`,
|
||||
).run(startedAtMs + 10_000, 10_000, startedAtMs + 10_000, sessionId);
|
||||
stmts.eventInsertStmt.run(
|
||||
sessionId,
|
||||
startedAtMs + 9_000,
|
||||
EVENT_SUBTITLE_LINE,
|
||||
1,
|
||||
18_000,
|
||||
21_000,
|
||||
5,
|
||||
0,
|
||||
'{"line":"progress fallback"}',
|
||||
startedAtMs + 9_000,
|
||||
startedAtMs + 9_000,
|
||||
);
|
||||
|
||||
const [episode] = getAnimeEpisodes(db, animeId);
|
||||
assert.ok(episode);
|
||||
assert.equal(episode?.endedMediaMs, 21_000);
|
||||
assert.equal(episode?.totalSessions, 1);
|
||||
assert.equal(episode?.totalActiveMs, 10_000);
|
||||
} finally {
|
||||
db.close();
|
||||
cleanupDbPath(dbPath);
|
||||
}
|
||||
});
|
||||
|
||||
test('getSessionTimeline returns the full session when no limit is provided', () => {
|
||||
const dbPath = makeDbPath();
|
||||
const db = new Database(dbPath);
|
||||
|
||||
@@ -1745,10 +1745,38 @@ export function getAnimeEpisodes(db: DatabaseSync, animeId: number): AnimeEpisod
|
||||
v.parsed_episode AS episode,
|
||||
v.duration_ms AS durationMs,
|
||||
(
|
||||
SELECT s_recent.ended_media_ms
|
||||
SELECT COALESCE(
|
||||
s_recent.ended_media_ms,
|
||||
(
|
||||
SELECT MAX(line.segment_end_ms)
|
||||
FROM imm_subtitle_lines line
|
||||
WHERE line.session_id = s_recent.session_id
|
||||
AND line.segment_end_ms IS NOT NULL
|
||||
),
|
||||
(
|
||||
SELECT MAX(event.segment_end_ms)
|
||||
FROM imm_session_events event
|
||||
WHERE event.session_id = s_recent.session_id
|
||||
AND event.segment_end_ms IS NOT NULL
|
||||
)
|
||||
)
|
||||
FROM imm_sessions s_recent
|
||||
WHERE s_recent.video_id = v.video_id
|
||||
AND s_recent.ended_media_ms IS NOT NULL
|
||||
AND (
|
||||
s_recent.ended_media_ms IS NOT NULL
|
||||
OR EXISTS (
|
||||
SELECT 1
|
||||
FROM imm_subtitle_lines line
|
||||
WHERE line.session_id = s_recent.session_id
|
||||
AND line.segment_end_ms IS NOT NULL
|
||||
)
|
||||
OR EXISTS (
|
||||
SELECT 1
|
||||
FROM imm_session_events event
|
||||
WHERE event.session_id = s_recent.session_id
|
||||
AND event.segment_end_ms IS NOT NULL
|
||||
)
|
||||
)
|
||||
ORDER BY
|
||||
COALESCE(s_recent.ended_at_ms, s_recent.LAST_UPDATE_DATE, s_recent.started_at_ms) DESC,
|
||||
s_recent.session_id DESC
|
||||
@@ -2289,10 +2317,13 @@ export function upsertCoverArt(
|
||||
const sharedCoverBlobHash = findSharedCoverBlobHash(db, videoId, art.anilistId, art.coverUrl);
|
||||
const nowMs = Date.now();
|
||||
const coverBlob = normalizeCoverBlobBytes(art.coverBlob);
|
||||
let coverBlobHash = sharedCoverBlobHash ?? existing?.coverBlobHash ?? null;
|
||||
let coverBlobHash = sharedCoverBlobHash ?? null;
|
||||
if (!coverBlobHash && coverBlob && coverBlob.length > 0) {
|
||||
coverBlobHash = createHash('sha256').update(coverBlob).digest('hex');
|
||||
}
|
||||
if (!coverBlobHash && (!coverBlob || coverBlob.length === 0)) {
|
||||
coverBlobHash = existing?.coverBlobHash ?? null;
|
||||
}
|
||||
|
||||
if (coverBlobHash && coverBlob && coverBlob.length > 0 && !sharedCoverBlobHash) {
|
||||
db.prepare(
|
||||
|
||||
@@ -200,6 +200,12 @@ export async function runAppReadyRuntime(deps: AppReadyRuntimeDeps): Promise<voi
|
||||
return;
|
||||
}
|
||||
|
||||
if (deps.texthookerOnlyMode) {
|
||||
deps.reloadConfig();
|
||||
deps.handleInitialArgs();
|
||||
return;
|
||||
}
|
||||
|
||||
if (deps.shouldUseMinimalStartup?.()) {
|
||||
deps.reloadConfig();
|
||||
deps.handleInitialArgs();
|
||||
|
||||
@@ -3741,6 +3741,98 @@ test('tokenizeSubtitle clears all annotations for kana-only demonstrative helper
|
||||
);
|
||||
});
|
||||
|
||||
test('tokenizeSubtitle clears all annotations for explanatory pondering endings', async () => {
|
||||
const result = await tokenizeSubtitle(
|
||||
'俺どうかしちゃったのかな',
|
||||
makeDepsFromYomitanTokens(
|
||||
[
|
||||
{ surface: '俺', reading: 'おれ', headword: '俺' },
|
||||
{ surface: 'どうかしちゃった', reading: 'どうかしちゃった', headword: 'どうかしちゃう' },
|
||||
{ surface: 'のかな', reading: 'のかな', headword: 'の' },
|
||||
],
|
||||
{
|
||||
getFrequencyDictionaryEnabled: () => true,
|
||||
getFrequencyRank: (text) => (text === '俺' ? 19 : text === 'どうかしちゃう' ? 3200 : 77),
|
||||
getJlptLevel: (text) =>
|
||||
text === '俺' ? 'N5' : text === 'どうかしちゃう' ? 'N3' : text === 'の' ? 'N5' : null,
|
||||
isKnownWord: (text) => text === '俺' || text === 'の',
|
||||
getMinSentenceWordsForNPlusOne: () => 1,
|
||||
tokenizeWithMecab: async () => [
|
||||
{
|
||||
headword: '俺',
|
||||
surface: '俺',
|
||||
reading: 'オレ',
|
||||
startPos: 0,
|
||||
endPos: 1,
|
||||
partOfSpeech: PartOfSpeech.noun,
|
||||
pos1: '名詞',
|
||||
pos2: '代名詞',
|
||||
isMerged: false,
|
||||
isKnown: false,
|
||||
isNPlusOneTarget: false,
|
||||
},
|
||||
{
|
||||
headword: 'どうかしちゃう',
|
||||
surface: 'どうかしちゃった',
|
||||
reading: 'ドウカシチャッタ',
|
||||
startPos: 1,
|
||||
endPos: 8,
|
||||
partOfSpeech: PartOfSpeech.verb,
|
||||
pos1: '動詞',
|
||||
pos2: '自立',
|
||||
isMerged: false,
|
||||
isKnown: false,
|
||||
isNPlusOneTarget: false,
|
||||
},
|
||||
{
|
||||
headword: 'の',
|
||||
surface: 'のかな',
|
||||
reading: 'ノカナ',
|
||||
startPos: 8,
|
||||
endPos: 11,
|
||||
partOfSpeech: PartOfSpeech.other,
|
||||
pos1: '名詞|助動詞',
|
||||
pos2: '非自立',
|
||||
isMerged: false,
|
||||
isKnown: false,
|
||||
isNPlusOneTarget: false,
|
||||
},
|
||||
],
|
||||
},
|
||||
),
|
||||
);
|
||||
|
||||
assert.deepEqual(
|
||||
result.tokens?.map((token) => ({
|
||||
surface: token.surface,
|
||||
headword: token.headword,
|
||||
isKnown: token.isKnown,
|
||||
isNPlusOneTarget: token.isNPlusOneTarget,
|
||||
frequencyRank: token.frequencyRank,
|
||||
jlptLevel: token.jlptLevel,
|
||||
})),
|
||||
[
|
||||
{ surface: '俺', headword: '俺', isKnown: true, isNPlusOneTarget: false, frequencyRank: 19, jlptLevel: 'N5' },
|
||||
{
|
||||
surface: 'どうかしちゃった',
|
||||
headword: 'どうかしちゃう',
|
||||
isKnown: false,
|
||||
isNPlusOneTarget: true,
|
||||
frequencyRank: 3200,
|
||||
jlptLevel: 'N3',
|
||||
},
|
||||
{
|
||||
surface: 'のかな',
|
||||
headword: 'の',
|
||||
isKnown: false,
|
||||
isNPlusOneTarget: false,
|
||||
frequencyRank: undefined,
|
||||
jlptLevel: undefined,
|
||||
},
|
||||
],
|
||||
);
|
||||
});
|
||||
|
||||
test('tokenizeSubtitle keeps frequency for content-led merged token with trailing colloquial suffixes', async () => {
|
||||
const result = await tokenizeSubtitle(
|
||||
'張り切ってんじゃ',
|
||||
|
||||
@@ -234,6 +234,18 @@ test('shouldExcludeTokenFromSubtitleAnnotations excludes explanatory ending vari
|
||||
}
|
||||
});
|
||||
|
||||
test('shouldExcludeTokenFromSubtitleAnnotations excludes explanatory pondering endings', () => {
|
||||
const token = makeToken({
|
||||
surface: 'のかな',
|
||||
headword: 'の',
|
||||
reading: 'ノカナ',
|
||||
pos1: '名詞|助動詞',
|
||||
pos2: '非自立',
|
||||
});
|
||||
|
||||
assert.equal(shouldExcludeTokenFromSubtitleAnnotations(token), true);
|
||||
});
|
||||
|
||||
test('shouldExcludeTokenFromSubtitleAnnotations excludes auxiliary-stem そうだ grammar tails', () => {
|
||||
const token = makeToken({
|
||||
surface: 'そうだ',
|
||||
|
||||
@@ -45,6 +45,7 @@ const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_TRAILING_PARTICLES = [
|
||||
'かな',
|
||||
'かね',
|
||||
] as const;
|
||||
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_THOUGHT_SUFFIXES = ['か', 'かな', 'かね'] as const;
|
||||
const SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDINGS = new Set(
|
||||
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_PREFIXES.flatMap((prefix) =>
|
||||
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_CORES.flatMap((core) =>
|
||||
@@ -258,6 +259,16 @@ function isExcludedByTerm(token: MergedToken): boolean {
|
||||
continue;
|
||||
}
|
||||
|
||||
if (
|
||||
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_PREFIXES.some((prefix) =>
|
||||
SUBTITLE_ANNOTATION_EXCLUDED_EXPLANATORY_ENDING_THOUGHT_SUFFIXES.some(
|
||||
(suffix) => normalized === `${prefix}${suffix}`,
|
||||
),
|
||||
)
|
||||
) {
|
||||
return true;
|
||||
}
|
||||
|
||||
if (
|
||||
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(trimmed) ||
|
||||
SUBTITLE_ANNOTATION_EXCLUDED_TERMS.has(normalized) ||
|
||||
|
||||
48
src/main.ts
48
src/main.ts
@@ -315,6 +315,7 @@ import {
|
||||
createFirstRunSetupService,
|
||||
shouldAutoOpenFirstRunSetup,
|
||||
} from './main/runtime/first-run-setup-service';
|
||||
import { resolveAutoplayReadyMaxReleaseAttempts } from './main/runtime/startup-autoplay-release-policy';
|
||||
import {
|
||||
buildFirstRunSetupHtml,
|
||||
createMaybeFocusExistingFirstRunSetupWindowHandler,
|
||||
@@ -1096,8 +1097,11 @@ function maybeSignalPluginAutoplayReady(
|
||||
|
||||
// Fallback: repeatedly try to release pause for a short window in case startup
|
||||
// gate arming and tokenization-ready signal arrive out of order.
|
||||
const maxReleaseAttempts = options?.forceWhilePaused === true ? 14 : 3;
|
||||
const releaseRetryDelayMs = 200;
|
||||
const maxReleaseAttempts = resolveAutoplayReadyMaxReleaseAttempts({
|
||||
forceWhilePaused: options?.forceWhilePaused === true,
|
||||
retryDelayMs: releaseRetryDelayMs,
|
||||
});
|
||||
const attemptRelease = (attempt: number): void => {
|
||||
void (async () => {
|
||||
if (
|
||||
@@ -3037,10 +3041,11 @@ const { appReadyRuntimeRunner } = composeAppReadyRuntime({
|
||||
Boolean(appState.initialArgs && isHeadlessInitialCommand(appState.initialArgs)),
|
||||
shouldUseMinimalStartup: () =>
|
||||
Boolean(
|
||||
appState.initialArgs?.stats &&
|
||||
(appState.initialArgs?.statsCleanup ||
|
||||
appState.initialArgs?.statsBackground ||
|
||||
appState.initialArgs?.statsStop),
|
||||
appState.initialArgs?.texthooker ||
|
||||
(appState.initialArgs?.stats &&
|
||||
(appState.initialArgs?.statsCleanup ||
|
||||
appState.initialArgs?.statsBackground ||
|
||||
appState.initialArgs?.statsStop)),
|
||||
),
|
||||
shouldSkipHeavyStartup: () =>
|
||||
Boolean(
|
||||
@@ -3130,6 +3135,39 @@ void initializeDiscordPresenceService();
|
||||
const handleCliCommand = createCliCommandRuntimeHandler({
|
||||
handleTexthookerOnlyModeTransitionMainDeps: {
|
||||
isTexthookerOnlyMode: () => appState.texthookerOnlyMode,
|
||||
ensureOverlayStartupPrereqs: () => {
|
||||
if (appState.subtitlePosition === null) {
|
||||
loadSubtitlePosition();
|
||||
}
|
||||
if (appState.keybindings.length === 0) {
|
||||
appState.keybindings = resolveKeybindings(getResolvedConfig(), DEFAULT_KEYBINDINGS);
|
||||
}
|
||||
if (!appState.mpvClient) {
|
||||
appState.mpvClient = createMpvClientRuntimeService();
|
||||
}
|
||||
if (!appState.runtimeOptionsManager) {
|
||||
appState.runtimeOptionsManager = new RuntimeOptionsManager(
|
||||
() => configService.getConfig().ankiConnect,
|
||||
{
|
||||
applyAnkiPatch: (patch) => {
|
||||
if (appState.ankiIntegration) {
|
||||
appState.ankiIntegration.applyRuntimeConfigPatch(patch);
|
||||
}
|
||||
},
|
||||
getSubtitleStyleConfig: () => configService.getConfig().subtitleStyle,
|
||||
onOptionsChanged: () => {
|
||||
subtitleProcessingController.invalidateTokenizationCache();
|
||||
subtitlePrefetchService?.onSeek(lastObservedTimePos);
|
||||
broadcastRuntimeOptionsChanged();
|
||||
refreshOverlayShortcuts();
|
||||
},
|
||||
},
|
||||
);
|
||||
}
|
||||
if (!appState.subtitleTimingTracker) {
|
||||
appState.subtitleTimingTracker = new SubtitleTimingTracker();
|
||||
}
|
||||
},
|
||||
setTexthookerOnlyMode: (enabled) => {
|
||||
appState.texthookerOnlyMode = enabled;
|
||||
},
|
||||
|
||||
@@ -150,6 +150,59 @@ test('auto sync skips rebuild/import on unchanged revisit when merged dictionary
|
||||
assert.deepEqual(imports, ['/tmp/merged.zip']);
|
||||
});
|
||||
|
||||
test('auto sync does not emit updating progress for unchanged revisit when merged dictionary is current', async () => {
|
||||
const userDataPath = makeTempDir();
|
||||
let importedRevision: string | null = null;
|
||||
let currentRun: string[] = [];
|
||||
const phaseHistory: string[][] = [];
|
||||
|
||||
const runtime = createCharacterDictionaryAutoSyncRuntimeService({
|
||||
userDataPath,
|
||||
getConfig: () => ({
|
||||
enabled: true,
|
||||
maxLoaded: 3,
|
||||
profileScope: 'all',
|
||||
}),
|
||||
getOrCreateCurrentSnapshot: async () => ({
|
||||
mediaId: 7,
|
||||
mediaTitle: 'Frieren',
|
||||
entryCount: 100,
|
||||
fromCache: true,
|
||||
updatedAt: 1000,
|
||||
}),
|
||||
buildMergedDictionary: async () => ({
|
||||
zipPath: '/tmp/merged.zip',
|
||||
revision: 'rev-7',
|
||||
dictionaryTitle: 'SubMiner Character Dictionary',
|
||||
entryCount: 100,
|
||||
}),
|
||||
getYomitanDictionaryInfo: async () =>
|
||||
importedRevision
|
||||
? [{ title: 'SubMiner Character Dictionary', revision: importedRevision }]
|
||||
: [],
|
||||
importYomitanDictionary: async () => {
|
||||
importedRevision = 'rev-7';
|
||||
return true;
|
||||
},
|
||||
deleteYomitanDictionary: async () => true,
|
||||
upsertYomitanDictionarySettings: async () => false,
|
||||
now: () => 1000,
|
||||
onSyncStatus: (event) => {
|
||||
currentRun.push(event.phase);
|
||||
},
|
||||
});
|
||||
|
||||
currentRun = [];
|
||||
await runtime.runSyncNow();
|
||||
phaseHistory.push([...currentRun]);
|
||||
currentRun = [];
|
||||
await runtime.runSyncNow();
|
||||
phaseHistory.push([...currentRun]);
|
||||
|
||||
assert.deepEqual(phaseHistory[0], ['building', 'importing', 'ready']);
|
||||
assert.deepEqual(phaseHistory[1], ['ready']);
|
||||
});
|
||||
|
||||
test('auto sync updates MRU order without rebuilding merged dictionary when membership is unchanged', async () => {
|
||||
const userDataPath = makeTempDir();
|
||||
const sequence = [1, 2, 1];
|
||||
@@ -217,6 +270,63 @@ test('auto sync updates MRU order without rebuilding merged dictionary when memb
|
||||
assert.deepEqual(state.activeMediaIds, ['1 - Title 1', '2 - Title 2']);
|
||||
});
|
||||
|
||||
test('auto sync reimports existing merged zip without rebuilding on unchanged revisit', async () => {
|
||||
const userDataPath = makeTempDir();
|
||||
const dictionariesDir = path.join(userDataPath, 'character-dictionaries');
|
||||
fs.mkdirSync(dictionariesDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(dictionariesDir, 'merged.zip'), 'cached-zip', 'utf8');
|
||||
const mergedBuilds: number[][] = [];
|
||||
const imports: string[] = [];
|
||||
let importedRevision: string | null = null;
|
||||
|
||||
const runtime = createCharacterDictionaryAutoSyncRuntimeService({
|
||||
userDataPath,
|
||||
getConfig: () => ({
|
||||
enabled: true,
|
||||
maxLoaded: 3,
|
||||
profileScope: 'all',
|
||||
}),
|
||||
getOrCreateCurrentSnapshot: async () => ({
|
||||
mediaId: 7,
|
||||
mediaTitle: 'Frieren',
|
||||
entryCount: 100,
|
||||
fromCache: true,
|
||||
updatedAt: 1000,
|
||||
}),
|
||||
buildMergedDictionary: async (mediaIds) => {
|
||||
mergedBuilds.push([...mediaIds]);
|
||||
return {
|
||||
zipPath: '/tmp/merged.zip',
|
||||
revision: 'rev-7',
|
||||
dictionaryTitle: 'SubMiner Character Dictionary',
|
||||
entryCount: 100,
|
||||
};
|
||||
},
|
||||
getYomitanDictionaryInfo: async () =>
|
||||
importedRevision
|
||||
? [{ title: 'SubMiner Character Dictionary', revision: importedRevision }]
|
||||
: [],
|
||||
importYomitanDictionary: async (zipPath) => {
|
||||
imports.push(zipPath);
|
||||
importedRevision = 'rev-7';
|
||||
return true;
|
||||
},
|
||||
deleteYomitanDictionary: async () => true,
|
||||
upsertYomitanDictionarySettings: async () => true,
|
||||
now: () => 1000,
|
||||
});
|
||||
|
||||
await runtime.runSyncNow();
|
||||
importedRevision = null;
|
||||
await runtime.runSyncNow();
|
||||
|
||||
assert.deepEqual(mergedBuilds, [[7]]);
|
||||
assert.deepEqual(imports, [
|
||||
'/tmp/merged.zip',
|
||||
path.join(userDataPath, 'character-dictionaries', 'merged.zip'),
|
||||
]);
|
||||
});
|
||||
|
||||
test('auto sync evicts least recently used media from merged set', async () => {
|
||||
const userDataPath = makeTempDir();
|
||||
const sequence = [1, 2, 3, 4];
|
||||
@@ -537,12 +647,6 @@ test('auto sync emits progress events for start import and completion', async ()
|
||||
mediaTitle: 'Rascal Does Not Dream of Bunny Girl Senpai',
|
||||
message: 'Generating character dictionary for Rascal Does Not Dream of Bunny Girl Senpai...',
|
||||
},
|
||||
{
|
||||
phase: 'syncing',
|
||||
mediaId: 101291,
|
||||
mediaTitle: 'Rascal Does Not Dream of Bunny Girl Senpai',
|
||||
message: 'Updating character dictionary for Rascal Does Not Dream of Bunny Girl Senpai...',
|
||||
},
|
||||
{
|
||||
phase: 'building',
|
||||
mediaId: 101291,
|
||||
|
||||
@@ -275,12 +275,6 @@ export function createCharacterDictionaryAutoSyncRuntimeService(
|
||||
});
|
||||
currentMediaId = snapshot.mediaId;
|
||||
currentMediaTitle = snapshot.mediaTitle;
|
||||
deps.onSyncStatus?.({
|
||||
phase: 'syncing',
|
||||
mediaId: snapshot.mediaId,
|
||||
mediaTitle: snapshot.mediaTitle,
|
||||
message: buildSyncingMessage(snapshot.mediaTitle),
|
||||
});
|
||||
const state = readAutoSyncState(statePath);
|
||||
const nextActiveMediaIds = [
|
||||
{
|
||||
@@ -360,7 +354,17 @@ export function createCharacterDictionaryAutoSyncRuntimeService(
|
||||
);
|
||||
}
|
||||
if (merged === null) {
|
||||
merged = await deps.buildMergedDictionary(nextActiveMediaIdValues);
|
||||
const existingMergedZipPath = path.join(dictionariesDir, 'merged.zip');
|
||||
if (fs.existsSync(existingMergedZipPath)) {
|
||||
merged = {
|
||||
zipPath: existingMergedZipPath,
|
||||
revision,
|
||||
dictionaryTitle,
|
||||
entryCount: snapshot.entryCount,
|
||||
};
|
||||
} else {
|
||||
merged = await deps.buildMergedDictionary(nextActiveMediaIdValues);
|
||||
}
|
||||
}
|
||||
deps.logInfo?.(`[dictionary:auto-sync] importing merged dictionary: ${merged.zipPath}`);
|
||||
const imported = await withOperationTimeout(
|
||||
|
||||
@@ -8,6 +8,7 @@ test('cli prechecks main deps builder maps transition handlers', () => {
|
||||
isTexthookerOnlyMode: () => true,
|
||||
setTexthookerOnlyMode: (enabled) => calls.push(`set:${enabled}`),
|
||||
commandNeedsOverlayRuntime: () => true,
|
||||
ensureOverlayStartupPrereqs: () => calls.push('prereqs'),
|
||||
startBackgroundWarmups: () => calls.push('warmups'),
|
||||
logInfo: (message) => calls.push(`info:${message}`),
|
||||
})();
|
||||
@@ -15,7 +16,8 @@ test('cli prechecks main deps builder maps transition handlers', () => {
|
||||
assert.equal(deps.isTexthookerOnlyMode(), true);
|
||||
assert.equal(deps.commandNeedsOverlayRuntime({} as never), true);
|
||||
deps.setTexthookerOnlyMode(false);
|
||||
deps.ensureOverlayStartupPrereqs();
|
||||
deps.startBackgroundWarmups();
|
||||
deps.logInfo('x');
|
||||
assert.deepEqual(calls, ['set:false', 'warmups', 'info:x']);
|
||||
assert.deepEqual(calls, ['set:false', 'prereqs', 'warmups', 'info:x']);
|
||||
});
|
||||
|
||||
@@ -4,6 +4,7 @@ export function createBuildHandleTexthookerOnlyModeTransitionMainDepsHandler(dep
|
||||
isTexthookerOnlyMode: () => boolean;
|
||||
setTexthookerOnlyMode: (enabled: boolean) => void;
|
||||
commandNeedsOverlayRuntime: (args: CliArgs) => boolean;
|
||||
ensureOverlayStartupPrereqs: () => void;
|
||||
startBackgroundWarmups: () => void;
|
||||
logInfo: (message: string) => void;
|
||||
}) {
|
||||
@@ -11,6 +12,7 @@ export function createBuildHandleTexthookerOnlyModeTransitionMainDepsHandler(dep
|
||||
isTexthookerOnlyMode: () => deps.isTexthookerOnlyMode(),
|
||||
setTexthookerOnlyMode: (enabled: boolean) => deps.setTexthookerOnlyMode(enabled),
|
||||
commandNeedsOverlayRuntime: (args: CliArgs) => deps.commandNeedsOverlayRuntime(args),
|
||||
ensureOverlayStartupPrereqs: () => deps.ensureOverlayStartupPrereqs(),
|
||||
startBackgroundWarmups: () => deps.startBackgroundWarmups(),
|
||||
logInfo: (message: string) => deps.logInfo(message),
|
||||
});
|
||||
|
||||
@@ -8,6 +8,7 @@ test('texthooker precheck no-ops when mode is disabled', () => {
|
||||
isTexthookerOnlyMode: () => false,
|
||||
setTexthookerOnlyMode: () => {},
|
||||
commandNeedsOverlayRuntime: () => true,
|
||||
ensureOverlayStartupPrereqs: () => {},
|
||||
startBackgroundWarmups: () => {
|
||||
warmups += 1;
|
||||
},
|
||||
@@ -22,12 +23,16 @@ test('texthooker precheck disables mode and warms up on start command', () => {
|
||||
let mode = true;
|
||||
let warmups = 0;
|
||||
let logs = 0;
|
||||
let prereqs = 0;
|
||||
const handlePrecheck = createHandleTexthookerOnlyModeTransitionHandler({
|
||||
isTexthookerOnlyMode: () => mode,
|
||||
setTexthookerOnlyMode: (enabled) => {
|
||||
mode = enabled;
|
||||
},
|
||||
commandNeedsOverlayRuntime: () => false,
|
||||
ensureOverlayStartupPrereqs: () => {
|
||||
prereqs += 1;
|
||||
},
|
||||
startBackgroundWarmups: () => {
|
||||
warmups += 1;
|
||||
},
|
||||
@@ -38,6 +43,7 @@ test('texthooker precheck disables mode and warms up on start command', () => {
|
||||
|
||||
handlePrecheck({ start: true, texthooker: false } as never);
|
||||
assert.equal(mode, false);
|
||||
assert.equal(prereqs, 1);
|
||||
assert.equal(warmups, 1);
|
||||
assert.equal(logs, 1);
|
||||
});
|
||||
@@ -50,6 +56,7 @@ test('texthooker precheck no-ops for texthooker command', () => {
|
||||
mode = enabled;
|
||||
},
|
||||
commandNeedsOverlayRuntime: () => true,
|
||||
ensureOverlayStartupPrereqs: () => {},
|
||||
startBackgroundWarmups: () => {},
|
||||
logInfo: () => {},
|
||||
});
|
||||
|
||||
@@ -4,6 +4,7 @@ export function createHandleTexthookerOnlyModeTransitionHandler(deps: {
|
||||
isTexthookerOnlyMode: () => boolean;
|
||||
setTexthookerOnlyMode: (enabled: boolean) => void;
|
||||
commandNeedsOverlayRuntime: (args: CliArgs) => boolean;
|
||||
ensureOverlayStartupPrereqs: () => void;
|
||||
startBackgroundWarmups: () => void;
|
||||
logInfo: (message: string) => void;
|
||||
}) {
|
||||
@@ -13,6 +14,7 @@ export function createHandleTexthookerOnlyModeTransitionHandler(deps: {
|
||||
!args.texthooker &&
|
||||
(args.start || deps.commandNeedsOverlayRuntime(args))
|
||||
) {
|
||||
deps.ensureOverlayStartupPrereqs();
|
||||
deps.setTexthookerOnlyMode(false);
|
||||
deps.logInfo('Disabling texthooker-only mode after overlay/start command.');
|
||||
deps.startBackgroundWarmups();
|
||||
|
||||
@@ -9,6 +9,7 @@ test('cli command runtime handler applies precheck and forwards command with con
|
||||
isTexthookerOnlyMode: () => true,
|
||||
setTexthookerOnlyMode: () => calls.push('set-mode'),
|
||||
commandNeedsOverlayRuntime: () => true,
|
||||
ensureOverlayStartupPrereqs: () => calls.push('prereqs'),
|
||||
startBackgroundWarmups: () => calls.push('warmups'),
|
||||
logInfo: (message) => calls.push(`log:${message}`),
|
||||
},
|
||||
@@ -24,6 +25,7 @@ test('cli command runtime handler applies precheck and forwards command with con
|
||||
handler({ start: true } as never);
|
||||
|
||||
assert.deepEqual(calls, [
|
||||
'prereqs',
|
||||
'set-mode',
|
||||
'log:Disabling texthooker-only mode after overlay/start command.',
|
||||
'warmups',
|
||||
|
||||
@@ -87,6 +87,7 @@ test('media path change handler reports stop for empty path and probes media key
|
||||
maybeProbeAnilistDuration: (mediaKey) => calls.push(`probe:${mediaKey}`),
|
||||
ensureAnilistMediaGuess: (mediaKey) => calls.push(`guess:${mediaKey}`),
|
||||
syncImmersionMediaState: () => calls.push('sync'),
|
||||
flushPlaybackPositionOnMediaPathClear: () => calls.push('flush-playback'),
|
||||
scheduleCharacterDictionarySync: () => calls.push('dict-sync'),
|
||||
signalAutoplayReadyIfWarm: (path) => calls.push(`autoplay:${path}`),
|
||||
refreshDiscordPresence: () => calls.push('presence'),
|
||||
@@ -94,6 +95,7 @@ test('media path change handler reports stop for empty path and probes media key
|
||||
|
||||
handler({ path: '' });
|
||||
assert.deepEqual(calls, [
|
||||
'flush-playback',
|
||||
'path:',
|
||||
'stopped',
|
||||
'restore-mpv-sub',
|
||||
@@ -116,6 +118,7 @@ test('media path change handler signals autoplay-ready fast path for warm non-em
|
||||
maybeProbeAnilistDuration: (mediaKey) => calls.push(`probe:${mediaKey}`),
|
||||
ensureAnilistMediaGuess: (mediaKey) => calls.push(`guess:${mediaKey}`),
|
||||
syncImmersionMediaState: () => calls.push('sync'),
|
||||
flushPlaybackPositionOnMediaPathClear: () => calls.push('flush-playback'),
|
||||
scheduleCharacterDictionarySync: () => calls.push('dict-sync'),
|
||||
signalAutoplayReadyIfWarm: (path) => calls.push(`autoplay:${path}`),
|
||||
refreshDiscordPresence: () => calls.push('presence'),
|
||||
@@ -133,6 +136,35 @@ test('media path change handler signals autoplay-ready fast path for warm non-em
|
||||
]);
|
||||
});
|
||||
|
||||
test('media path change handler ignores playback flush for non-empty path', () => {
|
||||
const calls: string[] = [];
|
||||
const handler = createHandleMpvMediaPathChangeHandler({
|
||||
updateCurrentMediaPath: (path) => calls.push(`path:${path}`),
|
||||
reportJellyfinRemoteStopped: () => calls.push('stopped'),
|
||||
restoreMpvSubVisibility: () => calls.push('restore-mpv-sub'),
|
||||
getCurrentAnilistMediaKey: () => null,
|
||||
resetAnilistMediaTracking: (mediaKey) => calls.push(`reset:${String(mediaKey)}`),
|
||||
maybeProbeAnilistDuration: (mediaKey) => calls.push(`probe:${mediaKey}`),
|
||||
ensureAnilistMediaGuess: (mediaKey) => calls.push(`guess:${mediaKey}`),
|
||||
syncImmersionMediaState: () => calls.push('sync'),
|
||||
flushPlaybackPositionOnMediaPathClear: () => calls.push('flush-playback'),
|
||||
scheduleCharacterDictionarySync: () => calls.push('dict-sync'),
|
||||
signalAutoplayReadyIfWarm: (path) => calls.push(`autoplay:${path}`),
|
||||
refreshDiscordPresence: () => calls.push('presence'),
|
||||
});
|
||||
|
||||
handler({ path: '/tmp/video.mkv' });
|
||||
assert.ok(!calls.includes('flush-playback'));
|
||||
assert.deepEqual(calls, [
|
||||
'path:/tmp/video.mkv',
|
||||
'reset:null',
|
||||
'sync',
|
||||
'dict-sync',
|
||||
'autoplay:/tmp/video.mkv',
|
||||
'presence',
|
||||
]);
|
||||
});
|
||||
|
||||
test('media title change handler clears guess state without re-scheduling character dictionary sync', () => {
|
||||
const calls: string[] = [];
|
||||
const deps: Parameters<typeof createHandleMpvMediaTitleChangeHandler>[0] & {
|
||||
|
||||
@@ -53,10 +53,14 @@ export function createHandleMpvMediaPathChangeHandler(deps: {
|
||||
syncImmersionMediaState: () => void;
|
||||
scheduleCharacterDictionarySync?: () => void;
|
||||
signalAutoplayReadyIfWarm?: (path: string) => void;
|
||||
flushPlaybackPositionOnMediaPathClear?: (mediaPath: string) => void;
|
||||
refreshDiscordPresence: () => void;
|
||||
}) {
|
||||
return ({ path }: { path: string | null }): void => {
|
||||
const normalizedPath = typeof path === 'string' ? path : '';
|
||||
if (!normalizedPath) {
|
||||
deps.flushPlaybackPositionOnMediaPathClear?.(normalizedPath);
|
||||
}
|
||||
deps.updateCurrentMediaPath(normalizedPath);
|
||||
if (!normalizedPath) {
|
||||
deps.reportJellyfinRemoteStopped();
|
||||
|
||||
@@ -44,6 +44,7 @@ test('main mpv event binder wires callbacks through to runtime deps', () => {
|
||||
maybeProbeAnilistDuration: (mediaKey) => calls.push(`probe:${mediaKey}`),
|
||||
ensureAnilistMediaGuess: (mediaKey) => calls.push(`guess:${mediaKey}`),
|
||||
syncImmersionMediaState: () => calls.push('sync-immersion'),
|
||||
flushPlaybackPositionOnMediaPathClear: () => calls.push('flush-playback'),
|
||||
|
||||
updateCurrentMediaTitle: (title) => calls.push(`media-title:${title}`),
|
||||
resetAnilistMediaGuessState: () => calls.push('reset-guess-state'),
|
||||
@@ -86,4 +87,6 @@ test('main mpv event binder wires callbacks through to runtime deps', () => {
|
||||
assert.ok(calls.includes('progress:normal'));
|
||||
assert.ok(calls.includes('progress:force'));
|
||||
assert.ok(calls.includes('presence-refresh'));
|
||||
assert.ok(calls.includes('sync-immersion'));
|
||||
assert.ok(calls.includes('flush-playback'));
|
||||
});
|
||||
|
||||
@@ -56,6 +56,7 @@ export function createBindMpvMainEventHandlersHandler(deps: {
|
||||
ensureAnilistMediaGuess: (mediaKey: string) => void;
|
||||
syncImmersionMediaState: () => void;
|
||||
signalAutoplayReadyIfWarm?: (path: string) => void;
|
||||
flushPlaybackPositionOnMediaPathClear?: (mediaPath: string) => void;
|
||||
|
||||
updateCurrentMediaTitle: (title: string) => void;
|
||||
resetAnilistMediaGuessState: () => void;
|
||||
@@ -114,6 +115,8 @@ export function createBindMpvMainEventHandlersHandler(deps: {
|
||||
maybeProbeAnilistDuration: (mediaKey) => deps.maybeProbeAnilistDuration(mediaKey),
|
||||
ensureAnilistMediaGuess: (mediaKey) => deps.ensureAnilistMediaGuess(mediaKey),
|
||||
syncImmersionMediaState: () => deps.syncImmersionMediaState(),
|
||||
flushPlaybackPositionOnMediaPathClear: (mediaPath) =>
|
||||
deps.flushPlaybackPositionOnMediaPathClear?.(mediaPath),
|
||||
signalAutoplayReadyIfWarm: (path) => deps.signalAutoplayReadyIfWarm?.(path),
|
||||
scheduleCharacterDictionarySync: () => deps.scheduleCharacterDictionarySync?.(),
|
||||
refreshDiscordPresence: () => deps.refreshDiscordPresence(),
|
||||
|
||||
@@ -7,7 +7,11 @@ test('mpv main event main deps map app state updates and delegate callbacks', as
|
||||
const appState = {
|
||||
initialArgs: { jellyfinPlay: true },
|
||||
overlayRuntimeInitialized: true,
|
||||
mpvClient: { connected: true },
|
||||
mpvClient: {
|
||||
connected: true,
|
||||
currentTimePos: 12.25,
|
||||
requestProperty: async () => 18.75,
|
||||
},
|
||||
immersionTracker: {
|
||||
recordSubtitleLine: (text: string) => calls.push(`immersion-sub:${text}`),
|
||||
handleMediaTitleUpdate: (title: string) => calls.push(`immersion-title:${title}`),
|
||||
@@ -92,6 +96,8 @@ test('mpv main event main deps map app state updates and delegate callbacks', as
|
||||
deps.recordPauseState(true);
|
||||
deps.updateSubtitleRenderMetrics({});
|
||||
deps.setPreviousSecondarySubVisibility(true);
|
||||
deps.flushPlaybackPositionOnMediaPathClear?.('');
|
||||
await Promise.resolve();
|
||||
|
||||
assert.equal(appState.currentSubText, 'sub');
|
||||
assert.equal(appState.currentSubAssText, 'ass');
|
||||
@@ -106,4 +112,6 @@ test('mpv main event main deps map app state updates and delegate callbacks', as
|
||||
assert.ok(calls.includes('metrics'));
|
||||
assert.ok(calls.includes('presence-refresh'));
|
||||
assert.ok(calls.includes('restore-mpv-sub'));
|
||||
assert.ok(calls.includes('immersion-time:12.25'));
|
||||
assert.ok(calls.includes('immersion-time:18.75'));
|
||||
});
|
||||
|
||||
@@ -4,7 +4,14 @@ export function createBuildBindMpvMainEventHandlersMainDepsHandler(deps: {
|
||||
appState: {
|
||||
initialArgs?: { jellyfinPlay?: unknown } | null;
|
||||
overlayRuntimeInitialized: boolean;
|
||||
mpvClient: { connected?: boolean; currentSecondarySubText?: string } | null;
|
||||
mpvClient:
|
||||
| {
|
||||
connected?: boolean;
|
||||
currentSecondarySubText?: string;
|
||||
currentTimePos?: number;
|
||||
requestProperty?: (name: string) => Promise<unknown>;
|
||||
}
|
||||
| null;
|
||||
immersionTracker: {
|
||||
recordSubtitleLine?: (
|
||||
text: string,
|
||||
@@ -21,6 +28,7 @@ export function createBuildBindMpvMainEventHandlersMainDepsHandler(deps: {
|
||||
subtitleTimingTracker: {
|
||||
recordSubtitle?: (text: string, start: number, end: number) => void;
|
||||
} | null;
|
||||
currentMediaPath?: string | null;
|
||||
currentSubText: string;
|
||||
currentSubAssText: string;
|
||||
currentSubtitleData?: SubtitleData | null;
|
||||
@@ -58,6 +66,15 @@ export function createBuildBindMpvMainEventHandlersMainDepsHandler(deps: {
|
||||
ensureImmersionTrackerInitialized: () => void;
|
||||
tokenizeSubtitleForImmersion?: (text: string) => Promise<SubtitleData | null>;
|
||||
}) {
|
||||
const writePlaybackPositionFromMpv = (timeSec: unknown): void => {
|
||||
const normalizedTimeSec = Number(timeSec);
|
||||
if (!Number.isFinite(normalizedTimeSec)) {
|
||||
return;
|
||||
}
|
||||
deps.ensureImmersionTrackerInitialized();
|
||||
deps.appState.immersionTracker?.recordPlaybackPosition?.(normalizedTimeSec);
|
||||
};
|
||||
|
||||
return () => ({
|
||||
reportJellyfinRemoteStopped: () => deps.reportJellyfinRemoteStopped(),
|
||||
syncOverlayMpvSubtitleSuppression: () => deps.syncOverlayMpvSubtitleSuppression(),
|
||||
@@ -161,6 +178,25 @@ export function createBuildBindMpvMainEventHandlersMainDepsHandler(deps: {
|
||||
deps.ensureImmersionTrackerInitialized();
|
||||
deps.appState.immersionTracker?.recordPauseState?.(paused);
|
||||
},
|
||||
flushPlaybackPositionOnMediaPathClear: (mediaPath: string) => {
|
||||
const mpvClient = deps.appState.mpvClient;
|
||||
const currentKnownTime = Number(mpvClient?.currentTimePos);
|
||||
writePlaybackPositionFromMpv(currentKnownTime);
|
||||
if (!mpvClient?.requestProperty) {
|
||||
return;
|
||||
}
|
||||
void mpvClient.requestProperty('time-pos').then((timePos) => {
|
||||
const currentPath = (deps.appState.currentMediaPath ?? '').trim();
|
||||
if (currentPath.length > 0 && currentPath !== mediaPath) {
|
||||
return;
|
||||
}
|
||||
const resolvedTime = Number(timePos);
|
||||
if (Number.isFinite(currentKnownTime) && Number.isFinite(resolvedTime) && currentKnownTime === resolvedTime) {
|
||||
return;
|
||||
}
|
||||
writePlaybackPositionFromMpv(resolvedTime);
|
||||
});
|
||||
},
|
||||
updateSubtitleRenderMetrics: (patch: Record<string, unknown>) =>
|
||||
deps.updateSubtitleRenderMetrics(patch),
|
||||
setPreviousSecondarySubVisibility: (visible: boolean) => {
|
||||
|
||||
32
src/main/runtime/startup-autoplay-release-policy.test.ts
Normal file
32
src/main/runtime/startup-autoplay-release-policy.test.ts
Normal file
@@ -0,0 +1,32 @@
|
||||
import assert from 'node:assert/strict';
|
||||
import test from 'node:test';
|
||||
import {
|
||||
DEFAULT_AUTOPLAY_RELEASE_RETRY_DELAY_MS,
|
||||
resolveAutoplayReadyMaxReleaseAttempts,
|
||||
STARTUP_AUTOPLAY_RELEASE_TIMEOUT_MS,
|
||||
} from './startup-autoplay-release-policy';
|
||||
|
||||
test('autoplay release keeps the short retry budget for normal playback signals', () => {
|
||||
assert.equal(resolveAutoplayReadyMaxReleaseAttempts(), 3);
|
||||
assert.equal(resolveAutoplayReadyMaxReleaseAttempts({ forceWhilePaused: false }), 3);
|
||||
});
|
||||
|
||||
test('autoplay release uses the full startup timeout window while paused', () => {
|
||||
assert.equal(
|
||||
resolveAutoplayReadyMaxReleaseAttempts({ forceWhilePaused: true }),
|
||||
Math.ceil(
|
||||
STARTUP_AUTOPLAY_RELEASE_TIMEOUT_MS / DEFAULT_AUTOPLAY_RELEASE_RETRY_DELAY_MS,
|
||||
),
|
||||
);
|
||||
});
|
||||
|
||||
test('autoplay release rounds up custom paused retry budgets to cover the timeout window', () => {
|
||||
assert.equal(
|
||||
resolveAutoplayReadyMaxReleaseAttempts({
|
||||
forceWhilePaused: true,
|
||||
retryDelayMs: 300,
|
||||
startupTimeoutMs: 1_000,
|
||||
}),
|
||||
4,
|
||||
);
|
||||
});
|
||||
28
src/main/runtime/startup-autoplay-release-policy.ts
Normal file
28
src/main/runtime/startup-autoplay-release-policy.ts
Normal file
@@ -0,0 +1,28 @@
|
||||
const DEFAULT_AUTOPLAY_RELEASE_RETRY_DELAY_MS = 200;
|
||||
const STARTUP_AUTOPLAY_RELEASE_TIMEOUT_MS = 15_000;
|
||||
|
||||
export function resolveAutoplayReadyMaxReleaseAttempts(options?: {
|
||||
forceWhilePaused?: boolean;
|
||||
retryDelayMs?: number;
|
||||
startupTimeoutMs?: number;
|
||||
}): number {
|
||||
if (options?.forceWhilePaused !== true) {
|
||||
return 3;
|
||||
}
|
||||
|
||||
const retryDelayMs = Math.max(
|
||||
1,
|
||||
Math.floor(options.retryDelayMs ?? DEFAULT_AUTOPLAY_RELEASE_RETRY_DELAY_MS),
|
||||
);
|
||||
const startupTimeoutMs = Math.max(
|
||||
retryDelayMs,
|
||||
Math.floor(options.startupTimeoutMs ?? STARTUP_AUTOPLAY_RELEASE_TIMEOUT_MS),
|
||||
);
|
||||
|
||||
return Math.max(3, Math.ceil(startupTimeoutMs / retryDelayMs));
|
||||
}
|
||||
|
||||
export {
|
||||
DEFAULT_AUTOPLAY_RELEASE_RETRY_DELAY_MS,
|
||||
STARTUP_AUTOPLAY_RELEASE_TIMEOUT_MS,
|
||||
};
|
||||
@@ -62,7 +62,10 @@ test('startup OSD buffers checking behind annotations and replaces it with later
|
||||
makeDictionaryEvent('generating', 'Generating character dictionary for Frieren...'),
|
||||
);
|
||||
|
||||
assert.deepEqual(osdMessages, ['Loading subtitle annotations |']);
|
||||
assert.deepEqual(osdMessages, [
|
||||
'Loading subtitle annotations |',
|
||||
'Generating character dictionary for Frieren...',
|
||||
]);
|
||||
|
||||
sequencer.markAnnotationLoadingComplete('Subtitle annotations loaded');
|
||||
|
||||
@@ -154,3 +157,30 @@ test('startup OSD reset keeps tokenization ready after first warmup', () => {
|
||||
|
||||
assert.deepEqual(osdMessages, ['Updating character dictionary for Frieren...']);
|
||||
});
|
||||
|
||||
test('startup OSD shows later dictionary progress immediately once tokenization is ready', () => {
|
||||
const osdMessages: string[] = [];
|
||||
const sequencer = createStartupOsdSequencer({
|
||||
showOsd: (message) => {
|
||||
osdMessages.push(message);
|
||||
},
|
||||
});
|
||||
|
||||
sequencer.showAnnotationLoading('Loading subtitle annotations |');
|
||||
sequencer.markTokenizationReady();
|
||||
sequencer.notifyCharacterDictionaryStatus(
|
||||
makeDictionaryEvent('generating', 'Generating character dictionary for Frieren...'),
|
||||
);
|
||||
|
||||
assert.deepEqual(osdMessages, [
|
||||
'Loading subtitle annotations |',
|
||||
'Generating character dictionary for Frieren...',
|
||||
]);
|
||||
|
||||
sequencer.markAnnotationLoadingComplete('Subtitle annotations loaded');
|
||||
|
||||
assert.deepEqual(osdMessages, [
|
||||
'Loading subtitle annotations |',
|
||||
'Generating character dictionary for Frieren...',
|
||||
]);
|
||||
});
|
||||
|
||||
@@ -25,6 +25,9 @@ export function createStartupOsdSequencer(deps: { showOsd: (message: string) =>
|
||||
return false;
|
||||
}
|
||||
if (pendingDictionaryProgress) {
|
||||
if (dictionaryProgressShown) {
|
||||
return true;
|
||||
}
|
||||
deps.showOsd(pendingDictionaryProgress.message);
|
||||
dictionaryProgressShown = true;
|
||||
return true;
|
||||
@@ -84,6 +87,9 @@ export function createStartupOsdSequencer(deps: { showOsd: (message: string) =>
|
||||
if (canShowDictionaryStatus()) {
|
||||
deps.showOsd(event.message);
|
||||
dictionaryProgressShown = true;
|
||||
} else if (tokenizationReady) {
|
||||
deps.showOsd(event.message);
|
||||
dictionaryProgressShown = true;
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
@@ -18,8 +18,8 @@ const summary: OverviewSummary = {
|
||||
activeDays: 12,
|
||||
totalSessions: 15,
|
||||
lookupRate: {
|
||||
shortValue: '2.3 / 100 tokens',
|
||||
longValue: '2.3 lookups per 100 tokens',
|
||||
shortValue: '2.3 / 100 words',
|
||||
longValue: '2.3 lookups per 100 words',
|
||||
},
|
||||
todayTokens: 0,
|
||||
newWordsToday: 0,
|
||||
@@ -33,8 +33,8 @@ test('TrackingSnapshot renders Yomitan lookup rate copy on the homepage card', (
|
||||
);
|
||||
|
||||
assert.match(markup, /Lookup Rate/);
|
||||
assert.match(markup, /2\.3 \/ 100 tokens/);
|
||||
assert.match(markup, /Lifetime Yomitan lookups normalized by total tokens seen/);
|
||||
assert.match(markup, /2\.3 \/ 100 words/);
|
||||
assert.match(markup, /Lifetime Yomitan lookups normalized by total words seen/);
|
||||
});
|
||||
|
||||
test('TrackingSnapshot labels new words as unique headwords', () => {
|
||||
|
||||
@@ -183,7 +183,7 @@ export function TrendsTab() {
|
||||
color={cardsMinedColor}
|
||||
type="bar"
|
||||
/>
|
||||
<TrendChart title="Tokens Seen" data={data.activity.words} color="#8bd5ca" type="bar" />
|
||||
<TrendChart title="Words Seen" data={data.activity.words} color="#8bd5ca" type="bar" />
|
||||
<TrendChart title="Sessions" data={data.activity.sessions} color="#b7bdf8" type="bar" />
|
||||
|
||||
<SectionHeader>Period Trends</SectionHeader>
|
||||
@@ -194,7 +194,7 @@ export function TrendsTab() {
|
||||
type="line"
|
||||
/>
|
||||
<TrendChart title="Sessions" data={data.progress.sessions} color="#b7bdf8" type="line" />
|
||||
<TrendChart title="Tokens Seen" data={data.progress.words} color="#8bd5ca" type="line" />
|
||||
<TrendChart title="Words Seen" data={data.progress.words} color="#8bd5ca" type="line" />
|
||||
<TrendChart
|
||||
title="New Words Seen"
|
||||
data={data.progress.newWords}
|
||||
@@ -215,7 +215,7 @@ export function TrendsTab() {
|
||||
/>
|
||||
<TrendChart title="Lookups" data={data.progress.lookups} color="#f5bde6" type="line" />
|
||||
<TrendChart
|
||||
title="Lookups / 100 Tokens"
|
||||
title="Lookups / 100 Words"
|
||||
data={data.ratios.lookupsPerHundred}
|
||||
color="#f5a97f"
|
||||
type="line"
|
||||
@@ -246,7 +246,7 @@ export function TrendsTab() {
|
||||
data={filteredCardsPerAnime}
|
||||
colorPalette={cardsMinedStackedColors}
|
||||
/>
|
||||
<StackedTrendChart title="Tokens Seen per Anime" data={filteredWordsPerAnime} />
|
||||
<StackedTrendChart title="Words Seen per Anime" data={filteredWordsPerAnime} />
|
||||
<StackedTrendChart title="Lookups per Anime" data={filteredLookupsPerAnime} />
|
||||
<StackedTrendChart
|
||||
title="Lookups/100w per Anime"
|
||||
@@ -261,7 +261,7 @@ export function TrendsTab() {
|
||||
data={filteredCardsProgress}
|
||||
colorPalette={cardsMinedStackedColors}
|
||||
/>
|
||||
<StackedTrendChart title="Tokens Seen Progress" data={filteredWordsProgress} />
|
||||
<StackedTrendChart title="Words Seen Progress" data={filteredWordsProgress} />
|
||||
|
||||
<SectionHeader>Patterns</SectionHeader>
|
||||
<TrendChart
|
||||
|
||||
@@ -85,8 +85,8 @@ test('buildOverviewSummary aggregates tracked totals and recent windows', () =>
|
||||
assert.equal(summary.activeDays, 2);
|
||||
assert.equal(summary.totalSessions, 15);
|
||||
assert.deepEqual(summary.lookupRate, {
|
||||
shortValue: '2.3 / 100 tokens',
|
||||
longValue: '2.3 lookups per 100 tokens',
|
||||
shortValue: '2.3 / 100 words',
|
||||
longValue: '2.3 lookups per 100 words',
|
||||
});
|
||||
});
|
||||
|
||||
|
||||
@@ -35,6 +35,6 @@ test('MediaSessionList renders expandable session rows with delete affordance',
|
||||
assert.match(markup, /Session History/);
|
||||
assert.match(markup, /aria-expanded="true"/);
|
||||
assert.match(markup, /Delete session Episode 7/);
|
||||
assert.match(markup, /tokens/);
|
||||
assert.match(markup, /No token data for this session/);
|
||||
assert.match(markup, /words/);
|
||||
assert.match(markup, /No word data for this session/);
|
||||
});
|
||||
|
||||
@@ -30,7 +30,7 @@ test('SessionDetail omits the misleading new words metric', () => {
|
||||
/>,
|
||||
);
|
||||
|
||||
assert.match(markup, /No token data/);
|
||||
assert.match(markup, /No word data/);
|
||||
assert.doesNotMatch(markup, /New words/);
|
||||
});
|
||||
|
||||
|
||||
@@ -8,10 +8,10 @@ import { SessionRow } from '../components/sessions/SessionRow';
|
||||
import { EventType, type SessionEvent } from '../types/stats';
|
||||
import { buildLookupRateDisplay, getYomitanLookupEvents } from './yomitan-lookup';
|
||||
|
||||
test('buildLookupRateDisplay formats lookups per 100 tokens in short and long forms', () => {
|
||||
test('buildLookupRateDisplay formats lookups per 100 words in short and long forms', () => {
|
||||
assert.deepEqual(buildLookupRateDisplay(23, 1000), {
|
||||
shortValue: '2.3 / 100 tokens',
|
||||
longValue: '2.3 lookups per 100 tokens',
|
||||
shortValue: '2.3 / 100 words',
|
||||
longValue: '2.3 lookups per 100 words',
|
||||
});
|
||||
assert.equal(buildLookupRateDisplay(0, 0), null);
|
||||
});
|
||||
@@ -49,11 +49,11 @@ test('MediaHeader renders Yomitan lookup count and lookup rate copy', () => {
|
||||
);
|
||||
|
||||
assert.match(markup, /23/);
|
||||
assert.match(markup, /2\.3 \/ 100 tokens/);
|
||||
assert.match(markup, /2\.3 lookups per 100 tokens/);
|
||||
assert.match(markup, /2\.3 \/ 100 words/);
|
||||
assert.match(markup, /2\.3 lookups per 100 words/);
|
||||
});
|
||||
|
||||
test('MediaHeader distinguishes token occurrences from known unique words', () => {
|
||||
test('MediaHeader distinguishes word occurrences from known unique words', () => {
|
||||
const markup = renderToStaticMarkup(
|
||||
<MediaHeader
|
||||
detail={{
|
||||
@@ -76,7 +76,7 @@ test('MediaHeader distinguishes token occurrences from known unique words', () =
|
||||
/>,
|
||||
);
|
||||
|
||||
assert.match(markup, /token occurrences/);
|
||||
assert.match(markup, /word occurrences/);
|
||||
assert.match(markup, /known unique words \(50%\)/);
|
||||
assert.match(markup, /17 \/ 34/);
|
||||
});
|
||||
@@ -105,7 +105,7 @@ test('EpisodeList renders per-episode Yomitan lookup rate', () => {
|
||||
);
|
||||
|
||||
assert.match(markup, /Lookup Rate/);
|
||||
assert.match(markup, /2\.0 \/ 100 tokens/);
|
||||
assert.match(markup, /2\.0 \/ 100 words/);
|
||||
assert.match(markup, /6%/);
|
||||
assert.doesNotMatch(markup, /90%/);
|
||||
});
|
||||
@@ -139,11 +139,11 @@ test('AnimeOverviewStats renders aggregate Yomitan lookup metrics', () => {
|
||||
|
||||
assert.match(markup, /Lookups/);
|
||||
assert.match(markup, /16/);
|
||||
assert.match(markup, /2\.0 \/ 100 tokens/);
|
||||
assert.match(markup, /Yomitan lookups per 100 tokens seen/);
|
||||
assert.match(markup, /2\.0 \/ 100 words/);
|
||||
assert.match(markup, /Yomitan lookups per 100 words seen/);
|
||||
});
|
||||
|
||||
test('SessionRow prefers token-based word count when available', () => {
|
||||
test('SessionRow prefers word-based count when available', () => {
|
||||
const markup = renderToStaticMarkup(
|
||||
<SessionRow
|
||||
session={{
|
||||
|
||||
Reference in New Issue
Block a user