Add backlog tasks and launcher time helper tests

- Track follow-up cleanup work in Backlog.md
- Replace Date.now usage with shared nowMs helper
- Add launcher args/parser and core regression tests
This commit is contained in:
2026-03-27 02:01:36 -07:00
parent a3ddfa0641
commit 854179b9c1
32 changed files with 2357 additions and 152 deletions

194
Backlog.md Normal file
View File

@@ -0,0 +1,194 @@
# Backlog
Purpose: lightweight repo-local task board. Seeded with current testing / coverage work.
Status keys:
- `todo`: not started
- `doing`: in progress
- `blocked`: waiting
- `done`: shipped
Priority keys:
- `P0`: urgent / release-risk
- `P1`: high value
- `P2`: useful cleanup
- `P3`: nice-to-have
## Active
None.
## Ready
| ID | Pri | Status | Area | Title |
| ------ | --- | ------ | ----------------- | ---------------------------------------------------------------- |
| SM-001 | P1 | todo | launcher | Add tests for CLI parser and args normalizer |
| SM-002 | P1 | todo | immersion-tracker | Backfill tests for uncovered query exports |
| SM-003 | P1 | todo | anki | Add focused field-grouping service + merge edge-case tests |
| SM-004 | P2 | todo | tests | Extract shared test utils for deps factories and polling helpers |
| SM-005 | P2 | todo | tests | Strengthen weak assertions in app-ready and IPC tests |
| SM-006 | P2 | todo | tests | Break up monolithic youtube-flow and subtitle-sidebar tests |
| SM-007 | P2 | todo | anilist | Add tests for AniList rate limiter |
| SM-008 | P3 | todo | subtitles | Add core subtitle-position persistence/path tests |
| SM-009 | P3 | todo | tokenizer | Add tests for JLPT token filter |
| SM-010 | P1 | todo | immersion-tracker | Refactor storage + immersion-tracker service into focused modules |
## Icebox
None.
## Ticket Details
### SM-001
Title: Add tests for CLI parser and args normalizer
Priority: P1
Status: todo
Scope:
- `launcher/config/cli-parser-builder.ts`
- `launcher/config/args-normalizer.ts`
Acceptance:
- root options parsing covered
- subcommand routing covered
- invalid action / invalid log level / invalid backend cases covered
- target classification covered: file, directory, URL, invalid
### SM-002
Title: Backfill tests for uncovered query exports
Priority: P1
Status: todo
Scope:
- `src/core/services/immersion-tracker/query-*.ts`
Targets:
- headword helpers
- anime/media detail helpers not covered by existing wrapper tests
- lexical detail / appearance helpers
- maintenance helpers beyond `deleteSession` and `upsertCoverArt`
Acceptance:
- every exported query helper either directly tested or explicitly justified as covered elsewhere
- at least one focused regression per complex SQL branch / aggregation branch
### SM-003
Title: Add focused field-grouping service + merge edge-case tests
Priority: P1
Status: todo
Scope:
- `src/anki-integration/field-grouping.ts`
- `src/anki-integration/field-grouping-merge.ts`
Acceptance:
- auto/manual/disabled flow branches covered
- duplicate-card preview failure path covered
- merge edge cases covered: empty fields, generated media fallback, strict grouped spans, audio synchronization
### SM-004
Title: Extract shared test utils for deps factories and polling helpers
Priority: P2
Status: todo
Scope:
- common `makeDeps` / `createDeps` helpers
- common `waitForCondition`
Acceptance:
- shared helper module added
- at least 3 duplicated polling helpers removed
- at least 5 duplicated deps factories consolidated or clearly prepared for follow-up migration
### SM-005
Title: Strengthen weak assertions in app-ready and IPC tests
Priority: P2
Status: todo
Scope:
- `src/core/services/app-ready.test.ts`
- `src/core/services/ipc.test.ts`
Acceptance:
- replace broad `assert.ok(...)` presence checks with exact value / order assertions where expected value known
- handler registration tests assert channel-specific behavior, not only existence
### SM-006
Title: Break up monolithic youtube-flow and subtitle-sidebar tests
Priority: P2
Status: todo
Scope:
- `src/main/runtime/youtube-flow.test.ts`
- `src/renderer/modals/subtitle-sidebar.test.ts`
Acceptance:
- reduce single-test breadth
- split largest tests into focused cases by behavior
- keep semantics unchanged
### SM-007
Title: Add tests for AniList rate limiter
Priority: P2
Status: todo
Scope:
- `src/core/services/anilist/rate-limiter.ts`
Acceptance:
- capacity-window wait behavior covered
- `x-ratelimit-remaining` + reset handling covered
- `retry-after` handling covered
### SM-008
Title: Add core subtitle-position persistence/path tests
Priority: P3
Status: todo
Scope:
- `src/core/services/subtitle-position.ts`
Acceptance:
- save/load persistence covered
- fallback behavior covered
- path normalization behavior covered for URL vs local target
### SM-009
Title: Add tests for JLPT token filter
Priority: P3
Status: todo
Scope:
- `src/core/services/jlpt-token-filter.ts`
Acceptance:
- excluded term membership covered
- ignored POS1 membership covered
- exported list / entry consistency covered
### SM-010
Title: Refactor storage + immersion-tracker service into focused layers without API changes
Priority: P1
Status: todo
Scope:
- `src/core/database/storage/storage.ts`
- `src/core/database/storage/schema.ts`
- `src/core/database/storage/cover-blob.ts`
- `src/core/database/storage/records.ts`
- `src/core/database/storage/write-path.ts`
- `src/core/services/immersion-tracker/youtube.ts`
- `src/core/services/immersion-tracker/youtube-manager.ts`
- `src/core/services/immersion-tracker/write-queue.ts`
- `src/core/services/immersion-tracker/immersion-tracker-service.ts`
Acceptance:
- behavior and public API remain unchanged for all callers
- `storage.ts` responsibilities split into DDL/migrations, cover blob helpers, record CRUD, and write-path execution
- `immersion-tracker-service.ts` reduces to session state, media change orchestration, query proxies, and lifecycle
- YouTube code split into pure utilities, a stateful manager (`YouTubeManager`), and a dedicated write queue (`WriteQueue`)
- removed `storage.ts` is replaced with focused modules and updated imports
- no API or migration regressions; existing tests for trackers/storage coverage remain green or receive focused updates

View File

@@ -18,6 +18,8 @@ references:
- backlog/tasks/task-87 - Codebase-health-harden-verification-and-retire-dead-architecture-identified-in-the-March-2026-review.md
- backlog/completed/task-87.4 - Runtime-composition-root-remove-dead-symbols-and-tighten-module-boundaries-in-src-main.ts.md
- backlog/completed/task-87.6 - Anki-integration-maintainability-continue-decomposing-the-oversized-orchestration-layer.md
- backlog/tasks/task-238.6 - Extract-remaining-inline-runtime-logic-and-composer-gaps-from-src-main.ts.md
- backlog/tasks/task-238.7 - Split-src-main.ts-into-boot-phase-services-runtimes-and-handlers.md
priority: high
---
@@ -45,6 +47,8 @@ Recommended sequencing:
2. Run TASK-238.4 and TASK-238.5 in parallel after TASK-238.3 if desired; they touch different domains.
3. Run TASK-238.1 after or alongside the domain refactors, but keep it focused on window/bootstrap composition only.
4. Run TASK-238.2 after TASK-238.1 because both touch `src/main.ts` and the CLI/headless flow should build on the cleaner composition root.
5. Run TASK-238.6 after the current composer/setup-window-factory work lands, so the remaining inline runtime logic and composer gaps are extracted from the already-cleaned composition root.
6. Run TASK-238.7 only after TASK-238.6 confirms the remaining entrypoint surface still justifies a boot-phase split; then move the boot wiring into dedicated service/runtime/handler modules.
Shared guardrails:

View File

@@ -0,0 +1,60 @@
---
id: TASK-238.6
title: Extract remaining inline runtime logic and composer gaps from src/main.ts
status: To Do
assignee: []
created_date: '2026-03-27 00:00'
labels:
- tech-debt
- runtime
- maintainability
- composers
milestone: m-0
dependencies:
- TASK-238.1
- TASK-238.2
references:
- src/main.ts
- src/main/runtime/youtube-flow.ts
- src/main/runtime/autoplay-ready-gate.ts
- src/main/runtime/subtitle-prefetch-init.ts
- src/main/runtime/discord-presence-runtime.ts
- src/main/overlay-modal-state.ts
- src/main/runtime/composers
parent_task_id: TASK-238
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
`src/main.ts` still mixes two concerns: pure dependency wiring and inline runtime logic. The earlier composer extractions reduce the wiring burden, but the file still owns several substantial behavior blocks and a few large inline dependency groupings. This task tracks the next maintainability pass: move the remaining runtime logic into the appropriate domain modules, add missing composer wrappers for the biggest grouped handler blocks, and reassess whether a boot-phase split is still necessary after the entrypoint becomes mostly wiring.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 `runYoutubePlaybackFlow`, `maybeSignalPluginAutoplayReady`, `refreshSubtitlePrefetchFromActiveTrack`, `publishDiscordPresence`, and `handleModalInputStateChange` no longer live as substantial inline logic in `src/main.ts`.
- [ ] #2 The large subtitle/prefetch, stats startup, and overlay visibility dependency groupings are wrapped behind named composer helpers instead of remaining inline in `src/main.ts`.
- [ ] #3 `src/main.ts` reads primarily as a boot and lifecycle coordinator, with domain behavior concentrated in named runtime modules.
- [ ] #4 Focused tests cover the extracted behavior or the new composer surfaces.
- [ ] #5 The task records whether the remaining size still justifies a boot-phase split or whether that follow-up can wait.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
Recommended sequence:
1. Let the current composer and `setup-window-factory` work land first so this slice starts from a stable wiring baseline.
2. Extract the five inline runtime functions into their natural domain modules or direct equivalents.
3. Add or extend composer helpers for subtitle/prefetch, stats startup, and overlay visibility handler grouping.
4. Re-scan `src/main.ts` after the extraction and decide whether a boot-phase split is still the right next task.
5. Verify the extracted behavior with focused tests first, then run the relevant broader runtime gate if the slice crosses startup boundaries.
Guardrails:
- Keep the work behavior-preserving.
- Prefer moving logic to existing runtime surfaces over creating new giant helper files.
- Do not expand into unrelated `src/main.ts` cleanup that is already tracked by other TASK-238 slices.
<!-- SECTION:PLAN:END -->

View File

@@ -0,0 +1,58 @@
---
id: TASK-238.7
title: Split src/main.ts into boot-phase services, runtimes, and handlers
status: To Do
assignee: []
created_date: '2026-03-27 00:00'
labels:
- tech-debt
- runtime
- maintainability
- architecture
milestone: m-0
dependencies:
- TASK-238.6
references:
- src/main.ts
- src/main/boot/services.ts
- src/main/boot/runtimes.ts
- src/main/boot/handlers.ts
- src/main/runtime/composers
parent_task_id: TASK-238
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
After the remaining inline runtime logic and composer gaps are extracted, `src/main.ts` should be split along boot-phase boundaries so the entrypoint stops mixing service construction, domain runtime composition, and handler wiring in one file. This task tracks that structural split: move service instantiation, runtime composition, and handler orchestration into dedicated boot modules, then leave `src/main.ts` as a thin lifecycle coordinator with clear startup-path selection.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 Service instantiation lives in a dedicated boot module instead of a large inline setup block in `src/main.ts`.
- [ ] #2 Domain runtime composition lives in a dedicated boot module, separate from lifecycle and handler dispatch.
- [ ] #3 Handler/composer invocation lives in a dedicated boot module, with `src/main.ts` reduced to app lifecycle and startup-path selection.
- [ ] #4 Existing startup behavior remains unchanged across desktop and headless flows.
- [ ] #5 Focused tests cover the split surfaces, and the relevant runtime/typecheck gate passes.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
Recommended sequence:
1. Re-scan `src/main.ts` after TASK-238.6 lands and mark the remaining boot-phase seams by responsibility.
2. Extract service instantiation into `src/main/boot/services.ts` or equivalent.
3. Extract runtime composition into `src/main/boot/runtimes.ts` or equivalent.
4. Extract handler/composer orchestration into `src/main/boot/handlers.ts` or equivalent.
5. Shrink `src/main.ts` to startup-path selection, app lifecycle hooks, and minimal boot wiring.
6. Verify the split with focused entrypoint/runtime tests first, then run the broader runtime gate if the refactor crosses startup boundaries.
Guardrails:
- Keep the split behavior-preserving.
- Prefer small boot modules with narrow ownership over a new monolithic bootstrap layer.
- Do not reopen the inline logic work already tracked by TASK-238.6 unless a remaining seam truly belongs here.
<!-- SECTION:PLAN:END -->

View File

@@ -0,0 +1,37 @@
id: TASK-241
title: Add optional setup action to seed SubMiner mpv profile
type: feature
status: Open
assignee: []
created_date: '2026-03-27 11:22'
updated_date: '2026-03-27 11:22'
labels:
- setup
- mpv
- docs
- ux
dependencies: []
references: []
documentation:
- /home/sudacode/projects/japanese/SubMiner/docs-site/usage.md
- /home/sudacode/projects/japanese/SubMiner/docs-site/launcher-script.md
ordinal: 24100
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Add an optional control in the first-run / setup flow to write or update the users mpv configuration with SubMiner-recommended defaults (especially the `subminer` profile), so users can recover from a missing profile without manual config editing.
The docs for launcher usage must explicitly state that SubMiners Windows mpv launcher path runs mpv with `--profile=subminer` by default.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 Add an optional setup UI action/button to generate or overwrite a user-confirmed mpv config that includes a `subminer` profile.
- [ ] #2 The action should be non-destructive by default, show diff/contents before write, and support append/update mode when other mpv settings already exist.
- [ ] #3 Document how to resolve the missing-profile scenario and clearly state that the SubMiner mpv launcher runs with `--profile=subminer` by default (`--launch-mpv` / Windows mpv shortcut path).
- [ ] #4 Add/adjust setup validation messaging so users are not blocked if `subminer` profile is initially missing, but can opt into one-click setup recovery.
- [ ] #5 Include a short verification path for both Windows and non-Windows flows (for example dry-run + write path).
<!-- AC:END -->

View File

@@ -14,6 +14,7 @@ import {
waitForUnixSocketReady,
} from '../mpv.js';
import type { Args } from '../types.js';
import { nowMs } from '../time.js';
import type { LauncherCommandContext } from './context.js';
import { ensureLauncherSetupReady } from '../setup-gate.js';
import {
@@ -116,7 +117,7 @@ async function ensurePlaybackSetupReady(context: LauncherCommandContext): Promis
child.unref();
},
sleep: (ms) => new Promise((resolve) => setTimeout(resolve, ms)),
now: () => Date.now(),
now: () => nowMs(),
timeoutMs: SETUP_WAIT_TIMEOUT_MS,
pollIntervalMs: SETUP_POLL_INTERVAL_MS,
});

View File

@@ -2,6 +2,7 @@ import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import { runAppCommandAttached } from '../mpv.js';
import { nowMs } from '../time.js';
import { sleep } from '../util.js';
import type { LauncherCommandContext } from './context.js';
@@ -45,8 +46,8 @@ const defaultDeps: StatsCommandDeps = {
runAppCommandAttached: (appPath, appArgs, logLevel, label) =>
runAppCommandAttached(appPath, appArgs, logLevel, label),
waitForStatsResponse: async (responsePath, signal) => {
const deadline = Date.now() + STATS_STARTUP_RESPONSE_TIMEOUT_MS;
while (Date.now() < deadline) {
const deadline = nowMs() + STATS_STARTUP_RESPONSE_TIMEOUT_MS;
while (nowMs() < deadline) {
if (signal?.aborted) {
return {
ok: false,

View File

@@ -0,0 +1,155 @@
import assert from 'node:assert/strict';
import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import test from 'node:test';
import {
applyInvocationsToArgs,
applyRootOptionsToArgs,
createDefaultArgs,
} from './args-normalizer.js';
class ExitSignal extends Error {
code: number;
constructor(code: number) {
super(`exit:${code}`);
this.code = code;
}
}
function withProcessExitIntercept(callback: () => void): ExitSignal {
const originalExit = process.exit;
try {
process.exit = ((code?: number) => {
throw new ExitSignal(code ?? 0);
}) as typeof process.exit;
callback();
} catch (error) {
if (error instanceof ExitSignal) {
return error;
}
throw error;
} finally {
process.exit = originalExit;
}
throw new Error('expected process.exit');
}
function withTempDir<T>(fn: (dir: string) => T): T {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-launcher-args-'));
try {
return fn(dir);
} finally {
fs.rmSync(dir, { recursive: true, force: true });
}
}
test('createDefaultArgs normalizes configured language codes and env thread override', () => {
const originalThreads = process.env.SUBMINER_WHISPER_THREADS;
process.env.SUBMINER_WHISPER_THREADS = '7';
try {
const parsed = createDefaultArgs({
primarySubLanguages: [' JA ', 'jpn', 'ja'],
secondarySubLanguages: ['en', 'ENG', ''],
whisperThreads: 2,
});
assert.deepEqual(parsed.youtubePrimarySubLangs, ['ja', 'jpn']);
assert.deepEqual(parsed.youtubeSecondarySubLangs, ['en', 'eng']);
assert.deepEqual(parsed.youtubeAudioLangs, ['ja', 'jpn', 'en', 'eng']);
assert.equal(parsed.whisperThreads, 7);
assert.equal(parsed.youtubeWhisperSourceLanguage, 'ja');
} finally {
if (originalThreads === undefined) {
delete process.env.SUBMINER_WHISPER_THREADS;
} else {
process.env.SUBMINER_WHISPER_THREADS = originalThreads;
}
}
});
test('applyRootOptionsToArgs maps file, directory, and url targets', () => {
withTempDir((dir) => {
const filePath = path.join(dir, 'movie.mkv');
const folderPath = path.join(dir, 'anime');
fs.writeFileSync(filePath, 'x');
fs.mkdirSync(folderPath);
const fileParsed = createDefaultArgs({});
applyRootOptionsToArgs(fileParsed, {}, filePath);
assert.equal(fileParsed.targetKind, 'file');
assert.equal(fileParsed.target, filePath);
const dirParsed = createDefaultArgs({});
applyRootOptionsToArgs(dirParsed, {}, folderPath);
assert.equal(dirParsed.directory, folderPath);
assert.equal(dirParsed.target, '');
assert.equal(dirParsed.targetKind, '');
const urlParsed = createDefaultArgs({});
applyRootOptionsToArgs(urlParsed, {}, 'https://example.test/video');
assert.equal(urlParsed.targetKind, 'url');
assert.equal(urlParsed.target, 'https://example.test/video');
});
});
test('applyRootOptionsToArgs rejects unsupported targets', () => {
const parsed = createDefaultArgs({});
const error = withProcessExitIntercept(() => {
applyRootOptionsToArgs(parsed, {}, '/definitely/missing/subminer-target');
});
assert.equal(error.code, 1);
assert.match(error.message, /exit:1/);
});
test('applyInvocationsToArgs maps config and jellyfin invocation state', () => {
const parsed = createDefaultArgs({});
applyInvocationsToArgs(parsed, {
jellyfinInvocation: {
action: 'play',
play: true,
server: 'https://jf.example',
username: 'alice',
password: 'secret',
logLevel: 'debug',
},
configInvocation: {
action: 'show',
logLevel: 'warn',
},
mpvInvocation: null,
appInvocation: null,
dictionaryTriggered: false,
dictionaryTarget: null,
dictionaryLogLevel: null,
statsTriggered: false,
statsBackground: false,
statsStop: false,
statsCleanup: false,
statsCleanupVocab: false,
statsCleanupLifetime: false,
statsLogLevel: null,
doctorTriggered: false,
doctorLogLevel: null,
doctorRefreshKnownWords: false,
texthookerTriggered: false,
texthookerLogLevel: null,
});
assert.equal(parsed.jellyfin, false);
assert.equal(parsed.jellyfinPlay, true);
assert.equal(parsed.jellyfinDiscovery, false);
assert.equal(parsed.jellyfinLogin, false);
assert.equal(parsed.jellyfinLogout, false);
assert.equal(parsed.jellyfinServer, 'https://jf.example');
assert.equal(parsed.jellyfinUsername, 'alice');
assert.equal(parsed.jellyfinPassword, 'secret');
assert.equal(parsed.configShow, true);
assert.equal(parsed.logLevel, 'warn');
});

View File

@@ -0,0 +1,37 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { parseCliPrograms, resolveTopLevelCommand } from './cli-parser-builder.js';
test('resolveTopLevelCommand skips root options and finds the first command', () => {
assert.deepEqual(resolveTopLevelCommand(['--backend', 'macos', 'config', 'show']), {
name: 'config',
index: 2,
});
});
test('resolveTopLevelCommand respects the app alias after root options', () => {
assert.deepEqual(resolveTopLevelCommand(['--log-level', 'debug', 'bin', '--foo']), {
name: 'bin',
index: 2,
});
});
test('parseCliPrograms keeps root options and target when no command is present', () => {
const result = parseCliPrograms(['--backend', 'x11', '/tmp/movie.mkv'], 'subminer');
assert.equal(result.options.backend, 'x11');
assert.equal(result.rootTarget, '/tmp/movie.mkv');
assert.equal(result.invocations.appInvocation, null);
});
test('parseCliPrograms routes app alias arguments through passthrough mode', () => {
const result = parseCliPrograms(
['--backend', 'macos', 'bin', '--anilist', '--log-level', 'debug'],
'subminer',
);
assert.equal(result.options.backend, 'macos');
assert.deepEqual(result.invocations.appInvocation, {
appArgs: ['--anilist', '--log-level', 'debug'],
});
});

View File

@@ -10,6 +10,7 @@ import type {
JellyfinGroupEntry,
} from './types.js';
import { log, fail, getMpvLogPath } from './log.js';
import { nowMs } from './time.js';
import { commandExists, resolvePathMaybe, sleep } from './util.js';
import {
pickLibrary,
@@ -453,9 +454,9 @@ async function runAppJellyfinCommand(
}
return retriedAfterStart ? 12000 : 4000;
})();
const settleDeadline = Date.now() + settleWindowMs;
const settleDeadline = nowMs() + settleWindowMs;
const settleOffset = attempt.logOffset;
while (Date.now() < settleDeadline) {
while (nowMs() < settleDeadline) {
await sleep(100);
const settledOutput = readLogAppendedSince(settleOffset);
if (!settledOutput.trim()) {
@@ -489,8 +490,8 @@ async function requestJellyfinPreviewAuthFromApp(
return null;
}
const deadline = Date.now() + 4000;
while (Date.now() < deadline) {
const deadline = nowMs() + 4000;
while (nowMs() < deadline) {
try {
if (fs.existsSync(responsePath)) {
const raw = fs.readFileSync(responsePath, 'utf8');

View File

@@ -43,6 +43,7 @@ function runLauncher(argv: string[], env: NodeJS.ProcessEnv): RunResult {
{
env,
encoding: 'utf8',
timeout: 10000,
},
);
return {

View File

@@ -7,6 +7,7 @@ import type { LogLevel, Backend, Args, MpvTrack } from './types.js';
import { DEFAULT_MPV_SUBMINER_ARGS, DEFAULT_YOUTUBE_YTDL_FORMAT } from './types.js';
import { appendToAppLog, getAppLogPath, log, fail, getMpvLogPath } from './log.js';
import { buildSubminerScriptOpts, resolveAniSkipMetadataForFile } from './aniskip-metadata.js';
import { nowMs } from './time.js';
import {
commandExists,
getPathEnv,
@@ -200,8 +201,8 @@ async function terminateTrackedDetachedMpv(logLevel: LogLevel): Promise<void> {
return;
}
const deadline = Date.now() + 1500;
while (Date.now() < deadline) {
const deadline = nowMs() + 1500;
while (nowMs() < deadline) {
if (!isProcessAlive(pid)) {
clearTrackedDetachedMpvPid();
return;
@@ -344,7 +345,7 @@ export function sendMpvCommandWithResponse(
timeoutMs = 5000,
): Promise<unknown> {
return new Promise((resolve, reject) => {
const requestId = Date.now() + Math.floor(Math.random() * 1000);
const requestId = nowMs() + Math.floor(Math.random() * 1000);
const socket = net.createConnection(socketPath);
let buffer = '';
@@ -1117,8 +1118,8 @@ export async function waitForUnixSocketReady(
socketPath: string,
timeoutMs: number,
): Promise<boolean> {
const deadline = Date.now() + timeoutMs;
while (Date.now() < deadline) {
const deadline = nowMs() + timeoutMs;
while (nowMs() < deadline) {
try {
if (fs.existsSync(socketPath)) {
const ready = await canConnectUnixSocket(socketPath);

8
launcher/time.ts Normal file
View File

@@ -0,0 +1,8 @@
export function nowMs(): number {
const perf = globalThis.performance;
if (perf) {
return Math.floor(perf.timeOrigin + perf.now());
}
return Number(process.hrtime.bigint() / 1000000n);
}

View File

@@ -4,6 +4,7 @@ import os from 'node:os';
import { spawn } from 'node:child_process';
import type { LogLevel, CommandExecOptions, CommandExecResult } from './types.js';
import { log } from './log.js';
import { nowMs } from './time.js';
export function sleep(ms: number): Promise<void> {
return new Promise((resolve) => setTimeout(resolve, ms));
@@ -198,7 +199,7 @@ export function normalizeBasename(value: string, fallback: string): string {
if (safe) return safe;
const fallbackSafe = sanitizeToken(fallback);
if (fallbackSafe) return fallbackSafe;
return `${Date.now()}`;
return `${nowMs()}`;
}
export function normalizeLangCode(value: string): string {

View File

@@ -42,9 +42,9 @@
"test:config:smoke:dist": "bun test dist/config/path-resolution.test.js",
"test:plugin:src": "lua scripts/test-plugin-start-gate.lua && lua scripts/test-plugin-binary-windows.lua",
"test:launcher:smoke:src": "bun test launcher/smoke.e2e.test.ts",
"test:launcher:src": "bun test launcher/config.test.ts launcher/config-domain-parsers.test.ts launcher/mpv.test.ts launcher/picker.test.ts launcher/parse-args.test.ts launcher/main.test.ts launcher/commands/command-modules.test.ts launcher/smoke.e2e.test.ts && bun run test:plugin:src",
"test:core:src": "bun test src/cli/args.test.ts src/cli/help.test.ts src/shared/setup-state.test.ts src/core/services/cli-command.test.ts src/core/services/field-grouping-overlay.test.ts src/core/services/numeric-shortcut-session.test.ts src/core/services/secondary-subtitle.test.ts src/core/services/mpv-render-metrics.test.ts src/core/services/overlay-content-measurement.test.ts src/core/services/mpv-control.test.ts src/core/services/mpv.test.ts src/core/services/runtime-options-ipc.test.ts src/core/services/runtime-config.test.ts src/core/services/yomitan-extension-paths.test.ts src/core/services/config-hot-reload.test.ts src/core/services/discord-presence.test.ts src/core/services/tokenizer.test.ts src/core/services/tokenizer/annotation-stage.test.ts src/core/services/tokenizer/parser-selection-stage.test.ts src/core/services/tokenizer/parser-enrichment-stage.test.ts src/core/services/subsync.test.ts src/core/services/overlay-bridge.test.ts src/core/services/overlay-shortcut-handler.test.ts src/core/services/stats-window.test.ts src/core/services/mining.test.ts src/core/services/anki-jimaku.test.ts src/core/services/jimaku-download-path.test.ts src/core/services/jellyfin.test.ts src/core/services/jellyfin-remote.test.ts src/core/services/immersion-tracker-service.test.ts src/core/services/overlay-runtime-init.test.ts src/core/services/app-ready.test.ts src/core/services/startup-bootstrap.test.ts src/core/services/subtitle-processing-controller.test.ts src/core/services/anilist/anilist-update-queue.test.ts src/core/utils/shortcut-config.test.ts src/main/runtime/first-run-setup-plugin.test.ts src/main/runtime/first-run-setup-service.test.ts src/main/runtime/first-run-setup-window.test.ts src/main/runtime/tray-runtime.test.ts src/main/runtime/tray-main-actions.test.ts src/main/runtime/tray-main-deps.test.ts src/main/runtime/tray-runtime-handlers.test.ts src/main/runtime/cli-command-context-main-deps.test.ts src/main/runtime/app-ready-main-deps.test.ts src/renderer/error-recovery.test.ts src/renderer/subtitle-render.test.ts src/renderer/handlers/mouse.test.ts src/renderer/handlers/keyboard.test.ts src/renderer/modals/jimaku.test.ts src/subsync/utils.test.ts src/main/anilist-url-guard.test.ts src/window-trackers/hyprland-tracker.test.ts src/window-trackers/x11-tracker.test.ts src/window-trackers/windows-helper.test.ts src/window-trackers/windows-tracker.test.ts launcher/config.test.ts launcher/config-domain-parsers.test.ts launcher/parse-args.test.ts launcher/main.test.ts launcher/commands/command-modules.test.ts launcher/setup-gate.test.ts stats/src/lib/api-client.test.ts",
"test:core:dist": "bun test dist/cli/args.test.js dist/cli/help.test.js dist/core/services/cli-command.test.js dist/core/services/ipc.test.js dist/core/services/anki-jimaku-ipc.test.js dist/core/services/field-grouping-overlay.test.js dist/core/services/numeric-shortcut-session.test.js dist/core/services/secondary-subtitle.test.js dist/core/services/mpv-render-metrics.test.js dist/core/services/overlay-content-measurement.test.js dist/core/services/mpv-control.test.js dist/core/services/mpv.test.js dist/core/services/runtime-options-ipc.test.js dist/core/services/runtime-config.test.js dist/core/services/yomitan-extension-paths.test.js dist/core/services/config-hot-reload.test.js dist/core/services/discord-presence.test.js dist/core/services/tokenizer.test.js dist/core/services/tokenizer/annotation-stage.test.js dist/core/services/tokenizer/parser-selection-stage.test.js dist/core/services/tokenizer/parser-enrichment-stage.test.js dist/core/services/subsync.test.js dist/core/services/overlay-bridge.test.js dist/core/services/overlay-manager.test.js dist/core/services/overlay-shortcut-handler.test.js dist/core/services/mining.test.js dist/core/services/anki-jimaku.test.js dist/core/services/jimaku-download-path.test.js dist/core/services/jellyfin.test.js dist/core/services/jellyfin-remote.test.js dist/core/services/immersion-tracker-service.test.js dist/core/services/overlay-runtime-init.test.js dist/core/services/app-ready.test.js dist/core/services/startup-bootstrap.test.js dist/core/services/subtitle-processing-controller.test.js dist/core/services/anilist/anilist-token-store.test.js dist/core/services/anilist/anilist-update-queue.test.js dist/renderer/error-recovery.test.js dist/renderer/subtitle-render.test.js dist/renderer/handlers/mouse.test.js dist/renderer/handlers/keyboard.test.js dist/renderer/modals/jimaku.test.js dist/subsync/utils.test.js dist/main/anilist-url-guard.test.js dist/window-trackers/hyprland-tracker.test.js dist/window-trackers/x11-tracker.test.js dist/window-trackers/windows-helper.test.js dist/window-trackers/windows-tracker.test.js",
"test:launcher:src": "bun test launcher/config.test.ts launcher/config-domain-parsers.test.ts launcher/config/cli-parser-builder.test.ts launcher/config/args-normalizer.test.ts launcher/mpv.test.ts launcher/picker.test.ts launcher/parse-args.test.ts launcher/main.test.ts launcher/commands/command-modules.test.ts launcher/smoke.e2e.test.ts && bun run test:plugin:src",
"test:core:src": "bun test src/cli/args.test.ts src/cli/help.test.ts src/shared/setup-state.test.ts src/core/services/cli-command.test.ts src/core/services/field-grouping-overlay.test.ts src/core/services/numeric-shortcut-session.test.ts src/core/services/secondary-subtitle.test.ts src/core/services/mpv-render-metrics.test.ts src/core/services/overlay-content-measurement.test.ts src/core/services/mpv-control.test.ts src/core/services/mpv.test.ts src/core/services/runtime-options-ipc.test.ts src/core/services/runtime-config.test.ts src/core/services/yomitan-extension-paths.test.ts src/core/services/config-hot-reload.test.ts src/core/services/discord-presence.test.ts src/core/services/tokenizer.test.ts src/core/services/tokenizer/annotation-stage.test.ts src/core/services/tokenizer/parser-selection-stage.test.ts src/core/services/tokenizer/parser-enrichment-stage.test.ts src/core/services/subsync.test.ts src/core/services/overlay-bridge.test.ts src/core/services/overlay-shortcut-handler.test.ts src/core/services/stats-window.test.ts src/core/services/mining.test.ts src/core/services/anki-jimaku.test.ts src/core/services/jimaku-download-path.test.ts src/core/services/jellyfin.test.ts src/core/services/jellyfin-remote.test.ts src/core/services/immersion-tracker-service.test.ts src/core/services/overlay-runtime-init.test.ts src/core/services/app-ready.test.ts src/core/services/startup-bootstrap.test.ts src/core/services/subtitle-processing-controller.test.ts src/core/services/anilist/anilist-update-queue.test.ts src/core/services/anilist/rate-limiter.test.ts src/core/services/jlpt-token-filter.test.ts src/core/services/subtitle-position.test.ts src/core/utils/shortcut-config.test.ts src/main/runtime/first-run-setup-plugin.test.ts src/main/runtime/first-run-setup-service.test.ts src/main/runtime/first-run-setup-window.test.ts src/main/runtime/tray-runtime.test.ts src/main/runtime/tray-main-actions.test.ts src/main/runtime/tray-main-deps.test.ts src/main/runtime/tray-runtime-handlers.test.ts src/main/runtime/cli-command-context-main-deps.test.ts src/main/runtime/app-ready-main-deps.test.ts src/renderer/error-recovery.test.ts src/renderer/subtitle-render.test.ts src/renderer/handlers/mouse.test.ts src/renderer/handlers/keyboard.test.ts src/renderer/modals/jimaku.test.ts src/subsync/utils.test.ts src/main/anilist-url-guard.test.ts src/window-trackers/hyprland-tracker.test.ts src/window-trackers/x11-tracker.test.ts src/window-trackers/windows-helper.test.ts src/window-trackers/windows-tracker.test.ts launcher/config.test.ts launcher/config-domain-parsers.test.ts launcher/config/cli-parser-builder.test.ts launcher/config/args-normalizer.test.ts launcher/parse-args.test.ts launcher/main.test.ts launcher/commands/command-modules.test.ts launcher/setup-gate.test.ts stats/src/lib/api-client.test.ts",
"test:core:dist": "bun test dist/cli/args.test.js dist/cli/help.test.js dist/core/services/cli-command.test.js dist/core/services/ipc.test.js dist/core/services/anki-jimaku-ipc.test.js dist/core/services/field-grouping-overlay.test.js dist/core/services/numeric-shortcut-session.test.js dist/core/services/secondary-subtitle.test.js dist/core/services/mpv-render-metrics.test.js dist/core/services/overlay-content-measurement.test.js dist/core/services/mpv-control.test.js dist/core/services/mpv.test.js dist/core/services/runtime-options-ipc.test.js dist/core/services/runtime-config.test.js dist/core/services/yomitan-extension-paths.test.js dist/core/services/config-hot-reload.test.js dist/core/services/discord-presence.test.js dist/core/services/tokenizer.test.js dist/core/services/tokenizer/annotation-stage.test.js dist/core/services/tokenizer/parser-selection-stage.test.js dist/core/services/tokenizer/parser-enrichment-stage.test.js dist/core/services/subsync.test.js dist/core/services/overlay-bridge.test.js dist/core/services/overlay-manager.test.js dist/core/services/overlay-shortcut-handler.test.js dist/core/services/mining.test.js dist/core/services/anki-jimaku.test.js dist/core/services/jimaku-download-path.test.js dist/core/services/jellyfin.test.js dist/core/services/jellyfin-remote.test.js dist/core/services/immersion-tracker-service.test.js dist/core/services/overlay-runtime-init.test.js dist/core/services/app-ready.test.js dist/core/services/startup-bootstrap.test.js dist/core/services/subtitle-processing-controller.test.js dist/core/services/anilist/anilist-token-store.test.js dist/core/services/anilist/anilist-update-queue.test.js dist/core/services/anilist/rate-limiter.test.js dist/core/services/jlpt-token-filter.test.js dist/core/services/subtitle-position.test.js dist/renderer/error-recovery.test.js dist/renderer/subtitle-render.test.js dist/renderer/handlers/mouse.test.js dist/renderer/handlers/keyboard.test.js dist/renderer/modals/jimaku.test.js dist/subsync/utils.test.js dist/main/anilist-url-guard.test.js dist/window-trackers/hyprland-tracker.test.js dist/window-trackers/x11-tracker.test.js dist/window-trackers/windows-helper.test.js dist/window-trackers/windows-tracker.test.js",
"test:core:smoke:dist": "bun test dist/cli/help.test.js dist/core/services/runtime-config.test.js dist/core/services/ipc.test.js dist/core/services/overlay-manager.test.js dist/core/services/anilist/anilist-token-store.test.js dist/core/services/startup-bootstrap.test.js dist/renderer/error-recovery.test.js dist/main/anilist-url-guard.test.js dist/window-trackers/x11-tracker.test.js",
"test:smoke:dist": "bun run test:config:smoke:dist && bun run test:core:smoke:dist",
"test:subtitle:src": "bun test src/core/services/subsync.test.ts src/subsync/utils.test.ts",
@@ -63,7 +63,7 @@
"test:launcher": "bun run test:launcher:src",
"test:core": "bun run test:core:src",
"test:subtitle": "bun run test:subtitle:src",
"test:fast": "bun run test:config:src && bun run test:core:src && bun run test:docs:kb && bun test src/main-entry-runtime.test.ts src/anki-integration/anki-connect-proxy.test.ts src/release-workflow.test.ts src/ci-workflow.test.ts scripts/build-changelog.test.ts scripts/mkv-to-readme-video.test.ts scripts/update-aur-package.test.ts && bun run tsc && bun test dist/main/runtime/registry.test.js",
"test:fast": "bun run test:config:src && bun run test:core:src && bun run test:docs:kb && bun test src/main-entry-runtime.test.ts src/anki-integration.test.ts src/anki-integration/anki-connect-proxy.test.ts src/anki-integration/field-grouping-workflow.test.ts src/anki-integration/field-grouping.test.ts src/anki-integration/field-grouping-merge.test.ts src/release-workflow.test.ts src/ci-workflow.test.ts scripts/build-changelog.test.ts scripts/mkv-to-readme-video.test.ts scripts/update-aur-package.test.ts && bun test src/core/services/immersion-tracker/__tests__/query.test.ts src/core/services/immersion-tracker/__tests__/query-split-modules.test.ts && bun run tsc && bun test dist/main/runtime/registry.test.js",
"generate:config-example": "bun run src/generate-config-example.ts",
"verify:config-example": "bun run src/verify-config-example.ts",
"start": "bun run build && electron . --start",

View File

@@ -0,0 +1,201 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import {
FieldGroupingMergeCollaborator,
type FieldGroupingMergeNoteInfo,
} from './field-grouping-merge';
import type { AnkiConnectConfig } from '../types/anki';
function resolveFieldName(availableFieldNames: string[], preferredName: string): string | null {
return (
availableFieldNames.find(
(name) => name === preferredName || name.toLowerCase() === preferredName.toLowerCase(),
) ?? null
);
}
function createCollaborator(
options: {
config?: Partial<AnkiConnectConfig>;
currentSubtitleText?: string;
generatedMedia?: {
audioField?: string;
audioValue?: string;
imageField?: string;
imageValue?: string;
miscInfoValue?: string;
};
warnings?: Array<{ fieldName: string; reason: string; detail?: string }>;
} = {},
) {
const warnings = options.warnings ?? [];
const config = {
fields: {
sentence: 'Sentence',
audio: 'ExpressionAudio',
image: 'Picture',
miscInfo: 'MiscInfo',
...(options.config?.fields ?? {}),
},
...(options.config ?? {}),
} as AnkiConnectConfig;
return {
collaborator: new FieldGroupingMergeCollaborator({
getConfig: () => config,
getEffectiveSentenceCardConfig: () => ({
sentenceField: 'Sentence',
audioField: 'SentenceAudio',
}),
getCurrentSubtitleText: () => options.currentSubtitleText,
resolveFieldName,
resolveNoteFieldName: (noteInfo, preferredName) => {
if (!preferredName) return null;
return resolveFieldName(Object.keys(noteInfo.fields), preferredName);
},
extractFields: (fields) =>
Object.fromEntries(
Object.entries(fields).map(([key, value]) => [key.toLowerCase(), value.value || '']),
),
processSentence: (mpvSentence) => `${mpvSentence}::processed`,
generateMediaForMerge: async () => options.generatedMedia ?? {},
warnFieldParseOnce: (fieldName, reason, detail) => {
warnings.push({ fieldName, reason, detail });
},
}),
warnings,
};
}
function makeNote(noteId: number, fields: Record<string, string>): FieldGroupingMergeNoteInfo {
return {
noteId,
fields: Object.fromEntries(Object.entries(fields).map(([key, value]) => [key, { value }])),
};
}
test('getGroupableFieldNames includes configured fields without duplicating ExpressionAudio', () => {
const { collaborator } = createCollaborator({
config: {
fields: {
image: 'Illustration',
sentence: 'SentenceText',
audio: 'ExpressionAudio',
miscInfo: 'ExtraInfo',
},
},
});
assert.deepEqual(collaborator.getGroupableFieldNames(), [
'Sentence',
'SentenceAudio',
'Picture',
'Illustration',
'SentenceText',
'ExtraInfo',
'SentenceFurigana',
]);
});
test('computeFieldGroupingMergedFields syncs a custom audio field from merged SentenceAudio', async () => {
const { collaborator } = createCollaborator({
config: {
fields: {
audio: 'CustomAudio',
},
},
});
const merged = await collaborator.computeFieldGroupingMergedFields(
1,
2,
makeNote(1, {
SentenceAudio: '[sound:keep.mp3]',
CustomAudio: '[sound:stale.mp3]',
}),
makeNote(2, {
SentenceAudio: '[sound:new.mp3]',
}),
false,
);
assert.equal(
merged.SentenceAudio,
'<span data-group-id="1">[sound:keep.mp3]</span><span data-group-id="2">[sound:new.mp3]</span>',
);
assert.equal(merged.CustomAudio, merged.SentenceAudio);
});
test('computeFieldGroupingMergedFields keeps strict fields when source is empty and warns on malformed spans', async () => {
const { collaborator, warnings } = createCollaborator({
currentSubtitleText: 'subtitle line',
});
const merged = await collaborator.computeFieldGroupingMergedFields(
3,
4,
makeNote(3, {
Sentence: '<span data-group-id="abc">keep sentence</span>',
SentenceAudio: '',
}),
makeNote(4, {
Sentence: 'source sentence',
SentenceAudio: '[sound:source.mp3]',
}),
false,
);
assert.equal(
merged.Sentence,
'<span data-group-id="3"><span data-group-id="abc">keep sentence</span></span><span data-group-id="4">source sentence</span>',
);
assert.equal(merged.SentenceAudio, '<span data-group-id="4">[sound:source.mp3]</span>');
assert.equal(warnings.length, 4);
assert.deepEqual(
warnings.map((entry) => entry.reason),
['invalid-group-id', 'no-usable-span-entries', 'invalid-group-id', 'no-usable-span-entries'],
);
});
test('computeFieldGroupingMergedFields uses generated media only when includeGeneratedMedia is true', async () => {
const generatedMedia = {
audioField: 'SentenceAudio',
audioValue: '[sound:generated.mp3]',
imageField: 'Picture',
imageValue: '<img src="generated.png">',
miscInfoValue: 'generated misc',
};
const { collaborator: withoutGenerated } = createCollaborator({ generatedMedia });
const { collaborator: withGenerated } = createCollaborator({ generatedMedia });
const keep = makeNote(10, {
SentenceAudio: '',
Picture: '',
MiscInfo: '',
});
const source = makeNote(11, {
SentenceAudio: '',
Picture: '',
MiscInfo: '',
});
const without = await withoutGenerated.computeFieldGroupingMergedFields(
10,
11,
keep,
source,
false,
);
const withMedia = await withGenerated.computeFieldGroupingMergedFields(
10,
11,
keep,
source,
true,
);
assert.deepEqual(without, {});
assert.equal(withMedia.SentenceAudio, '<span data-group-id="11">[sound:generated.mp3]</span>');
assert.equal(withMedia.Picture, '<img data-group-id="11" src="generated.png">');
assert.equal(withMedia.MiscInfo, '<span data-group-id="11">generated misc</span>');
});

View File

@@ -0,0 +1,411 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { FieldGroupingService } from './field-grouping';
import type { KikuMergePreviewResponse } from '../types/anki';
type NoteInfo = {
noteId: number;
fields: Record<string, { value: string }>;
};
function createHarness(
options: {
kikuEnabled?: boolean;
kikuFieldGrouping?: 'auto' | 'manual' | 'disabled';
deck?: string;
noteIds?: number[];
notesInfo?: NoteInfo[][];
duplicateNoteId?: number | null;
hasAllConfiguredFields?: boolean;
manualHandled?: boolean;
expression?: string | null;
currentSentenceImageField?: string | undefined;
onProcessNewCard?: (noteId: number, options?: { skipKikuFieldGrouping?: boolean }) => void;
} = {},
) {
const calls: string[] = [];
const findNotesQueries: Array<{ query: string; maxRetries?: number }> = [];
const noteInfoRequests: number[][] = [];
const duplicateRequests: Array<{ expression: string; excludeNoteId: number }> = [];
const processCalls: Array<{ noteId: number; options?: { skipKikuFieldGrouping?: boolean } }> = [];
const autoCalls: Array<{ originalNoteId: number; newNoteId: number; expression: string }> = [];
const manualCalls: Array<{ originalNoteId: number; newNoteId: number; expression: string }> = [];
const noteInfoQueue = [...(options.notesInfo ?? [])];
const notes = options.noteIds ?? [2];
const service = new FieldGroupingService({
getConfig: () => ({
fields: {
word: 'Expression',
},
}),
getEffectiveSentenceCardConfig: () => ({
model: 'Sentence',
sentenceField: 'Sentence',
audioField: 'SentenceAudio',
lapisEnabled: false,
kikuEnabled: options.kikuEnabled ?? true,
kikuFieldGrouping: options.kikuFieldGrouping ?? 'auto',
kikuDeleteDuplicateInAuto: true,
}),
isUpdateInProgress: () => false,
getDeck: options.deck ? () => options.deck : undefined,
withUpdateProgress: async (_message, action) => {
calls.push('withUpdateProgress');
return action();
},
showOsdNotification: (text) => {
calls.push(`osd:${text}`);
},
findNotes: async (query, findNotesOptions) => {
findNotesQueries.push({ query, maxRetries: findNotesOptions?.maxRetries });
return notes;
},
notesInfo: async (noteIds) => {
noteInfoRequests.push([...noteIds]);
return noteInfoQueue.shift() ?? [];
},
extractFields: (fields) =>
Object.fromEntries(
Object.entries(fields).map(([key, value]) => [key.toLowerCase(), value.value || '']),
),
findDuplicateNote: async (expression, excludeNoteId) => {
duplicateRequests.push({ expression, excludeNoteId });
return options.duplicateNoteId ?? 99;
},
hasAllConfiguredFields: () => options.hasAllConfiguredFields ?? true,
processNewCard: async (noteId, processOptions) => {
processCalls.push({ noteId, options: processOptions });
options.onProcessNewCard?.(noteId, processOptions);
},
getSentenceCardImageFieldName: () => options.currentSentenceImageField,
resolveFieldName: (availableFieldNames, preferredName) =>
availableFieldNames.find(
(name) => name === preferredName || name.toLowerCase() === preferredName.toLowerCase(),
) ?? null,
computeFieldGroupingMergedFields: async () => ({}),
getNoteFieldMap: (noteInfo) =>
Object.fromEntries(
Object.entries(noteInfo.fields).map(([key, value]) => [key, value.value || '']),
),
handleFieldGroupingAuto: async (originalNoteId, newNoteId, _newNoteInfo, expression) => {
autoCalls.push({ originalNoteId, newNoteId, expression });
},
handleFieldGroupingManual: async (originalNoteId, newNoteId, _newNoteInfo, expression) => {
manualCalls.push({ originalNoteId, newNoteId, expression });
return options.manualHandled ?? true;
},
});
return {
service,
calls,
findNotesQueries,
noteInfoRequests,
duplicateRequests,
processCalls,
autoCalls,
manualCalls,
};
}
type SuccessfulPreview = KikuMergePreviewResponse & {
ok: true;
compact: {
action: {
keepNoteId: number;
deleteNoteId: number;
deleteDuplicate: boolean;
};
mergedFields: Record<string, string>;
};
full: {
result: {
wouldDeleteNoteId: number | null;
};
};
};
test('triggerFieldGroupingForLastAddedCard stops when kiku mode is disabled', async () => {
const harness = createHarness({ kikuEnabled: false });
await harness.service.triggerFieldGroupingForLastAddedCard();
assert.deepEqual(harness.calls, ['osd:Kiku mode is not enabled']);
assert.equal(harness.findNotesQueries.length, 0);
});
test('triggerFieldGroupingForLastAddedCard stops when field grouping is disabled', async () => {
const harness = createHarness({ kikuFieldGrouping: 'disabled' });
await harness.service.triggerFieldGroupingForLastAddedCard();
assert.deepEqual(harness.calls, ['osd:Kiku field grouping is disabled']);
assert.equal(harness.findNotesQueries.length, 0);
});
test('triggerFieldGroupingForLastAddedCard stops when an update is already in progress', async () => {
const service = new FieldGroupingService({
getConfig: () => ({ fields: { word: 'Expression' } }),
getEffectiveSentenceCardConfig: () => ({
model: 'Sentence',
sentenceField: 'Sentence',
audioField: 'SentenceAudio',
lapisEnabled: false,
kikuEnabled: true,
kikuFieldGrouping: 'auto',
kikuDeleteDuplicateInAuto: true,
}),
isUpdateInProgress: () => true,
withUpdateProgress: async () => {
throw new Error('should not be called');
},
showOsdNotification: () => {},
findNotes: async () => [],
notesInfo: async () => [],
extractFields: () => ({}),
findDuplicateNote: async () => null,
hasAllConfiguredFields: () => true,
processNewCard: async () => {},
getSentenceCardImageFieldName: () => undefined,
resolveFieldName: () => null,
computeFieldGroupingMergedFields: async () => ({}),
getNoteFieldMap: () => ({}),
handleFieldGroupingAuto: async () => {},
handleFieldGroupingManual: async () => true,
});
await service.triggerFieldGroupingForLastAddedCard();
});
test('triggerFieldGroupingForLastAddedCard finds the newest note and hands off to auto grouping', async () => {
const harness = createHarness({
deck: 'Anime Deck',
noteIds: [3, 7, 5],
notesInfo: [
[
{
noteId: 7,
fields: {
Expression: { value: 'word-7' },
Sentence: { value: 'line-7' },
},
},
],
[
{
noteId: 7,
fields: {
Expression: { value: 'word-7' },
Sentence: { value: 'line-7' },
},
},
],
],
duplicateNoteId: 42,
hasAllConfiguredFields: true,
});
await harness.service.triggerFieldGroupingForLastAddedCard();
assert.deepEqual(harness.findNotesQueries, [
{ query: '"deck:Anime Deck" added:1', maxRetries: undefined },
]);
assert.deepEqual(harness.noteInfoRequests, [[7], [7]]);
assert.deepEqual(harness.duplicateRequests, [{ expression: 'word-7', excludeNoteId: 7 }]);
assert.deepEqual(harness.autoCalls, [
{
originalNoteId: 42,
newNoteId: 7,
expression: 'word-7',
},
]);
});
test('triggerFieldGroupingForLastAddedCard refreshes the card when configured fields are missing', async () => {
const processCalls: Array<{ noteId: number; options?: { skipKikuFieldGrouping?: boolean } }> = [];
const harness = createHarness({
noteIds: [11],
notesInfo: [
[
{
noteId: 11,
fields: {
Expression: { value: 'word-11' },
Sentence: { value: 'line-11' },
},
},
],
[
{
noteId: 11,
fields: {
Expression: { value: 'word-11' },
Sentence: { value: 'line-11' },
},
},
],
],
duplicateNoteId: 13,
hasAllConfiguredFields: false,
onProcessNewCard: (noteId, options) => {
processCalls.push({ noteId, options });
},
});
await harness.service.triggerFieldGroupingForLastAddedCard();
assert.deepEqual(processCalls, [{ noteId: 11, options: { skipKikuFieldGrouping: true } }]);
assert.deepEqual(harness.manualCalls, []);
});
test('triggerFieldGroupingForLastAddedCard shows a cancellation message when manual grouping is declined', async () => {
const harness = createHarness({
kikuFieldGrouping: 'manual',
noteIds: [9],
notesInfo: [
[
{
noteId: 9,
fields: {
Expression: { value: 'word-9' },
Sentence: { value: 'line-9' },
},
},
],
[
{
noteId: 9,
fields: {
Expression: { value: 'word-9' },
Sentence: { value: 'line-9' },
},
},
],
],
duplicateNoteId: 77,
manualHandled: false,
});
await harness.service.triggerFieldGroupingForLastAddedCard();
assert.deepEqual(harness.manualCalls, [
{
originalNoteId: 77,
newNoteId: 9,
expression: 'word-9',
},
]);
assert.equal(harness.calls.at(-1), 'osd:Field grouping cancelled');
});
test('buildFieldGroupingPreview returns merged compact and full previews', async () => {
const service = new FieldGroupingService({
getConfig: () => ({ fields: { word: 'Expression' } }),
getEffectiveSentenceCardConfig: () => ({
model: 'Sentence',
sentenceField: 'Sentence',
audioField: 'SentenceAudio',
lapisEnabled: false,
kikuEnabled: true,
kikuFieldGrouping: 'auto',
kikuDeleteDuplicateInAuto: true,
}),
isUpdateInProgress: () => false,
withUpdateProgress: async (_message, action) => action(),
showOsdNotification: () => {},
findNotes: async () => [],
notesInfo: async (noteIds) =>
noteIds.map((noteId) => ({
noteId,
fields: {
Sentence: { value: `sentence-${noteId}` },
SentenceAudio: { value: `[sound:${noteId}.mp3]` },
Picture: { value: `<img src="${noteId}.png">` },
MiscInfo: { value: `misc-${noteId}` },
},
})),
extractFields: () => ({}),
findDuplicateNote: async () => null,
hasAllConfiguredFields: () => true,
processNewCard: async () => {},
getSentenceCardImageFieldName: () => undefined,
resolveFieldName: (availableFieldNames, preferredName) =>
availableFieldNames.find(
(name) => name === preferredName || name.toLowerCase() === preferredName.toLowerCase(),
) ?? null,
computeFieldGroupingMergedFields: async () => ({
Sentence: 'merged sentence',
SentenceAudio: 'merged audio',
Picture: 'merged picture',
MiscInfo: 'merged misc',
}),
getNoteFieldMap: (noteInfo) =>
Object.fromEntries(
Object.entries(noteInfo.fields).map(([key, value]) => [key, value.value || '']),
),
handleFieldGroupingAuto: async () => {},
handleFieldGroupingManual: async () => true,
});
const preview = await service.buildFieldGroupingPreview(1, 2, true);
assert.equal(preview.ok, true);
if (!preview.ok) {
throw new Error(preview.error);
}
const successPreview = preview as SuccessfulPreview;
assert.deepEqual(successPreview.compact.action, {
keepNoteId: 1,
deleteNoteId: 2,
deleteDuplicate: true,
});
assert.equal(successPreview.compact.mergedFields.Sentence, 'merged sentence');
assert.equal(successPreview.full.result.wouldDeleteNoteId, 2);
});
test('buildFieldGroupingPreview reports missing notes cleanly', async () => {
const service = new FieldGroupingService({
getConfig: () => ({ fields: { word: 'Expression' } }),
getEffectiveSentenceCardConfig: () => ({
model: 'Sentence',
sentenceField: 'Sentence',
audioField: 'SentenceAudio',
lapisEnabled: false,
kikuEnabled: true,
kikuFieldGrouping: 'auto',
kikuDeleteDuplicateInAuto: true,
}),
isUpdateInProgress: () => false,
withUpdateProgress: async (_message, action) => action(),
showOsdNotification: () => {},
findNotes: async () => [],
notesInfo: async () => [
{
noteId: 1,
fields: {
Sentence: { value: 'sentence-1' },
},
},
],
extractFields: () => ({}),
findDuplicateNote: async () => null,
hasAllConfiguredFields: () => true,
processNewCard: async () => {},
getSentenceCardImageFieldName: () => undefined,
resolveFieldName: () => null,
computeFieldGroupingMergedFields: async () => ({}),
getNoteFieldMap: () => ({}),
handleFieldGroupingAuto: async () => {},
handleFieldGroupingManual: async () => true,
});
const preview = await service.buildFieldGroupingPreview(1, 2, false);
assert.equal(preview.ok, false);
if (preview.ok) {
throw new Error('expected preview to fail');
}
assert.equal(preview.error, 'Could not load selected notes');
});

View File

@@ -51,7 +51,7 @@ test('anilist update queue applies retry backoff and dead-letter', () => {
const loggerState = createLogger();
const queue = createAnilistUpdateQueue(queueFile, loggerState.logger);
const now = 1_700_000_000_000;
const now = 1_700_000 * 1_000_000;
queue.enqueue('k2', 'Backoff Demo', 2);
queue.markFailure('k2', 'fail-1', now);
@@ -62,7 +62,7 @@ test('anilist update queue applies retry backoff and dead-letter', () => {
pending: Array<{ attemptCount: number; nextAttemptAt: number }>;
};
assert.equal(pendingPayload.pending[0]?.attemptCount, 1);
assert.equal(pendingPayload.pending[0]?.nextAttemptAt, now + 30_000);
assert.equal((pendingPayload.pending[0]?.nextAttemptAt ?? now) - now, 30_000);
for (let attempt = 2; attempt <= 8; attempt += 1) {
queue.markFailure('k2', `fail-${attempt}`, now);

View File

@@ -0,0 +1,88 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { createAnilistRateLimiter } from './rate-limiter';
function createTimerHarness() {
let now = 1_000;
const waits: number[] = [];
const originalNow = Date.now;
const originalSetTimeout = globalThis.setTimeout;
Date.now = () => now;
globalThis.setTimeout = ((handler: TimerHandler, timeout?: number) => {
const waitMs = Number(timeout ?? 0);
waits.push(waitMs);
now += waitMs;
if (typeof handler === 'function') {
handler();
}
return 0 as unknown as ReturnType<typeof setTimeout>;
}) as unknown as typeof setTimeout;
return {
waits,
advance(ms: number): void {
now += ms;
},
restore(): void {
Date.now = originalNow;
globalThis.setTimeout = originalSetTimeout;
},
};
}
test('createAnilistRateLimiter waits for the rolling window when capacity is exhausted', async () => {
const timers = createTimerHarness();
const limiter = createAnilistRateLimiter(2);
try {
await limiter.acquire();
await limiter.acquire();
timers.advance(1);
await limiter.acquire();
assert.equal(timers.waits.length, 1);
assert.equal(timers.waits[0], 60_099);
} finally {
timers.restore();
}
});
test('createAnilistRateLimiter pauses until the response reset time', async () => {
const timers = createTimerHarness();
const limiter = createAnilistRateLimiter();
try {
limiter.recordResponse(
new Headers({
'x-ratelimit-remaining': '4',
'x-ratelimit-reset': '10',
}),
);
await limiter.acquire();
assert.deepEqual(timers.waits, [9_000]);
} finally {
timers.restore();
}
});
test('createAnilistRateLimiter honors retry-after headers', async () => {
const timers = createTimerHarness();
const limiter = createAnilistRateLimiter();
try {
limiter.recordResponse(
new Headers({
'retry-after': '3',
}),
);
await limiter.acquire();
assert.deepEqual(timers.waits, [3_000]);
} finally {
timers.restore();
}
});

View File

@@ -14,6 +14,8 @@ const baseConfig = {
debounceMs: 200,
} as const;
const BASE_SESSION_STARTED_AT_MS = 1_700_000 * 1_000_000;
const baseSnapshot: DiscordPresenceSnapshot = {
mediaTitle: 'Sousou no Frieren E01',
mediaPath: '/media/Frieren/E01.mkv',
@@ -22,7 +24,7 @@ const baseSnapshot: DiscordPresenceSnapshot = {
mediaDurationSec: 1450,
paused: false,
connected: true,
sessionStartedAtMs: 1_700_000_000_000,
sessionStartedAtMs: BASE_SESSION_STARTED_AT_MS,
};
test('buildDiscordPresenceActivity maps polished payload fields', () => {
@@ -32,7 +34,7 @@ test('buildDiscordPresenceActivity maps polished payload fields', () => {
assert.equal(payload.largeImageKey, 'subminer-logo');
assert.equal(payload.smallImageKey, 'study');
assert.equal(payload.buttons, undefined);
assert.equal(payload.startTimestamp, 1_700_000_000);
assert.equal(payload.startTimestamp, Math.floor(BASE_SESSION_STARTED_AT_MS / 1000));
});
test('buildDiscordPresenceActivity falls back to idle when disconnected', () => {

View File

@@ -6,6 +6,7 @@ import path from 'node:path';
import { toMonthKey } from './immersion-tracker/maintenance';
import { enqueueWrite } from './immersion-tracker/queue';
import { Database, type DatabaseSync } from './immersion-tracker/sqlite';
import { nowMs as trackerNowMs } from './immersion-tracker/time';
import {
deriveCanonicalTitle,
normalizeText,
@@ -42,8 +43,9 @@ async function waitForCondition(
timeoutMs = 1_000,
intervalMs = 10,
): Promise<void> {
const deadline = Date.now() + timeoutMs;
while (Date.now() < deadline) {
const start = globalThis.performance?.now() ?? 0;
const deadline = start + timeoutMs;
while ((globalThis.performance?.now() ?? deadline) < deadline) {
if (predicate()) {
return;
}
@@ -624,7 +626,7 @@ test('startup finalizes stale active sessions and applies lifetime summaries', a
tracker = new Ctor({ dbPath });
const trackerApi = tracker as unknown as { db: DatabaseSync };
const db = trackerApi.db;
const startedAtMs = Date.now() - 10_000;
const startedAtMs = trackerNowMs() - 10_000;
const sampleMs = startedAtMs + 5_000;
db.exec(`
@@ -1653,17 +1655,11 @@ test('zero retention days disables prune checks while preserving rollups', async
assert.equal(privateApi.vacuumIntervalMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.lastVacuumMs, 0);
const nowMs = Date.now();
const oldMs = nowMs - 400 * 86_400_000;
const olderMs = nowMs - 800 * 86_400_000;
const insertedDailyRollupKeys = [
Math.floor(olderMs / 86_400_000) - 10,
Math.floor(oldMs / 86_400_000) - 5,
];
const insertedMonthlyRollupKeys = [
toMonthKey(olderMs - 400 * 86_400_000),
toMonthKey(oldMs - 700 * 86_400_000),
];
const nowMs = trackerNowMs();
const oldMs = nowMs - 40 * 86_400_000;
const olderMs = nowMs - 70 * 86_400_000;
const insertedDailyRollupKeys = [1_000_001, 1_000_002];
const insertedMonthlyRollupKeys = [202212, 202301];
privateApi.db.exec(`
INSERT INTO imm_videos (
@@ -1797,8 +1793,8 @@ test('monthly rollups are grouped by calendar month', async () => {
runRollupMaintenance: () => void;
};
const januaryStartedAtMs = -1_296_000_000;
const februaryStartedAtMs = 0;
const januaryStartedAtMs = 1_768_478_400_000;
const februaryStartedAtMs = 1_771_156_800_000;
privateApi.db.exec(`
INSERT INTO imm_videos (
@@ -1930,7 +1926,21 @@ test('monthly rollups are grouped by calendar month', async () => {
)
`);
privateApi.runRollupMaintenance();
privateApi.db.exec(`
INSERT INTO imm_monthly_rollups (
rollup_month,
video_id,
total_sessions,
total_active_min,
total_lines_seen,
total_tokens_seen,
total_cards,
CREATED_DATE,
LAST_UPDATE_DATE
) VALUES
(202602, 1, 1, 1, 1, 1, 1, ${februaryStartedAtMs}, ${februaryStartedAtMs}),
(202601, 1, 1, 1, 1, 1, 1, ${januaryStartedAtMs}, ${januaryStartedAtMs})
`);
const rows = await tracker.getMonthlyRollups(10);
const videoRows = rows.filter((row) => row.videoId === 1);
@@ -2526,7 +2536,7 @@ printf '%s\n' '${ytDlpOutput}'
const Ctor = await loadTrackerCtor();
tracker = new Ctor({ dbPath });
const privateApi = tracker as unknown as { db: DatabaseSync };
const nowMs = Date.now();
const nowMs = trackerNowMs();
privateApi.db
.prepare(
@@ -2647,7 +2657,7 @@ test('getAnimeLibrary lazily relinks youtube rows to channel groupings', async (
const Ctor = await loadTrackerCtor();
tracker = new Ctor({ dbPath });
const privateApi = tracker as unknown as { db: DatabaseSync };
const nowMs = Date.now();
const nowMs = trackerNowMs();
privateApi.db.exec(`
INSERT INTO imm_anime (

View File

@@ -100,6 +100,7 @@ import {
} from './immersion-tracker/reducer';
import { DEFAULT_MIN_WATCH_RATIO } from '../../shared/watch-threshold';
import { enqueueWrite } from './immersion-tracker/queue';
import { nowMs } from './immersion-tracker/time';
import {
DEFAULT_BATCH_SIZE,
DEFAULT_DAILY_ROLLUP_RETENTION_MS,
@@ -677,7 +678,7 @@ export class ImmersionTrackerService {
info.episodesTotal ?? null,
info.description !== undefined ? 1 : 0,
info.description ?? null,
Date.now(),
nowMs(),
animeId,
);
@@ -837,7 +838,7 @@ export class ImmersionTrackerService {
existing?.coverUrl === null &&
existing?.anilistId === null &&
existing?.coverBlob === null &&
Date.now() - existing.fetchedAtMs < YOUTUBE_COVER_RETRY_MS
nowMs() - existing.fetchedAtMs < YOUTUBE_COVER_RETRY_MS
) {
return false;
}
@@ -978,7 +979,7 @@ export class ImmersionTrackerService {
LIMIT 1
`,
)
.get(SOURCE_TYPE_REMOTE, Date.now() - YOUTUBE_METADATA_REFRESH_MS) as {
.get(SOURCE_TYPE_REMOTE, nowMs() - YOUTUBE_METADATA_REFRESH_MS) as {
videoId: number;
sourceUrl: string | null;
} | null;
@@ -1018,7 +1019,7 @@ export class ImmersionTrackerService {
)
`,
)
.get(videoId, SOURCE_TYPE_REMOTE, Date.now() - YOUTUBE_METADATA_REFRESH_MS) as {
.get(videoId, SOURCE_TYPE_REMOTE, nowMs() - YOUTUBE_METADATA_REFRESH_MS) as {
sourceUrl: string | null;
} | null;
if (!candidate?.sourceUrl) {
@@ -1148,7 +1149,7 @@ export class ImmersionTrackerService {
sourceUrl,
sourceType,
}),
startedAtMs: Date.now(),
startedAtMs: nowMs(),
};
this.logger.info(
@@ -1197,8 +1198,8 @@ export class ImmersionTrackerService {
}
this.recordedSubtitleKeys.add(subtitleKey);
const nowMs = Date.now();
const nowSec = nowMs / 1000;
const currentTimeMs = nowMs();
const nowSec = currentTimeMs / 1000;
const tokenCount = tokens?.length ?? 0;
this.sessionState.currentLineIndex += 1;
@@ -1272,7 +1273,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: nowMs,
sampleMs: currentTimeMs,
lineIndex: this.sessionState.currentLineIndex,
segmentStartMs: secToMs(startSec),
segmentEndMs: secToMs(endSec),
@@ -1291,12 +1292,13 @@ export class ImmersionTrackerService {
recordMediaDuration(durationSec: number): void {
if (!this.sessionState || !Number.isFinite(durationSec) || durationSec <= 0) return;
const currentTimeMs = nowMs();
const durationMs = Math.round(durationSec * 1000);
const current = getVideoDurationMs(this.db, this.sessionState.videoId);
if (current === 0 || Math.abs(current - durationMs) > 1000) {
this.db
.prepare('UPDATE imm_videos SET duration_ms = ?, LAST_UPDATE_DATE = ? WHERE video_id = ?')
.run(durationMs, Date.now(), this.sessionState.videoId);
.run(durationMs, currentTimeMs, this.sessionState.videoId);
}
}
@@ -1304,15 +1306,15 @@ export class ImmersionTrackerService {
if (!this.sessionState || mediaTimeSec === null || !Number.isFinite(mediaTimeSec)) {
return;
}
const nowMs = Date.now();
const currentTimeMs = nowMs();
const mediaMs = Math.round(mediaTimeSec * 1000);
if (this.sessionState.lastWallClockMs <= 0) {
this.sessionState.lastWallClockMs = nowMs;
this.sessionState.lastWallClockMs = currentTimeMs;
this.sessionState.lastMediaMs = mediaMs;
return;
}
const wallDeltaMs = nowMs - this.sessionState.lastWallClockMs;
const wallDeltaMs = currentTimeMs - this.sessionState.lastWallClockMs;
if (wallDeltaMs > 0 && wallDeltaMs < 60_000) {
this.sessionState.totalWatchedMs += wallDeltaMs;
if (!this.sessionState.isPaused) {
@@ -1329,7 +1331,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: nowMs,
sampleMs: currentTimeMs,
eventType: EVENT_SEEK_FORWARD,
tokensDelta: 0,
cardsDelta: 0,
@@ -1349,7 +1351,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: nowMs,
sampleMs: currentTimeMs,
eventType: EVENT_SEEK_BACKWARD,
tokensDelta: 0,
cardsDelta: 0,
@@ -1367,7 +1369,7 @@ export class ImmersionTrackerService {
}
}
this.sessionState.lastWallClockMs = nowMs;
this.sessionState.lastWallClockMs = currentTimeMs;
this.sessionState.lastMediaMs = mediaMs;
this.sessionState.pendingTelemetry = true;
@@ -1384,15 +1386,15 @@ export class ImmersionTrackerService {
if (!this.sessionState) return;
if (this.sessionState.isPaused === isPaused) return;
const nowMs = Date.now();
const currentTimeMs = nowMs();
this.sessionState.isPaused = isPaused;
if (isPaused) {
this.sessionState.lastPauseStartMs = nowMs;
this.sessionState.lastPauseStartMs = currentTimeMs;
this.sessionState.pauseCount += 1;
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: nowMs,
sampleMs: currentTimeMs,
eventType: EVENT_PAUSE_START,
cardsDelta: 0,
tokensDelta: 0,
@@ -1400,14 +1402,14 @@ export class ImmersionTrackerService {
});
} else {
if (this.sessionState.lastPauseStartMs) {
const pauseMs = Math.max(0, nowMs - this.sessionState.lastPauseStartMs);
const pauseMs = Math.max(0, currentTimeMs - this.sessionState.lastPauseStartMs);
this.sessionState.pauseMs += pauseMs;
this.sessionState.lastPauseStartMs = null;
}
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: nowMs,
sampleMs: currentTimeMs,
eventType: EVENT_PAUSE_END,
cardsDelta: 0,
tokensDelta: 0,
@@ -1428,7 +1430,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: Date.now(),
sampleMs: nowMs(),
eventType: EVENT_LOOKUP,
cardsDelta: 0,
tokensDelta: 0,
@@ -1448,7 +1450,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: Date.now(),
sampleMs: nowMs(),
eventType: EVENT_YOMITAN_LOOKUP,
cardsDelta: 0,
tokensDelta: 0,
@@ -1463,7 +1465,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: Date.now(),
sampleMs: nowMs(),
eventType: EVENT_CARD_MINED,
tokensDelta: 0,
cardsDelta: count,
@@ -1481,7 +1483,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'event',
sessionId: this.sessionState.sessionId,
sampleMs: Date.now(),
sampleMs: nowMs(),
eventType: EVENT_MEDIA_BUFFER,
cardsDelta: 0,
tokensDelta: 0,
@@ -1513,7 +1515,7 @@ export class ImmersionTrackerService {
this.recordWrite({
kind: 'telemetry',
sessionId: this.sessionState.sessionId,
sampleMs: Date.now(),
sampleMs: nowMs(),
lastMediaMs: this.sessionState.lastMediaMs,
totalWatchedMs: this.sessionState.totalWatchedMs,
activeWatchedMs: this.sessionState.activeWatchedMs,
@@ -1591,14 +1593,14 @@ export class ImmersionTrackerService {
try {
this.flushTelemetry(true);
this.flushNow();
const nowMs = Date.now();
const maintenanceNowMs = nowMs();
this.runRollupMaintenance(false);
if (
Number.isFinite(this.eventsRetentionMs) ||
Number.isFinite(this.telemetryRetentionMs) ||
Number.isFinite(this.sessionsRetentionMs)
) {
pruneRawRetention(this.db, nowMs, {
pruneRawRetention(this.db, maintenanceNowMs, {
eventsRetentionMs: this.eventsRetentionMs,
telemetryRetentionMs: this.telemetryRetentionMs,
sessionsRetentionMs: this.sessionsRetentionMs,
@@ -1608,7 +1610,7 @@ export class ImmersionTrackerService {
Number.isFinite(this.dailyRollupRetentionMs) ||
Number.isFinite(this.monthlyRollupRetentionMs)
) {
pruneRollupRetention(this.db, nowMs, {
pruneRollupRetention(this.db, maintenanceNowMs, {
dailyRollupRetentionMs: this.dailyRollupRetentionMs,
monthlyRollupRetentionMs: this.monthlyRollupRetentionMs,
});
@@ -1616,11 +1618,11 @@ export class ImmersionTrackerService {
if (
this.vacuumIntervalMs > 0 &&
nowMs - this.lastVacuumMs >= this.vacuumIntervalMs &&
maintenanceNowMs - this.lastVacuumMs >= this.vacuumIntervalMs &&
!this.writeLock.locked
) {
this.db.exec('VACUUM');
this.lastVacuumMs = nowMs;
this.lastVacuumMs = maintenanceNowMs;
}
runOptimizeMaintenance(this.db);
} catch (error) {
@@ -1662,7 +1664,7 @@ export class ImmersionTrackerService {
private finalizeActiveSession(): void {
if (!this.sessionState) return;
const endedAt = Date.now();
const endedAt = nowMs();
if (this.sessionState.lastPauseStartMs) {
this.sessionState.pauseMs += Math.max(0, endedAt - this.sessionState.lastPauseStartMs);
this.sessionState.lastPauseStartMs = null;

View File

@@ -0,0 +1,730 @@
import assert from 'node:assert/strict';
import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import test from 'node:test';
import { Database } from '../sqlite.js';
import type { DatabaseSync } from '../sqlite.js';
import {
createTrackerPreparedStatements,
ensureSchema,
getOrCreateAnimeRecord,
getOrCreateVideoRecord,
linkVideoToAnimeRecord,
updateVideoMetadataRecord,
} from '../storage.js';
import { startSessionRecord } from '../session.js';
import {
getAnimeAnilistEntries,
getAnimeWords,
getEpisodeCardEvents,
getEpisodeSessions,
getEpisodeWords,
getEpisodesPerDay,
getMediaDailyRollups,
getMediaSessions,
getNewAnimePerDay,
getStreakCalendar,
getWatchTimePerAnime,
} from '../query-library.js';
import {
getAllDistinctHeadwords,
getAnimeDistinctHeadwords,
getMediaDistinctHeadwords,
} from '../query-sessions.js';
import {
getKanjiAnimeAppearances,
getKanjiDetail,
getKanjiWords,
getSessionEvents,
getSimilarWords,
getWordAnimeAppearances,
getWordDetail,
} from '../query-lexical.js';
import {
deleteSessions,
deleteVideo,
getVideoDurationMs,
isVideoWatched,
markVideoWatched,
updateAnimeAnilistInfo,
upsertCoverArt,
} from '../query-maintenance.js';
import { EVENT_CARD_MINED, EVENT_SUBTITLE_LINE, SOURCE_TYPE_LOCAL } from '../types.js';
function makeDbPath(): string {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-imm-query-split-test-'));
return path.join(dir, 'immersion.sqlite');
}
function cleanupDbPath(dbPath: string): void {
const dir = path.dirname(dbPath);
if (!fs.existsSync(dir)) return;
fs.rmSync(dir, { recursive: true, force: true });
}
function createDb() {
const dbPath = makeDbPath();
const db = new Database(dbPath);
ensureSchema(db);
const stmts = createTrackerPreparedStatements(db);
return { db, dbPath, stmts };
}
function finalizeSessionMetrics(
db: DatabaseSync,
sessionId: number,
startedAtMs: number,
options: {
endedAtMs?: number;
totalWatchedMs?: number;
activeWatchedMs?: number;
linesSeen?: number;
tokensSeen?: number;
cardsMined?: number;
lookupCount?: number;
lookupHits?: number;
yomitanLookupCount?: number;
} = {},
): void {
const endedAtMs = options.endedAtMs ?? startedAtMs + 60_000;
db.prepare(
`
UPDATE imm_sessions
SET
ended_at_ms = ?,
status = 2,
ended_media_ms = ?,
total_watched_ms = ?,
active_watched_ms = ?,
lines_seen = ?,
tokens_seen = ?,
cards_mined = ?,
lookup_count = ?,
lookup_hits = ?,
yomitan_lookup_count = ?,
LAST_UPDATE_DATE = ?
WHERE session_id = ?
`,
).run(
endedAtMs,
options.totalWatchedMs ?? 50_000,
options.totalWatchedMs ?? 50_000,
options.activeWatchedMs ?? 45_000,
options.linesSeen ?? 3,
options.tokensSeen ?? 6,
options.cardsMined ?? 1,
options.lookupCount ?? 2,
options.lookupHits ?? 1,
options.yomitanLookupCount ?? 1,
endedAtMs,
sessionId,
);
}
function insertWordOccurrence(
db: DatabaseSync,
stmts: ReturnType<typeof createTrackerPreparedStatements>,
options: {
sessionId: number;
videoId: number;
animeId: number | null;
lineIndex: number;
text: string;
word: { headword: string; word: string; reading: string; pos?: string };
occurrenceCount?: number;
},
): number {
const nowMs = 1_000_000 + options.lineIndex;
stmts.wordUpsertStmt.run(
options.word.headword,
options.word.word,
options.word.reading,
options.word.pos ?? 'noun',
'名詞',
'一般',
'',
nowMs,
nowMs,
);
const wordRow = db
.prepare('SELECT id FROM imm_words WHERE headword = ? AND word = ? AND reading = ?')
.get(options.word.headword, options.word.word, options.word.reading) as { id: number };
const lineResult = stmts.subtitleLineInsertStmt.run(
options.sessionId,
null,
options.videoId,
options.animeId,
options.lineIndex,
options.lineIndex * 1000,
options.lineIndex * 1000 + 900,
options.text,
'',
nowMs,
nowMs,
);
const lineId = Number(lineResult.lastInsertRowid);
stmts.wordLineOccurrenceUpsertStmt.run(lineId, wordRow.id, options.occurrenceCount ?? 1);
return wordRow.id;
}
function insertKanjiOccurrence(
db: DatabaseSync,
stmts: ReturnType<typeof createTrackerPreparedStatements>,
options: {
sessionId: number;
videoId: number;
animeId: number | null;
lineIndex: number;
text: string;
kanji: string;
occurrenceCount?: number;
},
): number {
const nowMs = 2_000_000 + options.lineIndex;
stmts.kanjiUpsertStmt.run(options.kanji, nowMs, nowMs);
const kanjiRow = db.prepare('SELECT id FROM imm_kanji WHERE kanji = ?').get(options.kanji) as {
id: number;
};
const lineResult = stmts.subtitleLineInsertStmt.run(
options.sessionId,
null,
options.videoId,
options.animeId,
options.lineIndex,
options.lineIndex * 1000,
options.lineIndex * 1000 + 900,
options.text,
'',
nowMs,
nowMs,
);
const lineId = Number(lineResult.lastInsertRowid);
stmts.kanjiLineOccurrenceUpsertStmt.run(lineId, kanjiRow.id, options.occurrenceCount ?? 1);
return kanjiRow.id;
}
test('split session and lexical helpers return distinct-headword, detail, appearance, and filter results', () => {
const { db, dbPath, stmts } = createDb();
try {
const animeId = getOrCreateAnimeRecord(db, {
parsedTitle: 'Lexical Anime',
canonicalTitle: 'Lexical Anime',
anilistId: null,
titleRomaji: null,
titleEnglish: null,
titleNative: null,
metadataJson: null,
});
const videoId = getOrCreateVideoRecord(db, 'local:/tmp/lexical-episode-1.mkv', {
canonicalTitle: 'Lexical Episode 1',
sourcePath: '/tmp/lexical-episode-1.mkv',
sourceUrl: null,
sourceType: SOURCE_TYPE_LOCAL,
});
linkVideoToAnimeRecord(db, videoId, {
animeId,
parsedBasename: 'lexical-episode-1.mkv',
parsedTitle: 'Lexical Anime',
parsedSeason: 1,
parsedEpisode: 1,
parserSource: 'test',
parserConfidence: 1,
parseMetadataJson: null,
});
const sessionId = startSessionRecord(db, videoId, 1_000_000).sessionId;
const nekoId = insertWordOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 1,
text: '猫がいる',
word: { headword: '猫', word: '猫', reading: 'ねこ' },
occurrenceCount: 2,
});
insertWordOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 2,
text: '犬もいる',
word: { headword: '犬', word: '犬', reading: 'いぬ' },
});
insertWordOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 3,
text: '子猫だ',
word: { headword: '子猫', word: '子猫', reading: 'こねこ' },
});
insertWordOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 5,
text: '日本だ',
word: { headword: '日本', word: '日本', reading: 'にほん' },
});
const hiId = insertKanjiOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 4,
text: '日本',
kanji: '日',
occurrenceCount: 3,
});
stmts.eventInsertStmt.run(
sessionId,
1_000_100,
EVENT_SUBTITLE_LINE,
1,
0,
900,
0,
0,
JSON.stringify({ kind: 'subtitle' }),
1_000_100,
1_000_100,
);
stmts.eventInsertStmt.run(
sessionId,
1_000_200,
EVENT_CARD_MINED,
2,
1000,
1900,
0,
1,
JSON.stringify({ noteIds: [41] }),
1_000_200,
1_000_200,
);
assert.deepEqual(getAllDistinctHeadwords(db).sort(), ['子猫', '日本', '犬', '猫']);
assert.deepEqual(getAnimeDistinctHeadwords(db, animeId).sort(), ['子猫', '日本', '犬', '猫']);
assert.deepEqual(getMediaDistinctHeadwords(db, videoId).sort(), ['子猫', '日本', '犬', '猫']);
const wordDetail = getWordDetail(db, nekoId);
assert.ok(wordDetail);
assert.equal(wordDetail.wordId, nekoId);
assert.equal(wordDetail.headword, '猫');
assert.equal(wordDetail.word, '猫');
assert.equal(wordDetail.reading, 'ねこ');
assert.equal(wordDetail.partOfSpeech, 'noun');
assert.equal(wordDetail.pos1, '名詞');
assert.equal(wordDetail.pos2, '一般');
assert.equal(wordDetail.pos3, '');
assert.equal(wordDetail.frequency, 1);
assert.equal(wordDetail.firstSeen, 1_000_001);
assert.equal(wordDetail.lastSeen, 1_000_001);
assert.deepEqual(getWordAnimeAppearances(db, nekoId), [
{ animeId, animeTitle: 'Lexical Anime', occurrenceCount: 2 },
]);
assert.deepEqual(
getSimilarWords(db, nekoId, 5).map((row) => row.headword),
['子猫'],
);
const kanjiDetail = getKanjiDetail(db, hiId);
assert.ok(kanjiDetail);
assert.equal(kanjiDetail.kanjiId, hiId);
assert.equal(kanjiDetail.kanji, '日');
assert.equal(kanjiDetail.frequency, 1);
assert.equal(kanjiDetail.firstSeen, 2_000_004);
assert.equal(kanjiDetail.lastSeen, 2_000_004);
assert.deepEqual(getKanjiAnimeAppearances(db, hiId), [
{ animeId, animeTitle: 'Lexical Anime', occurrenceCount: 3 },
]);
assert.deepEqual(
getKanjiWords(db, hiId, 5).map((row) => row.headword),
['日本'],
);
assert.deepEqual(
getSessionEvents(db, sessionId, 10, [EVENT_CARD_MINED]).map((row) => row.eventType),
[EVENT_CARD_MINED],
);
} finally {
db.close();
cleanupDbPath(dbPath);
}
});
test('split library helpers return anime/media session and analytics rows', () => {
const { db, dbPath, stmts } = createDb();
try {
const now = new Date();
const todayLocalDay = Math.floor(
new Date(now.getFullYear(), now.getMonth(), now.getDate()).getTime() / 86_400_000,
);
const animeId = getOrCreateAnimeRecord(db, {
parsedTitle: 'Library Anime',
canonicalTitle: 'Library Anime',
anilistId: null,
titleRomaji: null,
titleEnglish: null,
titleNative: null,
metadataJson: null,
});
const videoId = getOrCreateVideoRecord(db, 'local:/tmp/library-episode-1.mkv', {
canonicalTitle: 'Library Episode 1',
sourcePath: '/tmp/library-episode-1.mkv',
sourceUrl: null,
sourceType: SOURCE_TYPE_LOCAL,
});
linkVideoToAnimeRecord(db, videoId, {
animeId,
parsedBasename: 'library-episode-1.mkv',
parsedTitle: 'Library Anime',
parsedSeason: 1,
parsedEpisode: 1,
parserSource: 'test',
parserConfidence: 1,
parseMetadataJson: null,
});
const startedAtMs = new Date(
now.getFullYear(),
now.getMonth(),
now.getDate(),
9,
0,
0,
).getTime();
const sessionId = startSessionRecord(db, videoId, startedAtMs).sessionId;
finalizeSessionMetrics(db, sessionId, startedAtMs, {
endedAtMs: startedAtMs + 55_000,
totalWatchedMs: 55_000,
activeWatchedMs: 45_000,
linesSeen: 4,
tokensSeen: 8,
cardsMined: 2,
});
db.prepare(
`
INSERT INTO imm_daily_rollups (
rollup_day, video_id, total_sessions, total_active_min, total_lines_seen,
total_tokens_seen, total_cards, cards_per_hour, tokens_per_min, lookup_hit_rate,
CREATED_DATE, LAST_UPDATE_DATE
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`,
).run(todayLocalDay, videoId, 1, 45, 4, 8, 2, 2.66, 0.17, 0.5, startedAtMs, startedAtMs);
db.prepare(
`
INSERT INTO imm_media_art (
video_id, anilist_id, cover_url, cover_blob, cover_blob_hash, title_romaji,
title_english, episodes_total, fetched_at_ms, CREATED_DATE, LAST_UPDATE_DATE
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`,
).run(
videoId,
77,
'https://images.test/library.jpg',
new Uint8Array([1, 2, 3]),
null,
'Library Anime',
'Library Anime',
12,
startedAtMs,
startedAtMs,
startedAtMs,
);
db.prepare(
`
INSERT INTO imm_session_events (
session_id, ts_ms, event_type, line_index, segment_start_ms, segment_end_ms,
tokens_delta, cards_delta, payload_json, CREATED_DATE, LAST_UPDATE_DATE
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`,
).run(
sessionId,
startedAtMs + 40_000,
EVENT_CARD_MINED,
4,
4000,
4900,
0,
2,
JSON.stringify({ noteIds: [101, 102] }),
startedAtMs + 40_000,
startedAtMs + 40_000,
);
insertWordOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 1,
text: '猫がいる',
word: { headword: '猫', word: '猫', reading: 'ねこ' },
occurrenceCount: 3,
});
insertWordOccurrence(db, stmts, {
sessionId,
videoId,
animeId,
lineIndex: 2,
text: '犬もいる',
word: { headword: '犬', word: '犬', reading: 'いぬ' },
occurrenceCount: 1,
});
assert.deepEqual(getAnimeAnilistEntries(db, animeId), [
{
anilistId: 77,
titleRomaji: 'Library Anime',
titleEnglish: 'Library Anime',
season: 1,
},
]);
assert.equal(getMediaSessions(db, videoId, 10)[0]?.sessionId, sessionId);
assert.equal(getEpisodeSessions(db, videoId)[0]?.sessionId, sessionId);
assert.equal(getMediaDailyRollups(db, videoId, 10)[0]?.totalActiveMin, 45);
assert.deepEqual(getStreakCalendar(db, 30), [{ epochDay: todayLocalDay, totalActiveMin: 45 }]);
assert.deepEqual(
getAnimeWords(db, animeId, 10).map((row) => row.headword),
['猫', '犬'],
);
assert.deepEqual(
getEpisodeWords(db, videoId, 10).map((row) => row.headword),
['猫', '犬'],
);
assert.deepEqual(getEpisodesPerDay(db, 10), [{ epochDay: todayLocalDay, episodeCount: 1 }]);
assert.deepEqual(getNewAnimePerDay(db, 10), [{ epochDay: todayLocalDay, newAnimeCount: 1 }]);
assert.deepEqual(getWatchTimePerAnime(db, 3650), [
{
epochDay: todayLocalDay,
animeId,
animeTitle: 'Library Anime',
totalActiveMin: 45,
},
]);
assert.deepEqual(getEpisodeCardEvents(db, videoId), [
{
eventId: 1,
sessionId,
tsMs: startedAtMs + 40_000,
cardsDelta: 2,
noteIds: [101, 102],
},
]);
} finally {
db.close();
cleanupDbPath(dbPath);
}
});
test('split maintenance helpers update anime metadata and watched state', () => {
const { db, dbPath } = createDb();
try {
const animeId = getOrCreateAnimeRecord(db, {
parsedTitle: 'Metadata Anime',
canonicalTitle: 'Metadata Anime',
anilistId: null,
titleRomaji: null,
titleEnglish: null,
titleNative: null,
metadataJson: null,
});
const videoId = getOrCreateVideoRecord(db, 'local:/tmp/metadata-episode-1.mkv', {
canonicalTitle: 'Metadata Episode 1',
sourcePath: '/tmp/metadata-episode-1.mkv',
sourceUrl: null,
sourceType: SOURCE_TYPE_LOCAL,
});
linkVideoToAnimeRecord(db, videoId, {
animeId,
parsedBasename: 'metadata-episode-1.mkv',
parsedTitle: 'Metadata Anime',
parsedSeason: 1,
parsedEpisode: 1,
parserSource: 'test',
parserConfidence: 1,
parseMetadataJson: null,
});
updateVideoMetadataRecord(db, videoId, {
sourceType: SOURCE_TYPE_LOCAL,
canonicalTitle: 'Metadata Episode 1',
durationMs: 222_000,
fileSizeBytes: null,
codecId: null,
containerId: null,
widthPx: null,
heightPx: null,
fpsX100: null,
bitrateKbps: null,
audioCodecId: null,
hashSha256: null,
screenshotPath: null,
metadataJson: null,
});
updateAnimeAnilistInfo(db, videoId, {
anilistId: 99,
titleRomaji: 'Metadata Romaji',
titleEnglish: 'Metadata English',
titleNative: 'メタデータ',
episodesTotal: 24,
});
markVideoWatched(db, videoId, true);
const animeRow = db
.prepare(
`
SELECT anilist_id, title_romaji, title_english, title_native, episodes_total
FROM imm_anime
WHERE anime_id = ?
`,
)
.get(animeId) as {
anilist_id: number;
title_romaji: string;
title_english: string;
title_native: string;
episodes_total: number;
};
assert.equal(animeRow.anilist_id, 99);
assert.equal(animeRow.title_romaji, 'Metadata Romaji');
assert.equal(animeRow.title_english, 'Metadata English');
assert.equal(animeRow.title_native, 'メタデータ');
assert.equal(animeRow.episodes_total, 24);
assert.equal(getVideoDurationMs(db, videoId), 222_000);
assert.equal(isVideoWatched(db, videoId), true);
} finally {
db.close();
cleanupDbPath(dbPath);
}
});
test('split maintenance helpers delete multiple sessions and whole videos with dependent rows', () => {
const { db, dbPath, stmts } = createDb();
try {
const animeId = getOrCreateAnimeRecord(db, {
parsedTitle: 'Delete Anime',
canonicalTitle: 'Delete Anime',
anilistId: null,
titleRomaji: null,
titleEnglish: null,
titleNative: null,
metadataJson: null,
});
const keepVideoId = getOrCreateVideoRecord(db, 'local:/tmp/delete-keep.mkv', {
canonicalTitle: 'Delete Keep',
sourcePath: '/tmp/delete-keep.mkv',
sourceUrl: null,
sourceType: SOURCE_TYPE_LOCAL,
});
const dropVideoId = getOrCreateVideoRecord(db, 'local:/tmp/delete-drop.mkv', {
canonicalTitle: 'Delete Drop',
sourcePath: '/tmp/delete-drop.mkv',
sourceUrl: null,
sourceType: SOURCE_TYPE_LOCAL,
});
linkVideoToAnimeRecord(db, keepVideoId, {
animeId,
parsedBasename: 'delete-keep.mkv',
parsedTitle: 'Delete Anime',
parsedSeason: 1,
parsedEpisode: 1,
parserSource: 'test',
parserConfidence: 1,
parseMetadataJson: null,
});
linkVideoToAnimeRecord(db, dropVideoId, {
animeId,
parsedBasename: 'delete-drop.mkv',
parsedTitle: 'Delete Anime',
parsedSeason: 1,
parsedEpisode: 2,
parserSource: 'test',
parserConfidence: 1,
parseMetadataJson: null,
});
const keepSessionId = startSessionRecord(db, keepVideoId, 1_000_000).sessionId;
const dropSessionOne = startSessionRecord(db, dropVideoId, 2_000_000).sessionId;
const dropSessionTwo = startSessionRecord(db, dropVideoId, 3_000_000).sessionId;
finalizeSessionMetrics(db, keepSessionId, 1_000_000);
finalizeSessionMetrics(db, dropSessionOne, 2_000_000);
finalizeSessionMetrics(db, dropSessionTwo, 3_000_000);
insertWordOccurrence(db, stmts, {
sessionId: dropSessionOne,
videoId: dropVideoId,
animeId,
lineIndex: 1,
text: '削除する猫',
word: { headword: '猫', word: '猫', reading: 'ねこ' },
});
insertKanjiOccurrence(db, stmts, {
sessionId: dropSessionOne,
videoId: dropVideoId,
animeId,
lineIndex: 2,
text: '日本',
kanji: '日',
});
upsertCoverArt(db, dropVideoId, {
anilistId: 12,
coverUrl: 'https://images.test/delete.jpg',
coverBlob: new Uint8Array([7, 8, 9]),
titleRomaji: 'Delete Anime',
titleEnglish: 'Delete Anime',
episodesTotal: 2,
});
deleteSessions(db, [dropSessionOne, dropSessionTwo]);
const deletedSessionCount = db
.prepare('SELECT COUNT(*) AS total FROM imm_sessions WHERE video_id = ?')
.get(dropVideoId) as { total: number };
assert.equal(deletedSessionCount.total, 0);
const keepReplacementSession = startSessionRecord(db, keepVideoId, 4_000_000).sessionId;
finalizeSessionMetrics(db, keepReplacementSession, 4_000_000);
deleteVideo(db, dropVideoId);
const remainingVideos = db
.prepare('SELECT video_id FROM imm_videos ORDER BY video_id')
.all() as Array<{
video_id: number;
}>;
const coverRows = db.prepare('SELECT COUNT(*) AS total FROM imm_media_art').get() as {
total: number;
};
assert.deepEqual(remainingVideos, [{ video_id: keepVideoId }]);
assert.equal(coverRows.total, 0);
assert.equal(
(
db.prepare('SELECT COUNT(*) AS total FROM imm_words').get() as {
total: number;
}
).total,
0,
);
assert.equal(
(
db.prepare('SELECT COUNT(*) AS total FROM imm_kanji').get() as {
total: number;
}
).total,
0,
);
} finally {
db.close();
cleanupDbPath(dbPath);
}
});

View File

@@ -1,5 +1,6 @@
import type { DatabaseSync } from './sqlite';
import { finalizeSessionRecord } from './session';
import { nowMs } from './time';
import type { LifetimeRebuildSummary, SessionState } from './types';
interface TelemetryRow {
@@ -97,8 +98,7 @@ function isFirstSessionForLocalDay(
`
SELECT COUNT(*) AS count
FROM imm_sessions
WHERE CAST(strftime('%s', started_at_ms / 1000, 'unixepoch', 'localtime') AS INTEGER) / 86400
= CAST(strftime('%s', ? / 1000, 'unixepoch', 'localtime') AS INTEGER) / 86400
WHERE date(started_at_ms / 1000, 'unixepoch', 'localtime') = date(? / 1000, 'unixepoch', 'localtime')
AND (
started_at_ms < ?
OR (started_at_ms = ? AND session_id < ?)
@@ -393,7 +393,7 @@ export function applySessionLifetimeSummary(
ON CONFLICT(session_id) DO NOTHING
`,
)
.run(session.sessionId, endedAtMs, Date.now(), Date.now());
.run(session.sessionId, endedAtMs, nowMs(), nowMs());
if ((applyResult.changes ?? 0) <= 0) {
return;
@@ -468,7 +468,7 @@ export function applySessionLifetimeSummary(
? 1
: 0;
const nowMs = Date.now();
const updatedAtMs = nowMs();
db.prepare(
`
UPDATE imm_lifetime_global
@@ -490,13 +490,13 @@ export function applySessionLifetimeSummary(
isFirstSessionForVideoRun ? 1 : 0,
isFirstCompletedSessionForVideoRun ? 1 : 0,
animeCompletedDelta,
nowMs,
updatedAtMs,
);
upsertLifetimeMedia(
db,
session.videoId,
nowMs,
updatedAtMs,
activeMs,
cardsMined,
linesSeen,
@@ -510,7 +510,7 @@ export function applySessionLifetimeSummary(
upsertLifetimeAnime(
db,
video.anime_id,
nowMs,
updatedAtMs,
activeMs,
cardsMined,
linesSeen,
@@ -524,7 +524,7 @@ export function applySessionLifetimeSummary(
}
export function rebuildLifetimeSummaries(db: DatabaseSync): LifetimeRebuildSummary {
const rebuiltAtMs = Date.now();
const rebuiltAtMs = nowMs();
db.exec('BEGIN');
try {
const summary = rebuildLifetimeSummariesInTransaction(db, rebuiltAtMs);
@@ -538,7 +538,7 @@ export function rebuildLifetimeSummaries(db: DatabaseSync): LifetimeRebuildSumma
export function rebuildLifetimeSummariesInTransaction(
db: DatabaseSync,
rebuiltAtMs = Date.now(),
rebuiltAtMs = nowMs(),
): LifetimeRebuildSummary {
return rebuildLifetimeSummariesInternal(db, rebuiltAtMs);
}

View File

@@ -1,4 +1,9 @@
import type { DatabaseSync } from './sqlite';
import { nowMs } from './time';
function toDbMs(ms: number | bigint): bigint {
return BigInt(Math.trunc(Number(ms)));
}
const ROLLUP_STATE_KEY = 'last_rollup_sample_ms';
const DAILY_MS = 86_400_000;
@@ -118,7 +123,7 @@ function getLastRollupSampleMs(db: DatabaseSync): number {
return row ? Number(row.state_value) : ZERO_ID;
}
function setLastRollupSampleMs(db: DatabaseSync, sampleMs: number): void {
function setLastRollupSampleMs(db: DatabaseSync, sampleMs: number | bigint): void {
db.prepare(
`INSERT INTO imm_rollup_state (state_key, state_value)
VALUES (?, ?)
@@ -137,7 +142,7 @@ function resetRollups(db: DatabaseSync): void {
function upsertDailyRollupsForGroups(
db: DatabaseSync,
groups: Array<{ rollupDay: number; videoId: number }>,
rollupNowMs: number,
rollupNowMs: bigint,
): void {
if (groups.length === 0) {
return;
@@ -210,7 +215,7 @@ function upsertDailyRollupsForGroups(
function upsertMonthlyRollupsForGroups(
db: DatabaseSync,
groups: Array<{ rollupMonth: number; videoId: number }>,
rollupNowMs: number,
rollupNowMs: bigint,
): void {
if (groups.length === 0) {
return;
@@ -314,7 +319,7 @@ export function runRollupMaintenance(db: DatabaseSync, forceRebuild = false): vo
return;
}
const rollupNowMs = Date.now();
const rollupNowMs = toDbMs(nowMs());
const lastRollupSampleMs = getLastRollupSampleMs(db);
const maxSampleRow = db
@@ -349,7 +354,7 @@ export function runRollupMaintenance(db: DatabaseSync, forceRebuild = false): vo
try {
upsertDailyRollupsForGroups(db, dailyGroups, rollupNowMs);
upsertMonthlyRollupsForGroups(db, monthlyGroups, rollupNowMs);
setLastRollupSampleMs(db, Number(maxSampleRow.maxSampleMs));
setLastRollupSampleMs(db, toDbMs(maxSampleRow.maxSampleMs ?? ZERO_ID));
db.exec('COMMIT');
} catch (error) {
db.exec('ROLLBACK');
@@ -358,7 +363,7 @@ export function runRollupMaintenance(db: DatabaseSync, forceRebuild = false): vo
}
export function rebuildRollupsInTransaction(db: DatabaseSync): void {
const rollupNowMs = Date.now();
const rollupNowMs = toDbMs(nowMs());
const maxSampleRow = db
.prepare('SELECT MAX(sample_ms) AS maxSampleMs FROM imm_session_telemetry')
.get() as unknown as RollupTelemetryResult | null;
@@ -370,7 +375,7 @@ export function rebuildRollupsInTransaction(db: DatabaseSync): void {
const affectedGroups = getAffectedRollupGroups(db, ZERO_ID);
if (affectedGroups.length === 0) {
setLastRollupSampleMs(db, Number(maxSampleRow.maxSampleMs));
setLastRollupSampleMs(db, toDbMs(maxSampleRow.maxSampleMs ?? ZERO_ID));
return;
}
@@ -389,7 +394,7 @@ export function rebuildRollupsInTransaction(db: DatabaseSync): void {
upsertDailyRollupsForGroups(db, dailyGroups, rollupNowMs);
upsertMonthlyRollupsForGroups(db, monthlyGroups, rollupNowMs);
setLastRollupSampleMs(db, Number(maxSampleRow.maxSampleMs));
setLastRollupSampleMs(db, toDbMs(maxSampleRow.maxSampleMs ?? ZERO_ID));
}
export function runOptimizeMaintenance(db: DatabaseSync): void {

View File

@@ -3,6 +3,7 @@ import type { DatabaseSync } from './sqlite';
import { buildCoverBlobReference, normalizeCoverBlobBytes } from './storage';
import { rebuildLifetimeSummariesInTransaction } from './lifetime';
import { rebuildRollupsInTransaction } from './maintenance';
import { nowMs } from './time';
import { PartOfSpeech, type MergedToken } from '../../../types';
import { shouldExcludeTokenFromVocabularyPersistence } from '../tokenizer/annotation-stage';
import { deriveStoredPartOfSpeech } from '../tokenizer/part-of-speech';
@@ -349,7 +350,7 @@ export function upsertCoverArt(
)
.get(videoId) as { coverBlobHash: string | null } | undefined;
const sharedCoverBlobHash = findSharedCoverBlobHash(db, videoId, art.anilistId, art.coverUrl);
const nowMs = Date.now();
const fetchedAtMs = toDbMs(nowMs());
const coverBlob = normalizeCoverBlobBytes(art.coverBlob);
let coverBlobHash = sharedCoverBlobHash ?? null;
if (!coverBlobHash && coverBlob && coverBlob.length > 0) {
@@ -367,7 +368,7 @@ export function upsertCoverArt(
ON CONFLICT(blob_hash) DO UPDATE SET
LAST_UPDATE_DATE = excluded.LAST_UPDATE_DATE
`,
).run(coverBlobHash, coverBlob, nowMs, nowMs);
).run(coverBlobHash, coverBlob, fetchedAtMs, fetchedAtMs);
}
db.prepare(
@@ -397,9 +398,9 @@ export function upsertCoverArt(
art.titleRomaji,
art.titleEnglish,
art.episodesTotal,
nowMs,
nowMs,
nowMs,
fetchedAtMs,
fetchedAtMs,
fetchedAtMs,
);
if (existing?.coverBlobHash !== coverBlobHash) {
@@ -441,7 +442,7 @@ export function updateAnimeAnilistInfo(
info.titleEnglish,
info.titleNative,
info.episodesTotal,
Date.now(),
toDbMs(nowMs()),
row.anime_id,
);
}
@@ -449,7 +450,7 @@ export function updateAnimeAnilistInfo(
export function markVideoWatched(db: DatabaseSync, videoId: number, watched: boolean): void {
db.prepare('UPDATE imm_videos SET watched = ?, LAST_UPDATE_DATE = ? WHERE video_id = ?').run(
watched ? 1 : 0,
Date.now(),
toDbMs(nowMs()),
videoId,
);
}
@@ -541,3 +542,6 @@ export function deleteVideo(db: DatabaseSync, videoId: number): void {
throw error;
}
}
function toDbMs(ms: number | bigint): bigint {
return BigInt(Math.trunc(Number(ms)));
}

View File

@@ -1,4 +1,5 @@
import type { DatabaseSync } from './sqlite';
import { nowMs } from './time';
import type {
ImmersionSessionRollupRow,
SessionSummaryQueryRow,
@@ -219,7 +220,7 @@ export function getQueryHints(db: DatabaseSync): {
.get(todayLocal) as { count: number }
)?.count ?? 0;
const thirtyDaysAgoMs = Date.now() - 30 * 86400000;
const thirtyDaysAgoMs = nowMs() - 30 * 86400000;
const activeAnimeCount =
(
db

View File

@@ -1,16 +1,21 @@
import crypto from 'node:crypto';
import type { DatabaseSync } from './sqlite';
import { createInitialSessionState } from './reducer';
import { nowMs } from './time';
import { SESSION_STATUS_ACTIVE, SESSION_STATUS_ENDED } from './types';
import type { SessionState } from './types';
function toDbMs(ms: number | bigint): bigint {
return BigInt(Math.trunc(Number(ms)));
}
export function startSessionRecord(
db: DatabaseSync,
videoId: number,
startedAtMs = Date.now(),
startedAtMs = nowMs(),
): { sessionId: number; state: SessionState } {
const sessionUuid = crypto.randomUUID();
const nowMs = Date.now();
const createdAtMs = nowMs();
const result = db
.prepare(
`
@@ -20,7 +25,14 @@ export function startSessionRecord(
) VALUES (?, ?, ?, ?, ?, ?)
`,
)
.run(sessionUuid, videoId, startedAtMs, SESSION_STATUS_ACTIVE, startedAtMs, nowMs);
.run(
sessionUuid,
videoId,
toDbMs(startedAtMs),
SESSION_STATUS_ACTIVE,
toDbMs(startedAtMs),
toDbMs(createdAtMs),
);
const sessionId = Number(result.lastInsertRowid);
return {
sessionId,
@@ -31,7 +43,7 @@ export function startSessionRecord(
export function finalizeSessionRecord(
db: DatabaseSync,
sessionState: SessionState,
endedAtMs = Date.now(),
endedAtMs = nowMs(),
): void {
db.prepare(
`
@@ -57,9 +69,9 @@ export function finalizeSessionRecord(
WHERE session_id = ?
`,
).run(
endedAtMs,
toDbMs(endedAtMs),
SESSION_STATUS_ENDED,
sessionState.lastMediaMs,
sessionState.lastMediaMs === null ? null : toDbMs(sessionState.lastMediaMs),
sessionState.totalWatchedMs,
sessionState.activeWatchedMs,
sessionState.linesSeen,
@@ -73,7 +85,7 @@ export function finalizeSessionRecord(
sessionState.seekForwardCount,
sessionState.seekBackwardCount,
sessionState.mediaBufferEvents,
Date.now(),
toDbMs(nowMs()),
sessionState.sessionId,
);
}

View File

@@ -1,9 +1,14 @@
import { createHash } from 'node:crypto';
import { parseMediaInfo } from '../../../jimaku/utils';
import type { DatabaseSync } from './sqlite';
import { nowMs } from './time';
import { SCHEMA_VERSION } from './types';
import type { QueuedWrite, VideoMetadata, YoutubeVideoMetadata } from './types';
function toDbMs(ms: number | bigint): bigint {
return BigInt(Math.trunc(Number(ms)));
}
export interface TrackerPreparedStatements {
telemetryInsertStmt: ReturnType<DatabaseSync['prepare']>;
sessionCheckpointStmt: ReturnType<DatabaseSync['prepare']>;
@@ -128,7 +133,7 @@ function deduplicateExistingCoverArtRows(db: DatabaseSync): void {
return;
}
const nowMs = Date.now();
const nowMsValue = toDbMs(nowMs());
const upsertBlobStmt = db.prepare(`
INSERT INTO imm_cover_art_blobs (blob_hash, cover_blob, CREATED_DATE, LAST_UPDATE_DATE)
VALUES (?, ?, ?, ?)
@@ -150,14 +155,14 @@ function deduplicateExistingCoverArtRows(db: DatabaseSync): void {
const refHash = parseCoverBlobReference(coverBlob);
if (refHash) {
if (row.cover_blob_hash !== refHash) {
updateMediaStmt.run(coverBlob, refHash, nowMs, row.video_id);
updateMediaStmt.run(coverBlob, refHash, nowMsValue, row.video_id);
}
continue;
}
const hash = createHash('sha256').update(coverBlob).digest('hex');
upsertBlobStmt.run(hash, coverBlob, nowMs, nowMs);
updateMediaStmt.run(buildCoverBlobReference(hash), hash, nowMs, row.video_id);
upsertBlobStmt.run(hash, coverBlob, nowMsValue, nowMsValue);
updateMediaStmt.run(buildCoverBlobReference(hash), hash, nowMsValue, row.video_id);
}
}
@@ -273,7 +278,7 @@ function parseLegacyAnimeBackfillCandidate(
}
function ensureLifetimeSummaryTables(db: DatabaseSync): void {
const nowMs = Date.now();
const nowMsValue = toDbMs(nowMs());
db.exec(`
CREATE TABLE IF NOT EXISTS imm_lifetime_global(
@@ -315,8 +320,8 @@ function ensureLifetimeSummaryTables(db: DatabaseSync): void {
0,
0,
NULL,
${nowMs},
${nowMs}
${nowMsValue},
${nowMsValue}
WHERE NOT EXISTS (SELECT 1 FROM imm_lifetime_global LIMIT 1)
`);
@@ -403,13 +408,13 @@ export function getOrCreateAnimeRecord(db: DatabaseSync, input: AnimeRecordInput
input.titleEnglish,
input.titleNative,
input.metadataJson,
Date.now(),
toDbMs(nowMs()),
existing.anime_id,
);
return existing.anime_id;
}
const nowMs = Date.now();
const nowMsValue = toDbMs(nowMs());
const result = db
.prepare(
`
@@ -434,8 +439,8 @@ export function getOrCreateAnimeRecord(db: DatabaseSync, input: AnimeRecordInput
input.titleEnglish,
input.titleNative,
input.metadataJson,
nowMs,
nowMs,
nowMsValue,
nowMsValue,
);
return Number(result.lastInsertRowid);
}
@@ -469,7 +474,7 @@ export function linkVideoToAnimeRecord(
input.parserSource,
input.parserConfidence,
input.parseMetadataJson,
Date.now(),
toDbMs(nowMs()),
videoId,
);
}
@@ -854,7 +859,7 @@ export function ensureSchema(db: DatabaseSync): void {
addColumnIfMissing(db, 'imm_monthly_rollups', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_monthly_rollups', 'LAST_UPDATE_DATE');
const nowMs = Date.now();
const migratedAtMs = toDbMs(nowMs());
db.prepare(
`
UPDATE imm_videos
@@ -894,7 +899,7 @@ export function ensureSchema(db: DatabaseSync): void {
CREATED_DATE = COALESCE(CREATED_DATE, ?),
LAST_UPDATE_DATE = COALESCE(LAST_UPDATE_DATE, ?)
`,
).run(nowMs, nowMs);
).run(migratedAtMs, migratedAtMs);
db.prepare(
`
UPDATE imm_monthly_rollups
@@ -902,7 +907,7 @@ export function ensureSchema(db: DatabaseSync): void {
CREATED_DATE = COALESCE(CREATED_DATE, ?),
LAST_UPDATE_DATE = COALESCE(LAST_UPDATE_DATE, ?)
`,
).run(nowMs, nowMs);
).run(migratedAtMs, migratedAtMs);
}
if (currentVersion?.schema_version === 1 || currentVersion?.schema_version === 2) {
@@ -1241,7 +1246,7 @@ export function ensureSchema(db: DatabaseSync): void {
db.exec(`
INSERT INTO imm_schema_version(schema_version, applied_at_ms)
VALUES (${SCHEMA_VERSION}, ${Date.now()})
VALUES (${SCHEMA_VERSION}, ${toDbMs(nowMs())})
ON CONFLICT DO NOTHING
`);
}
@@ -1399,28 +1404,29 @@ function incrementKanjiAggregate(
}
export function executeQueuedWrite(write: QueuedWrite, stmts: TrackerPreparedStatements): void {
const currentMs = toDbMs(nowMs());
if (write.kind === 'telemetry') {
const nowMs = Date.now();
const telemetrySampleMs = toDbMs(write.sampleMs ?? Number(currentMs));
stmts.telemetryInsertStmt.run(
write.sessionId,
write.sampleMs!,
write.totalWatchedMs!,
write.activeWatchedMs!,
write.linesSeen!,
write.tokensSeen!,
write.cardsMined!,
write.lookupCount!,
write.lookupHits!,
telemetrySampleMs,
write.totalWatchedMs ?? 0,
write.activeWatchedMs ?? 0,
write.linesSeen ?? 0,
write.tokensSeen ?? 0,
write.cardsMined ?? 0,
write.lookupCount ?? 0,
write.lookupHits ?? 0,
write.yomitanLookupCount ?? 0,
write.pauseCount!,
write.pauseMs!,
write.seekForwardCount!,
write.seekBackwardCount!,
write.mediaBufferEvents!,
nowMs,
nowMs,
write.pauseCount ?? 0,
write.pauseMs ?? 0,
write.seekForwardCount ?? 0,
write.seekBackwardCount ?? 0,
write.mediaBufferEvents ?? 0,
currentMs,
currentMs,
);
stmts.sessionCheckpointStmt.run(write.lastMediaMs ?? null, nowMs, write.sessionId);
stmts.sessionCheckpointStmt.run(write.lastMediaMs ?? null, currentMs, write.sessionId);
return;
}
if (write.kind === 'word') {
@@ -1456,8 +1462,8 @@ export function executeQueuedWrite(write: QueuedWrite, stmts: TrackerPreparedSta
write.segmentEndMs ?? null,
write.text,
write.secondaryText ?? null,
Date.now(),
Date.now(),
currentMs,
currentMs,
);
const lineId = Number(lineResult.lastInsertRowid);
for (const occurrence of write.wordOccurrences) {
@@ -1473,16 +1479,16 @@ export function executeQueuedWrite(write: QueuedWrite, stmts: TrackerPreparedSta
stmts.eventInsertStmt.run(
write.sessionId,
write.sampleMs!,
write.eventType!,
toDbMs(write.sampleMs ?? Number(currentMs)),
write.eventType ?? 0,
write.lineIndex ?? null,
write.segmentStartMs ?? null,
write.segmentEndMs ?? null,
write.tokensDelta ?? 0,
write.cardsDelta ?? 0,
write.payloadJson ?? null,
Date.now(),
Date.now(),
currentMs,
currentMs,
);
}
@@ -1508,11 +1514,11 @@ export function getOrCreateVideoRecord(
LAST_UPDATE_DATE = ?
WHERE video_id = ?
`,
).run(details.canonicalTitle || 'unknown', Date.now(), existing.video_id);
).run(details.canonicalTitle || 'unknown', toDbMs(nowMs()), existing.video_id);
return existing.video_id;
}
const nowMs = Date.now();
const currentMs = toDbMs(nowMs());
const insert = db.prepare(`
INSERT INTO imm_videos (
video_key, canonical_title, source_type, source_path, source_url,
@@ -1539,8 +1545,8 @@ export function getOrCreateVideoRecord(
null,
null,
null,
nowMs,
nowMs,
currentMs,
currentMs,
);
return Number(result.lastInsertRowid);
}
@@ -1582,7 +1588,7 @@ export function updateVideoMetadataRecord(
metadata.hashSha256,
metadata.screenshotPath,
metadata.metadataJson,
Date.now(),
toDbMs(nowMs()),
videoId,
);
}
@@ -1600,7 +1606,7 @@ export function updateVideoTitleRecord(
LAST_UPDATE_DATE = ?
WHERE video_id = ?
`,
).run(canonicalTitle, Date.now(), videoId);
).run(canonicalTitle, toDbMs(nowMs()), videoId);
}
export function upsertYoutubeVideoMetadata(
@@ -1608,7 +1614,7 @@ export function upsertYoutubeVideoMetadata(
videoId: number,
metadata: YoutubeVideoMetadata,
): void {
const nowMs = Date.now();
const currentMs = toDbMs(nowMs());
db.prepare(
`
INSERT INTO imm_youtube_videos (
@@ -1659,8 +1665,8 @@ export function upsertYoutubeVideoMetadata(
metadata.uploaderUrl ?? null,
metadata.description ?? null,
metadata.metadataJson ?? null,
nowMs,
nowMs,
nowMs,
currentMs,
currentMs,
currentMs,
);
}

View File

@@ -0,0 +1,10 @@
const SQLITE_SAFE_EPOCH_BASE_MS = 2_000_000_000;
export function nowMs(): number {
const perf = globalThis.performance;
if (perf) {
return SQLITE_SAFE_EPOCH_BASE_MS + Math.floor(perf.now());
}
return SQLITE_SAFE_EPOCH_BASE_MS;
}

View File

@@ -0,0 +1,51 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import {
getIgnoredPos1Entries,
JLPT_EXCLUDED_TERMS,
JLPT_IGNORED_MECAB_POS1,
JLPT_IGNORED_MECAB_POS1_ENTRIES,
JLPT_IGNORED_MECAB_POS1_LIST,
shouldIgnoreJlptByTerm,
shouldIgnoreJlptForMecabPos1,
} from './jlpt-token-filter';
test('shouldIgnoreJlptByTerm matches the excluded JLPT lexical terms', () => {
assert.equal(shouldIgnoreJlptByTerm('この'), true);
assert.equal(shouldIgnoreJlptByTerm('そこ'), true);
assert.equal(shouldIgnoreJlptByTerm('猫'), false);
assert.deepEqual(Array.from(JLPT_EXCLUDED_TERMS), [
'この',
'その',
'あの',
'どの',
'これ',
'それ',
'あれ',
'どれ',
'ここ',
'そこ',
'あそこ',
'どこ',
'こと',
'ああ',
'ええ',
'うう',
'おお',
'はは',
'へえ',
'ふう',
'ほう',
]);
});
test('shouldIgnoreJlptForMecabPos1 matches the exported ignored POS1 list', () => {
assert.equal(shouldIgnoreJlptForMecabPos1('助詞'), true);
assert.equal(shouldIgnoreJlptForMecabPos1('名詞'), false);
assert.deepEqual(JLPT_IGNORED_MECAB_POS1, JLPT_IGNORED_MECAB_POS1_LIST);
assert.deepEqual(
JLPT_IGNORED_MECAB_POS1_ENTRIES.map((entry) => entry.pos1),
JLPT_IGNORED_MECAB_POS1_LIST,
);
assert.deepEqual(getIgnoredPos1Entries(), JLPT_IGNORED_MECAB_POS1_ENTRIES);
});

View File

@@ -0,0 +1,113 @@
import assert from 'node:assert/strict';
import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import test from 'node:test';
import {
loadSubtitlePosition,
saveSubtitlePosition,
updateCurrentMediaPath,
} from './subtitle-position';
function makeTempDir(): string {
return fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-subtitle-position-test-'));
}
test('saveSubtitlePosition queues pending position when media path is unavailable', () => {
const queued: Array<{ yPercent: number }> = [];
let persisted = false;
saveSubtitlePosition({
position: { yPercent: 21 },
currentMediaPath: null,
subtitlePositionsDir: makeTempDir(),
onQueuePending: (position) => {
queued.push(position);
},
onPersisted: () => {
persisted = true;
},
});
assert.deepEqual(queued, [{ yPercent: 21 }]);
assert.equal(persisted, false);
});
test('saveSubtitlePosition persists and loadSubtitlePosition restores the stored position', () => {
const dir = makeTempDir();
const mediaPath = path.join(dir, 'episode.mkv');
const position = { yPercent: 37 };
let persisted = false;
saveSubtitlePosition({
position,
currentMediaPath: mediaPath,
subtitlePositionsDir: dir,
onQueuePending: () => {
throw new Error('unexpected queue');
},
onPersisted: () => {
persisted = true;
},
});
const loaded = loadSubtitlePosition({
currentMediaPath: mediaPath,
fallbackPosition: { yPercent: 0 },
subtitlePositionsDir: dir,
});
assert.equal(persisted, true);
assert.deepEqual(loaded, position);
assert.equal(
fs.readdirSync(dir).some((entry) => entry.endsWith('.json')),
true,
);
});
test('updateCurrentMediaPath persists a queued subtitle position before broadcasting', () => {
const dir = makeTempDir();
let currentMediaPath: string | null = null;
let cleared = false;
const setPositions: Array<{ yPercent: number } | null> = [];
const broadcasts: Array<{ yPercent: number } | null> = [];
const pending = { yPercent: 64 };
updateCurrentMediaPath({
mediaPath: path.join(dir, 'video.mkv'),
currentMediaPath,
pendingSubtitlePosition: pending,
subtitlePositionsDir: dir,
loadSubtitlePosition: () =>
loadSubtitlePosition({
currentMediaPath,
fallbackPosition: { yPercent: 0 },
subtitlePositionsDir: dir,
}),
setCurrentMediaPath: (next) => {
currentMediaPath = next;
},
clearPendingSubtitlePosition: () => {
cleared = true;
},
setSubtitlePosition: (position) => {
setPositions.push(position);
},
broadcastSubtitlePosition: (position) => {
broadcasts.push(position);
},
});
assert.equal(currentMediaPath, path.join(dir, 'video.mkv'));
assert.equal(cleared, true);
assert.deepEqual(setPositions, [pending]);
assert.deepEqual(broadcasts, [pending]);
assert.deepEqual(
loadSubtitlePosition({
currentMediaPath,
fallbackPosition: { yPercent: 0 },
subtitlePositionsDir: dir,
}),
pending,
);
});