mirror of
https://github.com/ksyasuda/SubMiner.git
synced 2026-03-20 03:16:46 -07:00
Harden stats APIs and fix Electron Yomitan debug runtime
- Validate stats session IDs/limits and add AnkiConnect request timeouts - Stabilize stats window/runtime lifecycle and tighten window security defaults - Fix Electron CLI debug startup by unsetting `ELECTRON_RUN_AS_NODE` and wiring Yomitan session state - Expand regression coverage for tracker queries/events ordering and session aggregates - Update docs for stats dashboard usage and Yomitan lookup troubleshooting
This commit is contained in:
@@ -0,0 +1,53 @@
|
||||
---
|
||||
id: TASK-174
|
||||
title: Fix missing frequency highlights for merged tokenizer tokens
|
||||
status: In Progress
|
||||
assignee:
|
||||
- codex
|
||||
created_date: '2026-03-15 10:18'
|
||||
updated_date: '2026-03-15 10:40'
|
||||
labels:
|
||||
- bug
|
||||
- tokenizer
|
||||
- frequency-highlighting
|
||||
dependencies: []
|
||||
references:
|
||||
- /Users/sudacode/projects/japanese/SubMiner/src/core/services/tokenizer.ts
|
||||
- >-
|
||||
/Users/sudacode/projects/japanese/SubMiner/src/core/services/tokenizer/parser-selection-stage.ts
|
||||
- >-
|
||||
/Users/sudacode/projects/japanese/SubMiner/src/core/services/tokenizer/yomitan-parser-runtime.ts
|
||||
- /Users/sudacode/projects/japanese/SubMiner/scripts/get_frequency.ts
|
||||
- /Users/sudacode/projects/japanese/SubMiner/scripts/test-yomitan-parser.ts
|
||||
priority: high
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
Frequency highlighting can miss words that should color within the configured top-X limit when tokenizer candidate selection keeps merged Yomitan units that combine a content word with trailing function text. The annotation stage then conservatively clears frequency for the whole merged token, so visible high-frequency words lose highlighting. The standalone debug CLIs are also failing to initialize the shared Yomitan runtime, which blocks reliable repro for this class of bug.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
<!-- AC:BEGIN -->
|
||||
- [ ] #1 Tokenizer no longer drops frequency highlighting for content words in merged-token cases where a better scanning parse candidate would preserve highlightable tokens.
|
||||
- [ ] #2 A regression test covers the reported sentence shape and fails before the fix.
|
||||
- [ ] #3 The standalone frequency/parser debug path can initialize the shared Yomitan runtime well enough to reproduce tokenizer output instead of immediately reporting runtime/session wiring errors.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
<!-- SECTION:PLAN:BEGIN -->
|
||||
1. Add a regression test for the reported merged-token frequency miss, centered on Yomitan scanning candidate selection and downstream frequency annotation.
|
||||
2. Update tokenizer candidate selection so merged content+function tokens do not win over candidates that preserve highlightable content tokens.
|
||||
3. Repair the standalone frequency/parser debug scripts so their Electron/Yomitan runtime wiring matches current shared runtime expectations.
|
||||
4. Verify with targeted tokenizer/parser tests and the standalone debug repro command.
|
||||
<!-- SECTION:PLAN:END -->
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
<!-- SECTION:NOTES:BEGIN -->
|
||||
Initial triage: shared frequency class logic looks correct; likely failure is upstream tokenizer candidate selection producing merged content+function tokens that annotation later excludes from frequency. Standalone debug scripts also fail to initialize a usable Electron/Yomitan runtime, blocking reliable repro from the current CLI path.
|
||||
|
||||
Repro after fixing the standalone Electron wrapper does not support the original highlight claim for `誰でもいいから かかってこいよ`: the tokenizer reports `かかってこい` with `frequencyRank` 63098, so it correctly stays uncolored at `--color-top-x 10000` and becomes colorable once the threshold is raised above that rank. The concrete bug fixed in this pass is the standalone Electron debug path: package scripts now unset `ELECTRON_RUN_AS_NODE`, and the scripts normalize Electron imports/guards so `get-frequency:electron` can reach real Electron/Yomitan runtime state instead of immediately falling back to Node-mode diagnostics. `test-yomitan-parser:electron` still shows extension/service-worker issues against the existing profile and was not stabilized in this pass.
|
||||
<!-- SECTION:NOTES:END -->
|
||||
@@ -0,0 +1,33 @@
|
||||
---
|
||||
id: TASK-175
|
||||
title: Address latest PR 19 review comments
|
||||
status: In Progress
|
||||
assignee: []
|
||||
created_date: '2026-03-15 10:25'
|
||||
labels:
|
||||
- pr-review
|
||||
- stats-dashboard
|
||||
dependencies: []
|
||||
references:
|
||||
- src/core/services/ipc.ts
|
||||
- src/core/services/stats-server.ts
|
||||
- src/core/services/immersion-tracker/__tests__/query.test.ts
|
||||
- src/core/services/stats-window-runtime.ts
|
||||
- src/core/services/stats-window.test.ts
|
||||
- src/shared/ipc/contracts.ts
|
||||
- src/main.ts
|
||||
priority: medium
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
Validate the latest automated review comments on PR #19 against the current branch, implement the technically valid fixes, and document any items intentionally left unchanged.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
<!-- AC:BEGIN -->
|
||||
- [ ] #1 Validated the latest PR #19 review comments against current branch behavior and existing architecture
|
||||
- [ ] #2 Implemented the accepted fixes with regression coverage where it fits
|
||||
- [ ] #3 Documented which latest review items were intentionally not changed because they were already addressed or not technically warranted
|
||||
<!-- AC:END -->
|
||||
@@ -4,6 +4,23 @@ SubMiner can build a Yomitan-compatible character dictionary from AniList metada
|
||||
|
||||
The dictionary is generated per-media, merged across your recently-watched titles, and auto-imported into Yomitan. When a character name appears in a subtitle line, it gets highlighted and becomes available for hover-driven Yomitan profile lookup.
|
||||
|
||||
## Stats Dashboard
|
||||
|
||||
The character dictionary and stats dashboard both read from the same local immersion data.
|
||||
|
||||
- Open the dashboard from overlay: press your configured `stats.toggleKey` (default: `` ` `` / `Backquote`).
|
||||
- Open from launcher/CLI: run `subminer stats`.
|
||||
- Open directly: visit `http://127.0.0.1:<stats.serverPort>` when the local server is running.
|
||||
|
||||
Useful config keys:
|
||||
|
||||
- `stats.autoStartServer` — start the local stats server automatically once immersion tracking starts.
|
||||
- `stats.serverPort` — local HTTP port for dashboard and API.
|
||||
- `stats.toggleKey` — key binding for overlay dashboard toggle.
|
||||
- `stats.autoOpenBrowser` — auto-open dashboard browser for `subminer stats`.
|
||||
|
||||
The dashboard gives quick visibility into episode summaries, watch-time rollups, session timelines, and vocabulary/kanji drill-down from the same DB used by character matching.
|
||||
|
||||
## How It Works
|
||||
|
||||
The feature has three stages: **snapshot**, **merge**, and **match**.
|
||||
|
||||
@@ -178,7 +178,7 @@ SubMiner does not load the source tree directly from `vendor/subminer-yomitan`;
|
||||
|
||||
If you installed from the AppImage and see this error, the package may be incomplete. Re-download the AppImage or place the unpacked Yomitan extension manually in `~/.config/SubMiner/yomitan`.
|
||||
|
||||
**Yomitan popup does not appear when hovering words and triggering lookup**
|
||||
**Yomitan lookup popup does not appear when hovering words or triggering lookup**
|
||||
|
||||
- Verify Yomitan loaded successfully — check the terminal output for "Loaded Yomitan extension".
|
||||
- Yomitan requires dictionaries to be installed. Open Yomitan settings (`Alt+Shift+Y` or `SubMiner.AppImage --settings`) and confirm at least one dictionary is imported.
|
||||
|
||||
@@ -8,9 +8,9 @@
|
||||
"typecheck": "tsc --noEmit -p tsconfig.typecheck.json",
|
||||
"typecheck:watch": "tsc --watch --preserveWatchOutput -p tsconfig.typecheck.json",
|
||||
"get-frequency": "bun run scripts/get_frequency.ts --pretty --color-top-x 10000 --yomitan-user-data ~/.config/SubMiner --colorized-line",
|
||||
"get-frequency:electron": "bun run build:yomitan && bun build scripts/get_frequency.ts --format=cjs --target=node --outfile dist/scripts/get_frequency.js --external electron && electron dist/scripts/get_frequency.js --pretty --color-top-x 10000 --yomitan-user-data ~/.config/SubMiner --colorized-line",
|
||||
"get-frequency:electron": "bun run build:yomitan && bun build scripts/get_frequency.ts --format=cjs --target=node --outfile dist/scripts/get_frequency.js --external electron && env -u ELECTRON_RUN_AS_NODE electron dist/scripts/get_frequency.js --pretty --color-top-x 10000 --yomitan-user-data ~/.config/SubMiner --colorized-line",
|
||||
"test-yomitan-parser": "bun run scripts/test-yomitan-parser.ts",
|
||||
"test-yomitan-parser:electron": "bun run build:yomitan && bun build scripts/test-yomitan-parser.ts --format=cjs --target=node --outfile dist/scripts/test-yomitan-parser.js --external electron && electron dist/scripts/test-yomitan-parser.js",
|
||||
"test-yomitan-parser:electron": "bun run build:yomitan && bun build scripts/test-yomitan-parser.ts --format=cjs --target=node --outfile dist/scripts/test-yomitan-parser.js --external electron && env -u ELECTRON_RUN_AS_NODE electron dist/scripts/test-yomitan-parser.js",
|
||||
"build:yomitan": "bun scripts/build-yomitan.mjs",
|
||||
"build:assets": "bun scripts/prepare-build-assets.mjs",
|
||||
"build:stats": "cd stats && bun run build",
|
||||
|
||||
@@ -482,6 +482,7 @@ function simplifyTokenWithVerbose(
|
||||
|
||||
interface YomitanRuntimeState {
|
||||
yomitanExt: unknown | null;
|
||||
yomitanSession: unknown | null;
|
||||
parserWindow: unknown | null;
|
||||
parserReadyPromise: Promise<void> | null;
|
||||
parserInitPromise: Promise<boolean> | null;
|
||||
@@ -525,24 +526,38 @@ function destroyUnknownParserWindow(window: unknown): void {
|
||||
}
|
||||
}
|
||||
|
||||
async function loadElectronModule(): Promise<typeof import('electron') | null> {
|
||||
try {
|
||||
const electronImport = await import('electron');
|
||||
return (electronImport.default ?? electronImport) as typeof import('electron');
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async function createYomitanRuntimeState(
|
||||
userDataPath: string,
|
||||
extensionPath?: string,
|
||||
): Promise<YomitanRuntimeState> {
|
||||
const state: YomitanRuntimeState = {
|
||||
yomitanExt: null,
|
||||
yomitanSession: null,
|
||||
parserWindow: null,
|
||||
parserReadyPromise: null,
|
||||
parserInitPromise: null,
|
||||
available: false,
|
||||
};
|
||||
|
||||
const electronImport = await import('electron').catch((error) => {
|
||||
state.note = error instanceof Error ? error.message : 'unknown error';
|
||||
return null;
|
||||
});
|
||||
if (!electronImport || !electronImport.app || !electronImport.app.whenReady) {
|
||||
state.note = 'electron runtime not available in this process';
|
||||
const electronImport = await loadElectronModule();
|
||||
if (
|
||||
!electronImport ||
|
||||
!electronImport.app ||
|
||||
typeof electronImport.app.whenReady !== 'function' ||
|
||||
!electronImport.session
|
||||
) {
|
||||
state.note = electronImport
|
||||
? 'electron runtime not available in this process'
|
||||
: 'electron import failed';
|
||||
return state;
|
||||
}
|
||||
|
||||
@@ -557,6 +572,7 @@ async function createYomitanRuntimeState(
|
||||
setYomitanParserReadyPromise: (promise: Promise<void> | null) => void;
|
||||
setYomitanParserInitPromise: (promise: Promise<boolean> | null) => void;
|
||||
setYomitanExtension: (extension: unknown) => void;
|
||||
setYomitanSession: (session: unknown) => void;
|
||||
}) => Promise<unknown>;
|
||||
|
||||
const extension = await loadYomitanExtension({
|
||||
@@ -575,6 +591,9 @@ async function createYomitanRuntimeState(
|
||||
setYomitanExtension: (extension) => {
|
||||
state.yomitanExt = extension;
|
||||
},
|
||||
setYomitanSession: (nextSession) => {
|
||||
state.yomitanSession = nextSession;
|
||||
},
|
||||
});
|
||||
|
||||
if (!extension) {
|
||||
@@ -768,8 +787,12 @@ async function main(): Promise<void> {
|
||||
);
|
||||
}
|
||||
|
||||
electronModule = await import('electron').catch(() => null);
|
||||
if (electronModule && args.yomitanUserDataPath) {
|
||||
electronModule = await loadElectronModule();
|
||||
if (
|
||||
electronModule?.app &&
|
||||
typeof electronModule.app.setPath === 'function' &&
|
||||
args.yomitanUserDataPath
|
||||
) {
|
||||
electronModule.app.setPath('userData', args.yomitanUserDataPath);
|
||||
}
|
||||
yomitanState = !args.forceMecabOnly
|
||||
@@ -783,6 +806,7 @@ async function main(): Promise<void> {
|
||||
|
||||
const deps = createTokenizerDepsRuntime({
|
||||
getYomitanExt: () => (useYomitan ? yomitanState!.yomitanExt : null) as never,
|
||||
getYomitanSession: () => (useYomitan ? yomitanState!.yomitanSession : null) as never,
|
||||
getYomitanParserWindow: () => (useYomitan ? yomitanState!.parserWindow : null) as never,
|
||||
setYomitanParserWindow: (window) => {
|
||||
if (!useYomitan) {
|
||||
|
||||
@@ -379,6 +379,15 @@ function resolveYomitanExtensionPath(explicitPath?: string): string | null {
|
||||
});
|
||||
}
|
||||
|
||||
async function loadElectronModule(): Promise<typeof import('electron') | null> {
|
||||
try {
|
||||
const electronImport = await import('electron');
|
||||
return (electronImport.default ?? electronImport) as typeof import('electron');
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
async function setupYomitanRuntime(options: CliOptions): Promise<YomitanRuntimeState> {
|
||||
const state: YomitanRuntimeState = {
|
||||
available: false,
|
||||
@@ -394,16 +403,13 @@ async function setupYomitanRuntime(options: CliOptions): Promise<YomitanRuntimeS
|
||||
return state;
|
||||
}
|
||||
|
||||
const electronModule = await import('electron').catch((error) => {
|
||||
state.note = error instanceof Error ? error.message : 'electron import failed';
|
||||
return null;
|
||||
});
|
||||
const electronModule = await loadElectronModule();
|
||||
if (!electronModule?.app || !electronModule?.session) {
|
||||
state.note = 'electron runtime not available in this process';
|
||||
return state;
|
||||
}
|
||||
|
||||
if (options.yomitanUserDataPath) {
|
||||
if (options.yomitanUserDataPath && typeof electronModule.app.setPath === 'function') {
|
||||
electronModule.app.setPath('userData', options.yomitanUserDataPath);
|
||||
}
|
||||
await electronModule.app.whenReady();
|
||||
|
||||
@@ -618,8 +618,8 @@ test('monthly rollups are grouped by calendar month', async () => {
|
||||
runRollupMaintenance: () => void;
|
||||
};
|
||||
|
||||
const januaryStartedAtMs = Date.UTC(2026, 0, 31, 23, 59, 59, 0);
|
||||
const februaryStartedAtMs = Date.UTC(2026, 1, 1, 0, 0, 1, 0);
|
||||
const januaryStartedAtMs = Date.UTC(2026, 0, 15, 12, 0, 0, 0);
|
||||
const februaryStartedAtMs = Date.UTC(2026, 1, 15, 12, 0, 0, 0);
|
||||
|
||||
privateApi.db.exec(`
|
||||
INSERT INTO imm_videos (
|
||||
|
||||
@@ -109,7 +109,13 @@ test('getSessionSummaries returns sessionId and canonicalTitle', () => {
|
||||
assert.equal(row.sessionId, sessionId);
|
||||
assert.equal(row.canonicalTitle, 'Query Test Episode');
|
||||
assert.equal(row.videoId, videoId);
|
||||
assert.ok(row.linesSeen >= 5);
|
||||
assert.equal(row.linesSeen, 5);
|
||||
assert.equal(row.totalWatchedMs, 3_000);
|
||||
assert.equal(row.activeWatchedMs, 2_500);
|
||||
assert.equal(row.wordsSeen, 10);
|
||||
assert.equal(row.tokensSeen, 10);
|
||||
assert.equal(row.lookupCount, 2);
|
||||
assert.equal(row.lookupHits, 1);
|
||||
} finally {
|
||||
db.close();
|
||||
cleanupDbPath(dbPath);
|
||||
@@ -137,7 +143,12 @@ test('getSessionSummaries with no telemetry returns zero aggregates', () => {
|
||||
assert.ok(row, 'expected to find the session with no telemetry');
|
||||
assert.equal(row.canonicalTitle, 'No Telemetry');
|
||||
assert.equal(row.totalWatchedMs, 0);
|
||||
assert.equal(row.activeWatchedMs, 0);
|
||||
assert.equal(row.linesSeen, 0);
|
||||
assert.equal(row.wordsSeen, 0);
|
||||
assert.equal(row.tokensSeen, 0);
|
||||
assert.equal(row.lookupCount, 0);
|
||||
assert.equal(row.lookupHits, 0);
|
||||
assert.equal(row.cardsMined, 0);
|
||||
} finally {
|
||||
db.close();
|
||||
@@ -153,13 +164,19 @@ test('getVocabularyStats returns rows ordered by frequency descending', () => {
|
||||
ensureSchema(db);
|
||||
const stmts = createTrackerPreparedStatements(db);
|
||||
|
||||
// Insert words: 猫 twice, 犬 once
|
||||
// Insert words with the highest-frequency entry inserted after another word
|
||||
stmts.wordUpsertStmt.run('犬', '犬', 'いぬ', 'noun', '名詞', '一般', '', 1_500, 1_500);
|
||||
stmts.wordUpsertStmt.run('猫', '猫', 'ねこ', 'noun', '名詞', '一般', '', 1_000, 2_000);
|
||||
stmts.wordUpsertStmt.run('猫', '猫', 'ねこ', 'noun', '名詞', '一般', '', 1_000, 3_000);
|
||||
stmts.wordUpsertStmt.run('犬', '犬', 'いぬ', 'noun', '名詞', '一般', '', 1_500, 1_500);
|
||||
|
||||
const rows = getVocabularyStats(db, 10);
|
||||
|
||||
assert.equal(rows.length, 2);
|
||||
assert.equal(rows[0]?.headword, '猫');
|
||||
assert.equal(rows[1]?.headword, '犬');
|
||||
assert.equal(rows[0]?.frequency, 2);
|
||||
assert.equal(rows[1]?.frequency, 1);
|
||||
|
||||
assert.ok(rows.length >= 2);
|
||||
// First row should be 猫 (frequency 2)
|
||||
const nekRow = rows.find((r) => r.headword === '猫');
|
||||
@@ -432,13 +449,17 @@ test('getKanjiStats returns rows ordered by frequency descending', () => {
|
||||
ensureSchema(db);
|
||||
const stmts = createTrackerPreparedStatements(db);
|
||||
|
||||
// Insert kanji: 日 twice, 月 once
|
||||
// Insert kanji with highest-frequency entry inserted after another character
|
||||
stmts.kanjiUpsertStmt.run('月', 1_500, 1_500);
|
||||
stmts.kanjiUpsertStmt.run('日', 1_000, 2_000);
|
||||
stmts.kanjiUpsertStmt.run('日', 1_000, 3_000);
|
||||
stmts.kanjiUpsertStmt.run('月', 1_500, 1_500);
|
||||
|
||||
const rows = getKanjiStats(db, 10);
|
||||
|
||||
assert.equal(rows.length, 2);
|
||||
assert.equal(rows[0]?.kanji, '日');
|
||||
assert.equal(rows[1]?.kanji, '月');
|
||||
|
||||
assert.ok(rows.length >= 2);
|
||||
const nichiRow = rows.find((r) => r.kanji === '日');
|
||||
const tsukiRow = rows.find((r) => r.kanji === '月');
|
||||
@@ -539,7 +560,16 @@ test('getSessionEvents returns empty array for session with no events', () => {
|
||||
|
||||
try {
|
||||
ensureSchema(db);
|
||||
const events = getSessionEvents(db, 9999, 50);
|
||||
|
||||
const videoId = getOrCreateVideoRecord(db, 'local:/tmp/events-empty.mkv', {
|
||||
canonicalTitle: 'Events Empty',
|
||||
sourcePath: '/tmp/events-empty.mkv',
|
||||
sourceUrl: null,
|
||||
sourceType: SOURCE_TYPE_LOCAL,
|
||||
});
|
||||
const { sessionId } = startSessionRecord(db, videoId, 6_000_000);
|
||||
|
||||
const events = getSessionEvents(db, sessionId, 50);
|
||||
assert.deepEqual(events, []);
|
||||
} finally {
|
||||
db.close();
|
||||
@@ -547,6 +577,72 @@ test('getSessionEvents returns empty array for session with no events', () => {
|
||||
}
|
||||
});
|
||||
|
||||
test('getSessionEvents filters events to the requested session id', () => {
|
||||
const dbPath = makeDbPath();
|
||||
const db = new Database(dbPath);
|
||||
|
||||
try {
|
||||
ensureSchema(db);
|
||||
const stmts = createTrackerPreparedStatements(db);
|
||||
|
||||
const decoyVideoId = getOrCreateVideoRecord(db, 'local:/tmp/events-filter-decoy.mkv', {
|
||||
canonicalTitle: 'Events Filter Decoy',
|
||||
sourcePath: '/tmp/events-filter-decoy.mkv',
|
||||
sourceUrl: null,
|
||||
sourceType: SOURCE_TYPE_LOCAL,
|
||||
});
|
||||
const targetVideoId = getOrCreateVideoRecord(db, 'local:/tmp/events-filter-target.mkv', {
|
||||
canonicalTitle: 'Events Filter Target',
|
||||
sourcePath: '/tmp/events-filter-target.mkv',
|
||||
sourceUrl: null,
|
||||
sourceType: SOURCE_TYPE_LOCAL,
|
||||
});
|
||||
|
||||
const decoySession = startSessionRecord(db, decoyVideoId, 8_000_000);
|
||||
const targetSession = startSessionRecord(db, targetVideoId, 8_100_000);
|
||||
|
||||
// Decoy session event
|
||||
stmts.eventInsertStmt.run(
|
||||
decoySession.sessionId,
|
||||
8_100_000 + 1,
|
||||
EVENT_SUBTITLE_LINE,
|
||||
1,
|
||||
0,
|
||||
500,
|
||||
1,
|
||||
0,
|
||||
'{"line":"decoy"}',
|
||||
8_100_000 + 1,
|
||||
8_100_000 + 1,
|
||||
);
|
||||
|
||||
// Target session event
|
||||
stmts.eventInsertStmt.run(
|
||||
targetSession.sessionId,
|
||||
8_100_000 + 2,
|
||||
EVENT_SUBTITLE_LINE,
|
||||
2,
|
||||
0,
|
||||
600,
|
||||
1,
|
||||
0,
|
||||
'{"line":"target"}',
|
||||
8_100_000 + 2,
|
||||
8_100_000 + 2,
|
||||
);
|
||||
|
||||
const events = getSessionEvents(db, targetSession.sessionId, 50);
|
||||
|
||||
assert.equal(events.length, 1);
|
||||
assert.equal(events[0]?.payload, '{"line":"target"}');
|
||||
assert.equal(events[0]?.eventType, EVENT_SUBTITLE_LINE);
|
||||
assert.equal(events[0]?.tsMs, 8100002);
|
||||
} finally {
|
||||
db.close();
|
||||
cleanupDbPath(dbPath);
|
||||
}
|
||||
});
|
||||
|
||||
test('getSessionEvents respects limit parameter', () => {
|
||||
const dbPath = makeDbPath();
|
||||
const db = new Database(dbPath);
|
||||
|
||||
@@ -229,6 +229,13 @@ export function registerIpcHandlers(deps: IpcServiceDeps, ipc: IpcMainRegistrar
|
||||
return Math.min(value as number, maxValue);
|
||||
};
|
||||
|
||||
const parsePositiveInteger = (value: unknown): number | null => {
|
||||
if (typeof value !== 'number' || !Number.isInteger(value) || value <= 0) {
|
||||
return null;
|
||||
}
|
||||
return value;
|
||||
};
|
||||
|
||||
ipc.on(
|
||||
IPC_CHANNELS.command.setIgnoreMouseEvents,
|
||||
(event: unknown, ignore: unknown, options: unknown = {}) => {
|
||||
@@ -474,18 +481,20 @@ export function registerIpcHandlers(deps: IpcServiceDeps, ipc: IpcMainRegistrar
|
||||
ipc.handle(
|
||||
IPC_CHANNELS.request.statsGetSessionTimeline,
|
||||
async (_event, sessionId: unknown, limit: unknown) => {
|
||||
if (typeof sessionId !== 'number') return [];
|
||||
const parsedSessionId = parsePositiveInteger(sessionId);
|
||||
if (parsedSessionId === null) return [];
|
||||
const parsedLimit = parsePositiveIntLimit(limit, 200, 1000);
|
||||
return deps.immersionTracker?.getSessionTimeline(sessionId, parsedLimit) ?? [];
|
||||
return deps.immersionTracker?.getSessionTimeline(parsedSessionId, parsedLimit) ?? [];
|
||||
},
|
||||
);
|
||||
|
||||
ipc.handle(
|
||||
IPC_CHANNELS.request.statsGetSessionEvents,
|
||||
async (_event, sessionId: unknown, limit: unknown) => {
|
||||
if (typeof sessionId !== 'number') return [];
|
||||
const parsedSessionId = parsePositiveInteger(sessionId);
|
||||
if (parsedSessionId === null) return [];
|
||||
const parsedLimit = parsePositiveIntLimit(limit, 500, 1000);
|
||||
return deps.immersionTracker?.getSessionEvents(sessionId, parsedLimit) ?? [];
|
||||
return deps.immersionTracker?.getSessionEvents(parsedSessionId, parsedLimit) ?? [];
|
||||
},
|
||||
);
|
||||
|
||||
|
||||
@@ -37,6 +37,7 @@ const STATS_STATIC_CONTENT_TYPES: Record<string, string> = {
|
||||
'.woff': 'font/woff',
|
||||
'.woff2': 'font/woff2',
|
||||
};
|
||||
const ANKI_CONNECT_FETCH_TIMEOUT_MS = 3_000;
|
||||
|
||||
function resolveStatsStaticPath(staticDir: string, requestPath: string): string | null {
|
||||
const normalizedPath = requestPath.replace(/^\/+/, '') || 'index.html';
|
||||
@@ -130,7 +131,7 @@ export function createStatsApp(
|
||||
});
|
||||
|
||||
app.get('/api/stats/sessions/:id/timeline', async (c) => {
|
||||
const id = parseIntQuery(c.req.query('id') ?? c.req.param('id'), 0);
|
||||
const id = parseIntQuery(c.req.param('id'), 0);
|
||||
if (id <= 0) return c.json([], 400);
|
||||
const limit = parseIntQuery(c.req.query('limit'), 200, 1000);
|
||||
const timeline = await tracker.getSessionTimeline(id, limit);
|
||||
@@ -138,7 +139,7 @@ export function createStatsApp(
|
||||
});
|
||||
|
||||
app.get('/api/stats/sessions/:id/events', async (c) => {
|
||||
const id = parseIntQuery(c.req.query('id') ?? c.req.param('id'), 0);
|
||||
const id = parseIntQuery(c.req.param('id'), 0);
|
||||
if (id <= 0) return c.json([], 400);
|
||||
const limit = parseIntQuery(c.req.query('limit'), 500, 1000);
|
||||
const events = await tracker.getSessionEvents(id, limit);
|
||||
@@ -304,6 +305,7 @@ export function createStatsApp(
|
||||
const response = await fetch('http://127.0.0.1:8765', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
signal: AbortSignal.timeout(ANKI_CONNECT_FETCH_TIMEOUT_MS),
|
||||
body: JSON.stringify({ action: 'guiBrowse', version: 6, params: { query: `nid:${noteId}` } }),
|
||||
});
|
||||
const result = await response.json();
|
||||
@@ -315,12 +317,17 @@ export function createStatsApp(
|
||||
|
||||
app.post('/api/stats/anki/notesInfo', async (c) => {
|
||||
const body = await c.req.json().catch(() => null);
|
||||
const noteIds = Array.isArray(body?.noteIds) ? body.noteIds.filter((id: unknown) => typeof id === 'number') : [];
|
||||
const noteIds = Array.isArray(body?.noteIds)
|
||||
? body.noteIds.filter(
|
||||
(id: unknown): id is number => typeof id === 'number' && Number.isInteger(id) && id > 0,
|
||||
)
|
||||
: [];
|
||||
if (noteIds.length === 0) return c.json([]);
|
||||
try {
|
||||
const response = await fetch('http://127.0.0.1:8765', {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
signal: AbortSignal.timeout(ANKI_CONNECT_FETCH_TIMEOUT_MS),
|
||||
body: JSON.stringify({ action: 'notesInfo', version: 6, params: { notes: noteIds } }),
|
||||
});
|
||||
const result = await response.json() as { result?: Array<{ noteId: number; fields: Record<string, { value: string }> }> };
|
||||
|
||||
@@ -36,7 +36,7 @@ export function buildStatsWindowOptions(options: {
|
||||
width: options.bounds?.width ?? DEFAULT_STATS_WINDOW_WIDTH,
|
||||
height: options.bounds?.height ?? DEFAULT_STATS_WINDOW_HEIGHT,
|
||||
frame: false,
|
||||
transparent: false,
|
||||
transparent: true,
|
||||
alwaysOnTop: true,
|
||||
resizable: false,
|
||||
skipTaskbar: true,
|
||||
@@ -50,7 +50,7 @@ export function buildStatsWindowOptions(options: {
|
||||
nodeIntegration: false,
|
||||
contextIsolation: true,
|
||||
preload: options.preloadPath,
|
||||
sandbox: false,
|
||||
sandbox: true,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
@@ -27,7 +27,7 @@ test('buildStatsWindowOptions uses tracked overlay bounds and preload-friendly w
|
||||
assert.equal(options.webPreferences?.preload, '/tmp/preload-stats.js');
|
||||
assert.equal(options.webPreferences?.contextIsolation, true);
|
||||
assert.equal(options.webPreferences?.nodeIntegration, false);
|
||||
assert.equal(options.webPreferences?.sandbox, false);
|
||||
assert.equal(options.webPreferences?.sandbox, true);
|
||||
});
|
||||
|
||||
test('shouldHideStatsWindowForInput matches Escape and configured bare toggle key', () => {
|
||||
@@ -55,6 +55,58 @@ test('shouldHideStatsWindowForInput matches Escape and configured bare toggle ke
|
||||
true,
|
||||
);
|
||||
|
||||
assert.equal(
|
||||
shouldHideStatsWindowForInput(
|
||||
{
|
||||
type: 'keyDown',
|
||||
key: '`',
|
||||
code: 'Backquote',
|
||||
control: true,
|
||||
} as Electron.Input,
|
||||
'Backquote',
|
||||
),
|
||||
false,
|
||||
);
|
||||
|
||||
assert.equal(
|
||||
shouldHideStatsWindowForInput(
|
||||
{
|
||||
type: 'keyDown',
|
||||
key: '`',
|
||||
code: 'Backquote',
|
||||
alt: true,
|
||||
} as Electron.Input,
|
||||
'Backquote',
|
||||
),
|
||||
false,
|
||||
);
|
||||
|
||||
assert.equal(
|
||||
shouldHideStatsWindowForInput(
|
||||
{
|
||||
type: 'keyDown',
|
||||
key: '`',
|
||||
code: 'Backquote',
|
||||
meta: true,
|
||||
} as Electron.Input,
|
||||
'Backquote',
|
||||
),
|
||||
false,
|
||||
);
|
||||
|
||||
assert.equal(
|
||||
shouldHideStatsWindowForInput(
|
||||
{
|
||||
type: 'keyDown',
|
||||
key: '`',
|
||||
code: 'Backquote',
|
||||
isAutoRepeat: true,
|
||||
} as Electron.Input,
|
||||
'Backquote',
|
||||
),
|
||||
false,
|
||||
);
|
||||
|
||||
assert.equal(
|
||||
shouldHideStatsWindowForInput(
|
||||
{
|
||||
|
||||
31
src/main.ts
31
src/main.ts
@@ -622,13 +622,19 @@ if (!fs.existsSync(USER_DATA_PATH)) {
|
||||
app.setPath('userData', USER_DATA_PATH);
|
||||
|
||||
let forceQuitTimer: ReturnType<typeof setTimeout> | null = null;
|
||||
let statsServer: ReturnType<typeof startStatsServer> | null = null;
|
||||
|
||||
function stopStatsServer(): void {
|
||||
if (!statsServer) {
|
||||
return;
|
||||
}
|
||||
statsServer.close();
|
||||
statsServer = null;
|
||||
}
|
||||
|
||||
function requestAppQuit(): void {
|
||||
destroyStatsWindow();
|
||||
if (appState.statsServer) {
|
||||
appState.statsServer.close();
|
||||
appState.statsServer = null;
|
||||
}
|
||||
stopStatsServer();
|
||||
if (!forceQuitTimer) {
|
||||
forceQuitTimer = setTimeout(() => {
|
||||
logger.warn('App quit timed out; forcing process exit.');
|
||||
@@ -2376,6 +2382,8 @@ const {
|
||||
getSubtitleTimingTracker: () => appState.subtitleTimingTracker,
|
||||
getImmersionTracker: () => appState.immersionTracker,
|
||||
clearImmersionTracker: () => {
|
||||
stopStatsServer();
|
||||
appState.statsServer = null;
|
||||
appState.immersionTracker = null;
|
||||
},
|
||||
getAnkiIntegration: () => appState.ankiIntegration,
|
||||
@@ -2427,13 +2435,15 @@ const ensureStatsServerStarted = (): string => {
|
||||
if (!tracker) {
|
||||
throw new Error('Immersion tracker failed to initialize.');
|
||||
}
|
||||
if (!appState.statsServer) {
|
||||
appState.statsServer = startStatsServer({
|
||||
if (!statsServer) {
|
||||
statsServer = startStatsServer({
|
||||
port: getResolvedConfig().stats.serverPort,
|
||||
staticDir: statsDistPath,
|
||||
tracker,
|
||||
});
|
||||
appState.statsServer = statsServer;
|
||||
}
|
||||
appState.statsServer = statsServer;
|
||||
return `http://127.0.0.1:${getResolvedConfig().stats.serverPort}`;
|
||||
};
|
||||
|
||||
@@ -2473,10 +2483,17 @@ const immersionTrackerStartupMainDeps: Parameters<
|
||||
resolveLegacyVocabularyPos,
|
||||
}),
|
||||
setTracker: (tracker) => {
|
||||
const trackerHasChanged =
|
||||
appState.immersionTracker !== null && appState.immersionTracker !== tracker;
|
||||
if (trackerHasChanged && appState.statsServer) {
|
||||
stopStatsServer();
|
||||
appState.statsServer = null;
|
||||
}
|
||||
|
||||
appState.immersionTracker = tracker as ImmersionTrackerService | null;
|
||||
appState.immersionTracker?.setCoverArtFetcher(statsCoverArtFetcher);
|
||||
if (tracker) {
|
||||
// Start HTTP stats server (once)
|
||||
// Start HTTP stats server
|
||||
if (!appState.statsServer) {
|
||||
const config = getResolvedConfig();
|
||||
if (config.stats.autoStartServer) {
|
||||
|
||||
@@ -41,53 +41,38 @@ export function App() {
|
||||
<TabBar activeTab={activeTab} onTabChange={handleTabChange} />
|
||||
</header>
|
||||
<main className="flex-1 overflow-y-auto p-4">
|
||||
<section
|
||||
id="panel-overview"
|
||||
role="tabpanel"
|
||||
aria-labelledby="tab-overview"
|
||||
hidden={activeTab !== 'overview'}
|
||||
>
|
||||
<OverviewTab />
|
||||
</section>
|
||||
<section
|
||||
id="panel-anime"
|
||||
role="tabpanel"
|
||||
aria-labelledby="tab-anime"
|
||||
hidden={activeTab !== 'anime'}
|
||||
>
|
||||
<AnimeTab
|
||||
initialAnimeId={selectedAnimeId}
|
||||
onClearInitialAnime={() => setSelectedAnimeId(null)}
|
||||
onNavigateToWord={openWordDetail}
|
||||
/>
|
||||
</section>
|
||||
<section
|
||||
id="panel-trends"
|
||||
role="tabpanel"
|
||||
aria-labelledby="tab-trends"
|
||||
hidden={activeTab !== 'trends'}
|
||||
>
|
||||
<TrendsTab />
|
||||
</section>
|
||||
<section
|
||||
id="panel-vocabulary"
|
||||
role="tabpanel"
|
||||
aria-labelledby="tab-vocabulary"
|
||||
hidden={activeTab !== 'vocabulary'}
|
||||
>
|
||||
<VocabularyTab
|
||||
onNavigateToAnime={navigateToAnime}
|
||||
onOpenWordDetail={openWordDetail}
|
||||
/>
|
||||
</section>
|
||||
<section
|
||||
id="panel-sessions"
|
||||
role="tabpanel"
|
||||
aria-labelledby="tab-sessions"
|
||||
hidden={activeTab !== 'sessions'}
|
||||
>
|
||||
<SessionsTab />
|
||||
</section>
|
||||
{activeTab === 'overview' ? (
|
||||
<section id="panel-overview" role="tabpanel" aria-labelledby="tab-overview">
|
||||
<OverviewTab />
|
||||
</section>
|
||||
) : null}
|
||||
{activeTab === 'anime' ? (
|
||||
<section id="panel-anime" role="tabpanel" aria-labelledby="tab-anime">
|
||||
<AnimeTab
|
||||
initialAnimeId={selectedAnimeId}
|
||||
onClearInitialAnime={() => setSelectedAnimeId(null)}
|
||||
onNavigateToWord={openWordDetail}
|
||||
/>
|
||||
</section>
|
||||
) : null}
|
||||
{activeTab === 'trends' ? (
|
||||
<section id="panel-trends" role="tabpanel" aria-labelledby="tab-trends">
|
||||
<TrendsTab />
|
||||
</section>
|
||||
) : null}
|
||||
{activeTab === 'vocabulary' ? (
|
||||
<section id="panel-vocabulary" role="tabpanel" aria-labelledby="tab-vocabulary">
|
||||
<VocabularyTab
|
||||
onNavigateToAnime={navigateToAnime}
|
||||
onOpenWordDetail={openWordDetail}
|
||||
/>
|
||||
</section>
|
||||
) : null}
|
||||
{activeTab === 'sessions' ? (
|
||||
<section id="panel-sessions" role="tabpanel" aria-labelledby="tab-sessions">
|
||||
<SessionsTab />
|
||||
</section>
|
||||
) : null}
|
||||
</main>
|
||||
<WordDetailPanel
|
||||
wordId={globalWordId}
|
||||
|
||||
@@ -19,8 +19,20 @@ export function VocabularyTab({ onNavigateToAnime, onOpenWordDetail }: Vocabular
|
||||
const [selectedKanjiId, setSelectedKanjiId] = useState<number | null>(null);
|
||||
const [search, setSearch] = useState('');
|
||||
|
||||
if (loading) return <div className="text-ctp-overlay2 p-4">Loading...</div>;
|
||||
if (error) return <div className="text-ctp-red p-4">Error: {error}</div>;
|
||||
if (loading) {
|
||||
return (
|
||||
<div className="text-ctp-overlay2 p-4" role="status" aria-live="polite">
|
||||
Loading...
|
||||
</div>
|
||||
);
|
||||
}
|
||||
if (error) {
|
||||
return (
|
||||
<div className="text-ctp-red p-4" role="alert" aria-live="assertive">
|
||||
Error: {error}
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
const summary = buildVocabularySummary(words, kanji);
|
||||
|
||||
|
||||
Reference in New Issue
Block a user