Compare commits

..

1 Commits

Author SHA1 Message Date
35adf8299c Refactor startup, queries, and workflow into focused modules (#36)
* chore(backlog): add mining workflow milestone and tasks

* refactor: split character dictionary runtime modules

* refactor: split shared type entrypoints

* refactor: use bun serve for stats server

* feat: add repo-local subminer workflow plugin

* fix: add stats server node fallback

* refactor: split immersion tracker query modules

* chore: update backlog task records

* refactor: migrate shared type imports

* refactor: compose startup and setup window wiring

* Add backlog tasks and launcher time helper tests

- Track follow-up cleanup work in Backlog.md
- Replace Date.now usage with shared nowMs helper
- Add launcher args/parser and core regression tests

* test: increase launcher test timeout for CI stability

* fix: address CodeRabbit review feedback

* refactor(main): extract remaining inline runtime logic from main

* chore(backlog): update task notes and changelog fragment

* refactor: split main boot phases

* test: stabilize bun coverage reporting

* Switch plausible endpoint and harden coverage lane parsing

- update docs-site tracking to use the Plausible capture endpoint
- tighten coverage lane argument and LCOV parsing checks
- make script entrypoint use CommonJS main guard

* Restrict docs analytics and build coverage input

- limit Plausible init to docs.subminer.moe
- build Yomitan before src coverage lane

* fix(ci): normalize Windows shortcut paths for cross-platform tests

* Fix verification and immersion-tracker grouping

- isolate verifier artifacts and lease handling
- switch weekly/monthly tracker cutoffs to calendar boundaries
- tighten boot lifecycle and zip writer tests

* fix: resolve CI type failures in boot and immersion query tests

* fix: remove strict spread usage in Date mocks

* fix: use explicit super args for MockDate constructors

* Factor out mock date helper in tracker tests

- reuse a shared `withMockDate` helper for date-sensitive query tests
- make monthly rollup assertions key off `videoId` instead of row order

* fix: use variadic array type for MockDate constructor args

TS2367: fixed-length tuple made args.length === 0 unreachable.

* refactor: remove unused createMainBootRuntimes/Handlers aggregate functions

These functions were never called by production code — main.ts imports
the individual composeBoot* re-exports directly.

* refactor: remove boot re-export alias layer

main.ts now imports directly from the runtime/composers and runtime/domains
modules, eliminating the intermediate boot/ indirection.

* refactor: consolidate 3 near-identical setup window factories

Extract shared createSetupWindowHandler with a config parameter.
Public API unchanged.

* refactor: parameterize duplicated getAffected*Ids query helpers

Four structurally identical functions collapsed into two parameterized
helpers while preserving the existing public API.

* refactor: inline identity composers (stats-startup, overlay-window)

composeStatsStartupRuntime was a no-op that returned its input.
composeOverlayWindowHandlers was a 1-line delegation.
Both removed in favor of direct usage.

* chore: remove unused token/queue file path constants from main.ts

* fix: replace any types in boot services with proper signatures

* refactor: deduplicate ensureDir into shared/fs-utils

5 copies of mkdir-p-if-not-exists consolidated into one shared module
with ensureDir (directory path) and ensureDirForFile (file path) variants.

* fix: tighten type safety in boot services

- Add AppLifecycleShape and OverlayModalInputStateShape constraints
  so TAppLifecycleApp and TOverlayModalInputState generics are bounded
- Remove unsafe `as { handleModalInputStateChange? }` cast — now
  directly callable via the constraint
- Use `satisfies AppLifecycleShape` for structural validation on the
  appLifecycleApp object literal
- Document Electron App.on incompatibility with simple signatures

* refactor: inline subtitle-prefetch-runtime-composer

The composer was a pure pass-through that destructured an object and
reassembled it with the same fields. Inlined at the call site.

* chore: consolidate duplicate import paths in main.ts

* test: extract mpv composer test fixture factory to reduce duplication

* test: add behavioral assertions to composer tests

Upgrade 8 composer test files from shape-only typeof checks to behavioral
assertions that invoke returned handlers and verify injected dependencies are
actually called, following the mpv-runtime-composer pattern.

* refactor: normalize import extensions in query modules

* refactor: consolidate toDbMs into query-shared.ts

* refactor: remove Node.js fallback from stats-server, use Bun only

* Fix monthly rollup test expectations

- Preserve multi-arg Date construction in mock helper
- Align rollup assertions with the correct videoId

* fix: address PR 36 CodeRabbit follow-ups

* fix: harden coverage lane cleanup

* fix(stats): fallback to node server when Bun.serve unavailable

* fix(ci): restore coverage lane compatibility

* chore(backlog): close TASK-242

* fix: address latest CodeRabbit review round

* fix: guard disabled immersion retention windows

* fix: migrate discord rpc wrapper

* fix(ci): add changelog fragment for PR 36

* fix: stabilize macOS visible overlay toggle

* fix: pin installed mpv plugin to current binary

* fix: strip inline subtitle markup from sidebar cues

* fix(renderer): restore subtitle sidebar mpv passthrough

* feat(discord): add configurable presence style presets

Replace the hardcoded "Mining and crafting (Anki cards)" meme message
with a preset system. New `discordPresence.presenceStyle` option
supports four presets: "default" (clean bilingual), "meme" (the OG
Minecraft joke), "japanese" (fully JP), and "minimal". The default
preset shows "Sentence Mining" with 日本語学習中 as the small image
tooltip. Existing users can set presenceStyle to "meme" to keep the
old behavior.

* fix: finalize v0.10.0 release prep

* docs: add subtitle sidebar guide and release note

* chore(backlog): mark docs task done

* fix: lazily resolve youtube playback socket path

* chore(release): build v0.10.0 changelog

* Revert "chore(release): build v0.10.0 changelog"

This reverts commit 9741c0f020.
2026-03-29 16:16:29 -07:00
91 changed files with 2727 additions and 1047 deletions

4
.gitignore vendored
View File

@@ -9,13 +9,11 @@ out/
dist/
release/
build/yomitan/
coverage/
# Launcher build artifact (produced by make build-launcher)
/subminer
# Coverage
coverage/
# Logs
*.log
npm-debug.log*

View File

@@ -20,7 +20,7 @@ Priority keys:
| ID | Pri | Status | Area | Title |
| ------ | --- | ------ | -------------- | --------------------------------------------------- |
| SM-013 | P1 | doing | review-followup | Address PR #36 CodeRabbit action items |
| SM-013 | P1 | done | review-followup | Address PR #36 CodeRabbit action items |
## Ready
@@ -241,7 +241,7 @@ Done:
Title: Address PR #36 CodeRabbit action items
Priority: P1
Status: doing
Status: done
Scope:
- `plugins/subminer-workflow/skills/subminer-change-verification/scripts/verify_subminer_change.sh`
@@ -251,7 +251,16 @@ Scope:
- `src/core/services/immersion-tracker/maintenance.ts`
- `src/main/boot/services.ts`
- `src/main/character-dictionary-runtime/zip.test.ts`
Acceptance:
Acceptance:
- fix valid open CodeRabbit findings on PR #36
- add focused regression coverage for behavior changes where practical
- verify touched tests plus typecheck stay green
Done:
- hardened `--artifact-dir` validation in the verification script
- fixed trend aggregation rounding and monthly ratio bucketing
- preserved unwatched anime episodes in episode queries
- restored seconds-based aggregate timestamps in shared maintenance
- fixed the startup refactor compile break by making the predicates local at the call site
- verified with `bun test src/core/services/immersion-tracker/__tests__/query.test.ts src/core/services/immersion-tracker/__tests__/query-split-modules.test.ts` and `bun run typecheck`

View File

@@ -1,5 +1,26 @@
# Changelog
## v0.10.0 (2026-03-29)
### Changed
- Integrations: Replaced the deprecated Discord Rich Presence wrapper with the maintained `@xhayper/discord-rpc` package.
### Fixed
- Stats: Fixed stats startup so the immersion tracker can run when `Bun.serve` is unavailable.
- Stats: Stats server now falls back to a Node `http` listener in Electron/runtime paths that do not expose Bun.
- Overlay: Fixed the macOS visible-overlay toggle path so manual hides stay hidden and the plugin uses the explicit visible-overlay toggle command.
- Subtitle Sidebar: Restored macOS mpv passthrough while the overlay subtitle sidebar is open so clicks outside the sidebar can refocus mpv and keep native keybindings working.
### Internal
- Release: Added a maintained source coverage lane that shards Bun coverage one test file at a time and merges LCOV output into `coverage/test-src/lcov.info`.
- Release: CI and release quality-gate now upload the merged source-lane LCOV artifact for inspection.
- Runtime: Extracted remaining inline runtime logic from `src/main.ts` into dedicated runtime modules and composer helpers.
- Runtime: Added focused regression tests for the extracted runtime/composer boundaries.
- Runtime: Updated task tracking notes to mark TASK-238.6 complete and confirm follow-on boot-phase split can be deferred.
- Runtime: Split `src/main.ts` boot wiring into dedicated `src/main/boot/services.ts`, `src/main/boot/runtimes.ts`, and `src/main/boot/handlers.ts` modules.
- Runtime: Added focused tests for the new boot-phase seams and kept the startup/typecheck/build verification lanes green.
- Runtime: Updated internal architecture/task docs to record the boot-phase split and new ownership boundary.
## v0.9.3 (2026-03-25)
### Changed

View File

@@ -0,0 +1,35 @@
---
id: TASK-243
title: 'Assess and address PR #36 latest CodeRabbit review round'
status: Done
assignee: []
created_date: '2026-03-29 07:39'
updated_date: '2026-03-29 07:41'
labels:
- code-review
- pr-36
dependencies: []
references:
- 'https://github.com/ksyasuda/SubMiner/pull/36'
priority: high
ordinal: 3600
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Inspect the latest CodeRabbit review round on PR #36, verify each actionable comment against the current branch, implement the confirmed fixes, and verify the touched paths.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 Confirmed review comments are implemented or explicitly deferred with rationale.
- [ ] #2 Touched paths are verified with the smallest sufficient test/build lane.
- [ ] #3 Current PR feedback is reduced to resolved or intentionally deferred suggestions.
<!-- AC:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Addressed the confirmed latest CodeRabbit review items on PR #36. `scripts/run-coverage-lane.ts` now uses the Bun-style `import.meta.main` entrypoint check with a local ts-ignore to preserve the repo's CommonJS typecheck settings. `src/core/services/immersion-tracker/maintenance.ts` no longer shadows the imported `nowMs` helper in retention functions. `src/main.ts` now centralizes the startup-mode predicates behind a shared helper and releases `resolvedSource.cleanup` on the cached-subtitle fast path so materialized sources do not leak.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,35 @@
---
id: TASK-244
title: 'Assess and address PR #36 latest CodeRabbit review round 2'
status: Done
assignee: []
created_date: '2026-03-29 08:09'
updated_date: '2026-03-29 08:10'
labels:
- code-review
- pr-36
dependencies: []
references:
- 'https://github.com/ksyasuda/SubMiner/pull/36'
priority: high
ordinal: 3610
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Inspect the newest CodeRabbit review round on PR #36, verify the actionable comment against the current branch, implement the confirmed fix, and verify the touched path.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 The actionable review comment is implemented or explicitly deferred with rationale.
- [ ] #2 Touched path is verified with the smallest sufficient test lane.
- [ ] #3 Current PR feedback is reduced to resolved or intentionally deferred suggestions.
<!-- AC:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Addressed the actionable latest CodeRabbit comment on PR #36. `src/core/services/immersion-tracker/maintenance.ts` now skips retention deletions when a window is disabled with `Infinity`, so `toDbMs(...)` is only called for finite retention values. Added a regression test in `maintenance.test.ts` that verifies disabled retention windows preserve session events, telemetry, and sessions while returning zero deletions.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -1,7 +1,7 @@
---
id: TASK-238.1
title: Extract main-window and overlay-window composition from src/main.ts
status: Done
status: To Do
assignee: []
created_date: '2026-03-26 20:49'
labels:
@@ -29,10 +29,10 @@ priority: high
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 At least the main overlay window path plus two other window/setup flows are extracted from direct `BrowserWindow` construction inside `src/main.ts`.
- [x] #2 The extracted modules expose narrow factory/handler APIs that can be tested without booting the whole app.
- [x] #3 `src/main.ts` becomes materially smaller and easier to scan, with window creation concentrated behind well-named runtime surfaces.
- [x] #4 Relevant runtime/window tests pass, and new tests are added for any newly isolated window composition helpers.
- [ ] #1 At least the main overlay window path plus two other window/setup flows are extracted from direct `BrowserWindow` construction inside `src/main.ts`.
- [ ] #2 The extracted modules expose narrow factory/handler APIs that can be tested without booting the whole app.
- [ ] #3 `src/main.ts` becomes materially smaller and easier to scan, with window creation concentrated behind well-named runtime surfaces.
- [ ] #4 Relevant runtime/window tests pass, and new tests are added for any newly isolated window composition helpers.
<!-- AC:END -->
## Implementation Plan
@@ -43,11 +43,3 @@ priority: high
3. Update the composition root to consume the new modules and keep side effects/app state ownership explicit.
4. Verify with focused runtime/window tests plus `bun run typecheck`.
<!-- SECTION:PLAN:END -->
## Completion Notes
- Window composition now flows through `src/main/runtime/setup-window-factory.ts` and `src/main/runtime/overlay-window-factory.ts`, with `src/main/runtime/overlay-window-runtime-handlers.ts` composing the main/modal overlay entrypoints.
- `src/main.ts` keeps dependency wiring and state ownership, while the named runtime helpers own the reusable window-creation surfaces.
- Verification:
- `bun test src/main/runtime/overlay-window-factory.test.ts src/main/runtime/overlay-window-runtime-handlers.test.ts`
- `bun run typecheck` failed on unrelated existing errors in `src/core/services/immersion-tracker/lifetime.ts`, `src/core/services/immersion-tracker/maintenance.ts`, and `src/core/services/stats-server.ts`

View File

@@ -1,7 +1,7 @@
---
id: TASK-238.2
title: Extract CLI and headless command wiring from src/main.ts
status: Done
status: To Do
assignee: []
created_date: '2026-03-26 20:49'
labels:
@@ -30,10 +30,10 @@ priority: high
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 CLI parsing, initial-command dispatch, and headless command execution no longer live as large inline flows in `src/main.ts`.
- [x] #2 The new modules make the desktop startup path and headless startup path visibly separate and easier to test.
- [x] #3 Existing CLI behaviors remain unchanged, including help output and startup gating behavior.
- [x] #4 Targeted CLI/runtime tests cover the extracted path, and `bun run typecheck` passes.
- [ ] #1 CLI parsing, initial-command dispatch, and headless command execution no longer live as large inline flows in `src/main.ts`.
- [ ] #2 The new modules make the desktop startup path and headless startup path visibly separate and easier to test.
- [ ] #3 Existing CLI behaviors remain unchanged, including help output and startup gating behavior.
- [ ] #4 Targeted CLI/runtime tests cover the extracted path, and `bun run typecheck` passes.
<!-- AC:END -->
## Implementation Plan
@@ -44,11 +44,3 @@ priority: high
3. Keep Electron app ownership in `src/main.ts`; move only CLI orchestration and context assembly.
4. Verify with CLI-focused tests plus `bun run typecheck`.
<!-- SECTION:PLAN:END -->
## Completion Notes
- CLI and headless startup wiring now lives behind `src/main/runtime/composers/cli-startup-composer.ts`, `src/main/runtime/cli-command-runtime-handler.ts`, `src/main/runtime/initial-args-handler.ts`, and `src/main/runtime/composers/headless-startup-composer.ts`.
- `src/main.ts` now passes CLI/context dependencies into those runtime surfaces instead of holding the full orchestration inline.
- Verification:
- `bun test src/main/runtime/composers/cli-startup-composer.test.ts src/main/runtime/initial-args-handler.test.ts src/main/runtime/cli-command-runtime-handler.test.ts`
- `bun run typecheck` failed on unrelated existing errors in `src/core/services/immersion-tracker/lifetime.ts`, `src/core/services/immersion-tracker/maintenance.ts`, and `src/core/services/stats-server.ts`

View File

@@ -0,0 +1,35 @@
---
id: TASK-242
title: Fix stats server Bun fallback in coverage lane
status: Done
assignee: []
created_date: '2026-03-29 07:31'
updated_date: '2026-03-29 07:37'
labels:
- ci
- bug
milestone: cleanup
dependencies: []
references:
- 'PR #36'
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Coverage CI fails when `startStatsServer` reaches the Bun server seam under the maintained source lane. Add a runtime fallback that works when `Bun.serve` is unavailable and keep the stats-server startup path testable.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 `bun run test:coverage:src` passes in GitHub CI
- [x] #2 `startStatsServer` uses `Bun.serve` when present and a Node server fallback otherwise
- [x] #3 Regression coverage exists for the fallback startup path
<!-- AC:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Fixed the CI failure in the coverage lane by replacing the Bun-only stats server path with a Bun-or-node/http startup fallback and by normalizing setup window options so undefined BrowserWindow fields are omitted. Verified the exact coverage lane under Bun 1.3.5 and confirmed the GitHub Actions run for PR #36 completed successfully.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,68 @@
---
id: TASK-245
title: Cut minor release v0.10.0 for docs and release prep
status: Done
assignee:
- '@codex'
created_date: '2026-03-29 08:10'
updated_date: '2026-03-29 08:13'
labels:
- release
- docs
- minor
dependencies: []
references:
- /home/sudacode/projects/japanese/SubMiner/package.json
- /home/sudacode/projects/japanese/SubMiner/README.md
- /home/sudacode/projects/japanese/SubMiner/docs/RELEASING.md
- /home/sudacode/projects/japanese/SubMiner/docs/README.md
- /home/sudacode/projects/japanese/SubMiner/docs-site/changelog.md
- /home/sudacode/projects/japanese/SubMiner/CHANGELOG.md
- /home/sudacode/projects/japanese/SubMiner/release/release-notes.md
priority: high
ordinal: 54850
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Prepare the next 0-ver minor release cut as `v0.10.0`, keeping release-facing docs, backlog, and changelog artifacts aligned, then run the release-prep verification gate.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Repository version metadata is updated to `0.10.0`.
- [x] #2 Release-facing docs and public changelog surfaces are aligned for the `v0.10.0` cut.
- [x] #3 `CHANGELOG.md` and `release/release-notes.md` contain the committed `v0.10.0` section and any consumed fragments are removed.
- [x] #4 Release-prep verification passes for changelog, config example, typecheck, tests, and build.
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Completed:
- Bumped `package.json` from `0.9.3` to `0.10.0`.
- Ran `bun run changelog:build --version 0.10.0 --date 2026-03-29`, which generated `CHANGELOG.md` and `release/release-notes.md` and removed the queued `changes/*.md` fragments.
- Updated `docs-site/changelog.md` with the public-facing `v0.10.0` summary.
Verification:
- `bun run changelog:lint`
- `bun run changelog:check --version 0.10.0`
- `bun run verify:config-example`
- `bun run typecheck`
- `bunx bun@1.3.5 run test:fast`
- `bunx bun@1.3.5 run test:env`
- `bunx bun@1.3.5 run build`
- `bunx bun@1.3.5 run docs:test`
- `bunx bun@1.3.5 run docs:build`
Notes:
- The local `bun` binary is `1.3.11`, which tripped Bun's nested `node:test` handling in `test:fast`; rerunning with the repo-pinned `bun@1.3.5` cleared the issue.
- No README content change was necessary for this cut.
<!-- SECTION:NOTES:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Prepared the `v0.10.0` release cut locally. Bumped `package.json`, generated committed root changelog and release notes, updated the public docs changelog summary, and verified the release gate with the repo-pinned Bun `1.3.5` runtime. The release prep is green and ready for tagging/publishing when desired.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,55 @@
---
id: TASK-246
title: Migrate Discord Rich Presence to maintained RPC wrapper
status: Done
assignee: []
created_date: '2026-03-29 08:17'
updated_date: '2026-03-29 08:22'
labels:
- dependency
- discord
- presence
dependencies: []
priority: medium
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Replace the deprecated Discord Rich Presence wrapper with a maintained JavaScript alternative while preserving the current IPC-based presence behavior in the Electron main process.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 The app no longer depends on `discord-rpc`
- [x] #2 Discord Rich Presence still logs in and publishes activity updates from the main process
- [x] #3 Existing Discord presence tests continue to pass or are updated to cover the new client API
- [x] #4 The change is documented in the release notes or changelog fragment
<!-- AC:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Completed:
- Swapped the app's Discord RPC dependency from `discord-rpc` to `@xhayper/discord-rpc`.
- Extracted the client adapter into `src/main/runtime/discord-rpc-client.ts` so the main process can keep using a small wrapper around the maintained library.
- Added `src/main/runtime/discord-rpc-client.test.ts` to verify the adapter forwards login/activity/clear/destroy calls through `client.user`.
- Documented the dependency swap in `CHANGELOG.md`, `release/release-notes.md`, and `docs-site/changelog.md`.
Verification:
- `bunx bun@1.3.5 test src/main/runtime/discord-rpc-client.test.ts src/core/services/discord-presence.test.ts`
- `bunx bun@1.3.5 run changelog:lint`
- `bunx bun@1.3.5 run changelog:check --version 0.10.0`
- `bunx bun@1.3.5 run docs:test`
- `bunx bun@1.3.5 run docs:build`
Notes:
- The existing release prep artifacts for v0.10.0 were kept intact and updated in place.
- No README change was needed for this dependency swap.
<!-- SECTION:NOTES:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Replaced the deprecated `discord-rpc` dependency with the maintained `@xhayper/discord-rpc` wrapper while preserving the main-process rich presence flow. Added a focused runtime wrapper test, kept the existing Discord presence service tests green, and documented the dependency swap in the release notes and changelog.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,60 @@
---
id: TASK-247
title: Strip inline subtitle markup from subtitle sidebar cues
status: Done
assignee:
- codex
created_date: '2026-03-29 10:01'
updated_date: '2026-03-29 10:10'
labels: []
dependencies: []
references:
- src/core/services/subtitle-cue-parser.ts
- src/renderer/modals/subtitle-sidebar.ts
- src/core/services/subtitle-cue-parser.test.ts
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Subtitle sidebar should display readable subtitle text when loaded subtitle files include inline markup such as HTML-like font tags. Parsed cue text currently preserves markup, causing raw tags to appear in the sidebar instead of clean subtitle content.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Subtitle sidebar cue text omits inline subtitle markup such as HTML-like font tags while preserving visible subtitle content.
- [x] #2 Parsed subtitle cues used by the sidebar keep timing order and expected line-break behavior after markup sanitization.
- [x] #3 Regression tests cover markup-bearing subtitle cue parsing so raw tags do not reappear in the sidebar.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Add regression tests in src/core/services/subtitle-cue-parser.test.ts for subtitle cues containing HTML-like font tags, including multi-line content.
2. Verify the new parser test fails against current behavior to confirm the bug is covered.
3. Update src/core/services/subtitle-cue-parser.ts to sanitize inline subtitle markup while preserving visible text and expected newline handling.
4. Re-run focused parser tests, then run broader verification commands required for handoff as practical.
5. Update task notes/acceptance criteria based on verified results and finalize the task record.
<!-- SECTION:PLAN:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
User approved implementation on 2026-03-29.
Implemented parser-level subtitle cue sanitization for HTML-like tags so loaded sidebar cues render readable text while preserving cue line breaks.
Added regression coverage for SRT and ASS cue parsing with <font ...> markup.
Verification: bun test src/core/services/subtitle-cue-parser.test.ts; bun run typecheck; bun run test:fast; bun run test:env; bun run build; bun run test:smoke:dist.
<!-- SECTION:NOTES:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Sanitized parsed subtitle cue text in src/core/services/subtitle-cue-parser.ts so HTML-like inline markup such as <font ...> is removed before cues reach the subtitle sidebar. The sanitizer is shared across SRT/VTT-style parsing and ASS parsing, while existing cue timing and line-break semantics remain intact.
Added regression tests in src/core/services/subtitle-cue-parser.test.ts covering markup-bearing SRT lines and ASS dialogue lines with \N breaks, and verified the original failure before implementing the fix.
Tests run: bun test src/core/services/subtitle-cue-parser.test.ts; bun run typecheck; bun run test:fast; bun run test:env; bun run build; bun run test:smoke:dist.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,69 @@
---
id: TASK-248
title: Fix macOS visible overlay toggle getting immediately restored
status: Done
assignee: []
created_date: '2026-03-29 10:03'
updated_date: '2026-03-29 22:14'
labels: []
dependencies: []
references:
- /Users/sudacode/projects/japanese/SubMiner/plugin/subminer/process.lua
- /Users/sudacode/projects/japanese/SubMiner/plugin/subminer/ui.lua
- /Users/sudacode/projects/japanese/SubMiner/src/core/services/cli-command.ts
- >-
/Users/sudacode/projects/japanese/SubMiner/src/main/overlay-visibility-runtime.ts
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Investigate and fix the visible overlay toggle path on macOS so the user can reliably hide the overlay after it has been shown. The current behavior can ignore the toggle or hide the overlay briefly before it is restored immediately.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Pressing the visible-overlay toggle hides the overlay when it is currently shown on macOS.
- [x] #2 A manual hide is not immediately undone by startup or readiness flows.
- [x] #3 The mpv/plugin toggle path matches the intended visible-overlay toggle behavior.
- [x] #4 Regression tests cover the failing toggle path.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Reproduce the toggle/re-show logic from code paths around mpv plugin control commands and auto-play readiness.
2. Add regression coverage for manual toggle-off staying hidden through readiness completion.
3. Patch the plugin/control path so manual visible-overlay toggles are not undone by readiness auto-show.
4. Run targeted tests, then the relevant verification lane.
<!-- SECTION:PLAN:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Root cause: the mpv plugin readiness callback (`subminer-autoplay-ready`) could re-issue `--show-visible-overlay` after a manual toggle/hide. Initial fix only suppressed the next readiness restore, but repeated readiness callbacks in the same media session could still re-show the overlay. The plugin toggle path also still used legacy `--toggle` instead of the explicit visible-overlay command.
Implemented a session-scoped suppression flag in the Lua plugin so a manual hide/toggle during the pause-until-ready window blocks readiness auto-show for the rest of the current auto-start session, then resets on the next auto-start session.
Added Lua regression coverage for both behaviors: manual toggle-off stays hidden through readiness completion, repeated readiness callbacks in the same session stay suppressed, and `subminer-toggle` emits `--toggle-visible-overlay` rather than legacy `--toggle`.
Follow-up investigation found a second issue in `src/core/services/cli-command.ts`: pure visible-overlay toggle commands still ran the MPV connect/start path (`connectMpvClient`) because `--toggle` and `--toggle-visible-overlay` were classified as start-like commands. That side effect could retrigger startup visibility work even after the plugin-side fix.
Updated CLI command handling so only `--start` reconnects MPV. Pure toggle/show/hide overlay commands still initialize overlay runtime when needed, but they no longer restart/reconnect the MPV control path.
Renderer/modal follow-ups: restored focused-overlay mpv y-chord proxy in `src/renderer/handlers/keyboard.ts`, added a modal-close guard in `src/main/overlay-runtime.ts` so modal teardown does not re-show a manually hidden overlay, and added a duplicate-toggle debounce in `src/main/runtime/overlay-visibility-actions.ts` to ignore near-simultaneous toggle requests inside the main process.
2026-03-29: added regression for repeated subminer-autoplay-ready signals after manual y-t hide. Root cause: Lua plugin suppression only blocked the first ready-time restore, so later ready callbacks in the same media session could re-show the visible overlay. Updated plugin suppression to remain active for the full current auto-start session and reset on the next auto-start trigger.
2026-03-29: live mpv log showed repeated `subminer-autoplay-ready` script messages from Electron during paused startup, each triggering plugin `--show-visible-overlay` and immediate re-show. Fixed `src/main/runtime/autoplay-ready-gate.ts` so plugin readiness is signaled once per media while paused retry loops only re-issue `pause=false` instead of re-signaling readiness.
2026-03-29: Added window-level guard for stray visible-overlay re-show on macOS. `src/core/services/overlay-window.ts` now immediately re-hides the visible overlay window on `show` if overlay state is false, covering native/Electron re-show paths that bypass normal visibility actions. Regression: `src/core/services/overlay-window.test.ts`. Verified with full gate and rebuilt unsigned mac bundle.
2026-03-29: added a blur-path guard for the visible overlay window. `src/core/services/overlay-window.ts` now skips topmost restacking when a visible-overlay blur fires after overlay state already flipped off, covering a macOS hide-in-flight path that could immediately reassert the window. Regression coverage added in `src/core/services/overlay-window.test.ts`; verified with targeted overlay tests, full gate, and rebuilt unsigned mac bundle.
<!-- SECTION:NOTES:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Confirmed with user that macOS `y-t` now works. Cleaned the patch set down to the remaining justified fixes: explicit visible-overlay plugin toggle/suppression, pure-toggle CLI no longer reconnects MPV, autoplay-ready signaling only fires once per media, and the final visible-overlay blur guard that stops macOS restacking after a manual hide. Full gate passed again before commit `c939c580` (`fix: stabilize macOS visible overlay toggle`).
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,37 @@
---
id: TASK-249
title: Fix AniList token persistence on setup login
status: Done
assignee: []
created_date: '2026-03-29 10:08'
updated_date: '2026-03-29 19:42'
labels:
- anilist
- bug
dependencies: []
documentation:
- src/main/runtime/anilist-setup.ts
- src/core/services/anilist/anilist-token-store.ts
- src/main/runtime/anilist-token-refresh.ts
- docs-site/anilist-integration.md
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
AniList setup can appear successful but the token is not persisted across restarts. Investigate the setup callback and token store path so the app either saves the token reliably or surfaces persistence failure instead of reopening setup on every launch.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [ ] #1 AniList setup login persists a usable token across app restarts when safeStorage works
- [ ] #2 If token persistence fails the setup flow reports the failure instead of pretending login succeeded
- [ ] #3 Regression coverage exists for the callback/save path and the refresh path that reopens setup when no token is available
<!-- AC:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Pinned installed mpv plugin configs to the current SubMiner binary so standalone mpv launches reuse the same app identity that saved AniList tokens. Added startup self-heal for existing blank binary_path configs, install-time binary_path writes for fresh plugin installs, regression tests for both paths, and docs updates describing the new behavior.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,72 @@
---
id: TASK-250
title: Restore macOS mpv passthrough while overlay subtitle sidebar is open
status: Done
assignee:
- '@codex'
created_date: '2026-03-29 10:10'
updated_date: '2026-03-29 10:23'
labels:
- bug
- macos
- subtitle-sidebar
- overlay
- mpv
dependencies: []
references:
- >-
/Users/sudacode/projects/japanese/SubMiner/src/renderer/overlay-mouse-ignore.ts
- >-
/Users/sudacode/projects/japanese/SubMiner/src/renderer/modals/subtitle-sidebar.ts
- /Users/sudacode/projects/japanese/SubMiner/src/renderer/handlers/keyboard.ts
- >-
/Users/sudacode/projects/japanese/SubMiner/src/renderer/modals/subtitle-sidebar.test.ts
- >-
/Users/sudacode/projects/japanese/SubMiner/src/renderer/overlay-mouse-ignore.test.ts
priority: high
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
When the overlay-layout subtitle sidebar is open on macOS, users should still be able to click through outside the sidebar and return keyboard focus to mpv so native mpv keybindings continue to work. The sidebar should stay interactive when hovered or focused, but it must not make the whole visible overlay behave like a blocking modal.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 Opening the overlay-layout subtitle sidebar does not keep the entire visible overlay mouse-interactive outside sidebar hover or focus.
- [x] #2 With the subtitle sidebar open, clicking outside the sidebar can refocus mpv so native mpv keybindings continue to work.
- [x] #3 Focused regression coverage exists for overlay-layout sidebar passthrough behavior on mouse-ignore state changes.
<!-- AC:END -->
## Implementation Plan
<!-- SECTION:PLAN:BEGIN -->
1. Add renderer regression coverage for overlay-layout subtitle sidebar passthrough so open-but-unhovered sidebar no longer holds global mouse interaction.
2. Update overlay mouse-ignore gating to keep the subtitle sidebar interactive only while hovered or otherwise actively interacting, instead of treating overlay layout as a blocking modal.
3. Run focused renderer tests for subtitle sidebar and mouse-ignore behavior, then update task notes/criteria with the verified outcome.
<!-- SECTION:PLAN:END -->
## Implementation Notes
<!-- SECTION:NOTES:BEGIN -->
Confirmed the regression only affects the default overlay-layout subtitle sidebar: open sidebar state was treated as a blocking overlay modal, which prevented click-through outside the sidebar and stranded native mpv keybindings until focus was manually recovered.
Added a failing regression in src/renderer/modals/subtitle-sidebar.test.ts for overlay-layout passthrough before changing the gate.
Verification: bun test src/renderer/modals/subtitle-sidebar.test.ts src/renderer/overlay-mouse-ignore.test.ts; bun run typecheck
User reported the first renderer-only fix did not resolve the macOS issue in practice. Reopening investigation to trace visible-overlay window focus and hit-testing outside the renderer mouse-ignore gate.
Follow-up root cause: sidebar hover handlers were attached to the full-screen `.subtitle-sidebar-modal` shell instead of the actual sidebar panel. On the transparent visible overlay that shell spans the viewport, so sidebar-active state could persist outside the panel and keep the overlay interactive longer than intended.
Updated the sidebar modal to track hover/focus on `subtitleSidebarContent` and derive sidebar interaction state from panel hover or focus-within before recomputing mouse passthrough.
Verification refresh: bun test src/renderer/modals/subtitle-sidebar.test.ts src/renderer/overlay-mouse-ignore.test.ts; bun run typecheck
<!-- SECTION:NOTES:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Restored overlay subtitle sidebar passthrough in two layers. First, the visible overlay mouse-ignore gate no longer treats the subtitle sidebar as a global blocking modal. Second, the sidebar panel now tracks interaction on the real sidebar content instead of the full-screen modal shell, and keeps itself active only while the panel is hovered or focused. Added regressions for overlay-layout passthrough and focus-within behavior. Verification: `bun test src/renderer/modals/subtitle-sidebar.test.ts src/renderer/overlay-mouse-ignore.test.ts` and `bun run typecheck`.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -0,0 +1,32 @@
---
id: TASK-251
title: 'Docs: add subtitle sidebar and Jimaku integration pages'
status: Done
assignee: []
created_date: '2026-03-29 22:36'
updated_date: '2026-03-29 22:38'
labels:
- docs
dependencies: []
priority: medium
---
## Description
<!-- SECTION:DESCRIPTION:BEGIN -->
Track the docs-site update that adds a dedicated subtitle sidebar page, links Jimaku integration from the homepage/config docs, and refreshes the docs-site theme styling used by those pages.
<!-- SECTION:DESCRIPTION:END -->
## Acceptance Criteria
<!-- AC:BEGIN -->
- [x] #1 docs-site nav includes a Subtitle Sidebar entry
- [x] #2 Subtitle Sidebar page documents layout, shortcut, and config options
- [x] #3 Jimaku integration page and configuration docs link to the new docs page
- [x] #4 Changelog fragment exists for the user-visible docs release note
<!-- AC:END -->
## Final Summary
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
Added the subtitle sidebar docs page and nav entry, linked Jimaku integration from the homepage/config docs, refreshed docs-site styling tokens, and recorded the release note fragment. Verified with `bun run changelog:lint`, `bun run docs:test`, `bun run docs:build`, and `bun run build`. Full repo test gate still has pre-existing failures in `bun run test:fast` and `bun run test:env` unrelated to these docs changes.
<!-- SECTION:FINAL_SUMMARY:END -->

View File

@@ -7,9 +7,9 @@
"dependencies": {
"@fontsource-variable/geist": "^5.2.8",
"@fontsource-variable/geist-mono": "^5.2.7",
"@xhayper/discord-rpc": "^1.3.3",
"axios": "^1.13.5",
"commander": "^14.0.3",
"discord-rpc": "^4.0.1",
"hono": "^4.12.7",
"jsonc-parser": "^3.3.1",
"libsql": "^0.5.22",
@@ -37,6 +37,12 @@
"@develar/schema-utils": ["@develar/schema-utils@2.6.5", "", { "dependencies": { "ajv": "^6.12.0", "ajv-keywords": "^3.4.1" } }, "sha512-0cp4PsWQ/9avqTVMCtZ+GirikIA36ikvjtHweU4/j8yLtgObI0+JUPhYFScgwlteveGB1rt3Cm8UhN04XayDig=="],
"@discordjs/collection": ["@discordjs/collection@2.1.1", "", {}, "sha512-LiSusze9Tc7qF03sLCujF5iZp7K+vRNEDBZ86FT9aQAv3vxMLihUvKvpsCWiQ2DJq1tVckopKm1rxomgNUc9hg=="],
"@discordjs/rest": ["@discordjs/rest@2.6.1", "", { "dependencies": { "@discordjs/collection": "^2.1.1", "@discordjs/util": "^1.2.0", "@sapphire/async-queue": "^1.5.3", "@sapphire/snowflake": "^3.5.5", "@vladfrangu/async_event_emitter": "^2.4.6", "discord-api-types": "^0.38.40", "magic-bytes.js": "^1.13.0", "tslib": "^2.6.3", "undici": "6.24.1" } }, "sha512-wwQdgjeaoYFiaG+atbqx6aJDpqW7JHAo0HrQkBTbYzM3/PJ3GweQIpgElNcGZ26DCUOXMyawYd0YF7vtr+fZXg=="],
"@discordjs/util": ["@discordjs/util@1.2.0", "", { "dependencies": { "discord-api-types": "^0.38.33" } }, "sha512-3LKP7F2+atl9vJFhaBjn4nOaSWahZ/yWjOvA4e5pnXkt2qyXRCHLxoBQy81GFtLGCq7K9lPm9R517M1U+/90Qg=="],
"@electron/asar": ["@electron/asar@3.4.1", "", { "dependencies": { "commander": "^5.0.0", "glob": "^7.1.6", "minimatch": "^3.0.4" }, "bin": { "asar": "bin/asar.js" } }, "sha512-i4/rNPRS84t0vSRa2HorerGRXWyF4vThfHesw0dmcWHp+cspK743UanA0suA5Q5y8kzY2y6YKrvbIUn69BCAiA=="],
"@electron/fuses": ["@electron/fuses@1.8.0", "", { "dependencies": { "chalk": "^4.1.1", "fs-extra": "^9.0.1", "minimist": "^1.2.5" }, "bin": { "electron-fuses": "dist/bin.js" } }, "sha512-zx0EIq78WlY/lBb1uXlziZmDZI4ubcCXIMJ4uGjXzZW0nS19TjSPeXPAjzzTmKQlJUZm0SbmZhPKP7tuQ1SsEw=="],
@@ -143,6 +149,10 @@
"@pkgjs/parseargs": ["@pkgjs/parseargs@0.11.0", "", {}, "sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg=="],
"@sapphire/async-queue": ["@sapphire/async-queue@1.5.5", "", {}, "sha512-cvGzxbba6sav2zZkH8GPf2oGk9yYoD5qrNWdu9fRehifgnFZJMV+nuy2nON2roRO4yQQ+v7MK/Pktl/HgfsUXg=="],
"@sapphire/snowflake": ["@sapphire/snowflake@3.5.5", "", {}, "sha512-xzvBr1Q1c4lCe7i6sRnrofxeO1QTP/LKQ6A6qy0iB4x5yfiSfARMEQEghojzTNALDTcv8En04qYNIco9/K9eZQ=="],
"@sindresorhus/is": ["@sindresorhus/is@4.6.0", "", {}, "sha512-t09vSN3MdfsyCHoFcTRCH/iUtG7OJ0CsjzB8cjAmKc/va/kIgeDI/TxsigdncE/4be734m0cvIYwNaV4i2XqAw=="],
"@szmarczak/http-timer": ["@szmarczak/http-timer@4.0.6", "", { "dependencies": { "defer-to-connect": "^2.0.0" } }, "sha512-4BAffykYOgO+5nzBWYwE3W90sBgLJoUPRWWcL8wlyiM8IB8ipJz3UMJ9KXQd1RKQXpKp8Tutn80HZtWsu2u76w=="],
@@ -171,6 +181,10 @@
"@types/yauzl": ["@types/yauzl@2.10.3", "", { "dependencies": { "@types/node": "*" } }, "sha512-oJoftv0LSuaDZE3Le4DbKX+KS9G36NzOeSap90UIK0yMA/NhKJhqlSGtNDORNRaIbQfzjXDrQa0ytJ6mNRGz/Q=="],
"@vladfrangu/async_event_emitter": ["@vladfrangu/async_event_emitter@2.4.7", "", {}, "sha512-Xfe6rpCTxSxfbswi/W/Pz7zp1WWSNn4A0eW4mLkQUewCrXXtMj31lCg+iQyTkh/CkusZSq9eDflu7tjEDXUY6g=="],
"@xhayper/discord-rpc": ["@xhayper/discord-rpc@1.3.3", "", { "dependencies": { "@discordjs/rest": "^2.6.1", "@vladfrangu/async_event_emitter": "^2.4.7", "discord-api-types": "^0.38.42", "ws": "^8.20.0" } }, "sha512-Ih48GHiua7TtZgKO+f0uZPhCeQqb84fY2qUys/oMh8UbUfiUkUJLVCmd/v2AK0/pV33euh0aqSXo7+9LiPSwGw=="],
"@xmldom/xmldom": ["@xmldom/xmldom@0.8.11", "", {}, "sha512-cQzWCtO6C8TQiYl1ruKNn2U6Ao4o4WBBcbL61yJl84x+j5sOWWFU9X7DpND8XZG3daDppSsigMdfAIl2upQBRw=="],
"abbrev": ["abbrev@3.0.1", "", {}, "sha512-AO2ac6pjRB3SJmGJo+v5/aK6Omggp6fsLrs6wN9bd35ulu4cCwaAU9+7ZhXjeqHVkaHThLuzH0nZr0YpCDhygg=="],
@@ -209,8 +223,6 @@
"base64-js": ["base64-js@1.5.1", "", {}, "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA=="],
"bindings": ["bindings@1.5.0", "", { "dependencies": { "file-uri-to-path": "1.0.0" } }, "sha512-p2q/t/mhvuOj/UeLlV6566GD/guowlr0hHxClI0W9m7MWYkL1F0hLo+0Aexs9HSPCtR1SXQ0TD3MMKrXZajbiQ=="],
"bl": ["bl@4.1.0", "", { "dependencies": { "buffer": "^5.5.0", "inherits": "^2.0.4", "readable-stream": "^3.4.0" } }, "sha512-1W07cM9gS6DcLperZfFSj+bWLtaPGSOHWhPiGzXmvVJbRLdG82sH/Kn8EtW1VqWVA54AKf2h5k5BbnIbwF3h6w=="],
"boolean": ["boolean@3.2.0", "", {}, "sha512-d0II/GO9uf9lfUHH2BQsjxzRJZBdsjgsBiW4BvhWk/3qoKwQFjIDVN19PfX8F2D/r9PCMTtLWjYVCFrpeYUzsw=="],
@@ -293,7 +305,7 @@
"dir-compare": ["dir-compare@4.2.0", "", { "dependencies": { "minimatch": "^3.0.5", "p-limit": "^3.1.0 " } }, "sha512-2xMCmOoMrdQIPHdsTawECdNPwlVFB9zGcz3kuhmBO6U3oU+UQjsue0i8ayLKpgBcm+hcXPMVSGUN9d+pvJ6+VQ=="],
"discord-rpc": ["discord-rpc@4.0.1", "", { "dependencies": { "node-fetch": "^2.6.1", "ws": "^7.3.1" }, "optionalDependencies": { "register-scheme": "github:devsnek/node-register-scheme" } }, "sha512-HOvHpbq5STRZJjQIBzwoKnQ0jHplbEWFWlPDwXXKm/bILh4nzjcg7mNqll0UY7RsjFoaXA7e/oYb/4lvpda2zA=="],
"discord-api-types": ["discord-api-types@0.38.43", "", {}, "sha512-sSoBf/nK6m7BGtw65mi+QBuvEWaHE8MMziFLqWL+gT6ME/BLg34dRSVKS3Husx40uU06bvxUc3/X+D9Y6/zAbw=="],
"dmg-builder": ["dmg-builder@26.8.2", "", { "dependencies": { "app-builder-lib": "26.8.2", "builder-util": "26.8.1", "fs-extra": "^10.1.0", "iconv-lite": "^0.6.2", "js-yaml": "^4.1.0" }, "optionalDependencies": { "dmg-license": "^1.0.11" } }, "sha512-DaWI+p4DOqiFVZFMovdGYammBOyJAiHHFWUTQ0Z7gNc0twfdIN0LvyJ+vFsgZEDR1fjgbpCj690IVtbYIsZObQ=="],
@@ -359,8 +371,6 @@
"fdir": ["fdir@6.5.0", "", { "peerDependencies": { "picomatch": "^3 || ^4" }, "optionalPeers": ["picomatch"] }, "sha512-tIbYtZbucOs0BRGqPJkshJUYdL+SDH7dVM8gjy+ERp3WAUjLEFJE+02kanyHtwjWOnwrKYBiwAmM0p4kLJAnXg=="],
"file-uri-to-path": ["file-uri-to-path@1.0.0", "", {}, "sha512-0Zt+s3L7Vf1biwWZ29aARiVYLx7iMGnEUl9x33fbB/j3jR81u/O2LbqK+Bm1CDSNDKVtJ/YjwY7TUd5SkeLQLw=="],
"filelist": ["filelist@1.0.6", "", { "dependencies": { "minimatch": "^5.0.1" } }, "sha512-5giy2PkLYY1cP39p17Ech+2xlpTRL9HLspOfEgm0L6CwBXBTgsK5ou0JtzYuepxkaQ/tvhCFIJ5uXo0OrM2DxA=="],
"follow-redirects": ["follow-redirects@1.15.11", "", {}, "sha512-deG2P0JfjrTxl50XGCDyfI97ZGVCxIpfKYmfyrQ54n5FO/0gfIES8C/Psl6kWVDolizcaaxZJnTS0QSMxvnsBQ=="],
@@ -477,6 +487,8 @@
"lru-cache": ["lru-cache@6.0.0", "", { "dependencies": { "yallist": "^4.0.0" } }, "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA=="],
"magic-bytes.js": ["magic-bytes.js@1.13.0", "", {}, "sha512-afO2mnxW7GDTXMm5/AoN1WuOcdoKhtgXjIvHmobqTD1grNplhGdv3PFOyjCVmrnOZBIT/gD/koDKpYG+0mvHcg=="],
"make-fetch-happen": ["make-fetch-happen@14.0.3", "", { "dependencies": { "@npmcli/agent": "^3.0.0", "cacache": "^19.0.1", "http-cache-semantics": "^4.1.1", "minipass": "^7.0.2", "minipass-fetch": "^4.0.0", "minipass-flush": "^1.0.5", "minipass-pipeline": "^1.2.4", "negotiator": "^1.0.0", "proc-log": "^5.0.0", "promise-retry": "^2.0.1", "ssri": "^12.0.0" } }, "sha512-QMjGbFTP0blj97EeidG5hk/QhKQ3T4ICckQGLgz38QF7Vgbk6e6FTARN8KhKxyBbWn8R0HU+bnw8aSoFPD4qtQ=="],
"matcher": ["matcher@3.0.0", "", { "dependencies": { "escape-string-regexp": "^4.0.0" } }, "sha512-OkeDaAZ/bQCxeFAozM55PKcKU0yJMPGifLwV4Qgjitu+5MoAfSQN4lsLJeXZ1b8w0x+/Emda6MZgXS1jvsapng=="],
@@ -523,8 +535,6 @@
"node-api-version": ["node-api-version@0.2.1", "", { "dependencies": { "semver": "^7.3.5" } }, "sha512-2xP/IGGMmmSQpI1+O/k72jF/ykvZ89JeuKX3TLJAYPDVLUalrshrLHkeVcCCZqG/eEa635cr8IBYzgnDvM2O8Q=="],
"node-fetch": ["node-fetch@2.7.0", "", { "dependencies": { "whatwg-url": "^5.0.0" }, "peerDependencies": { "encoding": "^0.1.0" }, "optionalPeers": ["encoding"] }, "sha512-c4FRfUm/dbcWZ7U+1Wq0AwCyFL+3nt2bEw05wfxSz+DWpWsitgmSgYmy2dQdWyKC1694ELPqMs/YzUSNozLt8A=="],
"node-gyp": ["node-gyp@11.5.0", "", { "dependencies": { "env-paths": "^2.2.0", "exponential-backoff": "^3.1.1", "graceful-fs": "^4.2.6", "make-fetch-happen": "^14.0.3", "nopt": "^8.0.0", "proc-log": "^5.0.0", "semver": "^7.3.5", "tar": "^7.4.3", "tinyglobby": "^0.2.12", "which": "^5.0.0" }, "bin": { "node-gyp": "bin/node-gyp.js" } }, "sha512-ra7Kvlhxn5V9Slyus0ygMa2h+UqExPqUIkfk7Pc8QTLT956JLSy51uWFwHtIYy0vI8cB4BDhc/S03+880My/LQ=="],
"nopt": ["nopt@8.1.0", "", { "dependencies": { "abbrev": "^3.0.0" }, "bin": { "nopt": "bin/nopt.js" } }, "sha512-ieGu42u/Qsa4TFktmaKEwM6MQH0pOWnaB3htzh0JRtx84+Mebc0cbZYN5bC+6WTZ4+77xrL9Pn5m7CV6VIkV7A=="],
@@ -587,8 +597,6 @@
"readable-stream": ["readable-stream@3.6.2", "", { "dependencies": { "inherits": "^2.0.3", "string_decoder": "^1.1.1", "util-deprecate": "^1.0.1" } }, "sha512-9u/sniCrY3D5WdsERHzHE4G2YCXqoG5FTHUiCC4SIbr6XcLZBY05ya9EKjYek9O5xOAwjGq+1JdGBAS7Q9ScoA=="],
"register-scheme": ["register-scheme@github:devsnek/node-register-scheme#e7cc9a6", { "dependencies": { "bindings": "^1.3.0", "node-addon-api": "^1.3.0" } }, "devsnek-node-register-scheme-e7cc9a6", "sha512-VwUWN3aKIg/yn7T8axW20Y1+4wGALIQectBmkmwSJfLrCycpVepGP/+KHjXSL/Ga8N1SmewL49kESgIhW7HbWg=="],
"require-directory": ["require-directory@2.1.1", "", {}, "sha512-fGxEI7+wsG9xrvdjsrlmL22OMTTiHRwAMroiEeMgq8gzoLC/PQr7RsRDSTLUg/bZAZtF+TVIkHc6/4RIKrui+Q=="],
"resedit": ["resedit@1.7.2", "", { "dependencies": { "pe-library": "^0.4.1" } }, "sha512-vHjcY2MlAITJhC0eRD/Vv8Vlgmu9Sd3LX9zZvtGzU5ZImdTN3+d6e/4mnTyV8vEbyf1sgNIrWxhWlrys52OkEA=="],
@@ -673,14 +681,16 @@
"tmp-promise": ["tmp-promise@3.0.3", "", { "dependencies": { "tmp": "^0.2.0" } }, "sha512-RwM7MoPojPxsOBYnyd2hy0bxtIlVrihNs9pj5SUvY8Zz1sQcQG2tG1hSr8PDxfgEB8RNKDhqbIlroIarSNDNsQ=="],
"tr46": ["tr46@0.0.3", "", {}, "sha512-N3WMsuqV66lT30CrXNbEjx4GEwlow3v6rr4mCcv6prnfwhS01rkgyFdjPNBYd9br7LpXV1+Emh01fHnq2Gdgrw=="],
"truncate-utf8-bytes": ["truncate-utf8-bytes@1.0.2", "", { "dependencies": { "utf8-byte-length": "^1.0.1" } }, "sha512-95Pu1QXQvruGEhv62XCMO3Mm90GscOCClvrIUwCM0PYOXK3kaF3l3sIHxx71ThJfcbM2O5Au6SO3AWCSEfW4mQ=="],
"tslib": ["tslib@2.8.1", "", {}, "sha512-oJFu94HQb+KVduSUQL7wnpmqnfmLsOA/nAh6b6EH0wCEoK0/mPeXU6c3wKDV83MkOuHPRHtSXKKU99IBazS/2w=="],
"type-fest": ["type-fest@0.13.1", "", {}, "sha512-34R7HTnG0XIJcBSn5XhDd7nNFPRcXYRZrBB2O2jdKqYODldSzBAqzsWoZYYvduky73toYS/ESqxPvkDf/F0XMg=="],
"typescript": ["typescript@5.9.3", "", { "bin": { "tsc": "bin/tsc", "tsserver": "bin/tsserver" } }, "sha512-jl1vZzPDinLr9eUt3J/t7V6FgNEw9QjvBPdysz9KfQDD41fQrC2Y4vKQdiaUpFT4bXlb1RHhLpp8wtm6M5TgSw=="],
"undici": ["undici@6.24.1", "", {}, "sha512-sC+b0tB1whOCzbtlx20fx3WgCXwkW627p4EA9uM+/tNNPkSS+eSEld6pAs9nDv7WbY1UUljBMYPtu9BCOrCWKA=="],
"undici-types": ["undici-types@7.18.2", "", {}, "sha512-AsuCzffGHJybSaRrmr5eHr81mwJU3kjw6M+uprWvCXiNeN9SOGwQ3Jn8jb8m3Z6izVgknn1R0FTCEAP2QrLY/w=="],
"unique-filename": ["unique-filename@4.0.0", "", { "dependencies": { "unique-slug": "^5.0.0" } }, "sha512-XSnEewXmQ+veP7xX2dS5Q4yZAvO40cBN2MWkJ7D/6sW4Dg6wYBNwM1Vrnz1FhH5AdeLIlUXRI9e28z1YZi71NQ=="],
@@ -699,10 +709,6 @@
"wcwidth": ["wcwidth@1.0.1", "", { "dependencies": { "defaults": "^1.0.3" } }, "sha512-XHPEwS0q6TaxcvG85+8EYkbiCux2XtWG2mkc47Ng2A77BQu9+DqIOJldST4HgPkuea7dvKSj5VgX3P1d4rW8Tg=="],
"webidl-conversions": ["webidl-conversions@3.0.1", "", {}, "sha512-2JAn3z8AR6rjK8Sm8orRC0h/bcl/DqL7tRPdGZ4I1CjdF+EaMLmYxBHyXuKL849eucPFhvBoxMsflfOb8kxaeQ=="],
"whatwg-url": ["whatwg-url@5.0.0", "", { "dependencies": { "tr46": "~0.0.3", "webidl-conversions": "^3.0.0" } }, "sha512-saE57nupxk6v3HY35+jzBwYa0rKSy0XR8JSxZPwgLr7ys0IBzhGviA1/TUGJLmSVqs8pb9AnvICXEuOHLprYTw=="],
"which": ["which@5.0.0", "", { "dependencies": { "isexe": "^3.1.1" }, "bin": { "node-which": "bin/which.js" } }, "sha512-JEdGzHwwkrbWoGOlIHqQ5gtprKGOenpDHpxE9zVR1bWbOtYRyPPHMe9FaP6x61CmNaTThSkb0DAJte5jD+DmzQ=="],
"wrap-ansi": ["wrap-ansi@7.0.0", "", { "dependencies": { "ansi-styles": "^4.0.0", "string-width": "^4.1.0", "strip-ansi": "^6.0.0" } }, "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q=="],
@@ -769,8 +775,6 @@
"cross-spawn/which": ["which@2.0.2", "", { "dependencies": { "isexe": "^2.0.0" }, "bin": { "node-which": "./bin/node-which" } }, "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA=="],
"discord-rpc/ws": ["ws@7.5.10", "", { "peerDependencies": { "bufferutil": "^4.0.1", "utf-8-validate": "^5.0.2" }, "optionalPeers": ["bufferutil", "utf-8-validate"] }, "sha512-+dbF1tHwZpXcbOJdVOkzLDxZP1ailvSxM6ZweXTegylPny803bFhA+vqBYw4s31NSAk4S2Qz+AKXK9a4wkdjcQ=="],
"electron/@types/node": ["@types/node@22.19.15", "", { "dependencies": { "undici-types": "~6.21.0" } }, "sha512-F0R/h2+dsy5wJAUe3tAU6oqa2qbWY5TpNfL/RGmo1y38hiyO1w3x2jPtt76wmuaJI4DQnOBu21cNXQ2STIUUWg=="],
"electron-winstaller/fs-extra": ["fs-extra@7.0.1", "", { "dependencies": { "graceful-fs": "^4.1.2", "jsonfile": "^4.0.0", "universalify": "^0.1.0" } }, "sha512-YJDaCJZEnBmcbw13fvdAM9AwNOJwOzrE4pqMqBq5nFiEqXUqHwlK4B+3pUw6JNvfSPtX05xFHtYy/1ni01eGCw=="],

View File

@@ -1,5 +0,0 @@
type: internal
area: release
- Added a maintained source coverage lane that shards Bun coverage one test file at a time and merges LCOV output into `coverage/test-src/lcov.info`.
- CI and release quality-gate now upload the merged source-lane LCOV artifact for inspection.

View File

@@ -1,5 +0,0 @@
type: fixed
area: stats
- Fixed stats startup so the immersion tracker can run when `Bun.serve` is unavailable.
- Stats server now falls back to a Node `http` listener in Electron/runtime paths that do not expose Bun.

View File

@@ -1,6 +0,0 @@
type: internal
area: runtime
- Extracted remaining inline runtime logic from `src/main.ts` into dedicated runtime modules and composer helpers.
- Added focused regression tests for the extracted runtime/composer boundaries.
- Updated task tracking notes to mark TASK-238.6 complete and confirm follow-on boot-phase split can be deferred.

View File

@@ -1,6 +0,0 @@
type: internal
area: runtime
- Split `src/main.ts` boot wiring into dedicated `src/main/boot/services.ts`, `src/main/boot/runtimes.ts`, and `src/main/boot/handlers.ts` modules.
- Added focused tests for the new boot-phase seams and kept the startup/typecheck/build verification lanes green.
- Updated internal architecture/task docs to record the boot-phase split and new ownership boundary.

View File

@@ -0,0 +1,6 @@
type: docs
area: docs-site
- Added a dedicated Subtitle Sidebar guide and linked it from the homepage and configuration docs.
- Linked Jimaku integration from the homepage to its dedicated docs page.
- Refreshed docs-site theme tokens and hover/selection styling for the updated pages.

View File

@@ -0,0 +1,5 @@
type: fixed
area: main
- Resolve the YouTube playback socket path lazily so startup honors CLI and config overrides.
- Add regression coverage for the lazy socket-path lookup during Windows mpv startup.

View File

@@ -498,6 +498,7 @@
// ==========================================
"discordPresence": {
"enabled": false, // Enable optional Discord Rich Presence updates. Values: true | false
"presenceStyle": "default", // Presence card text preset: "default" (clean bilingual), "meme" (Mining and crafting), "japanese" (fully JP), or "minimal".
"updateIntervalMs": 3000, // Minimum interval between presence payload updates.
"debounceMs": 750 // Debounce delay used to collapse bursty presence updates.
}, // Optional Discord Rich Presence activity card updates for current playback/study session.

View File

@@ -74,7 +74,9 @@ export default {
{ text: 'Configuration', link: '/configuration' },
{ text: 'Keyboard Shortcuts', link: '/shortcuts' },
{ text: 'Subtitle Annotations', link: '/subtitle-annotations' },
{ text: 'Subtitle Sidebar', link: '/subtitle-sidebar' },
{ text: 'Immersion Tracking', link: '/immersion-tracking' },
{ text: 'JLPT Vocabulary Bundle', link: '/jlpt-vocab-bundle' },
{ text: 'Troubleshooting', link: '/troubleshooting' },
],
},

View File

@@ -34,6 +34,25 @@
system-ui,
sans-serif;
--tui-transition: 180ms ease;
/* Theme-specific values — overridden in .dark below */
--tui-nav-bg: color-mix(in srgb, var(--vp-c-bg-alt) 88%, transparent);
--tui-table-hover-bg: color-mix(in srgb, var(--vp-c-bg-soft) 80%, transparent);
--tui-link-underline: color-mix(in srgb, var(--vp-c-brand-1) 40%, transparent);
--tui-selection-bg: hsla(267, 83%, 45%, 0.14);
--tui-hero-glow: hsla(267, 83%, 45%, 0.05);
--tui-step-hover-bg: var(--vp-c-bg-alt);
--tui-step-hover-glow: color-mix(in srgb, var(--vp-c-brand-1) 30%, transparent);
}
.dark {
--tui-nav-bg: hsla(232, 23%, 18%, 0.82);
--tui-table-hover-bg: hsla(232, 23%, 18%, 0.4);
--tui-link-underline: hsla(267, 83%, 80%, 0.3);
--tui-selection-bg: hsla(267, 83%, 80%, 0.22);
--tui-hero-glow: hsla(267, 83%, 80%, 0.06);
--tui-step-hover-bg: hsla(232, 23%, 18%, 0.6);
--tui-step-hover-glow: hsla(267, 83%, 80%, 0.3);
}
:root {
@@ -48,7 +67,7 @@
/* === Selection === */
::selection {
background: hsla(267, 83%, 80%, 0.22);
background: var(--tui-selection-bg);
color: var(--vp-c-text-1);
}
@@ -102,7 +121,7 @@ button,
}
.VPNav .VPNavBar:not(.has-sidebar) {
background: hsla(232, 23%, 18%, 0.82);
background: var(--tui-nav-bg);
}
.VPNav .VPNavBar.has-sidebar .content {
@@ -245,13 +264,13 @@ button,
}
.vp-doc table tr:hover td {
background: hsla(232, 23%, 18%, 0.4);
background: var(--tui-table-hover-bg);
}
/* === Links === */
.vp-doc a {
text-decoration: none;
border-bottom: 1px solid hsla(267, 83%, 80%, 0.3);
border-bottom: 1px solid var(--tui-link-underline);
transition: border-color var(--tui-transition), color var(--tui-transition);
}
@@ -653,7 +672,7 @@ body {
height: 400px;
background: radial-gradient(
ellipse at center,
hsla(267, 83%, 80%, 0.06) 0%,
var(--tui-hero-glow) 0%,
transparent 70%
);
pointer-events: none;

View File

@@ -41,6 +41,28 @@ The update flow:
3. **Progress check** -- SubMiner fetches your current list entry for the matched media. If your recorded progress already meets or exceeds the detected episode, the update is skipped.
4. **Mutation** -- A `SaveMediaListEntry` mutation sets the new progress and marks the entry as `CURRENT`.
```mermaid
flowchart TB
classDef step fill:#c6a0f6,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef action fill:#8aadf4,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef result fill:#a6da95,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef enrich fill:#8bd5ca,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef ext fill:#eed49f,stroke:#494d64,color:#24273a,stroke-width:1.5px
Play["Media Plays"]:::step
Detect["Episode Detected"]:::action
Queue["Update Queue"]:::action
Rate["Rate Limiter"]:::enrich
GQL["GraphQL Mutation"]:::ext
Done["Progress Updated"]:::result
Play --> Detect
Detect --> Queue
Queue --> Rate
Rate --> GQL
GQL --> Done
```
## Update Queue and Retry
Failed AniList updates are persisted to a retry queue on disk and retried with exponential backoff.

View File

@@ -1,5 +1,12 @@
# Changelog
## v0.10.0 (2026-03-29)
- Fixed stats startup so the immersion tracker can run when `Bun.serve` is unavailable.
- Added a Node `http` fallback for Electron/runtime paths that do not expose Bun, so stats keeps working there too.
- Updated Discord Rich Presence to the maintained `@xhayper/discord-rpc` wrapper.
- Fixed the macOS visible-overlay toggle path so manual hides stay hidden and the plugin uses the explicit visible-overlay toggle command.
- Restored macOS mpv passthrough while the overlay subtitle sidebar is open so clicks outside the sidebar can refocus mpv and keep native keybindings working.
## v0.9.3 (2026-03-25)
- Moved YouTube primary subtitle language defaults to `youtube.primarySubLanguages`.
- Removed the placeholder YouTube subtitle retime step; downloaded primary subtitle tracks are now used directly.

View File

@@ -31,6 +31,30 @@ The feature has three stages: **snapshot**, **merge**, and **match**.
3. **Match** — During subtitle rendering, Yomitan scans subtitle text against all loaded dictionaries including the character dictionary. Tokens that match a character entry are flagged with `isNameMatch` and highlighted in the overlay with a distinct color.
```mermaid
flowchart TB
classDef api fill:#a6da95,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef store fill:#8aadf4,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef build fill:#b7bdf8,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef dict fill:#c6a0f6,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef render fill:#8bd5ca,stroke:#494d64,color:#24273a,stroke-width:1.5px
AL["AniList API"]:::api
Snap["Snapshot JSON"]:::store
Merge["Merge"]:::build
ZIP["Yomitan ZIP"]:::dict
Yomi["Yomitan Import"]:::dict
Sub["Subtitle Scan"]:::render
HL["Name Highlight"]:::render
AL -->|"GraphQL"| Snap
Snap --> Merge
Merge --> ZIP
ZIP --> Yomi
Yomi --> Sub
Sub --> HL
```
## Enabling the Feature
Character dictionary sync is disabled by default. To turn it on:

View File

@@ -390,6 +390,8 @@ The sidebar is only available when the active subtitle source has been parsed in
`embedded` layout is intended to act like a split-pane view: it reserves player space with a right-side video margin and keeps interaction in both the player area and sidebar. If you see unexpected offset behavior in your environment, switch back to `overlay` to isolate sidebar placement.
For full details on layout modes, behavior, and the keyboard shortcut, see the [Subtitle Sidebar](/subtitle-sidebar) page.
`jlptColors` keys are:
| Key | Default | Description |
@@ -1197,30 +1199,38 @@ Discord Rich Presence is optional and disabled by default. When enabled, SubMine
{
"discordPresence": {
"enabled": true,
"presenceStyle": "default",
"updateIntervalMs": 3000,
"debounceMs": 750
}
}
```
| Option | Values | Description |
| ------------------ | --------------- | ---------------------------------------------------------- |
| `enabled` | `true`, `false` | Enable Discord Rich Presence updates (default: `false`) |
| `updateIntervalMs` | number | Minimum interval between activity updates in milliseconds |
| `debounceMs` | number | Debounce window for bursty playback events in milliseconds |
| Option | Values | Description |
| ------------------ | ------------------------------------------------- | ---------------------------------------------------------- |
| `enabled` | `true`, `false` | Enable Discord Rich Presence updates (default: `false`) |
| `presenceStyle` | `"default"`, `"meme"`, `"japanese"`, `"minimal"` | Card text preset (default: `"default"`) |
| `updateIntervalMs` | number | Minimum interval between activity updates in milliseconds |
| `debounceMs` | number | Debounce window for bursty playback events in milliseconds |
Setup steps:
1. Set `discordPresence.enabled` to `true`.
2. Restart SubMiner.
2. Optionally set `discordPresence.presenceStyle` to choose a card text preset.
3. Restart SubMiner.
SubMiner uses a fixed official activity card style for all users:
#### Presence style presets
- Details: current media title while playing (fallback: `Mining and crafting (Anki cards)` when idle/disconnected)
- State: `Playing mm:ss / mm:ss` or `Paused mm:ss / mm:ss` (fallback: `Idle`)
- Large image key/text: `subminer-logo` / `SubMiner`
- Small image key/text: `study` / `Sentence Mining`
- No activity button by default
While playing media, the **Details** line always shows the current media title and **State** shows `Playing mm:ss / mm:ss` or `Paused mm:ss / mm:ss`. The preset controls what appears when idle and the tooltip text on images.
| Preset | Idle details | Small image text | Vibe |
| ------------ | ----------------------------------- | ------------------ | --------------------------------------- |
| **`default`**| `Sentence Mining` | `日本語学習中` | Clean, bilingual flair |
| `meme` | `Mining and crafting (Anki cards)` | `Sentence Mining` | Minecraft-inspired joke |
| `japanese` | `文の採掘中` | `イマージョン学習` | Fully Japanese |
| `minimal` | `SubMiner` | *(none)* | Bare essentials, no small image overlay |
All presets use the `subminer-logo` large image with `SubMiner` tooltip. No activity button is shown by default.
Troubleshooting:

View File

@@ -4,16 +4,7 @@ For internal architecture/workflow guidance, use `docs/README.md` at the repo ro
## Prerequisites
- Required for all contributor workflows:
- [Bun](https://bun.sh)
- `git` with submodule support
- Required by commands used on this page:
- `bash` for helper scripts such as `make dev-watch`, `bun run format:check:src`, and `bash scripts/verify-generated-launcher.sh`
- `unzip` on macOS/Linux for the bundled Yomitan build step inside `bun run build`
- `lua` for plugin/environment test lanes such as `bun run test:env` and `bun run test:launcher`
- Platform-specific / conditional:
- `swiftc` on macOS is optional. If absent, the build falls back to staging the Swift helper source instead of compiling the helper binary.
- Windows uses `powershell.exe` during the bundled Yomitan extraction step. A normal Windows install already provides it.
- [Bun](https://bun.sh)
## Setup
@@ -30,8 +21,6 @@ bun install
`make deps` is still available as a convenience wrapper around the same dependency install flow.
If you only need the default TypeScript/unit lanes, Bun plus the checked-in dependencies is enough after install. The extra tools above are only needed when you run the commands that invoke them.
## Building
```bash
@@ -51,8 +40,6 @@ make build-launcher
`bun run build` includes the Yomitan build step. It builds the bundled Chrome extension directly from the `vendor/subminer-yomitan` submodule into `build/yomitan` using Bun.
On macOS/Linux, that build also shells out to `unzip` while extracting the Yomitan artifact. On macOS, the asset staging step will compile the helper with `swiftc` when available, then fall back to copying the `.swift` source if not.
## Launcher Artifact Workflow
- Source of truth: `launcher/*.ts`
@@ -73,8 +60,8 @@ bash scripts/verify-generated-launcher.sh
```bash
bun run dev # builds + launches with --start --dev
bun run electron . --start --dev --log-level debug # equivalent Electron launch with verbose logging
bun run electron . --background # tray/background mode, minimal default logging
electron . --start --dev --log-level debug # equivalent Electron launch with verbose logging
electron . --background # tray/background mode, minimal default logging
make dev-start # build + launch via Makefile
make dev-watch # watch TS + renderer and launch Electron (faster edit loop)
make dev-watch-macos # same as dev-watch, forcing --backend macos
@@ -107,11 +94,6 @@ bun run test:subtitle # maintained alass/ffsubsync subtitle surface
- `bun run test:env` covers environment-sensitive checks: launcher smoke/plugin verification plus the Bun source SQLite lane.
- `bun run test:immersion:sqlite` is the reproducible persistence lane when you need real DB-backed SQLite coverage under Bun.
Command-specific test deps:
- `bun run test:env` and `bun run test:launcher` invoke Lua-based plugin checks, so `lua` must be installed.
- `bun run format:src` and `bun run format:check:src` invoke `bash scripts/prettier-scope.sh`.
The Bun-managed discovery lanes intentionally exclude a small compiled/runtime-focused set: `src/core/services/ipc.test.ts`, `src/core/services/anki-jimaku-ipc.test.ts`, `src/core/services/overlay-manager.test.ts`, `src/main/config-validation.test.ts`, `src/main/runtime/startup-config.test.ts`, and `src/main/runtime/registry.test.ts`. `bun run test:runtime:compat` keeps them in the standard workflow via `dist/**`.
Suggested local gate before handoff:

View File

@@ -67,7 +67,7 @@ features:
alt: Subtitle download icon
title: Subtitle Download & Sync
details: Search and pull subtitles from Jimaku, then auto-sync timing with alass or ffsubsync — all from the overlay.
link: /configuration#jimaku
link: /jimaku-integration
linkText: Jimaku integration
- icon:
src: /assets/tokenization.svg
@@ -223,12 +223,12 @@ const demoAssetVersion = '20260223-2';
}
.workflow-step:hover {
background: hsla(232, 23%, 18%, 0.6);
background: var(--tui-step-hover-bg);
}
.workflow-step:hover .step-number {
color: var(--vp-c-brand-1);
text-shadow: 0 0 12px hsla(267, 83%, 80%, 0.3);
text-shadow: 0 0 12px var(--tui-step-hover-glow);
}
.workflow-connector {

View File

@@ -172,7 +172,7 @@ Install `mpv` separately and ensure `mpv.exe` is on `PATH`. `ffmpeg` is still re
### Windows Usage Notes
- Launch `SubMiner.exe` once to let the first-run setup flow seed `%APPDATA%\\SubMiner\\config.jsonc`, offer mpv plugin installation, open bundled Yomitan settings, and optionally create `SubMiner mpv` Start Menu/Desktop shortcuts.
- If you use the mpv plugin, leave `binary_path` empty unless SubMiner is installed in a non-standard location.
- First-run mpv plugin installs pin `binary_path` to the current `SubMiner.exe` automatically. Manual plugin configs can leave `binary_path` empty unless SubMiner is installed in a non-standard location.
- Windows plugin installs rewrite `socket_path` to `\\.\pipe\subminer-socket`; do not keep `/tmp/subminer-socket` on Windows.
- Native window tracking is built in on Windows; no `xdotool`, `xwininfo`, or compositor-specific helper is required.
@@ -201,6 +201,7 @@ mpv must be launched with `--input-ipc-server=/tmp/subminer-socket` for SubMiner
:::
On Windows, the packaged plugin config is rewritten to `socket_path=\\.\pipe\subminer-socket`.
First-run setup also pins `binary_path` to the current app binary so mpv launches the same SubMiner build that installed the plugin.
```bash
# Option 1: install from release assets bundle

View File

@@ -26,6 +26,31 @@ If no files match the current episode filter, a "Show all files" button lets you
| `Arrow Up` / `Arrow Down` | Navigate entries or files |
| `Escape` | Close modal |
### Flow
```mermaid
flowchart TD
classDef step fill:#c6a0f6,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef action fill:#8aadf4,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef result fill:#a6da95,stroke:#494d64,color:#24273a,stroke-width:1.5px
classDef enrich fill:#8bd5ca,stroke:#494d64,color:#24273a,stroke-width:1.5px
Open["Open Jimaku modal (Ctrl+Shift+J)"]:::step
Parse["Auto-fill title, season, episode from filename"]:::enrich
Search["Search Jimaku API"]:::action
Entries["Browse matching entries"]:::action
Files["Browse subtitle files"]:::action
Download["Download selected file"]:::action
Load["Load subtitle into mpv"]:::result
Open --> Parse
Parse --> Search
Search --> Entries
Entries --> Files
Files --> Download
Download --> Load
```
## Configuration
Add a `jimaku` section to your `config.jsonc`:
@@ -106,6 +131,6 @@ Verify mpv is running and connected via IPC. SubMiner loads the subtitle by issu
## Related
- [Configuration Reference](/configuration#jimaku) — full config section
- [Configuration Reference](/configuration#jimaku) — full config options
- [Mining Workflow](/mining-workflow#jimaku-subtitle-search) — how Jimaku fits into the sentence mining loop
- [Troubleshooting](/troubleshooting#jimaku) — additional error guidance

View File

@@ -498,6 +498,7 @@
// ==========================================
"discordPresence": {
"enabled": false, // Enable optional Discord Rich Presence updates. Values: true | false
"presenceStyle": "default", // Presence card text preset: "default" (clean bilingual), "meme" (Mining and crafting), "japanese" (fully JP), or "minimal".
"updateIntervalMs": 3000, // Minimum interval between presence payload updates.
"debounceMs": 750 // Debounce delay used to collapse bursty presence updates.
}, // Optional Discord Rich Presence activity card updates for current playback/study session.

View File

@@ -0,0 +1,71 @@
# Subtitle Sidebar
The subtitle sidebar displays the full parsed cue list for the active subtitle file as a scrollable panel alongside mpv. It lets you review past and upcoming lines, click any cue to seek directly to that moment, and follow along without depending on the transient overlay subtitles.
The sidebar is opt-in and disabled by default. Enable it under `subtitleSidebar.enabled` in your config.
## How It Works
When SubMiner parses the active subtitle source into a cue list, the sidebar becomes available. Toggle it with the `\` key (configurable via `subtitleSidebar.toggleKey`). While open:
- The active cue is highlighted and kept in view as playback advances (when `autoScroll` is `true`).
- Clicking any cue seeks mpv to that timestamp.
- The sidebar stays synchronized with the overlay — media transitions and subtitle source changes update both simultaneously.
The sidebar only appears when a parsed cue list is available. External subtitle sources that SubMiner cannot parse (for example, embedded ASS tracks rendered directly by mpv) will not populate the sidebar.
## Layout Modes
Two layout modes are available via `subtitleSidebar.layout`:
**`overlay`** (default) — The sidebar floats over mpv as a panel. It does not affect the player window size or position.
**`embedded`** — Reserves space on the right side of the player and shifts the video area to mimic a split-pane layout. Useful if you want the cue list visible without it covering the video. If you see unexpected positioning in your environment, switch back to `overlay` to isolate the issue.
## Configuration
Enable and configure the sidebar under `subtitleSidebar` in your config file:
```json
{
"subtitleSidebar": {
"enabled": false,
"autoOpen": false,
"layout": "overlay",
"toggleKey": "Backslash",
"pauseVideoOnHover": false,
"autoScroll": true,
"fontFamily": "\"M PLUS 1\", \"Noto Sans CJK JP\", sans-serif",
"fontSize": 16
}
}
```
| Option | Type | Default | Description |
| --------------------------- | ------- | ------------ | -------------------------------------------------------------------------------------------------- |
| `enabled` | boolean | `false` | Enable subtitle sidebar support |
| `autoOpen` | boolean | `false` | Open the sidebar automatically on overlay startup |
| `layout` | string | `"overlay"` | `"overlay"` floats over mpv; `"embedded"` reserves right-side player space |
| `toggleKey` | string | `"Backslash"` | `KeyboardEvent.code` for the toggle shortcut |
| `pauseVideoOnHover` | boolean | `false` | Pause playback while hovering the cue list |
| `autoScroll` | boolean | `true` | Keep the active cue in view during playback |
| `maxWidth` | number | `420` | Maximum sidebar width in CSS pixels |
| `opacity` | number | `0.95` | Sidebar opacity between `0` and `1` |
| `backgroundColor` | string | — | Sidebar shell background color |
| `textColor` | string | — | Default cue text color |
| `fontFamily` | string | — | CSS `font-family` applied to cue text |
| `fontSize` | number | `16` | Base cue font size in CSS pixels |
| `timestampColor` | string | — | Cue timestamp color |
| `activeLineColor` | string | — | Active cue text color |
| `activeLineBackgroundColor` | string | — | Active cue background color |
| `hoverLineBackgroundColor` | string | — | Hovered cue background color |
Default colors use Catppuccin Macchiato with a semi-transparent shell so the panel stays readable without feeling like a solid overlay.
## Keyboard Shortcut
| Key | Action | Config key |
| --- | ----------------------- | ------------------------------ |
| `\` | Toggle subtitle sidebar | `subtitleSidebar.toggleKey` |
The toggle is overlay-local and only opens when SubMiner has a parsed cue list for the active subtitle source. See [Keyboard Shortcuts](/shortcuts) for the full shortcut reference.

View File

@@ -17,6 +17,41 @@ When SubMiner detects a YouTube URL (or `ytsearch:` target), it pauses mpv at st
4. **Download** --- Selected tracks are fetched via direct URL when available, falling back to `yt-dlp --write-subs` / `--write-auto-subs`. YouTube TimedText XML formats (`srv1`/`srv2`/`srv3`) are converted to VTT on the fly. Auto-generated VTT captions are normalized to remove rolling-caption duplication.
5. **Load** --- Subtitle files are injected into mpv via `sub-add`. Playback resumes once the primary track is ready; secondary failures do not block.
## Pipeline Diagram
```mermaid
flowchart TD
classDef step fill:#c6a0f6,stroke:#494d64,color:#24273a
classDef action fill:#8aadf4,stroke:#494d64,color:#24273a
classDef result fill:#a6da95,stroke:#494d64,color:#24273a
classDef enrich fill:#8bd5ca,stroke:#494d64,color:#24273a
classDef ext fill:#eed49f,stroke:#494d64,color:#24273a
A[YouTube URL detected]:::step
B[yt-dlp probe]:::ext
C[Track discovery]:::action
D{Auto or manual selection?}:::step
E[Auto-select best tracks]:::action
F[Manual picker — Ctrl+Alt+C]:::action
G[Download subtitle files]:::action
H[Convert TimedText to VTT]:::enrich
I[Normalize auto-caption duplicates]:::enrich
K[sub-add into mpv]:::action
L[Overlay renders subtitles]:::result
A --> B
B --> C
C --> D
D -- startup --> E
D -- user request --> F
E --> G
F --> G
G --> H
H --> I
I --> K
K --> L
```
## Auto-Load Flow
On startup with a YouTube URL:

View File

@@ -1,6 +1,6 @@
{
"name": "subminer",
"version": "0.9.3",
"version": "0.10.0",
"description": "All-in-one sentence mining overlay with AnkiConnect and dictionary integration",
"packageManager": "bun@1.3.5",
"main": "dist/main-entry.js",
@@ -100,9 +100,9 @@
"dependencies": {
"@fontsource-variable/geist": "^5.2.8",
"@fontsource-variable/geist-mono": "^5.2.7",
"@xhayper/discord-rpc": "^1.3.3",
"axios": "^1.13.5",
"commander": "^14.0.3",
"discord-rpc": "^4.0.1",
"hono": "^4.12.7",
"jsonc-parser": "^3.3.1",
"libsql": "^0.5.22",

View File

@@ -153,6 +153,9 @@ function M.create(ctx)
local function notify_auto_play_ready()
release_auto_play_ready_gate("tokenization-ready")
if state.suppress_ready_overlay_restore then
return
end
if state.overlay_running and resolve_visible_overlay_startup() then
run_control_command_async("show-visible-overlay", {
socket_path = opts.socket_path,
@@ -287,6 +290,9 @@ function M.create(ctx)
local function start_overlay(overrides)
overrides = overrides or {}
if overrides.auto_start_trigger == true then
state.suppress_ready_overlay_restore = false
end
if not binary.ensure_binary_available() then
subminer_log("error", "binary", "SubMiner binary not found")
@@ -433,6 +439,7 @@ function M.create(ctx)
subminer_log("error", "binary", "SubMiner binary not found")
return
end
state.suppress_ready_overlay_restore = true
run_control_command_async("hide-visible-overlay", nil, function(ok, result)
if ok then
@@ -456,8 +463,9 @@ function M.create(ctx)
show_osd("Error: binary not found")
return
end
state.suppress_ready_overlay_restore = true
run_control_command_async("toggle", nil, function(ok)
run_control_command_async("toggle-visible-overlay", nil, function(ok)
if not ok then
subminer_log("warn", "process", "Toggle command failed")
show_osd("Toggle failed")

View File

@@ -32,6 +32,7 @@ function M.new()
auto_play_ready_gate_armed = false,
auto_play_ready_timeout = nil,
auto_play_ready_osd_timer = nil,
suppress_ready_overlay_restore = false,
}
end

View File

@@ -239,19 +239,6 @@ release_real_runtime_lease() {
fi
}
handle_real_runtime_lease_termination() {
local signal=${1:-EXIT}
release_real_runtime_lease
case "$signal" in
INT)
exit 130
;;
TERM)
exit 143
;;
esac
}
compute_final_status() {
if [[ "$FAILED" == "1" ]]; then
FINAL_STATUS="failed"
@@ -403,6 +390,8 @@ REAL_RUNTIME_LEASE_DIR=""
REAL_RUNTIME_LEASE_ERROR=""
PATH_SELECTION_MODE="auto"
trap 'release_real_runtime_lease' EXIT
while [[ $# -gt 0 ]]; do
case "$1" in
--lane)
@@ -513,9 +502,6 @@ for lane in "${SELECTED_LANES[@]}"; do
record_blocked_step "$lane" "real-runtime-lease" "$REAL_RUNTIME_LEASE_ERROR"
continue
fi
trap 'handle_real_runtime_lease_termination EXIT' EXIT
trap 'handle_real_runtime_lease_termination INT' INT
trap 'handle_real_runtime_lease_termination TERM' TERM
helper=$(find_real_runtime_helper || true)
if [[ -z "${helper:-}" ]]; then
record_blocked_step "$lane" "real-runtime-helper" "no real-runtime helper script available in $SCRIPT_DIR"

View File

@@ -1,7 +1,8 @@
import assert from 'node:assert/strict';
import { resolve } from 'node:path';
import test from 'node:test';
import { mergeLcovReports } from './run-coverage-lane';
import { mergeLcovReports, resolveCoverageDir } from './run-coverage-lane';
test('mergeLcovReports combines duplicate source-file counters across shard outputs', () => {
const merged = mergeLcovReports([
@@ -59,3 +60,15 @@ test('mergeLcovReports keeps distinct source files as separate records', () => {
assert.match(merged, /SF:src\/a\.ts[\s\S]*end_of_record/);
assert.match(merged, /SF:src\/b\.ts[\s\S]*end_of_record/);
});
test('resolveCoverageDir keeps coverage output inside the repository', () => {
const repoRoot = resolve('/tmp', 'subminer-repo-root');
assert.equal(resolveCoverageDir(repoRoot, []), resolve(repoRoot, 'coverage'));
assert.equal(
resolveCoverageDir(repoRoot, ['--coverage-dir', 'coverage/test-src']),
resolve(repoRoot, 'coverage/test-src'),
);
assert.throws(() => resolveCoverageDir(repoRoot, ['--coverage-dir', '../escape']));
assert.throws(() => resolveCoverageDir(repoRoot, ['--coverage-dir', '/tmp/escape']));
});

View File

@@ -1,6 +1,6 @@
import { existsSync, mkdirSync, readFileSync, readdirSync, rmSync, writeFileSync } from 'node:fs';
import { spawnSync } from 'node:child_process';
import { join, relative, resolve } from 'node:path';
import { isAbsolute, join, relative, resolve } from 'node:path';
type LaneConfig = {
roots: string[];
@@ -85,6 +85,15 @@ function parseCoverageDirArg(argv: string[]): string {
return 'coverage';
}
export function resolveCoverageDir(repoRootDir: string, argv: string[]): string {
const candidate = resolve(repoRootDir, parseCoverageDirArg(argv));
const rel = relative(repoRootDir, candidate);
if (isAbsolute(rel) || rel.startsWith('..')) {
throw new Error(`--coverage-dir must be within repository: ${candidate}`);
}
return candidate;
}
function parseLcovReport(report: string): LcovRecord[] {
const records: LcovRecord[] = [];
let current: LcovRecord | null = null;
@@ -251,7 +260,7 @@ function runCoverageLane(): number {
return 1;
}
const coverageDir = resolve(repoRoot, parseCoverageDirArg(process.argv.slice(3)));
const coverageDir = resolveCoverageDir(repoRoot, process.argv.slice(3));
const shardRoot = join(coverageDir, '.shards');
mkdirSync(coverageDir, { recursive: true });
rmSync(shardRoot, { recursive: true, force: true });
@@ -260,39 +269,43 @@ function runCoverageLane(): number {
const files = getLaneFiles(laneName);
const reports: string[] = [];
for (const [index, file] of files.entries()) {
const shardDir = join(shardRoot, `${String(index + 1).padStart(3, '0')}`);
const result = spawnSync(
'bun',
['test', '--coverage', '--coverage-reporter=lcov', '--coverage-dir', shardDir, `./${file}`],
{
cwd: repoRoot,
stdio: 'inherit',
},
);
try {
for (const [index, file] of files.entries()) {
const shardDir = join(shardRoot, `${String(index + 1).padStart(3, '0')}`);
const result = spawnSync(
'bun',
['test', '--coverage', '--coverage-reporter=lcov', '--coverage-dir', shardDir, `./${file}`],
{
cwd: repoRoot,
stdio: 'inherit',
},
);
if (result.error) {
throw result.error;
}
if ((result.status ?? 1) !== 0) {
return result.status ?? 1;
if (result.error) {
throw result.error;
}
if ((result.status ?? 1) !== 0) {
return result.status ?? 1;
}
const lcovPath = join(shardDir, 'lcov.info');
if (!existsSync(lcovPath)) {
process.stdout.write(`Skipping empty coverage shard for ${file}\n`);
continue;
}
reports.push(readFileSync(lcovPath, 'utf8'));
}
const lcovPath = join(shardDir, 'lcov.info');
if (!existsSync(lcovPath)) {
process.stdout.write(`Skipping empty coverage shard for ${file}\n`);
continue;
}
reports.push(readFileSync(lcovPath, 'utf8'));
writeFileSync(join(coverageDir, 'lcov.info'), mergeLcovReports(reports), 'utf8');
process.stdout.write(`Merged LCOV written to ${relative(repoRoot, join(coverageDir, 'lcov.info'))}\n`);
return 0;
} finally {
rmSync(shardRoot, { recursive: true, force: true });
}
writeFileSync(join(coverageDir, 'lcov.info'), mergeLcovReports(reports), 'utf8');
rmSync(shardRoot, { recursive: true, force: true });
process.stdout.write(`Merged LCOV written to ${relative(repoRoot, join(coverageDir, 'lcov.info'))}\n`);
return 0;
}
if (require.main === module) {
// @ts-ignore Bun entrypoint detection; TS config for scripts still targets CommonJS.
if (import.meta.main) {
process.exit(runCoverageLane());
}

View File

@@ -822,6 +822,92 @@ do
)
end
do
local recorded, err = run_plugin_scenario({
process_list = "",
option_overrides = {
binary_path = binary_path,
auto_start = "yes",
auto_start_visible_overlay = "yes",
auto_start_pause_until_ready = "yes",
socket_path = "/tmp/subminer-socket",
},
input_ipc_server = "/tmp/subminer-socket",
media_title = "Random Movie",
files = {
[binary_path] = true,
},
})
assert_true(recorded ~= nil, "plugin failed to load for manual toggle-off ready scenario: " .. tostring(err))
fire_event(recorded, "file-loaded")
assert_true(recorded.script_messages["subminer-toggle"] ~= nil, "subminer-toggle script message not registered")
recorded.script_messages["subminer-toggle"]()
assert_true(
count_control_calls(recorded.async_calls, "--toggle-visible-overlay") == 1,
"manual toggle should use explicit visible-overlay toggle command"
)
recorded.script_messages["subminer-autoplay-ready"]()
assert_true(
count_control_calls(recorded.async_calls, "--show-visible-overlay") == 1,
"manual toggle-off before readiness should suppress ready-time visible overlay restore"
)
end
do
local recorded, err = run_plugin_scenario({
process_list = "",
option_overrides = {
binary_path = binary_path,
auto_start = "yes",
auto_start_visible_overlay = "yes",
auto_start_pause_until_ready = "yes",
socket_path = "/tmp/subminer-socket",
},
input_ipc_server = "/tmp/subminer-socket",
media_title = "Random Movie",
files = {
[binary_path] = true,
},
})
assert_true(
recorded ~= nil,
"plugin failed to load for repeated ready restore suppression scenario: " .. tostring(err)
)
fire_event(recorded, "file-loaded")
assert_true(recorded.script_messages["subminer-toggle"] ~= nil, "subminer-toggle script message not registered")
recorded.script_messages["subminer-toggle"]()
recorded.script_messages["subminer-autoplay-ready"]()
recorded.script_messages["subminer-autoplay-ready"]()
assert_true(
count_control_calls(recorded.async_calls, "--show-visible-overlay") == 1,
"manual toggle-off should suppress repeated ready-time visible overlay restores for the same session"
)
end
do
local recorded, err = run_plugin_scenario({
process_list = "",
option_overrides = {
binary_path = binary_path,
auto_start = "no",
},
files = {
[binary_path] = true,
},
})
assert_true(recorded ~= nil, "plugin failed to load for manual toggle command scenario: " .. tostring(err))
assert_true(recorded.script_messages["subminer-toggle"] ~= nil, "subminer-toggle script message not registered")
recorded.script_messages["subminer-toggle"]()
assert_true(
count_control_calls(recorded.async_calls, "--toggle-visible-overlay") == 1,
"script-message toggle should issue explicit visible-overlay toggle command"
)
assert_true(
count_control_calls(recorded.async_calls, "--toggle") == 0,
"script-message toggle should not issue legacy generic toggle command"
)
end
do
local recorded, err = run_plugin_scenario({
process_list = "",

View File

@@ -129,6 +129,7 @@ export const INTEGRATIONS_DEFAULT_CONFIG: Pick<
},
discordPresence: {
enabled: false,
presenceStyle: 'default' as const,
updateIntervalMs: 3_000,
debounceMs: 750,
},

View File

@@ -323,6 +323,13 @@ export function buildIntegrationConfigOptionRegistry(
defaultValue: defaultConfig.discordPresence.enabled,
description: 'Enable optional Discord Rich Presence updates.',
},
{
path: 'discordPresence.presenceStyle',
kind: 'string',
defaultValue: defaultConfig.discordPresence.presenceStyle,
description:
'Presence card text preset: "default" (clean bilingual), "meme" (Mining and crafting), "japanese" (fully JP), or "minimal".',
},
{
path: 'discordPresence.updateIntervalMs',
kind: 'number',

View File

@@ -443,13 +443,23 @@ test('handleCliCommand still runs non-start actions on second-instance', () => {
);
});
test('handleCliCommand connects MPV for toggle on second-instance', () => {
test('handleCliCommand does not connect MPV for pure toggle on second-instance', () => {
const { deps, calls } = createDeps();
handleCliCommand(makeArgs({ toggle: true }), 'second-instance', deps);
assert.ok(calls.includes('toggleVisibleOverlay'));
assert.equal(
calls.some((value) => value === 'connectMpvClient'),
true,
false,
);
});
test('handleCliCommand does not connect MPV for explicit visible-overlay toggle', () => {
const { deps, calls } = createDeps();
handleCliCommand(makeArgs({ toggleVisibleOverlay: true }), 'second-instance', deps);
assert.ok(calls.includes('toggleVisibleOverlay'));
assert.equal(
calls.some((value) => value === 'connectMpvClient'),
false,
);
});

View File

@@ -271,7 +271,7 @@ export function handleCliCommand(
const reuseSecondInstanceStart =
source === 'second-instance' && args.start && deps.isOverlayRuntimeInitialized();
const shouldStart = args.start || args.toggle || args.toggleVisibleOverlay;
const shouldConnectMpv = args.start;
const needsOverlayRuntime = commandNeedsOverlayRuntime(args);
const shouldInitializeOverlayRuntime = needsOverlayRuntime || args.start;
@@ -302,7 +302,7 @@ export function handleCliCommand(
deps.initializeOverlayRuntime();
}
if (shouldStart && deps.hasMpvClient()) {
if (shouldConnectMpv && deps.hasMpvClient()) {
const socketPath = deps.getMpvSocketPath();
deps.setMpvClientSocketPath(socketPath);
deps.connectMpvClient();

View File

@@ -10,6 +10,7 @@ import {
const baseConfig = {
enabled: true,
presenceStyle: 'default' as const,
updateIntervalMs: 10_000,
debounceMs: 200,
} as const;
@@ -27,24 +28,67 @@ const baseSnapshot: DiscordPresenceSnapshot = {
sessionStartedAtMs: BASE_SESSION_STARTED_AT_MS,
};
test('buildDiscordPresenceActivity maps polished payload fields', () => {
test('buildDiscordPresenceActivity maps polished payload fields (default style)', () => {
const payload = buildDiscordPresenceActivity(baseConfig, baseSnapshot);
assert.equal(payload.details, 'Sousou no Frieren E01');
assert.equal(payload.state, 'Playing 01:35 / 24:10');
assert.equal(payload.largeImageKey, 'subminer-logo');
assert.equal(payload.smallImageKey, 'study');
assert.equal(payload.smallImageText, '日本語学習中');
assert.equal(payload.buttons, undefined);
assert.equal(payload.startTimestamp, Math.floor(BASE_SESSION_STARTED_AT_MS / 1000));
});
test('buildDiscordPresenceActivity falls back to idle when disconnected', () => {
test('buildDiscordPresenceActivity falls back to idle with default style', () => {
const payload = buildDiscordPresenceActivity(baseConfig, {
...baseSnapshot,
connected: false,
mediaPath: null,
});
assert.equal(payload.state, 'Idle');
assert.equal(payload.details, 'Sentence Mining');
});
test('buildDiscordPresenceActivity uses meme style fallback', () => {
const memeConfig = { ...baseConfig, presenceStyle: 'meme' as const };
const payload = buildDiscordPresenceActivity(memeConfig, {
...baseSnapshot,
connected: false,
mediaPath: null,
});
assert.equal(payload.details, 'Mining and crafting (Anki cards)');
assert.equal(payload.smallImageText, 'Sentence Mining');
});
test('buildDiscordPresenceActivity uses japanese style', () => {
const jpConfig = { ...baseConfig, presenceStyle: 'japanese' as const };
const payload = buildDiscordPresenceActivity(jpConfig, {
...baseSnapshot,
connected: false,
mediaPath: null,
});
assert.equal(payload.details, '文の採掘中');
assert.equal(payload.smallImageText, 'イマージョン学習');
});
test('buildDiscordPresenceActivity uses minimal style', () => {
const minConfig = { ...baseConfig, presenceStyle: 'minimal' as const };
const payload = buildDiscordPresenceActivity(minConfig, {
...baseSnapshot,
connected: false,
mediaPath: null,
});
assert.equal(payload.details, 'SubMiner');
assert.equal(payload.smallImageKey, undefined);
assert.equal(payload.smallImageText, undefined);
});
test('buildDiscordPresenceActivity shows media title regardless of style', () => {
for (const presenceStyle of ['default', 'meme', 'japanese', 'minimal'] as const) {
const payload = buildDiscordPresenceActivity({ ...baseConfig, presenceStyle }, baseSnapshot);
assert.equal(payload.details, 'Sousou no Frieren E01');
assert.equal(payload.state, 'Playing 01:35 / 24:10');
}
});
test('service deduplicates identical updates and sends changed timeline', async () => {

View File

@@ -1,3 +1,4 @@
import type { DiscordPresenceStylePreset } from '../../types/integrations';
import type { ResolvedConfig } from '../../types';
export interface DiscordPresenceSnapshot {
@@ -33,15 +34,58 @@ type DiscordClient = {
type TimeoutLike = ReturnType<typeof setTimeout>;
const DISCORD_PRESENCE_STYLE = {
fallbackDetails: 'Mining and crafting (Anki cards)',
largeImageKey: 'subminer-logo',
largeImageText: 'SubMiner',
smallImageKey: 'study',
smallImageText: 'Sentence Mining',
buttonLabel: '',
buttonUrl: '',
} as const;
interface PresenceStyleDefinition {
fallbackDetails: string;
largeImageKey: string;
largeImageText: string;
smallImageKey: string;
smallImageText: string;
buttonLabel: string;
buttonUrl: string;
}
const PRESENCE_STYLES: Record<DiscordPresenceStylePreset, PresenceStyleDefinition> = {
default: {
fallbackDetails: 'Sentence Mining',
largeImageKey: 'subminer-logo',
largeImageText: 'SubMiner',
smallImageKey: 'study',
smallImageText: '日本語学習中',
buttonLabel: '',
buttonUrl: '',
},
meme: {
fallbackDetails: 'Mining and crafting (Anki cards)',
largeImageKey: 'subminer-logo',
largeImageText: 'SubMiner',
smallImageKey: 'study',
smallImageText: 'Sentence Mining',
buttonLabel: '',
buttonUrl: '',
},
japanese: {
fallbackDetails: '文の採掘中',
largeImageKey: 'subminer-logo',
largeImageText: 'SubMiner',
smallImageKey: 'study',
smallImageText: 'イマージョン学習',
buttonLabel: '',
buttonUrl: '',
},
minimal: {
fallbackDetails: 'SubMiner',
largeImageKey: 'subminer-logo',
largeImageText: 'SubMiner',
smallImageKey: '',
smallImageText: '',
buttonLabel: '',
buttonUrl: '',
},
};
function resolvePresenceStyle(preset: DiscordPresenceStylePreset | undefined): PresenceStyleDefinition {
return PRESENCE_STYLES[preset ?? 'default'] ?? PRESENCE_STYLES.default;
}
function trimField(value: string, maxLength = 128): string {
if (value.length <= maxLength) return value;
@@ -79,15 +123,16 @@ function formatClock(totalSeconds: number | null | undefined): string {
}
export function buildDiscordPresenceActivity(
_config: DiscordPresenceConfig,
config: DiscordPresenceConfig,
snapshot: DiscordPresenceSnapshot,
): DiscordActivityPayload {
const style = resolvePresenceStyle(config.presenceStyle);
const status = buildStatus(snapshot);
const title = sanitizeText(snapshot.mediaTitle, basename(snapshot.mediaPath) || 'Unknown media');
const details =
snapshot.connected && snapshot.mediaPath
? trimField(title)
: DISCORD_PRESENCE_STYLE.fallbackDetails;
: style.fallbackDetails;
const timeline = `${formatClock(snapshot.currentTimeSec)} / ${formatClock(snapshot.mediaDurationSec)}`;
const state =
snapshot.connected && snapshot.mediaPath
@@ -100,26 +145,26 @@ export function buildDiscordPresenceActivity(
startTimestamp: Math.floor(snapshot.sessionStartedAtMs / 1000),
};
if (DISCORD_PRESENCE_STYLE.largeImageKey.trim().length > 0) {
activity.largeImageKey = DISCORD_PRESENCE_STYLE.largeImageKey.trim();
if (style.largeImageKey.trim().length > 0) {
activity.largeImageKey = style.largeImageKey.trim();
}
if (DISCORD_PRESENCE_STYLE.largeImageText.trim().length > 0) {
activity.largeImageText = trimField(DISCORD_PRESENCE_STYLE.largeImageText.trim());
if (style.largeImageText.trim().length > 0) {
activity.largeImageText = trimField(style.largeImageText.trim());
}
if (DISCORD_PRESENCE_STYLE.smallImageKey.trim().length > 0) {
activity.smallImageKey = DISCORD_PRESENCE_STYLE.smallImageKey.trim();
if (style.smallImageKey.trim().length > 0) {
activity.smallImageKey = style.smallImageKey.trim();
}
if (DISCORD_PRESENCE_STYLE.smallImageText.trim().length > 0) {
activity.smallImageText = trimField(DISCORD_PRESENCE_STYLE.smallImageText.trim());
if (style.smallImageText.trim().length > 0) {
activity.smallImageText = trimField(style.smallImageText.trim());
}
if (
DISCORD_PRESENCE_STYLE.buttonLabel.trim().length > 0 &&
/^https?:\/\//.test(DISCORD_PRESENCE_STYLE.buttonUrl.trim())
style.buttonLabel.trim().length > 0 &&
/^https?:\/\//.test(style.buttonUrl.trim())
) {
activity.buttons = [
{
label: trimField(DISCORD_PRESENCE_STYLE.buttonLabel.trim(), 32),
url: DISCORD_PRESENCE_STYLE.buttonUrl.trim(),
label: trimField(style.buttonLabel.trim(), 32),
url: style.buttonUrl.trim(),
},
];
}

View File

@@ -77,10 +77,6 @@ function makeDbPath(): string {
return path.join(dir, 'immersion.sqlite');
}
function stripDbMsSuffix(value: string | null | undefined): string {
return (value ?? '0').replace(/\.0$/, '');
}
function cleanupDbPath(dbPath: string): void {
const dir = path.dirname(dbPath);
if (!fs.existsSync(dir)) {
@@ -189,7 +185,7 @@ test('destroy finalizes active session and persists final telemetry', async () =
const db = new Database(dbPath);
const sessionRow = db.prepare('SELECT ended_at_ms FROM imm_sessions LIMIT 1').get() as {
ended_at_ms: string | null;
ended_at_ms: number | null;
} | null;
const telemetryCountRow = db
.prepare('SELECT COUNT(*) AS total FROM imm_session_telemetry')
@@ -197,7 +193,7 @@ test('destroy finalizes active session and persists final telemetry', async () =
db.close();
assert.ok(sessionRow);
assert.ok(BigInt(stripDbMsSuffix(sessionRow?.ended_at_ms)) > 0n);
assert.ok(Number(sessionRow?.ended_at_ms ?? 0) > 0);
assert.ok(Number(telemetryCountRow.total) >= 2);
} finally {
tracker?.destroy();
@@ -508,7 +504,7 @@ test('rebuildLifetimeSummaries backfills retained ended sessions and resets stal
episodes_started: number;
episodes_completed: number;
anime_completed: number;
last_rebuilt_ms: string | null;
last_rebuilt_ms: number | null;
} | null;
const appliedSessions = rebuildApi.db
.prepare('SELECT COUNT(*) AS total FROM imm_lifetime_applied_sessions')
@@ -522,7 +518,7 @@ test('rebuildLifetimeSummaries backfills retained ended sessions and resets stal
assert.equal(globalRow?.episodes_started, 2);
assert.equal(globalRow?.episodes_completed, 2);
assert.equal(globalRow?.anime_completed, 1);
assert.ok(BigInt(stripDbMsSuffix(globalRow?.last_rebuilt_ms)) > 0n);
assert.equal(globalRow?.last_rebuilt_ms, rebuild.rebuiltAtMs);
assert.equal(appliedSessions?.total, 2);
} finally {
tracker?.destroy();
@@ -728,8 +724,24 @@ test('startup finalizes stale active sessions and applies lifetime summaries', a
tracker.destroy();
tracker = new Ctor({ dbPath });
const verificationDb = new Database(dbPath);
const globalRow = verificationDb
const restartedApi = tracker as unknown as { db: DatabaseSync };
const sessionRow = restartedApi.db
.prepare(
`
SELECT ended_at_ms, status, ended_media_ms, active_watched_ms, tokens_seen, cards_mined
FROM imm_sessions
WHERE session_id = 1
`,
)
.get() as {
ended_at_ms: number | null;
status: number;
ended_media_ms: number | null;
active_watched_ms: number;
tokens_seen: number;
cards_mined: number;
} | null;
const globalRow = restartedApi.db
.prepare(
`
SELECT total_sessions, total_active_ms, total_cards, active_days, episodes_started,
@@ -746,13 +758,23 @@ test('startup finalizes stale active sessions and applies lifetime summaries', a
episodes_started: number;
episodes_completed: number;
} | null;
const mediaRows = verificationDb
const mediaRows = restartedApi.db
.prepare('SELECT COUNT(*) AS total FROM imm_lifetime_media')
.get() as { total: number } | null;
const animeRows = verificationDb
const animeRows = restartedApi.db
.prepare('SELECT COUNT(*) AS total FROM imm_lifetime_anime')
.get() as { total: number } | null;
verificationDb.close();
const appliedRows = restartedApi.db
.prepare('SELECT COUNT(*) AS total FROM imm_lifetime_applied_sessions')
.get() as { total: number } | null;
assert.ok(sessionRow);
assert.ok(Number(sessionRow?.ended_at_ms ?? 0) >= sampleMs);
assert.equal(sessionRow?.status, 2);
assert.equal(sessionRow?.ended_media_ms, 321_000);
assert.equal(sessionRow?.active_watched_ms, 4000);
assert.equal(sessionRow?.tokens_seen, 120);
assert.equal(sessionRow?.cards_mined, 2);
assert.ok(globalRow);
assert.equal(globalRow?.total_sessions, 1);
@@ -763,6 +785,7 @@ test('startup finalizes stale active sessions and applies lifetime summaries', a
assert.equal(globalRow?.episodes_completed, 1);
assert.equal(mediaRows?.total, 1);
assert.equal(animeRows?.total, 1);
assert.equal(appliedRows?.total, 1);
} finally {
tracker?.destroy();
cleanupDbPath(dbPath);
@@ -1567,12 +1590,12 @@ test('applies configurable queue, flush, and retention policy', async () => {
queueCap: number;
maxPayloadBytes: number;
maintenanceIntervalMs: number;
eventsRetentionMs: string | null;
telemetryRetentionMs: string | null;
sessionsRetentionMs: string | null;
dailyRollupRetentionMs: string | null;
monthlyRollupRetentionMs: string | null;
vacuumIntervalMs: string | null;
eventsRetentionMs: number;
telemetryRetentionMs: number;
sessionsRetentionMs: number;
dailyRollupRetentionMs: number;
monthlyRollupRetentionMs: number;
vacuumIntervalMs: number;
};
assert.equal(privateApi.batchSize, 10);
@@ -1580,12 +1603,12 @@ test('applies configurable queue, flush, and retention policy', async () => {
assert.equal(privateApi.queueCap, 1500);
assert.equal(privateApi.maxPayloadBytes, 512);
assert.equal(privateApi.maintenanceIntervalMs, 7_200_000);
assert.equal(privateApi.eventsRetentionMs, '1209600000');
assert.equal(privateApi.telemetryRetentionMs, '3888000000');
assert.equal(privateApi.sessionsRetentionMs, '5184000000');
assert.equal(privateApi.dailyRollupRetentionMs, '63072000000');
assert.equal(privateApi.monthlyRollupRetentionMs, '315360000000');
assert.equal(privateApi.vacuumIntervalMs, '1209600000');
assert.equal(privateApi.eventsRetentionMs, 14 * 86_400_000);
assert.equal(privateApi.telemetryRetentionMs, 45 * 86_400_000);
assert.equal(privateApi.sessionsRetentionMs, 60 * 86_400_000);
assert.equal(privateApi.dailyRollupRetentionMs, 730 * 86_400_000);
assert.equal(privateApi.monthlyRollupRetentionMs, 3650 * 86_400_000);
assert.equal(privateApi.vacuumIntervalMs, 14 * 86_400_000);
} finally {
tracker?.destroy();
cleanupDbPath(dbPath);
@@ -1615,21 +1638,21 @@ test('zero retention days disables prune checks while preserving rollups', async
const privateApi = tracker as unknown as {
runMaintenance: () => void;
db: DatabaseSync;
eventsRetentionMs: string | null;
telemetryRetentionMs: string | null;
sessionsRetentionMs: string | null;
dailyRollupRetentionMs: string | null;
monthlyRollupRetentionMs: string | null;
vacuumIntervalMs: string | null;
eventsRetentionMs: number;
telemetryRetentionMs: number;
sessionsRetentionMs: number;
dailyRollupRetentionMs: number;
monthlyRollupRetentionMs: number;
vacuumIntervalMs: number;
lastVacuumMs: number;
};
assert.equal(privateApi.eventsRetentionMs, null);
assert.equal(privateApi.telemetryRetentionMs, null);
assert.equal(privateApi.sessionsRetentionMs, null);
assert.equal(privateApi.dailyRollupRetentionMs, null);
assert.equal(privateApi.monthlyRollupRetentionMs, null);
assert.equal(privateApi.vacuumIntervalMs, null);
assert.equal(privateApi.eventsRetentionMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.telemetryRetentionMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.sessionsRetentionMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.dailyRollupRetentionMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.monthlyRollupRetentionMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.vacuumIntervalMs, Number.POSITIVE_INFINITY);
assert.equal(privateApi.lastVacuumMs, 0);
const nowMs = trackerNowMs();

View File

@@ -101,13 +101,18 @@ import {
import { DEFAULT_MIN_WATCH_RATIO } from '../../shared/watch-threshold';
import { enqueueWrite } from './immersion-tracker/queue';
import { nowMs } from './immersion-tracker/time';
import { toDbMs } from './immersion-tracker/query-shared';
import {
DEFAULT_BATCH_SIZE,
DEFAULT_DAILY_ROLLUP_RETENTION_MS,
DEFAULT_EVENTS_RETENTION_MS,
DEFAULT_FLUSH_INTERVAL_MS,
DEFAULT_MAINTENANCE_INTERVAL_MS,
DEFAULT_MAX_PAYLOAD_BYTES,
DEFAULT_MONTHLY_ROLLUP_RETENTION_MS,
DEFAULT_QUEUE_CAP,
DEFAULT_SESSIONS_RETENTION_MS,
DEFAULT_TELEMETRY_RETENTION_MS,
DEFAULT_VACUUM_INTERVAL_MS,
EVENT_CARD_MINED,
EVENT_LOOKUP,
EVENT_MEDIA_BUFFER,
@@ -301,12 +306,12 @@ export class ImmersionTrackerService {
private readonly flushIntervalMs: number;
private readonly maintenanceIntervalMs: number;
private readonly maxPayloadBytes: number;
private readonly eventsRetentionMs: string | null;
private readonly telemetryRetentionMs: string | null;
private readonly sessionsRetentionMs: string | null;
private readonly dailyRollupRetentionMs: string | null;
private readonly monthlyRollupRetentionMs: string | null;
private readonly vacuumIntervalMs: string | null;
private readonly eventsRetentionMs: number;
private readonly telemetryRetentionMs: number;
private readonly sessionsRetentionMs: number;
private readonly dailyRollupRetentionMs: number;
private readonly monthlyRollupRetentionMs: number;
private readonly vacuumIntervalMs: number;
private readonly dbPath: string;
private readonly writeLock = { locked: false };
private flushTimer: ReturnType<typeof setTimeout> | null = null;
@@ -338,12 +343,6 @@ export class ImmersionTrackerService {
}
const policy = options.policy ?? {};
const DEFAULT_EVENTS_RETENTION_DAYS = 7;
const DEFAULT_TELEMETRY_RETENTION_DAYS = 30;
const DEFAULT_SESSIONS_RETENTION_DAYS = 30;
const DEFAULT_DAILY_ROLLUP_RETENTION_DAYS = 365;
const DEFAULT_MONTHLY_ROLLUP_RETENTION_DAYS = 5 * 365;
const DEFAULT_VACUUM_INTERVAL_DAYS = 7;
this.queueCap = resolveBoundedInt(policy.queueCap, DEFAULT_QUEUE_CAP, 100, 100_000);
this.batchSize = resolveBoundedInt(policy.batchSize, DEFAULT_BATCH_SIZE, 1, 10_000);
this.flushIntervalMs = resolveBoundedInt(
@@ -368,43 +367,42 @@ export class ImmersionTrackerService {
const retention = policy.retention ?? {};
const daysToRetentionMs = (
value: number | undefined,
fallbackDays: number,
fallbackMs: number,
maxDays: number,
): string | null => {
): number => {
const fallbackDays = Math.floor(fallbackMs / 86_400_000);
const resolvedDays = resolveBoundedInt(value, fallbackDays, 0, maxDays);
return resolvedDays === 0
? null
: (BigInt(`${resolvedDays}`) * 86_400_000n).toString();
return resolvedDays === 0 ? Number.POSITIVE_INFINITY : resolvedDays * 86_400_000;
};
this.eventsRetentionMs = daysToRetentionMs(
retention.eventsDays,
DEFAULT_EVENTS_RETENTION_DAYS,
DEFAULT_EVENTS_RETENTION_MS,
3650,
);
this.telemetryRetentionMs = daysToRetentionMs(
retention.telemetryDays,
DEFAULT_TELEMETRY_RETENTION_DAYS,
DEFAULT_TELEMETRY_RETENTION_MS,
3650,
);
this.sessionsRetentionMs = daysToRetentionMs(
retention.sessionsDays,
DEFAULT_SESSIONS_RETENTION_DAYS,
DEFAULT_SESSIONS_RETENTION_MS,
3650,
);
this.dailyRollupRetentionMs = daysToRetentionMs(
retention.dailyRollupsDays,
DEFAULT_DAILY_ROLLUP_RETENTION_DAYS,
DEFAULT_DAILY_ROLLUP_RETENTION_MS,
36500,
);
this.monthlyRollupRetentionMs = daysToRetentionMs(
retention.monthlyRollupsDays,
DEFAULT_MONTHLY_ROLLUP_RETENTION_DAYS,
DEFAULT_MONTHLY_ROLLUP_RETENTION_MS,
36500,
);
this.vacuumIntervalMs = daysToRetentionMs(
retention.vacuumIntervalDays,
DEFAULT_VACUUM_INTERVAL_DAYS,
DEFAULT_VACUUM_INTERVAL_MS,
3650,
);
this.db = new Database(this.dbPath);
@@ -1598,9 +1596,9 @@ export class ImmersionTrackerService {
const maintenanceNowMs = nowMs();
this.runRollupMaintenance(false);
if (
this.eventsRetentionMs !== null ||
this.telemetryRetentionMs !== null ||
this.sessionsRetentionMs !== null
Number.isFinite(this.eventsRetentionMs) ||
Number.isFinite(this.telemetryRetentionMs) ||
Number.isFinite(this.sessionsRetentionMs)
) {
pruneRawRetention(this.db, maintenanceNowMs, {
eventsRetentionMs: this.eventsRetentionMs,
@@ -1609,8 +1607,8 @@ export class ImmersionTrackerService {
});
}
if (
this.dailyRollupRetentionMs !== null ||
this.monthlyRollupRetentionMs !== null
Number.isFinite(this.dailyRollupRetentionMs) ||
Number.isFinite(this.monthlyRollupRetentionMs)
) {
pruneRollupRetention(this.db, maintenanceNowMs, {
dailyRollupRetentionMs: this.dailyRollupRetentionMs,
@@ -1619,9 +1617,8 @@ export class ImmersionTrackerService {
}
if (
this.vacuumIntervalMs !== null &&
BigInt(toDbMs(maintenanceNowMs)) - BigInt(toDbMs(this.lastVacuumMs)) >=
BigInt(this.vacuumIntervalMs) &&
this.vacuumIntervalMs > 0 &&
maintenanceNowMs - this.lastVacuumMs >= this.vacuumIntervalMs &&
!this.writeLock.locked
) {
this.db.exec('VACUUM');

View File

@@ -3,7 +3,7 @@ import fs from 'node:fs';
import os from 'node:os';
import path from 'node:path';
import test from 'node:test';
import { Database, type DatabaseSync } from '../sqlite.js';
import { Database } from '../sqlite.js';
import {
createTrackerPreparedStatements,
ensureSchema,
@@ -44,7 +44,6 @@ import {
EVENT_SUBTITLE_LINE,
EVENT_YOMITAN_LOOKUP,
} from '../types.js';
import { nowMs } from '../time.js';
function makeDbPath(): string {
const dir = fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-imm-query-test-'));
@@ -82,22 +81,6 @@ function cleanupDbPath(dbPath: string): void {
}
}
function getSqliteLocalMidnightMs(db: DatabaseSync, epochSeconds = Math.floor(nowMs() / 1000)): number {
const row = db
.prepare(
`
SELECT (
?
- CAST(strftime('%H', ?,'unixepoch','localtime') AS INTEGER) * 3600
- CAST(strftime('%M', ?,'unixepoch','localtime') AS INTEGER) * 60
- CAST(strftime('%S', ?,'unixepoch','localtime') AS INTEGER)
) AS value
`,
)
.get(epochSeconds, epochSeconds, epochSeconds, epochSeconds) as { value: number } | null;
return row?.value ?? 0;
}
function withMockDate<T>(fixedDate: Date, run: (realDate: typeof Date) => T): T {
const realDate = Date;
const fixedDateMs = fixedDate.getTime();
@@ -760,30 +743,18 @@ test('getTrendsDashboard keeps local-midnight session buckets separate', () => {
parseMetadataJson: null,
});
const baseMidnightSec = getSqliteLocalMidnightMs(db, 1_735_689_600);
const beforeMidnightSec = baseMidnightSec - 30 * 60;
const afterMidnightSec = baseMidnightSec + 30 * 60;
const beforeMidnight = `${beforeMidnightSec}000`;
const afterMidnight = `${afterMidnightSec}000`;
const firstSessionId = startSessionRecord(
db,
videoId,
beforeMidnight as unknown as number,
).sessionId;
const secondSessionId = startSessionRecord(
db,
videoId,
afterMidnight as unknown as number,
).sessionId;
const beforeMidnight = new Date(2026, 2, 1, 23, 30).getTime();
const afterMidnight = new Date(2026, 2, 2, 0, 30).getTime();
const firstSessionId = startSessionRecord(db, videoId, beforeMidnight).sessionId;
const secondSessionId = startSessionRecord(db, videoId, afterMidnight).sessionId;
for (const [sessionId, startedAtMs, tokensSeen, lookupCount] of [
[firstSessionId, beforeMidnight, 100, 4],
[secondSessionId, afterMidnight, 120, 6],
] as const) {
const startedAtPlus60Ms = `${BigInt(startedAtMs) + 60000n}`;
stmts.telemetryInsertStmt.run(
sessionId,
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
60_000,
60_000,
1,
@@ -796,8 +767,8 @@ test('getTrendsDashboard keeps local-midnight session buckets separate', () => {
0,
0,
0,
startedAtPlus60Ms as unknown as number,
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
startedAtMs + 60_000,
);
db.prepare(
`
@@ -816,7 +787,7 @@ test('getTrendsDashboard keeps local-midnight session buckets separate', () => {
WHERE session_id = ?
`,
).run(
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
60_000,
60_000,
1,
@@ -824,24 +795,18 @@ test('getTrendsDashboard keeps local-midnight session buckets separate', () => {
lookupCount,
lookupCount,
lookupCount,
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
sessionId,
);
}
const dashboard = getTrendsDashboard(db, 'all', 'day');
const lookupValues = dashboard.progress.lookups.map((point) => point.value);
assert.ok(
lookupValues.length === 1 || lookupValues.length === 2,
`unexpected lookup bucket count: ${lookupValues.length}`,
assert.equal(dashboard.progress.lookups.length, 2);
assert.deepEqual(
dashboard.progress.lookups.map((point) => point.value),
[4, 10],
);
if (lookupValues.length === 2) {
assert.deepEqual(lookupValues, [4, 10]);
} else {
assert.deepEqual(lookupValues, [10]);
}
assert.equal(lookupValues.at(-1), 10);
assert.ok(dashboard.ratios.lookupsPerHundred.length >= 1);
assert.equal(dashboard.ratios.lookupsPerHundred.length, 2);
} finally {
db.close();
cleanupDbPath(dbPath);
@@ -851,7 +816,8 @@ test('getTrendsDashboard keeps local-midnight session buckets separate', () => {
test('getTrendsDashboard month grouping spans every touched calendar month and keeps progress monthly', () => {
const dbPath = makeDbPath();
const db = new Database(dbPath);
try {
withMockDate(new Date(2026, 2, 1, 12, 0, 0), (RealDate) => {
try {
ensureSchema(db);
const stmts = createTrackerPreparedStatements(db);
const febVideoId = getOrCreateVideoRecord(db, 'local:/tmp/feb-trends.mkv', {
@@ -896,30 +862,18 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
parseMetadataJson: null,
});
const baseMidnightSec = getSqliteLocalMidnightMs(db);
const febStartedAtSec = baseMidnightSec - 40 * 86_400;
const marStartedAtSec = baseMidnightSec - 10 * 86_400;
const febStartedAtMs = `${febStartedAtSec}000`;
const marStartedAtMs = `${marStartedAtSec}000`;
const febSessionId = startSessionRecord(
db,
febVideoId,
febStartedAtMs as unknown as number,
).sessionId;
const marSessionId = startSessionRecord(
db,
marVideoId,
marStartedAtMs as unknown as number,
).sessionId;
const febStartedAtMs = new RealDate(2026, 1, 15, 20, 0, 0).getTime();
const marStartedAtMs = new RealDate(2026, 2, 1, 9, 0, 0).getTime();
const febSessionId = startSessionRecord(db, febVideoId, febStartedAtMs).sessionId;
const marSessionId = startSessionRecord(db, marVideoId, marStartedAtMs).sessionId;
for (const [sessionId, startedAtMs, tokensSeen, cardsMined, yomitanLookupCount] of [
[febSessionId, febStartedAtMs, 100, 2, 3],
[marSessionId, marStartedAtMs, 120, 4, 5],
] as const) {
const startedAtPlus60Ms = `${BigInt(startedAtMs) + 60000n}`;
stmts.telemetryInsertStmt.run(
sessionId,
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
30 * 60_000,
30 * 60_000,
4,
@@ -932,8 +886,8 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
0,
0,
0,
startedAtPlus60Ms as unknown as number,
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
startedAtMs + 60_000,
);
db.prepare(
`
@@ -953,16 +907,16 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
WHERE session_id = ?
`,
).run(
startedAtPlus60Ms as unknown as number,
`${30 * 60_000}`,
`${30 * 60_000}`,
startedAtMs + 60_000,
30 * 60_000,
30 * 60_000,
4,
tokensSeen,
cardsMined,
yomitanLookupCount,
yomitanLookupCount,
yomitanLookupCount,
startedAtPlus60Ms as unknown as number,
startedAtMs + 60_000,
sessionId,
);
}
@@ -983,30 +937,12 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?)
`,
);
const febEpochDay = db
.prepare(
`SELECT CAST(julianday(?,'unixepoch','localtime') - 2440587.5 AS INTEGER) AS value`,
)
.get(febStartedAtSec) as { value: number } | null;
const marEpochDay = db
.prepare(
`SELECT CAST(julianday(?,'unixepoch','localtime') - 2440587.5 AS INTEGER) AS value`,
)
.get(marStartedAtSec) as { value: number } | null;
const febMonthKey = db
.prepare(
`SELECT CAST(strftime('%Y%m', ?,'unixepoch','localtime') AS INTEGER) AS value`,
)
.get(febStartedAtSec) as { value: number } | null;
const marMonthKey = db
.prepare(
`SELECT CAST(strftime('%Y%m', ?,'unixepoch','localtime') AS INTEGER) AS value`,
)
.get(marStartedAtSec) as { value: number } | null;
insertDailyRollup.run(febEpochDay?.value ?? 0, febVideoId, 1, 30, 4, 100, 2, febStartedAtMs, febStartedAtMs);
insertDailyRollup.run(marEpochDay?.value ?? 0, marVideoId, 1, 30, 4, 120, 4, marStartedAtMs, marStartedAtMs);
insertMonthlyRollup.run(febMonthKey?.value ?? 0, febVideoId, 1, 30, 4, 100, 2, febStartedAtMs, febStartedAtMs);
insertMonthlyRollup.run(marMonthKey?.value ?? 0, marVideoId, 1, 30, 4, 120, 4, marStartedAtMs, marStartedAtMs);
const febEpochDay = Math.floor(febStartedAtMs / 86_400_000);
const marEpochDay = Math.floor(marStartedAtMs / 86_400_000);
insertDailyRollup.run(febEpochDay, febVideoId, 1, 30, 4, 100, 2, febStartedAtMs, febStartedAtMs);
insertDailyRollup.run(marEpochDay, marVideoId, 1, 30, 4, 120, 4, marStartedAtMs, marStartedAtMs);
insertMonthlyRollup.run(202602, febVideoId, 1, 30, 4, 100, 2, febStartedAtMs, febStartedAtMs);
insertMonthlyRollup.run(202603, marVideoId, 1, 30, 4, 120, 4, marStartedAtMs, marStartedAtMs);
db.prepare(
`
@@ -1022,8 +958,8 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
'名詞',
'',
'',
febStartedAtSec,
febStartedAtSec,
Math.floor(febStartedAtMs / 1000),
Math.floor(febStartedAtMs / 1000),
1,
);
db.prepare(
@@ -1040,12 +976,12 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
'名詞',
'',
'',
marStartedAtSec,
marStartedAtSec,
Math.floor(marStartedAtMs / 1000),
Math.floor(marStartedAtMs / 1000),
1,
);
const dashboard = getTrendsDashboard(db, '90d', 'month');
const dashboard = getTrendsDashboard(db, '30d', 'month');
assert.equal(dashboard.activity.watchTime.length, 2);
assert.deepEqual(
@@ -1060,10 +996,11 @@ test('getTrendsDashboard month grouping spans every touched calendar month and k
dashboard.progress.lookups.map((point) => point.label),
dashboard.activity.watchTime.map((point) => point.label),
);
} finally {
db.close();
cleanupDbPath(dbPath);
}
} finally {
db.close();
cleanupDbPath(dbPath);
}
});
});
test('getQueryHints reads all-time totals from lifetime summary', () => {
@@ -1140,51 +1077,55 @@ test('getQueryHints computes weekly new-word cutoff from calendar midnights', ()
const dbPath = makeDbPath();
const db = new Database(dbPath);
try {
ensureSchema(db);
withMockDate(new Date(2026, 2, 15, 12, 0, 0), (RealDate) => {
try {
ensureSchema(db);
const insertWord = db.prepare(
`
INSERT INTO imm_words (
headword, word, reading, part_of_speech, pos1, pos2, pos3, first_seen, last_seen, frequency
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`,
);
const todayStartSec = getSqliteLocalMidnightMs(db);
const weekBoundarySec = todayStartSec - 7 * 86_400;
const justBeforeWeekBoundary = weekBoundarySec - 30 * 60;
const justAfterWeekBoundary = weekBoundarySec + 30 * 60;
insertWord.run(
'境界前',
'境界前',
'きょうかいまえ',
'noun',
'名詞',
'',
'',
justBeforeWeekBoundary,
justBeforeWeekBoundary,
1,
);
insertWord.run(
'境界後',
'境界後',
'きょうかいご',
'noun',
'名詞',
'',
'',
justAfterWeekBoundary,
justAfterWeekBoundary,
1,
);
const insertWord = db.prepare(
`
INSERT INTO imm_words (
headword, word, reading, part_of_speech, pos1, pos2, pos3, first_seen, last_seen, frequency
) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`,
);
const justBeforeWeekBoundary = Math.floor(
new RealDate(2026, 2, 7, 23, 30, 0).getTime() / 1000,
);
const justAfterWeekBoundary = Math.floor(
new RealDate(2026, 2, 8, 0, 30, 0).getTime() / 1000,
);
insertWord.run(
'境界前',
'境界前',
'きょうかいまえ',
'noun',
'名詞',
'',
'',
justBeforeWeekBoundary,
justBeforeWeekBoundary,
1,
);
insertWord.run(
'境界後',
'境界後',
'きょうかいご',
'noun',
'名詞',
'',
'',
justAfterWeekBoundary,
justAfterWeekBoundary,
1,
);
const hints = getQueryHints(db);
assert.equal(hints.newWordsThisWeek, 1);
} finally {
db.close();
cleanupDbPath(dbPath);
}
const hints = getQueryHints(db);
assert.equal(hints.newWordsThisWeek, 1);
} finally {
db.close();
cleanupDbPath(dbPath);
}
});
});
test('getQueryHints counts new words by distinct headword first-seen time', () => {
@@ -1194,7 +1135,9 @@ test('getQueryHints counts new words by distinct headword first-seen time', () =
try {
ensureSchema(db);
const todayStartSec = getSqliteLocalMidnightMs(db);
const now = new Date();
const todayStartSec =
new Date(now.getFullYear(), now.getMonth(), now.getDate()).getTime() / 1000;
const oneHourAgo = todayStartSec + 3_600;
const twoDaysAgo = todayStartSec - 2 * 86_400;

View File

@@ -1,8 +1,6 @@
import type { DatabaseSync } from './sqlite';
import { finalizeSessionRecord } from './session';
import { nowMs } from './time';
import { toDbMs } from './query-shared';
import { toDbSeconds } from './query-shared';
import type { LifetimeRebuildSummary, SessionState } from './types';
interface TelemetryRow {
@@ -21,12 +19,11 @@ interface AnimeRow {
episodes_total: number | null;
}
function asPositiveNumber(value: number | string | null, fallback: number): number {
const numericValue = typeof value === 'number' ? value : Number(value);
if (!Number.isFinite(numericValue)) {
function asPositiveNumber(value: number | null, fallback: number): number {
if (value === null || !Number.isFinite(value)) {
return fallback;
}
return Math.max(0, Math.floor(numericValue));
return Math.max(0, Math.floor(value));
}
interface ExistenceRow {
@@ -44,31 +41,30 @@ interface LifetimeAnimeStateRow {
interface RetainedSessionRow {
sessionId: number;
videoId: number;
startedAtMs: number | string;
endedAtMs: number | string;
lastMediaMs: number | string | null;
totalWatchedMs: number | string;
activeWatchedMs: number | string;
linesSeen: number | string;
tokensSeen: number | string;
cardsMined: number | string;
lookupCount: number | string;
lookupHits: number | string;
yomitanLookupCount: number | string;
pauseCount: number | string;
pauseMs: number | string;
seekForwardCount: number | string;
seekBackwardCount: number | string;
mediaBufferEvents: number | string;
startedAtMs: number;
endedAtMs: number;
lastMediaMs: number | null;
totalWatchedMs: number;
activeWatchedMs: number;
linesSeen: number;
tokensSeen: number;
cardsMined: number;
lookupCount: number;
lookupHits: number;
yomitanLookupCount: number;
pauseCount: number;
pauseMs: number;
seekForwardCount: number;
seekBackwardCount: number;
mediaBufferEvents: number;
}
function hasRetainedPriorSession(
db: DatabaseSync,
videoId: number,
startedAtMs: number | string,
startedAtMs: number,
currentSessionId: number,
): boolean {
const startedAtDbMs = toDbMs(startedAtMs);
return (
Number(
(
@@ -84,7 +80,7 @@ function hasRetainedPriorSession(
)
`,
)
.get(videoId, startedAtDbMs, startedAtDbMs, currentSessionId) as ExistenceRow | null
.get(videoId, startedAtMs, startedAtMs, currentSessionId) as ExistenceRow | null
)?.count ?? 0,
) > 0
);
@@ -93,25 +89,25 @@ function hasRetainedPriorSession(
function isFirstSessionForLocalDay(
db: DatabaseSync,
currentSessionId: number,
startedAtMs: number | string,
startedAtMs: number,
): boolean {
const startedAtDbSeconds = toDbSeconds(startedAtMs);
const sameDayCount = Number(
return (
(
db.prepare(`
db
.prepare(
`
SELECT COUNT(*) AS count
FROM imm_sessions
WHERE date(started_at_ms / 1000, 'unixepoch', 'localtime') = date(?,'unixepoch','localtime')
WHERE date(started_at_ms / 1000, 'unixepoch', 'localtime') = date(? / 1000, 'unixepoch', 'localtime')
AND (
started_at_ms < ?
OR (started_at_ms = ? AND session_id < ?)
)
`,
)
.get(startedAtDbSeconds, toDbMs(startedAtMs), toDbMs(startedAtMs), currentSessionId) as ExistenceRow | null
)?.count ?? 0
)
.get(startedAtMs, startedAtMs, startedAtMs, currentSessionId) as ExistenceRow | null
)?.count === 0
);
return sameDayCount === 0;
}
function resetLifetimeSummaries(db: DatabaseSync, nowMs: number): void {
@@ -135,7 +131,7 @@ function resetLifetimeSummaries(db: DatabaseSync, nowMs: number): void {
LAST_UPDATE_DATE = ?
WHERE global_id = 1
`,
).run(toDbMs(nowMs), toDbMs(nowMs));
).run(nowMs, nowMs);
}
function rebuildLifetimeSummariesInternal(
@@ -148,8 +144,8 @@ function rebuildLifetimeSummariesInternal(
SELECT
session_id AS sessionId,
video_id AS videoId,
CAST(started_at_ms AS INTEGER) AS startedAtMs,
CAST(ended_at_ms AS INTEGER) AS endedAtMs,
started_at_ms AS startedAtMs,
ended_at_ms AS endedAtMs,
total_watched_ms AS totalWatchedMs,
active_watched_ms AS activeWatchedMs,
lines_seen AS linesSeen,
@@ -185,27 +181,27 @@ function toRebuildSessionState(row: RetainedSessionRow): SessionState {
return {
sessionId: row.sessionId,
videoId: row.videoId,
startedAtMs: row.startedAtMs as unknown as number,
startedAtMs: row.startedAtMs,
currentLineIndex: 0,
lastWallClockMs: row.endedAtMs as unknown as number,
lastMediaMs: row.lastMediaMs === null ? null : (row.lastMediaMs as unknown as number),
lastWallClockMs: row.endedAtMs,
lastMediaMs: row.lastMediaMs,
lastPauseStartMs: null,
isPaused: false,
pendingTelemetry: false,
markedWatched: false,
totalWatchedMs: asPositiveNumber(row.totalWatchedMs, 0),
activeWatchedMs: asPositiveNumber(row.activeWatchedMs, 0),
linesSeen: asPositiveNumber(row.linesSeen, 0),
tokensSeen: asPositiveNumber(row.tokensSeen, 0),
cardsMined: asPositiveNumber(row.cardsMined, 0),
lookupCount: asPositiveNumber(row.lookupCount, 0),
lookupHits: asPositiveNumber(row.lookupHits, 0),
yomitanLookupCount: asPositiveNumber(row.yomitanLookupCount, 0),
pauseCount: asPositiveNumber(row.pauseCount, 0),
pauseMs: asPositiveNumber(row.pauseMs, 0),
seekForwardCount: asPositiveNumber(row.seekForwardCount, 0),
seekBackwardCount: asPositiveNumber(row.seekBackwardCount, 0),
mediaBufferEvents: asPositiveNumber(row.mediaBufferEvents, 0),
totalWatchedMs: Math.max(0, row.totalWatchedMs),
activeWatchedMs: Math.max(0, row.activeWatchedMs),
linesSeen: Math.max(0, row.linesSeen),
tokensSeen: Math.max(0, row.tokensSeen),
cardsMined: Math.max(0, row.cardsMined),
lookupCount: Math.max(0, row.lookupCount),
lookupHits: Math.max(0, row.lookupHits),
yomitanLookupCount: Math.max(0, row.yomitanLookupCount),
pauseCount: Math.max(0, row.pauseCount),
pauseMs: Math.max(0, row.pauseMs),
seekForwardCount: Math.max(0, row.seekForwardCount),
seekBackwardCount: Math.max(0, row.seekBackwardCount),
mediaBufferEvents: Math.max(0, row.mediaBufferEvents),
};
}
@@ -216,8 +212,8 @@ function getRetainedStaleActiveSessions(db: DatabaseSync): RetainedSessionRow[]
SELECT
s.session_id AS sessionId,
s.video_id AS videoId,
CAST(s.started_at_ms AS INTEGER) AS startedAtMs,
CAST(COALESCE(t.sample_ms, s.LAST_UPDATE_DATE, s.started_at_ms) AS INTEGER) AS endedAtMs,
s.started_at_ms AS startedAtMs,
COALESCE(t.sample_ms, s.LAST_UPDATE_DATE, s.started_at_ms) AS endedAtMs,
s.ended_media_ms AS lastMediaMs,
COALESCE(t.total_watched_ms, s.total_watched_ms, 0) AS totalWatchedMs,
COALESCE(t.active_watched_ms, s.active_watched_ms, 0) AS activeWatchedMs,
@@ -251,14 +247,14 @@ function getRetainedStaleActiveSessions(db: DatabaseSync): RetainedSessionRow[]
function upsertLifetimeMedia(
db: DatabaseSync,
videoId: number,
nowMs: string,
nowMs: number,
activeMs: number,
cardsMined: number,
linesSeen: number,
tokensSeen: number,
completed: number,
startedAtMs: number | string,
endedAtMs: number | string,
startedAtMs: number,
endedAtMs: number,
): void {
db.prepare(
`
@@ -314,15 +310,15 @@ function upsertLifetimeMedia(
function upsertLifetimeAnime(
db: DatabaseSync,
animeId: number,
nowMs: string,
nowMs: number,
activeMs: number,
cardsMined: number,
linesSeen: number,
tokensSeen: number,
episodesStartedDelta: number,
episodesCompletedDelta: number,
startedAtMs: number | string,
endedAtMs: number | string,
startedAtMs: number,
endedAtMs: number,
): void {
db.prepare(
`
@@ -381,7 +377,7 @@ function upsertLifetimeAnime(
export function applySessionLifetimeSummary(
db: DatabaseSync,
session: SessionState,
endedAtMs: number | string,
endedAtMs: number,
): void {
const applyResult = db
.prepare(
@@ -396,8 +392,8 @@ export function applySessionLifetimeSummary(
)
ON CONFLICT(session_id) DO NOTHING
`,
)
.run(session.sessionId, toDbMs(endedAtMs), toDbMs(nowMs()), toDbMs(nowMs()));
)
.run(session.sessionId, endedAtMs, nowMs(), nowMs());
if ((applyResult.changes ?? 0) <= 0) {
return;
@@ -472,7 +468,7 @@ export function applySessionLifetimeSummary(
? 1
: 0;
const updatedAtMs = toDbMs(nowMs());
const updatedAtMs = nowMs();
db.prepare(
`
UPDATE imm_lifetime_global

View File

@@ -54,9 +54,9 @@ test('pruneRawRetention uses session retention separately from telemetry retenti
`);
const result = pruneRawRetention(db, nowMs, {
eventsRetentionMs: '120000000',
telemetryRetentionMs: '80000000',
sessionsRetentionMs: '300000000',
eventsRetentionMs: 120_000_000,
telemetryRetentionMs: 80_000_000,
sessionsRetentionMs: 300_000_000,
});
const remainingSessions = db
@@ -82,6 +82,65 @@ test('pruneRawRetention uses session retention separately from telemetry retenti
}
});
test('pruneRawRetention skips disabled retention windows', () => {
const dbPath = makeDbPath();
const db = new Database(dbPath);
try {
ensureSchema(db);
const nowMs = 1_000_000_000;
db.exec(`
INSERT INTO imm_videos (
video_id, video_key, canonical_title, source_type, duration_ms, CREATED_DATE, LAST_UPDATE_DATE
) VALUES (
1, 'local:/tmp/video.mkv', 'Video', 1, 0, ${nowMs}, ${nowMs}
);
INSERT INTO imm_sessions (
session_id, session_uuid, video_id, started_at_ms, ended_at_ms, status, CREATED_DATE, LAST_UPDATE_DATE
) VALUES (
1, 'session-1', 1, ${nowMs - 1_000}, ${nowMs - 500}, 2, ${nowMs}, ${nowMs}
);
INSERT INTO imm_session_telemetry (
session_id, sample_ms, total_watched_ms, active_watched_ms, CREATED_DATE, LAST_UPDATE_DATE
) VALUES (
1, ${nowMs - 2_000}, 0, 0, ${nowMs}, ${nowMs}
);
INSERT INTO imm_session_events (
session_id, event_type, ts_ms, payload_json, CREATED_DATE, LAST_UPDATE_DATE
) VALUES (
1, 1, ${nowMs - 3_000}, '{}', ${nowMs}, ${nowMs}
);
`);
const result = pruneRawRetention(db, nowMs, {
eventsRetentionMs: Number.POSITIVE_INFINITY,
telemetryRetentionMs: Number.POSITIVE_INFINITY,
sessionsRetentionMs: Number.POSITIVE_INFINITY,
});
const remainingSessionEvents = db
.prepare('SELECT COUNT(*) AS count FROM imm_session_events')
.get() as { count: number };
const remainingTelemetry = db
.prepare('SELECT COUNT(*) AS count FROM imm_session_telemetry')
.get() as { count: number };
const remainingSessions = db
.prepare('SELECT COUNT(*) AS count FROM imm_sessions')
.get() as { count: number };
assert.equal(result.deletedSessionEvents, 0);
assert.equal(result.deletedTelemetryRows, 0);
assert.equal(result.deletedEndedSessions, 0);
assert.equal(remainingSessionEvents.count, 1);
assert.equal(remainingTelemetry.count, 1);
assert.equal(remainingSessions.count, 1);
} finally {
db.close();
cleanupDbPath(dbPath);
}
});
test('toMonthKey floors negative timestamps into the prior UTC month', () => {
assert.equal(toMonthKey(-1), 196912);
assert.equal(toMonthKey(-86_400_000), 196912);
@@ -129,9 +188,9 @@ test('raw retention keeps rollups and rollup retention prunes them separately',
`);
pruneRawRetention(db, nowMs, {
eventsRetentionMs: '120000000',
telemetryRetentionMs: '120000000',
sessionsRetentionMs: '120000000',
eventsRetentionMs: 120_000_000,
telemetryRetentionMs: 120_000_000,
sessionsRetentionMs: 120_000_000,
});
const rollupsAfterRawPrune = db
@@ -145,8 +204,8 @@ test('raw retention keeps rollups and rollup retention prunes them separately',
assert.equal(monthlyAfterRawPrune?.total, 1);
const rollupPrune = pruneRollupRetention(db, nowMs, {
dailyRollupRetentionMs: '120000000',
monthlyRollupRetentionMs: '1',
dailyRollupRetentionMs: 120_000_000,
monthlyRollupRetentionMs: 1,
});
const rollupsAfterRollupPrune = db

View File

@@ -1,13 +1,13 @@
import type { DatabaseSync } from './sqlite';
import { nowMs } from './time';
import { subtractDbMs, toDbMs, toDbSeconds } from './query-shared';
import { toDbMs } from './query-shared';
const ROLLUP_STATE_KEY = 'last_rollup_sample_ms';
const DAILY_MS = 86_400_000;
const ZERO_ID = 0;
interface RollupStateRow {
state_value: string;
state_value: number;
}
interface RollupGroupRow {
@@ -46,37 +46,34 @@ export function toMonthKey(timestampMs: number): number {
export function pruneRawRetention(
db: DatabaseSync,
nowMs: number,
currentMs: number,
policy: {
eventsRetentionMs: string | null;
telemetryRetentionMs: string | null;
sessionsRetentionMs: string | null;
eventsRetentionMs: number;
telemetryRetentionMs: number;
sessionsRetentionMs: number;
},
): RawRetentionResult {
const deletedSessionEvents =
policy.eventsRetentionMs === null
? 0
: (
db
.prepare(`DELETE FROM imm_session_events WHERE ts_ms < ?`)
.run(subtractDbMs(nowMs, policy.eventsRetentionMs)) as { changes: number }
).changes;
const deletedTelemetryRows =
policy.telemetryRetentionMs === null
? 0
: (
db
.prepare(`DELETE FROM imm_session_telemetry WHERE sample_ms < ?`)
.run(subtractDbMs(nowMs, policy.telemetryRetentionMs)) as { changes: number }
).changes;
const deletedEndedSessions =
policy.sessionsRetentionMs === null
? 0
: (
db
.prepare(`DELETE FROM imm_sessions WHERE ended_at_ms IS NOT NULL AND ended_at_ms < ?`)
.run(subtractDbMs(nowMs, policy.sessionsRetentionMs)) as { changes: number }
).changes;
const deletedSessionEvents = Number.isFinite(policy.eventsRetentionMs)
? (
db.prepare(`DELETE FROM imm_session_events WHERE ts_ms < ?`).run(
toDbMs(currentMs - policy.eventsRetentionMs),
) as { changes: number }
).changes
: 0;
const deletedTelemetryRows = Number.isFinite(policy.telemetryRetentionMs)
? (
db
.prepare(`DELETE FROM imm_session_telemetry WHERE sample_ms < ?`)
.run(toDbMs(currentMs - policy.telemetryRetentionMs)) as { changes: number }
).changes
: 0;
const deletedEndedSessions = Number.isFinite(policy.sessionsRetentionMs)
? (
db
.prepare(`DELETE FROM imm_sessions WHERE ended_at_ms IS NOT NULL AND ended_at_ms < ?`)
.run(toDbMs(currentMs - policy.sessionsRetentionMs)) as { changes: number }
).changes
: 0;
return {
deletedSessionEvents,
@@ -87,42 +84,30 @@ export function pruneRawRetention(
export function pruneRollupRetention(
db: DatabaseSync,
nowMs: number,
currentMs: number,
policy: {
dailyRollupRetentionMs: string | null;
monthlyRollupRetentionMs: string | null;
dailyRollupRetentionMs: number;
monthlyRollupRetentionMs: number;
},
): { deletedDailyRows: number; deletedMonthlyRows: number } {
const currentMs = toDbMs(nowMs);
const deletedDailyRows =
policy.dailyRollupRetentionMs === null
? 0
: (
const deletedDailyRows = Number.isFinite(policy.dailyRollupRetentionMs)
? (
db
.prepare(
`DELETE FROM imm_daily_rollups
WHERE rollup_day < CAST(julianday(date(?,'unixepoch','localtime')) - 2440587.5 AS INTEGER) - ?`,
)
.run(
toDbSeconds(currentMs),
Number(BigInt(policy.dailyRollupRetentionMs) / BigInt(DAILY_MS)),
) as {
changes: number;
}
).changes;
const deletedMonthlyRows =
policy.monthlyRollupRetentionMs === null
? 0
: (
db
.prepare(
`DELETE FROM imm_monthly_rollups
WHERE rollup_month < CAST(strftime('%Y%m', ?,'unixepoch','localtime') AS INTEGER)`,
)
.run(toDbSeconds(subtractDbMs(currentMs, policy.monthlyRollupRetentionMs))) as {
changes: number;
}
).changes;
.prepare(`DELETE FROM imm_daily_rollups WHERE rollup_day < ?`)
.run(Math.floor((currentMs - policy.dailyRollupRetentionMs) / DAILY_MS)) as {
changes: number;
}
).changes
: 0;
const deletedMonthlyRows = Number.isFinite(policy.monthlyRollupRetentionMs)
? (
db
.prepare(`DELETE FROM imm_monthly_rollups WHERE rollup_month < ?`)
.run(toMonthKey(currentMs - policy.monthlyRollupRetentionMs)) as {
changes: number;
}
).changes
: 0;
return {
deletedDailyRows,
@@ -130,19 +115,19 @@ export function pruneRollupRetention(
};
}
function getLastRollupSampleMs(db: DatabaseSync): string {
function getLastRollupSampleMs(db: DatabaseSync): number {
const row = db
.prepare(`SELECT state_value FROM imm_rollup_state WHERE state_key = ? LIMIT 1`)
.get(ROLLUP_STATE_KEY) as unknown as RollupStateRow | null;
return row ? row.state_value : ZERO_ID.toString();
return row ? Number(row.state_value) : ZERO_ID;
}
function setLastRollupSampleMs(db: DatabaseSync, sampleMs: string | number | bigint): void {
function setLastRollupSampleMs(db: DatabaseSync, sampleMs: number | bigint): void {
db.prepare(
`INSERT INTO imm_rollup_state (state_key, state_value)
VALUES (?, ?)
ON CONFLICT(state_key) DO UPDATE SET state_value = excluded.state_value`,
).run(ROLLUP_STATE_KEY, toDbMs(sampleMs));
).run(ROLLUP_STATE_KEY, sampleMs);
}
function resetRollups(db: DatabaseSync): void {
@@ -156,7 +141,7 @@ function resetRollups(db: DatabaseSync): void {
function upsertDailyRollupsForGroups(
db: DatabaseSync,
groups: Array<{ rollupDay: number; videoId: number }>,
rollupNowMs: string,
rollupNowMs: bigint,
): void {
if (groups.length === 0) {
return;
@@ -172,29 +157,32 @@ function upsertDailyRollupsForGroups(
CAST(julianday(s.started_at_ms / 1000, 'unixepoch', 'localtime') - 2440587.5 AS INTEGER) AS rollup_day,
s.video_id AS video_id,
COUNT(DISTINCT s.session_id) AS total_sessions,
COALESCE(SUM(sm.max_active_ms), 0) / 60000.0 AS total_active_min,
COALESCE(SUM(sm.max_lines), 0) AS total_lines_seen,
COALESCE(SUM(sm.max_tokens), 0) AS total_tokens_seen,
COALESCE(SUM(sm.max_cards), 0) AS total_cards,
COALESCE(SUM(COALESCE(sm.max_active_ms, s.active_watched_ms)), 0) / 60000.0 AS total_active_min,
COALESCE(SUM(COALESCE(sm.max_lines, s.lines_seen)), 0) AS total_lines_seen,
COALESCE(SUM(COALESCE(sm.max_tokens, s.tokens_seen)), 0) AS total_tokens_seen,
COALESCE(SUM(COALESCE(sm.max_cards, s.cards_mined)), 0) AS total_cards,
CASE
WHEN COALESCE(SUM(sm.max_active_ms), 0) > 0
THEN (COALESCE(SUM(sm.max_cards), 0) * 60.0) / (COALESCE(SUM(sm.max_active_ms), 0) / 60000.0)
WHEN COALESCE(SUM(COALESCE(sm.max_active_ms, s.active_watched_ms)), 0) > 0
THEN (COALESCE(SUM(COALESCE(sm.max_cards, s.cards_mined)), 0) * 60.0)
/ (COALESCE(SUM(COALESCE(sm.max_active_ms, s.active_watched_ms)), 0) / 60000.0)
ELSE NULL
END AS cards_per_hour,
CASE
WHEN COALESCE(SUM(sm.max_active_ms), 0) > 0
THEN COALESCE(SUM(sm.max_tokens), 0) / (COALESCE(SUM(sm.max_active_ms), 0) / 60000.0)
WHEN COALESCE(SUM(COALESCE(sm.max_active_ms, s.active_watched_ms)), 0) > 0
THEN COALESCE(SUM(COALESCE(sm.max_tokens, s.tokens_seen)), 0)
/ (COALESCE(SUM(COALESCE(sm.max_active_ms, s.active_watched_ms)), 0) / 60000.0)
ELSE NULL
END AS tokens_per_min,
CASE
WHEN COALESCE(SUM(sm.max_lookups), 0) > 0
THEN CAST(COALESCE(SUM(sm.max_hits), 0) AS REAL) / CAST(SUM(sm.max_lookups) AS REAL)
WHEN COALESCE(SUM(COALESCE(sm.max_lookups, s.lookup_count)), 0) > 0
THEN CAST(COALESCE(SUM(COALESCE(sm.max_hits, s.lookup_hits)), 0) AS REAL)
/ CAST(COALESCE(SUM(COALESCE(sm.max_lookups, s.lookup_count)), 0) AS REAL)
ELSE NULL
END AS lookup_hit_rate,
? AS CREATED_DATE,
? AS LAST_UPDATE_DATE
FROM imm_sessions s
JOIN (
LEFT JOIN (
SELECT
t.session_id,
MAX(t.active_watched_ms) AS max_active_ms,
@@ -229,7 +217,7 @@ function upsertDailyRollupsForGroups(
function upsertMonthlyRollupsForGroups(
db: DatabaseSync,
groups: Array<{ rollupMonth: number; videoId: number }>,
rollupNowMs: string,
rollupNowMs: bigint,
): void {
if (groups.length === 0) {
return;
@@ -244,14 +232,14 @@ function upsertMonthlyRollupsForGroups(
CAST(strftime('%Y%m', s.started_at_ms / 1000, 'unixepoch', 'localtime') AS INTEGER) AS rollup_month,
s.video_id AS video_id,
COUNT(DISTINCT s.session_id) AS total_sessions,
COALESCE(SUM(sm.max_active_ms), 0) / 60000.0 AS total_active_min,
COALESCE(SUM(sm.max_lines), 0) AS total_lines_seen,
COALESCE(SUM(sm.max_tokens), 0) AS total_tokens_seen,
COALESCE(SUM(sm.max_cards), 0) AS total_cards,
COALESCE(SUM(COALESCE(sm.max_active_ms, s.active_watched_ms)), 0) / 60000.0 AS total_active_min,
COALESCE(SUM(COALESCE(sm.max_lines, s.lines_seen)), 0) AS total_lines_seen,
COALESCE(SUM(COALESCE(sm.max_tokens, s.tokens_seen)), 0) AS total_tokens_seen,
COALESCE(SUM(COALESCE(sm.max_cards, s.cards_mined)), 0) AS total_cards,
? AS CREATED_DATE,
? AS LAST_UPDATE_DATE
FROM imm_sessions s
JOIN (
LEFT JOIN (
SELECT
t.session_id,
MAX(t.active_watched_ms) AS max_active_ms,
@@ -280,7 +268,7 @@ function upsertMonthlyRollupsForGroups(
function getAffectedRollupGroups(
db: DatabaseSync,
lastRollupSampleMs: string,
lastRollupSampleMs: number,
): Array<{ rollupDay: number; rollupMonth: number; videoId: number }> {
return (
db
@@ -293,7 +281,7 @@ function getAffectedRollupGroups(
FROM imm_session_telemetry t
JOIN imm_sessions s
ON s.session_id = t.session_id
WHERE t.sample_ms > ?
WHERE t.sample_ms >= ?
`,
)
.all(lastRollupSampleMs) as unknown as RollupGroupRow[]
@@ -387,7 +375,7 @@ export function rebuildRollupsInTransaction(db: DatabaseSync): void {
return;
}
const affectedGroups = getAffectedRollupGroups(db, ZERO_ID.toString());
const affectedGroups = getAffectedRollupGroups(db, ZERO_ID);
if (affectedGroups.length === 0) {
setLastRollupSampleMs(db, toDbMs(maxSampleRow.maxSampleMs ?? ZERO_ID));
return;

View File

@@ -131,7 +131,7 @@ export function getSessionEvents(
): SessionEventRow[] {
if (!eventTypes || eventTypes.length === 0) {
const stmt = db.prepare(`
SELECT event_type AS eventType, CAST(ts_ms AS INTEGER) AS tsMs, payload_json AS payload
SELECT event_type AS eventType, ts_ms AS tsMs, payload_json AS payload
FROM imm_session_events WHERE session_id = ? ORDER BY ts_ms ASC LIMIT ?
`);
return stmt.all(sessionId, limit) as SessionEventRow[];
@@ -139,7 +139,7 @@ export function getSessionEvents(
const placeholders = eventTypes.map(() => '?').join(', ');
const stmt = db.prepare(`
SELECT event_type AS eventType, CAST(ts_ms AS INTEGER) AS tsMs, payload_json AS payload
SELECT event_type AS eventType, ts_ms AS tsMs, payload_json AS payload
FROM imm_session_events
WHERE session_id = ? AND event_type IN (${placeholders})
ORDER BY ts_ms ASC
@@ -186,7 +186,7 @@ export function getSimilarWords(db: DatabaseSync, wordId: number, limit = 10): S
headword: string;
reading: string;
} | null;
if (!word) return [];
if (!word || word.headword.trim() === '') return [];
return db
.prepare(
`

View File

@@ -32,7 +32,7 @@ export function getAnimeLibrary(db: DatabaseSync): AnimeLibraryRow[] {
COALESCE(lm.total_tokens_seen, 0) AS totalTokensSeen,
COUNT(DISTINCT v.video_id) AS episodeCount,
a.episodes_total AS episodesTotal,
CAST(COALESCE(lm.last_watched_ms, 0) AS INTEGER) AS lastWatchedMs
COALESCE(lm.last_watched_ms, 0) AS lastWatchedMs
FROM imm_anime a
JOIN imm_lifetime_anime lm ON lm.anime_id = a.anime_id
JOIN imm_videos v ON v.anime_id = a.anime_id
@@ -65,7 +65,7 @@ export function getAnimeDetail(db: DatabaseSync, animeId: number): AnimeDetailRo
COALESCE(SUM(COALESCE(asm.lookupHits, s.lookup_hits, 0)), 0) AS totalLookupHits,
COALESCE(SUM(COALESCE(asm.yomitanLookupCount, s.yomitan_lookup_count, 0)), 0) AS totalYomitanLookupCount,
COUNT(DISTINCT v.video_id) AS episodeCount,
CAST(COALESCE(lm.last_watched_ms, 0) AS INTEGER) AS lastWatchedMs
COALESCE(lm.last_watched_ms, 0) AS lastWatchedMs
FROM imm_anime a
JOIN imm_lifetime_anime lm ON lm.anime_id = a.anime_id
JOIN imm_videos v ON v.anime_id = a.anime_id
@@ -110,7 +110,7 @@ export function getAnimeEpisodes(db: DatabaseSync, animeId: number): AnimeEpisod
v.parsed_season AS season,
v.parsed_episode AS episode,
v.duration_ms AS durationMs,
CAST((
(
SELECT COALESCE(
NULLIF(s_recent.ended_media_ms, 0),
(
@@ -147,14 +147,14 @@ export function getAnimeEpisodes(db: DatabaseSync, animeId: number): AnimeEpisod
COALESCE(s_recent.ended_at_ms, s_recent.LAST_UPDATE_DATE, s_recent.started_at_ms) DESC,
s_recent.session_id DESC
LIMIT 1
) AS INTEGER) AS endedMediaMs,
) AS endedMediaMs,
v.watched AS watched,
COUNT(DISTINCT s.session_id) AS totalSessions,
COALESCE(SUM(COALESCE(asm.activeWatchedMs, s.active_watched_ms, 0)), 0) AS totalActiveMs,
COALESCE(SUM(COALESCE(asm.cardsMined, s.cards_mined, 0)), 0) AS totalCards,
COALESCE(SUM(COALESCE(asm.tokensSeen, s.tokens_seen, 0)), 0) AS totalTokensSeen,
COALESCE(SUM(COALESCE(asm.yomitanLookupCount, s.yomitan_lookup_count, 0)), 0) AS totalYomitanLookupCount,
CAST(MAX(s.started_at_ms) AS INTEGER) AS lastWatchedMs
MAX(s.started_at_ms) AS lastWatchedMs
FROM imm_videos v
LEFT JOIN imm_sessions s ON s.video_id = v.video_id
LEFT JOIN active_session_metrics asm ON asm.sessionId = s.session_id
@@ -182,7 +182,7 @@ export function getMediaLibrary(db: DatabaseSync): MediaLibraryRow[] {
COALESCE(lm.total_active_ms, 0) AS totalActiveMs,
COALESCE(lm.total_cards, 0) AS totalCards,
COALESCE(lm.total_tokens_seen, 0) AS totalTokensSeen,
CAST(COALESCE(lm.last_watched_ms, 0) AS INTEGER) AS lastWatchedMs,
COALESCE(lm.last_watched_ms, 0) AS lastWatchedMs,
yv.youtube_video_id AS youtubeVideoId,
yv.video_url AS videoUrl,
yv.video_title AS videoTitle,
@@ -261,8 +261,8 @@ export function getMediaSessions(
s.session_id AS sessionId,
s.video_id AS videoId,
v.canonical_title AS canonicalTitle,
CAST(s.started_at_ms AS INTEGER) AS startedAtMs,
CAST(s.ended_at_ms AS INTEGER) AS endedAtMs,
s.started_at_ms AS startedAtMs,
s.ended_at_ms AS endedAtMs,
COALESCE(asm.totalWatchedMs, s.total_watched_ms, 0) AS totalWatchedMs,
COALESCE(asm.activeWatchedMs, s.active_watched_ms, 0) AS activeWatchedMs,
COALESCE(asm.linesSeen, s.lines_seen, 0) AS linesSeen,
@@ -517,7 +517,7 @@ export function getEpisodeSessions(db: DatabaseSync, videoId: number): SessionSu
SELECT
s.session_id AS sessionId, s.video_id AS videoId,
v.canonical_title AS canonicalTitle,
CAST(s.started_at_ms AS INTEGER) AS startedAtMs, CAST(s.ended_at_ms AS INTEGER) AS endedAtMs,
s.started_at_ms AS startedAtMs, s.ended_at_ms AS endedAtMs,
COALESCE(asm.totalWatchedMs, s.total_watched_ms, 0) AS totalWatchedMs,
COALESCE(asm.activeWatchedMs, s.active_watched_ms, 0) AS activeWatchedMs,
COALESCE(asm.linesSeen, s.lines_seen, 0) AS linesSeen,
@@ -541,7 +541,7 @@ export function getEpisodeCardEvents(db: DatabaseSync, videoId: number): Episode
.prepare(
`
SELECT e.event_id AS eventId, e.session_id AS sessionId,
CAST(e.ts_ms AS INTEGER) AS tsMs, e.cards_delta AS cardsDelta,
e.ts_ms AS tsMs, e.cards_delta AS cardsDelta,
e.payload_json AS payloadJson
FROM imm_session_events e
JOIN imm_sessions s ON s.session_id = e.session_id

View File

@@ -5,13 +5,7 @@ import type {
SessionSummaryQueryRow,
SessionTimelineRow,
} from './types';
import { ACTIVE_SESSION_METRICS_CTE, subtractDbMs, toDbMs, toDbSeconds } from './query-shared';
const THIRTY_DAYS_MS = '2592000000';
function localMidnightSecondsExpr(): string {
return `(CAST(strftime('%s', 'now') AS INTEGER) - CAST(strftime('%H', 'now', 'localtime') AS INTEGER) * 3600 - CAST(strftime('%M', 'now', 'localtime') AS INTEGER) * 60 - CAST(strftime('%S', 'now', 'localtime') AS INTEGER))`;
}
import { ACTIVE_SESSION_METRICS_CTE } from './query-shared';
export function getSessionSummaries(db: DatabaseSync, limit = 50): SessionSummaryQueryRow[] {
const prepared = db.prepare(`
@@ -22,8 +16,8 @@ export function getSessionSummaries(db: DatabaseSync, limit = 50): SessionSummar
v.canonical_title AS canonicalTitle,
v.anime_id AS animeId,
a.canonical_title AS animeTitle,
CAST(s.started_at_ms AS INTEGER) AS startedAtMs,
CAST(s.ended_at_ms AS INTEGER) AS endedAtMs,
s.started_at_ms AS startedAtMs,
s.ended_at_ms AS endedAtMs,
COALESCE(asm.totalWatchedMs, s.total_watched_ms, 0) AS totalWatchedMs,
COALESCE(asm.activeWatchedMs, s.active_watched_ms, 0) AS activeWatchedMs,
COALESCE(asm.linesSeen, s.lines_seen, 0) AS linesSeen,
@@ -49,7 +43,7 @@ export function getSessionTimeline(
): SessionTimelineRow[] {
const select = `
SELECT
CAST(sample_ms AS INTEGER) AS sampleMs,
sample_ms AS sampleMs,
total_watched_ms AS totalWatchedMs,
active_watched_ms AS activeWatchedMs,
lines_seen AS linesSeen,
@@ -135,13 +129,18 @@ export function getSessionWordsByLine(
}
function getNewWordCounts(db: DatabaseSync): { newWordsToday: number; newWordsThisWeek: number } {
const now = new Date();
const todayStartSec = new Date(now.getFullYear(), now.getMonth(), now.getDate()).getTime() / 1000;
const weekAgoSec =
new Date(now.getFullYear(), now.getMonth(), now.getDate() - 7).getTime() / 1000;
const row = db
.prepare(
`
WITH headword_first_seen AS (
SELECT
headword,
CAST(MIN(first_seen) AS INTEGER) AS first_seen
MIN(first_seen) AS first_seen
FROM imm_words
WHERE first_seen IS NOT NULL
AND headword IS NOT NULL
@@ -149,12 +148,13 @@ function getNewWordCounts(db: DatabaseSync): { newWordsToday: number; newWordsTh
GROUP BY headword
)
SELECT
COALESCE(SUM(CASE WHEN first_seen >= (${localMidnightSecondsExpr()}) THEN 1 ELSE 0 END), 0) AS today,
COALESCE(SUM(CASE WHEN first_seen >= (${localMidnightSecondsExpr()} - 7 * 86400) THEN 1 ELSE 0 END), 0) AS week
COALESCE(SUM(CASE WHEN first_seen >= ? THEN 1 ELSE 0 END), 0) AS today,
COALESCE(SUM(CASE WHEN first_seen >= ? THEN 1 ELSE 0 END), 0) AS week
FROM headword_first_seen
`,
`,
)
.get() as { today: number; week: number } | null;
.get(todayStartSec, weekAgoSec) as { today: number; week: number } | null;
return {
newWordsToday: Number(row?.today ?? 0),
newWordsThisWeek: Number(row?.week ?? 0),
@@ -203,7 +203,10 @@ export function getQueryHints(db: DatabaseSync): {
animeCompleted: number;
} | null;
const nowSeconds = (BigInt(toDbMs(nowMs())) / 1000n).toString();
const now = new Date();
const todayLocal = Math.floor(
new Date(now.getFullYear(), now.getMonth(), now.getDate()).getTime() / 86_400_000,
);
const episodesToday =
(
@@ -212,13 +215,13 @@ export function getQueryHints(db: DatabaseSync): {
`
SELECT COUNT(DISTINCT s.video_id) AS count
FROM imm_sessions s
WHERE date(s.started_at_ms / 1000, 'unixepoch', 'localtime') = date(?,'unixepoch','localtime')
WHERE CAST(julianday(s.started_at_ms / 1000, 'unixepoch', 'localtime') - 2440587.5 AS INTEGER) = ?
`,
)
.get(nowSeconds) as { count: number }
.get(todayLocal) as { count: number }
)?.count ?? 0;
const activeAnimeCutoffMs = subtractDbMs(toDbMs(nowMs()), `${THIRTY_DAYS_MS}`);
const thirtyDaysAgoMs = nowMs() - 30 * 86400000;
const activeAnimeCount =
(
db
@@ -231,7 +234,7 @@ export function getQueryHints(db: DatabaseSync): {
AND s.started_at_ms >= ?
`,
)
.get(activeAnimeCutoffMs) as { count: number }
.get(thirtyDaysAgoMs) as { count: number }
)?.count ?? 0;
const totalEpisodesWatched = Number(lifetime?.episodesCompleted ?? 0);

View File

@@ -271,28 +271,12 @@ export function deleteSessionsByIds(db: DatabaseSync, sessionIds: number[]): voi
db.prepare(`DELETE FROM imm_sessions WHERE session_id IN (${placeholders})`).run(...sessionIds);
}
export function toDbMs(ms: number | bigint | string): string {
export function toDbMs(ms: number | bigint): bigint {
if (typeof ms === 'bigint') {
return ms.toString();
}
if (typeof ms === 'string') {
const text = ms.trim().replace(/\.0+$/, '');
return /^-?\d+$/.test(text) ? text : '0';
return ms;
}
if (!Number.isFinite(ms)) {
return '0';
throw new TypeError(`Invalid database timestamp: ${ms}`);
}
return ms.toFixed(0);
}
export function toDbSeconds(ms: number | bigint | string): string {
const dbMs = toDbMs(ms);
if (dbMs === '0') {
return '0';
}
return (BigInt(dbMs) / 1000n).toString();
}
export function subtractDbMs(timestampMs: number | bigint | string, deltaMs: number | string): string {
return (BigInt(toDbMs(timestampMs)) - BigInt(`${deltaMs}`)).toString();
return BigInt(Math.trunc(ms));
}

View File

@@ -19,10 +19,6 @@ interface TrendPerAnimePoint {
interface TrendSessionMetricRow {
startedAtMs: number;
localEpochDay: number;
localMonthKey: number;
localDayOfWeek: number;
localHour: number;
videoId: number | null;
canonicalTitle: string | null;
animeTitle: string | null;
@@ -78,60 +74,63 @@ const TREND_DAY_LIMITS: Record<Exclude<TrendRange, 'all'>, number> = {
};
const DAY_NAMES = ['Sun', 'Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat'];
const MONTH_NAMES = ['Jan', 'Feb', 'Mar', 'Apr', 'May', 'Jun', 'Jul', 'Aug', 'Sep', 'Oct', 'Nov', 'Dec'];
function getTrendDayLimit(range: TrendRange): number {
return range === 'all' ? 365 : TREND_DAY_LIMITS[range];
}
function getTrendMonthlyLimit(range: TrendRange): number {
switch (range) {
case 'all':
return 120;
case '7d':
return 1;
case '30d':
return 2;
case '90d':
return 4;
if (range === 'all') {
return 120;
}
}
function epochDayToCivil(epochDay: number): { year: number; month: number; day: number } {
const z = epochDay + 719468;
const era = Math.floor(z / 146097);
const doe = z - era * 146097;
const yoe = Math.floor(
(doe - Math.floor(doe / 1460) + Math.floor(doe / 36524) - Math.floor(doe / 146096)) / 365,
const now = new Date();
const cutoff = new Date(
now.getFullYear(),
now.getMonth(),
now.getDate() - (TREND_DAY_LIMITS[range] - 1),
);
let year = yoe + era * 400;
const doy = doe - (365 * yoe + Math.floor(yoe / 4) - Math.floor(yoe / 100));
const mp = Math.floor((5 * doy + 2) / 153);
const day = doy - Math.floor((153 * mp + 2) / 5) + 1;
const month = mp < 10 ? mp + 3 : mp - 9;
if (month <= 2) {
year += 1;
return Math.max(1, (now.getFullYear() - cutoff.getFullYear()) * 12 + now.getMonth() - cutoff.getMonth() + 1);
}
function getTrendCutoffMs(range: TrendRange): number | null {
if (range === 'all') {
return null;
}
return { year, month, day };
const dayLimit = getTrendDayLimit(range);
const now = new Date();
const localMidnight = new Date(now.getFullYear(), now.getMonth(), now.getDate()).getTime();
return localMidnight - (dayLimit - 1) * 86_400_000;
}
function formatEpochDayLabel(epochDay: number): string {
const { month, day } = epochDayToCivil(epochDay);
return `${MONTH_NAMES[month - 1]} ${day}`;
function makeTrendLabel(value: number): string {
if (value > 100_000) {
const year = Math.floor(value / 100);
const month = value % 100;
return new Date(Date.UTC(year, month - 1, 1)).toLocaleDateString(undefined, {
month: 'short',
year: '2-digit',
});
}
return new Date(value * 86_400_000).toLocaleDateString(undefined, {
month: 'short',
day: 'numeric',
});
}
function formatMonthKeyLabel(monthKey: number): string {
const year = Math.floor(monthKey / 100);
const month = monthKey % 100;
return `${MONTH_NAMES[month - 1]} ${String(year).slice(-2)}`;
function getLocalEpochDay(timestampMs: number): number {
const date = new Date(timestampMs);
return Math.floor((timestampMs - date.getTimezoneOffset() * 60_000) / 86_400_000);
}
function formatTrendLabel(value: number): string {
return value > 100_000 ? formatMonthKeyLabel(value) : formatEpochDayLabel(value);
function getLocalDateForEpochDay(epochDay: number): Date {
const utcDate = new Date(epochDay * 86_400_000);
return new Date(utcDate.getTime() + utcDate.getTimezoneOffset() * 60_000);
}
function localMidnightSecondsExpr(): string {
return `(CAST(strftime('%s', 'now') AS INTEGER) - CAST(strftime('%H', 'now', 'localtime') AS INTEGER) * 3600 - CAST(strftime('%M', 'now', 'localtime') AS INTEGER) * 60 - CAST(strftime('%S', 'now', 'localtime') AS INTEGER))`;
function getLocalMonthKey(timestampMs: number): number {
const date = new Date(timestampMs);
return date.getFullYear() * 100 + date.getMonth() + 1;
}
function getTrendSessionWordCount(session: Pick<TrendSessionMetricRow, 'tokensSeen'>): number {
@@ -169,7 +168,7 @@ function buildAggregatedTrendRows(rollups: ImmersionSessionRollupRow[]) {
words: 0,
sessions: 0,
};
existing.activeMin += Math.round(rollup.totalActiveMin);
existing.activeMin += rollup.totalActiveMin;
existing.cards += rollup.totalCards;
existing.words += rollup.totalTokensSeen;
existing.sessions += rollup.totalSessions;
@@ -179,8 +178,8 @@ function buildAggregatedTrendRows(rollups: ImmersionSessionRollupRow[]) {
return Array.from(byKey.entries())
.sort(([left], [right]) => left - right)
.map(([key, value]) => ({
label: formatTrendLabel(key),
activeMin: value.activeMin,
label: makeTrendLabel(key),
activeMin: Math.round(value.activeMin),
cards: value.cards,
words: value.words,
sessions: value.sessions,
@@ -190,7 +189,7 @@ function buildAggregatedTrendRows(rollups: ImmersionSessionRollupRow[]) {
function buildWatchTimeByDayOfWeek(sessions: TrendSessionMetricRow[]): TrendChartPoint[] {
const totals = new Array(7).fill(0);
for (const session of sessions) {
totals[session.localDayOfWeek] += session.activeWatchedMs;
totals[new Date(session.startedAtMs).getDay()] += session.activeWatchedMs;
}
return DAY_NAMES.map((name, index) => ({
label: name,
@@ -201,7 +200,7 @@ function buildWatchTimeByDayOfWeek(sessions: TrendSessionMetricRow[]): TrendChar
function buildWatchTimeByHour(sessions: TrendSessionMetricRow[]): TrendChartPoint[] {
const totals = new Array(24).fill(0);
for (const session of sessions) {
totals[session.localHour] += session.activeWatchedMs;
totals[new Date(session.startedAtMs).getHours()] += session.activeWatchedMs;
}
return totals.map((ms, index) => ({
label: `${String(index).padStart(2, '0')}:00`,
@@ -209,18 +208,25 @@ function buildWatchTimeByHour(sessions: TrendSessionMetricRow[]): TrendChartPoin
}));
}
function dayLabel(epochDay: number): string {
return getLocalDateForEpochDay(epochDay).toLocaleDateString(undefined, {
month: 'short',
day: 'numeric',
});
}
function buildSessionSeriesByDay(
sessions: TrendSessionMetricRow[],
getValue: (session: TrendSessionMetricRow) => number,
): TrendChartPoint[] {
const byDay = new Map<number, number>();
for (const session of sessions) {
const epochDay = session.localEpochDay;
const epochDay = getLocalEpochDay(session.startedAtMs);
byDay.set(epochDay, (byDay.get(epochDay) ?? 0) + getValue(session));
}
return Array.from(byDay.entries())
.sort(([left], [right]) => left - right)
.map(([epochDay, value]) => ({ label: formatEpochDayLabel(epochDay), value }));
.map(([epochDay, value]) => ({ label: dayLabel(epochDay), value }));
}
function buildSessionSeriesByMonth(
@@ -229,30 +235,40 @@ function buildSessionSeriesByMonth(
): TrendChartPoint[] {
const byMonth = new Map<number, number>();
for (const session of sessions) {
const monthKey = session.localMonthKey;
const monthKey = getLocalMonthKey(session.startedAtMs);
byMonth.set(monthKey, (byMonth.get(monthKey) ?? 0) + getValue(session));
}
return Array.from(byMonth.entries())
.sort(([left], [right]) => left - right)
.map(([monthKey, value]) => ({ label: formatMonthKeyLabel(monthKey), value }));
.map(([monthKey, value]) => ({ label: makeTrendLabel(monthKey), value }));
}
function buildLookupsPerHundredWords(sessions: TrendSessionMetricRow[]): TrendChartPoint[] {
const lookupsByDay = new Map<number, number>();
const wordsByDay = new Map<number, number>();
function buildLookupsPerHundredWords(
sessions: TrendSessionMetricRow[],
groupBy: TrendGroupBy,
): TrendChartPoint[] {
const lookupsByBucket = new Map<number, number>();
const wordsByBucket = new Map<number, number>();
for (const session of sessions) {
const epochDay = session.localEpochDay;
lookupsByDay.set(epochDay, (lookupsByDay.get(epochDay) ?? 0) + session.yomitanLookupCount);
wordsByDay.set(epochDay, (wordsByDay.get(epochDay) ?? 0) + getTrendSessionWordCount(session));
const bucketKey =
groupBy === 'month' ? getLocalMonthKey(session.startedAtMs) : getLocalEpochDay(session.startedAtMs);
lookupsByBucket.set(
bucketKey,
(lookupsByBucket.get(bucketKey) ?? 0) + session.yomitanLookupCount,
);
wordsByBucket.set(
bucketKey,
(wordsByBucket.get(bucketKey) ?? 0) + getTrendSessionWordCount(session),
);
}
return Array.from(lookupsByDay.entries())
return Array.from(lookupsByBucket.entries())
.sort(([left], [right]) => left - right)
.map(([epochDay, lookups]) => {
const words = wordsByDay.get(epochDay) ?? 0;
.map(([bucketKey, lookups]) => {
const words = wordsByBucket.get(bucketKey) ?? 0;
return {
label: formatEpochDayLabel(epochDay),
label: groupBy === 'month' ? makeTrendLabel(bucketKey) : dayLabel(bucketKey),
value: words > 0 ? +((lookups / words) * 100).toFixed(1) : 0,
};
});
@@ -266,7 +282,7 @@ function buildPerAnimeFromSessions(
for (const session of sessions) {
const animeTitle = resolveTrendAnimeTitle(session);
const epochDay = session.localEpochDay;
const epochDay = getLocalEpochDay(session.startedAtMs);
const dayMap = byAnime.get(animeTitle) ?? new Map();
dayMap.set(epochDay, (dayMap.get(epochDay) ?? 0) + getValue(session));
byAnime.set(animeTitle, dayMap);
@@ -287,7 +303,7 @@ function buildLookupsPerHundredPerAnime(sessions: TrendSessionMetricRow[]): Tren
for (const session of sessions) {
const animeTitle = resolveTrendAnimeTitle(session);
const epochDay = session.localEpochDay;
const epochDay = getLocalEpochDay(session.startedAtMs);
const lookupMap = lookups.get(animeTitle) ?? new Map();
lookupMap.set(epochDay, (lookupMap.get(epochDay) ?? 0) + session.yomitanLookupCount);
@@ -455,7 +471,7 @@ function buildEpisodesPerDayFromDailyRollups(
return Array.from(byDay.entries())
.sort(([left], [right]) => left - right)
.map(([epochDay, videoIds]) => ({
label: formatEpochDayLabel(epochDay),
label: dayLabel(epochDay),
value: videoIds.size,
}));
}
@@ -475,25 +491,20 @@ function buildEpisodesPerMonthFromRollups(rollups: ImmersionSessionRollupRow[]):
return Array.from(byMonth.entries())
.sort(([left], [right]) => left - right)
.map(([monthKey, videoIds]) => ({
label: formatTrendLabel(monthKey),
label: makeTrendLabel(monthKey),
value: videoIds.size,
}));
}
function getTrendSessionMetrics(db: DatabaseSync, range: TrendRange): TrendSessionMetricRow[] {
const dayLimit = getTrendDayLimit(range);
const cutoffClause =
range === 'all'
? ''
: `WHERE CAST(s.started_at_ms AS INTEGER) >= (${localMidnightSecondsExpr()} - ${(dayLimit - 1) * 86400}) * 1000`;
function getTrendSessionMetrics(
db: DatabaseSync,
cutoffMs: number | null,
): TrendSessionMetricRow[] {
const whereClause = cutoffMs === null ? '' : 'WHERE s.started_at_ms >= ?';
const prepared = db.prepare(`
${ACTIVE_SESSION_METRICS_CTE}
SELECT
CAST(s.started_at_ms AS INTEGER) AS startedAtMs,
CAST(julianday(s.started_at_ms / 1000, 'unixepoch', 'localtime') - 2440587.5 AS INTEGER) AS localEpochDay,
CAST(strftime('%Y%m', s.started_at_ms / 1000, 'unixepoch', 'localtime') AS INTEGER) AS localMonthKey,
CAST(strftime('%w', s.started_at_ms / 1000, 'unixepoch', 'localtime') AS INTEGER) AS localDayOfWeek,
CAST(strftime('%H', s.started_at_ms / 1000, 'unixepoch', 'localtime') AS INTEGER) AS localHour,
s.started_at_ms AS startedAtMs,
s.video_id AS videoId,
v.canonical_title AS canonicalTitle,
a.canonical_title AS animeTitle,
@@ -505,79 +516,61 @@ function getTrendSessionMetrics(db: DatabaseSync, range: TrendRange): TrendSessi
LEFT JOIN active_session_metrics asm ON asm.sessionId = s.session_id
LEFT JOIN imm_videos v ON v.video_id = s.video_id
LEFT JOIN imm_anime a ON a.anime_id = v.anime_id
${cutoffClause}
ORDER BY CAST(s.started_at_ms AS INTEGER) ASC
${whereClause}
ORDER BY s.started_at_ms ASC
`);
const rows = prepared.all() as Array<{
startedAtMs: number | string;
localEpochDay: number | string;
localMonthKey: number | string;
localDayOfWeek: number | string;
localHour: number | string;
videoId: number | null;
canonicalTitle: string | null;
animeTitle: string | null;
activeWatchedMs: number | string;
tokensSeen: number | string;
cardsMined: number | string;
yomitanLookupCount: number | string;
}>;
return rows.map((row) => ({
startedAtMs: Number.parseInt(String(row.startedAtMs), 10),
localEpochDay: Number.parseInt(String(row.localEpochDay), 10),
localMonthKey: Number.parseInt(String(row.localMonthKey), 10),
localDayOfWeek: Number.parseInt(String(row.localDayOfWeek), 10),
localHour: Number.parseInt(String(row.localHour), 10),
videoId: row.videoId,
canonicalTitle: row.canonicalTitle,
animeTitle: row.animeTitle,
activeWatchedMs: Number(row.activeWatchedMs),
tokensSeen: Number(row.tokensSeen),
cardsMined: Number(row.cardsMined),
yomitanLookupCount: Number(row.yomitanLookupCount),
}));
return (cutoffMs === null ? prepared.all() : prepared.all(cutoffMs)) as TrendSessionMetricRow[];
}
function buildNewWordsPerDay(db: DatabaseSync, dayLimit: number | null): TrendChartPoint[] {
const cutoffExpr =
dayLimit === null ? '' : `AND CAST(first_seen AS INTEGER) >= (${localMidnightSecondsExpr()} - ${(dayLimit - 1) * 86400})`;
function buildNewWordsPerDay(db: DatabaseSync, cutoffMs: number | null): TrendChartPoint[] {
const whereClause = cutoffMs === null ? '' : 'AND first_seen >= ?';
const prepared = db.prepare(`
SELECT
CAST(julianday(first_seen, 'unixepoch', 'localtime') - 2440587.5 AS INTEGER) AS epochDay,
COUNT(*) AS wordCount
FROM imm_words
WHERE first_seen IS NOT NULL
${cutoffExpr}
${whereClause}
GROUP BY epochDay
ORDER BY epochDay ASC
`);
const rows = prepared.all() as Array<{ epochDay: number; wordCount: number }>;
const rows = (
cutoffMs === null ? prepared.all() : prepared.all(Math.floor(cutoffMs / 1000))
) as Array<{
epochDay: number;
wordCount: number;
}>;
return rows.map((row) => ({
label: formatEpochDayLabel(row.epochDay),
label: dayLabel(row.epochDay),
value: row.wordCount,
}));
}
function buildNewWordsPerMonth(db: DatabaseSync, dayLimit: number | null): TrendChartPoint[] {
const cutoffExpr =
dayLimit === null ? '' : `AND CAST(first_seen AS INTEGER) >= (${localMidnightSecondsExpr()} - ${(dayLimit - 1) * 86400})`;
function buildNewWordsPerMonth(db: DatabaseSync, cutoffMs: number | null): TrendChartPoint[] {
const whereClause = cutoffMs === null ? '' : 'AND first_seen >= ?';
const prepared = db.prepare(`
SELECT
CAST(strftime('%Y%m', first_seen, 'unixepoch', 'localtime') AS INTEGER) AS monthKey,
COUNT(*) AS wordCount
FROM imm_words
WHERE first_seen IS NOT NULL
${cutoffExpr}
${whereClause}
GROUP BY monthKey
ORDER BY monthKey ASC
`);
const rows = prepared.all() as Array<{ monthKey: number; wordCount: number }>;
const rows = (
cutoffMs === null ? prepared.all() : prepared.all(Math.floor(cutoffMs / 1000))
) as Array<{
monthKey: number;
wordCount: number;
}>;
return rows.map((row) => ({
label: formatMonthKeyLabel(row.monthKey),
label: makeTrendLabel(row.monthKey),
value: row.wordCount,
}));
}
@@ -589,12 +582,13 @@ export function getTrendsDashboard(
): TrendsDashboardQueryResult {
const dayLimit = getTrendDayLimit(range);
const monthlyLimit = getTrendMonthlyLimit(range);
const cutoffMs = getTrendCutoffMs(range);
const useMonthlyBuckets = groupBy === 'month';
const dailyRollups = getDailyRollups(db, dayLimit);
const monthlyRollups = getMonthlyRollups(db, monthlyLimit);
const chartRollups = useMonthlyBuckets ? monthlyRollups : dailyRollups;
const sessions = getTrendSessionMetrics(db, range);
const sessions = getTrendSessionMetrics(db, cutoffMs);
const titlesByVideoId = getVideoAnimeTitleMap(
db,
dailyRollups.map((rollup) => rollup.videoId),
@@ -611,7 +605,7 @@ export function getTrendsDashboard(
const animePerDay = {
episodes: buildEpisodesPerAnimeFromDailyRollups(dailyRollups, titlesByVideoId),
watchTime: buildPerAnimeFromDailyRollups(dailyRollups, titlesByVideoId, (rollup) =>
Math.round(rollup.totalActiveMin),
rollup.totalActiveMin,
),
cards: buildPerAnimeFromDailyRollups(
dailyRollups,
@@ -634,7 +628,7 @@ export function getTrendsDashboard(
sessions: accumulatePoints(activity.sessions),
words: accumulatePoints(activity.words),
newWords: accumulatePoints(
useMonthlyBuckets ? buildNewWordsPerMonth(db, range === 'all' ? null : dayLimit) : buildNewWordsPerDay(db, range === 'all' ? null : dayLimit),
useMonthlyBuckets ? buildNewWordsPerMonth(db, cutoffMs) : buildNewWordsPerDay(db, cutoffMs),
),
cards: accumulatePoints(activity.cards),
episodes: accumulatePoints(
@@ -649,7 +643,7 @@ export function getTrendsDashboard(
),
},
ratios: {
lookupsPerHundred: buildLookupsPerHundredWords(sessions),
lookupsPerHundred: buildLookupsPerHundredWords(sessions, groupBy),
},
animePerDay,
animeCumulative: {

View File

@@ -40,7 +40,7 @@ export function startSessionRecord(
export function finalizeSessionRecord(
db: DatabaseSync,
sessionState: SessionState,
endedAtMs: number | string = nowMs(),
endedAtMs = nowMs(),
): void {
db.prepare(
`

View File

@@ -143,10 +143,10 @@ test('ensureSchema creates immersion core tables', () => {
const rollupStateRow = db
.prepare('SELECT state_value FROM imm_rollup_state WHERE state_key = ?')
.get('last_rollup_sample_ms') as {
state_value: string;
state_value: number;
} | null;
assert.ok(rollupStateRow);
assert.equal(rollupStateRow?.state_value, '0');
assert.equal(rollupStateRow?.state_value, 0);
} finally {
db.close();
cleanupDbPath(dbPath);
@@ -965,12 +965,12 @@ test('start/finalize session updates ended_at and status', () => {
const row = db
.prepare('SELECT ended_at_ms, status FROM imm_sessions WHERE session_id = ?')
.get(sessionId) as {
ended_at_ms: string | null;
ended_at_ms: number | null;
status: number;
} | null;
assert.ok(row);
assert.equal(row?.ended_at_ms, String(endedAtMs));
assert.equal(row?.ended_at_ms, endedAtMs);
assert.equal(row?.status, SESSION_STATUS_ENDED);
} finally {
db.close();

View File

@@ -287,9 +287,9 @@ function ensureLifetimeSummaryTables(db: DatabaseSync): void {
episodes_started INTEGER NOT NULL DEFAULT 0,
episodes_completed INTEGER NOT NULL DEFAULT 0,
anime_completed INTEGER NOT NULL DEFAULT 0,
last_rebuilt_ms TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT
last_rebuilt_ms INTEGER,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER
)
`);
@@ -332,10 +332,10 @@ function ensureLifetimeSummaryTables(db: DatabaseSync): void {
total_tokens_seen INTEGER NOT NULL DEFAULT 0,
episodes_started INTEGER NOT NULL DEFAULT 0,
episodes_completed INTEGER NOT NULL DEFAULT 0,
first_watched_ms TEXT,
last_watched_ms TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
first_watched_ms INTEGER,
last_watched_ms INTEGER,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(anime_id) REFERENCES imm_anime(anime_id) ON DELETE CASCADE
)
`);
@@ -349,10 +349,10 @@ function ensureLifetimeSummaryTables(db: DatabaseSync): void {
total_lines_seen INTEGER NOT NULL DEFAULT 0,
total_tokens_seen INTEGER NOT NULL DEFAULT 0,
completed INTEGER NOT NULL DEFAULT 0,
first_watched_ms TEXT,
last_watched_ms TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
first_watched_ms INTEGER,
last_watched_ms INTEGER,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(video_id) REFERENCES imm_videos(video_id) ON DELETE CASCADE
)
`);
@@ -360,9 +360,9 @@ function ensureLifetimeSummaryTables(db: DatabaseSync): void {
db.exec(`
CREATE TABLE IF NOT EXISTS imm_lifetime_applied_sessions(
session_id INTEGER PRIMARY KEY,
applied_at_ms TEXT NOT NULL,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
applied_at_ms INTEGER NOT NULL,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(session_id) REFERENCES imm_sessions(session_id) ON DELETE CASCADE
)
`);
@@ -562,18 +562,18 @@ export function ensureSchema(db: DatabaseSync): void {
db.exec(`
CREATE TABLE IF NOT EXISTS imm_schema_version (
schema_version INTEGER PRIMARY KEY,
applied_at_ms TEXT NOT NULL
applied_at_ms INTEGER NOT NULL
);
`);
db.exec(`
CREATE TABLE IF NOT EXISTS imm_rollup_state(
state_key TEXT PRIMARY KEY,
state_value TEXT NOT NULL
state_value INTEGER NOT NULL
);
`);
db.exec(`
INSERT INTO imm_rollup_state(state_key, state_value)
VALUES ('last_rollup_sample_ms', '0')
VALUES ('last_rollup_sample_ms', 0)
ON CONFLICT(state_key) DO NOTHING
`);
@@ -597,8 +597,8 @@ export function ensureSchema(db: DatabaseSync): void {
episodes_total INTEGER,
description TEXT,
metadata_json TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER
);
`);
db.exec(`
@@ -625,8 +625,8 @@ export function ensureSchema(db: DatabaseSync): void {
bitrate_kbps INTEGER, audio_codec_id INTEGER,
hash_sha256 TEXT, screenshot_path TEXT,
metadata_json TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(anime_id) REFERENCES imm_anime(anime_id) ON DELETE SET NULL
);
`);
@@ -635,8 +635,7 @@ export function ensureSchema(db: DatabaseSync): void {
session_id INTEGER PRIMARY KEY AUTOINCREMENT,
session_uuid TEXT NOT NULL UNIQUE,
video_id INTEGER NOT NULL,
started_at_ms TEXT NOT NULL,
ended_at_ms TEXT,
started_at_ms INTEGER NOT NULL, ended_at_ms INTEGER,
status INTEGER NOT NULL,
locale_id INTEGER, target_lang_id INTEGER,
difficulty_tier INTEGER, subtitle_mode INTEGER,
@@ -654,8 +653,8 @@ export function ensureSchema(db: DatabaseSync): void {
seek_forward_count INTEGER NOT NULL DEFAULT 0,
seek_backward_count INTEGER NOT NULL DEFAULT 0,
media_buffer_events INTEGER NOT NULL DEFAULT 0,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(video_id) REFERENCES imm_videos(video_id)
);
`);
@@ -663,7 +662,7 @@ export function ensureSchema(db: DatabaseSync): void {
CREATE TABLE IF NOT EXISTS imm_session_telemetry(
telemetry_id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id INTEGER NOT NULL,
sample_ms TEXT NOT NULL,
sample_ms INTEGER NOT NULL,
total_watched_ms INTEGER NOT NULL DEFAULT 0,
active_watched_ms INTEGER NOT NULL DEFAULT 0,
lines_seen INTEGER NOT NULL DEFAULT 0,
@@ -677,8 +676,8 @@ export function ensureSchema(db: DatabaseSync): void {
seek_forward_count INTEGER NOT NULL DEFAULT 0,
seek_backward_count INTEGER NOT NULL DEFAULT 0,
media_buffer_events INTEGER NOT NULL DEFAULT 0,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(session_id) REFERENCES imm_sessions(session_id) ON DELETE CASCADE
);
`);
@@ -686,7 +685,7 @@ export function ensureSchema(db: DatabaseSync): void {
CREATE TABLE IF NOT EXISTS imm_session_events(
event_id INTEGER PRIMARY KEY AUTOINCREMENT,
session_id INTEGER NOT NULL,
ts_ms TEXT NOT NULL,
ts_ms INTEGER NOT NULL,
event_type INTEGER NOT NULL,
line_index INTEGER,
segment_start_ms INTEGER,
@@ -694,8 +693,8 @@ export function ensureSchema(db: DatabaseSync): void {
tokens_delta INTEGER NOT NULL DEFAULT 0,
cards_delta INTEGER NOT NULL DEFAULT 0,
payload_json TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(session_id) REFERENCES imm_sessions(session_id) ON DELETE CASCADE
);
`);
@@ -711,8 +710,8 @@ export function ensureSchema(db: DatabaseSync): void {
cards_per_hour REAL,
tokens_per_min REAL,
lookup_hit_rate REAL,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
PRIMARY KEY (rollup_day, video_id)
);
`);
@@ -725,8 +724,8 @@ export function ensureSchema(db: DatabaseSync): void {
total_lines_seen INTEGER NOT NULL DEFAULT 0,
total_tokens_seen INTEGER NOT NULL DEFAULT 0,
total_cards INTEGER NOT NULL DEFAULT 0,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
PRIMARY KEY (rollup_month, video_id)
);
`);
@@ -769,8 +768,8 @@ export function ensureSchema(db: DatabaseSync): void {
segment_end_ms INTEGER,
text TEXT NOT NULL,
secondary_text TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(session_id) REFERENCES imm_sessions(session_id) ON DELETE CASCADE,
FOREIGN KEY(event_id) REFERENCES imm_session_events(event_id) ON DELETE SET NULL,
FOREIGN KEY(video_id) REFERENCES imm_videos(video_id) ON DELETE CASCADE,
@@ -807,9 +806,9 @@ export function ensureSchema(db: DatabaseSync): void {
title_romaji TEXT,
title_english TEXT,
episodes_total INTEGER,
fetched_at_ms TEXT NOT NULL,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
fetched_at_ms INTEGER NOT NULL,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(video_id) REFERENCES imm_videos(video_id) ON DELETE CASCADE
);
`);
@@ -828,9 +827,9 @@ export function ensureSchema(db: DatabaseSync): void {
uploader_url TEXT,
description TEXT,
metadata_json TEXT,
fetched_at_ms TEXT NOT NULL,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
fetched_at_ms INTEGER NOT NULL,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(video_id) REFERENCES imm_videos(video_id) ON DELETE CASCADE
);
`);
@@ -838,24 +837,24 @@ export function ensureSchema(db: DatabaseSync): void {
CREATE TABLE IF NOT EXISTS imm_cover_art_blobs(
blob_hash TEXT PRIMARY KEY,
cover_blob BLOB NOT NULL,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER
);
`);
if (currentVersion?.schema_version === 1) {
addColumnIfMissing(db, 'imm_videos', 'CREATED_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_videos', 'LAST_UPDATE_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_sessions', 'CREATED_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_sessions', 'LAST_UPDATE_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_session_telemetry', 'CREATED_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_session_telemetry', 'LAST_UPDATE_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_session_events', 'CREATED_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_session_events', 'LAST_UPDATE_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_daily_rollups', 'CREATED_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_daily_rollups', 'LAST_UPDATE_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_monthly_rollups', 'CREATED_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_monthly_rollups', 'LAST_UPDATE_DATE', 'TEXT');
addColumnIfMissing(db, 'imm_videos', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_videos', 'LAST_UPDATE_DATE');
addColumnIfMissing(db, 'imm_sessions', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_sessions', 'LAST_UPDATE_DATE');
addColumnIfMissing(db, 'imm_session_telemetry', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_session_telemetry', 'LAST_UPDATE_DATE');
addColumnIfMissing(db, 'imm_session_events', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_session_events', 'LAST_UPDATE_DATE');
addColumnIfMissing(db, 'imm_daily_rollups', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_daily_rollups', 'LAST_UPDATE_DATE');
addColumnIfMissing(db, 'imm_monthly_rollups', 'CREATED_DATE');
addColumnIfMissing(db, 'imm_monthly_rollups', 'LAST_UPDATE_DATE');
const migratedAtMs = toDbMs(nowMs());
db.prepare(
@@ -939,8 +938,8 @@ export function ensureSchema(db: DatabaseSync): void {
segment_end_ms INTEGER,
text TEXT NOT NULL,
secondary_text TEXT,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT,
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER,
FOREIGN KEY(session_id) REFERENCES imm_sessions(session_id) ON DELETE CASCADE,
FOREIGN KEY(event_id) REFERENCES imm_session_events(event_id) ON DELETE SET NULL,
FOREIGN KEY(video_id) REFERENCES imm_videos(video_id) ON DELETE CASCADE,
@@ -1089,8 +1088,8 @@ export function ensureSchema(db: DatabaseSync): void {
CREATE TABLE IF NOT EXISTS imm_cover_art_blobs(
blob_hash TEXT PRIMARY KEY,
cover_blob BLOB NOT NULL,
CREATED_DATE TEXT,
LAST_UPDATE_DATE TEXT
CREATED_DATE INTEGER,
LAST_UPDATE_DATE INTEGER
)
`);
deduplicateExistingCoverArtRows(db);
@@ -1238,7 +1237,7 @@ export function ensureSchema(db: DatabaseSync): void {
db.exec('DELETE FROM imm_daily_rollups');
db.exec('DELETE FROM imm_monthly_rollups');
db.exec(
`UPDATE imm_rollup_state SET state_value = '0' WHERE state_key = 'last_rollup_sample_ms'`,
`UPDATE imm_rollup_state SET state_value = 0 WHERE state_key = 'last_rollup_sample_ms'`,
);
}
@@ -1421,7 +1420,7 @@ export function executeQueuedWrite(write: QueuedWrite, stmts: TrackerPreparedSta
) {
throw new Error('Incomplete telemetry write');
}
const telemetrySampleMs = write.sampleMs === undefined ? currentMs : toDbMs(write.sampleMs);
const telemetrySampleMs = toDbMs(write.sampleMs ?? Number(currentMs));
stmts.telemetryInsertStmt.run(
write.sessionId,
telemetrySampleMs,
@@ -1496,7 +1495,7 @@ export function executeQueuedWrite(write: QueuedWrite, stmts: TrackerPreparedSta
stmts.eventInsertStmt.run(
write.sessionId,
write.sampleMs === undefined ? currentMs : toDbMs(write.sampleMs),
toDbMs(write.sampleMs ?? Number(currentMs)),
write.eventType ?? 0,
write.lineIndex ?? null,
write.segmentStartMs ?? null,

View File

@@ -59,3 +59,21 @@ export function handleOverlayWindowBeforeInputEvent(options: {
options.preventDefault();
return true;
}
export function handleOverlayWindowBlurred(options: {
kind: OverlayWindowKind;
windowVisible: boolean;
isOverlayVisible: (kind: OverlayWindowKind) => boolean;
ensureOverlayWindowLevel: () => void;
moveWindowTop: () => void;
}): boolean {
if (options.kind === 'visible' && !options.isOverlayVisible(options.kind)) {
return false;
}
options.ensureOverlayWindowLevel();
if (options.kind === 'visible' && options.windowVisible) {
options.moveWindowTop();
}
return true;
}

View File

@@ -2,6 +2,7 @@ import assert from 'node:assert/strict';
import test from 'node:test';
import {
handleOverlayWindowBeforeInputEvent,
handleOverlayWindowBlurred,
isTabInputForMpvForwarding,
} from './overlay-window-input';
@@ -82,3 +83,58 @@ test('handleOverlayWindowBeforeInputEvent leaves modal Tab handling alone', () =
assert.equal(handled, false);
assert.deepEqual(calls, []);
});
test('handleOverlayWindowBlurred skips visible overlay restacking after manual hide', () => {
const calls: string[] = [];
const handled = handleOverlayWindowBlurred({
kind: 'visible',
windowVisible: true,
isOverlayVisible: () => false,
ensureOverlayWindowLevel: () => {
calls.push('ensure-level');
},
moveWindowTop: () => {
calls.push('move-top');
},
});
assert.equal(handled, false);
assert.deepEqual(calls, []);
});
test('handleOverlayWindowBlurred preserves active visible/modal window stacking', () => {
const calls: string[] = [];
assert.equal(
handleOverlayWindowBlurred({
kind: 'visible',
windowVisible: true,
isOverlayVisible: () => true,
ensureOverlayWindowLevel: () => {
calls.push('ensure-visible');
},
moveWindowTop: () => {
calls.push('move-visible');
},
}),
true,
);
assert.equal(
handleOverlayWindowBlurred({
kind: 'modal',
windowVisible: true,
isOverlayVisible: () => false,
ensureOverlayWindowLevel: () => {
calls.push('ensure-modal');
},
moveWindowTop: () => {
calls.push('move-modal');
},
}),
true,
);
assert.deepEqual(calls, ['ensure-visible', 'move-visible', 'ensure-modal']);
});

View File

@@ -5,6 +5,7 @@ import { createLogger } from '../../logger';
import { IPC_CHANNELS } from '../../shared/ipc/contracts';
import {
handleOverlayWindowBeforeInputEvent,
handleOverlayWindowBlurred,
type OverlayWindowKind,
} from './overlay-window-input';
import { buildOverlayWindowOptions } from './overlay-window-options';
@@ -124,12 +125,18 @@ export function createOverlayWindow(
});
window.on('blur', () => {
if (!window.isDestroyed()) {
options.ensureOverlayWindowLevel(window);
if (kind === 'visible' && window.isVisible()) {
if (window.isDestroyed()) return;
handleOverlayWindowBlurred({
kind,
windowVisible: window.isVisible(),
isOverlayVisible: options.isOverlayVisible,
ensureOverlayWindowLevel: () => {
options.ensureOverlayWindowLevel(window);
},
moveWindowTop: () => {
window.moveTop();
}
}
},
});
});
if (options.isDev && kind === 'visible') {

View File

@@ -1,6 +1,6 @@
import { Hono } from 'hono';
import type { ImmersionTrackerService } from './immersion-tracker-service.js';
import http from 'node:http';
import http, { type IncomingMessage, type ServerResponse } from 'node:http';
import { basename, extname, resolve, sep } from 'node:path';
import { readFileSync, existsSync, statSync } from 'node:fs';
import { Readable } from 'node:stream';
@@ -61,6 +61,71 @@ function resolveStatsNoteFieldName(
return null;
}
function toFetchHeaders(headers: IncomingMessage['headers']): Headers {
const fetchHeaders = new Headers();
for (const [name, value] of Object.entries(headers)) {
if (value === undefined) continue;
if (Array.isArray(value)) {
for (const entry of value) {
fetchHeaders.append(name, entry);
}
continue;
}
fetchHeaders.set(name, value);
}
return fetchHeaders;
}
function toFetchRequest(req: IncomingMessage): Request {
const method = req.method ?? 'GET';
const url = new URL(req.url ?? '/', `http://${req.headers.host ?? '127.0.0.1'}`);
const init: RequestInit & { duplex?: 'half' } = {
method,
headers: toFetchHeaders(req.headers),
};
if (method !== 'GET' && method !== 'HEAD') {
init.body = Readable.toWeb(req) as BodyInit;
init.duplex = 'half';
}
return new Request(url, init);
}
async function writeFetchResponse(res: ServerResponse, response: Response): Promise<void> {
res.statusCode = response.status;
response.headers.forEach((value, key) => {
res.setHeader(key, value);
});
const body = await response.arrayBuffer();
res.end(Buffer.from(body));
}
function startNodeHttpServer(
app: Hono,
config: StatsServerConfig,
): { close: () => void } {
const server = http.createServer((req, res) => {
void (async () => {
try {
await writeFetchResponse(res, await app.fetch(toFetchRequest(req)));
} catch {
res.statusCode = 500;
res.end('Internal Server Error');
}
})();
});
server.listen(config.port, '127.0.0.1');
return {
close: () => {
server.close();
},
};
}
/** Load known words cache from disk into a Set. Returns null if unavailable. */
function loadKnownWordsSet(cachePath: string | undefined): Set<string> | null {
if (!cachePath || !existsSync(cachePath)) return null;
@@ -1008,21 +1073,18 @@ export function startStatsServer(config: StatsServerConfig): { close: () => void
resolveAnkiNoteId: config.resolveAnkiNoteId,
});
const bunServe =
(
globalThis as typeof globalThis & {
Bun?: {
serve?: (options: {
fetch: (typeof app)['fetch'];
port: number;
hostname: string;
}) => { stop: () => void };
};
}
).Bun?.serve;
const bunRuntime = globalThis as typeof globalThis & {
Bun?: {
serve?: (options: {
fetch: (typeof app)['fetch'];
port: number;
hostname: string;
}) => { stop: () => void };
};
};
if (typeof bunServe === 'function') {
const server = bunServe({
if (bunRuntime.Bun?.serve) {
const server = bunRuntime.Bun.serve({
fetch: app.fetch,
port: config.port,
hostname: '127.0.0.1',
@@ -1035,52 +1097,5 @@ export function startStatsServer(config: StatsServerConfig): { close: () => void
};
}
const server = http.createServer(async (req, res) => {
const url = new URL(`http://127.0.0.1:${config.port}${req.url}`);
const headers = new Headers();
for (const [name, value] of Object.entries(req.headers)) {
if (value === undefined) continue;
if (Array.isArray(value)) {
for (const entry of value) {
headers.append(name, entry);
}
} else {
headers.set(name, value);
}
}
const body =
req.method === 'GET' || req.method === 'HEAD'
? undefined
: (Readable.toWeb(req) as unknown as BodyInit);
const response = await app.fetch(
new Request(url.toString(), {
method: req.method,
headers,
body,
}),
);
res.statusCode = response.status;
for (const [name, value] of response.headers) {
res.setHeader(name, value);
}
const responseBody = await response.arrayBuffer();
if (responseBody.byteLength > 0) {
res.end(Buffer.from(responseBody));
return;
}
res.end();
});
server.listen(config.port, '127.0.0.1');
return {
close: () => {
server.close();
},
};
return startNodeHttpServer(app, config);
}

View File

@@ -35,6 +35,21 @@ test('parseSrtCues handles multi-line subtitle text', () => {
assert.equal(cues[0]!.text, 'これは\nテストです');
});
test('parseSrtCues strips HTML-like markup while preserving line breaks', () => {
const content = [
'1',
'00:01:00,000 --> 00:01:05,000',
'<font color="japanese">これは</font>',
'<font color="japanese">テストです</font>',
'',
].join('\n');
const cues = parseSrtCues(content);
assert.equal(cues.length, 1);
assert.equal(cues[0]!.text, 'これは\nテストです');
});
test('parseSrtCues handles hours in timestamps', () => {
const content = ['1', '01:30:00,000 --> 01:30:05,000', 'テスト', ''].join('\n');
@@ -134,6 +149,18 @@ test('parseAssCues handles \\N line breaks', () => {
assert.equal(cues[0]!.text, '一行目\\N二行目');
});
test('parseAssCues strips HTML-like markup while preserving ASS line breaks', () => {
const content = [
'[Events]',
'Format: Layer, Start, End, Style, Name, MarginL, MarginR, MarginV, Effect, Text',
'Dialogue: 0,0:00:01.00,0:00:04.00,Default,,0,0,0,,<font color="japanese">一行目</font>\\N<font color="japanese">二行目</font>',
].join('\n');
const cues = parseAssCues(content);
assert.equal(cues[0]!.text, '一行目\\N二行目');
});
test('parseAssCues returns empty for content without Events section', () => {
const content = ['[Script Info]', 'Title: Test'].join('\n');

View File

@@ -4,6 +4,8 @@ export interface SubtitleCue {
text: string;
}
const HTML_SUBTITLE_TAG_PATTERN = /<\/?[A-Za-z][^>\n]*>/g;
const SRT_TIMING_PATTERN =
/^\s*(?:(\d{1,2}):)?(\d{2}):(\d{2})[,.](\d{1,3})\s*-->\s*(?:(\d{1,2}):)?(\d{2}):(\d{2})[,.](\d{1,3})/;
@@ -21,6 +23,10 @@ function parseTimestamp(
);
}
function sanitizeSubtitleCueText(text: string): string {
return text.replace(ASS_OVERRIDE_TAG_PATTERN, '').replace(HTML_SUBTITLE_TAG_PATTERN, '').trim();
}
export function parseSrtCues(content: string): SubtitleCue[] {
const cues: SubtitleCue[] = [];
const lines = content.split(/\r?\n/);
@@ -54,7 +60,7 @@ export function parseSrtCues(content: string): SubtitleCue[] {
i += 1;
}
const text = textLines.join('\n').trim();
const text = sanitizeSubtitleCueText(textLines.join('\n'));
if (text) {
cues.push({ startTime, endTime, text });
}
@@ -140,13 +146,9 @@ export function parseAssCues(content: string): SubtitleCue[] {
continue;
}
const rawText = fields
.slice(textFieldIndex)
.join(',')
.replace(ASS_OVERRIDE_TAG_PATTERN, '')
.trim();
if (rawText) {
cues.push({ startTime, endTime, text: rawText });
const text = sanitizeSubtitleCueText(fields.slice(textFieldIndex).join(','));
if (text) {
cues.push({ startTime, endTime, text });
}
}

View File

@@ -31,6 +31,7 @@ import {
screen,
} from 'electron';
import { applyControllerConfigUpdate } from './main/controller-config-update.js';
import { createDiscordRpcClient } from './main/runtime/discord-rpc-client.js';
import { mergeAiConfig } from './ai/config';
function getPasswordStoreArg(argv: string[]): string | null {
@@ -68,6 +69,26 @@ function getDefaultPasswordStore(): string {
return 'gnome-libsecret';
}
function getStartupModeFlags(initialArgs: CliArgs | null | undefined): {
shouldUseMinimalStartup: boolean;
shouldSkipHeavyStartup: boolean;
} {
return {
shouldUseMinimalStartup: Boolean(
initialArgs?.texthooker ||
(initialArgs?.stats &&
(initialArgs.statsCleanup || initialArgs.statsBackground || initialArgs.statsStop)),
),
shouldSkipHeavyStartup: Boolean(
initialArgs &&
(shouldRunSettingsOnlyStartup(initialArgs) ||
initialArgs.stats ||
initialArgs.dictionary ||
initialArgs.setup),
),
};
}
protocol.registerSchemesAsPrivileged([
{
scheme: 'chrome-extension',
@@ -335,6 +356,7 @@ import {
import {
detectInstalledFirstRunPlugin,
installFirstRunPluginToDefaultLocation,
syncInstalledFirstRunPluginBinaryPath,
} from './main/runtime/first-run-setup-plugin';
import {
applyWindowsMpvShortcuts,
@@ -399,7 +421,7 @@ import {
import { handleMpvCommandFromIpcRuntime } from './main/ipc-mpv-command';
import { registerIpcRuntimeServices } from './main/ipc-runtime';
import { createAnkiJimakuIpcRuntimeServiceDeps } from './main/dependencies';
import { createMainBootServices } from './main/boot/services';
import { createMainBootServices, type MainBootServicesResult } from './main/boot/services';
import { handleCliCommandRuntimeServiceWithContext } from './main/cli-runtime';
import { createOverlayModalRuntimeService } from './main/overlay-runtime';
import { createOverlayModalInputState } from './main/runtime/overlay-modal-input-state';
@@ -596,6 +618,28 @@ const getDefaultSocketPathHandler = createGetDefaultSocketPathHandler(getDefault
function getDefaultSocketPath(): string {
return getDefaultSocketPathHandler();
}
type BootServices = MainBootServicesResult<
ConfigService,
ReturnType<typeof createAnilistTokenStore>,
ReturnType<typeof createJellyfinTokenStore>,
ReturnType<typeof createAnilistUpdateQueue>,
SubtitleWebSocket,
ReturnType<typeof createLogger>,
ReturnType<typeof createMainRuntimeRegistry>,
ReturnType<typeof createOverlayManager>,
ReturnType<typeof createOverlayModalInputState>,
ReturnType<typeof createOverlayContentMeasurementStore>,
ReturnType<typeof createOverlayModalRuntimeService>,
ReturnType<typeof createAppState>,
{
requestSingleInstanceLock: () => boolean;
quit: () => void;
on: (event: string, listener: (...args: unknown[]) => void) => unknown;
whenReady: () => Promise<void>;
}
>;
const bootServices = createMainBootServices({
platform: process.platform,
argv: process.argv,
@@ -675,31 +719,7 @@ const bootServices = createMainBootServices({
});
},
createAppState,
}) as {
configDir: string;
userDataPath: string;
defaultMpvLogPath: string;
defaultImmersionDbPath: string;
configService: ConfigService;
anilistTokenStore: ReturnType<typeof createAnilistTokenStore>;
jellyfinTokenStore: ReturnType<typeof createJellyfinTokenStore>;
anilistUpdateQueue: ReturnType<typeof createAnilistUpdateQueue>;
subtitleWsService: SubtitleWebSocket;
annotationSubtitleWsService: SubtitleWebSocket;
logger: ReturnType<typeof createLogger>;
runtimeRegistry: ReturnType<typeof createMainRuntimeRegistry>;
overlayManager: ReturnType<typeof createOverlayManager>;
overlayModalInputState: ReturnType<typeof createOverlayModalInputState>;
overlayContentMeasurementStore: ReturnType<typeof createOverlayContentMeasurementStore>;
overlayModalRuntime: ReturnType<typeof createOverlayModalRuntimeService>;
appState: ReturnType<typeof createAppState>;
appLifecycleApp: {
requestSingleInstanceLock: () => boolean;
quit: () => void;
on: (event: string, listener: (...args: unknown[]) => void) => unknown;
whenReady: () => Promise<void>;
};
};
}) as BootServices;
const {
configDir: CONFIG_DIR,
userDataPath: USER_DATA_PATH,
@@ -983,7 +1003,7 @@ const youtubePlaybackRuntime = createYoutubePlaybackRuntime({
mpvYtdlFormat: YOUTUBE_MPV_YTDL_FORMAT,
autoLaunchTimeoutMs: YOUTUBE_MPV_AUTO_LAUNCH_TIMEOUT_MS,
connectTimeoutMs: YOUTUBE_MPV_CONNECT_TIMEOUT_MS,
socketPath: appState.mpvSocketPath,
getSocketPath: () => appState.mpvSocketPath,
getMpvConnected: () => Boolean(appState.mpvClient?.connected),
invalidatePendingAutoplayReadyFallbacks: () =>
autoplayReadyGate.invalidatePendingAutoplayReadyFallbacks(),
@@ -1017,6 +1037,12 @@ const resolveWindowsMpvShortcutRuntimePaths = () =>
appDataDir: app.getPath('appData'),
desktopDir: app.getPath('desktop'),
});
syncInstalledFirstRunPluginBinaryPath({
platform: process.platform,
homeDir: os.homedir(),
xdgConfigHome: process.env.XDG_CONFIG_HOME,
binaryPath: process.execPath,
});
const firstRunSetupService = createFirstRunSetupService({
platform: process.platform,
configDir: CONFIG_DIR,
@@ -1046,6 +1072,7 @@ const firstRunSetupService = createFirstRunSetupService({
dirname: __dirname,
appPath: app.getAppPath(),
resourcesPath: process.resourcesPath,
binaryPath: process.execPath,
}),
detectWindowsMpvShortcuts: () => {
if (process.platform !== 'win32') {
@@ -1097,26 +1124,6 @@ const discordPresenceRuntime = createDiscordPresenceRuntime({
},
});
function createDiscordRpcClient() {
const discordRpc = require('discord-rpc') as {
Client: new (opts: { transport: 'ipc' }) => {
login: (opts: { clientId: string }) => Promise<void>;
setActivity: (activity: Record<string, unknown>) => Promise<void>;
clearActivity: () => Promise<void>;
destroy: () => void;
};
};
const client = new discordRpc.Client({ transport: 'ipc' });
return {
login: () => client.login({ clientId: DISCORD_PRESENCE_APP_ID }),
setActivity: (activity: unknown) =>
client.setActivity(activity as unknown as Record<string, unknown>),
clearActivity: () => client.clearActivity(),
destroy: () => client.destroy(),
};
}
async function initializeDiscordPresenceService(): Promise<void> {
if (getResolvedConfig().discordPresence.enabled !== true) {
appState.discordPresenceService = null;
@@ -1125,7 +1132,7 @@ async function initializeDiscordPresenceService(): Promise<void> {
appState.discordPresenceService = createDiscordPresenceService({
config: getResolvedConfig().discordPresence,
createClient: () => createDiscordRpcClient(),
createClient: () => createDiscordRpcClient(DISCORD_PRESENCE_APP_ID),
logDebug: (message, meta) => logger.debug(message, meta),
});
await appState.discordPresenceService.start();
@@ -3186,21 +3193,9 @@ const { appReadyRuntimeRunner } = composeAppReadyRuntime({
shouldRunHeadlessInitialCommand: () =>
Boolean(appState.initialArgs && isHeadlessInitialCommand(appState.initialArgs)),
shouldUseMinimalStartup: () =>
Boolean(
appState.initialArgs?.texthooker ||
(appState.initialArgs?.stats &&
(appState.initialArgs?.statsCleanup ||
appState.initialArgs?.statsBackground ||
appState.initialArgs?.statsStop)),
),
getStartupModeFlags(appState.initialArgs).shouldUseMinimalStartup,
shouldSkipHeavyStartup: () =>
Boolean(
appState.initialArgs &&
(shouldRunSettingsOnlyStartup(appState.initialArgs) ||
appState.initialArgs.stats ||
appState.initialArgs.dictionary ||
appState.initialArgs.setup),
),
getStartupModeFlags(appState.initialArgs).shouldSkipHeavyStartup,
createImmersionTracker: () => {
ensureImmersionTrackerStarted();
},
@@ -4221,16 +4216,16 @@ const { registerIpcRuntimeHandlers } = composeIpcRuntimeHandlers({
};
}
if (appState.activeParsedSubtitleSource === resolvedSource.sourceKey) {
return {
cues: appState.activeParsedSubtitleCues,
currentTimeSec,
currentSubtitle,
config,
};
}
try {
if (appState.activeParsedSubtitleSource === resolvedSource.sourceKey) {
return {
cues: appState.activeParsedSubtitleCues,
currentTimeSec,
currentSubtitle,
config,
};
}
const content = await loadSubtitleSourceText(resolvedSource.path);
const cues = parseSubtitleCues(content, resolvedSource.path);
appState.activeParsedSubtitleCues = cues;
@@ -4480,11 +4475,20 @@ const { runAndApplyStartupState } = composeHeadlessStartupHandlers<
});
runAndApplyStartupState();
if (isAnilistTrackingEnabled(getResolvedConfig())) {
void refreshAnilistClientSecretStateIfEnabled({ force: true });
anilistStateRuntime.refreshRetryQueueState();
const startupModeFlags = getStartupModeFlags(appState.initialArgs);
const shouldUseMinimalStartup = startupModeFlags.shouldUseMinimalStartup;
const shouldSkipHeavyStartup = startupModeFlags.shouldSkipHeavyStartup;
if (!appState.initialArgs || (!shouldUseMinimalStartup && !shouldSkipHeavyStartup)) {
if (isAnilistTrackingEnabled(getResolvedConfig())) {
void refreshAnilistClientSecretStateIfEnabled({ force: true }).catch((error) => {
logger.error('Failed to refresh AniList client secret state during startup', error);
});
anilistStateRuntime.refreshRetryQueueState();
}
void initializeDiscordPresenceService().catch((error) => {
logger.error('Failed to initialize Discord presence service during startup', error);
});
}
void initializeDiscordPresenceService();
const { createMainWindow: createMainWindowHandler, createModalWindow: createModalWindowHandler } =
createOverlayWindowRuntimeHandlers<BrowserWindow>({
createOverlayWindowDeps: {

View File

@@ -33,13 +33,66 @@ test('autoplay ready gate suppresses duplicate media signals unless forced while
gate.maybeSignalPluginAutoplayReady({ text: '字幕', tokens: null });
gate.maybeSignalPluginAutoplayReady({ text: '字幕', tokens: null }, { forceWhilePaused: true });
await new Promise((resolve) => setTimeout(resolve, 0));
const firstScheduled = scheduled.shift();
firstScheduled?.();
await new Promise((resolve) => setTimeout(resolve, 0));
assert.deepEqual(commands.slice(0, 3), [
['script-message', 'subminer-autoplay-ready'],
['script-message', 'subminer-autoplay-ready'],
assert.deepEqual(commands.filter((command) => command[0] === 'script-message'), [
['script-message', 'subminer-autoplay-ready'],
]);
assert.ok(commands.some((command) => command[0] === 'set_property' && command[1] === 'pause'));
assert.ok(
commands.some(
(command) =>
command[0] === 'set_property' && command[1] === 'pause' && command[2] === false,
),
);
assert.equal(scheduled.length > 0, true);
});
test('autoplay ready gate retry loop does not re-signal plugin readiness', async () => {
const commands: Array<Array<string | boolean>> = [];
const scheduled: Array<() => void> = [];
const gate = createAutoplayReadyGate({
isAppOwnedFlowInFlight: () => false,
getCurrentMediaPath: () => '/media/video.mkv',
getCurrentVideoPath: () => null,
getPlaybackPaused: () => true,
getMpvClient: () =>
({
connected: true,
requestProperty: async () => true,
send: ({ command }: { command: Array<string | boolean> }) => {
commands.push(command);
},
}) as never,
signalPluginAutoplayReady: () => {
commands.push(['script-message', 'subminer-autoplay-ready']);
},
schedule: (callback) => {
scheduled.push(callback);
return 1 as never;
},
logDebug: () => {},
});
gate.maybeSignalPluginAutoplayReady({ text: '字幕', tokens: null }, { forceWhilePaused: true });
await new Promise((resolve) => setTimeout(resolve, 0));
for (const callback of scheduled.splice(0, 3)) {
callback();
await new Promise((resolve) => setTimeout(resolve, 0));
}
assert.deepEqual(commands.filter((command) => command[0] === 'script-message'), [
['script-message', 'subminer-autoplay-ready'],
]);
assert.equal(
commands.filter(
(command) =>
command[0] === 'set_property' && command[1] === 'pause' && command[2] === false,
).length > 0,
true,
);
});

View File

@@ -46,19 +46,6 @@ export function createAutoplayReadyGate(deps: AutoplayReadyGateDeps) {
const duplicateMediaSignal = autoPlayReadySignalMediaPath === mediaPath;
const allowDuplicateWhilePaused =
options?.forceWhilePaused === true && deps.getPlaybackPaused() !== false;
if (duplicateMediaSignal && !allowDuplicateWhilePaused) {
return;
}
if (duplicateMediaSignal && allowDuplicateWhilePaused) {
deps.signalPluginAutoplayReady();
return;
}
autoPlayReadySignalMediaPath = mediaPath;
const playbackGeneration = ++autoPlayReadySignalGeneration;
deps.signalPluginAutoplayReady();
const releaseRetryDelayMs = 200;
const maxReleaseAttempts = resolveAutoplayReadyMaxReleaseAttempts({
forceWhilePaused: options?.forceWhilePaused === true,
@@ -88,7 +75,7 @@ export function createAutoplayReadyGate(deps: AutoplayReadyGateDeps) {
return true;
};
const attemptRelease = (attempt: number): void => {
const attemptRelease = (playbackGeneration: number, attempt: number): void => {
void (async () => {
if (
autoPlayReadySignalMediaPath !== mediaPath ||
@@ -100,7 +87,7 @@ export function createAutoplayReadyGate(deps: AutoplayReadyGateDeps) {
const mpvClient = deps.getMpvClient();
if (!mpvClient?.connected) {
if (attempt < maxReleaseAttempts) {
deps.schedule(() => attemptRelease(attempt + 1), releaseRetryDelayMs);
deps.schedule(() => attemptRelease(playbackGeneration, attempt + 1), releaseRetryDelayMs);
}
return;
}
@@ -110,15 +97,27 @@ export function createAutoplayReadyGate(deps: AutoplayReadyGateDeps) {
return;
}
deps.signalPluginAutoplayReady();
mpvClient.send({ command: ['set_property', 'pause', false] });
if (attempt < maxReleaseAttempts) {
deps.schedule(() => attemptRelease(attempt + 1), releaseRetryDelayMs);
deps.schedule(() => attemptRelease(playbackGeneration, attempt + 1), releaseRetryDelayMs);
}
})();
};
attemptRelease(0);
if (duplicateMediaSignal && !allowDuplicateWhilePaused) {
return;
}
if (!duplicateMediaSignal) {
autoPlayReadySignalMediaPath = mediaPath;
const playbackGeneration = ++autoPlayReadySignalGeneration;
deps.signalPluginAutoplayReady();
attemptRelease(playbackGeneration, 0);
return;
}
const playbackGeneration = ++autoPlayReadySignalGeneration;
attemptRelease(playbackGeneration, 0);
};
return {

View File

@@ -0,0 +1,38 @@
import assert from 'node:assert/strict';
import test from 'node:test';
import { createDiscordRpcClient } from './discord-rpc-client';
test('createDiscordRpcClient forwards rich presence calls through client.user', async () => {
const calls: Array<string> = [];
const rpcClient = createDiscordRpcClient('123456789012345678', {
createClient: () =>
({
login: async () => {
calls.push('login');
},
user: {
setActivity: async () => {
calls.push('setActivity');
},
clearActivity: async () => {
calls.push('clearActivity');
},
},
destroy: async () => {
calls.push('destroy');
},
}) as never,
});
await rpcClient.login();
await rpcClient.setActivity({
details: 'Title',
state: 'Playing 00:01 / 00:02',
startTimestamp: 1_700_000_000,
});
await rpcClient.clearActivity();
await rpcClient.destroy();
assert.deepEqual(calls, ['login', 'setActivity', 'clearActivity', 'destroy']);
});

View File

@@ -0,0 +1,49 @@
import { Client } from '@xhayper/discord-rpc';
import type { DiscordActivityPayload } from '../../core/services/discord-presence';
type DiscordRpcClientUserLike = {
setActivity: (activity: DiscordActivityPayload) => Promise<unknown>;
clearActivity: () => Promise<void>;
};
type DiscordRpcRawClient = {
login: () => Promise<void>;
destroy: () => Promise<void>;
user?: DiscordRpcClientUserLike;
};
export type DiscordRpcClient = {
login: () => Promise<void>;
setActivity: (activity: DiscordActivityPayload) => Promise<void>;
clearActivity: () => Promise<void>;
destroy: () => Promise<void>;
};
function requireUser(client: DiscordRpcRawClient): DiscordRpcClientUserLike {
if (!client.user) {
throw new Error('Discord RPC client user is unavailable');
}
return client.user;
}
export function wrapDiscordRpcClient(client: DiscordRpcRawClient): DiscordRpcClient {
return {
login: () => client.login(),
setActivity: (activity) => requireUser(client).setActivity(activity).then(() => undefined),
clearActivity: () => requireUser(client).clearActivity(),
destroy: () => client.destroy(),
};
}
export function createDiscordRpcClient(
clientId: string,
deps?: { createClient?: (options: { clientId: string; transport: { type: 'ipc' } }) => DiscordRpcRawClient },
): DiscordRpcClient {
const client =
deps?.createClient?.({ clientId, transport: { type: 'ipc' } }) ??
new Client({ clientId, transport: { type: 'ipc' } });
return wrapDiscordRpcClient(client);
}

View File

@@ -7,6 +7,7 @@ import {
detectInstalledFirstRunPlugin,
installFirstRunPluginToDefaultLocation,
resolvePackagedFirstRunPluginAssets,
syncInstalledFirstRunPluginBinaryPath,
} from './first-run-setup-plugin';
import { resolveDefaultMpvInstallPaths } from '../../shared/setup-state';
@@ -68,13 +69,17 @@ test('installFirstRunPluginToDefaultLocation installs plugin and backs up existi
dirname: path.join(root, 'dist', 'main', 'runtime'),
appPath: path.join(root, 'app'),
resourcesPath,
binaryPath: '/Applications/SubMiner.app/Contents/MacOS/SubMiner',
});
assert.equal(result.ok, true);
assert.equal(result.pluginInstallStatus, 'installed');
assert.equal(detectInstalledFirstRunPlugin(installPaths), true);
assert.equal(fs.readFileSync(installPaths.pluginEntrypointPath, 'utf8'), '-- packaged plugin');
assert.equal(fs.readFileSync(installPaths.pluginConfigPath, 'utf8'), 'configured=true\n');
assert.equal(
fs.readFileSync(installPaths.pluginConfigPath, 'utf8'),
'configured=true\nbinary_path=/Applications/SubMiner.app/Contents/MacOS/SubMiner\n',
);
const scriptsDirEntries = fs.readdirSync(installPaths.scriptsDir);
const scriptOptsEntries = fs.readdirSync(installPaths.scriptOptsDir);
@@ -113,13 +118,17 @@ test('installFirstRunPluginToDefaultLocation installs plugin to Windows mpv defa
dirname: path.join(root, 'dist', 'main', 'runtime'),
appPath: path.join(root, 'app'),
resourcesPath,
binaryPath: 'C:\\Program Files\\SubMiner\\SubMiner.exe',
});
assert.equal(result.ok, true);
assert.equal(result.pluginInstallStatus, 'installed');
assert.equal(detectInstalledFirstRunPlugin(installPaths), true);
assert.equal(fs.readFileSync(installPaths.pluginEntrypointPath, 'utf8'), '-- packaged plugin');
assert.equal(fs.readFileSync(installPaths.pluginConfigPath, 'utf8'), 'configured=true\n');
assert.equal(
fs.readFileSync(installPaths.pluginConfigPath, 'utf8'),
'configured=true\nbinary_path=C:\\Program Files\\SubMiner\\SubMiner.exe\n',
);
});
});
@@ -146,12 +155,70 @@ test('installFirstRunPluginToDefaultLocation rewrites Windows plugin socket_path
dirname: path.join(root, 'dist', 'main', 'runtime'),
appPath: path.join(root, 'app'),
resourcesPath,
binaryPath: 'C:\\Program Files\\SubMiner\\SubMiner.exe',
});
assert.equal(result.ok, true);
assert.equal(
fs.readFileSync(installPaths.pluginConfigPath, 'utf8'),
'binary_path=\nsocket_path=\\\\.\\pipe\\subminer-socket\n',
'binary_path=C:\\Program Files\\SubMiner\\SubMiner.exe\nsocket_path=\\\\.\\pipe\\subminer-socket\n',
);
});
});
test('syncInstalledFirstRunPluginBinaryPath fills blank binary_path for existing installs', () => {
withTempDir((root) => {
const homeDir = path.join(root, 'home');
const xdgConfigHome = path.join(root, 'xdg');
const installPaths = resolveDefaultMpvInstallPaths('linux', homeDir, xdgConfigHome);
fs.mkdirSync(path.dirname(installPaths.pluginConfigPath), { recursive: true });
fs.writeFileSync(installPaths.pluginConfigPath, 'binary_path=\nsocket_path=/tmp/subminer-socket\n');
const result = syncInstalledFirstRunPluginBinaryPath({
platform: 'linux',
homeDir,
xdgConfigHome,
binaryPath: '/Applications/SubMiner.app/Contents/MacOS/SubMiner',
});
assert.deepEqual(result, {
updated: true,
configPath: installPaths.pluginConfigPath,
});
assert.equal(
fs.readFileSync(installPaths.pluginConfigPath, 'utf8'),
'binary_path=/Applications/SubMiner.app/Contents/MacOS/SubMiner\nsocket_path=/tmp/subminer-socket\n',
);
});
});
test('syncInstalledFirstRunPluginBinaryPath preserves explicit binary_path overrides', () => {
withTempDir((root) => {
const homeDir = path.join(root, 'home');
const xdgConfigHome = path.join(root, 'xdg');
const installPaths = resolveDefaultMpvInstallPaths('linux', homeDir, xdgConfigHome);
fs.mkdirSync(path.dirname(installPaths.pluginConfigPath), { recursive: true });
fs.writeFileSync(
installPaths.pluginConfigPath,
'binary_path=/tmp/SubMiner/scripts/subminer-dev.sh\nsocket_path=/tmp/subminer-socket\n',
);
const result = syncInstalledFirstRunPluginBinaryPath({
platform: 'linux',
homeDir,
xdgConfigHome,
binaryPath: '/Applications/SubMiner.app/Contents/MacOS/SubMiner',
});
assert.deepEqual(result, {
updated: false,
configPath: installPaths.pluginConfigPath,
});
assert.equal(
fs.readFileSync(installPaths.pluginConfigPath, 'utf8'),
'binary_path=/tmp/SubMiner/scripts/subminer-dev.sh\nsocket_path=/tmp/subminer-socket\n',
);
});
});

View File

@@ -28,6 +28,43 @@ function rewriteInstalledWindowsPluginConfig(configPath: string): void {
}
}
function sanitizePluginConfigValue(value: string): string {
return value.replace(/[\r\n]/g, '').trim();
}
function upsertPluginConfigLine(content: string, key: string, value: string): string {
const normalizedValue = sanitizePluginConfigValue(value);
const line = `${key}=${normalizedValue}`;
const pattern = new RegExp(`^${key}=.*$`, 'm');
if (pattern.test(content)) {
return content.replace(pattern, line);
}
const suffix = content.endsWith('\n') || content.length === 0 ? '' : '\n';
return `${content}${suffix}${line}\n`;
}
function rewriteInstalledPluginBinaryPath(configPath: string, binaryPath: string): boolean {
const content = fs.readFileSync(configPath, 'utf8');
const updated = upsertPluginConfigLine(content, 'binary_path', binaryPath);
if (updated === content) {
return false;
}
fs.writeFileSync(configPath, updated, 'utf8');
return true;
}
function readInstalledPluginBinaryPath(configPath: string): string | null {
const content = fs.readFileSync(configPath, 'utf8');
const match = content.match(/^binary_path=(.*)$/m);
if (!match) {
return null;
}
const rawValue = match[1] ?? '';
const value = sanitizePluginConfigValue(rawValue);
return value.length > 0 ? value : null;
}
export function resolvePackagedFirstRunPluginAssets(deps: {
dirname: string;
appPath: string;
@@ -79,6 +116,7 @@ export function installFirstRunPluginToDefaultLocation(options: {
dirname: string;
appPath: string;
resourcesPath: string;
binaryPath: string;
}): PluginInstallResult {
const installPaths = resolveDefaultMpvInstallPaths(
options.platform,
@@ -116,6 +154,7 @@ export function installFirstRunPluginToDefaultLocation(options: {
backupExistingPath(installPaths.pluginConfigPath);
fs.cpSync(assets.pluginDirSource, installPaths.pluginDir, { recursive: true });
fs.copyFileSync(assets.pluginConfigSource, installPaths.pluginConfigPath);
rewriteInstalledPluginBinaryPath(installPaths.pluginConfigPath, options.binaryPath);
if (options.platform === 'win32') {
rewriteInstalledWindowsPluginConfig(installPaths.pluginConfigPath);
}
@@ -127,3 +166,33 @@ export function installFirstRunPluginToDefaultLocation(options: {
message: `Installed mpv plugin to ${installPaths.mpvConfigDir}.`,
};
}
export function syncInstalledFirstRunPluginBinaryPath(options: {
platform: NodeJS.Platform;
homeDir: string;
xdgConfigHome?: string;
binaryPath: string;
}): { updated: boolean; configPath: string | null } {
const installPaths = resolveDefaultMpvInstallPaths(
options.platform,
options.homeDir,
options.xdgConfigHome,
);
if (!installPaths.supported || !fs.existsSync(installPaths.pluginConfigPath)) {
return { updated: false, configPath: null };
}
const configuredBinaryPath = readInstalledPluginBinaryPath(installPaths.pluginConfigPath);
if (configuredBinaryPath) {
return { updated: false, configPath: installPaths.pluginConfigPath };
}
const updated = rewriteInstalledPluginBinaryPath(installPaths.pluginConfigPath, options.binaryPath);
if (options.platform === 'win32') {
rewriteInstalledWindowsPluginConfig(installPaths.pluginConfigPath);
}
return {
updated,
configPath: installPaths.pluginConfigPath,
};
}

View File

@@ -16,21 +16,20 @@ test('createCreateFirstRunSetupWindowHandler builds first-run setup window', ()
});
assert.deepEqual(createSetupWindow(), { id: 'first-run' });
const { resizable, minimizable, maximizable, ...firstRunWindowOptions } = options!;
assert.deepEqual(firstRunWindowOptions, {
assert.deepEqual(options, {
width: 480,
height: 460,
title: 'SubMiner Setup',
show: true,
autoHideMenuBar: true,
resizable: false,
minimizable: false,
maximizable: false,
webPreferences: {
nodeIntegration: false,
contextIsolation: true,
},
});
assert.equal(resizable, false);
assert.equal(minimizable, false);
assert.equal(maximizable, false);
});
test('createCreateJellyfinSetupWindowHandler builds jellyfin setup window', () => {
@@ -43,13 +42,7 @@ test('createCreateJellyfinSetupWindowHandler builds jellyfin setup window', () =
});
assert.deepEqual(createSetupWindow(), { id: 'jellyfin' });
const {
resizable: jellyfinResizable,
minimizable: jellyfinMinimizable,
maximizable: jellyfinMaximizable,
...jellyfinWindowOptions
} = options!;
assert.deepEqual(jellyfinWindowOptions, {
assert.deepEqual(options, {
width: 520,
height: 560,
title: 'Jellyfin Setup',
@@ -60,9 +53,6 @@ test('createCreateJellyfinSetupWindowHandler builds jellyfin setup window', () =
contextIsolation: true,
},
});
assert.equal(jellyfinResizable, undefined);
assert.equal(jellyfinMinimizable, undefined);
assert.equal(jellyfinMaximizable, undefined);
});
test('createCreateAnilistSetupWindowHandler builds anilist setup window', () => {
@@ -75,13 +65,7 @@ test('createCreateAnilistSetupWindowHandler builds anilist setup window', () =>
});
assert.deepEqual(createSetupWindow(), { id: 'anilist' });
const {
resizable: anilistResizable,
minimizable: anilistMinimizable,
maximizable: anilistMaximizable,
...anilistWindowOptions
} = options!;
assert.deepEqual(anilistWindowOptions, {
assert.deepEqual(options, {
width: 1000,
height: 760,
title: 'Anilist Setup',
@@ -92,7 +76,4 @@ test('createCreateAnilistSetupWindowHandler builds anilist setup window', () =>
contextIsolation: true,
},
});
assert.equal(anilistResizable, undefined);
assert.equal(anilistMinimizable, undefined);
assert.equal(anilistMaximizable, undefined);
});

View File

@@ -11,23 +11,21 @@ function createSetupWindowHandler<TWindow>(
deps: { createBrowserWindow: (options: Electron.BrowserWindowConstructorOptions) => TWindow },
config: SetupWindowConfig,
) {
return (): TWindow => {
const options: Electron.BrowserWindowConstructorOptions = {
return (): TWindow =>
deps.createBrowserWindow({
width: config.width,
height: config.height,
title: config.title,
show: true,
autoHideMenuBar: true,
...(config.resizable === undefined ? {} : { resizable: config.resizable }),
...(config.minimizable === undefined ? {} : { minimizable: config.minimizable }),
...(config.maximizable === undefined ? {} : { maximizable: config.maximizable }),
webPreferences: {
nodeIntegration: false,
contextIsolation: true,
},
};
if (config.resizable !== undefined) options.resizable = config.resizable;
if (config.minimizable !== undefined) options.minimizable = config.minimizable;
if (config.maximizable !== undefined) options.maximizable = config.maximizable;
return deps.createBrowserWindow(options);
};
});
}
export function createCreateFirstRunSetupWindowHandler<TWindow>(deps: {

View File

@@ -6,6 +6,7 @@ test('youtube playback runtime resets flow ownership after a successful run', as
const calls: string[] = [];
let appOwnedFlowInFlight = false;
let timeoutCallback: (() => void) | null = null;
let socketPath = '/tmp/mpv.sock';
const runtime = createYoutubePlaybackRuntime({
platform: 'linux',
@@ -13,7 +14,7 @@ test('youtube playback runtime resets flow ownership after a successful run', as
mpvYtdlFormat: 'bestvideo+bestaudio',
autoLaunchTimeoutMs: 2_000,
connectTimeoutMs: 1_000,
socketPath: '/tmp/mpv.sock',
getSocketPath: () => socketPath,
getMpvConnected: () => true,
invalidatePendingAutoplayReadyFallbacks: () => {
calls.push('invalidate-autoplay');
@@ -78,3 +79,70 @@ test('youtube playback runtime resets flow ownership after a successful run', as
scheduledCallback();
assert.equal(runtime.getQuitOnDisconnectArmed(), true);
});
test('youtube playback runtime resolves the socket path lazily for windows startup', async () => {
const calls: string[] = [];
let socketPath = '/tmp/initial.sock';
const runtime = createYoutubePlaybackRuntime({
platform: 'win32',
directPlaybackFormat: 'best',
mpvYtdlFormat: 'bestvideo+bestaudio',
autoLaunchTimeoutMs: 2_000,
connectTimeoutMs: 1_000,
getSocketPath: () => socketPath,
getMpvConnected: () => false,
invalidatePendingAutoplayReadyFallbacks: () => {
calls.push('invalidate-autoplay');
},
setAppOwnedFlowInFlight: (next) => {
calls.push(`app-owned:${next}`);
},
ensureYoutubePlaybackRuntimeReady: async () => {
calls.push('ensure-runtime-ready');
},
resolveYoutubePlaybackUrl: async (url, format) => {
calls.push(`resolve:${url}:${format}`);
return 'https://example.com/direct';
},
launchWindowsMpv: (_playbackUrl, args) => {
calls.push(`launch:${args.join(' ')}`);
return { ok: true, mpvPath: '/usr/bin/mpv' };
},
waitForYoutubeMpvConnected: async (timeoutMs) => {
calls.push(`wait-connected:${timeoutMs}`);
return true;
},
prepareYoutubePlaybackInMpv: async ({ url }) => {
calls.push(`prepare:${url}`);
return true;
},
runYoutubePlaybackFlow: async ({ url, mode }) => {
calls.push(`run-flow:${url}:${mode}`);
},
logInfo: (message) => {
calls.push(`info:${message}`);
},
logWarn: (message) => {
calls.push(`warn:${message}`);
},
schedule: (callback) => {
calls.push('schedule-arm');
callback();
return 1 as never;
},
clearScheduled: () => {
calls.push('clear-scheduled');
},
});
socketPath = '/tmp/updated.sock';
await runtime.runYoutubePlaybackFlow({
url: 'https://youtu.be/demo',
mode: 'download',
source: 'initial',
});
assert.ok(calls.some((entry) => entry.includes('--input-ipc-server=/tmp/updated.sock')));
});

View File

@@ -11,7 +11,7 @@ export type YoutubePlaybackRuntimeDeps = {
mpvYtdlFormat: string;
autoLaunchTimeoutMs: number;
connectTimeoutMs: number;
socketPath: string;
getSocketPath: () => string;
getMpvConnected: () => boolean;
invalidatePendingAutoplayReadyFallbacks: () => void;
setAppOwnedFlowInFlight: (next: boolean) => void;
@@ -76,6 +76,7 @@ export function createYoutubePlaybackRuntime(deps: YoutubePlaybackRuntimeDeps) {
}
if (deps.platform === 'win32' && !deps.getMpvConnected()) {
const socketPath = deps.getSocketPath();
const launchResult = deps.launchWindowsMpv(playbackUrl, [
'--pause=yes',
'--ytdl=yes',
@@ -87,7 +88,7 @@ export function createYoutubePlaybackRuntime(deps: YoutubePlaybackRuntimeDeps) {
'--secondary-sub-visibility=no',
'--alang=ja,jp,jpn,japanese,en,eng,english,enus,en-us',
'--slang=ja,jp,jpn,japanese,en,eng,english,enus,en-us',
`--input-ipc-server=${deps.socketPath}`,
`--input-ipc-server=${socketPath}`,
]);
launchedWindowsMpv = launchResult.ok;
if (launchResult.ok && launchResult.mpvPath) {

View File

@@ -518,6 +518,26 @@ test('popup-visible mpv keybindings still fire for bound keys', async () => {
}
});
test('visible-layer y-t dispatches mpv plugin toggle while overlay owns focus', async () => {
const { handlers, testGlobals } = createKeyboardHandlerHarness();
try {
await handlers.setupMpvInputForwarding();
testGlobals.dispatchKeydown({ key: 'y', code: 'KeyY' });
testGlobals.dispatchKeydown({ key: 't', code: 'KeyT' });
assert.equal(
testGlobals.mpvCommands.some(
(command) => command[0] === 'script-message' && command[1] === 'subminer-toggle',
),
true,
);
} finally {
testGlobals.restore();
}
});
test('keyboard mode: controller helpers dispatch popup audio play/cycle and scroll bridge commands', async () => {
const { ctx, handlers, testGlobals } = createKeyboardHandlerHarness();

View File

@@ -1241,6 +1241,7 @@ test('subtitle sidebar closes and resumes a hover pause', async () => {
const previousDocument = globals.document;
const mpvCommands: Array<Array<string | number>> = [];
const modalListeners = new Map<string, Array<() => void>>();
const contentListeners = new Map<string, Array<() => void>>();
const snapshot: SubtitleSidebarSnapshot = {
cues: [{ startTime: 1, endTime: 2, text: 'first' }],
@@ -1317,6 +1318,11 @@ test('subtitle sidebar closes and resumes a hover pause', async () => {
subtitleSidebarContent: {
classList: createClassList(),
getBoundingClientRect: () => ({ width: 420 }),
addEventListener: (type: string, listener: () => void) => {
const bucket = contentListeners.get(type) ?? [];
bucket.push(listener);
contentListeners.set(type, bucket);
},
},
subtitleSidebarClose: { addEventListener: () => {} },
subtitleSidebarStatus: { textContent: '' },
@@ -1333,7 +1339,7 @@ test('subtitle sidebar closes and resumes a hover pause', async () => {
await modal.openSubtitleSidebarModal();
await modal.refreshSubtitleSidebarSnapshot();
mpvCommands.length = 0;
await modalListeners.get('mouseenter')?.[0]?.();
await contentListeners.get('mouseenter')?.[0]?.();
assert.deepEqual(mpvCommands.at(-1), ['set_property', 'pause', 'yes']);
@@ -1353,6 +1359,7 @@ test('subtitle sidebar hover pause ignores playback-state IPC failures', async (
const previousDocument = globals.document;
const mpvCommands: Array<Array<string | number>> = [];
const modalListeners = new Map<string, Array<() => Promise<void> | void>>();
const contentListeners = new Map<string, Array<() => Promise<void> | void>>();
const snapshot: SubtitleSidebarSnapshot = {
cues: [{ startTime: 1, endTime: 2, text: 'first' }],
@@ -1431,6 +1438,11 @@ test('subtitle sidebar hover pause ignores playback-state IPC failures', async (
subtitleSidebarContent: {
classList: createClassList(),
getBoundingClientRect: () => ({ width: 420 }),
addEventListener: (type: string, listener: () => Promise<void> | void) => {
const bucket = contentListeners.get(type) ?? [];
bucket.push(listener);
contentListeners.set(type, bucket);
},
},
subtitleSidebarClose: { addEventListener: () => {} },
subtitleSidebarStatus: { textContent: '' },
@@ -1446,7 +1458,7 @@ test('subtitle sidebar hover pause ignores playback-state IPC failures', async (
await modal.openSubtitleSidebarModal();
await assert.doesNotReject(async () => {
await modalListeners.get('mouseenter')?.[0]?.();
await contentListeners.get('mouseenter')?.[0]?.();
});
assert.equal(state.subtitleSidebarPausedByHover, false);
@@ -1744,6 +1756,7 @@ test('subtitle sidebar embedded layout restores macOS and Windows passthrough ou
const mpvCommands: Array<Array<string | number>> = [];
const ignoreMouseCalls: Array<[boolean, { forward?: boolean } | undefined]> = [];
const modalListeners = new Map<string, Array<() => void>>();
const contentListeners = new Map<string, Array<() => void>>();
const snapshot: SubtitleSidebarSnapshot = {
cues: [{ startTime: 1, endTime: 2, text: 'first' }],
@@ -1823,6 +1836,11 @@ test('subtitle sidebar embedded layout restores macOS and Windows passthrough ou
subtitleSidebarContent: {
classList: createClassList(),
getBoundingClientRect: () => ({ width: 360 }),
addEventListener: (type: string, listener: () => void) => {
const bucket = contentListeners.get(type) ?? [];
bucket.push(listener);
contentListeners.set(type, bucket);
},
},
subtitleSidebarClose: { addEventListener: () => {} },
subtitleSidebarStatus: { textContent: '' },
@@ -1842,15 +1860,15 @@ test('subtitle sidebar embedded layout restores macOS and Windows passthrough ou
await modal.openSubtitleSidebarModal();
assert.deepEqual(ignoreMouseCalls.at(-1), [true, { forward: true }]);
modalListeners.get('mouseenter')?.[0]?.();
contentListeners.get('mouseenter')?.[0]?.();
assert.deepEqual(ignoreMouseCalls.at(-1), [false, undefined]);
modalListeners.get('mouseleave')?.[0]?.();
contentListeners.get('mouseleave')?.[0]?.();
assert.deepEqual(ignoreMouseCalls.at(-1), [true, { forward: true }]);
state.isOverSubtitle = true;
modalListeners.get('mouseenter')?.[0]?.();
modalListeners.get('mouseleave')?.[0]?.();
contentListeners.get('mouseenter')?.[0]?.();
contentListeners.get('mouseleave')?.[0]?.();
assert.deepEqual(ignoreMouseCalls.at(-1), [false, undefined]);
void mpvCommands;
@@ -1860,6 +1878,251 @@ test('subtitle sidebar embedded layout restores macOS and Windows passthrough ou
}
});
test('subtitle sidebar overlay layout restores macOS and Windows passthrough outside sidebar hover', async () => {
const globals = globalThis as typeof globalThis & { window?: unknown; document?: unknown };
const previousWindow = globals.window;
const previousDocument = globals.document;
const mpvCommands: Array<Array<string | number>> = [];
const ignoreMouseCalls: Array<[boolean, { forward?: boolean } | undefined]> = [];
const modalListeners = new Map<string, Array<() => void>>();
const contentListeners = new Map<string, Array<() => void>>();
const snapshot: SubtitleSidebarSnapshot = {
cues: [{ startTime: 1, endTime: 2, text: 'first' }],
currentSubtitle: {
text: 'first',
startTime: 1,
endTime: 2,
},
currentTimeSec: 1.1,
config: {
enabled: true,
autoOpen: false,
layout: 'overlay',
toggleKey: 'Backslash',
pauseVideoOnHover: false,
autoScroll: true,
maxWidth: 360,
opacity: 0.92,
backgroundColor: 'rgba(54, 58, 79, 0.88)',
textColor: '#cad3f5',
fontFamily: '"Iosevka Aile", sans-serif',
fontSize: 17,
timestampColor: '#a5adcb',
activeLineColor: '#f5bde6',
activeLineBackgroundColor: 'rgba(138, 173, 244, 0.22)',
hoverLineBackgroundColor: 'rgba(54, 58, 79, 0.84)',
},
};
Object.defineProperty(globalThis, 'window', {
configurable: true,
value: {
innerWidth: 1200,
electronAPI: {
getSubtitleSidebarSnapshot: async () => snapshot,
sendMpvCommand: (command: Array<string | number>) => {
mpvCommands.push(command);
},
setIgnoreMouseEvents: (ignore: boolean, options?: { forward?: boolean }) => {
ignoreMouseCalls.push([ignore, options]);
},
} as unknown as ElectronAPI,
addEventListener: () => {},
removeEventListener: () => {},
},
});
Object.defineProperty(globalThis, 'document', {
configurable: true,
value: {
createElement: () => createCueRow(),
body: {
classList: createClassList(),
},
documentElement: {
style: {
setProperty: () => {},
},
},
},
});
try {
const state = createRendererState();
const ctx = {
dom: {
overlay: { classList: createClassList() },
subtitleSidebarModal: {
classList: createClassList(['hidden']),
setAttribute: () => {},
style: { setProperty: () => {} },
addEventListener: (type: string, listener: () => void) => {
const bucket = modalListeners.get(type) ?? [];
bucket.push(listener);
modalListeners.set(type, bucket);
},
},
subtitleSidebarContent: {
classList: createClassList(),
getBoundingClientRect: () => ({ width: 360 }),
addEventListener: (type: string, listener: () => void) => {
const bucket = contentListeners.get(type) ?? [];
bucket.push(listener);
contentListeners.set(type, bucket);
},
},
subtitleSidebarClose: { addEventListener: () => {} },
subtitleSidebarStatus: { textContent: '' },
subtitleSidebarList: createListStub(),
},
platform: {
shouldToggleMouseIgnore: true,
},
state,
};
const modal = createSubtitleSidebarModal(ctx as never, {
modalStateReader: { isAnyModalOpen: () => false },
});
modal.wireDomEvents();
assert.equal(modalListeners.get('mouseenter')?.length ?? 0, 0);
assert.equal(modalListeners.get('mouseleave')?.length ?? 0, 0);
assert.equal(contentListeners.get('mouseenter')?.length ?? 0, 1);
assert.equal(contentListeners.get('mouseleave')?.length ?? 0, 1);
await modal.openSubtitleSidebarModal();
assert.deepEqual(ignoreMouseCalls.at(-1), [true, { forward: true }]);
contentListeners.get('mouseenter')?.[0]?.();
assert.deepEqual(ignoreMouseCalls.at(-1), [false, undefined]);
contentListeners.get('mouseleave')?.[0]?.();
assert.deepEqual(ignoreMouseCalls.at(-1), [true, { forward: true }]);
void mpvCommands;
} finally {
Object.defineProperty(globalThis, 'window', { configurable: true, value: previousWindow });
Object.defineProperty(globalThis, 'document', { configurable: true, value: previousDocument });
}
});
test('subtitle sidebar overlay layout only stays interactive while focus remains inside the sidebar panel', async () => {
const globals = globalThis as typeof globalThis & { window?: unknown; document?: unknown };
const previousWindow = globals.window;
const previousDocument = globals.document;
const ignoreMouseCalls: Array<[boolean, { forward?: boolean } | undefined]> = [];
const contentListeners = new Map<string, Array<(event?: FocusEvent) => void>>();
const snapshot: SubtitleSidebarSnapshot = {
cues: [{ startTime: 1, endTime: 2, text: 'first' }],
currentSubtitle: {
text: 'first',
startTime: 1,
endTime: 2,
},
currentTimeSec: 1.1,
config: {
enabled: true,
autoOpen: false,
layout: 'overlay',
toggleKey: 'Backslash',
pauseVideoOnHover: false,
autoScroll: true,
maxWidth: 360,
opacity: 0.92,
backgroundColor: 'rgba(54, 58, 79, 0.88)',
textColor: '#cad3f5',
fontFamily: '"Iosevka Aile", sans-serif',
fontSize: 17,
timestampColor: '#a5adcb',
activeLineColor: '#f5bde6',
activeLineBackgroundColor: 'rgba(138, 173, 244, 0.22)',
hoverLineBackgroundColor: 'rgba(54, 58, 79, 0.84)',
},
};
Object.defineProperty(globalThis, 'window', {
configurable: true,
value: {
innerWidth: 1200,
electronAPI: {
getSubtitleSidebarSnapshot: async () => snapshot,
sendMpvCommand: () => {},
setIgnoreMouseEvents: (ignore: boolean, options?: { forward?: boolean }) => {
ignoreMouseCalls.push([ignore, options]);
},
} as unknown as ElectronAPI,
addEventListener: () => {},
removeEventListener: () => {},
},
});
Object.defineProperty(globalThis, 'document', {
configurable: true,
value: {
createElement: () => createCueRow(),
body: {
classList: createClassList(),
},
documentElement: {
style: {
setProperty: () => {},
},
},
},
});
try {
const state = createRendererState();
const sidebarContent = {
classList: createClassList(),
getBoundingClientRect: () => ({ width: 360 }),
addEventListener: (type: string, listener: (event?: FocusEvent) => void) => {
const bucket = contentListeners.get(type) ?? [];
bucket.push(listener);
contentListeners.set(type, bucket);
},
contains: () => false,
};
const ctx = {
dom: {
overlay: { classList: createClassList() },
subtitleSidebarModal: {
classList: createClassList(['hidden']),
setAttribute: () => {},
style: { setProperty: () => {} },
addEventListener: () => {},
},
subtitleSidebarContent: sidebarContent,
subtitleSidebarClose: { addEventListener: () => {} },
subtitleSidebarStatus: { textContent: '' },
subtitleSidebarList: createListStub(),
},
platform: {
shouldToggleMouseIgnore: true,
},
state,
};
const modal = createSubtitleSidebarModal(ctx as never, {
modalStateReader: { isAnyModalOpen: () => false },
});
modal.wireDomEvents();
await modal.openSubtitleSidebarModal();
assert.deepEqual(ignoreMouseCalls.at(-1), [true, { forward: true }]);
contentListeners.get('focusin')?.[0]?.();
assert.deepEqual(ignoreMouseCalls.at(-1), [false, undefined]);
contentListeners.get('focusout')?.[0]?.({ relatedTarget: null } as FocusEvent);
assert.deepEqual(ignoreMouseCalls.at(-1), [true, { forward: true }]);
} finally {
Object.defineProperty(globalThis, 'window', { configurable: true, value: previousWindow });
Object.defineProperty(globalThis, 'document', { configurable: true, value: previousDocument });
}
});
test('closing embedded subtitle sidebar recomputes passthrough from remaining subtitle hover state', async () => {
const globals = globalThis as typeof globalThis & { window?: unknown; document?: unknown };
const previousWindow = globals.window;

View File

@@ -143,11 +143,23 @@ export function createSubtitleSidebarModal(
let lastAppliedVideoMarginRatio: number | null = null;
let subtitleSidebarHoverRequestId = 0;
let disposeDomEvents: (() => void) | null = null;
let subtitleSidebarHovered = false;
let subtitleSidebarFocusedWithin = false;
function restoreEmbeddedSidebarPassthrough(): void {
syncOverlayMouseIgnoreState(ctx);
}
function syncSidebarInteractionState(): void {
ctx.state.isOverSubtitleSidebar = subtitleSidebarHovered || subtitleSidebarFocusedWithin;
}
function clearSidebarInteractionState(): void {
subtitleSidebarHovered = false;
subtitleSidebarFocusedWithin = false;
syncSidebarInteractionState();
}
function setStatus(message: string): void {
ctx.dom.subtitleSidebarStatus.textContent = message;
}
@@ -379,6 +391,7 @@ export function createSubtitleSidebarModal(
applyConfig(snapshot);
if (!snapshot.config.enabled) {
resumeSubtitleSidebarHoverPause();
clearSidebarInteractionState();
ctx.state.subtitleSidebarCues = [];
ctx.state.subtitleSidebarModalOpen = false;
ctx.dom.subtitleSidebarModal.classList.add('hidden');
@@ -450,7 +463,7 @@ export function createSubtitleSidebarModal(
}
ctx.state.subtitleSidebarModalOpen = true;
ctx.state.isOverSubtitleSidebar = false;
clearSidebarInteractionState();
ctx.dom.subtitleSidebarModal.classList.remove('hidden');
ctx.dom.subtitleSidebarModal.setAttribute('aria-hidden', 'false');
renderCueList();
@@ -478,7 +491,7 @@ export function createSubtitleSidebarModal(
return;
}
resumeSubtitleSidebarHoverPause();
ctx.state.isOverSubtitleSidebar = false;
clearSidebarInteractionState();
ctx.state.subtitleSidebarModalOpen = false;
ctx.dom.subtitleSidebarModal.classList.add('hidden');
ctx.dom.subtitleSidebarModal.setAttribute('aria-hidden', 'true');
@@ -536,8 +549,9 @@ export function createSubtitleSidebarModal(
ctx.dom.subtitleSidebarList.addEventListener('wheel', () => {
ctx.state.subtitleSidebarManualScrollUntilMs = Date.now() + MANUAL_SCROLL_HOLD_MS;
});
ctx.dom.subtitleSidebarModal.addEventListener('mouseenter', async () => {
ctx.state.isOverSubtitleSidebar = true;
ctx.dom.subtitleSidebarContent.addEventListener('mouseenter', async () => {
subtitleSidebarHovered = true;
syncSidebarInteractionState();
restoreEmbeddedSidebarPassthrough();
if (!ctx.state.subtitleSidebarPauseVideoOnHover || ctx.state.subtitleSidebarPausedByHover) {
return;
@@ -557,8 +571,36 @@ export function createSubtitleSidebarModal(
ctx.state.subtitleSidebarPausedByHover = true;
}
});
ctx.dom.subtitleSidebarModal.addEventListener('mouseleave', () => {
ctx.state.isOverSubtitleSidebar = false;
ctx.dom.subtitleSidebarContent.addEventListener('mouseleave', () => {
subtitleSidebarHovered = false;
syncSidebarInteractionState();
if (ctx.state.isOverSubtitleSidebar) {
restoreEmbeddedSidebarPassthrough();
return;
}
resumeSubtitleSidebarHoverPause();
});
ctx.dom.subtitleSidebarContent.addEventListener('focusin', () => {
subtitleSidebarFocusedWithin = true;
syncSidebarInteractionState();
restoreEmbeddedSidebarPassthrough();
});
ctx.dom.subtitleSidebarContent.addEventListener('focusout', (event: FocusEvent) => {
const relatedTarget = event.relatedTarget;
if (
typeof Node !== 'undefined' &&
relatedTarget instanceof Node &&
ctx.dom.subtitleSidebarContent.contains(relatedTarget)
) {
return;
}
subtitleSidebarFocusedWithin = false;
syncSidebarInteractionState();
if (ctx.state.isOverSubtitleSidebar) {
restoreEmbeddedSidebarPassthrough();
return;
}
resumeSubtitleSidebarHoverPause();
});
const resizeHandler = () => {

View File

@@ -2,9 +2,6 @@ import type { RendererContext } from './context';
import type { RendererState } from './state';
function isBlockingOverlayModalOpen(state: RendererState): boolean {
const embeddedSidebarOpen =
state.subtitleSidebarModalOpen && state.subtitleSidebarConfig?.layout === 'embedded';
return Boolean(
state.controllerSelectModalOpen ||
state.controllerDebugModalOpen ||
@@ -13,8 +10,7 @@ function isBlockingOverlayModalOpen(state: RendererState): boolean {
state.kikuModalOpen ||
state.runtimeOptionsModalOpen ||
state.subsyncModalOpen ||
state.sessionHelpModalOpen ||
(state.subtitleSidebarModalOpen && !embeddedSidebarOpen),
state.sessionHelpModalOpen,
);
}

View File

@@ -273,6 +273,7 @@ export interface ResolvedConfig {
};
discordPresence: {
enabled: boolean;
presenceStyle: import('./integrations').DiscordPresenceStylePreset;
updateIntervalMs: number;
debounceMs: number;
};

View File

@@ -101,8 +101,11 @@ export interface JellyfinConfig {
transcodeVideoCodec?: string;
}
export type DiscordPresenceStylePreset = 'default' | 'meme' | 'japanese' | 'minimal';
export interface DiscordPresenceConfig {
enabled?: boolean;
presenceStyle?: DiscordPresenceStylePreset;
updateIntervalMs?: number;
debounceMs?: number;
}