Refactor startup, queries, and workflow into focused modules (#36)

* chore(backlog): add mining workflow milestone and tasks

* refactor: split character dictionary runtime modules

* refactor: split shared type entrypoints

* refactor: use bun serve for stats server

* feat: add repo-local subminer workflow plugin

* fix: add stats server node fallback

* refactor: split immersion tracker query modules

* chore: update backlog task records

* refactor: migrate shared type imports

* refactor: compose startup and setup window wiring

* Add backlog tasks and launcher time helper tests

- Track follow-up cleanup work in Backlog.md
- Replace Date.now usage with shared nowMs helper
- Add launcher args/parser and core regression tests

* test: increase launcher test timeout for CI stability

* fix: address CodeRabbit review feedback

* refactor(main): extract remaining inline runtime logic from main

* chore(backlog): update task notes and changelog fragment

* refactor: split main boot phases

* test: stabilize bun coverage reporting

* Switch plausible endpoint and harden coverage lane parsing

- update docs-site tracking to use the Plausible capture endpoint
- tighten coverage lane argument and LCOV parsing checks
- make script entrypoint use CommonJS main guard

* Restrict docs analytics and build coverage input

- limit Plausible init to docs.subminer.moe
- build Yomitan before src coverage lane

* fix(ci): normalize Windows shortcut paths for cross-platform tests

* Fix verification and immersion-tracker grouping

- isolate verifier artifacts and lease handling
- switch weekly/monthly tracker cutoffs to calendar boundaries
- tighten boot lifecycle and zip writer tests

* fix: resolve CI type failures in boot and immersion query tests

* fix: remove strict spread usage in Date mocks

* fix: use explicit super args for MockDate constructors

* Factor out mock date helper in tracker tests

- reuse a shared `withMockDate` helper for date-sensitive query tests
- make monthly rollup assertions key off `videoId` instead of row order

* fix: use variadic array type for MockDate constructor args

TS2367: fixed-length tuple made args.length === 0 unreachable.

* refactor: remove unused createMainBootRuntimes/Handlers aggregate functions

These functions were never called by production code — main.ts imports
the individual composeBoot* re-exports directly.

* refactor: remove boot re-export alias layer

main.ts now imports directly from the runtime/composers and runtime/domains
modules, eliminating the intermediate boot/ indirection.

* refactor: consolidate 3 near-identical setup window factories

Extract shared createSetupWindowHandler with a config parameter.
Public API unchanged.

* refactor: parameterize duplicated getAffected*Ids query helpers

Four structurally identical functions collapsed into two parameterized
helpers while preserving the existing public API.

* refactor: inline identity composers (stats-startup, overlay-window)

composeStatsStartupRuntime was a no-op that returned its input.
composeOverlayWindowHandlers was a 1-line delegation.
Both removed in favor of direct usage.

* chore: remove unused token/queue file path constants from main.ts

* fix: replace any types in boot services with proper signatures

* refactor: deduplicate ensureDir into shared/fs-utils

5 copies of mkdir-p-if-not-exists consolidated into one shared module
with ensureDir (directory path) and ensureDirForFile (file path) variants.

* fix: tighten type safety in boot services

- Add AppLifecycleShape and OverlayModalInputStateShape constraints
  so TAppLifecycleApp and TOverlayModalInputState generics are bounded
- Remove unsafe `as { handleModalInputStateChange? }` cast — now
  directly callable via the constraint
- Use `satisfies AppLifecycleShape` for structural validation on the
  appLifecycleApp object literal
- Document Electron App.on incompatibility with simple signatures

* refactor: inline subtitle-prefetch-runtime-composer

The composer was a pure pass-through that destructured an object and
reassembled it with the same fields. Inlined at the call site.

* chore: consolidate duplicate import paths in main.ts

* test: extract mpv composer test fixture factory to reduce duplication

* test: add behavioral assertions to composer tests

Upgrade 8 composer test files from shape-only typeof checks to behavioral
assertions that invoke returned handlers and verify injected dependencies are
actually called, following the mpv-runtime-composer pattern.

* refactor: normalize import extensions in query modules

* refactor: consolidate toDbMs into query-shared.ts

* refactor: remove Node.js fallback from stats-server, use Bun only

* Fix monthly rollup test expectations

- Preserve multi-arg Date construction in mock helper
- Align rollup assertions with the correct videoId

* fix: address PR 36 CodeRabbit follow-ups

* fix: harden coverage lane cleanup

* fix(stats): fallback to node server when Bun.serve unavailable

* fix(ci): restore coverage lane compatibility

* chore(backlog): close TASK-242

* fix: address latest CodeRabbit review round

* fix: guard disabled immersion retention windows

* fix: migrate discord rpc wrapper

* fix(ci): add changelog fragment for PR 36

* fix: stabilize macOS visible overlay toggle

* fix: pin installed mpv plugin to current binary

* fix: strip inline subtitle markup from sidebar cues

* fix(renderer): restore subtitle sidebar mpv passthrough

* feat(discord): add configurable presence style presets

Replace the hardcoded "Mining and crafting (Anki cards)" meme message
with a preset system. New `discordPresence.presenceStyle` option
supports four presets: "default" (clean bilingual), "meme" (the OG
Minecraft joke), "japanese" (fully JP), and "minimal". The default
preset shows "Sentence Mining" with 日本語学習中 as the small image
tooltip. Existing users can set presenceStyle to "meme" to keep the
old behavior.

* fix: finalize v0.10.0 release prep

* docs: add subtitle sidebar guide and release note

* chore(backlog): mark docs task done

* fix: lazily resolve youtube playback socket path

* chore(release): build v0.10.0 changelog

* Revert "chore(release): build v0.10.0 changelog"

This reverts commit 9741c0f020.
This commit is contained in:
2026-03-29 16:16:29 -07:00
committed by GitHub
parent 2d4f2d1139
commit 35adf8299c
297 changed files with 17713 additions and 9147 deletions

View File

@@ -0,0 +1,660 @@
import type { DatabaseSync } from './sqlite';
import type { ImmersionSessionRollupRow } from './types';
import { ACTIVE_SESSION_METRICS_CTE, makePlaceholders } from './query-shared';
import { getDailyRollups, getMonthlyRollups } from './query-sessions';
type TrendRange = '7d' | '30d' | '90d' | 'all';
type TrendGroupBy = 'day' | 'month';
interface TrendChartPoint {
label: string;
value: number;
}
interface TrendPerAnimePoint {
epochDay: number;
animeTitle: string;
value: number;
}
interface TrendSessionMetricRow {
startedAtMs: number;
videoId: number | null;
canonicalTitle: string | null;
animeTitle: string | null;
activeWatchedMs: number;
tokensSeen: number;
cardsMined: number;
yomitanLookupCount: number;
}
export interface TrendsDashboardQueryResult {
activity: {
watchTime: TrendChartPoint[];
cards: TrendChartPoint[];
words: TrendChartPoint[];
sessions: TrendChartPoint[];
};
progress: {
watchTime: TrendChartPoint[];
sessions: TrendChartPoint[];
words: TrendChartPoint[];
newWords: TrendChartPoint[];
cards: TrendChartPoint[];
episodes: TrendChartPoint[];
lookups: TrendChartPoint[];
};
ratios: {
lookupsPerHundred: TrendChartPoint[];
};
animePerDay: {
episodes: TrendPerAnimePoint[];
watchTime: TrendPerAnimePoint[];
cards: TrendPerAnimePoint[];
words: TrendPerAnimePoint[];
lookups: TrendPerAnimePoint[];
lookupsPerHundred: TrendPerAnimePoint[];
};
animeCumulative: {
watchTime: TrendPerAnimePoint[];
episodes: TrendPerAnimePoint[];
cards: TrendPerAnimePoint[];
words: TrendPerAnimePoint[];
};
patterns: {
watchTimeByDayOfWeek: TrendChartPoint[];
watchTimeByHour: TrendChartPoint[];
};
}
const TREND_DAY_LIMITS: Record<Exclude<TrendRange, 'all'>, number> = {
'7d': 7,
'30d': 30,
'90d': 90,
};
const DAY_NAMES = ['Sun', 'Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat'];
function getTrendDayLimit(range: TrendRange): number {
return range === 'all' ? 365 : TREND_DAY_LIMITS[range];
}
function getTrendMonthlyLimit(range: TrendRange): number {
if (range === 'all') {
return 120;
}
const now = new Date();
const cutoff = new Date(
now.getFullYear(),
now.getMonth(),
now.getDate() - (TREND_DAY_LIMITS[range] - 1),
);
return Math.max(1, (now.getFullYear() - cutoff.getFullYear()) * 12 + now.getMonth() - cutoff.getMonth() + 1);
}
function getTrendCutoffMs(range: TrendRange): number | null {
if (range === 'all') {
return null;
}
const dayLimit = getTrendDayLimit(range);
const now = new Date();
const localMidnight = new Date(now.getFullYear(), now.getMonth(), now.getDate()).getTime();
return localMidnight - (dayLimit - 1) * 86_400_000;
}
function makeTrendLabel(value: number): string {
if (value > 100_000) {
const year = Math.floor(value / 100);
const month = value % 100;
return new Date(Date.UTC(year, month - 1, 1)).toLocaleDateString(undefined, {
month: 'short',
year: '2-digit',
});
}
return new Date(value * 86_400_000).toLocaleDateString(undefined, {
month: 'short',
day: 'numeric',
});
}
function getLocalEpochDay(timestampMs: number): number {
const date = new Date(timestampMs);
return Math.floor((timestampMs - date.getTimezoneOffset() * 60_000) / 86_400_000);
}
function getLocalDateForEpochDay(epochDay: number): Date {
const utcDate = new Date(epochDay * 86_400_000);
return new Date(utcDate.getTime() + utcDate.getTimezoneOffset() * 60_000);
}
function getLocalMonthKey(timestampMs: number): number {
const date = new Date(timestampMs);
return date.getFullYear() * 100 + date.getMonth() + 1;
}
function getTrendSessionWordCount(session: Pick<TrendSessionMetricRow, 'tokensSeen'>): number {
return session.tokensSeen;
}
function resolveTrendAnimeTitle(value: {
animeTitle: string | null;
canonicalTitle: string | null;
}): string {
return value.animeTitle ?? value.canonicalTitle ?? 'Unknown';
}
function accumulatePoints(points: TrendChartPoint[]): TrendChartPoint[] {
let sum = 0;
return points.map((point) => {
sum += point.value;
return {
label: point.label,
value: sum,
};
});
}
function buildAggregatedTrendRows(rollups: ImmersionSessionRollupRow[]) {
const byKey = new Map<
number,
{ activeMin: number; cards: number; words: number; sessions: number }
>();
for (const rollup of rollups) {
const existing = byKey.get(rollup.rollupDayOrMonth) ?? {
activeMin: 0,
cards: 0,
words: 0,
sessions: 0,
};
existing.activeMin += rollup.totalActiveMin;
existing.cards += rollup.totalCards;
existing.words += rollup.totalTokensSeen;
existing.sessions += rollup.totalSessions;
byKey.set(rollup.rollupDayOrMonth, existing);
}
return Array.from(byKey.entries())
.sort(([left], [right]) => left - right)
.map(([key, value]) => ({
label: makeTrendLabel(key),
activeMin: Math.round(value.activeMin),
cards: value.cards,
words: value.words,
sessions: value.sessions,
}));
}
function buildWatchTimeByDayOfWeek(sessions: TrendSessionMetricRow[]): TrendChartPoint[] {
const totals = new Array(7).fill(0);
for (const session of sessions) {
totals[new Date(session.startedAtMs).getDay()] += session.activeWatchedMs;
}
return DAY_NAMES.map((name, index) => ({
label: name,
value: Math.round(totals[index] / 60_000),
}));
}
function buildWatchTimeByHour(sessions: TrendSessionMetricRow[]): TrendChartPoint[] {
const totals = new Array(24).fill(0);
for (const session of sessions) {
totals[new Date(session.startedAtMs).getHours()] += session.activeWatchedMs;
}
return totals.map((ms, index) => ({
label: `${String(index).padStart(2, '0')}:00`,
value: Math.round(ms / 60_000),
}));
}
function dayLabel(epochDay: number): string {
return getLocalDateForEpochDay(epochDay).toLocaleDateString(undefined, {
month: 'short',
day: 'numeric',
});
}
function buildSessionSeriesByDay(
sessions: TrendSessionMetricRow[],
getValue: (session: TrendSessionMetricRow) => number,
): TrendChartPoint[] {
const byDay = new Map<number, number>();
for (const session of sessions) {
const epochDay = getLocalEpochDay(session.startedAtMs);
byDay.set(epochDay, (byDay.get(epochDay) ?? 0) + getValue(session));
}
return Array.from(byDay.entries())
.sort(([left], [right]) => left - right)
.map(([epochDay, value]) => ({ label: dayLabel(epochDay), value }));
}
function buildSessionSeriesByMonth(
sessions: TrendSessionMetricRow[],
getValue: (session: TrendSessionMetricRow) => number,
): TrendChartPoint[] {
const byMonth = new Map<number, number>();
for (const session of sessions) {
const monthKey = getLocalMonthKey(session.startedAtMs);
byMonth.set(monthKey, (byMonth.get(monthKey) ?? 0) + getValue(session));
}
return Array.from(byMonth.entries())
.sort(([left], [right]) => left - right)
.map(([monthKey, value]) => ({ label: makeTrendLabel(monthKey), value }));
}
function buildLookupsPerHundredWords(
sessions: TrendSessionMetricRow[],
groupBy: TrendGroupBy,
): TrendChartPoint[] {
const lookupsByBucket = new Map<number, number>();
const wordsByBucket = new Map<number, number>();
for (const session of sessions) {
const bucketKey =
groupBy === 'month' ? getLocalMonthKey(session.startedAtMs) : getLocalEpochDay(session.startedAtMs);
lookupsByBucket.set(
bucketKey,
(lookupsByBucket.get(bucketKey) ?? 0) + session.yomitanLookupCount,
);
wordsByBucket.set(
bucketKey,
(wordsByBucket.get(bucketKey) ?? 0) + getTrendSessionWordCount(session),
);
}
return Array.from(lookupsByBucket.entries())
.sort(([left], [right]) => left - right)
.map(([bucketKey, lookups]) => {
const words = wordsByBucket.get(bucketKey) ?? 0;
return {
label: groupBy === 'month' ? makeTrendLabel(bucketKey) : dayLabel(bucketKey),
value: words > 0 ? +((lookups / words) * 100).toFixed(1) : 0,
};
});
}
function buildPerAnimeFromSessions(
sessions: TrendSessionMetricRow[],
getValue: (session: TrendSessionMetricRow) => number,
): TrendPerAnimePoint[] {
const byAnime = new Map<string, Map<number, number>>();
for (const session of sessions) {
const animeTitle = resolveTrendAnimeTitle(session);
const epochDay = getLocalEpochDay(session.startedAtMs);
const dayMap = byAnime.get(animeTitle) ?? new Map();
dayMap.set(epochDay, (dayMap.get(epochDay) ?? 0) + getValue(session));
byAnime.set(animeTitle, dayMap);
}
const result: TrendPerAnimePoint[] = [];
for (const [animeTitle, dayMap] of byAnime) {
for (const [epochDay, value] of dayMap) {
result.push({ epochDay, animeTitle, value });
}
}
return result;
}
function buildLookupsPerHundredPerAnime(sessions: TrendSessionMetricRow[]): TrendPerAnimePoint[] {
const lookups = new Map<string, Map<number, number>>();
const words = new Map<string, Map<number, number>>();
for (const session of sessions) {
const animeTitle = resolveTrendAnimeTitle(session);
const epochDay = getLocalEpochDay(session.startedAtMs);
const lookupMap = lookups.get(animeTitle) ?? new Map();
lookupMap.set(epochDay, (lookupMap.get(epochDay) ?? 0) + session.yomitanLookupCount);
lookups.set(animeTitle, lookupMap);
const wordMap = words.get(animeTitle) ?? new Map();
wordMap.set(epochDay, (wordMap.get(epochDay) ?? 0) + getTrendSessionWordCount(session));
words.set(animeTitle, wordMap);
}
const result: TrendPerAnimePoint[] = [];
for (const [animeTitle, dayMap] of lookups) {
const wordMap = words.get(animeTitle) ?? new Map();
for (const [epochDay, lookupCount] of dayMap) {
const wordCount = wordMap.get(epochDay) ?? 0;
result.push({
epochDay,
animeTitle,
value: wordCount > 0 ? +((lookupCount / wordCount) * 100).toFixed(1) : 0,
});
}
}
return result;
}
function buildCumulativePerAnime(points: TrendPerAnimePoint[]): TrendPerAnimePoint[] {
const byAnime = new Map<string, Map<number, number>>();
const allDays = new Set<number>();
for (const point of points) {
const dayMap = byAnime.get(point.animeTitle) ?? new Map();
dayMap.set(point.epochDay, (dayMap.get(point.epochDay) ?? 0) + point.value);
byAnime.set(point.animeTitle, dayMap);
allDays.add(point.epochDay);
}
const sortedDays = [...allDays].sort((left, right) => left - right);
if (sortedDays.length === 0) {
return [];
}
const minDay = sortedDays[0]!;
const maxDay = sortedDays[sortedDays.length - 1]!;
const result: TrendPerAnimePoint[] = [];
for (const [animeTitle, dayMap] of byAnime) {
const firstDay = Math.min(...dayMap.keys());
let cumulative = 0;
for (let epochDay = minDay; epochDay <= maxDay; epochDay += 1) {
if (epochDay < firstDay) {
continue;
}
cumulative += dayMap.get(epochDay) ?? 0;
result.push({ epochDay, animeTitle, value: cumulative });
}
}
return result;
}
function getVideoAnimeTitleMap(
db: DatabaseSync,
videoIds: Array<number | null>,
): Map<number, string> {
const uniqueIds = [
...new Set(videoIds.filter((value): value is number => typeof value === 'number')),
];
if (uniqueIds.length === 0) {
return new Map();
}
const rows = db
.prepare(
`
SELECT
v.video_id AS videoId,
COALESCE(a.canonical_title, v.canonical_title, 'Unknown') AS animeTitle
FROM imm_videos v
LEFT JOIN imm_anime a ON a.anime_id = v.anime_id
WHERE v.video_id IN (${makePlaceholders(uniqueIds)})
`,
)
.all(...uniqueIds) as Array<{ videoId: number; animeTitle: string }>;
return new Map(rows.map((row) => [row.videoId, row.animeTitle]));
}
function resolveVideoAnimeTitle(
videoId: number | null,
titlesByVideoId: Map<number, string>,
): string {
if (videoId === null) {
return 'Unknown';
}
return titlesByVideoId.get(videoId) ?? 'Unknown';
}
function buildPerAnimeFromDailyRollups(
rollups: ImmersionSessionRollupRow[],
titlesByVideoId: Map<number, string>,
getValue: (rollup: ImmersionSessionRollupRow) => number,
): TrendPerAnimePoint[] {
const byAnime = new Map<string, Map<number, number>>();
for (const rollup of rollups) {
const animeTitle = resolveVideoAnimeTitle(rollup.videoId, titlesByVideoId);
const dayMap = byAnime.get(animeTitle) ?? new Map();
dayMap.set(
rollup.rollupDayOrMonth,
(dayMap.get(rollup.rollupDayOrMonth) ?? 0) + getValue(rollup),
);
byAnime.set(animeTitle, dayMap);
}
const result: TrendPerAnimePoint[] = [];
for (const [animeTitle, dayMap] of byAnime) {
for (const [epochDay, value] of dayMap) {
result.push({ epochDay, animeTitle, value });
}
}
return result;
}
function buildEpisodesPerAnimeFromDailyRollups(
rollups: ImmersionSessionRollupRow[],
titlesByVideoId: Map<number, string>,
): TrendPerAnimePoint[] {
const byAnime = new Map<string, Map<number, Set<number>>>();
for (const rollup of rollups) {
if (rollup.videoId === null) {
continue;
}
const animeTitle = resolveVideoAnimeTitle(rollup.videoId, titlesByVideoId);
const dayMap = byAnime.get(animeTitle) ?? new Map();
const videoIds = dayMap.get(rollup.rollupDayOrMonth) ?? new Set<number>();
videoIds.add(rollup.videoId);
dayMap.set(rollup.rollupDayOrMonth, videoIds);
byAnime.set(animeTitle, dayMap);
}
const result: TrendPerAnimePoint[] = [];
for (const [animeTitle, dayMap] of byAnime) {
for (const [epochDay, videoIds] of dayMap) {
result.push({ epochDay, animeTitle, value: videoIds.size });
}
}
return result;
}
function buildEpisodesPerDayFromDailyRollups(
rollups: ImmersionSessionRollupRow[],
): TrendChartPoint[] {
const byDay = new Map<number, Set<number>>();
for (const rollup of rollups) {
if (rollup.videoId === null) {
continue;
}
const videoIds = byDay.get(rollup.rollupDayOrMonth) ?? new Set<number>();
videoIds.add(rollup.videoId);
byDay.set(rollup.rollupDayOrMonth, videoIds);
}
return Array.from(byDay.entries())
.sort(([left], [right]) => left - right)
.map(([epochDay, videoIds]) => ({
label: dayLabel(epochDay),
value: videoIds.size,
}));
}
function buildEpisodesPerMonthFromRollups(rollups: ImmersionSessionRollupRow[]): TrendChartPoint[] {
const byMonth = new Map<number, Set<number>>();
for (const rollup of rollups) {
if (rollup.videoId === null) {
continue;
}
const videoIds = byMonth.get(rollup.rollupDayOrMonth) ?? new Set<number>();
videoIds.add(rollup.videoId);
byMonth.set(rollup.rollupDayOrMonth, videoIds);
}
return Array.from(byMonth.entries())
.sort(([left], [right]) => left - right)
.map(([monthKey, videoIds]) => ({
label: makeTrendLabel(monthKey),
value: videoIds.size,
}));
}
function getTrendSessionMetrics(
db: DatabaseSync,
cutoffMs: number | null,
): TrendSessionMetricRow[] {
const whereClause = cutoffMs === null ? '' : 'WHERE s.started_at_ms >= ?';
const prepared = db.prepare(`
${ACTIVE_SESSION_METRICS_CTE}
SELECT
s.started_at_ms AS startedAtMs,
s.video_id AS videoId,
v.canonical_title AS canonicalTitle,
a.canonical_title AS animeTitle,
COALESCE(asm.activeWatchedMs, s.active_watched_ms, 0) AS activeWatchedMs,
COALESCE(asm.tokensSeen, s.tokens_seen, 0) AS tokensSeen,
COALESCE(asm.cardsMined, s.cards_mined, 0) AS cardsMined,
COALESCE(asm.yomitanLookupCount, s.yomitan_lookup_count, 0) AS yomitanLookupCount
FROM imm_sessions s
LEFT JOIN active_session_metrics asm ON asm.sessionId = s.session_id
LEFT JOIN imm_videos v ON v.video_id = s.video_id
LEFT JOIN imm_anime a ON a.anime_id = v.anime_id
${whereClause}
ORDER BY s.started_at_ms ASC
`);
return (cutoffMs === null ? prepared.all() : prepared.all(cutoffMs)) as TrendSessionMetricRow[];
}
function buildNewWordsPerDay(db: DatabaseSync, cutoffMs: number | null): TrendChartPoint[] {
const whereClause = cutoffMs === null ? '' : 'AND first_seen >= ?';
const prepared = db.prepare(`
SELECT
CAST(julianday(first_seen, 'unixepoch', 'localtime') - 2440587.5 AS INTEGER) AS epochDay,
COUNT(*) AS wordCount
FROM imm_words
WHERE first_seen IS NOT NULL
${whereClause}
GROUP BY epochDay
ORDER BY epochDay ASC
`);
const rows = (
cutoffMs === null ? prepared.all() : prepared.all(Math.floor(cutoffMs / 1000))
) as Array<{
epochDay: number;
wordCount: number;
}>;
return rows.map((row) => ({
label: dayLabel(row.epochDay),
value: row.wordCount,
}));
}
function buildNewWordsPerMonth(db: DatabaseSync, cutoffMs: number | null): TrendChartPoint[] {
const whereClause = cutoffMs === null ? '' : 'AND first_seen >= ?';
const prepared = db.prepare(`
SELECT
CAST(strftime('%Y%m', first_seen, 'unixepoch', 'localtime') AS INTEGER) AS monthKey,
COUNT(*) AS wordCount
FROM imm_words
WHERE first_seen IS NOT NULL
${whereClause}
GROUP BY monthKey
ORDER BY monthKey ASC
`);
const rows = (
cutoffMs === null ? prepared.all() : prepared.all(Math.floor(cutoffMs / 1000))
) as Array<{
monthKey: number;
wordCount: number;
}>;
return rows.map((row) => ({
label: makeTrendLabel(row.monthKey),
value: row.wordCount,
}));
}
export function getTrendsDashboard(
db: DatabaseSync,
range: TrendRange = '30d',
groupBy: TrendGroupBy = 'day',
): TrendsDashboardQueryResult {
const dayLimit = getTrendDayLimit(range);
const monthlyLimit = getTrendMonthlyLimit(range);
const cutoffMs = getTrendCutoffMs(range);
const useMonthlyBuckets = groupBy === 'month';
const dailyRollups = getDailyRollups(db, dayLimit);
const monthlyRollups = getMonthlyRollups(db, monthlyLimit);
const chartRollups = useMonthlyBuckets ? monthlyRollups : dailyRollups;
const sessions = getTrendSessionMetrics(db, cutoffMs);
const titlesByVideoId = getVideoAnimeTitleMap(
db,
dailyRollups.map((rollup) => rollup.videoId),
);
const aggregatedRows = buildAggregatedTrendRows(chartRollups);
const activity = {
watchTime: aggregatedRows.map((row) => ({ label: row.label, value: row.activeMin })),
cards: aggregatedRows.map((row) => ({ label: row.label, value: row.cards })),
words: aggregatedRows.map((row) => ({ label: row.label, value: row.words })),
sessions: aggregatedRows.map((row) => ({ label: row.label, value: row.sessions })),
};
const animePerDay = {
episodes: buildEpisodesPerAnimeFromDailyRollups(dailyRollups, titlesByVideoId),
watchTime: buildPerAnimeFromDailyRollups(dailyRollups, titlesByVideoId, (rollup) =>
rollup.totalActiveMin,
),
cards: buildPerAnimeFromDailyRollups(
dailyRollups,
titlesByVideoId,
(rollup) => rollup.totalCards,
),
words: buildPerAnimeFromDailyRollups(
dailyRollups,
titlesByVideoId,
(rollup) => rollup.totalTokensSeen,
),
lookups: buildPerAnimeFromSessions(sessions, (session) => session.yomitanLookupCount),
lookupsPerHundred: buildLookupsPerHundredPerAnime(sessions),
};
return {
activity,
progress: {
watchTime: accumulatePoints(activity.watchTime),
sessions: accumulatePoints(activity.sessions),
words: accumulatePoints(activity.words),
newWords: accumulatePoints(
useMonthlyBuckets ? buildNewWordsPerMonth(db, cutoffMs) : buildNewWordsPerDay(db, cutoffMs),
),
cards: accumulatePoints(activity.cards),
episodes: accumulatePoints(
useMonthlyBuckets
? buildEpisodesPerMonthFromRollups(monthlyRollups)
: buildEpisodesPerDayFromDailyRollups(dailyRollups),
),
lookups: accumulatePoints(
useMonthlyBuckets
? buildSessionSeriesByMonth(sessions, (session) => session.yomitanLookupCount)
: buildSessionSeriesByDay(sessions, (session) => session.yomitanLookupCount),
),
},
ratios: {
lookupsPerHundred: buildLookupsPerHundredWords(sessions, groupBy),
},
animePerDay,
animeCumulative: {
watchTime: buildCumulativePerAnime(animePerDay.watchTime),
episodes: buildCumulativePerAnime(animePerDay.episodes),
cards: buildCumulativePerAnime(animePerDay.cards),
words: buildCumulativePerAnime(animePerDay.words),
},
patterns: {
watchTimeByDayOfWeek: buildWatchTimeByDayOfWeek(sessions),
watchTimeByHour: buildWatchTimeByHour(sessions),
},
};
}