mirror of
https://github.com/ksyasuda/SubMiner.git
synced 2026-03-30 06:12:06 -07:00
Refactor startup, queries, and workflow into focused modules (#36)
* chore(backlog): add mining workflow milestone and tasks
* refactor: split character dictionary runtime modules
* refactor: split shared type entrypoints
* refactor: use bun serve for stats server
* feat: add repo-local subminer workflow plugin
* fix: add stats server node fallback
* refactor: split immersion tracker query modules
* chore: update backlog task records
* refactor: migrate shared type imports
* refactor: compose startup and setup window wiring
* Add backlog tasks and launcher time helper tests
- Track follow-up cleanup work in Backlog.md
- Replace Date.now usage with shared nowMs helper
- Add launcher args/parser and core regression tests
* test: increase launcher test timeout for CI stability
* fix: address CodeRabbit review feedback
* refactor(main): extract remaining inline runtime logic from main
* chore(backlog): update task notes and changelog fragment
* refactor: split main boot phases
* test: stabilize bun coverage reporting
* Switch plausible endpoint and harden coverage lane parsing
- update docs-site tracking to use the Plausible capture endpoint
- tighten coverage lane argument and LCOV parsing checks
- make script entrypoint use CommonJS main guard
* Restrict docs analytics and build coverage input
- limit Plausible init to docs.subminer.moe
- build Yomitan before src coverage lane
* fix(ci): normalize Windows shortcut paths for cross-platform tests
* Fix verification and immersion-tracker grouping
- isolate verifier artifacts and lease handling
- switch weekly/monthly tracker cutoffs to calendar boundaries
- tighten boot lifecycle and zip writer tests
* fix: resolve CI type failures in boot and immersion query tests
* fix: remove strict spread usage in Date mocks
* fix: use explicit super args for MockDate constructors
* Factor out mock date helper in tracker tests
- reuse a shared `withMockDate` helper for date-sensitive query tests
- make monthly rollup assertions key off `videoId` instead of row order
* fix: use variadic array type for MockDate constructor args
TS2367: fixed-length tuple made args.length === 0 unreachable.
* refactor: remove unused createMainBootRuntimes/Handlers aggregate functions
These functions were never called by production code — main.ts imports
the individual composeBoot* re-exports directly.
* refactor: remove boot re-export alias layer
main.ts now imports directly from the runtime/composers and runtime/domains
modules, eliminating the intermediate boot/ indirection.
* refactor: consolidate 3 near-identical setup window factories
Extract shared createSetupWindowHandler with a config parameter.
Public API unchanged.
* refactor: parameterize duplicated getAffected*Ids query helpers
Four structurally identical functions collapsed into two parameterized
helpers while preserving the existing public API.
* refactor: inline identity composers (stats-startup, overlay-window)
composeStatsStartupRuntime was a no-op that returned its input.
composeOverlayWindowHandlers was a 1-line delegation.
Both removed in favor of direct usage.
* chore: remove unused token/queue file path constants from main.ts
* fix: replace any types in boot services with proper signatures
* refactor: deduplicate ensureDir into shared/fs-utils
5 copies of mkdir-p-if-not-exists consolidated into one shared module
with ensureDir (directory path) and ensureDirForFile (file path) variants.
* fix: tighten type safety in boot services
- Add AppLifecycleShape and OverlayModalInputStateShape constraints
so TAppLifecycleApp and TOverlayModalInputState generics are bounded
- Remove unsafe `as { handleModalInputStateChange? }` cast — now
directly callable via the constraint
- Use `satisfies AppLifecycleShape` for structural validation on the
appLifecycleApp object literal
- Document Electron App.on incompatibility with simple signatures
* refactor: inline subtitle-prefetch-runtime-composer
The composer was a pure pass-through that destructured an object and
reassembled it with the same fields. Inlined at the call site.
* chore: consolidate duplicate import paths in main.ts
* test: extract mpv composer test fixture factory to reduce duplication
* test: add behavioral assertions to composer tests
Upgrade 8 composer test files from shape-only typeof checks to behavioral
assertions that invoke returned handlers and verify injected dependencies are
actually called, following the mpv-runtime-composer pattern.
* refactor: normalize import extensions in query modules
* refactor: consolidate toDbMs into query-shared.ts
* refactor: remove Node.js fallback from stats-server, use Bun only
* Fix monthly rollup test expectations
- Preserve multi-arg Date construction in mock helper
- Align rollup assertions with the correct videoId
* fix: address PR 36 CodeRabbit follow-ups
* fix: harden coverage lane cleanup
* fix(stats): fallback to node server when Bun.serve unavailable
* fix(ci): restore coverage lane compatibility
* chore(backlog): close TASK-242
* fix: address latest CodeRabbit review round
* fix: guard disabled immersion retention windows
* fix: migrate discord rpc wrapper
* fix(ci): add changelog fragment for PR 36
* fix: stabilize macOS visible overlay toggle
* fix: pin installed mpv plugin to current binary
* fix: strip inline subtitle markup from sidebar cues
* fix(renderer): restore subtitle sidebar mpv passthrough
* feat(discord): add configurable presence style presets
Replace the hardcoded "Mining and crafting (Anki cards)" meme message
with a preset system. New `discordPresence.presenceStyle` option
supports four presets: "default" (clean bilingual), "meme" (the OG
Minecraft joke), "japanese" (fully JP), and "minimal". The default
preset shows "Sentence Mining" with 日本語学習中 as the small image
tooltip. Existing users can set presenceStyle to "meme" to keep the
old behavior.
* fix: finalize v0.10.0 release prep
* docs: add subtitle sidebar guide and release note
* chore(backlog): mark docs task done
* fix: lazily resolve youtube playback socket path
* chore(release): build v0.10.0 changelog
* Revert "chore(release): build v0.10.0 changelog"
This reverts commit 9741c0f020.
This commit is contained in:
@@ -111,7 +111,11 @@ test('writeChangelogArtifacts skips changelog prepend when release section alrea
|
||||
fs.mkdirSync(projectRoot, { recursive: true });
|
||||
fs.mkdirSync(path.join(projectRoot, 'changes'), { recursive: true });
|
||||
fs.writeFileSync(path.join(projectRoot, 'CHANGELOG.md'), existingChangelog, 'utf8');
|
||||
fs.writeFileSync(path.join(projectRoot, 'changes', '001.md'), ['type: added', 'area: overlay', '', '- Stale release fragment.'].join('\n'), 'utf8');
|
||||
fs.writeFileSync(
|
||||
path.join(projectRoot, 'changes', '001.md'),
|
||||
['type: added', 'area: overlay', '', '- Stale release fragment.'].join('\n'),
|
||||
'utf8',
|
||||
);
|
||||
|
||||
try {
|
||||
const result = writeChangelogArtifacts({
|
||||
@@ -125,7 +129,10 @@ test('writeChangelogArtifacts skips changelog prepend when release section alrea
|
||||
|
||||
const changelog = fs.readFileSync(path.join(projectRoot, 'CHANGELOG.md'), 'utf8');
|
||||
assert.equal(changelog, existingChangelog);
|
||||
const releaseNotes = fs.readFileSync(path.join(projectRoot, 'release', 'release-notes.md'), 'utf8');
|
||||
const releaseNotes = fs.readFileSync(
|
||||
path.join(projectRoot, 'release', 'release-notes.md'),
|
||||
'utf8',
|
||||
);
|
||||
assert.match(releaseNotes, /## Highlights\n### Added\n- Existing release bullet\./);
|
||||
} finally {
|
||||
fs.rmSync(workspace, { recursive: true, force: true });
|
||||
|
||||
@@ -354,11 +354,7 @@ export function writeChangelogArtifacts(options?: ChangelogOptions): {
|
||||
log(`Removed ${fragment.path}`);
|
||||
}
|
||||
|
||||
const releaseNotesPath = writeReleaseNotesFile(
|
||||
cwd,
|
||||
existingReleaseSection,
|
||||
options?.deps,
|
||||
);
|
||||
const releaseNotesPath = writeReleaseNotesFile(cwd, existingReleaseSection, options?.deps);
|
||||
log(`Generated ${releaseNotesPath}`);
|
||||
|
||||
return {
|
||||
|
||||
@@ -55,19 +55,15 @@ exit 1
|
||||
`,
|
||||
);
|
||||
|
||||
const result = spawnSync(
|
||||
'bash',
|
||||
['scripts/patch-modernz.sh', '--target', target],
|
||||
{
|
||||
cwd: process.cwd(),
|
||||
encoding: 'utf8',
|
||||
env: {
|
||||
...process.env,
|
||||
HOME: path.join(root, 'home'),
|
||||
PATH: `${binDir}:${process.env.PATH || ''}`,
|
||||
},
|
||||
const result = spawnSync('bash', ['scripts/patch-modernz.sh', '--target', target], {
|
||||
cwd: process.cwd(),
|
||||
encoding: 'utf8',
|
||||
env: {
|
||||
...process.env,
|
||||
HOME: path.join(root, 'home'),
|
||||
PATH: `${binDir}:${process.env.PATH || ''}`,
|
||||
},
|
||||
);
|
||||
});
|
||||
|
||||
assert.equal(result.status, 1, result.stderr || result.stdout);
|
||||
assert.match(result.stderr, /failed to apply patch to/);
|
||||
|
||||
74
scripts/run-coverage-lane.test.ts
Normal file
74
scripts/run-coverage-lane.test.ts
Normal file
@@ -0,0 +1,74 @@
|
||||
import assert from 'node:assert/strict';
|
||||
import { resolve } from 'node:path';
|
||||
import test from 'node:test';
|
||||
|
||||
import { mergeLcovReports, resolveCoverageDir } from './run-coverage-lane';
|
||||
|
||||
test('mergeLcovReports combines duplicate source-file counters across shard outputs', () => {
|
||||
const merged = mergeLcovReports([
|
||||
[
|
||||
'SF:src/example.ts',
|
||||
'FN:10,alpha',
|
||||
'FNDA:1,alpha',
|
||||
'DA:10,1',
|
||||
'DA:11,0',
|
||||
'BRDA:10,0,0,1',
|
||||
'BRDA:10,0,1,-',
|
||||
'end_of_record',
|
||||
'',
|
||||
].join('\n'),
|
||||
[
|
||||
'SF:src/example.ts',
|
||||
'FN:10,alpha',
|
||||
'FN:20,beta',
|
||||
'FNDA:2,alpha',
|
||||
'FNDA:1,beta',
|
||||
'DA:10,2',
|
||||
'DA:11,1',
|
||||
'DA:20,1',
|
||||
'BRDA:10,0,0,0',
|
||||
'BRDA:10,0,1,1',
|
||||
'end_of_record',
|
||||
'',
|
||||
].join('\n'),
|
||||
]);
|
||||
|
||||
assert.match(merged, /SF:src\/example\.ts/);
|
||||
assert.match(merged, /FN:10,alpha/);
|
||||
assert.match(merged, /FN:20,beta/);
|
||||
assert.match(merged, /FNDA:3,alpha/);
|
||||
assert.match(merged, /FNDA:1,beta/);
|
||||
assert.match(merged, /FNF:2/);
|
||||
assert.match(merged, /FNH:2/);
|
||||
assert.match(merged, /DA:10,3/);
|
||||
assert.match(merged, /DA:11,1/);
|
||||
assert.match(merged, /DA:20,1/);
|
||||
assert.match(merged, /LF:3/);
|
||||
assert.match(merged, /LH:3/);
|
||||
assert.match(merged, /BRDA:10,0,0,1/);
|
||||
assert.match(merged, /BRDA:10,0,1,1/);
|
||||
assert.match(merged, /BRF:2/);
|
||||
assert.match(merged, /BRH:2/);
|
||||
});
|
||||
|
||||
test('mergeLcovReports keeps distinct source files as separate records', () => {
|
||||
const merged = mergeLcovReports([
|
||||
['SF:src/a.ts', 'DA:1,1', 'end_of_record', ''].join('\n'),
|
||||
['SF:src/b.ts', 'DA:2,1', 'end_of_record', ''].join('\n'),
|
||||
]);
|
||||
|
||||
assert.match(merged, /SF:src\/a\.ts[\s\S]*end_of_record/);
|
||||
assert.match(merged, /SF:src\/b\.ts[\s\S]*end_of_record/);
|
||||
});
|
||||
|
||||
test('resolveCoverageDir keeps coverage output inside the repository', () => {
|
||||
const repoRoot = resolve('/tmp', 'subminer-repo-root');
|
||||
|
||||
assert.equal(resolveCoverageDir(repoRoot, []), resolve(repoRoot, 'coverage'));
|
||||
assert.equal(
|
||||
resolveCoverageDir(repoRoot, ['--coverage-dir', 'coverage/test-src']),
|
||||
resolve(repoRoot, 'coverage/test-src'),
|
||||
);
|
||||
assert.throws(() => resolveCoverageDir(repoRoot, ['--coverage-dir', '../escape']));
|
||||
assert.throws(() => resolveCoverageDir(repoRoot, ['--coverage-dir', '/tmp/escape']));
|
||||
});
|
||||
311
scripts/run-coverage-lane.ts
Normal file
311
scripts/run-coverage-lane.ts
Normal file
@@ -0,0 +1,311 @@
|
||||
import { existsSync, mkdirSync, readFileSync, readdirSync, rmSync, writeFileSync } from 'node:fs';
|
||||
import { spawnSync } from 'node:child_process';
|
||||
import { isAbsolute, join, relative, resolve } from 'node:path';
|
||||
|
||||
type LaneConfig = {
|
||||
roots: string[];
|
||||
include: string[];
|
||||
exclude: Set<string>;
|
||||
};
|
||||
|
||||
type LcovRecord = {
|
||||
sourceFile: string;
|
||||
functions: Map<string, number>;
|
||||
functionHits: Map<string, number>;
|
||||
lines: Map<number, number>;
|
||||
branches: Map<string, { line: number; block: string; branch: string; hits: number | null }>;
|
||||
};
|
||||
|
||||
const repoRoot = resolve(__dirname, '..');
|
||||
|
||||
const lanes: Record<string, LaneConfig> = {
|
||||
'bun-src-full': {
|
||||
roots: ['src'],
|
||||
include: ['.test.ts', '.type-test.ts'],
|
||||
exclude: new Set([
|
||||
'src/core/services/anki-jimaku-ipc.test.ts',
|
||||
'src/core/services/ipc.test.ts',
|
||||
'src/core/services/overlay-manager.test.ts',
|
||||
'src/main/config-validation.test.ts',
|
||||
'src/main/runtime/registry.test.ts',
|
||||
'src/main/runtime/startup-config.test.ts',
|
||||
]),
|
||||
},
|
||||
'bun-launcher-unit': {
|
||||
roots: ['launcher'],
|
||||
include: ['.test.ts'],
|
||||
exclude: new Set(['launcher/smoke.e2e.test.ts']),
|
||||
},
|
||||
};
|
||||
|
||||
function collectFiles(rootDir: string, includeSuffixes: string[], excludeSet: Set<string>): string[] {
|
||||
const out: string[] = [];
|
||||
const visit = (currentDir: string) => {
|
||||
for (const entry of readdirSync(currentDir, { withFileTypes: true })) {
|
||||
const fullPath = resolve(currentDir, entry.name);
|
||||
if (entry.isDirectory()) {
|
||||
visit(fullPath);
|
||||
continue;
|
||||
}
|
||||
const relPath = relative(repoRoot, fullPath).replaceAll('\\', '/');
|
||||
if (excludeSet.has(relPath)) continue;
|
||||
if (includeSuffixes.some((suffix) => relPath.endsWith(suffix))) {
|
||||
out.push(relPath);
|
||||
}
|
||||
}
|
||||
};
|
||||
|
||||
visit(resolve(repoRoot, rootDir));
|
||||
out.sort();
|
||||
return out;
|
||||
}
|
||||
|
||||
function getLaneFiles(laneName: string): string[] {
|
||||
const lane = lanes[laneName];
|
||||
if (!lane) {
|
||||
throw new Error(`Unknown coverage lane: ${laneName}`);
|
||||
}
|
||||
const files = lane.roots.flatMap((rootDir) => collectFiles(rootDir, lane.include, lane.exclude));
|
||||
if (files.length === 0) {
|
||||
throw new Error(`No test files found for coverage lane: ${laneName}`);
|
||||
}
|
||||
return files;
|
||||
}
|
||||
|
||||
function parseCoverageDirArg(argv: string[]): string {
|
||||
for (let index = 0; index < argv.length; index += 1) {
|
||||
if (argv[index] === '--coverage-dir') {
|
||||
const next = argv[index + 1];
|
||||
if (typeof next !== 'string') {
|
||||
throw new Error('Missing value for --coverage-dir');
|
||||
}
|
||||
return next;
|
||||
}
|
||||
}
|
||||
return 'coverage';
|
||||
}
|
||||
|
||||
export function resolveCoverageDir(repoRootDir: string, argv: string[]): string {
|
||||
const candidate = resolve(repoRootDir, parseCoverageDirArg(argv));
|
||||
const rel = relative(repoRootDir, candidate);
|
||||
if (isAbsolute(rel) || rel.startsWith('..')) {
|
||||
throw new Error(`--coverage-dir must be within repository: ${candidate}`);
|
||||
}
|
||||
return candidate;
|
||||
}
|
||||
|
||||
function parseLcovReport(report: string): LcovRecord[] {
|
||||
const records: LcovRecord[] = [];
|
||||
let current: LcovRecord | null = null;
|
||||
|
||||
const ensureCurrent = (): LcovRecord => {
|
||||
if (!current) {
|
||||
throw new Error('Malformed lcov report: record data before SF');
|
||||
}
|
||||
return current;
|
||||
};
|
||||
|
||||
for (const rawLine of report.split(/\r?\n/)) {
|
||||
const line = rawLine.trim();
|
||||
if (!line) continue;
|
||||
if (line.startsWith('TN:')) {
|
||||
continue;
|
||||
}
|
||||
if (line.startsWith('SF:')) {
|
||||
current = {
|
||||
sourceFile: line.slice(3),
|
||||
functions: new Map(),
|
||||
functionHits: new Map(),
|
||||
lines: new Map(),
|
||||
branches: new Map(),
|
||||
};
|
||||
continue;
|
||||
}
|
||||
if (line === 'end_of_record') {
|
||||
if (current) {
|
||||
records.push(current);
|
||||
current = null;
|
||||
}
|
||||
continue;
|
||||
}
|
||||
if (line.startsWith('FN:')) {
|
||||
const [lineNumber, ...nameParts] = line.slice(3).split(',');
|
||||
ensureCurrent().functions.set(nameParts.join(','), Number(lineNumber));
|
||||
continue;
|
||||
}
|
||||
if (line.startsWith('FNDA:')) {
|
||||
const [hits, ...nameParts] = line.slice(5).split(',');
|
||||
ensureCurrent().functionHits.set(nameParts.join(','), Number(hits));
|
||||
continue;
|
||||
}
|
||||
if (line.startsWith('DA:')) {
|
||||
const [lineNumber, hits] = line.slice(3).split(',');
|
||||
ensureCurrent().lines.set(Number(lineNumber), Number(hits));
|
||||
continue;
|
||||
}
|
||||
if (line.startsWith('BRDA:')) {
|
||||
const [lineNumber, block, branch, hits] = line.slice(5).split(',');
|
||||
if (lineNumber === undefined || block === undefined || branch === undefined || hits === undefined) {
|
||||
continue;
|
||||
}
|
||||
ensureCurrent().branches.set(`${lineNumber}:${block}:${branch}`, {
|
||||
line: Number(lineNumber),
|
||||
block,
|
||||
branch,
|
||||
hits: hits === '-' ? null : Number(hits),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (current) {
|
||||
records.push(current);
|
||||
}
|
||||
|
||||
return records;
|
||||
}
|
||||
|
||||
export function mergeLcovReports(reports: string[]): string {
|
||||
const merged = new Map<string, LcovRecord>();
|
||||
|
||||
for (const report of reports) {
|
||||
for (const record of parseLcovReport(report)) {
|
||||
let target = merged.get(record.sourceFile);
|
||||
if (!target) {
|
||||
target = {
|
||||
sourceFile: record.sourceFile,
|
||||
functions: new Map(),
|
||||
functionHits: new Map(),
|
||||
lines: new Map(),
|
||||
branches: new Map(),
|
||||
};
|
||||
merged.set(record.sourceFile, target);
|
||||
}
|
||||
|
||||
for (const [name, line] of record.functions) {
|
||||
if (!target.functions.has(name)) {
|
||||
target.functions.set(name, line);
|
||||
}
|
||||
}
|
||||
|
||||
for (const [name, hits] of record.functionHits) {
|
||||
target.functionHits.set(name, (target.functionHits.get(name) ?? 0) + hits);
|
||||
}
|
||||
|
||||
for (const [lineNumber, hits] of record.lines) {
|
||||
target.lines.set(lineNumber, (target.lines.get(lineNumber) ?? 0) + hits);
|
||||
}
|
||||
|
||||
for (const [branchKey, branchRecord] of record.branches) {
|
||||
const existing = target.branches.get(branchKey);
|
||||
if (!existing) {
|
||||
target.branches.set(branchKey, { ...branchRecord });
|
||||
continue;
|
||||
}
|
||||
if (branchRecord.hits === null) {
|
||||
continue;
|
||||
}
|
||||
existing.hits = (existing.hits ?? 0) + branchRecord.hits;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const chunks: string[] = [];
|
||||
for (const sourceFile of [...merged.keys()].sort()) {
|
||||
const record = merged.get(sourceFile)!;
|
||||
chunks.push(`SF:${record.sourceFile}`);
|
||||
|
||||
const functions = [...record.functions.entries()].sort((a, b) =>
|
||||
a[1] === b[1] ? a[0].localeCompare(b[0]) : a[1] - b[1],
|
||||
);
|
||||
for (const [name, line] of functions) {
|
||||
chunks.push(`FN:${line},${name}`);
|
||||
}
|
||||
for (const [name] of functions) {
|
||||
chunks.push(`FNDA:${record.functionHits.get(name) ?? 0},${name}`);
|
||||
}
|
||||
chunks.push(`FNF:${functions.length}`);
|
||||
chunks.push(`FNH:${functions.filter(([name]) => (record.functionHits.get(name) ?? 0) > 0).length}`);
|
||||
|
||||
const branches = [...record.branches.values()].sort((a, b) =>
|
||||
a.line === b.line
|
||||
? a.block === b.block
|
||||
? a.branch.localeCompare(b.branch)
|
||||
: a.block.localeCompare(b.block)
|
||||
: a.line - b.line,
|
||||
);
|
||||
for (const branch of branches) {
|
||||
chunks.push(
|
||||
`BRDA:${branch.line},${branch.block},${branch.branch},${branch.hits === null ? '-' : branch.hits}`,
|
||||
);
|
||||
}
|
||||
chunks.push(`BRF:${branches.length}`);
|
||||
chunks.push(`BRH:${branches.filter((branch) => (branch.hits ?? 0) > 0).length}`);
|
||||
|
||||
const lines = [...record.lines.entries()].sort((a, b) => a[0] - b[0]);
|
||||
for (const [lineNumber, hits] of lines) {
|
||||
chunks.push(`DA:${lineNumber},${hits}`);
|
||||
}
|
||||
chunks.push(`LF:${lines.length}`);
|
||||
chunks.push(`LH:${lines.filter(([, hits]) => hits > 0).length}`);
|
||||
chunks.push('end_of_record');
|
||||
}
|
||||
|
||||
return chunks.length > 0 ? `${chunks.join('\n')}\n` : '';
|
||||
}
|
||||
|
||||
function runCoverageLane(): number {
|
||||
const laneName = process.argv[2];
|
||||
if (laneName === undefined) {
|
||||
process.stderr.write('Missing coverage lane name\n');
|
||||
return 1;
|
||||
}
|
||||
|
||||
const coverageDir = resolveCoverageDir(repoRoot, process.argv.slice(3));
|
||||
const shardRoot = join(coverageDir, '.shards');
|
||||
mkdirSync(coverageDir, { recursive: true });
|
||||
rmSync(shardRoot, { recursive: true, force: true });
|
||||
mkdirSync(shardRoot, { recursive: true });
|
||||
|
||||
const files = getLaneFiles(laneName);
|
||||
const reports: string[] = [];
|
||||
|
||||
try {
|
||||
for (const [index, file] of files.entries()) {
|
||||
const shardDir = join(shardRoot, `${String(index + 1).padStart(3, '0')}`);
|
||||
const result = spawnSync(
|
||||
'bun',
|
||||
['test', '--coverage', '--coverage-reporter=lcov', '--coverage-dir', shardDir, `./${file}`],
|
||||
{
|
||||
cwd: repoRoot,
|
||||
stdio: 'inherit',
|
||||
},
|
||||
);
|
||||
|
||||
if (result.error) {
|
||||
throw result.error;
|
||||
}
|
||||
if ((result.status ?? 1) !== 0) {
|
||||
return result.status ?? 1;
|
||||
}
|
||||
|
||||
const lcovPath = join(shardDir, 'lcov.info');
|
||||
if (!existsSync(lcovPath)) {
|
||||
process.stdout.write(`Skipping empty coverage shard for ${file}\n`);
|
||||
continue;
|
||||
}
|
||||
|
||||
reports.push(readFileSync(lcovPath, 'utf8'));
|
||||
}
|
||||
|
||||
writeFileSync(join(coverageDir, 'lcov.info'), mergeLcovReports(reports), 'utf8');
|
||||
process.stdout.write(`Merged LCOV written to ${relative(repoRoot, join(coverageDir, 'lcov.info'))}\n`);
|
||||
return 0;
|
||||
} finally {
|
||||
rmSync(shardRoot, { recursive: true, force: true });
|
||||
}
|
||||
}
|
||||
|
||||
// @ts-ignore Bun entrypoint detection; TS config for scripts still targets CommonJS.
|
||||
if (import.meta.main) {
|
||||
process.exit(runCoverageLane());
|
||||
}
|
||||
@@ -33,7 +33,7 @@ function runBash(args: string[]) {
|
||||
}
|
||||
|
||||
function parseArtifactDir(stdout: string): string {
|
||||
const match = stdout.match(/^artifact_dir=(.+)$/m);
|
||||
const match = stdout.match(/^artifacts: (.+)$/m);
|
||||
assert.ok(match, `expected artifact_dir in stdout, got:\n${stdout}`);
|
||||
return match[1] ?? '';
|
||||
}
|
||||
@@ -42,10 +42,17 @@ function readSummaryJson(artifactDir: string) {
|
||||
return JSON.parse(fs.readFileSync(path.join(artifactDir, 'summary.json'), 'utf8')) as {
|
||||
sessionId: string;
|
||||
status: string;
|
||||
selectedLanes: string[];
|
||||
lanes: string[];
|
||||
blockers?: string[];
|
||||
artifactDir: string;
|
||||
pathSelectionMode?: string;
|
||||
steps: Array<{
|
||||
lane: string;
|
||||
name: string;
|
||||
stdout: string;
|
||||
stderr: string;
|
||||
note: string;
|
||||
}>;
|
||||
};
|
||||
}
|
||||
|
||||
@@ -71,15 +78,14 @@ test('verifier blocks requested real-runtime lane when runtime execution is not
|
||||
'launcher/mpv.ts',
|
||||
]);
|
||||
|
||||
assert.notEqual(result.status, 0, result.stdout);
|
||||
assert.match(result.stdout, /^result=blocked$/m);
|
||||
assert.equal(result.status, 0, result.stdout);
|
||||
|
||||
const summary = readSummaryJson(artifactDir);
|
||||
assert.equal(summary.status, 'blocked');
|
||||
assert.deepEqual(summary.selectedLanes, ['real-runtime']);
|
||||
assert.deepEqual(summary.lanes, ['real-runtime']);
|
||||
assert.ok(summary.sessionId.length > 0);
|
||||
assert.ok(summary.blockers?.some((entry) => entry.includes('--allow-real-runtime')));
|
||||
assert.equal(fs.existsSync(path.join(artifactDir, 'reports', 'summary.json')), true);
|
||||
assert.equal(fs.existsSync(path.join(artifactDir, 'summary.json')), true);
|
||||
});
|
||||
});
|
||||
|
||||
@@ -96,16 +102,81 @@ test('verifier fails closed for unknown lanes', () => {
|
||||
'src/main.ts',
|
||||
]);
|
||||
|
||||
assert.notEqual(result.status, 0, result.stdout);
|
||||
assert.match(result.stdout, /^result=failed$/m);
|
||||
assert.equal(result.status, 0, result.stdout);
|
||||
|
||||
const summary = readSummaryJson(artifactDir);
|
||||
assert.equal(summary.status, 'failed');
|
||||
assert.deepEqual(summary.selectedLanes, ['not-a-lane']);
|
||||
assert.equal(summary.status, 'blocked');
|
||||
assert.deepEqual(summary.lanes, ['not-a-lane']);
|
||||
assert.ok(summary.blockers?.some((entry) => entry.includes('unknown lane')));
|
||||
});
|
||||
});
|
||||
|
||||
test('verifier keeps non-passing step artifacts distinct across lanes', () => {
|
||||
withTempDir((root) => {
|
||||
const artifactDir = path.join(root, 'artifacts');
|
||||
const result = runBash([
|
||||
verifyScript,
|
||||
'--dry-run',
|
||||
'--artifact-dir',
|
||||
artifactDir,
|
||||
'--lane',
|
||||
'docs',
|
||||
'--lane',
|
||||
'not-a-lane',
|
||||
'src/main.ts',
|
||||
]);
|
||||
|
||||
assert.equal(result.status, 0, result.stdout);
|
||||
|
||||
const summary = readSummaryJson(artifactDir);
|
||||
const docsStep = summary.steps.find((step) => step.lane === 'docs' && step.name === 'docs-kb');
|
||||
const unknownStep = summary.steps.find(
|
||||
(step) => step.lane === 'not-a-lane' && step.name === 'unknown-lane',
|
||||
);
|
||||
|
||||
assert.ok(docsStep);
|
||||
assert.ok(unknownStep);
|
||||
assert.notEqual(docsStep?.stdout, unknownStep?.stdout);
|
||||
assert.equal(fs.existsSync(path.join(artifactDir, docsStep!.stdout)), true);
|
||||
assert.equal(fs.existsSync(path.join(artifactDir, unknownStep!.stdout)), true);
|
||||
});
|
||||
});
|
||||
|
||||
test('verifier records the real-runtime lease blocker once', () => {
|
||||
withTempDir((root) => {
|
||||
const artifactDir = path.join(root, 'artifacts');
|
||||
const leaseDir = path.join(
|
||||
repoRoot,
|
||||
'.tmp',
|
||||
'skill-verification',
|
||||
'locks',
|
||||
'exclusive-real-runtime',
|
||||
);
|
||||
fs.mkdirSync(leaseDir, { recursive: true });
|
||||
fs.writeFileSync(path.join(leaseDir, 'session_id'), 'other-session');
|
||||
|
||||
try {
|
||||
const result = runBash([
|
||||
verifyScript,
|
||||
'--dry-run',
|
||||
'--artifact-dir',
|
||||
artifactDir,
|
||||
'--allow-real-runtime',
|
||||
'--lane',
|
||||
'real-runtime',
|
||||
'launcher/mpv.ts',
|
||||
]);
|
||||
|
||||
assert.equal(result.status, 0, result.stdout);
|
||||
|
||||
const summary = readSummaryJson(artifactDir);
|
||||
assert.deepEqual(summary.blockers, ['real-runtime lease already held by other-session']);
|
||||
} finally {
|
||||
fs.rmSync(leaseDir, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
test('verifier allocates unique session ids and artifact roots by default', () => {
|
||||
const first = runBash([verifyScript, '--dry-run', '--lane', 'core', 'src/main.ts']);
|
||||
const second = runBash([verifyScript, '--dry-run', '--lane', 'core', 'src/main.ts']);
|
||||
@@ -121,9 +192,9 @@ test('verifier allocates unique session ids and artifact roots by default', () =
|
||||
const secondSummary = readSummaryJson(secondArtifactDir);
|
||||
|
||||
assert.notEqual(firstSummary.sessionId, secondSummary.sessionId);
|
||||
assert.notEqual(firstSummary.artifactDir, secondSummary.artifactDir);
|
||||
assert.equal(firstSummary.pathSelectionMode, 'explicit');
|
||||
assert.equal(secondSummary.pathSelectionMode, 'explicit');
|
||||
assert.notEqual(firstArtifactDir, secondArtifactDir);
|
||||
assert.equal(firstSummary.pathSelectionMode, 'explicit-lanes');
|
||||
assert.equal(secondSummary.pathSelectionMode, 'explicit-lanes');
|
||||
} finally {
|
||||
fs.rmSync(firstArtifactDir, { recursive: true, force: true });
|
||||
fs.rmSync(secondArtifactDir, { recursive: true, force: true });
|
||||
|
||||
@@ -822,6 +822,92 @@ do
|
||||
)
|
||||
end
|
||||
|
||||
do
|
||||
local recorded, err = run_plugin_scenario({
|
||||
process_list = "",
|
||||
option_overrides = {
|
||||
binary_path = binary_path,
|
||||
auto_start = "yes",
|
||||
auto_start_visible_overlay = "yes",
|
||||
auto_start_pause_until_ready = "yes",
|
||||
socket_path = "/tmp/subminer-socket",
|
||||
},
|
||||
input_ipc_server = "/tmp/subminer-socket",
|
||||
media_title = "Random Movie",
|
||||
files = {
|
||||
[binary_path] = true,
|
||||
},
|
||||
})
|
||||
assert_true(recorded ~= nil, "plugin failed to load for manual toggle-off ready scenario: " .. tostring(err))
|
||||
fire_event(recorded, "file-loaded")
|
||||
assert_true(recorded.script_messages["subminer-toggle"] ~= nil, "subminer-toggle script message not registered")
|
||||
recorded.script_messages["subminer-toggle"]()
|
||||
assert_true(
|
||||
count_control_calls(recorded.async_calls, "--toggle-visible-overlay") == 1,
|
||||
"manual toggle should use explicit visible-overlay toggle command"
|
||||
)
|
||||
recorded.script_messages["subminer-autoplay-ready"]()
|
||||
assert_true(
|
||||
count_control_calls(recorded.async_calls, "--show-visible-overlay") == 1,
|
||||
"manual toggle-off before readiness should suppress ready-time visible overlay restore"
|
||||
)
|
||||
end
|
||||
|
||||
do
|
||||
local recorded, err = run_plugin_scenario({
|
||||
process_list = "",
|
||||
option_overrides = {
|
||||
binary_path = binary_path,
|
||||
auto_start = "yes",
|
||||
auto_start_visible_overlay = "yes",
|
||||
auto_start_pause_until_ready = "yes",
|
||||
socket_path = "/tmp/subminer-socket",
|
||||
},
|
||||
input_ipc_server = "/tmp/subminer-socket",
|
||||
media_title = "Random Movie",
|
||||
files = {
|
||||
[binary_path] = true,
|
||||
},
|
||||
})
|
||||
assert_true(
|
||||
recorded ~= nil,
|
||||
"plugin failed to load for repeated ready restore suppression scenario: " .. tostring(err)
|
||||
)
|
||||
fire_event(recorded, "file-loaded")
|
||||
assert_true(recorded.script_messages["subminer-toggle"] ~= nil, "subminer-toggle script message not registered")
|
||||
recorded.script_messages["subminer-toggle"]()
|
||||
recorded.script_messages["subminer-autoplay-ready"]()
|
||||
recorded.script_messages["subminer-autoplay-ready"]()
|
||||
assert_true(
|
||||
count_control_calls(recorded.async_calls, "--show-visible-overlay") == 1,
|
||||
"manual toggle-off should suppress repeated ready-time visible overlay restores for the same session"
|
||||
)
|
||||
end
|
||||
|
||||
do
|
||||
local recorded, err = run_plugin_scenario({
|
||||
process_list = "",
|
||||
option_overrides = {
|
||||
binary_path = binary_path,
|
||||
auto_start = "no",
|
||||
},
|
||||
files = {
|
||||
[binary_path] = true,
|
||||
},
|
||||
})
|
||||
assert_true(recorded ~= nil, "plugin failed to load for manual toggle command scenario: " .. tostring(err))
|
||||
assert_true(recorded.script_messages["subminer-toggle"] ~= nil, "subminer-toggle script message not registered")
|
||||
recorded.script_messages["subminer-toggle"]()
|
||||
assert_true(
|
||||
count_control_calls(recorded.async_calls, "--toggle-visible-overlay") == 1,
|
||||
"script-message toggle should issue explicit visible-overlay toggle command"
|
||||
)
|
||||
assert_true(
|
||||
count_control_calls(recorded.async_calls, "--toggle") == 0,
|
||||
"script-message toggle should not issue legacy generic toggle command"
|
||||
)
|
||||
end
|
||||
|
||||
do
|
||||
local recorded, err = run_plugin_scenario({
|
||||
process_list = "",
|
||||
|
||||
@@ -47,8 +47,8 @@ test('update-aur-package updates PKGBUILD and .SRCINFO without makepkg', () => {
|
||||
|
||||
const pkgbuild = fs.readFileSync(path.join(pkgDir, 'PKGBUILD'), 'utf8');
|
||||
const srcinfo = fs.readFileSync(path.join(pkgDir, '.SRCINFO'), 'utf8');
|
||||
const expectedSums = [appImagePath, wrapperPath, assetsPath].map((filePath) =>
|
||||
execFileSync('sha256sum', [filePath], { encoding: 'utf8' }).split(/\s+/)[0],
|
||||
const expectedSums = [appImagePath, wrapperPath, assetsPath].map(
|
||||
(filePath) => execFileSync('sha256sum', [filePath], { encoding: 'utf8' }).split(/\s+/)[0],
|
||||
);
|
||||
|
||||
assert.match(pkgbuild, /^pkgver=0\.6\.3$/m);
|
||||
|
||||
Reference in New Issue
Block a user