mirror of
https://github.com/ksyasuda/SubMiner.git
synced 2026-04-09 16:19:25 -07:00
Compare commits
5 Commits
v0.6.3
...
42abdd1268
| Author | SHA1 | Date | |
|---|---|---|---|
|
42abdd1268
|
|||
|
5d914b1547
|
|||
| 50b45cac0b | |||
|
e35aac6ee0
|
|||
|
fe2da22d29
|
6
.github/workflows/release.yml
vendored
6
.github/workflows/release.yml
vendored
@@ -386,11 +386,6 @@ jobs:
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Install makepkg
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y makepkg
|
||||
|
||||
- name: Configure SSH for AUR
|
||||
env:
|
||||
AUR_SSH_PRIVATE_KEY: ${{ secrets.AUR_SSH_PRIVATE_KEY }}
|
||||
@@ -426,6 +421,7 @@ jobs:
|
||||
version_no_v="${{ steps.version.outputs.VERSION }}"
|
||||
version_no_v="${version_no_v#v}"
|
||||
cp packaging/aur/subminer-bin/PKGBUILD aur-subminer-bin/PKGBUILD
|
||||
cp packaging/aur/subminer-bin/.SRCINFO aur-subminer-bin/.SRCINFO
|
||||
bash scripts/update-aur-package.sh \
|
||||
--pkg-dir aur-subminer-bin \
|
||||
--version "${{ steps.version.outputs.VERSION }}" \
|
||||
|
||||
10
CHANGELOG.md
10
CHANGELOG.md
@@ -1,5 +1,15 @@
|
||||
# Changelog
|
||||
|
||||
## v0.6.5 (2026-03-15)
|
||||
|
||||
### Internal
|
||||
- Release: Seed the AUR checkout with the repo `.SRCINFO` template before rewriting metadata so tagged releases do not depend on prior AUR state.
|
||||
|
||||
## v0.6.4 (2026-03-15)
|
||||
|
||||
### Internal
|
||||
- Release: Reworked AUR metadata generation to update `.SRCINFO` directly instead of depending on runner `makepkg`, fixing tagged release publishing for `subminer-bin`.
|
||||
|
||||
## v0.6.3 (2026-03-15)
|
||||
|
||||
### Changed
|
||||
|
||||
@@ -0,0 +1,73 @@
|
||||
---
|
||||
id: TASK-166
|
||||
title: Prevent AUR upgrade cache collisions for unversioned release assets
|
||||
status: Done
|
||||
assignee:
|
||||
- Codex
|
||||
created_date: '2026-03-17 18:10'
|
||||
updated_date: '2026-03-17 18:14'
|
||||
labels:
|
||||
- release
|
||||
- packaging
|
||||
- linux
|
||||
dependencies:
|
||||
- TASK-165
|
||||
references:
|
||||
- /home/sudacode/projects/japanese/SubMiner/.github/workflows/release.yml
|
||||
- /home/sudacode/projects/japanese/SubMiner/scripts/update-aur-package.sh
|
||||
- /home/sudacode/projects/japanese/SubMiner/scripts/update-aur-package.test.ts
|
||||
- /home/sudacode/projects/japanese/SubMiner/packaging/aur/subminer-bin/PKGBUILD
|
||||
- /home/sudacode/projects/japanese/SubMiner/packaging/aur/subminer-bin/.SRCINFO
|
||||
priority: medium
|
||||
---
|
||||
|
||||
## Description
|
||||
|
||||
<!-- SECTION:DESCRIPTION:BEGIN -->
|
||||
Fix the AUR release metadata generated by the tagged-release workflow so end-user upgrades do not reuse stale cached downloads for unversioned `subminer` and `subminer-assets.tar.gz` source names.
|
||||
<!-- SECTION:DESCRIPTION:END -->
|
||||
|
||||
## Acceptance Criteria
|
||||
<!-- AC:BEGIN -->
|
||||
- [x] #1 AUR packaging generated for a new `pkgver` uses versioned local source aliases for the non-versioned GitHub release assets.
|
||||
- [x] #2 The package install step references the versioned local launcher filename correctly.
|
||||
- [x] #3 Regression coverage fails if metadata generation reintroduces stable cache-colliding source aliases.
|
||||
- [x] #4 Targeted verification records the commands run and results.
|
||||
<!-- AC:END -->
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
<!-- SECTION:PLAN:BEGIN -->
|
||||
1. Add a failing regression test around `scripts/update-aur-package.sh` output for versioned local source aliases.
|
||||
2. Update the repo AUR template and `.SRCINFO` rewrite logic to stamp versioned alias names for `subminer` and `subminer-assets`.
|
||||
3. Verify the generated metadata and targeted workflow/package tests, then record results here.
|
||||
<!-- SECTION:PLAN:END -->
|
||||
|
||||
## Implementation Notes
|
||||
|
||||
<!-- SECTION:NOTES:BEGIN -->
|
||||
Root cause: the AUR package used stable local source aliases for the unversioned `subminer` and `subminer-assets.tar.gz` GitHub release assets. `makepkg`/AUR helpers can reuse those cached filenames across upgrades, so a stale cached download survives into a newer `pkgver` and then fails checksum validation.
|
||||
|
||||
Patched the repo AUR template to version the local cache aliases:
|
||||
- `subminer-${pkgver}::.../subminer`
|
||||
- `subminer-assets-${pkgver}.tar.gz::.../subminer-assets.tar.gz`
|
||||
|
||||
Updated `package()` to install the versioned local wrapper filename, and updated `scripts/update-aur-package.sh` so the generated `.SRCINFO` stamps matching concrete versioned aliases for release automation.
|
||||
|
||||
Added regression assertions in `scripts/update-aur-package.test.ts` covering both versioned source aliases and the launcher install path, then watched that test fail before the patch and pass after it.
|
||||
|
||||
Verification:
|
||||
- `bun test scripts/update-aur-package.test.ts`
|
||||
- `bash -n scripts/update-aur-package.sh && bash -n packaging/aur/subminer-bin/PKGBUILD`
|
||||
- `bun run typecheck`
|
||||
- `bun run test:fast`
|
||||
- `bun run test:env`
|
||||
- `bun run build`
|
||||
- `bun run test:smoke:dist`
|
||||
<!-- SECTION:NOTES:END -->
|
||||
|
||||
## Final Summary
|
||||
|
||||
<!-- SECTION:FINAL_SUMMARY:BEGIN -->
|
||||
The tagged-release AUR metadata path now emits versioned local source aliases for the non-versioned GitHub release assets, preventing stale `makepkg` cache reuse across `subminer-bin` upgrades. The change is covered by a regression test and passed the repo's maintained verification gate.
|
||||
<!-- SECTION:FINAL_SUMMARY:END -->
|
||||
4
changes/jlpt-duplicate-log-summary.md
Normal file
4
changes/jlpt-duplicate-log-summary.md
Normal file
@@ -0,0 +1,4 @@
|
||||
type: fixed
|
||||
area: jlpt
|
||||
|
||||
- Reduced JLPT dictionary startup log noise by summarizing duplicate surface-form collisions instead of logging one line per duplicate entry.
|
||||
@@ -1,5 +1,11 @@
|
||||
# Changelog
|
||||
|
||||
## v0.6.5 (2026-03-15)
|
||||
- Seeded the AUR checkout with the repo `.SRCINFO` template before rewriting metadata so tagged releases do not depend on prior AUR state.
|
||||
|
||||
## v0.6.4 (2026-03-15)
|
||||
- Reworked AUR metadata generation to update `.SRCINFO` directly instead of depending on runner `makepkg`, fixing tagged release publishing for `subminer-bin`.
|
||||
|
||||
## v0.6.3 (2026-03-15)
|
||||
- Expanded `Alt+C` into an inline controller config/remap flow with preferred-controller saving and per-action learn mode for buttons, triggers, and stick directions.
|
||||
- Automated `subminer-bin` AUR package updates from the tagged release workflow.
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
{
|
||||
"name": "subminer",
|
||||
"version": "0.6.3",
|
||||
"version": "0.6.5",
|
||||
"description": "All-in-one sentence mining overlay with AnkiConnect and dictionary integration",
|
||||
"packageManager": "bun@1.3.5",
|
||||
"main": "dist/main-entry.js",
|
||||
@@ -54,7 +54,7 @@
|
||||
"test:launcher": "bun run test:launcher:src",
|
||||
"test:core": "bun run test:core:src",
|
||||
"test:subtitle": "bun run test:subtitle:src",
|
||||
"test:fast": "bun run test:config:src && bun run test:core:src && bun test src/main-entry-runtime.test.ts src/anki-integration/anki-connect-proxy.test.ts src/release-workflow.test.ts src/ci-workflow.test.ts scripts/build-changelog.test.ts scripts/mkv-to-readme-video.test.ts && bun run tsc && bun test dist/main/runtime/registry.test.js",
|
||||
"test:fast": "bun run test:config:src && bun run test:core:src && bun test src/main-entry-runtime.test.ts src/anki-integration/anki-connect-proxy.test.ts src/release-workflow.test.ts src/ci-workflow.test.ts scripts/build-changelog.test.ts scripts/mkv-to-readme-video.test.ts scripts/update-aur-package.test.ts && bun run tsc && bun test dist/main/runtime/registry.test.js",
|
||||
"generate:config-example": "bun run src/generate-config-example.ts",
|
||||
"verify:config-example": "bun run src/verify-config-example.ts",
|
||||
"start": "bun run build && electron . --start",
|
||||
|
||||
@@ -27,8 +27,8 @@ pkgbase = subminer-bin
|
||||
options = !strip
|
||||
options = !debug
|
||||
source = SubMiner-0.6.2.AppImage::https://github.com/ksyasuda/SubMiner/releases/download/v0.6.2/SubMiner-0.6.2.AppImage
|
||||
source = subminer::https://github.com/ksyasuda/SubMiner/releases/download/v0.6.2/subminer
|
||||
source = subminer-assets.tar.gz::https://github.com/ksyasuda/SubMiner/releases/download/v0.6.2/subminer-assets.tar.gz
|
||||
source = subminer-0.6.2::https://github.com/ksyasuda/SubMiner/releases/download/v0.6.2/subminer
|
||||
source = subminer-assets-0.6.2.tar.gz::https://github.com/ksyasuda/SubMiner/releases/download/v0.6.2/subminer-assets.tar.gz
|
||||
sha256sums = c91667adbbc47a0fba34855358233454a9ea442ab57510546b2219abd1f2461e
|
||||
sha256sums = 85050918e14cb2512fcd34be83387a2383fa5c206dc1bdc11e8d98f7d37817e5
|
||||
sha256sums = 210113be64a06840f4dfaebc22a8e6fc802392f1308413aa00d9348c804ab2a1
|
||||
|
||||
@@ -32,8 +32,8 @@ provides=("subminer=${pkgver}")
|
||||
conflicts=('subminer')
|
||||
source=(
|
||||
"SubMiner-${pkgver}.AppImage::https://github.com/ksyasuda/SubMiner/releases/download/v${pkgver}/SubMiner-${pkgver}.AppImage"
|
||||
"subminer::https://github.com/ksyasuda/SubMiner/releases/download/v${pkgver}/subminer"
|
||||
"subminer-assets.tar.gz::https://github.com/ksyasuda/SubMiner/releases/download/v${pkgver}/subminer-assets.tar.gz"
|
||||
"subminer-${pkgver}::https://github.com/ksyasuda/SubMiner/releases/download/v${pkgver}/subminer"
|
||||
"subminer-assets-${pkgver}.tar.gz::https://github.com/ksyasuda/SubMiner/releases/download/v${pkgver}/subminer-assets.tar.gz"
|
||||
)
|
||||
sha256sums=(
|
||||
'c91667adbbc47a0fba34855358233454a9ea442ab57510546b2219abd1f2461e'
|
||||
@@ -50,7 +50,7 @@ package() {
|
||||
install -dm755 "${pkgdir}/opt/SubMiner"
|
||||
ln -s '/opt/SubMiner/SubMiner.AppImage' "${pkgdir}/usr/bin/SubMiner.AppImage"
|
||||
|
||||
install -Dm755 "${srcdir}/subminer" "${pkgdir}/usr/bin/subminer"
|
||||
install -Dm755 "${srcdir}/subminer-${pkgver}" "${pkgdir}/usr/bin/subminer"
|
||||
|
||||
install -Dm644 "${srcdir}/config.example.jsonc" \
|
||||
"${pkgdir}/usr/share/SubMiner/config.example.jsonc"
|
||||
|
||||
@@ -1,10 +1,6 @@
|
||||
## Highlights
|
||||
### Changed
|
||||
- Overlay: Expanded the `Alt+C` controller modal into an inline config/remap flow with preferred-controller saving and per-action learn mode for buttons, triggers, and stick directions.
|
||||
|
||||
### Internal
|
||||
- Workflow: Hardened the `subminer-scrum-master` skill to explicitly answer whether docs updates and changelog fragments are required before handoff.
|
||||
- Release: Automate `subminer-bin` AUR package updates from the tagged release workflow.
|
||||
- Release: Seed the AUR checkout with the repo `.SRCINFO` template before rewriting metadata so tagged releases do not depend on prior AUR state.
|
||||
|
||||
## Installation
|
||||
|
||||
|
||||
@@ -54,6 +54,7 @@ fi
|
||||
|
||||
version="${version#v}"
|
||||
pkgbuild="${pkg_dir}/PKGBUILD"
|
||||
srcinfo="${pkg_dir}/.SRCINFO"
|
||||
|
||||
if [[ ! -f "$pkgbuild" ]]; then
|
||||
echo "Missing PKGBUILD at $pkgbuild" >&2
|
||||
@@ -118,7 +119,96 @@ awk \
|
||||
' "$pkgbuild" > "$tmpfile"
|
||||
mv "$tmpfile" "$pkgbuild"
|
||||
|
||||
(
|
||||
cd "$pkg_dir"
|
||||
makepkg --printsrcinfo > .SRCINFO
|
||||
)
|
||||
if [[ ! -f "$srcinfo" ]]; then
|
||||
echo "Missing .SRCINFO at $srcinfo" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
tmpfile="$(mktemp)"
|
||||
awk \
|
||||
-v version="$version" \
|
||||
-v sum_appimage="${sha256sums[0]}" \
|
||||
-v sum_wrapper="${sha256sums[1]}" \
|
||||
-v sum_assets="${sha256sums[2]}" \
|
||||
'
|
||||
BEGIN {
|
||||
sha_index = 0
|
||||
found_pkgver = 0
|
||||
found_provides = 0
|
||||
found_noextract = 0
|
||||
found_source_appimage = 0
|
||||
found_source_wrapper = 0
|
||||
found_source_assets = 0
|
||||
}
|
||||
/^\tpkgver = / {
|
||||
print "\tpkgver = " version
|
||||
found_pkgver = 1
|
||||
next
|
||||
}
|
||||
/^\tprovides = subminer=/ {
|
||||
print "\tprovides = subminer=" version
|
||||
found_provides = 1
|
||||
next
|
||||
}
|
||||
/^\tnoextract = SubMiner-.*\.AppImage$/ {
|
||||
print "\tnoextract = SubMiner-" version ".AppImage"
|
||||
found_noextract = 1
|
||||
next
|
||||
}
|
||||
/^\tsource = SubMiner-.*\.AppImage::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v.*\/SubMiner-.*\.AppImage$/ {
|
||||
print "\tsource = SubMiner-" version ".AppImage::https://github.com/ksyasuda/SubMiner/releases/download/v" version "/SubMiner-" version ".AppImage"
|
||||
found_source_appimage = 1
|
||||
next
|
||||
}
|
||||
/^\tsource = subminer-.*::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v.*\/subminer$/ {
|
||||
print "\tsource = subminer-" version "::https://github.com/ksyasuda/SubMiner/releases/download/v" version "/subminer"
|
||||
found_source_wrapper = 1
|
||||
next
|
||||
}
|
||||
/^\tsource = subminer-assets-.*\.tar\.gz::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v.*\/subminer-assets\.tar\.gz$/ {
|
||||
print "\tsource = subminer-assets-" version ".tar.gz::https://github.com/ksyasuda/SubMiner/releases/download/v" version "/subminer-assets.tar.gz"
|
||||
found_source_assets = 1
|
||||
next
|
||||
}
|
||||
/^\tsha256sums = / {
|
||||
sha_index += 1
|
||||
if (sha_index == 1) {
|
||||
print "\tsha256sums = " sum_appimage
|
||||
next
|
||||
}
|
||||
if (sha_index == 2) {
|
||||
print "\tsha256sums = " sum_wrapper
|
||||
next
|
||||
}
|
||||
if (sha_index == 3) {
|
||||
print "\tsha256sums = " sum_assets
|
||||
next
|
||||
}
|
||||
}
|
||||
{
|
||||
print
|
||||
}
|
||||
END {
|
||||
if (!found_pkgver) {
|
||||
print "Missing pkgver entry in .SRCINFO" > "/dev/stderr"
|
||||
exit 1
|
||||
}
|
||||
if (!found_provides) {
|
||||
print "Missing provides entry in .SRCINFO" > "/dev/stderr"
|
||||
exit 1
|
||||
}
|
||||
if (!found_noextract) {
|
||||
print "Missing noextract entry in .SRCINFO" > "/dev/stderr"
|
||||
exit 1
|
||||
}
|
||||
if (!found_source_appimage || !found_source_wrapper || !found_source_assets) {
|
||||
print "Missing source entry in .SRCINFO" > "/dev/stderr"
|
||||
exit 1
|
||||
}
|
||||
if (sha_index < 3) {
|
||||
print "Missing sha256sums entries in .SRCINFO" > "/dev/stderr"
|
||||
exit 1
|
||||
}
|
||||
}
|
||||
' "$srcinfo" > "$tmpfile"
|
||||
mv "$tmpfile" "$srcinfo"
|
||||
|
||||
87
scripts/update-aur-package.test.ts
Normal file
87
scripts/update-aur-package.test.ts
Normal file
@@ -0,0 +1,87 @@
|
||||
import assert from 'node:assert/strict';
|
||||
import { execFileSync } from 'node:child_process';
|
||||
import fs from 'node:fs';
|
||||
import os from 'node:os';
|
||||
import path from 'node:path';
|
||||
import test from 'node:test';
|
||||
|
||||
function createWorkspace(name: string): string {
|
||||
return fs.mkdtempSync(path.join(os.tmpdir(), `${name}-`));
|
||||
}
|
||||
|
||||
test('update-aur-package updates PKGBUILD and .SRCINFO without makepkg', () => {
|
||||
const workspace = createWorkspace('subminer-aur-package');
|
||||
const pkgDir = path.join(workspace, 'aur-subminer-bin');
|
||||
const appImagePath = path.join(workspace, 'SubMiner-0.6.3.AppImage');
|
||||
const wrapperPath = path.join(workspace, 'subminer');
|
||||
const assetsPath = path.join(workspace, 'subminer-assets.tar.gz');
|
||||
|
||||
fs.mkdirSync(pkgDir, { recursive: true });
|
||||
fs.copyFileSync('packaging/aur/subminer-bin/PKGBUILD', path.join(pkgDir, 'PKGBUILD'));
|
||||
fs.copyFileSync('packaging/aur/subminer-bin/.SRCINFO', path.join(pkgDir, '.SRCINFO'));
|
||||
fs.writeFileSync(appImagePath, 'appimage');
|
||||
fs.writeFileSync(wrapperPath, 'wrapper');
|
||||
fs.writeFileSync(assetsPath, 'assets');
|
||||
|
||||
try {
|
||||
execFileSync(
|
||||
'bash',
|
||||
[
|
||||
'scripts/update-aur-package.sh',
|
||||
'--pkg-dir',
|
||||
pkgDir,
|
||||
'--version',
|
||||
'v0.6.3',
|
||||
'--appimage',
|
||||
appImagePath,
|
||||
'--wrapper',
|
||||
wrapperPath,
|
||||
'--assets',
|
||||
assetsPath,
|
||||
],
|
||||
{
|
||||
cwd: process.cwd(),
|
||||
encoding: 'utf8',
|
||||
},
|
||||
);
|
||||
|
||||
const pkgbuild = fs.readFileSync(path.join(pkgDir, 'PKGBUILD'), 'utf8');
|
||||
const srcinfo = fs.readFileSync(path.join(pkgDir, '.SRCINFO'), 'utf8');
|
||||
const expectedSums = [appImagePath, wrapperPath, assetsPath].map((filePath) =>
|
||||
execFileSync('sha256sum', [filePath], { encoding: 'utf8' }).split(/\s+/)[0],
|
||||
);
|
||||
|
||||
assert.match(pkgbuild, /^pkgver=0\.6\.3$/m);
|
||||
assert.match(
|
||||
pkgbuild,
|
||||
/^\s*"subminer-\$\{pkgver\}::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v\$\{pkgver\}\/subminer"$/m,
|
||||
);
|
||||
assert.match(
|
||||
pkgbuild,
|
||||
/^\s*"subminer-assets-\$\{pkgver\}\.tar\.gz::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v\$\{pkgver\}\/subminer-assets\.tar\.gz"$/m,
|
||||
);
|
||||
assert.match(
|
||||
pkgbuild,
|
||||
/^\s*install -Dm755 "\$\{srcdir\}\/subminer-\$\{pkgver\}" "\$\{pkgdir\}\/usr\/bin\/subminer"$/m,
|
||||
);
|
||||
assert.match(srcinfo, /^\tpkgver = 0\.6\.3$/m);
|
||||
assert.match(srcinfo, /^\tprovides = subminer=0\.6\.3$/m);
|
||||
assert.match(
|
||||
srcinfo,
|
||||
/^\tsource = SubMiner-0\.6\.3\.AppImage::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v0\.6\.3\/SubMiner-0\.6\.3\.AppImage$/m,
|
||||
);
|
||||
assert.match(
|
||||
srcinfo,
|
||||
/^\tsource = subminer-0\.6\.3::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v0\.6\.3\/subminer$/m,
|
||||
);
|
||||
assert.match(
|
||||
srcinfo,
|
||||
/^\tsource = subminer-assets-0\.6\.3\.tar\.gz::https:\/\/github\.com\/ksyasuda\/SubMiner\/releases\/download\/v0\.6\.3\/subminer-assets\.tar\.gz$/m,
|
||||
);
|
||||
assert.match(srcinfo, new RegExp(`^\\tsha256sums = ${expectedSums[0]}$`, 'm'));
|
||||
assert.match(srcinfo, new RegExp(`^\\tsha256sums = ${expectedSums[1]}$`, 'm'));
|
||||
assert.match(srcinfo, new RegExp(`^\\tsha256sums = ${expectedSums[2]}$`, 'm'));
|
||||
} finally {
|
||||
fs.rmSync(workspace, { recursive: true, force: true });
|
||||
}
|
||||
});
|
||||
@@ -6,9 +6,16 @@ import test from 'node:test';
|
||||
|
||||
import { createJlptVocabularyLookup } from './jlpt-vocab';
|
||||
|
||||
test('createJlptVocabularyLookup loads JLPT bank entries and resolves known levels', async () => {
|
||||
function createTempDir(): string {
|
||||
return fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-jlpt-dict-'));
|
||||
}
|
||||
|
||||
test('createJlptVocabularyLookup loads JLPT bank entries and resolves known levels', async (t) => {
|
||||
const logs: string[] = [];
|
||||
const tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-jlpt-dict-'));
|
||||
const tempDir = createTempDir();
|
||||
t.after(() => {
|
||||
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||
});
|
||||
fs.writeFileSync(
|
||||
path.join(tempDir, 'term_meta_bank_5.json'),
|
||||
JSON.stringify([
|
||||
@@ -37,8 +44,11 @@ test('createJlptVocabularyLookup loads JLPT bank entries and resolves known leve
|
||||
);
|
||||
});
|
||||
|
||||
test('createJlptVocabularyLookup does not require synchronous fs APIs', async () => {
|
||||
const tempDir = fs.mkdtempSync(path.join(os.tmpdir(), 'subminer-jlpt-dict-'));
|
||||
test('createJlptVocabularyLookup does not require synchronous fs APIs', async (t) => {
|
||||
const tempDir = createTempDir();
|
||||
t.after(() => {
|
||||
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||
});
|
||||
fs.writeFileSync(
|
||||
path.join(tempDir, 'term_meta_bank_4.json'),
|
||||
JSON.stringify([['見る', 1, { frequency: { displayValue: 3 } }]]),
|
||||
@@ -73,3 +83,47 @@ test('createJlptVocabularyLookup does not require synchronous fs APIs', async ()
|
||||
(fs as unknown as Record<string, unknown>).existsSync = existsSync;
|
||||
}
|
||||
});
|
||||
|
||||
test('createJlptVocabularyLookup summarizes duplicate JLPT terms without per-entry log spam', async (t) => {
|
||||
const logs: string[] = [];
|
||||
const tempDir = createTempDir();
|
||||
t.after(() => {
|
||||
fs.rmSync(tempDir, { recursive: true, force: true });
|
||||
});
|
||||
fs.writeFileSync(
|
||||
path.join(tempDir, 'term_meta_bank_1.json'),
|
||||
JSON.stringify([
|
||||
['余り', 1, { frequency: { displayValue: 'N1' }, reading: 'あんまり' }],
|
||||
['私', 2, { frequency: { displayValue: 'N1' }, reading: 'あたし' }],
|
||||
]),
|
||||
);
|
||||
fs.writeFileSync(path.join(tempDir, 'term_meta_bank_2.json'), JSON.stringify([]));
|
||||
fs.writeFileSync(path.join(tempDir, 'term_meta_bank_3.json'), JSON.stringify([]));
|
||||
fs.writeFileSync(path.join(tempDir, 'term_meta_bank_4.json'), JSON.stringify([]));
|
||||
fs.writeFileSync(
|
||||
path.join(tempDir, 'term_meta_bank_5.json'),
|
||||
JSON.stringify([
|
||||
['余り', 3, { frequency: { displayValue: 'N5' }, reading: 'あまり' }],
|
||||
['私', 4, { frequency: { displayValue: 'N5' }, reading: 'わたし' }],
|
||||
['私', 5, { frequency: { displayValue: 'N5' }, reading: 'わたくし' }],
|
||||
]),
|
||||
);
|
||||
|
||||
const lookup = await createJlptVocabularyLookup({
|
||||
searchPaths: [tempDir],
|
||||
log: (message) => {
|
||||
logs.push(message);
|
||||
},
|
||||
});
|
||||
|
||||
assert.equal(lookup('余り'), 'N1');
|
||||
assert.equal(lookup('私'), 'N1');
|
||||
assert.equal(
|
||||
logs.some((entry) => entry.includes('keeping') && entry.includes('instead')),
|
||||
false,
|
||||
);
|
||||
assert.equal(
|
||||
logs.some((entry) => entry.includes('collapsed') && entry.includes('duplicate')),
|
||||
true,
|
||||
);
|
||||
});
|
||||
|
||||
@@ -22,10 +22,17 @@ const JLPT_LEVEL_PRECEDENCE: Record<JlptLevel, number> = {
|
||||
N4: 2,
|
||||
N5: 1,
|
||||
};
|
||||
const JLPT_DUPLICATE_LOG_EXAMPLE_LIMIT = 5;
|
||||
|
||||
const NOOP_LOOKUP = (): null => null;
|
||||
const ENTRY_YIELD_INTERVAL = 5000;
|
||||
|
||||
interface JlptDuplicateStats {
|
||||
duplicateEntryCount: number;
|
||||
duplicateTerms: Set<string>;
|
||||
exampleTerms: string[];
|
||||
}
|
||||
|
||||
function isErrorCode(error: unknown, code: string): boolean {
|
||||
return Boolean(error && typeof error === 'object' && (error as { code?: unknown }).code === code);
|
||||
}
|
||||
@@ -47,11 +54,30 @@ function hasFrequencyDisplayValue(meta: unknown): boolean {
|
||||
return Object.prototype.hasOwnProperty.call(frequency as Record<string, unknown>, 'displayValue');
|
||||
}
|
||||
|
||||
function createJlptDuplicateStats(): JlptDuplicateStats {
|
||||
return {
|
||||
duplicateEntryCount: 0,
|
||||
duplicateTerms: new Set<string>(),
|
||||
exampleTerms: [],
|
||||
};
|
||||
}
|
||||
|
||||
function recordJlptDuplicate(stats: JlptDuplicateStats, term: string): void {
|
||||
stats.duplicateEntryCount += 1;
|
||||
stats.duplicateTerms.add(term);
|
||||
if (
|
||||
stats.exampleTerms.length < JLPT_DUPLICATE_LOG_EXAMPLE_LIMIT &&
|
||||
!stats.exampleTerms.includes(term)
|
||||
) {
|
||||
stats.exampleTerms.push(term);
|
||||
}
|
||||
}
|
||||
|
||||
async function addEntriesToMap(
|
||||
rawEntries: unknown,
|
||||
level: JlptLevel,
|
||||
terms: Map<string, JlptLevel>,
|
||||
log: (message: string) => void,
|
||||
duplicateStats: JlptDuplicateStats,
|
||||
): Promise<void> {
|
||||
const shouldUpdateLevel = (
|
||||
existingLevel: JlptLevel | undefined,
|
||||
@@ -90,14 +116,13 @@ async function addEntriesToMap(
|
||||
}
|
||||
|
||||
const existingLevel = terms.get(normalizedTerm);
|
||||
if (shouldUpdateLevel(existingLevel, level)) {
|
||||
terms.set(normalizedTerm, level);
|
||||
continue;
|
||||
if (existingLevel !== undefined) {
|
||||
recordJlptDuplicate(duplicateStats, normalizedTerm);
|
||||
}
|
||||
|
||||
log(
|
||||
`JLPT dictionary already has ${normalizedTerm} as ${existingLevel}; keeping that level instead of ${level}`,
|
||||
);
|
||||
if (shouldUpdateLevel(existingLevel, level)) {
|
||||
terms.set(normalizedTerm, level);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -106,6 +131,7 @@ async function collectDictionaryFromPath(
|
||||
log: (message: string) => void,
|
||||
): Promise<Map<string, JlptLevel>> {
|
||||
const terms = new Map<string, JlptLevel>();
|
||||
const duplicateStats = createJlptDuplicateStats();
|
||||
|
||||
for (const bank of JLPT_BANK_FILES) {
|
||||
const bankPath = path.join(dictionaryPath, bank.filename);
|
||||
@@ -146,12 +172,22 @@ async function collectDictionaryFromPath(
|
||||
}
|
||||
|
||||
const beforeSize = terms.size;
|
||||
await addEntriesToMap(rawEntries, bank.level, terms, log);
|
||||
await addEntriesToMap(rawEntries, bank.level, terms, duplicateStats);
|
||||
if (terms.size === beforeSize) {
|
||||
log(`JLPT bank file contained no extractable entries: ${bankPath}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (duplicateStats.duplicateEntryCount > 0) {
|
||||
const examples =
|
||||
duplicateStats.exampleTerms.length > 0
|
||||
? `; examples: ${duplicateStats.exampleTerms.join(', ')}`
|
||||
: '';
|
||||
log(
|
||||
`JLPT dictionary collapsed ${duplicateStats.duplicateEntryCount} duplicate JLPT entries across ${duplicateStats.duplicateTerms.size} terms; keeping highest-precedence level per surface form${examples}`,
|
||||
);
|
||||
}
|
||||
|
||||
return terms;
|
||||
}
|
||||
|
||||
|
||||
@@ -72,14 +72,18 @@ test('release workflow publishes subminer-bin to AUR from tagged release artifac
|
||||
assert.match(releaseWorkflow, /needs:\s*\[release\]/);
|
||||
assert.match(releaseWorkflow, /AUR_SSH_PRIVATE_KEY/);
|
||||
assert.match(releaseWorkflow, /ssh:\/\/aur@aur\.archlinux\.org\/subminer-bin\.git/);
|
||||
assert.match(releaseWorkflow, /Install makepkg/);
|
||||
assert.match(releaseWorkflow, /scripts\/update-aur-package\.sh/);
|
||||
assert.match(
|
||||
releaseWorkflow,
|
||||
/cp packaging\/aur\/subminer-bin\/\.SRCINFO aur-subminer-bin\/\.SRCINFO/,
|
||||
);
|
||||
assert.match(releaseWorkflow, /version_no_v="\$\{\{ steps\.version\.outputs\.VERSION \}\}"/);
|
||||
assert.match(releaseWorkflow, /SubMiner-\$\{version_no_v\}\.AppImage/);
|
||||
assert.doesNotMatch(
|
||||
releaseWorkflow,
|
||||
/SubMiner-\$\{\{ steps\.version\.outputs\.VERSION \}\}\.AppImage/,
|
||||
);
|
||||
assert.doesNotMatch(releaseWorkflow, /Install makepkg/);
|
||||
});
|
||||
|
||||
test('release workflow skips empty AUR sync commits', () => {
|
||||
|
||||
2
vendor/subminer-yomitan
vendored
2
vendor/subminer-yomitan
vendored
Submodule vendor/subminer-yomitan updated: 979a162904...0cee7435e8
2
vendor/texthooker-ui
vendored
2
vendor/texthooker-ui
vendored
Submodule vendor/texthooker-ui updated: e8c7ae1122...96e8404130
Reference in New Issue
Block a user