Compare commits

..

42 Commits
v1.7.1 ... main

Author SHA1 Message Date
Kevin Veen-Birkenbach
3d5c770def Solved ruff F401
Some checks are pending
Mark stable commit / test-unit (push) Waiting to run
Mark stable commit / test-integration (push) Waiting to run
Mark stable commit / test-env-virtual (push) Waiting to run
Mark stable commit / test-env-nix (push) Waiting to run
Mark stable commit / test-e2e (push) Waiting to run
Mark stable commit / test-virgin-user (push) Waiting to run
Mark stable commit / test-virgin-root (push) Waiting to run
Mark stable commit / lint-shell (push) Waiting to run
Mark stable commit / lint-python (push) Waiting to run
Mark stable commit / mark-stable (push) Blocked by required conditions
2025-12-18 19:16:15 +01:00
Kevin Veen-Birkenbach
f4339a746a executet 'ruff format --check .'
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-18 14:04:44 +01:00
Kevin Veen-Birkenbach
763f02a9a4 Release version 1.8.6
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-17 23:50:31 +01:00
Kevin Veen-Birkenbach
2eec873a17 Solved Debian Bug
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
https://chatgpt.com/share/69432655-a948-800f-8c0d-353921cdf644
2025-12-17 23:29:04 +01:00
Kevin Veen-Birkenbach
17ee947930 ci: pass NIX_CONFIG with GitHub token into all test containers
- Add NIX_CONFIG with GitHub access token to all CI test workflows
- Export NIX_CONFIG in Makefile for propagation to test scripts
- Forward NIX_CONFIG explicitly into all docker run invocations
- Prevent GitHub API rate limit errors during Nix-based tests

https://chatgpt.com/share/69432655-a948-800f-8c0d-353921cdf644
2025-12-17 23:29:04 +01:00
Kevin Veen-Birkenbach
b989bdd4eb Release version 1.8.5 2025-12-17 23:29:04 +01:00
Kevin Veen-Birkenbach
c4da8368d8 --- Release Error --- 2025-12-17 23:28:45 +01:00
Kevin Veen-Birkenbach
997c265cfb refactor(git): introduce GitRunError hierarchy, surface non-repo errors, and improve verification queries
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
* Replace legacy GitError usage with a clearer exception hierarchy:

  * GitBaseError as the common root for all git-related failures
  * GitRunError for subprocess execution failures
  * GitQueryError for read-only query failures
  * GitCommandError for state-changing command failures
  * GitNotRepositoryError to explicitly signal “not a git repository” situations
* Update git runner to detect “not a git repository” stderr and raise GitNotRepositoryError with rich context (cwd, command, stderr)
* Refactor repository verification to use dedicated query helpers instead of ad-hoc subprocess calls:

  * get_remote_head_commit (ls-remote) for pull mode
  * get_head_commit for local mode
  * get_latest_signing_key (%GK) for signature verification
* Add strict vs best-effort behavior in verify_repository:

  * Best-effort collection for reporting (does not block when no verification config exists)
  * Strict retrieval and explicit error messages when verification is configured
  * Clear failure cases when commit/signing key cannot be determined
* Add new unit tests covering:

  * get_latest_signing_key output stripping and error wrapping
  * get_remote_head_commit parsing, empty output, and error wrapping
  * verify_repository success/failure scenarios and “do not swallow GitNotRepositoryError”
* Adjust imports and exception handling across actions/commands/queries to align with GitRunError-based handling while keeping GitNotRepositoryError uncaught for debugging clarity

https://chatgpt.com/share/6943173c-508c-800f-8879-af75d131c79b
2025-12-17 21:48:03 +01:00
Kevin Veen-Birkenbach
955028288f Release version 1.8.4
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-17 11:20:16 +01:00
Kevin Veen-Birkenbach
866572e252 ci(docker): fix repo mount path for pkgmgr as base layer of Infinito.Nexus
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
Standardize Docker/CI/test environments to mount pkgmgr at /opt/src/pkgmgr.
This makes the layering explicit: pkgmgr is the lower-level foundation used by
Infinito.Nexus.

Infra-only change (Docker, CI, shell scripts). No runtime or Nix semantics changed.

https://chatgpt.com/share/69427fe7-e288-800f-90a4-c1c3c11a8484
2025-12-17 11:03:02 +01:00
Kevin Veen-Birkenbach
b0a733369e Optimized output for debugging
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-17 10:51:56 +01:00
Kevin Veen-Birkenbach
c5843ccd30 Release version 1.8.3
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-16 19:49:51 +01:00
Kevin Veen-Birkenbach
3cb7852cb4 feat(mirrors): support URL-only MIRRORS entries and keep git config clean
- Allow MIRRORS to contain plain URLs (one per line) in addition to legacy "NAME URL"
- Treat strings as single URLs to avoid iterable pitfalls
- Write PyPI URLs as metadata-only entries (never added to git config)
- Keep MIRRORS as the single source of truth for mirror setup
- Update integration test to assert URL-only MIRRORS output

https://chatgpt.com/share/6941a9aa-b8b4-800f-963d-2486b34856b1
2025-12-16 19:49:09 +01:00
Kevin Veen-Birkenbach
f995e3d368 Release version 1.8.2
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-16 19:22:41 +01:00
Kevin Veen-Birkenbach
ffa9d9660a gpt-5.2 ChatGPT: refactor tools code into cli.tools.vscode and add unit tests
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
* Move VS Code workspace logic (incl. guards) from cli.commands.tools into cli.tools.vscode
* Extract shared repo path resolution into cli.tools.paths and reuse for explore/terminal
* Simplify cli.commands.tools to pure orchestration via open_vscode_workspace
* Update existing tools command unit test to assert delegation instead of patching removed internals
* Add new unit tests for cli.tools.paths and cli.tools.vscode (workspace creation, reuse, guard errors)

https://chatgpt.com/share/69419a6a-c9e4-800f-9538-b6652b2da6b3
2025-12-16 18:43:56 +01:00
Kevin Veen-Birkenbach
be70dd4239 Release version 1.8.1
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-16 18:06:35 +01:00
Kevin Veen-Birkenbach
74876e2e15 Fixed ruff
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-16 18:00:56 +01:00
Kevin Veen-Birkenbach
54058c7f4d gpt-5.2 ChatGPT: integrate gh-based credential resolution with full integration test
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Add GhTokenProvider to read GitHub tokens via `gh auth token`
- Extend TokenResolver policy: ENV → gh → keyring (validate) → prompt (overwrite)
- Introduce provider-specific token validation for GitHub
- Ensure invalid keyring tokens trigger interactive re-prompt and overwrite
- Add end-to-end integration test covering gh → keyring → prompt flow
- Clean up credentials package exports and documentation

https://chatgpt.com/share/69418c81-6748-800f-8fec-616684746e3c
2025-12-16 17:44:44 +01:00
Kevin Veen-Birkenbach
8583fdf172 feat(mirror,create): make MIRRORS single source of truth and exclude PyPI from git config
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Treat MIRRORS as the only authority for mirror URLs
- Filter non-git URLs (e.g. PyPI) from git remotes and push URLs
- Prefer SSH git URLs when determining primary origin
- Ensure mirror probing only targets valid git remotes
- Refactor repository create into service-based architecture
- Write PyPI metadata exclusively to MIRRORS, never to git config
- Add integration test verifying PyPI is not written into .git/config
- Update preview and unit tests to match new create flow

https://chatgpt.com/share/69415c61-1c5c-800f-86dd-0405edec25db
2025-12-16 14:19:19 +01:00
Kevin Veen-Birkenbach
374f4ed745 test(integration): move create repo preview test from e2e and mock git commands
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Reclassify create repo preview test as integration test
- Rename test class to drop E2E naming
- Replace subprocess mock with core.git command mocks (init/add_all/commit)
- Patch get_config_value to avoid git config dependency

https://chatgpt.com/share/694150de-873c-800f-a01d-df3cc7ce25df
2025-12-16 13:30:19 +01:00
Kevin Veen-Birkenbach
63e1b3d145 core.git: add get_repo_root query and use it in repository scaffold
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
https://chatgpt.com/share/69414f70-fc60-800f-ba6a-cbea426ea913
2025-12-16 13:23:36 +01:00
Kevin Veen-Birkenbach
2f89de1ff5 refactor(pull): switch repository pull to core.git commands
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Replace raw subprocess git pull with core.git.commands.pull_args
- Remove shell-based command execution
- Add GitPullArgsError wrapper for consistent error handling
- Align unit tests to mock pull_args instead of subprocess.run
- Preserve verification and prompt logic

https://chatgpt.com/share/69414dc9-5b30-800f-88b2-bd27a873580b
2025-12-16 13:17:04 +01:00
Kevin Veen-Birkenbach
019aa4b0d9 refactor(git): migrate repository creation to core.git commands
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Replace direct subprocess git calls with core.git commands (init, add_all, commit, branch_move, push_upstream)
- Introduce add_all, init, and branch_move command wrappers with preview support
- Use git config queries via get_config_value instead of shell access
- Preserve main → master fallback logic with explicit error handling
- Improve error transparency while keeping previous non-fatal behavior

https://chatgpt.com/share/69414b77-b4d4-800f-a189-463b489664b3
2025-12-16 13:05:42 +01:00
Kevin Veen-Birkenbach
9c22c7dbb4 refactor(git): introduce structured core.git command/query API and adapt actions & tests
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Replace direct subprocess usage with core.git.run wrapper
- Introduce dedicated core.git.commands (add, commit, fetch, pull_ff_only, push, clone, tag_annotated, tag_force_annotated, etc.)
- Introduce core.git.queries (list_tags, get_upstream_ref, get_config_value, changelog helpers, etc.)
- Refactor release workflow and git_ops to use command/query split
- Implement semantic vX.Y.Z comparison with safe fallback for non-parseable tags
- Refactor repository clone logic to use core.git.commands.clone with preview support and ssh→https fallback
- Remove legacy run_git_command helpers
- Split and update unit tests to mock command/query boundaries instead of subprocess
- Add comprehensive tests for clone modes, preview behavior, ssh→https fallback, and verification prompts
- Add unit tests for core.git.run error handling and preview mode
- Align public exports (__all__) with new structure
- Improve type hints, docstrings, and error specificity across git helpers

https://chatgpt.com/share/69414735-51d4-800f-bc7b-4b90e35f71e5
2025-12-16 12:49:03 +01:00
Kevin Veen-Birkenbach
f83e192e37 refactor(release/git): replace shell git calls with command/query helpers
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Remove legacy shell-based git helpers from release workflow
- Introduce typed git command wrappers (add, commit, fetch, pull_ff_only, push, tag*)
- Add git queries for upstream detection and tag listing
- Refactor release workflow to use core git commands consistently
- Implement semantic vX.Y.Z tag comparison without external sort
- Ensure prerelease tags (e.g. -rc) do not outrank final releases
- Split and update unit tests to match new command/query architecture
2025-12-16 12:30:36 +01:00
Kevin Veen-Birkenbach
486863eb58 Sovled ruff linting hints
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-16 12:04:16 +01:00
Kevin Veen-Birkenbach
bb23bd94f2 refactor(git): add get_latest_commit query and remove subprocess usage
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Introduce core.git query get_latest_commit()
- Refactor config init to use git query instead of subprocess
- Fix __future__ import order in core.git package
- Export new query via core.git.queries API

https://chatgpt.com/share/69413c3e-3bcc-800f-b3b0-a3bf3b7bb875
2025-12-16 12:02:09 +01:00
Kevin Veen-Birkenbach
2a66c082eb gpt-5.2 ChatGPT: move git config lookup into core.git query
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Replace inline `git config --get` subprocess usage in release/files.py
  with core.git.queries.get_config_value()
- Keep core.git.run() strict; interpret exit code 1 for missing config keys
  at the query layer
- Export get_config_value via core.git.queries

https://chatgpt.com/share/69413aef-9814-800f-a9c3-e98666a4204a
2025-12-16 11:56:24 +01:00
Kevin Veen-Birkenbach
ee9d7758ed Solved ruff linting hints
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-16 11:42:40 +01:00
Kevin Veen-Birkenbach
0119af330f gpt-5.2: fix tests and imports after git queries split
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
https://chatgpt.com/share/694135eb-10a8-800f-8b12-968612f605c7

Gemini
https://ai.studio/apps/drive/1QO9MaEklm2zZMDZ6XPP0LStuAooXs1NO
2025-12-16 11:35:10 +01:00
Kevin Veen-Birkenbach
e117115b7f gpt-5.2 ChatGPT: adapt tests to new core.git commands/queries split
- Update mirror integration tests to use probe_remote_reachable
- Refactor branch action tests to mock git command helpers instead of run_git
- Align changelog tests with get_changelog query API
- Update git core tests to cover run() and query helpers
- Remove legacy run_git assumptions from tests

https://chatgpt.com/share/69412008-9e8c-800f-9ac9-90f390d55380

**Validated by Google's model.**

**Summary:**
The test modifications have been correctly implemented to cover the Git refactoring changes:

1.  **Granular Mocking:** The tests have shifted from mocking the monolithic `run_git` or `subprocess` to mocking the new, specific wrapper functions (e.g., `pkgmgr.core.git.commands.fetch`, `pkgmgr.core.git.queries.probe_remote_reachable`). This accurately reflects the architectural change in the source code where business logic now relies on these granular imports.
2.  **Structural Alignment:** The test directory structure was updated (e.g., moving tests to `tests/unit/pkgmgr/core/git/queries/`) to match the new source code organization, ensuring logical consistency.
3.  **Exception Handling:** The tests were updated to verify specific exception types (like `GitDeleteRemoteBranchError`) rather than generic errors, ensuring the improved error granularity is correctly handled by the CLI.
4.  **Integration Safety:** The integration tests in `test_mirror_commands.py` were correctly updated to patch the new query paths, ensuring that network operations remain disabled during testing.

The test changes are consistent with the refactor and provide complete coverage for the new code structure.
https://aistudio.google.com/app/prompts?state=%7B%22ids%22:%5B%2214Br1JG1hxuntmoRzuvme3GKUvQ0heqRn%22%5D,%22action%22:%22open%22,%22userId%22:%22109171005420801378245%22,%22resourceKeys%22:%7B%7D%7D&usp=sharing
2025-12-16 10:01:30 +01:00
Kevin Veen-Birkenbach
755b78fcb7 refactor(git): split git helpers into run/commands/queries and update branch, mirror and changelog actions
https://chatgpt.com/share/69411b4a-fcf8-800f-843d-61c913f388eb
2025-12-16 09:41:35 +01:00
Kevin Veen-Birkenbach
9485bc9e3f Release version 1.8.0
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-15 13:37:42 +01:00
Kevin Veen-Birkenbach
dcda23435d git commit -m "feat(update): add --silent mode with continue-on-failure and unified summary
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Introduce --silent flag for install/update to downgrade per-repo errors to warnings
- Continue processing remaining repositories on pull/install failures
- Emit a single summary at the end (suppress per-repo summaries during update)
- Preserve interactive verification behavior when not silent
- Add integration test covering silent vs non-silent update behavior
- Update e2e tests to use --silent for stability"

https://chatgpt.com/share/693ffcca-f680-800f-9f95-9d8c52a9a678
2025-12-15 13:19:14 +01:00
Kevin Veen-Birkenbach
a69e81c44b fix(dependencies): install python-pip for all supported distributions
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Added python-pip for Arch, python3-pip for CentOS, Debian, Fedora, and Ubuntu.
- Ensures that pip is available for Python package installations across systems.

https://chatgpt.com/share/693fedab-69ac-800f-a8f9-19d504787565
2025-12-15 12:14:48 +01:00
Kevin Veen-Birkenbach
2ca004d056 fix(arch/dependencies): initialize pacman keyring before package installation
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Added pacman-key initialization to ensure keyring is properly set up before installing packages.
- This prevents errors related to missing secret keys during package signing.

https://chatgpt.com/share/693fddec-3800-800f-9ad8-6f2d3cd90cc6
2025-12-15 11:07:31 +01:00
Kevin Veen-Birkenbach
f7bd5bfd0b Optimized linters and solved linting bugs
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / lint-shell (push) Has been cancelled
Mark stable commit / lint-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-15 11:00:17 +01:00
Kevin Veen-Birkenbach
2c15a4016b feat(create): scaffold repositories via templates with preview and mirror setup
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / linter-shell (push) Has been cancelled
Mark stable commit / linter-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
https://chatgpt.com/share/693f5bdb-1780-800f-a772-0ecf399627fc
2025-12-15 01:52:38 +01:00
Kevin Veen-Birkenbach
9e3ce34626 Release version 1.7.2
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / linter-shell (push) Has been cancelled
Mark stable commit / linter-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
2025-12-15 00:53:26 +01:00
Kevin Veen-Birkenbach
1a13fcaa4e refactor(mirror): enforce primary origin URL and align mirror resolution logic
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / linter-shell (push) Has been cancelled
Mark stable commit / linter-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Resolve primary remote via RepoMirrorContext (origin → file order → config → default)
- Always set origin fetch and push URL to primary
- Add additional mirrors as extra push URLs without duplication
- Update remote provisioning and setup commands to use context-based resolution
- Adjust and extend unit tests to cover new origin/push behavior

https://chatgpt.com/share/693f4538-42d4-800f-98c2-2ec264fd2e19
2025-12-15 00:16:04 +01:00
Kevin Veen-Birkenbach
48a0d1d458 feat(release): auto-run publish after release with --no-publish opt-out
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / linter-shell (push) Has been cancelled
Mark stable commit / linter-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
- Run publish automatically after successful release
- Add --no-publish flag to disable auto-publish
- Respect TTY for interactive/credential prompts
- Harden repo directory resolution
- Add integration and unit tests for release→publish hook

https://chatgpt.com/share/693f335b-b820-800f-8666-68355f3c938f
2025-12-14 22:59:43 +01:00
Kevin Veen-Birkenbach
783d2b921a fix(publish): store PyPI token per user
Some checks failed
Mark stable commit / test-unit (push) Has been cancelled
Mark stable commit / test-integration (push) Has been cancelled
Mark stable commit / test-env-virtual (push) Has been cancelled
Mark stable commit / test-env-nix (push) Has been cancelled
Mark stable commit / test-e2e (push) Has been cancelled
Mark stable commit / test-virgin-user (push) Has been cancelled
Mark stable commit / test-virgin-root (push) Has been cancelled
Mark stable commit / linter-shell (push) Has been cancelled
Mark stable commit / linter-python (push) Has been cancelled
Mark stable commit / mark-stable (push) Has been cancelled
https://chatgpt.com/share/693f2e20-b94c-800f-9d8e-0c88187f7be6
2025-12-14 22:37:28 +01:00
272 changed files with 6402 additions and 2558 deletions

View File

@@ -28,8 +28,8 @@ jobs:
test-virgin-root: test-virgin-root:
uses: ./.github/workflows/test-virgin-root.yml uses: ./.github/workflows/test-virgin-root.yml
linter-shell: lint-shell:
uses: ./.github/workflows/linter-shell.yml uses: ./.github/workflows/lint-shell.yml
linter-python: lint-python:
uses: ./.github/workflows/linter-python.yml uses: ./.github/workflows/lint-python.yml

View File

@@ -4,7 +4,7 @@ on:
workflow_call: workflow_call:
jobs: jobs:
linter-python: lint-python:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:

View File

@@ -4,7 +4,7 @@ on:
workflow_call: workflow_call:
jobs: jobs:
linter-shell: lint-shell:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v4

View File

@@ -29,16 +29,16 @@ jobs:
test-virgin-root: test-virgin-root:
uses: ./.github/workflows/test-virgin-root.yml uses: ./.github/workflows/test-virgin-root.yml
linter-shell: lint-shell:
uses: ./.github/workflows/linter-shell.yml uses: ./.github/workflows/lint-shell.yml
linter-python: lint-python:
uses: ./.github/workflows/linter-python.yml uses: ./.github/workflows/lint-python.yml
mark-stable: mark-stable:
needs: needs:
- linter-shell - lint-shell
- linter-python - lint-python
- test-unit - test-unit
- test-integration - test-integration
- test-env-nix - test-env-nix

View File

@@ -11,7 +11,9 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
distro: [arch, debian, ubuntu, fedora, centos] distro: [arch, debian, ubuntu, fedora, centos]
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@@ -12,7 +12,9 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
distro: [arch, debian, ubuntu, fedora, centos] distro: [arch, debian, ubuntu, fedora, centos]
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@@ -11,7 +11,9 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
distro: [arch, debian, ubuntu, fedora, centos] distro: [arch, debian, ubuntu, fedora, centos]
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@@ -7,7 +7,9 @@ jobs:
test-integration: test-integration:
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 30 timeout-minutes: 30
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@@ -7,7 +7,9 @@ jobs:
test-unit: test-unit:
runs-on: ubuntu-latest runs-on: ubuntu-latest
timeout-minutes: 30 timeout-minutes: 30
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4

View File

@@ -11,7 +11,9 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
distro: [arch, debian, ubuntu, fedora, centos] distro: [arch, debian, ubuntu, fedora, centos]
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
@@ -19,27 +21,26 @@ jobs:
- name: Show Docker version - name: Show Docker version
run: docker version run: docker version
# 🔹 BUILD virgin image if missing
- name: Build virgin container (${{ matrix.distro }}) - name: Build virgin container (${{ matrix.distro }})
run: | run: |
set -euo pipefail set -euo pipefail
PKGMGR_DISTRO="${{ matrix.distro }}" make build-missing-virgin PKGMGR_DISTRO="${{ matrix.distro }}" make build-missing-virgin
# 🔹 RUN test inside virgin image
- name: Virgin ${{ matrix.distro }} pkgmgr test (root) - name: Virgin ${{ matrix.distro }} pkgmgr test (root)
run: | run: |
set -euo pipefail set -euo pipefail
docker run --rm \ docker run --rm \
-v "$PWD":/src \ -v "$PWD":/opt/src/pkgmgr \
-v pkgmgr_repos:/root/Repositories \ -v pkgmgr_repos:/root/Repositories \
-v pkgmgr_pip_cache:/root/.cache/pip \ -v pkgmgr_pip_cache:/root/.cache/pip \
-w /src \ -e NIX_CONFIG="${NIX_CONFIG}" \
-w /opt/src/pkgmgr \
"pkgmgr-${{ matrix.distro }}-virgin" \ "pkgmgr-${{ matrix.distro }}-virgin" \
bash -lc ' bash -lc '
set -euo pipefail set -euo pipefail
git config --global --add safe.directory /src git config --global --add safe.directory /opt/src/pkgmgr
make install make install
make setup make setup
@@ -50,5 +51,5 @@ jobs:
pkgmgr version pkgmgr pkgmgr version pkgmgr
echo ">>> Running Nix-based: nix run .#pkgmgr -- version pkgmgr" echo ">>> Running Nix-based: nix run .#pkgmgr -- version pkgmgr"
nix run /src#pkgmgr -- version pkgmgr nix run /opt/src/pkgmgr#pkgmgr -- version pkgmgr
' '

View File

@@ -11,7 +11,9 @@ jobs:
fail-fast: false fail-fast: false
matrix: matrix:
distro: [arch, debian, ubuntu, fedora, centos] distro: [arch, debian, ubuntu, fedora, centos]
env:
NIX_CONFIG: |
access-tokens = github.com=${{ secrets.GITHUB_TOKEN }}
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v4 uses: actions/checkout@v4
@@ -19,20 +21,19 @@ jobs:
- name: Show Docker version - name: Show Docker version
run: docker version run: docker version
# 🔹 BUILD virgin image if missing
- name: Build virgin container (${{ matrix.distro }}) - name: Build virgin container (${{ matrix.distro }})
run: | run: |
set -euo pipefail set -euo pipefail
PKGMGR_DISTRO="${{ matrix.distro }}" make build-missing-virgin PKGMGR_DISTRO="${{ matrix.distro }}" make build-missing-virgin
# 🔹 RUN test inside virgin image as non-root
- name: Virgin ${{ matrix.distro }} pkgmgr test (user) - name: Virgin ${{ matrix.distro }} pkgmgr test (user)
run: | run: |
set -euo pipefail set -euo pipefail
docker run --rm \ docker run --rm \
-v "$PWD":/src \ -v "$PWD":/opt/src/pkgmgr \
-w /src \ -e NIX_CONFIG="${NIX_CONFIG}" \
-w /opt/src/pkgmgr \
"pkgmgr-${{ matrix.distro }}-virgin" \ "pkgmgr-${{ matrix.distro }}-virgin" \
bash -lc ' bash -lc '
set -euo pipefail set -euo pipefail
@@ -42,7 +43,7 @@ jobs:
useradd -m dev useradd -m dev
echo "dev ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/dev echo "dev ALL=(ALL) NOPASSWD: ALL" > /etc/sudoers.d/dev
chmod 0440 /etc/sudoers.d/dev chmod 0440 /etc/sudoers.d/dev
chown -R dev:dev /src chown -R dev:dev /opt/src/pkgmgr
mkdir -p /nix/store /nix/var/nix /nix/var/log/nix /nix/var/nix/profiles mkdir -p /nix/store /nix/var/nix /nix/var/log/nix /nix/var/nix/profiles
chown -R dev:dev /nix chown -R dev:dev /nix
@@ -51,7 +52,7 @@ jobs:
sudo -H -u dev env HOME=/home/dev PKGMGR_DISABLE_NIX_FLAKE_INSTALLER=1 bash -lc " sudo -H -u dev env HOME=/home/dev PKGMGR_DISABLE_NIX_FLAKE_INSTALLER=1 bash -lc "
set -euo pipefail set -euo pipefail
cd /src cd /opt/src/pkgmgr
make setup-venv make setup-venv
. \"\$HOME/.venvs/pkgmgr/bin/activate\" . \"\$HOME/.venvs/pkgmgr/bin/activate\"
@@ -59,6 +60,6 @@ jobs:
pkgmgr version pkgmgr pkgmgr version pkgmgr
export NIX_REMOTE=local export NIX_REMOTE=local
nix run /src#pkgmgr -- version pkgmgr nix run /opt/src/pkgmgr#pkgmgr -- version pkgmgr
" "
' '

View File

@@ -1,3 +1,63 @@
## [1.8.6] - 2025-12-17
* Prevent Rate Limits during GitHub Nix Setups
## [1.8.5] - 2025-12-17
* * Clearer Git error handling, especially when a directory is not a Git repository.
* More reliable repository verification with improved commit and GPG signature checks.
* Better error messages and overall robustness when working with Git-based workflows.
## [1.9.0] - 2025-12-17
* Automated release.
## [1.8.4] - 2025-12-17
* * Made pkgmgrs base-layer role explicit by standardizing the Docker/CI mount path to *`/opt/src/pkgmgr`*.
## [1.8.3] - 2025-12-16
* MIRRORS now supports plain URL entries, ensuring metadata-only sources like PyPI are recorded without ever being added to the Git configuration.
## [1.8.2] - 2025-12-16
* * ***pkgmgr tools code*** is more robust and predictable: it now fails early with clear errors if VS Code is not installed or a repository is not yet identified.
## [1.8.1] - 2025-12-16
* * Improved stability and consistency of all Git operations (clone, pull, push, release, branch handling) with clearer error messages and predictable preview behavior.
* Mirrors are now handled cleanly: only valid Git remotes are used for Git operations, while non-Git URLs (e.g. PyPI) are excluded, preventing broken or confusing repository configs.
* GitHub authentication is more robust: tokens are automatically resolved via the GitHub CLI (`gh`), invalid stored tokens are replaced, and interactive prompts occur only when necessary.
* Repository creation and release workflows are more reliable, producing cleaner Git configurations and more predictable version handling.
## [1.8.0] - 2025-12-15
* *** New Features: ***
- **Silent Updates**: You can now use the `--silent` flag during installs and updates to suppress error messages for individual repositories and get a single summary at the end. This ensures the process continues even if some repositories fail, while still preserving interactive checks when not in silent mode.
- **Repository Scaffolding**: The process for creating new repositories has been improved. You can now use templates to scaffold repositories with a preview and automatic mirror setup.
*** Bug Fixes: ***
- **Pip Installation**: Pip is now installed automatically on all supported systems. This includes `python-pip` for Arch and `python3-pip` for CentOS, Debian, Fedora, and Ubuntu, ensuring that pip is available for Python package installations.
- **Pacman Keyring**: Fixed an issue on Arch Linux where package installation would fail due to missing keys. The pacman keyring is now properly initialized before installing packages.
## [1.7.2] - 2025-12-15
* * Git mirrors are now resolved consistently (origin → MIRRORS file → config → default).
* The `origin` remote is always enforced to use the primary URL for both fetch and push.
* Additional mirrors are added as extra push targets without duplication.
* Local and remote mirror setup behaves more predictably and consistently.
* Improved test coverage ensures stable origin and push URL handling.
## [1.7.1] - 2025-12-14 ## [1.7.1] - 2025-12-14
* Patched package-manager to kpmx to publish on pypi * Patched package-manager to kpmx to publish on pypi

View File

@@ -50,6 +50,6 @@ RUN set -euo pipefail; \
# Entry point # Entry point
COPY scripts/docker/entry.sh /usr/local/bin/docker-entry.sh COPY scripts/docker/entry.sh /usr/local/bin/docker-entry.sh
WORKDIR /src WORKDIR /opt/src/pkgmgr
ENTRYPOINT ["/usr/local/bin/docker-entry.sh"] ENTRYPOINT ["/usr/local/bin/docker-entry.sh"]
CMD ["pkgmgr", "--help"] CMD ["pkgmgr", "--help"]

View File

@@ -10,6 +10,10 @@ DISTROS ?= arch debian ubuntu fedora centos
PKGMGR_DISTRO ?= arch PKGMGR_DISTRO ?= arch
export PKGMGR_DISTRO export PKGMGR_DISTRO
# Nix Config Variable (To avoid rate limit)
NIX_CONFIG ?=
export NIX_CONFIG
# ------------------------------------------------------------ # ------------------------------------------------------------
# Base images # Base images
# (kept for documentation/reference; actual build logic is in scripts/build) # (kept for documentation/reference; actual build logic is in scripts/build)

View File

@@ -32,7 +32,7 @@
rec { rec {
pkgmgr = pyPkgs.buildPythonApplication { pkgmgr = pyPkgs.buildPythonApplication {
pname = "package-manager"; pname = "package-manager";
version = "1.7.1"; version = "1.8.6";
# Use the git repo as source # Use the git repo as source
src = ./.; src = ./.;
@@ -49,6 +49,7 @@
# Runtime dependencies (matches [project.dependencies] in pyproject.toml) # Runtime dependencies (matches [project.dependencies] in pyproject.toml)
propagatedBuildInputs = [ propagatedBuildInputs = [
pyPkgs.pyyaml pyPkgs.pyyaml
pyPkgs.jinja2
pyPkgs.pip pyPkgs.pip
]; ];
@@ -78,6 +79,7 @@
pythonWithDeps = python.withPackages (ps: [ pythonWithDeps = python.withPackages (ps: [
ps.pip ps.pip
ps.pyyaml ps.pyyaml
ps.jinja2
]); ]);
in in
{ {

View File

@@ -1,7 +1,7 @@
# Maintainer: Kevin Veen-Birkenbach <info@veen.world> # Maintainer: Kevin Veen-Birkenbach <info@veen.world>
pkgname=package-manager pkgname=package-manager
pkgver=1.7.1 pkgver=1.8.6
pkgrel=1 pkgrel=1
pkgdesc="Local-flake wrapper for Kevin's package-manager (Nix-based)." pkgdesc="Local-flake wrapper for Kevin's package-manager (Nix-based)."
arch=('any') arch=('any')

View File

@@ -1,3 +1,72 @@
package-manager (1.8.6-1) unstable; urgency=medium
* Prevent Rate Limits during GitHub Nix Setups
-- Kevin Veen-Birkenbach <kevin@veen.world> Wed, 17 Dec 2025 23:50:31 +0100
package-manager (1.8.5-1) unstable; urgency=medium
* * Clearer Git error handling, especially when a directory is not a Git repository.
* More reliable repository verification with improved commit and GPG signature checks.
* Better error messages and overall robustness when working with Git-based workflows.
-- Kevin Veen-Birkenbach <kevin@veen.world> Wed, 17 Dec 2025 22:15:48 +0100
package-manager (1.9.0-1) unstable; urgency=medium
* Automated release.
-- Kevin Veen-Birkenbach <kevin@veen.world> Wed, 17 Dec 2025 22:10:31 +0100
package-manager (1.8.4-1) unstable; urgency=medium
* * Made pkgmgrs base-layer role explicit by standardizing the Docker/CI mount path to *`/opt/src/pkgmgr`*.
-- Kevin Veen-Birkenbach <kevin@veen.world> Wed, 17 Dec 2025 11:20:16 +0100
package-manager (1.8.3-1) unstable; urgency=medium
* MIRRORS now supports plain URL entries, ensuring metadata-only sources like PyPI are recorded without ever being added to the Git configuration.
-- Kevin Veen-Birkenbach <kevin@veen.world> Tue, 16 Dec 2025 19:49:51 +0100
package-manager (1.8.2-1) unstable; urgency=medium
* * ***pkgmgr tools code*** is more robust and predictable: it now fails early with clear errors if VS Code is not installed or a repository is not yet identified.
-- Kevin Veen-Birkenbach <kevin@veen.world> Tue, 16 Dec 2025 19:22:41 +0100
package-manager (1.8.1-1) unstable; urgency=medium
* * Improved stability and consistency of all Git operations (clone, pull, push, release, branch handling) with clearer error messages and predictable preview behavior.
* Mirrors are now handled cleanly: only valid Git remotes are used for Git operations, while non-Git URLs (e.g. PyPI) are excluded, preventing broken or confusing repository configs.
* GitHub authentication is more robust: tokens are automatically resolved via the GitHub CLI (`gh`), invalid stored tokens are replaced, and interactive prompts occur only when necessary.
* Repository creation and release workflows are more reliable, producing cleaner Git configurations and more predictable version handling.
-- Kevin Veen-Birkenbach <kevin@veen.world> Tue, 16 Dec 2025 18:06:35 +0100
package-manager (1.8.0-1) unstable; urgency=medium
* *** New Features: ***
- **Silent Updates**: You can now use the `--silent` flag during installs and updates to suppress error messages for individual repositories and get a single summary at the end. This ensures the process continues even if some repositories fail, while still preserving interactive checks when not in silent mode.
- **Repository Scaffolding**: The process for creating new repositories has been improved. You can now use templates to scaffold repositories with a preview and automatic mirror setup.
*** Bug Fixes: ***
- **Pip Installation**: Pip is now installed automatically on all supported systems. This includes `python-pip` for Arch and `python3-pip` for CentOS, Debian, Fedora, and Ubuntu, ensuring that pip is available for Python package installations.
- **Pacman Keyring**: Fixed an issue on Arch Linux where package installation would fail due to missing keys. The pacman keyring is now properly initialized before installing packages.
-- Kevin Veen-Birkenbach <kevin@veen.world> Mon, 15 Dec 2025 13:37:42 +0100
package-manager (1.7.2-1) unstable; urgency=medium
* * Git mirrors are now resolved consistently (origin → MIRRORS file → config → default).
* The `origin` remote is always enforced to use the primary URL for both fetch and push.
* Additional mirrors are added as extra push targets without duplication.
* Local and remote mirror setup behaves more predictably and consistently.
* Improved test coverage ensures stable origin and push URL handling.
-- Kevin Veen-Birkenbach <kevin@veen.world> Mon, 15 Dec 2025 00:53:26 +0100
package-manager (1.7.1-1) unstable; urgency=medium package-manager (1.7.1-1) unstable; urgency=medium
* Patched package-manager to kpmx to publish on pypi * Patched package-manager to kpmx to publish on pypi

View File

@@ -1,5 +1,5 @@
Name: package-manager Name: package-manager
Version: 1.7.1 Version: 1.8.6
Release: 1%{?dist} Release: 1%{?dist}
Summary: Wrapper that runs Kevin's package-manager via Nix flake Summary: Wrapper that runs Kevin's package-manager via Nix flake
@@ -74,6 +74,48 @@ echo ">>> package-manager removed. Nix itself was not removed."
/usr/lib/package-manager/ /usr/lib/package-manager/
%changelog %changelog
* Wed Dec 17 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.6-1
- Prevent Rate Limits during GitHub Nix Setups
* Wed Dec 17 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.5-1
- * Clearer Git error handling, especially when a directory is not a Git repository.
* More reliable repository verification with improved commit and GPG signature checks.
* Better error messages and overall robustness when working with Git-based workflows.
* Wed Dec 17 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.9.0-1
- Automated release.
* Wed Dec 17 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.4-1
- * Made pkgmgrs base-layer role explicit by standardizing the Docker/CI mount path to *`/opt/src/pkgmgr`*.
* Tue Dec 16 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.3-1
- MIRRORS now supports plain URL entries, ensuring metadata-only sources like PyPI are recorded without ever being added to the Git configuration.
* Tue Dec 16 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.2-1
- * ***pkgmgr tools code*** is more robust and predictable: it now fails early with clear errors if VS Code is not installed or a repository is not yet identified.
* Tue Dec 16 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.1-1
- * Improved stability and consistency of all Git operations (clone, pull, push, release, branch handling) with clearer error messages and predictable preview behavior.
* Mirrors are now handled cleanly: only valid Git remotes are used for Git operations, while non-Git URLs (e.g. PyPI) are excluded, preventing broken or confusing repository configs.
* GitHub authentication is more robust: tokens are automatically resolved via the GitHub CLI (`gh`), invalid stored tokens are replaced, and interactive prompts occur only when necessary.
* Repository creation and release workflows are more reliable, producing cleaner Git configurations and more predictable version handling.
* Mon Dec 15 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.8.0-1
- *** New Features: ***
- **Silent Updates**: You can now use the `--silent` flag during installs and updates to suppress error messages for individual repositories and get a single summary at the end. This ensures the process continues even if some repositories fail, while still preserving interactive checks when not in silent mode.
- **Repository Scaffolding**: The process for creating new repositories has been improved. You can now use templates to scaffold repositories with a preview and automatic mirror setup.
*** Bug Fixes: ***
- **Pip Installation**: Pip is now installed automatically on all supported systems. This includes `python-pip` for Arch and `python3-pip` for CentOS, Debian, Fedora, and Ubuntu, ensuring that pip is available for Python package installations.
- **Pacman Keyring**: Fixed an issue on Arch Linux where package installation would fail due to missing keys. The pacman keyring is now properly initialized before installing packages.
* Mon Dec 15 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.7.2-1
- * Git mirrors are now resolved consistently (origin → MIRRORS file → config → default).
* The `origin` remote is always enforced to use the primary URL for both fetch and push.
* Additional mirrors are added as extra push targets without duplication.
* Local and remote mirror setup behaves more predictably and consistently.
* Improved test coverage ensures stable origin and push URL handling.
* Sun Dec 14 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.7.1-1 * Sun Dec 14 2025 Kevin Veen-Birkenbach <kevin@veen.world> - 1.7.1-1
- Patched package-manager to kpmx to publish on pypi - Patched package-manager to kpmx to publish on pypi

View File

@@ -7,7 +7,7 @@ build-backend = "setuptools.build_meta"
[project] [project]
name = "kpmx" name = "kpmx"
version = "1.7.1" version = "1.8.6"
description = "Kevin's package-manager tool (pkgmgr)" description = "Kevin's package-manager tool (pkgmgr)"
readme = "README.md" readme = "README.md"
requires-python = ">=3.9" requires-python = ">=3.9"
@@ -21,6 +21,7 @@ authors = [
dependencies = [ dependencies = [
"PyYAML>=6.0", "PyYAML>=6.0",
"tomli; python_version < \"3.11\"", "tomli; python_version < \"3.11\"",
"jinja2>=3.1"
] ]
[project.urls] [project.urls]

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -euo pipefail set -euo pipefail
echo "[docker] Starting package-manager container" echo "[docker-pkgmgr] Starting package-manager container"
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Log distribution info # Log distribution info
@@ -9,19 +9,19 @@ echo "[docker] Starting package-manager container"
if [[ -f /etc/os-release ]]; then if [[ -f /etc/os-release ]]; then
# shellcheck disable=SC1091 # shellcheck disable=SC1091
. /etc/os-release . /etc/os-release
echo "[docker] Detected distro: ${ID:-unknown} (like: ${ID_LIKE:-})" echo "[docker-pkgmgr] Detected distro: ${ID:-unknown} (like: ${ID_LIKE:-})"
fi fi
# Always use /src (mounted from host) as working directory # Always use /opt/src/pkgmgr (mounted from host) as working directory
echo "[docker] Using /src as working directory" echo "[docker-pkgmgr] Using /opt/src/pkgmgr as working directory"
cd /src cd /opt/src/pkgmgr
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# DEV mode: rebuild package-manager from the mounted /src tree # DEV mode: rebuild package-manager from the mounted /opt/src/pkgmgr tree
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
if [[ "${REINSTALL_PKGMGR:-0}" == "1" ]]; then if [[ "${REINSTALL_PKGMGR:-0}" == "1" ]]; then
echo "[docker] DEV mode enabled (REINSTALL_PKGMGR=1)" echo "[docker-pkgmgr] DEV mode enabled (REINSTALL_PKGMGR=1)"
echo "[docker] Rebuilding package-manager from /src via scripts/installation/package.sh..." echo "[docker-pkgmgr] Rebuilding package-manager from /opt/src/pkgmgr via scripts/installation/package.sh..."
bash scripts/installation/package.sh || exit 1 bash scripts/installation/package.sh || exit 1
fi fi
@@ -29,9 +29,9 @@ fi
# Hand off to pkgmgr or arbitrary command # Hand off to pkgmgr or arbitrary command
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
if [[ $# -eq 0 ]]; then if [[ $# -eq 0 ]]; then
echo "[docker] No arguments provided. Showing pkgmgr help..." echo "[docker-pkgmgr] No arguments provided. Showing pkgmgr help..."
exec pkgmgr --help exec pkgmgr --help
else else
echo "[docker] Executing command: $*" echo "[docker-pkgmgr] Executing command: $*"
exec "$@" exec "$@"
fi fi

View File

@@ -6,6 +6,13 @@ SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
echo "[arch/dependencies] Installing Arch build dependencies..." echo "[arch/dependencies] Installing Arch build dependencies..."
pacman -Syu --noconfirm pacman -Syu --noconfirm
if ! pacman-key --list-sigs &>/dev/null; then
echo "[arch/dependencies] Initializing pacman keyring..."
pacman-key --init
pacman-key --populate archlinux
fi
pacman -S --noconfirm --needed \ pacman -S --noconfirm --needed \
base-devel \ base-devel \
git \ git \
@@ -13,6 +20,7 @@ pacman -S --noconfirm --needed \
curl \ curl \
ca-certificates \ ca-certificates \
python \ python \
python-pip \
xz xz
pacman -Scc --noconfirm pacman -Scc --noconfirm

View File

@@ -6,7 +6,7 @@ echo "[arch/package] Building Arch package (makepkg --nodeps) in an isolated bui
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "${SCRIPT_DIR}/../../.." && pwd)" PROJECT_ROOT="$(cd "${SCRIPT_DIR}/../../.." && pwd)"
# We must not build inside /src (mounted repo). Build in /tmp to avoid permission issues. # We must not build inside /opt/src/pkgmgr (mounted repo). Build in /tmp to avoid permission issues.
BUILD_ROOT="/tmp/package-manager-arch-build" BUILD_ROOT="/tmp/package-manager-arch-build"
PKG_SRC_DIR="${PROJECT_ROOT}/packaging/arch" PKG_SRC_DIR="${PROJECT_ROOT}/packaging/arch"
PKG_BUILD_DIR="${BUILD_ROOT}/packaging/arch" PKG_BUILD_DIR="${BUILD_ROOT}/packaging/arch"

View File

@@ -14,6 +14,7 @@ dnf -y install \
curl-minimal \ curl-minimal \
ca-certificates \ ca-certificates \
python3 \ python3 \
python3-pip \
sudo \ sudo \
xz xz

View File

@@ -15,6 +15,7 @@ DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
ca-certificates \ ca-certificates \
python3 \ python3 \
python3-venv \ python3-venv \
python3-pip \
xz-utils xz-utils
rm -rf /var/lib/apt/lists/* rm -rf /var/lib/apt/lists/*

View File

@@ -14,6 +14,7 @@ dnf -y install \
curl \ curl \
ca-certificates \ ca-certificates \
python3 \ python3 \
python3-pip \
xz xz
dnf clean all dnf clean all

View File

@@ -17,6 +17,7 @@ DEBIAN_FRONTEND=noninteractive apt-get install -y --no-install-recommends \
make \ make \
python3 \ python3 \
python3-venv \ python3-venv \
python3-pip \
ca-certificates \ ca-certificates \
xz-utils xz-utils

View File

@@ -6,12 +6,13 @@ echo ">>> Running E2E tests: $PKGMGR_DISTRO"
echo "============================================================" echo "============================================================"
docker run --rm \ docker run --rm \
-v "$(pwd):/src" \ -v "$(pwd):/opt/src/pkgmgr" \
-v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \ -v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \
-v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \ -v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \
-e REINSTALL_PKGMGR=1 \ -e REINSTALL_PKGMGR=1 \
-e TEST_PATTERN="${TEST_PATTERN}" \ -e TEST_PATTERN="${TEST_PATTERN}" \
--workdir /src \ -e NIX_CONFIG="${NIX_CONFIG}" \
--workdir /opt/src/pkgmgr \
"pkgmgr-${PKGMGR_DISTRO}" \ "pkgmgr-${PKGMGR_DISTRO}" \
bash -lc ' bash -lc '
set -euo pipefail set -euo pipefail
@@ -40,14 +41,14 @@ docker run --rm \
} }
# Mark the mounted repository as safe to avoid Git ownership errors. # Mark the mounted repository as safe to avoid Git ownership errors.
# Newer Git (e.g. on Ubuntu) complains about the gitdir (/src/.git), # Newer Git (e.g. on Ubuntu) complains about the gitdir (/opt/src/pkgmgr/.git),
# older versions about the worktree (/src). Nix turns "." into the # older versions about the worktree (/opt/src/pkgmgr). Nix turns "." into the
# flake input "git+file:///src", which then uses Git under the hood. # flake input "git+file:///opt/src/pkgmgr", which then uses Git under the hood.
if command -v git >/dev/null 2>&1; then if command -v git >/dev/null 2>&1; then
# Worktree path # Worktree path
git config --global --add safe.directory /src || true git config --global --add safe.directory /opt/src/pkgmgr || true
# Gitdir path shown in the "dubious ownership" error # Gitdir path shown in the "dubious ownership" error
git config --global --add safe.directory /src/.git || true git config --global --add safe.directory /opt/src/pkgmgr/.git || true
# Ephemeral CI containers: allow all paths as a last resort # Ephemeral CI containers: allow all paths as a last resort
git config --global --add safe.directory "*" || true git config --global --add safe.directory "*" || true
fi fi
@@ -55,6 +56,6 @@ docker run --rm \
# Run the E2E tests inside the Nix development shell # Run the E2E tests inside the Nix development shell
nix develop .#default --no-write-lock-file -c \ nix develop .#default --no-write-lock-file -c \
python3 -m unittest discover \ python3 -m unittest discover \
-s /src/tests/e2e \ -s /opt/src/pkgmgr/tests/e2e \
-p "$TEST_PATTERN" -p "$TEST_PATTERN"
' '

View File

@@ -9,18 +9,19 @@ echo ">>> Image: ${IMAGE}"
echo "============================================================" echo "============================================================"
docker run --rm \ docker run --rm \
-v "$(pwd):/src" \ -v "$(pwd):/opt/src/pkgmgr" \
-v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \ -v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \
-v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \ -v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \
--workdir /src \ --workdir /opt/src/pkgmgr \
-e REINSTALL_PKGMGR=1 \ -e REINSTALL_PKGMGR=1 \
-e NIX_CONFIG="${NIX_CONFIG}" \
"${IMAGE}" \ "${IMAGE}" \
bash -lc ' bash -lc '
set -euo pipefail set -euo pipefail
if command -v git >/dev/null 2>&1; then if command -v git >/dev/null 2>&1; then
git config --global --add safe.directory /src || true git config --global --add safe.directory /opt/src/pkgmgr || true
git config --global --add safe.directory /src/.git || true git config --global --add safe.directory /opt/src/pkgmgr/.git || true
git config --global --add safe.directory "*" || true git config --global --add safe.directory "*" || true
fi fi
@@ -38,9 +39,9 @@ docker run --rm \
# ------------------------------------------------------------ # ------------------------------------------------------------
# Retry helper for GitHub API rate-limit (HTTP 403) # Retry helper for GitHub API rate-limit (HTTP 403)
# ------------------------------------------------------------ # ------------------------------------------------------------
if [[ -f /src/scripts/nix/lib/retry_403.sh ]]; then if [[ -f /opt/src/pkgmgr/scripts/nix/lib/retry_403.sh ]]; then
# shellcheck source=./scripts/nix/lib/retry_403.sh # shellcheck source=./scripts/nix/lib/retry_403.sh
source /src/scripts/nix/lib/retry_403.sh source /opt/src/pkgmgr/scripts/nix/lib/retry_403.sh
elif [[ -f ./scripts/nix/lib/retry_403.sh ]]; then elif [[ -f ./scripts/nix/lib/retry_403.sh ]]; then
# shellcheck source=./scripts/nix/lib/retry_403.sh # shellcheck source=./scripts/nix/lib/retry_403.sh
source ./scripts/nix/lib/retry_403.sh source ./scripts/nix/lib/retry_403.sh

View File

@@ -17,8 +17,9 @@ echo
# ------------------------------------------------------------ # ------------------------------------------------------------
if OUTPUT=$(docker run --rm \ if OUTPUT=$(docker run --rm \
-e REINSTALL_PKGMGR=1 \ -e REINSTALL_PKGMGR=1 \
-v "$(pwd):/src" \ -v "$(pwd):/opt/src/pkgmgr" \
-w /src \ -w /opt/src/pkgmgr \
-e NIX_CONFIG="${NIX_CONFIG}" \
"${IMAGE}" \ "${IMAGE}" \
bash -lc ' bash -lc '
set -euo pipefail set -euo pipefail

View File

@@ -6,19 +6,20 @@ echo ">>> Running INTEGRATION tests in ${PKGMGR_DISTRO} container"
echo "============================================================" echo "============================================================"
docker run --rm \ docker run --rm \
-v "$(pwd):/src" \ -v "$(pwd):/opt/src/pkgmgr" \
-v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \ -v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \
-v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \ -v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \
--workdir /src \ --workdir /opt/src/pkgmgr \
-e REINSTALL_PKGMGR=1 \ -e REINSTALL_PKGMGR=1 \
-e TEST_PATTERN="${TEST_PATTERN}" \ -e TEST_PATTERN="${TEST_PATTERN}" \
-e NIX_CONFIG="${NIX_CONFIG}" \
"pkgmgr-${PKGMGR_DISTRO}" \ "pkgmgr-${PKGMGR_DISTRO}" \
bash -lc ' bash -lc '
set -e; set -e;
git config --global --add safe.directory /src || true; git config --global --add safe.directory /opt/src/pkgmgr || true;
nix develop .#default --no-write-lock-file -c \ nix develop .#default --no-write-lock-file -c \
python3 -m unittest discover \ python3 -m unittest discover \
-s tests/integration \ -s tests/integration \
-t /src \ -t /opt/src/pkgmgr \
-p "$TEST_PATTERN"; -p "$TEST_PATTERN";
' '

View File

@@ -6,19 +6,20 @@ echo ">>> Running UNIT tests in ${PKGMGR_DISTRO} container"
echo "============================================================" echo "============================================================"
docker run --rm \ docker run --rm \
-v "$(pwd):/src" \ -v "$(pwd):/opt/src/pkgmgr" \
-v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \ -v "pkgmgr_nix_cache_${PKGMGR_DISTRO}:/root/.cache/nix" \
-v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \ -v "pkgmgr_nix_store_${PKGMGR_DISTRO}:/nix" \
--workdir /src \ --workdir /opt/src/pkgmgr \
-e REINSTALL_PKGMGR=1 \ -e REINSTALL_PKGMGR=1 \
-e TEST_PATTERN="${TEST_PATTERN}" \ -e TEST_PATTERN="${TEST_PATTERN}" \
-e NIX_CONFIG="${NIX_CONFIG}" \
"pkgmgr-${PKGMGR_DISTRO}" \ "pkgmgr-${PKGMGR_DISTRO}" \
bash -lc ' bash -lc '
set -e; set -e;
git config --global --add safe.directory /src || true; git config --global --add safe.directory /opt/src/pkgmgr || true;
nix develop .#default --no-write-lock-file -c \ nix develop .#default --no-write-lock-file -c \
python3 -m unittest discover \ python3 -m unittest discover \
-s tests/unit \ -s tests/unit \
-t /src \ -t /opt/src/pkgmgr \
-p "$TEST_PATTERN"; -p "$TEST_PATTERN";
' '

View File

@@ -25,12 +25,12 @@ __all__ = ["cli"]
def __getattr__(name: str) -> Any: def __getattr__(name: str) -> Any:
""" """
Lazily expose ``pkgmgr.cli`` as attribute on the top-level package. Lazily expose ``pkgmgr.cli`` as attribute on the top-level package.
This keeps ``import pkgmgr`` lightweight while still allowing This keeps ``import pkgmgr`` lightweight while still allowing
``from pkgmgr import cli`` in tests and entry points. ``from pkgmgr import cli`` in tests and entry points.
""" """
if name == "cli": if name == "cli":
return import_module("pkgmgr.cli") return import_module("pkgmgr.cli")
raise AttributeError(f"module 'pkgmgr' has no attribute {name!r}") raise AttributeError(f"module 'pkgmgr' has no attribute {name!r}")

View File

@@ -0,0 +1,6 @@
from __future__ import annotations
# expose subpackages for patch() / resolve_name() friendliness
from . import release as release # noqa: F401
__all__ = ["release"]

View File

@@ -1,7 +1,21 @@
from __future__ import annotations from __future__ import annotations
from typing import Optional from typing import Optional
from pkgmgr.core.git import run_git, GitError, get_current_branch
from .utils import _resolve_base_branch from pkgmgr.core.git.errors import GitRunError
from pkgmgr.core.git.queries import get_current_branch
from pkgmgr.core.git.commands import (
GitDeleteRemoteBranchError,
checkout,
delete_local_branch,
delete_remote_branch,
fetch,
merge_no_ff,
pull,
push,
)
from pkgmgr.core.git.queries import resolve_base_branch
def close_branch( def close_branch(
@@ -14,18 +28,17 @@ def close_branch(
""" """
Merge a feature branch into the base branch and delete it afterwards. Merge a feature branch into the base branch and delete it afterwards.
""" """
# Determine branch name # Determine branch name
if not name: if not name:
try: try:
name = get_current_branch(cwd=cwd) name = get_current_branch(cwd=cwd)
except GitError as exc: except GitRunError as exc:
raise RuntimeError(f"Failed to detect current branch: {exc}") from exc raise RuntimeError(f"Failed to detect current branch: {exc}") from exc
if not name: if not name:
raise RuntimeError("Branch name must not be empty.") raise RuntimeError("Branch name must not be empty.")
target_base = _resolve_base_branch(base_branch, fallback_base, cwd=cwd) target_base = resolve_base_branch(base_branch, fallback_base, cwd=cwd)
if name == target_base: if name == target_base:
raise RuntimeError( raise RuntimeError(
@@ -35,65 +48,31 @@ def close_branch(
# Confirmation # Confirmation
if not force: if not force:
answer = input( answer = (
f"Merge branch '{name}' into '{target_base}' and delete it afterwards? (y/N): " input(
).strip().lower() f"Merge branch '{name}' into '{target_base}' and delete it afterwards? (y/N): "
)
.strip()
.lower()
)
if answer != "y": if answer != "y":
print("Aborted closing branch.") print("Aborted closing branch.")
return return
# Fetch # Execute workflow (commands raise specific GitRunError subclasses)
try: fetch("origin", cwd=cwd)
run_git(["fetch", "origin"], cwd=cwd) checkout(target_base, cwd=cwd)
except GitError as exc: pull("origin", target_base, cwd=cwd)
raise RuntimeError( merge_no_ff(name, cwd=cwd)
f"Failed to fetch from origin before closing branch {name!r}: {exc}" push("origin", target_base, cwd=cwd)
) from exc
# Checkout base # Delete local branch (safe delete by default)
try: delete_local_branch(name, cwd=cwd, force=False)
run_git(["checkout", target_base], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to checkout base branch {target_base!r}: {exc}"
) from exc
# Pull latest # Delete remote branch (special-case error message)
try: try:
run_git(["pull", "origin", target_base], cwd=cwd) delete_remote_branch("origin", name, cwd=cwd)
except GitError as exc: except GitDeleteRemoteBranchError as exc:
raise RuntimeError(
f"Failed to pull latest changes for base branch {target_base!r}: {exc}"
) from exc
# Merge
try:
run_git(["merge", "--no-ff", name], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to merge branch {name!r} into {target_base!r}: {exc}"
) from exc
# Push result
try:
run_git(["push", "origin", target_base], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to push base branch {target_base!r} after merge: {exc}"
) from exc
# Delete local
try:
run_git(["branch", "-d", name], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to delete local branch {name!r}: {exc}"
) from exc
# Delete remote
try:
run_git(["push", "origin", "--delete", name], cwd=cwd)
except GitError as exc:
raise RuntimeError( raise RuntimeError(
f"Branch {name!r} deleted locally, but remote deletion failed: {exc}" f"Branch {name!r} deleted locally, but remote deletion failed: {exc}"
) from exc ) from exc

View File

@@ -1,7 +1,16 @@
from __future__ import annotations from __future__ import annotations
from typing import Optional from typing import Optional
from pkgmgr.core.git import run_git, GitError, get_current_branch
from .utils import _resolve_base_branch from pkgmgr.core.git.errors import GitRunError
from pkgmgr.core.git.queries import get_current_branch
from pkgmgr.core.git.commands import (
GitDeleteRemoteBranchError,
delete_local_branch,
delete_remote_branch,
)
from pkgmgr.core.git.queries import resolve_base_branch
def drop_branch( def drop_branch(
@@ -14,17 +23,16 @@ def drop_branch(
""" """
Delete a branch locally and remotely without merging. Delete a branch locally and remotely without merging.
""" """
if not name: if not name:
try: try:
name = get_current_branch(cwd=cwd) name = get_current_branch(cwd=cwd)
except GitError as exc: except GitRunError as exc:
raise RuntimeError(f"Failed to detect current branch: {exc}") from exc raise RuntimeError(f"Failed to detect current branch: {exc}") from exc
if not name: if not name:
raise RuntimeError("Branch name must not be empty.") raise RuntimeError("Branch name must not be empty.")
target_base = _resolve_base_branch(base_branch, fallback_base, cwd=cwd) target_base = resolve_base_branch(base_branch, fallback_base, cwd=cwd)
if name == target_base: if name == target_base:
raise RuntimeError( raise RuntimeError(
@@ -33,23 +41,23 @@ def drop_branch(
# Confirmation # Confirmation
if not force: if not force:
answer = input( answer = (
f"Delete branch '{name}' locally and on origin? This is destructive! (y/N): " input(
).strip().lower() f"Delete branch '{name}' locally and on origin? This is destructive! (y/N): "
)
.strip()
.lower()
)
if answer != "y": if answer != "y":
print("Aborted dropping branch.") print("Aborted dropping branch.")
return return
# Local delete delete_local_branch(name, cwd=cwd, force=False)
try:
run_git(["branch", "-d", name], cwd=cwd)
except GitError as exc:
raise RuntimeError(f"Failed to delete local branch {name!r}: {exc}") from exc
# Remote delete # Remote delete (special-case message)
try: try:
run_git(["push", "origin", "--delete", name], cwd=cwd) delete_remote_branch("origin", name, cwd=cwd)
except GitError as exc: except GitDeleteRemoteBranchError as exc:
raise RuntimeError( raise RuntimeError(
f"Branch {name!r} was deleted locally, but remote deletion failed: {exc}" f"Branch {name!r} was deleted locally, but remote deletion failed: {exc}"
) from exc ) from exc

View File

@@ -1,7 +1,15 @@
from __future__ import annotations from __future__ import annotations
from typing import Optional from typing import Optional
from pkgmgr.core.git import run_git, GitError
from .utils import _resolve_base_branch from pkgmgr.core.git.commands import (
checkout,
create_branch,
fetch,
pull,
push_upstream,
)
from pkgmgr.core.git.queries import resolve_base_branch
def open_branch( def open_branch(
@@ -13,7 +21,6 @@ def open_branch(
""" """
Create and push a new feature branch on top of a base branch. Create and push a new feature branch on top of a base branch.
""" """
# Request name interactively if not provided # Request name interactively if not provided
if not name: if not name:
name = input("Enter new branch name: ").strip() name = input("Enter new branch name: ").strip()
@@ -21,44 +28,13 @@ def open_branch(
if not name: if not name:
raise RuntimeError("Branch name must not be empty.") raise RuntimeError("Branch name must not be empty.")
resolved_base = _resolve_base_branch(base_branch, fallback_base, cwd=cwd) resolved_base = resolve_base_branch(base_branch, fallback_base, cwd=cwd)
# 1) Fetch from origin # Workflow (commands raise specific GitBaseError subclasses)
try: fetch("origin", cwd=cwd)
run_git(["fetch", "origin"], cwd=cwd) checkout(resolved_base, cwd=cwd)
except GitError as exc: pull("origin", resolved_base, cwd=cwd)
raise RuntimeError(
f"Failed to fetch from origin before creating branch {name!r}: {exc}"
) from exc
# 2) Checkout base branch # Create new branch from resolved base and push it with upstream tracking
try: create_branch(name, resolved_base, cwd=cwd)
run_git(["checkout", resolved_base], cwd=cwd) push_upstream("origin", name, cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to checkout base branch {resolved_base!r}: {exc}"
) from exc
# 3) Pull latest changes
try:
run_git(["pull", "origin", resolved_base], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to pull latest changes for base branch {resolved_base!r}: {exc}"
) from exc
# 4) Create new branch
try:
run_git(["checkout", "-b", name], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to create new branch {name!r} from base {resolved_base!r}: {exc}"
) from exc
# 5) Push new branch
try:
run_git(["push", "-u", "origin", name], cwd=cwd)
except GitError as exc:
raise RuntimeError(
f"Failed to push new branch {name!r} to origin: {exc}"
) from exc

View File

@@ -1,27 +0,0 @@
from __future__ import annotations
from pkgmgr.core.git import run_git, GitError
def _resolve_base_branch(
preferred: str,
fallback: str,
cwd: str,
) -> str:
"""
Resolve the base branch to use.
Try `preferred` first (default: main),
fall back to `fallback` (default: master).
Raise RuntimeError if neither exists.
"""
for candidate in (preferred, fallback):
try:
run_git(["rev-parse", "--verify", candidate], cwd=cwd)
return candidate
except GitError:
continue
raise RuntimeError(
f"Neither {preferred!r} nor {fallback!r} exist in this repository."
)

View File

@@ -3,17 +3,16 @@
""" """
Helpers to generate changelog information from Git history. Helpers to generate changelog information from Git history.
This module provides a small abstraction around `git log` so that
CLI commands can request a changelog between two refs (tags, branches,
commits) without dealing with raw subprocess calls.
""" """
from __future__ import annotations from __future__ import annotations
from typing import Optional from typing import Optional
from pkgmgr.core.git import run_git, GitError from pkgmgr.core.git.queries import (
get_changelog,
GitChangelogQueryError,
)
def generate_changelog( def generate_changelog(
@@ -25,48 +24,20 @@ def generate_changelog(
""" """
Generate a plain-text changelog between two Git refs. Generate a plain-text changelog between two Git refs.
Parameters Returns a human-readable message instead of raising.
----------
cwd:
Repository directory in which to run Git commands.
from_ref:
Optional starting reference (exclusive). If provided together
with `to_ref`, the range `from_ref..to_ref` is used.
If only `from_ref` is given, the range `from_ref..HEAD` is used.
to_ref:
Optional end reference (inclusive). If omitted, `HEAD` is used.
include_merges:
If False (default), merge commits are filtered out.
Returns
-------
str
The output of `git log` formatted as a simple text changelog.
If no commits are found or Git fails, an explanatory message
is returned instead of raising.
""" """
# Determine the revision range
if to_ref is None: if to_ref is None:
to_ref = "HEAD" to_ref = "HEAD"
if from_ref: rev_range = f"{from_ref}..{to_ref}" if from_ref else to_ref
rev_range = f"{from_ref}..{to_ref}"
else:
rev_range = to_ref
# Use a custom pretty format that includes tags/refs (%d)
cmd = [
"log",
"--pretty=format:%h %d %s",
]
if not include_merges:
cmd.append("--no-merges")
cmd.append(rev_range)
try: try:
output = run_git(cmd, cwd=cwd) output = get_changelog(
except GitError as exc: cwd=cwd,
# Do not raise to the CLI, return a human-readable error instead. from_ref=from_ref,
to_ref=to_ref,
include_merges=include_merges,
)
except GitChangelogQueryError as exc:
return ( return (
f"[ERROR] Failed to generate changelog in {cwd!r} " f"[ERROR] Failed to generate changelog in {cwd!r} "
f"for range {rev_range!r}:\n{exc}" f"for range {rev_range!r}:\n{exc}"

View File

@@ -2,14 +2,17 @@ import yaml
import os import os
from pkgmgr.core.config.save import save_user_config from pkgmgr.core.config.save import save_user_config
def interactive_add(config,USER_CONFIG_PATH:str):
def interactive_add(config, USER_CONFIG_PATH: str):
"""Interactively prompt the user to add a new repository entry to the user config.""" """Interactively prompt the user to add a new repository entry to the user config."""
print("Adding a new repository configuration entry.") print("Adding a new repository configuration entry.")
new_entry = {} new_entry = {}
new_entry["provider"] = input("Provider (e.g., github.com): ").strip() new_entry["provider"] = input("Provider (e.g., github.com): ").strip()
new_entry["account"] = input("Account (e.g., yourusername): ").strip() new_entry["account"] = input("Account (e.g., yourusername): ").strip()
new_entry["repository"] = input("Repository name (e.g., mytool): ").strip() new_entry["repository"] = input("Repository name (e.g., mytool): ").strip()
new_entry["command"] = input("Command (optional, leave blank to auto-detect): ").strip() new_entry["command"] = input(
"Command (optional, leave blank to auto-detect): "
).strip()
new_entry["description"] = input("Description (optional): ").strip() new_entry["description"] = input("Description (optional): ").strip()
new_entry["replacement"] = input("Replacement (optional): ").strip() new_entry["replacement"] = input("Replacement (optional): ").strip()
new_entry["alias"] = input("Alias (optional): ").strip() new_entry["alias"] = input("Alias (optional): ").strip()
@@ -25,12 +28,12 @@ def interactive_add(config,USER_CONFIG_PATH:str):
confirm = input("Add this entry to user config? (y/N): ").strip().lower() confirm = input("Add this entry to user config? (y/N): ").strip().lower()
if confirm == "y": if confirm == "y":
if os.path.exists(USER_CONFIG_PATH): if os.path.exists(USER_CONFIG_PATH):
with open(USER_CONFIG_PATH, 'r') as f: with open(USER_CONFIG_PATH, "r") as f:
user_config = yaml.safe_load(f) or {} user_config = yaml.safe_load(f) or {}
else: else:
user_config = {"repositories": []} user_config = {"repositories": []}
user_config.setdefault("repositories", []) user_config.setdefault("repositories", [])
user_config["repositories"].append(new_entry) user_config["repositories"].append(new_entry)
save_user_config(user_config,USER_CONFIG_PATH) save_user_config(user_config, USER_CONFIG_PATH)
else: else:
print("Entry not added.") print("Entry not added.")

View File

@@ -14,7 +14,7 @@ with the expected structure:
For each discovered repository, the function: For each discovered repository, the function:
• derives provider, account, repository from the folder structure • derives provider, account, repository from the folder structure
• (optionally) determines the latest commit hash via git log • (optionally) determines the latest commit hash via git
• generates a unique CLI alias • generates a unique CLI alias
• marks ignore=True for newly discovered repos • marks ignore=True for newly discovered repos
• skips repos already known in defaults or user config • skips repos already known in defaults or user config
@@ -23,11 +23,11 @@ For each discovered repository, the function:
from __future__ import annotations from __future__ import annotations
import os import os
import subprocess
from typing import Any, Dict from typing import Any, Dict
from pkgmgr.core.command.alias import generate_alias from pkgmgr.core.command.alias import generate_alias
from pkgmgr.core.config.save import save_user_config from pkgmgr.core.config.save import save_user_config
from pkgmgr.core.git.queries import get_latest_commit
def config_init( def config_init(
@@ -107,36 +107,33 @@ def config_init(
# Already known? # Already known?
if key in default_keys: if key in default_keys:
skipped += 1 skipped += 1
print(f"[SKIP] (defaults) {provider}/{account}/{repo_name}") print(
f"[SKIP] (defaults) {provider}/{account}/{repo_name}"
)
continue continue
if key in existing_keys: if key in existing_keys:
skipped += 1 skipped += 1
print(f"[SKIP] (user-config) {provider}/{account}/{repo_name}") print(
f"[SKIP] (user-config) {provider}/{account}/{repo_name}"
)
continue continue
print(f"[ADD] {provider}/{account}/{repo_name}") print(f"[ADD] {provider}/{account}/{repo_name}")
# Determine commit hash # Determine commit hash via git query
try: verified_commit = get_latest_commit(repo_path) or ""
result = subprocess.run( if verified_commit:
["git", "log", "-1", "--format=%H"], print(f"[INFO] Latest commit: {verified_commit}")
cwd=repo_path, else:
stdout=subprocess.PIPE, print(
stderr=subprocess.PIPE, "[WARN] Could not read commit (not a git repo or no commits)."
text=True,
check=True,
) )
verified = result.stdout.strip()
print(f"[INFO] Latest commit: {verified}")
except Exception as exc:
verified = ""
print(f"[WARN] Could not read commit: {exc}")
entry = { entry: Dict[str, Any] = {
"provider": provider, "provider": provider,
"account": account, "account": account,
"repository": repo_name, "repository": repo_name,
"verified": {"commit": verified}, "verified": {"commit": verified_commit},
"ignore": True, "ignore": True,
} }

View File

@@ -1,6 +1,7 @@
import yaml import yaml
from pkgmgr.core.config.load import load_config from pkgmgr.core.config.load import load_config
def show_config(selected_repos, user_config_path, full_config=False): def show_config(selected_repos, user_config_path, full_config=False):
"""Display configuration for one or more repositories, or the entire merged config.""" """Display configuration for one or more repositories, or the entire merged config."""
if full_config: if full_config:
@@ -8,7 +9,9 @@ def show_config(selected_repos, user_config_path, full_config=False):
print(yaml.dump(merged, default_flow_style=False)) print(yaml.dump(merged, default_flow_style=False))
else: else:
for repo in selected_repos: for repo in selected_repos:
identifier = f'{repo.get("provider")}/{repo.get("account")}/{repo.get("repository")}' identifier = (
f"{repo.get('provider')}/{repo.get('account')}/{repo.get('repository')}"
)
print(f"Repository: {identifier}") print(f"Repository: {identifier}")
for key, value in repo.items(): for key, value in repo.items():
print(f" {key}: {value}") print(f" {key}: {value}")

View File

@@ -16,7 +16,7 @@ Responsibilities:
from __future__ import annotations from __future__ import annotations
import os import os
from typing import Any, Dict, List, Optional from typing import Any, Dict, List, Optional, Tuple
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
@@ -66,10 +66,7 @@ def _ensure_repo_dir(
repo_dir = get_repo_dir(repositories_base_dir, repo) repo_dir = get_repo_dir(repositories_base_dir, repo)
if not os.path.exists(repo_dir): if not os.path.exists(repo_dir):
print( print(f"Repository directory '{repo_dir}' does not exist. Cloning it now...")
f"Repository directory '{repo_dir}' does not exist. "
"Cloning it now..."
)
clone_repos( clone_repos(
[repo], [repo],
repositories_base_dir, repositories_base_dir,
@@ -79,10 +76,7 @@ def _ensure_repo_dir(
clone_mode, clone_mode,
) )
if not os.path.exists(repo_dir): if not os.path.exists(repo_dir):
print( print(f"Cloning failed for repository {identifier}. Skipping installation.")
f"Cloning failed for repository {identifier}. "
"Skipping installation."
)
return None return None
return repo_dir return repo_dir
@@ -93,6 +87,7 @@ def _verify_repo(
repo_dir: str, repo_dir: str,
no_verification: bool, no_verification: bool,
identifier: str, identifier: str,
silent: bool,
) -> bool: ) -> bool:
""" """
Verify a repository using the configured verification data. Verify a repository using the configured verification data.
@@ -111,10 +106,17 @@ def _verify_repo(
print(f"Warning: Verification failed for {identifier}:") print(f"Warning: Verification failed for {identifier}:")
for err in errors: for err in errors:
print(f" - {err}") print(f" - {err}")
choice = input("Continue anyway? [y/N]: ").strip().lower()
if choice != "y": if silent:
print(f"Skipping installation for {identifier}.") # Non-interactive mode: continue with a warning.
return False print(
f"[Warning] Continuing despite verification failure for {identifier} (--silent)."
)
else:
choice = input("Continue anyway? [y/N]: ").strip().lower()
if choice != "y":
print(f"Skipping installation for {identifier}.")
return False
return True return True
@@ -163,6 +165,8 @@ def install_repos(
clone_mode: str, clone_mode: str,
update_dependencies: bool, update_dependencies: bool,
force_update: bool = False, force_update: bool = False,
silent: bool = False,
emit_summary: bool = True,
) -> None: ) -> None:
""" """
Install one or more repositories according to the configured installers Install one or more repositories according to the configured installers
@@ -170,45 +174,76 @@ def install_repos(
If force_update=True, installers of the currently active layer are allowed If force_update=True, installers of the currently active layer are allowed
to run again (upgrade/refresh), even if that layer is already loaded. to run again (upgrade/refresh), even if that layer is already loaded.
If silent=True, repository failures are downgraded to warnings and the
overall command never exits non-zero because of per-repository failures.
""" """
pipeline = InstallationPipeline(INSTALLERS) pipeline = InstallationPipeline(INSTALLERS)
failures: List[Tuple[str, str]] = []
for repo in selected_repos: for repo in selected_repos:
identifier = get_repo_identifier(repo, all_repos) identifier = get_repo_identifier(repo, all_repos)
repo_dir = _ensure_repo_dir( try:
repo=repo, repo_dir = _ensure_repo_dir(
repositories_base_dir=repositories_base_dir, repo=repo,
all_repos=all_repos, repositories_base_dir=repositories_base_dir,
preview=preview, all_repos=all_repos,
no_verification=no_verification, preview=preview,
clone_mode=clone_mode, no_verification=no_verification,
identifier=identifier, clone_mode=clone_mode,
) identifier=identifier,
if not repo_dir: )
if not repo_dir:
failures.append((identifier, "clone/ensure repo directory failed"))
continue
if not _verify_repo(
repo=repo,
repo_dir=repo_dir,
no_verification=no_verification,
identifier=identifier,
silent=silent,
):
continue
ctx = _create_context(
repo=repo,
identifier=identifier,
repo_dir=repo_dir,
repositories_base_dir=repositories_base_dir,
bin_dir=bin_dir,
all_repos=all_repos,
no_verification=no_verification,
preview=preview,
quiet=quiet,
clone_mode=clone_mode,
update_dependencies=update_dependencies,
force_update=force_update,
)
pipeline.run(ctx)
except SystemExit as exc:
code = exc.code if isinstance(exc.code, int) else str(exc.code)
failures.append((identifier, f"installer failed (exit={code})"))
if not quiet:
print(
f"[Warning] install: repository {identifier} failed (exit={code}). Continuing..."
)
continue
except Exception as exc:
failures.append((identifier, f"unexpected error: {exc}"))
if not quiet:
print(
f"[Warning] install: repository {identifier} hit an unexpected error: {exc}. Continuing..."
)
continue continue
if not _verify_repo( if failures and emit_summary and not quiet:
repo=repo, print("\n[pkgmgr] Installation finished with warnings:")
repo_dir=repo_dir, for ident, msg in failures:
no_verification=no_verification, print(f" - {ident}: {msg}")
identifier=identifier,
):
continue
ctx = _create_context( if failures and not silent:
repo=repo, raise SystemExit(1)
identifier=identifier,
repo_dir=repo_dir,
repositories_base_dir=repositories_base_dir,
bin_dir=bin_dir,
all_repos=all_repos,
no_verification=no_verification,
preview=preview,
quiet=quiet,
clone_mode=clone_mode,
update_dependencies=update_dependencies,
force_update=force_update,
)
pipeline.run(ctx)

View File

@@ -14,6 +14,10 @@ from pkgmgr.actions.install.installers.python import PythonInstaller # noqa: F4
from pkgmgr.actions.install.installers.makefile import MakefileInstaller # noqa: F401 from pkgmgr.actions.install.installers.makefile import MakefileInstaller # noqa: F401
# OS-specific installers # OS-specific installers
from pkgmgr.actions.install.installers.os_packages.arch_pkgbuild import ArchPkgbuildInstaller # noqa: F401 from pkgmgr.actions.install.installers.os_packages.arch_pkgbuild import (
from pkgmgr.actions.install.installers.os_packages.debian_control import DebianControlInstaller # noqa: F401 ArchPkgbuildInstaller as ArchPkgbuildInstaller,
) # noqa: F401
from pkgmgr.actions.install.installers.os_packages.debian_control import (
DebianControlInstaller as DebianControlInstaller,
) # noqa: F401
from pkgmgr.actions.install.installers.os_packages.rpm_spec import RpmSpecInstaller # noqa: F401 from pkgmgr.actions.install.installers.os_packages.rpm_spec import RpmSpecInstaller # noqa: F401

View File

@@ -41,7 +41,9 @@ class BaseInstaller(ABC):
return caps return caps
for matcher in CAPABILITY_MATCHERS: for matcher in CAPABILITY_MATCHERS:
if matcher.applies_to_layer(self.layer) and matcher.is_provided(ctx, self.layer): if matcher.applies_to_layer(self.layer) and matcher.is_provided(
ctx, self.layer
):
caps.add(matcher.name) caps.add(matcher.name)
return caps return caps

View File

@@ -16,7 +16,9 @@ class MakefileInstaller(BaseInstaller):
def supports(self, ctx: RepoContext) -> bool: def supports(self, ctx: RepoContext) -> bool:
if os.environ.get("PKGMGR_DISABLE_MAKEFILE_INSTALLER") == "1": if os.environ.get("PKGMGR_DISABLE_MAKEFILE_INSTALLER") == "1":
if not ctx.quiet: if not ctx.quiet:
print("[INFO] PKGMGR_DISABLE_MAKEFILE_INSTALLER=1 skipping MakefileInstaller.") print(
"[INFO] PKGMGR_DISABLE_MAKEFILE_INSTALLER=1 skipping MakefileInstaller."
)
return False return False
makefile_path = os.path.join(ctx.repo_dir, self.MAKEFILE_NAME) makefile_path = os.path.join(ctx.repo_dir, self.MAKEFILE_NAME)
@@ -46,7 +48,9 @@ class MakefileInstaller(BaseInstaller):
return return
if not ctx.quiet: if not ctx.quiet:
print(f"[pkgmgr] Running make install for {ctx.identifier} (MakefileInstaller)") print(
f"[pkgmgr] Running make install for {ctx.identifier} (MakefileInstaller)"
)
run_command("make install", cwd=ctx.repo_dir, preview=ctx.preview) run_command("make install", cwd=ctx.repo_dir, preview=ctx.preview)

View File

@@ -57,7 +57,9 @@ class NixConflictResolver:
# 3) Fallback: output-name based lookup (also covers nix suggesting: `nix profile remove pkgmgr`) # 3) Fallback: output-name based lookup (also covers nix suggesting: `nix profile remove pkgmgr`)
if not tokens: if not tokens:
tokens = self._profile.find_remove_tokens_for_output(ctx, self._runner, output) tokens = self._profile.find_remove_tokens_for_output(
ctx, self._runner, output
)
if tokens: if tokens:
if not quiet: if not quiet:
@@ -94,7 +96,9 @@ class NixConflictResolver:
continue continue
if not quiet: if not quiet:
print("[nix] conflict detected but could not resolve profile entries to remove.") print(
"[nix] conflict detected but could not resolve profile entries to remove."
)
return False return False
return False return False

View File

@@ -75,7 +75,9 @@ class NixFlakeInstaller(BaseInstaller):
# Core install path # Core install path
# --------------------------------------------------------------------- # ---------------------------------------------------------------------
def _install_only(self, ctx: "RepoContext", output: str, allow_failure: bool) -> None: def _install_only(
self, ctx: "RepoContext", output: str, allow_failure: bool
) -> None:
install_cmd = f"nix profile install {self._installable(ctx, output)}" install_cmd = f"nix profile install {self._installable(ctx, output)}"
if not ctx.quiet: if not ctx.quiet:
@@ -96,7 +98,9 @@ class NixFlakeInstaller(BaseInstaller):
output=output, output=output,
): ):
if not ctx.quiet: if not ctx.quiet:
print(f"[nix] output '{output}' successfully installed after conflict cleanup.") print(
f"[nix] output '{output}' successfully installed after conflict cleanup."
)
return return
if not ctx.quiet: if not ctx.quiet:
@@ -107,20 +111,26 @@ class NixFlakeInstaller(BaseInstaller):
# If indices are supported, try legacy index-upgrade path. # If indices are supported, try legacy index-upgrade path.
if self._indices_supported is not False: if self._indices_supported is not False:
indices = self._profile.find_installed_indices_for_output(ctx, self._runner, output) indices = self._profile.find_installed_indices_for_output(
ctx, self._runner, output
)
upgraded = False upgraded = False
for idx in indices: for idx in indices:
if self._upgrade_index(ctx, idx): if self._upgrade_index(ctx, idx):
upgraded = True upgraded = True
if not ctx.quiet: if not ctx.quiet:
print(f"[nix] output '{output}' successfully upgraded (index {idx}).") print(
f"[nix] output '{output}' successfully upgraded (index {idx})."
)
if upgraded: if upgraded:
return return
if indices and not ctx.quiet: if indices and not ctx.quiet:
print(f"[nix] upgrade failed; removing indices {indices} and reinstalling '{output}'.") print(
f"[nix] upgrade failed; removing indices {indices} and reinstalling '{output}'."
)
for idx in indices: for idx in indices:
self._remove_index(ctx, idx) self._remove_index(ctx, idx)
@@ -139,7 +149,9 @@ class NixFlakeInstaller(BaseInstaller):
print(f"[nix] output '{output}' successfully re-installed.") print(f"[nix] output '{output}' successfully re-installed.")
return return
print(f"[ERROR] Failed to install Nix flake output '{output}' (exit {final.returncode})") print(
f"[ERROR] Failed to install Nix flake output '{output}' (exit {final.returncode})"
)
if not allow_failure: if not allow_failure:
raise SystemExit(final.returncode) raise SystemExit(final.returncode)
@@ -149,7 +161,9 @@ class NixFlakeInstaller(BaseInstaller):
# force_update path # force_update path
# --------------------------------------------------------------------- # ---------------------------------------------------------------------
def _force_upgrade_output(self, ctx: "RepoContext", output: str, allow_failure: bool) -> None: def _force_upgrade_output(
self, ctx: "RepoContext", output: str, allow_failure: bool
) -> None:
# Prefer token path if indices unsupported (new nix) # Prefer token path if indices unsupported (new nix)
if self._indices_supported is False: if self._indices_supported is False:
self._remove_tokens_for_output(ctx, output) self._remove_tokens_for_output(ctx, output)
@@ -158,14 +172,18 @@ class NixFlakeInstaller(BaseInstaller):
print(f"[nix] output '{output}' successfully upgraded.") print(f"[nix] output '{output}' successfully upgraded.")
return return
indices = self._profile.find_installed_indices_for_output(ctx, self._runner, output) indices = self._profile.find_installed_indices_for_output(
ctx, self._runner, output
)
upgraded_any = False upgraded_any = False
for idx in indices: for idx in indices:
if self._upgrade_index(ctx, idx): if self._upgrade_index(ctx, idx):
upgraded_any = True upgraded_any = True
if not ctx.quiet: if not ctx.quiet:
print(f"[nix] output '{output}' successfully upgraded (index {idx}).") print(
f"[nix] output '{output}' successfully upgraded (index {idx})."
)
if upgraded_any: if upgraded_any:
if not ctx.quiet: if not ctx.quiet:
@@ -173,7 +191,9 @@ class NixFlakeInstaller(BaseInstaller):
return return
if indices and not ctx.quiet: if indices and not ctx.quiet:
print(f"[nix] upgrade failed; removing indices {indices} and reinstalling '{output}'.") print(
f"[nix] upgrade failed; removing indices {indices} and reinstalling '{output}'."
)
for idx in indices: for idx in indices:
self._remove_index(ctx, idx) self._remove_index(ctx, idx)
@@ -223,7 +243,9 @@ class NixFlakeInstaller(BaseInstaller):
return return
if not ctx.quiet: if not ctx.quiet:
print(f"[nix] indices unsupported; removing by token(s): {', '.join(tokens)}") print(
f"[nix] indices unsupported; removing by token(s): {', '.join(tokens)}"
)
for t in tokens: for t in tokens:
self._runner.run(ctx, f"nix profile remove {t}", allow_failure=True) self._runner.run(ctx, f"nix profile remove {t}", allow_failure=True)

View File

@@ -101,7 +101,9 @@ class NixProfileInspector:
data = self.list_json(ctx, runner) data = self.list_json(ctx, runner)
entries = normalize_elements(data) entries = normalize_elements(data)
tokens: List[str] = [out] # critical: matches nix's own suggestion for conflicts tokens: List[str] = [
out
] # critical: matches nix's own suggestion for conflicts
for e in entries: for e in entries:
if entry_matches_output(e, out): if entry_matches_output(e, out):

View File

@@ -48,7 +48,9 @@ class NixProfileListReader:
return uniq return uniq
def indices_matching_store_prefixes(self, ctx: "RepoContext", prefixes: List[str]) -> List[int]: def indices_matching_store_prefixes(
self, ctx: "RepoContext", prefixes: List[str]
) -> List[int]:
prefixes = [self._store_prefix(p) for p in prefixes if p] prefixes = [self._store_prefix(p) for p in prefixes if p]
prefixes = [p for p in prefixes if p] prefixes = [p for p in prefixes if p]
if not prefixes: if not prefixes:

View File

@@ -11,6 +11,7 @@ if TYPE_CHECKING:
from pkgmgr.actions.install.context import RepoContext from pkgmgr.actions.install.context import RepoContext
from .runner import CommandRunner from .runner import CommandRunner
@dataclass(frozen=True) @dataclass(frozen=True)
class RetryPolicy: class RetryPolicy:
max_attempts: int = 7 max_attempts: int = 7
@@ -35,13 +36,19 @@ class GitHubRateLimitRetry:
install_cmd: str, install_cmd: str,
) -> RunResult: ) -> RunResult:
quiet = bool(getattr(ctx, "quiet", False)) quiet = bool(getattr(ctx, "quiet", False))
delays = list(self._fibonacci_backoff(self._policy.base_delay_seconds, self._policy.max_attempts)) delays = list(
self._fibonacci_backoff(
self._policy.base_delay_seconds, self._policy.max_attempts
)
)
last: RunResult | None = None last: RunResult | None = None
for attempt, base_delay in enumerate(delays, start=1): for attempt, base_delay in enumerate(delays, start=1):
if not quiet: if not quiet:
print(f"[nix] attempt {attempt}/{self._policy.max_attempts}: {install_cmd}") print(
f"[nix] attempt {attempt}/{self._policy.max_attempts}: {install_cmd}"
)
res = runner.run(ctx, install_cmd, allow_failure=True) res = runner.run(ctx, install_cmd, allow_failure=True)
last = res last = res
@@ -56,7 +63,9 @@ class GitHubRateLimitRetry:
if attempt >= self._policy.max_attempts: if attempt >= self._policy.max_attempts:
break break
jitter = random.randint(self._policy.jitter_seconds_min, self._policy.jitter_seconds_max) jitter = random.randint(
self._policy.jitter_seconds_min, self._policy.jitter_seconds_max
)
wait_time = base_delay + jitter wait_time = base_delay + jitter
if not quiet: if not quiet:
@@ -67,7 +76,11 @@ class GitHubRateLimitRetry:
time.sleep(wait_time) time.sleep(wait_time)
return last if last is not None else RunResult(returncode=1, stdout="", stderr="nix install retry failed") return (
last
if last is not None
else RunResult(returncode=1, stdout="", stderr="nix install retry failed")
)
@staticmethod @staticmethod
def _is_github_rate_limit_error(text: str) -> bool: def _is_github_rate_limit_error(text: str) -> bool:

View File

@@ -9,6 +9,7 @@ from .types import RunResult
if TYPE_CHECKING: if TYPE_CHECKING:
from pkgmgr.actions.install.context import RepoContext from pkgmgr.actions.install.context import RepoContext
class CommandRunner: class CommandRunner:
""" """
Executes commands (shell=True) inside a repository directory (if provided). Executes commands (shell=True) inside a repository directory (if provided).
@@ -40,7 +41,9 @@ class CommandRunner:
raise raise
return RunResult(returncode=1, stdout="", stderr=str(e)) return RunResult(returncode=1, stdout="", stderr=str(e))
res = RunResult(returncode=p.returncode, stdout=p.stdout or "", stderr=p.stderr or "") res = RunResult(
returncode=p.returncode, stdout=p.stdout or "", stderr=p.stderr or ""
)
if res.returncode != 0 and not quiet: if res.returncode != 0 and not quiet:
self._print_compact_failure(res) self._print_compact_failure(res)

View File

@@ -20,7 +20,9 @@ class NixConflictTextParser:
tokens: List[str] = [] tokens: List[str] = []
for m in pat.finditer(text or ""): for m in pat.finditer(text or ""):
t = (m.group(1) or "").strip() t = (m.group(1) or "").strip()
if (t.startswith("'") and t.endswith("'")) or (t.startswith('"') and t.endswith('"')): if (t.startswith("'") and t.endswith("'")) or (
t.startswith('"') and t.endswith('"')
):
t = t[1:-1] t = t[1:-1]
if t: if t:
tokens.append(t) tokens.append(t)

View File

@@ -14,7 +14,9 @@ class PythonInstaller(BaseInstaller):
def supports(self, ctx: RepoContext) -> bool: def supports(self, ctx: RepoContext) -> bool:
if os.environ.get("PKGMGR_DISABLE_PYTHON_INSTALLER") == "1": if os.environ.get("PKGMGR_DISABLE_PYTHON_INSTALLER") == "1":
print("[INFO] PythonInstaller disabled via PKGMGR_DISABLE_PYTHON_INSTALLER.") print(
"[INFO] PythonInstaller disabled via PKGMGR_DISABLE_PYTHON_INSTALLER."
)
return False return False
return os.path.exists(os.path.join(ctx.repo_dir, "pyproject.toml")) return os.path.exists(os.path.join(ctx.repo_dir, "pyproject.toml"))

View File

@@ -132,7 +132,11 @@ class InstallationPipeline:
continue continue
if not quiet: if not quiet:
if ctx.force_update and state.layer is not None and installer_layer == state.layer: if (
ctx.force_update
and state.layer is not None
and installer_layer == state.layer
):
print( print(
f"[pkgmgr] Running installer {installer.__class__.__name__} " f"[pkgmgr] Running installer {installer.__class__.__name__} "
f"for {identifier} in '{repo_dir}' (upgrade requested)..." f"for {identifier} in '{repo_dir}' (upgrade requested)..."

View File

@@ -1,20 +1,50 @@
from __future__ import annotations from __future__ import annotations
import os import os
from typing import Optional, Set
from pkgmgr.core.command.run import run_command from pkgmgr.core.git.errors import GitRunError
from pkgmgr.core.git import GitError, run_git from pkgmgr.core.git.commands import (
from typing import List, Optional, Set GitAddRemoteError,
GitAddRemotePushUrlError,
GitSetRemoteUrlError,
add_remote,
add_remote_push_url,
set_remote_url,
)
from pkgmgr.core.git.queries import get_remote_push_urls, list_remotes
from .types import MirrorMap, RepoMirrorContext, Repository from .types import MirrorMap, RepoMirrorContext, Repository
def build_default_ssh_url(repo: Repository) -> Optional[str]: def _is_git_remote_url(url: str) -> bool:
""" """
Build a simple SSH URL from repo config if no explicit mirror is defined. True only for URLs that should become git remotes / push URLs.
Example: git@github.com:account/repository.git Accepted:
- git@host:owner/repo(.git) (SCP-like SSH)
- ssh://git@host(:port)/owner/repo(.git) (SSH URL)
- https://host/owner/repo.git (HTTPS git remote)
- http://host/owner/repo.git (rare, but possible)
Everything else (e.g. PyPI project page) stays metadata only.
""" """
u = (url or "").strip()
if not u:
return False
if u.startswith("git@"):
return True
if u.startswith("ssh://"):
return True
if (u.startswith("https://") or u.startswith("http://")) and u.endswith(".git"):
return True
return False
def build_default_ssh_url(repo: Repository) -> Optional[str]:
provider = repo.get("provider") provider = repo.get("provider")
account = repo.get("account") account = repo.get("account")
name = repo.get("repository") name = repo.get("repository")
@@ -23,96 +53,80 @@ def build_default_ssh_url(repo: Repository) -> Optional[str]:
if not provider or not account or not name: if not provider or not account or not name:
return None return None
provider = str(provider)
account = str(account)
name = str(name)
if port: if port:
return f"ssh://git@{provider}:{port}/{account}/{name}.git" return f"ssh://git@{provider}:{port}/{account}/{name}.git"
# GitHub-style shorthand
return f"git@{provider}:{account}/{name}.git" return f"git@{provider}:{account}/{name}.git"
def _git_mirrors_only(m: MirrorMap) -> MirrorMap:
return {k: v for k, v in m.items() if v and _is_git_remote_url(v)}
def determine_primary_remote_url( def determine_primary_remote_url(
repo: Repository, repo: Repository,
resolved_mirrors: MirrorMap, ctx: RepoMirrorContext,
) -> Optional[str]: ) -> Optional[str]:
""" """
Determine the primary remote URL in a consistent way: Priority order (GIT URLS ONLY):
1. origin from resolved mirrors (if it is a git URL)
1. resolved_mirrors["origin"] 2. first git URL from MIRRORS file (in file order)
2. any resolved mirror (first by name) 3. first git URL from config mirrors (in config order)
3. default SSH URL from provider/account/repository 4. default SSH URL
""" """
if "origin" in resolved_mirrors: resolved = ctx.resolved_mirrors
return resolved_mirrors["origin"] origin = resolved.get("origin")
if origin and _is_git_remote_url(origin):
return origin
if resolved_mirrors: for mirrors in (ctx.file_mirrors, ctx.config_mirrors):
first_name = sorted(resolved_mirrors.keys())[0] for _, url in mirrors.items():
return resolved_mirrors[first_name] if url and _is_git_remote_url(url):
return url
return build_default_ssh_url(repo) return build_default_ssh_url(repo)
def _safe_git_output(args: List[str], cwd: str) -> Optional[str]:
"""
Run a Git command via run_git and return its stdout, or None on failure.
"""
try:
return run_git(args, cwd=cwd)
except GitError:
return None
def current_origin_url(repo_dir: str) -> Optional[str]:
"""
Return the current URL for remote 'origin', or None if not present.
"""
output = _safe_git_output(["remote", "get-url", "origin"], cwd=repo_dir)
if not output:
return None
url = output.strip()
return url or None
def has_origin_remote(repo_dir: str) -> bool: def has_origin_remote(repo_dir: str) -> bool:
""" try:
Check whether a remote called 'origin' exists in the repository. return "origin" in list_remotes(cwd=repo_dir)
""" except GitRunError:
output = _safe_git_output(["remote"], cwd=repo_dir)
if not output:
return False return False
names = output.split()
return "origin" in names
def _ensure_push_urls_for_origin( def _set_origin_fetch_and_push(repo_dir: str, url: str, preview: bool) -> None:
"""
Ensure origin has fetch URL and push URL set to the primary URL.
Preview is handled by the underlying git runner.
"""
set_remote_url("origin", url, cwd=repo_dir, push=False, preview=preview)
set_remote_url("origin", url, cwd=repo_dir, push=True, preview=preview)
def _ensure_additional_push_urls(
repo_dir: str, repo_dir: str,
mirrors: MirrorMap, mirrors: MirrorMap,
primary: str,
preview: bool, preview: bool,
) -> None: ) -> None:
""" """
Ensure that all mirror URLs are present as push URLs on 'origin'. Ensure all *git* mirror URLs (except primary) are configured as additional
push URLs for origin.
Non-git URLs (like PyPI) are ignored and will never land in git config.
""" """
desired: Set[str] = {url for url in mirrors.values() if url} git_only = _git_mirrors_only(mirrors)
desired: Set[str] = {u for u in git_only.values() if u and u != primary}
if not desired: if not desired:
return return
existing_output = _safe_git_output( try:
["remote", "get-url", "--push", "--all", "origin"], existing = get_remote_push_urls("origin", cwd=repo_dir)
cwd=repo_dir, except GitRunError:
) existing = set()
existing = set(existing_output.splitlines()) if existing_output else set()
missing = sorted(desired - existing) for url in sorted(desired - existing):
for url in missing: add_remote_push_url("origin", url, cwd=repo_dir, preview=preview)
cmd = f"git remote set-url --add --push origin {url}"
if preview:
print(f"[PREVIEW] Would run in {repo_dir!r}: {cmd}")
else:
print(f"[INFO] Adding push URL to 'origin': {url}")
run_command(cmd, cwd=repo_dir, preview=False)
def ensure_origin_remote( def ensure_origin_remote(
@@ -120,60 +134,33 @@ def ensure_origin_remote(
ctx: RepoMirrorContext, ctx: RepoMirrorContext,
preview: bool, preview: bool,
) -> None: ) -> None:
"""
Ensure that a usable 'origin' remote exists and has all push URLs.
"""
repo_dir = ctx.repo_dir repo_dir = ctx.repo_dir
resolved_mirrors = ctx.resolved_mirrors
if not os.path.isdir(os.path.join(repo_dir, ".git")): if not os.path.isdir(os.path.join(repo_dir, ".git")):
print(f"[WARN] {repo_dir} is not a Git repository (no .git directory).") print(f"[WARN] {repo_dir} is not a Git repository.")
return return
url = determine_primary_remote_url(repo, resolved_mirrors) primary = determine_primary_remote_url(repo, ctx)
if not primary or not _is_git_remote_url(primary):
print("[WARN] No valid git primary mirror URL could be determined.")
return
# 1) Ensure origin exists
if not has_origin_remote(repo_dir): if not has_origin_remote(repo_dir):
if not url: try:
print( add_remote("origin", primary, cwd=repo_dir, preview=preview)
"[WARN] Could not determine URL for 'origin' remote. " except GitAddRemoteError as exc:
"Please configure mirrors or provider/account/repository." print(f"[WARN] Failed to add origin remote: {exc}")
) return # without origin we cannot reliably proceed
return
cmd = f"git remote add origin {url}" # 2) Ensure origin fetch+push URLs are correct
if preview:
print(f"[PREVIEW] Would run in {repo_dir!r}: {cmd}")
else:
print(f"[INFO] Adding 'origin' remote in {repo_dir}: {url}")
run_command(cmd, cwd=repo_dir, preview=False)
else:
current = current_origin_url(repo_dir)
if current == url or not url:
print(
"[INFO] 'origin' already points to "
f"{current or '<unknown>'} (no change needed)."
)
else:
# We do not auto-change origin here, only log the mismatch.
print(
"[INFO] 'origin' exists with URL "
f"{current or '<unknown>'}; not changing to {url}."
)
# Ensure all mirrors are present as push URLs
_ensure_push_urls_for_origin(repo_dir, resolved_mirrors, preview)
def is_remote_reachable(url: str, cwd: Optional[str] = None) -> bool:
"""
Check whether a remote repository is reachable via `git ls-remote`.
This does NOT modify anything; it only probes the remote.
"""
workdir = cwd or os.getcwd()
try: try:
# --exit-code → non-zero exit code if the remote does not exist _set_origin_fetch_and_push(repo_dir, primary, preview)
run_git(["ls-remote", "--exit-code", url], cwd=workdir) except GitSetRemoteUrlError as exc:
return True print(f"[WARN] Failed to set origin URLs: {exc}")
except GitError:
return False # 3) Ensure additional push URLs for mirrors (git urls only)
try:
_ensure_additional_push_urls(repo_dir, ctx.resolved_mirrors, primary, preview)
except GitAddRemotePushUrlError as exc:
print(f"[WARN] Failed to add additional push URLs: {exc}")

View File

@@ -1,8 +1,9 @@
from __future__ import annotations from __future__ import annotations
import os import os
from collections.abc import Iterable, Mapping
from typing import Union
from urllib.parse import urlparse from urllib.parse import urlparse
from typing import Mapping
from .types import MirrorMap, Repository from .types import MirrorMap, Repository
@@ -32,7 +33,7 @@ def read_mirrors_file(repo_dir: str, filename: str = "MIRRORS") -> MirrorMap:
""" """
Supports: Supports:
NAME URL NAME URL
URL auto name = hostname URL -> auto-generate name from hostname
""" """
path = os.path.join(repo_dir, filename) path = os.path.join(repo_dir, filename)
mirrors: MirrorMap = {} mirrors: MirrorMap = {}
@@ -52,7 +53,8 @@ def read_mirrors_file(repo_dir: str, filename: str = "MIRRORS") -> MirrorMap:
# Case 1: "name url" # Case 1: "name url"
if len(parts) == 2: if len(parts) == 2:
name, url = parts name, url = parts
# Case 2: "url" → auto-generate name
# Case 2: "url" -> auto name
elif len(parts) == 1: elif len(parts) == 1:
url = parts[0] url = parts[0]
parsed = urlparse(url) parsed = urlparse(url)
@@ -67,21 +69,56 @@ def read_mirrors_file(repo_dir: str, filename: str = "MIRRORS") -> MirrorMap:
continue continue
mirrors[name] = url mirrors[name] = url
except OSError as exc: except OSError as exc:
print(f"[WARN] Could not read MIRRORS file at {path}: {exc}") print(f"[WARN] Could not read MIRRORS file at {path}: {exc}")
return mirrors return mirrors
MirrorsInput = Union[Mapping[str, str], Iterable[str]]
def write_mirrors_file( def write_mirrors_file(
repo_dir: str, repo_dir: str,
mirrors: Mapping[str, str], mirrors: MirrorsInput,
filename: str = "MIRRORS", filename: str = "MIRRORS",
preview: bool = False, preview: bool = False,
) -> None: ) -> None:
"""
Write MIRRORS in one of two formats:
1) Mapping[str, str] -> "NAME URL" per line (legacy / compatible)
2) Iterable[str] -> "URL" per line (new preferred)
Strings are treated as a single URL (not iterated character-by-character).
"""
path = os.path.join(repo_dir, filename) path = os.path.join(repo_dir, filename)
lines = [f"{name} {url}" for name, url in sorted(mirrors.items())]
lines: list[str]
if isinstance(mirrors, Mapping):
items = [
(str(name), str(url))
for name, url in mirrors.items()
if url is not None and str(url).strip()
]
items.sort(key=lambda x: (x[0], x[1]))
lines = [f"{name} {url}" for name, url in items]
else:
if isinstance(mirrors, (str, bytes)):
urls = [str(mirrors).strip()]
else:
urls = [
str(url).strip()
for url in mirrors
if url is not None and str(url).strip()
]
urls = sorted(set(urls))
lines = urls
content = "\n".join(lines) + ("\n" if lines else "") content = "\n".join(lines) + ("\n" if lines else "")
if preview: if preview:
@@ -94,5 +131,6 @@ def write_mirrors_file(
with open(path, "w", encoding="utf-8") as fh: with open(path, "w", encoding="utf-8") as fh:
fh.write(content) fh.write(content)
print(f"[INFO] Wrote MIRRORS file at {path}") print(f"[INFO] Wrote MIRRORS file at {path}")
except OSError as exc: except OSError as exc:
print(f"[ERROR] Failed to write MIRRORS file at {path}: {exc}") print(f"[ERROR] Failed to write MIRRORS file at {path}: {exc}")

View File

@@ -16,6 +16,7 @@ from .types import MirrorMap, Repository
# Helpers # Helpers
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
def _repo_key(repo: Repository) -> Tuple[str, str, str]: def _repo_key(repo: Repository) -> Tuple[str, str, str]:
""" """
Normalised key for identifying a repository in config files. Normalised key for identifying a repository in config files.
@@ -47,6 +48,7 @@ def _load_user_config(path: str) -> Dict[str, object]:
# Main merge command # Main merge command
# ----------------------------------------------------------------------------- # -----------------------------------------------------------------------------
def merge_mirrors( def merge_mirrors(
selected_repos: List[Repository], selected_repos: List[Repository],
repositories_base_dir: str, repositories_base_dir: str,

View File

@@ -1,21 +0,0 @@
# src/pkgmgr/actions/mirror/remote_check.py
from __future__ import annotations
from typing import Tuple
from pkgmgr.core.git import GitError, run_git
def probe_mirror(url: str, repo_dir: str) -> Tuple[bool, str]:
"""
Probe a remote mirror URL using `git ls-remote`.
Returns:
(True, "") on success,
(False, error_message) on failure.
"""
try:
run_git(["ls-remote", url], cwd=repo_dir)
return True, ""
except GitError as exc:
return False, str(exc)

View File

@@ -1,4 +1,3 @@
# src/pkgmgr/actions/mirror/remote_provision.py
from __future__ import annotations from __future__ import annotations
from typing import List from typing import List
@@ -19,36 +18,28 @@ def ensure_remote_repository(
preview: bool, preview: bool,
) -> None: ) -> None:
ctx = build_context(repo, repositories_base_dir, all_repos) ctx = build_context(repo, repositories_base_dir, all_repos)
resolved_mirrors = ctx.resolved_mirrors
primary_url = determine_primary_remote_url(repo, resolved_mirrors) primary_url = determine_primary_remote_url(repo, ctx)
if not primary_url: if not primary_url:
print("[INFO] No remote URL could be derived; skipping remote provisioning.") print("[INFO] No primary URL found; skipping remote provisioning.")
return return
host_raw, owner_from_url, name_from_url = parse_repo_from_git_url(primary_url) host_raw, owner, name = parse_repo_from_git_url(primary_url)
host = normalize_provider_host(host_raw) host = normalize_provider_host(host_raw)
if not host or not owner_from_url or not name_from_url: if not host or not owner or not name:
print("[WARN] Could not derive host/owner/repository from URL; cannot ensure remote repo.") print("[WARN] Could not parse remote URL:", primary_url)
print(f" url={primary_url!r}")
print(f" host={host!r}, owner={owner_from_url!r}, repository={name_from_url!r}")
return return
print("------------------------------------------------------------")
print(f"[REMOTE ENSURE] {ctx.identifier}")
print(f"[REMOTE ENSURE] host: {host}")
print("------------------------------------------------------------")
spec = RepoSpec( spec = RepoSpec(
host=str(host), host=host,
owner=str(owner_from_url), owner=owner,
name=str(name_from_url), name=name,
private=bool(repo.get("private", True)), private=bool(repo.get("private", True)),
description=str(repo.get("description", "")), description=str(repo.get("description", "")),
) )
provider_kind = str(repo.get("provider", "")).strip().lower() or None provider_kind = str(repo.get("provider", "")).lower() or None
try: try:
result = ensure_remote_repo( result = ensure_remote_repo(
@@ -66,5 +57,3 @@ def ensure_remote_repository(
print(f"[REMOTE ENSURE] URL: {result.url}") print(f"[REMOTE ENSURE] URL: {result.url}")
except Exception as exc: # noqa: BLE001 except Exception as exc: # noqa: BLE001
print(f"[ERROR] Remote provisioning failed: {exc}") print(f"[ERROR] Remote provisioning failed: {exc}")
print()

View File

@@ -1,14 +1,30 @@
# src/pkgmgr/actions/mirror/setup_cmd.py
from __future__ import annotations from __future__ import annotations
from typing import List from typing import List
from pkgmgr.core.git.queries import probe_remote_reachable
from .context import build_context from .context import build_context
from .git_remote import ensure_origin_remote, determine_primary_remote_url from .git_remote import ensure_origin_remote, determine_primary_remote_url
from .remote_check import probe_mirror
from .remote_provision import ensure_remote_repository from .remote_provision import ensure_remote_repository
from .types import Repository from .types import Repository
def _is_git_remote_url(url: str) -> bool:
# Keep the same filtering semantics as in git_remote.py (duplicated on purpose
# to keep setup_cmd independent of private helpers).
u = (url or "").strip()
if not u:
return False
if u.startswith("git@"):
return True
if u.startswith("ssh://"):
return True
if (u.startswith("https://") or u.startswith("http://")) and u.endswith(".git"):
return True
return False
def _setup_local_mirrors_for_repo( def _setup_local_mirrors_for_repo(
repo: Repository, repo: Repository,
repositories_base_dir: str, repositories_base_dir: str,
@@ -22,7 +38,7 @@ def _setup_local_mirrors_for_repo(
print(f"[MIRROR SETUP:LOCAL] dir: {ctx.repo_dir}") print(f"[MIRROR SETUP:LOCAL] dir: {ctx.repo_dir}")
print("------------------------------------------------------------") print("------------------------------------------------------------")
ensure_origin_remote(repo, ctx, preview=preview) ensure_origin_remote(repo, ctx, preview)
print() print()
@@ -34,7 +50,6 @@ def _setup_remote_mirrors_for_repo(
ensure_remote: bool, ensure_remote: bool,
) -> None: ) -> None:
ctx = build_context(repo, repositories_base_dir, all_repos) ctx = build_context(repo, repositories_base_dir, all_repos)
resolved_mirrors = ctx.resolved_mirrors
print("------------------------------------------------------------") print("------------------------------------------------------------")
print(f"[MIRROR SETUP:REMOTE] {ctx.identifier}") print(f"[MIRROR SETUP:REMOTE] {ctx.identifier}")
@@ -44,37 +59,32 @@ def _setup_remote_mirrors_for_repo(
if ensure_remote: if ensure_remote:
ensure_remote_repository( ensure_remote_repository(
repo, repo,
repositories_base_dir=repositories_base_dir, repositories_base_dir,
all_repos=all_repos, all_repos,
preview=preview, preview,
) )
if not resolved_mirrors: # Probe only git URLs (do not try ls-remote against PyPI etc.)
primary_url = determine_primary_remote_url(repo, resolved_mirrors) # If there are no mirrors at all, probe the primary git URL.
if not primary_url: git_mirrors = {
print("[INFO] No mirrors configured and no primary URL available.") k: v for k, v in ctx.resolved_mirrors.items() if _is_git_remote_url(v)
}
if not git_mirrors:
primary = determine_primary_remote_url(repo, ctx)
if not primary or not _is_git_remote_url(primary):
print("[INFO] No git mirrors to probe.")
print() print()
return return
ok, error_message = probe_mirror(primary_url, ctx.repo_dir) ok = probe_remote_reachable(primary, cwd=ctx.repo_dir)
if ok: print("[OK]" if ok else "[WARN]", primary)
print(f"[OK] primary: {primary_url}")
else:
print(f"[WARN] primary: {primary_url}")
for line in error_message.splitlines():
print(f" {line}")
print() print()
return return
for name, url in sorted(resolved_mirrors.items()): for name, url in git_mirrors.items():
ok, error_message = probe_mirror(url, ctx.repo_dir) ok = probe_remote_reachable(url, cwd=ctx.repo_dir)
if ok: print(f"[OK] {name}: {url}" if ok else f"[WARN] {name}: {url}")
print(f"[OK] {name}: {url}")
else:
print(f"[WARN] {name}: {url}")
for line in error_message.splitlines():
print(f" {line}")
print() print()
@@ -91,17 +101,17 @@ def setup_mirrors(
for repo in selected_repos: for repo in selected_repos:
if local: if local:
_setup_local_mirrors_for_repo( _setup_local_mirrors_for_repo(
repo=repo, repo,
repositories_base_dir=repositories_base_dir, repositories_base_dir,
all_repos=all_repos, all_repos,
preview=preview, preview,
) )
if remote: if remote:
_setup_remote_mirrors_for_repo( _setup_remote_mirrors_for_repo(
repo=repo, repo,
repositories_base_dir=repositories_base_dir, repositories_base_dir,
all_repos=all_repos, all_repos,
preview=preview, preview,
ensure_remote=ensure_remote, ensure_remote,
) )

View File

@@ -17,7 +17,7 @@ def hostport_from_git_url(url: str) -> Tuple[str, Optional[str]]:
netloc = netloc.split("@", 1)[1] netloc = netloc.split("@", 1)[1]
if netloc.startswith("[") and "]" in netloc: if netloc.startswith("[") and "]" in netloc:
host = netloc[1:netloc.index("]")] host = netloc[1 : netloc.index("]")]
rest = netloc[netloc.index("]") + 1 :] rest = netloc[netloc.index("]") + 1 :]
port = rest[1:] if rest.startswith(":") else None port = rest[1:] if rest.startswith(":") else None
return host.strip(), (port.strip() if port else None) return host.strip(), (port.strip() if port else None)
@@ -43,7 +43,7 @@ def normalize_provider_host(host: str) -> str:
return "" return ""
if host.startswith("[") and "]" in host: if host.startswith("[") and "]" in host:
host = host[1:host.index("]")] host = host[1 : host.index("]")]
if ":" in host and host.count(":") == 1: if ":" in host and host.count(":") == 1:
host = host.rsplit(":", 1)[0] host = host.rsplit(":", 1)[0]

View File

@@ -4,7 +4,16 @@ from pkgmgr.core.repository.dir import get_repo_dir
from pkgmgr.core.command.run import run_command from pkgmgr.core.command.run import run_command
import sys import sys
def exec_proxy_command(proxy_prefix: str, selected_repos, repositories_base_dir, all_repos, proxy_command: str, extra_args, preview: bool):
def exec_proxy_command(
proxy_prefix: str,
selected_repos,
repositories_base_dir,
all_repos,
proxy_command: str,
extra_args,
preview: bool,
):
"""Execute a given proxy command with extra arguments for each repository.""" """Execute a given proxy command with extra arguments for each repository."""
error_repos = [] error_repos = []
max_exit_code = 0 max_exit_code = 0
@@ -22,7 +31,9 @@ def exec_proxy_command(proxy_prefix: str, selected_repos, repositories_base_dir,
try: try:
run_command(full_cmd, cwd=repo_dir, preview=preview) run_command(full_cmd, cwd=repo_dir, preview=preview)
except SystemExit as e: except SystemExit as e:
print(f"[ERROR] Command failed in {repo_identifier} with exit code {e.code}.") print(
f"[ERROR] Command failed in {repo_identifier} with exit code {e.code}."
)
error_repos.append((repo_identifier, e.code)) error_repos.append((repo_identifier, e.code))
max_exit_code = max(max_exit_code, e.code) max_exit_code = max(max_exit_code, e.code)

View File

@@ -1,17 +1,10 @@
from __future__ import annotations from __future__ import annotations
from pkgmgr.core.git import run_git from pkgmgr.core.git.queries import get_tags_at_ref
from pkgmgr.core.version.semver import SemVer, is_semver_tag from pkgmgr.core.version.semver import SemVer, is_semver_tag
def head_semver_tags(cwd: str = ".") -> list[str]: def head_semver_tags(cwd: str = ".") -> list[str]:
out = run_git(["tag", "--points-at", "HEAD"], cwd=cwd) tags = get_tags_at_ref("HEAD", cwd=cwd)
if not out:
return []
tags = [t.strip() for t in out.splitlines() if t.strip()]
tags = [t for t in tags if is_semver_tag(t) and t.startswith("v")] tags = [t for t in tags if is_semver_tag(t) and t.startswith("v")]
if not tags:
return []
return sorted(tags, key=SemVer.parse) return sorted(tags, key=SemVer.parse)

View File

@@ -84,10 +84,13 @@ def publish(
raise RuntimeError("No build artifacts found in dist/.") raise RuntimeError("No build artifacts found in dist/.")
resolver = TokenResolver() resolver = TokenResolver()
# Store PyPI token per OS user (keyring is already user-scoped).
# Do NOT scope by project name.
token = resolver.get_token( token = resolver.get_token(
provider_kind="pypi", provider_kind="pypi",
host=target.host, host=target.host,
owner=target.project, owner=None,
options=ResolutionOptions( options=ResolutionOptions(
interactive=interactive, interactive=interactive,
allow_prompt=allow_prompt, allow_prompt=allow_prompt,

View File

@@ -24,6 +24,8 @@ import tempfile
from datetime import date, datetime from datetime import date, datetime
from typing import Optional, Tuple from typing import Optional, Tuple
from pkgmgr.core.git.queries import get_config_value
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
# Editor helper for interactive changelog messages # Editor helper for interactive changelog messages
@@ -74,10 +76,7 @@ def _open_editor_for_changelog(initial_message: Optional[str] = None) -> str:
except OSError: except OSError:
pass pass
lines = [ lines = [line for line in content.splitlines() if not line.strip().startswith("#")]
line for line in content.splitlines()
if not line.strip().startswith("#")
]
return "\n".join(lines).strip() return "\n".join(lines).strip()
@@ -85,6 +84,7 @@ def _open_editor_for_changelog(initial_message: Optional[str] = None) -> str:
# File update helpers (pyproject + extra packaging + changelog) # File update helpers (pyproject + extra packaging + changelog)
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
def update_pyproject_version( def update_pyproject_version(
pyproject_path: str, pyproject_path: str,
new_version: str, new_version: str,
@@ -121,7 +121,7 @@ def update_pyproject_version(
pattern = r'^(version\s*=\s*")([^"]+)(")' pattern = r'^(version\s*=\s*")([^"]+)(")'
new_content, count = re.subn( new_content, count = re.subn(
pattern, pattern,
lambda m: f'{m.group(1)}{new_version}{m.group(3)}', lambda m: f"{m.group(1)}{new_version}{m.group(3)}",
content, content,
flags=re.MULTILINE, flags=re.MULTILINE,
) )
@@ -162,7 +162,7 @@ def update_flake_version(
pattern = r'(version\s*=\s*")([^"]+)(")' pattern = r'(version\s*=\s*")([^"]+)(")'
new_content, count = re.subn( new_content, count = re.subn(
pattern, pattern,
lambda m: f'{m.group(1)}{new_version}{m.group(3)}', lambda m: f"{m.group(1)}{new_version}{m.group(3)}",
content, content,
) )
@@ -365,24 +365,6 @@ def update_changelog(
# --------------------------------------------------------------------------- # ---------------------------------------------------------------------------
def _get_git_config_value(key: str) -> Optional[str]:
"""
Try to read a value from `git config --get <key>`.
"""
try:
result = subprocess.run(
["git", "config", "--get", key],
capture_output=True,
text=True,
check=False,
)
except Exception:
return None
value = result.stdout.strip()
return value or None
def _get_debian_author() -> Tuple[str, str]: def _get_debian_author() -> Tuple[str, str]:
""" """
Determine the maintainer name/email for debian/changelog entries. Determine the maintainer name/email for debian/changelog entries.
@@ -396,9 +378,9 @@ def _get_debian_author() -> Tuple[str, str]:
email = os.environ.get("GIT_AUTHOR_EMAIL") email = os.environ.get("GIT_AUTHOR_EMAIL")
if not name: if not name:
name = _get_git_config_value("user.name") name = get_config_value("user.name")
if not email: if not email:
email = _get_git_config_value("user.email") email = get_config_value("user.email")
if not name: if not name:
name = "Unknown Maintainer" name = "Unknown Maintainer"

View File

@@ -1,73 +1,92 @@
from __future__ import annotations from __future__ import annotations
import subprocess from pkgmgr.core.git.commands import (
fetch,
from pkgmgr.core.git import GitError pull_ff_only,
push,
tag_force_annotated,
)
from pkgmgr.core.git.queries import get_upstream_ref, list_tags
def run_git_command(cmd: str) -> None: def ensure_clean_and_synced(*, preview: bool = False) -> None:
print(f"[GIT] {cmd}")
try:
subprocess.run(
cmd,
shell=True,
check=True,
text=True,
capture_output=True,
)
except subprocess.CalledProcessError as exc:
print(f"[ERROR] Git command failed: {cmd}")
print(f" Exit code: {exc.returncode}")
if exc.stdout:
print("\n" + exc.stdout)
if exc.stderr:
print("\n" + exc.stderr)
raise GitError(f"Git command failed: {cmd}") from exc
def _capture(cmd: str) -> str:
res = subprocess.run(cmd, shell=True, check=False, capture_output=True, text=True)
return (res.stdout or "").strip()
def ensure_clean_and_synced(preview: bool = False) -> None:
""" """
Always run a pull BEFORE modifying anything. Always run a pull BEFORE modifying anything.
Uses --ff-only to avoid creating merge commits automatically. Uses --ff-only to avoid creating merge commits automatically.
If no upstream is configured, we skip. If no upstream is configured, we skip.
""" """
upstream = _capture("git rev-parse --abbrev-ref --symbolic-full-name @{u} 2>/dev/null") upstream = get_upstream_ref()
if not upstream: if not upstream:
print("[INFO] No upstream configured for current branch. Skipping pull.") print("[INFO] No upstream configured for current branch. Skipping pull.")
return return
if preview:
print("[PREVIEW] Would run: git fetch origin --prune --tags --force")
print("[PREVIEW] Would run: git pull --ff-only")
return
print("[INFO] Syncing with remote before making any changes...") print("[INFO] Syncing with remote before making any changes...")
run_git_command("git fetch origin --prune --tags --force")
run_git_command("git pull --ff-only") # Mirrors old behavior:
# git fetch origin --prune --tags --force
# git pull --ff-only
fetch(remote="origin", prune=True, tags=True, force=True, preview=preview)
pull_ff_only(preview=preview)
def _parse_v_tag(tag: str) -> tuple[int, ...] | None:
"""
Parse tags like 'v1.2.3' into (1, 2, 3).
Returns None if parsing is not possible.
"""
if not tag.startswith("v"):
return None
raw = tag[1:]
if not raw:
return None
parts = raw.split(".")
out: list[int] = []
for p in parts:
if not p.isdigit():
return None
out.append(int(p))
return tuple(out) if out else None
def is_highest_version_tag(tag: str) -> bool: def is_highest_version_tag(tag: str) -> bool:
""" """
Return True if `tag` is the highest version among all tags matching v*. Return True if `tag` is the highest version among all tags matching v*.
Comparison uses `sort -V` for natural version ordering.
We avoid shelling out to `sort -V` and implement a small vX.Y.Z parser.
Non-parseable v* tags are ignored for version comparison.
""" """
all_v = _capture("git tag --list 'v*'") all_v = list_tags("v*")
if not all_v: if not all_v:
return True # No tags yet, so the current tag is the highest return True # No tags yet -> current is highest by definition
# Get the latest tag in natural version order parsed_current = _parse_v_tag(tag)
latest = _capture("git tag --list 'v*' | sort -V | tail -n1") if parsed_current is None:
print(f"[INFO] Latest tag: {latest}, Current tag: {tag}") # If the "current" tag isn't parseable, fall back to conservative behavior:
# treat it as highest only if it matches the max lexicographically.
latest_lex = max(all_v)
print(f"[INFO] Latest tag (lex): {latest_lex}, Current tag: {tag}")
return tag >= latest_lex
# Ensure that the current tag is always considered the highest if it's the latest one parsed_all: list[tuple[int, ...]] = []
return tag >= latest # Use comparison operator to consider all future tags for t in all_v:
parsed = _parse_v_tag(t)
if parsed is not None:
parsed_all.append(parsed)
if not parsed_all:
# No parseable tags -> nothing to compare against
return True
latest = max(parsed_all)
print(
f"[INFO] Latest tag (parsed): v{'.'.join(map(str, latest))}, Current tag: {tag}"
)
return parsed_current >= latest
def update_latest_tag(new_tag: str, preview: bool = False) -> None: def update_latest_tag(new_tag: str, *, preview: bool = False) -> None:
""" """
Move the floating 'latest' tag to the newly created release tag. Move the floating 'latest' tag to the newly created release tag.
@@ -76,17 +95,14 @@ def update_latest_tag(new_tag: str, preview: bool = False) -> None:
- 'latest' is forced (floating tag), therefore the push uses --force. - 'latest' is forced (floating tag), therefore the push uses --force.
""" """
target_ref = f"{new_tag}^{{}}" target_ref = f"{new_tag}^{{}}"
print(f"[INFO] Updating 'latest' tag to point at {new_tag} (commit {target_ref})...") print(
f"[INFO] Updating 'latest' tag to point at {new_tag} (commit {target_ref})..."
if preview:
print(
f'[PREVIEW] Would run: git tag -f -a latest {target_ref} '
f'-m "Floating latest tag for {new_tag}"'
)
print("[PREVIEW] Would run: git push origin latest --force")
return
run_git_command(
f'git tag -f -a latest {target_ref} -m "Floating latest tag for {new_tag}"'
) )
run_git_command("git push origin latest --force")
tag_force_annotated(
name="latest",
target=target_ref,
message=f"Floating latest tag for {new_tag}",
preview=preview,
)
push("origin", "latest", force=True, preview=preview)

View File

@@ -7,7 +7,7 @@ Version discovery and bumping helpers for the release workflow.
from __future__ import annotations from __future__ import annotations
from pkgmgr.core.git import get_tags from pkgmgr.core.git.queries import get_tags
from pkgmgr.core.version.semver import ( from pkgmgr.core.version.semver import (
SemVer, SemVer,
find_latest_version, find_latest_version,

View File

@@ -1,4 +1,3 @@
# src/pkgmgr/actions/release/workflow.py
from __future__ import annotations from __future__ import annotations
import os import os
@@ -6,7 +5,9 @@ import sys
from typing import Optional from typing import Optional
from pkgmgr.actions.branch import close_branch from pkgmgr.actions.branch import close_branch
from pkgmgr.core.git import get_current_branch, GitError from pkgmgr.core.git import GitRunError
from pkgmgr.core.git.commands import add, commit, push, tag_annotated
from pkgmgr.core.git.queries import get_current_branch
from pkgmgr.core.repository.paths import resolve_repo_paths from pkgmgr.core.repository.paths import resolve_repo_paths
from .files import ( from .files import (
@@ -21,7 +22,6 @@ from .files import (
from .git_ops import ( from .git_ops import (
ensure_clean_and_synced, ensure_clean_and_synced,
is_highest_version_tag, is_highest_version_tag,
run_git_command,
update_latest_tag, update_latest_tag,
) )
from .prompts import confirm_proceed_release, should_delete_branch from .prompts import confirm_proceed_release, should_delete_branch
@@ -40,7 +40,7 @@ def _release_impl(
# Determine current branch early # Determine current branch early
try: try:
branch = get_current_branch() or "main" branch = get_current_branch() or "main"
except GitError: except GitRunError:
branch = "main" branch = "main"
print(f"Releasing on branch: {branch}") print(f"Releasing on branch: {branch}")
@@ -76,7 +76,9 @@ def _release_impl(
if paths.arch_pkgbuild: if paths.arch_pkgbuild:
update_pkgbuild_version(paths.arch_pkgbuild, new_ver_str, preview=preview) update_pkgbuild_version(paths.arch_pkgbuild, new_ver_str, preview=preview)
else: else:
print("[INFO] No PKGBUILD found (packaging/arch/PKGBUILD or PKGBUILD). Skipping.") print(
"[INFO] No PKGBUILD found (packaging/arch/PKGBUILD or PKGBUILD). Skipping."
)
if paths.rpm_spec: if paths.rpm_spec:
update_spec_version(paths.rpm_spec, new_ver_str, preview=preview) update_spec_version(paths.rpm_spec, new_ver_str, preview=preview)
@@ -123,45 +125,50 @@ def _release_impl(
paths.rpm_spec, paths.rpm_spec,
paths.debian_changelog, paths.debian_changelog,
] ]
existing_files = [p for p in files_to_add if isinstance(p, str) and p and os.path.exists(p)] existing_files = [
p for p in files_to_add if isinstance(p, str) and p and os.path.exists(p)
]
if preview: if preview:
for path in existing_files: add(existing_files, preview=True)
print(f"[PREVIEW] Would run: git add {path}") commit(commit_msg, all=True, preview=True)
print(f'[PREVIEW] Would run: git commit -am "{commit_msg}"') tag_annotated(new_tag, tag_msg, preview=True)
print(f'[PREVIEW] Would run: git tag -a {new_tag} -m "{tag_msg}"') push("origin", branch, preview=True)
print(f"[PREVIEW] Would run: git push origin {branch}") push("origin", new_tag, preview=True)
print(f"[PREVIEW] Would run: git push origin {new_tag}")
if is_highest_version_tag(new_tag): if is_highest_version_tag(new_tag):
update_latest_tag(new_tag, preview=True) update_latest_tag(new_tag, preview=True)
else: else:
print(f"[PREVIEW] Skipping 'latest' update (tag {new_tag} is not the highest).") print(
f"[PREVIEW] Skipping 'latest' update (tag {new_tag} is not the highest)."
)
if close and branch not in ("main", "master"): if close and branch not in ("main", "master"):
if force: if force:
print(f"[PREVIEW] Would delete branch {branch} (forced).") print(f"[PREVIEW] Would delete branch {branch} (forced).")
else: else:
print(f"[PREVIEW] Would ask whether to delete branch {branch} after release.") print(
f"[PREVIEW] Would ask whether to delete branch {branch} after release."
)
return return
for path in existing_files: add(existing_files, preview=False)
run_git_command(f"git add {path}") commit(commit_msg, all=True, preview=False)
tag_annotated(new_tag, tag_msg, preview=False)
run_git_command(f'git commit -am "{commit_msg}"')
run_git_command(f'git tag -a {new_tag} -m "{tag_msg}"')
# Push branch and ONLY the newly created version tag (no --tags) # Push branch and ONLY the newly created version tag (no --tags)
run_git_command(f"git push origin {branch}") push("origin", branch, preview=False)
run_git_command(f"git push origin {new_tag}") push("origin", new_tag, preview=False)
# Update 'latest' only if this is the highest version tag # Update 'latest' only if this is the highest version tag
try: try:
if is_highest_version_tag(new_tag): if is_highest_version_tag(new_tag):
update_latest_tag(new_tag, preview=False) update_latest_tag(new_tag, preview=False)
else: else:
print(f"[INFO] Skipping 'latest' update (tag {new_tag} is not the highest).") print(
except GitError as exc: f"[INFO] Skipping 'latest' update (tag {new_tag} is not the highest)."
)
except GitRunError as exc:
print(f"[WARN] Failed to update floating 'latest' tag for {new_tag}: {exc}") print(f"[WARN] Failed to update floating 'latest' tag for {new_tag}: {exc}")
print("'latest' tag was not updated.") print("'latest' tag was not updated.")
@@ -169,7 +176,9 @@ def _release_impl(
if close: if close:
if branch in ("main", "master"): if branch in ("main", "master"):
print(f"[INFO] close=True but current branch is {branch}; skipping branch deletion.") print(
f"[INFO] close=True but current branch is {branch}; skipping branch deletion."
)
return return
if not should_delete_branch(force=force): if not should_delete_branch(force=force):

View File

@@ -1,103 +1,138 @@
import subprocess from __future__ import annotations
import os import os
from typing import Any, Dict, List, Optional
from pkgmgr.core.git.commands import clone as git_clone, GitCloneError
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.core.repository.verify import verify_repository from pkgmgr.core.repository.verify import verify_repository
Repository = Dict[str, Any]
def _build_clone_url(repo: Repository, clone_mode: str) -> Optional[str]:
provider = repo.get("provider")
account = repo.get("account")
name = repo.get("repository")
replacement = repo.get("replacement")
if clone_mode == "ssh":
if not provider or not account or not name:
return None
return f"git@{provider}:{account}/{name}.git"
if clone_mode in ("https", "shallow"):
if replacement:
return f"https://{replacement}.git"
if not provider or not account or not name:
return None
return f"https://{provider}/{account}/{name}.git"
return None
def clone_repos( def clone_repos(
selected_repos, selected_repos: List[Repository],
repositories_base_dir: str, repositories_base_dir: str,
all_repos, all_repos: List[Repository],
preview: bool, preview: bool,
no_verification: bool, no_verification: bool,
clone_mode: str clone_mode: str,
): ) -> None:
for repo in selected_repos: for repo in selected_repos:
repo_identifier = get_repo_identifier(repo, all_repos) repo_identifier = get_repo_identifier(repo, all_repos)
repo_dir = get_repo_dir(repositories_base_dir, repo) repo_dir = get_repo_dir(repositories_base_dir, repo)
if os.path.exists(repo_dir): if os.path.exists(repo_dir):
print(f"[INFO] Repository '{repo_identifier}' already exists at '{repo_dir}'. Skipping clone.") print(
f"[INFO] Repository '{repo_identifier}' already exists at '{repo_dir}'. Skipping clone."
)
continue continue
parent_dir = os.path.dirname(repo_dir) parent_dir = os.path.dirname(repo_dir)
os.makedirs(parent_dir, exist_ok=True) os.makedirs(parent_dir, exist_ok=True)
# Build clone URL based on the clone_mode
# Build clone URL based on the clone_mode clone_url = _build_clone_url(repo, clone_mode)
if clone_mode == "ssh": if not clone_url:
clone_url = ( print(
f"git@{repo.get('provider')}:" f"[WARNING] Cannot build clone URL for '{repo_identifier}'. Skipping."
f"{repo.get('account')}/"
f"{repo.get('repository')}.git"
) )
elif clone_mode in ("https", "shallow"):
# Use replacement if defined, otherwise construct from provider/account/repository
if repo.get("replacement"):
clone_url = f"https://{repo.get('replacement')}.git"
else:
clone_url = (
f"https://{repo.get('provider')}/"
f"{repo.get('account')}/"
f"{repo.get('repository')}.git"
)
else:
print(f"Unknown clone mode '{clone_mode}'. Aborting clone for {repo_identifier}.")
continue continue
# Build base clone command shallow = clone_mode == "shallow"
base_clone_cmd = "git clone" mode_label = "HTTPS (shallow)" if shallow else clone_mode.upper()
if clone_mode == "shallow":
# Shallow clone: only latest state via HTTPS, no full history
base_clone_cmd += " --depth 1 --single-branch"
mode_label = "HTTPS (shallow)" if clone_mode == "shallow" else clone_mode.upper()
print( print(
f"[INFO] Attempting to clone '{repo_identifier}' using {mode_label} " f"[INFO] Attempting to clone '{repo_identifier}' using {mode_label} "
f"from {clone_url} into '{repo_dir}'." f"from {clone_url} into '{repo_dir}'."
) )
if preview: try:
print(f"[Preview] Would run: {base_clone_cmd} {clone_url} {repo_dir} in {parent_dir}") args = []
result = subprocess.CompletedProcess(args=[], returncode=0) if shallow:
else: args += ["--depth", "1", "--single-branch"]
result = subprocess.run( args += [clone_url, repo_dir]
f"{base_clone_cmd} {clone_url} {repo_dir}",
git_clone(
args,
cwd=parent_dir, cwd=parent_dir,
shell=True, preview=preview,
) )
if result.returncode != 0: except GitCloneError as exc:
# Only offer fallback if the original mode was SSH. if clone_mode != "ssh":
if clone_mode == "ssh": print(f"[WARNING] Clone failed for '{repo_identifier}': {exc}")
print(f"[WARNING] SSH clone failed for '{repo_identifier}' with return code {result.returncode}.") continue
choice = input("Do you want to attempt HTTPS clone instead? (y/N): ").strip().lower()
if choice == 'y': print(f"[WARNING] SSH clone failed for '{repo_identifier}': {exc}")
# Attempt HTTPS clone choice = (
if repo.get("replacement"): input("Do you want to attempt HTTPS clone instead? (y/N): ")
clone_url = f"https://{repo.get('replacement')}.git" .strip()
else: .lower()
clone_url = f"https://{repo.get('provider')}/{repo.get('account')}/{repo.get('repository')}.git" )
print(f"[INFO] Attempting to clone '{repo_identifier}' using HTTPS from {clone_url} into '{repo_dir}'.") if choice != "y":
if preview: print(f"[INFO] HTTPS clone not attempted for '{repo_identifier}'.")
print(f"[Preview] Would run: git clone {clone_url} {repo_dir} in {parent_dir}") continue
result = subprocess.CompletedProcess(args=[], returncode=0)
else: fallback_url = _build_clone_url(repo, "https")
result = subprocess.run(f"git clone {clone_url} {repo_dir}", cwd=parent_dir, shell=True) if not fallback_url:
else: print(f"[WARNING] Cannot build HTTPS URL for '{repo_identifier}'.")
print(f"[INFO] HTTPS clone not attempted for '{repo_identifier}'.") continue
continue
else: print(
# For https mode, do not attempt fallback. f"[INFO] Attempting to clone '{repo_identifier}' using HTTPS "
print(f"[WARNING] HTTPS clone failed for '{repo_identifier}' with return code {result.returncode}.") f"from {fallback_url} into '{repo_dir}'."
)
try:
git_clone(
[fallback_url, repo_dir],
cwd=parent_dir,
preview=preview,
)
except GitCloneError as exc2:
print(f"[WARNING] HTTPS clone failed for '{repo_identifier}': {exc2}")
continue continue
# After cloning, perform verification in local mode.
verified_info = repo.get("verified") verified_info = repo.get("verified")
if verified_info: if not verified_info:
verified_ok, errors, commit_hash, signing_key = verify_repository(repo, repo_dir, mode="local", no_verification=no_verification) continue
if not no_verification and not verified_ok:
print(f"Warning: Verification failed for {repo_identifier} after cloning:") verified_ok, errors, _commit_hash, _signing_key = verify_repository(
for err in errors: repo,
print(f" - {err}") repo_dir,
choice = input("Proceed anyway? (y/N): ").strip().lower() mode="local",
if choice != "y": no_verification=no_verification,
print(f"Skipping repository {repo_identifier} due to failed verification.") )
if no_verification or verified_ok:
continue
print(f"Warning: Verification failed for {repo_identifier} after cloning:")
for err in errors:
print(f" - {err}")
choice = input("Proceed anyway? (y/N): ").strip().lower()
if choice != "y":
print(f"Skipping repository {repo_identifier} due to failed verification.")

View File

@@ -1,143 +0,0 @@
import os
import subprocess
import yaml
from pkgmgr.core.command.alias import generate_alias
from pkgmgr.core.config.save import save_user_config
def create_repo(identifier, config_merged, user_config_path, bin_dir, remote=False, preview=False):
"""
Creates a new repository by performing the following steps:
1. Parses the identifier (provider:port/account/repository) and adds a new entry to the user config
if it is not already present. The provider part is split into provider and port (if provided).
2. Creates the local repository directory and initializes a Git repository.
3. If --remote is set, checks for an existing "origin" remote (removing it if found),
adds the remote using a URL built from provider, port, account, and repository,
creates an initial commit (e.g. with a README.md), and pushes to the remote.
The push is attempted on both "main" and "master" branches.
"""
parts = identifier.split("/")
if len(parts) != 3:
print("Identifier must be in the format 'provider:port/account/repository' (port is optional).")
return
provider_with_port, account, repository = parts
# Split provider and port if a colon is present.
if ":" in provider_with_port:
provider_name, port = provider_with_port.split(":", 1)
else:
provider_name = provider_with_port
port = None
# Check if the repository is already present in the merged config (including port)
exists = False
for repo in config_merged.get("repositories", []):
if (repo.get("provider") == provider_name and
repo.get("account") == account and
repo.get("repository") == repository):
exists = True
print(f"Repository {identifier} already exists in the configuration.")
break
if not exists:
# Create a new entry with an automatically generated alias.
new_entry = {
"provider": provider_name,
"port": port,
"account": account,
"repository": repository,
"alias": generate_alias({"repository": repository, "provider": provider_name, "account": account}, bin_dir, existing_aliases=set()),
"verified": {} # No initial verification info
}
# Load or initialize the user configuration.
if os.path.exists(user_config_path):
with open(user_config_path, "r") as f:
user_config = yaml.safe_load(f) or {}
else:
user_config = {"repositories": []}
user_config.setdefault("repositories", [])
user_config["repositories"].append(new_entry)
save_user_config(user_config, user_config_path)
print(f"Repository {identifier} added to the configuration.")
# Also update the merged configuration object.
config_merged.setdefault("repositories", []).append(new_entry)
# Create the local repository directory based on the configured base directory.
base_dir = os.path.expanduser(config_merged["directories"]["repositories"])
repo_dir = os.path.join(base_dir, provider_name, account, repository)
if not os.path.exists(repo_dir):
os.makedirs(repo_dir, exist_ok=True)
print(f"Local repository directory created: {repo_dir}")
else:
print(f"Local repository directory already exists: {repo_dir}")
# Initialize a Git repository if not already initialized.
if not os.path.exists(os.path.join(repo_dir, ".git")):
cmd_init = "git init"
if preview:
print(f"[Preview] Would execute: '{cmd_init}' in {repo_dir}")
else:
subprocess.run(cmd_init, cwd=repo_dir, shell=True, check=True)
print(f"Git repository initialized in {repo_dir}.")
else:
print("Git repository is already initialized.")
if remote:
# Create a README.md if it does not exist to have content for an initial commit.
readme_path = os.path.join(repo_dir, "README.md")
if not os.path.exists(readme_path):
if preview:
print(f"[Preview] Would create README.md in {repo_dir}.")
else:
with open(readme_path, "w") as f:
f.write(f"# {repository}\n")
subprocess.run("git add README.md", cwd=repo_dir, shell=True, check=True)
subprocess.run('git commit -m "Initial commit"', cwd=repo_dir, shell=True, check=True)
print("README.md created and initial commit made.")
# Build the remote URL.
if provider_name.lower() == "github.com":
remote_url = f"git@{provider_name}:{account}/{repository}.git"
else:
if port:
remote_url = f"ssh://git@{provider_name}:{port}/{account}/{repository}.git"
else:
remote_url = f"ssh://git@{provider_name}/{account}/{repository}.git"
# Check if the remote "origin" already exists.
cmd_list = "git remote"
if preview:
print(f"[Preview] Would check for existing remotes in {repo_dir}")
remote_exists = False # Assume no remote in preview mode.
else:
result = subprocess.run(cmd_list, cwd=repo_dir, shell=True, capture_output=True, text=True, check=True)
remote_list = result.stdout.strip().split()
remote_exists = "origin" in remote_list
if remote_exists:
# Remove the existing remote "origin".
cmd_remove = "git remote remove origin"
if preview:
print(f"[Preview] Would execute: '{cmd_remove}' in {repo_dir}")
else:
subprocess.run(cmd_remove, cwd=repo_dir, shell=True, check=True)
print("Existing remote 'origin' removed.")
# Now add the new remote.
cmd_remote = f"git remote add origin {remote_url}"
if preview:
print(f"[Preview] Would execute: '{cmd_remote}' in {repo_dir}")
else:
try:
subprocess.run(cmd_remote, cwd=repo_dir, shell=True, check=True)
print(f"Remote 'origin' added: {remote_url}")
except subprocess.CalledProcessError:
print(f"Failed to add remote using URL: {remote_url}.")
# Push the initial commit to the remote repository
cmd_push = "git push -u origin master"
if preview:
print(f"[Preview] Would execute: '{cmd_push}' in {repo_dir}")
else:
subprocess.run(cmd_push, cwd=repo_dir, shell=True, check=True)
print("Initial push to the remote repository completed.")

View File

@@ -0,0 +1,28 @@
from __future__ import annotations
from typing import Any, Dict
from .service import CreateRepoService
RepositoryConfig = Dict[str, Any]
__all__ = [
"CreateRepoService",
"create_repo",
]
def create_repo(
identifier: str,
config_merged: RepositoryConfig,
user_config_path: str,
bin_dir: str,
*,
remote: bool = False,
preview: bool = False,
) -> None:
CreateRepoService(
config_merged=config_merged,
user_config_path=user_config_path,
bin_dir=bin_dir,
).run(identifier=identifier, preview=preview, remote=remote)

View File

@@ -0,0 +1,84 @@
from __future__ import annotations
import os
from typing import Dict, Any, Set
import yaml
from pkgmgr.core.command.alias import generate_alias
from pkgmgr.core.config.save import save_user_config
Repository = Dict[str, Any]
class ConfigRepoWriter:
def __init__(
self,
*,
config_merged: Dict[str, Any],
user_config_path: str,
bin_dir: str,
):
self.config_merged = config_merged
self.user_config_path = user_config_path
self.bin_dir = bin_dir
def ensure_repo_entry(
self,
*,
host: str,
port: str | None,
owner: str,
name: str,
homepage: str,
preview: bool,
) -> Repository:
repositories = self.config_merged.setdefault("repositories", [])
for repo in repositories:
if (
repo.get("provider") == host
and repo.get("account") == owner
and repo.get("repository") == name
):
return repo
existing_aliases: Set[str] = {
str(r.get("alias")) for r in repositories if r.get("alias")
}
repo: Repository = {
"provider": host,
"port": port,
"account": owner,
"repository": name,
"homepage": homepage,
"alias": generate_alias(
{
"repository": name,
"provider": host,
"account": owner,
},
self.bin_dir,
existing_aliases=existing_aliases,
),
"verified": {},
}
if preview:
print(f"[Preview] Would add repository to config: {repo}")
return repo
if os.path.exists(self.user_config_path):
with open(self.user_config_path, "r", encoding="utf-8") as f:
user_cfg = yaml.safe_load(f) or {}
else:
user_cfg = {}
user_cfg.setdefault("repositories", []).append(repo)
save_user_config(user_cfg, self.user_config_path)
repositories.append(repo)
print(f"[INFO] Added repository to configuration: {host}/{owner}/{name}")
return repo

View File

@@ -0,0 +1,35 @@
from __future__ import annotations
from pkgmgr.core.git.commands import (
GitCommitError,
GitPushUpstreamError,
add_all,
branch_move,
commit,
init,
push_upstream,
)
class GitBootstrapper:
def init_repo(self, repo_dir: str, preview: bool) -> None:
init(cwd=repo_dir, preview=preview)
add_all(cwd=repo_dir, preview=preview)
try:
commit("Initial commit", cwd=repo_dir, preview=preview)
except GitCommitError as exc:
print(f"[WARN] Initial commit failed (continuing): {exc}")
def push_default_branch(self, repo_dir: str, preview: bool) -> None:
try:
branch_move("main", cwd=repo_dir, preview=preview)
push_upstream("origin", "main", cwd=repo_dir, preview=preview)
return
except GitPushUpstreamError:
pass
try:
branch_move("master", cwd=repo_dir, preview=preview)
push_upstream("origin", "master", cwd=repo_dir, preview=preview)
except GitPushUpstreamError as exc:
print(f"[WARN] Push failed: {exc}")

View File

@@ -0,0 +1,53 @@
from __future__ import annotations
from typing import Any, Dict
from pkgmgr.actions.mirror.io import write_mirrors_file
from pkgmgr.actions.mirror.setup_cmd import setup_mirrors
Repository = Dict[str, Any]
class MirrorBootstrapper:
"""
MIRRORS is the single source of truth.
Defaults are written to MIRRORS and mirror setup derives
git remotes exclusively from that file (git URLs only).
"""
def write_defaults(
self,
*,
repo_dir: str,
primary: str,
name: str,
preview: bool,
) -> None:
mirrors = {
primary,
f"https://pypi.org/project/{name}/",
}
write_mirrors_file(repo_dir, mirrors, preview=preview)
def setup(
self,
*,
repo: Repository,
repositories_base_dir: str,
all_repos: list[Repository],
preview: bool,
remote: bool,
) -> None:
# IMPORTANT:
# Do NOT set repo["mirrors"] here.
# MIRRORS file is the single source of truth.
setup_mirrors(
selected_repos=[repo],
repositories_base_dir=repositories_base_dir,
all_repos=all_repos,
preview=preview,
local=True,
remote=True,
ensure_remote=remote,
)

View File

@@ -0,0 +1,12 @@
from __future__ import annotations
from dataclasses import dataclass
from typing import Optional
@dataclass(frozen=True)
class RepoParts:
host: str
port: Optional[str]
owner: str
name: str

View File

@@ -0,0 +1,66 @@
from __future__ import annotations
import re
from typing import Tuple
from urllib.parse import urlparse
from .model import RepoParts
_NAME_RE = re.compile(r"^[a-z0-9_-]+$")
def parse_identifier(identifier: str) -> RepoParts:
ident = identifier.strip()
if "://" in ident or ident.startswith("git@"):
return _parse_git_url(ident)
parts = ident.split("/")
if len(parts) != 3:
raise ValueError("Identifier must be URL or 'provider(:port)/owner/repo'.")
host_with_port, owner, name = parts
host, port = _split_host_port(host_with_port)
_ensure_valid_repo_name(name)
return RepoParts(host=host, port=port, owner=owner, name=name)
def _parse_git_url(url: str) -> RepoParts:
if url.startswith("git@") and "://" not in url:
left, right = url.split(":", 1)
host = left.split("@", 1)[1]
owner, name = right.lstrip("/").split("/", 1)
name = _strip_git_suffix(name)
_ensure_valid_repo_name(name)
return RepoParts(host=host, port=None, owner=owner, name=name)
parsed = urlparse(url)
host = parsed.hostname or ""
port = str(parsed.port) if parsed.port else None
path = (parsed.path or "").strip("/")
if not host or "/" not in path:
raise ValueError(f"Could not parse git URL: {url}")
owner, name = path.split("/", 1)
name = _strip_git_suffix(name)
_ensure_valid_repo_name(name)
return RepoParts(host=host, port=port, owner=owner, name=name)
def _split_host_port(host: str) -> Tuple[str, str | None]:
if ":" in host:
h, p = host.split(":", 1)
return h, p or None
return host, None
def _strip_git_suffix(name: str) -> str:
return name[:-4] if name.endswith(".git") else name
def _ensure_valid_repo_name(name: str) -> None:
if not _NAME_RE.fullmatch(name):
raise ValueError("Repository name must match: lowercase a-z, 0-9, '_' and '-'.")

View File

@@ -0,0 +1,52 @@
from __future__ import annotations
import os
from typing import Dict, Any
from .model import RepoParts
class CreateRepoPlanner:
def __init__(self, parts: RepoParts, repositories_base_dir: str):
self.parts = parts
self.repositories_base_dir = os.path.expanduser(repositories_base_dir)
@property
def repo_dir(self) -> str:
return os.path.join(
self.repositories_base_dir,
self.parts.host,
self.parts.owner,
self.parts.name,
)
@property
def homepage(self) -> str:
return f"https://{self.parts.host}/{self.parts.owner}/{self.parts.name}"
@property
def primary_remote(self) -> str:
if self.parts.port:
return (
f"ssh://git@{self.parts.host}:{self.parts.port}/"
f"{self.parts.owner}/{self.parts.name}.git"
)
return f"git@{self.parts.host}:{self.parts.owner}/{self.parts.name}.git"
def template_context(
self,
*,
author_name: str,
author_email: str,
) -> Dict[str, Any]:
return {
"provider": self.parts.host,
"port": self.parts.port,
"account": self.parts.owner,
"repository": self.parts.name,
"homepage": self.homepage,
"author_name": author_name,
"author_email": author_email,
"license_text": f"All rights reserved by {author_name}",
"primary_remote": self.primary_remote,
}

View File

@@ -0,0 +1,97 @@
from __future__ import annotations
import os
from typing import Dict, Any
from pkgmgr.core.git.queries import get_config_value
from .parser import parse_identifier
from .planner import CreateRepoPlanner
from .config_writer import ConfigRepoWriter
from .templates import TemplateRenderer
from .git_bootstrap import GitBootstrapper
from .mirrors import MirrorBootstrapper
class CreateRepoService:
def __init__(
self,
*,
config_merged: Dict[str, Any],
user_config_path: str,
bin_dir: str,
):
self.config_merged = config_merged
self.user_config_path = user_config_path
self.bin_dir = bin_dir
self.templates = TemplateRenderer()
self.git = GitBootstrapper()
self.mirrors = MirrorBootstrapper()
def run(
self,
*,
identifier: str,
preview: bool,
remote: bool,
) -> None:
parts = parse_identifier(identifier)
base_dir = self.config_merged.get("directories", {}).get(
"repositories", "~/Repositories"
)
planner = CreateRepoPlanner(parts, base_dir)
writer = ConfigRepoWriter(
config_merged=self.config_merged,
user_config_path=self.user_config_path,
bin_dir=self.bin_dir,
)
repo = writer.ensure_repo_entry(
host=parts.host,
port=parts.port,
owner=parts.owner,
name=parts.name,
homepage=planner.homepage,
preview=preview,
)
if preview:
print(f"[Preview] Would ensure directory exists: {planner.repo_dir}")
else:
os.makedirs(planner.repo_dir, exist_ok=True)
author_name = get_config_value("user.name") or "Unknown Author"
author_email = get_config_value("user.email") or "unknown@example.invalid"
self.templates.render(
repo_dir=planner.repo_dir,
context=planner.template_context(
author_name=author_name,
author_email=author_email,
),
preview=preview,
)
self.git.init_repo(planner.repo_dir, preview=preview)
self.mirrors.write_defaults(
repo_dir=planner.repo_dir,
primary=planner.primary_remote,
name=parts.name,
preview=preview,
)
self.mirrors.setup(
repo=repo,
repositories_base_dir=os.path.expanduser(base_dir),
all_repos=self.config_merged.get("repositories", []),
preview=preview,
remote=remote,
)
if remote:
self.git.push_default_branch(planner.repo_dir, preview=preview)

View File

@@ -0,0 +1,78 @@
from __future__ import annotations
import os
from pathlib import Path
from typing import Dict, Any
from pkgmgr.core.git.queries import get_repo_root
try:
from jinja2 import Environment, FileSystemLoader, StrictUndefined
except Exception as exc: # pragma: no cover
Environment = None # type: ignore
FileSystemLoader = None # type: ignore
StrictUndefined = None # type: ignore
_JINJA_IMPORT_ERROR = exc
else:
_JINJA_IMPORT_ERROR = None
class TemplateRenderer:
def __init__(self) -> None:
self.templates_dir = self._resolve_templates_dir()
def render(
self,
*,
repo_dir: str,
context: Dict[str, Any],
preview: bool,
) -> None:
if preview:
self._preview()
return
if Environment is None:
raise RuntimeError(
"Jinja2 is required but not available. "
f"Import error: {_JINJA_IMPORT_ERROR}"
)
env = Environment(
loader=FileSystemLoader(self.templates_dir),
undefined=StrictUndefined,
autoescape=False,
keep_trailing_newline=True,
)
for root, _, files in os.walk(self.templates_dir):
for fn in files:
if not fn.endswith(".j2"):
continue
abs_src = os.path.join(root, fn)
rel_src = os.path.relpath(abs_src, self.templates_dir)
rel_out = rel_src[:-3]
abs_out = os.path.join(repo_dir, rel_out)
os.makedirs(os.path.dirname(abs_out), exist_ok=True)
template = env.get_template(rel_src)
rendered = template.render(**context)
with open(abs_out, "w", encoding="utf-8") as f:
f.write(rendered)
def _preview(self) -> None:
for root, _, files in os.walk(self.templates_dir):
for fn in files:
if fn.endswith(".j2"):
rel = os.path.relpath(os.path.join(root, fn), self.templates_dir)
print(f"[Preview] Would render template: {rel} -> {rel[:-3]}")
@staticmethod
def _resolve_templates_dir() -> str:
here = Path(__file__).resolve().parent
root = get_repo_root(cwd=str(here))
if not root:
raise RuntimeError("Could not determine repository root for templates.")
return os.path.join(root, "templates", "default")

View File

@@ -24,9 +24,13 @@ def deinstall_repos(
# Remove alias link/file (interactive) # Remove alias link/file (interactive)
if os.path.exists(alias_path): if os.path.exists(alias_path):
confirm = input( confirm = (
f"Are you sure you want to delete link '{alias_path}' for {repo_identifier}? [y/N]: " input(
).strip().lower() f"Are you sure you want to delete link '{alias_path}' for {repo_identifier}? [y/N]: "
)
.strip()
.lower()
)
if confirm == "y": if confirm == "y":
if preview: if preview:
print(f"[Preview] Would remove link '{alias_path}'.") print(f"[Preview] Would remove link '{alias_path}'.")

View File

@@ -3,19 +3,30 @@ import os
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
def delete_repos(selected_repos, repositories_base_dir, all_repos, preview=False): def delete_repos(selected_repos, repositories_base_dir, all_repos, preview=False):
for repo in selected_repos: for repo in selected_repos:
repo_identifier = get_repo_identifier(repo, all_repos) repo_identifier = get_repo_identifier(repo, all_repos)
repo_dir = get_repo_dir(repositories_base_dir, repo) repo_dir = get_repo_dir(repositories_base_dir, repo)
if os.path.exists(repo_dir): if os.path.exists(repo_dir):
confirm = input(f"Are you sure you want to delete directory '{repo_dir}' for {repo_identifier}? [y/N]: ").strip().lower() confirm = (
input(
f"Are you sure you want to delete directory '{repo_dir}' for {repo_identifier}? [y/N]: "
)
.strip()
.lower()
)
if confirm == "y": if confirm == "y":
if preview: if preview:
print(f"[Preview] Would delete directory '{repo_dir}' for {repo_identifier}.") print(
f"[Preview] Would delete directory '{repo_dir}' for {repo_identifier}."
)
else: else:
try: try:
shutil.rmtree(repo_dir) shutil.rmtree(repo_dir)
print(f"Deleted repository directory '{repo_dir}' for {repo_identifier}.") print(
f"Deleted repository directory '{repo_dir}' for {repo_identifier}."
)
except Exception as e: except Exception as e:
print(f"Error deleting '{repo_dir}' for {repo_identifier}: {e}") print(f"Error deleting '{repo_dir}' for {repo_identifier}: {e}")
else: else:

View File

@@ -233,9 +233,7 @@ def list_repositories(
categories.append(str(repo["category"])) categories.append(str(repo["category"]))
yaml_tags: List[str] = list(map(str, repo.get("tags", []))) yaml_tags: List[str] = list(map(str, repo.get("tags", [])))
display_tags: List[str] = sorted( display_tags: List[str] = sorted(set(yaml_tags + list(map(str, extra_tags))))
set(yaml_tags + list(map(str, extra_tags)))
)
rows.append( rows.append(
{ {
@@ -288,13 +286,7 @@ def list_repositories(
status_padded = status.ljust(status_width) status_padded = status.ljust(status_width)
status_colored = _color_status(status_padded) status_colored = _color_status(status_padded)
print( print(f"{ident_col} {status_colored} {cat_col} {tag_col} {dir_col}")
f"{ident_col} "
f"{status_colored} "
f"{cat_col} "
f"{tag_col} "
f"{dir_col}"
)
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# Detailed section (alias value red, same status coloring) # Detailed section (alias value red, same status coloring)

View File

@@ -1,25 +1,30 @@
#!/usr/bin/env python3 from __future__ import annotations
# -*- coding: utf-8 -*-
import os import os
import subprocess
import sys import sys
from typing import List, Dict, Any
from pkgmgr.core.git.commands import pull_args, GitPullArgsError
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.core.repository.verify import verify_repository from pkgmgr.core.repository.verify import verify_repository
Repository = Dict[str, Any]
def pull_with_verification( def pull_with_verification(
selected_repos, selected_repos: List[Repository],
repositories_base_dir, repositories_base_dir: str,
all_repos, all_repos: List[Repository],
extra_args, extra_args: List[str],
no_verification, no_verification: bool,
preview: bool, preview: bool,
) -> None: ) -> None:
""" """
Execute `git pull` for each repository with verification. Execute `git pull` for each repository with verification.
- If verification fails and verification is enabled, prompt user to continue.
- Uses core.git.commands.pull_args() (no raw subprocess usage).
""" """
for repo in selected_repos: for repo in selected_repos:
repo_identifier = get_repo_identifier(repo, all_repos) repo_identifier = get_repo_identifier(repo, all_repos)
@@ -37,12 +42,7 @@ def pull_with_verification(
no_verification=no_verification, no_verification=no_verification,
) )
if ( if not preview and not no_verification and verified_info and not verified_ok:
not preview
and not no_verification
and verified_info
and not verified_ok
):
print(f"Warning: Verification failed for {repo_identifier}:") print(f"Warning: Verification failed for {repo_identifier}:")
for err in errors: for err in errors:
print(f" - {err}") print(f" - {err}")
@@ -50,17 +50,10 @@ def pull_with_verification(
if choice != "y": if choice != "y":
continue continue
args_part = " ".join(extra_args) if extra_args else "" try:
full_cmd = f"git pull{(' ' + args_part) if args_part else ''}" pull_args(extra_args, cwd=repo_dir, preview=preview)
except GitPullArgsError as exc:
if preview: # Keep behavior consistent with previous implementation:
print(f"[Preview] In '{repo_dir}': {full_cmd}") # stop on first failure and propagate return code as generic failure.
else: print(str(exc))
print(f"Running in '{repo_dir}': {full_cmd}") sys.exit(1)
result = subprocess.run(full_cmd, cwd=repo_dir, shell=True, check=False)
if result.returncode != 0:
print(
f"'git pull' for {repo_identifier} failed "
f"with exit code {result.returncode}."
)
sys.exit(result.returncode)

View File

@@ -3,7 +3,7 @@
from __future__ import annotations from __future__ import annotations
from typing import Any, Iterable from typing import Any, Iterable, List, Tuple
from pkgmgr.actions.update.system_updater import SystemUpdater from pkgmgr.actions.update.system_updater import SystemUpdater
@@ -30,32 +30,81 @@ class UpdateManager:
quiet: bool, quiet: bool,
update_dependencies: bool, update_dependencies: bool,
clone_mode: str, clone_mode: str,
silent: bool = False,
force_update: bool = True, force_update: bool = True,
) -> None: ) -> None:
from pkgmgr.actions.install import install_repos from pkgmgr.actions.install import install_repos
from pkgmgr.actions.repository.pull import pull_with_verification from pkgmgr.actions.repository.pull import pull_with_verification
from pkgmgr.core.repository.identifier import get_repo_identifier
pull_with_verification( failures: List[Tuple[str, str]] = []
selected_repos,
repositories_base_dir,
all_repos,
[],
no_verification,
preview,
)
install_repos( for repo in list(selected_repos):
selected_repos, identifier = get_repo_identifier(repo, all_repos)
repositories_base_dir,
bin_dir, try:
all_repos, pull_with_verification(
no_verification, [repo],
preview, repositories_base_dir,
quiet, all_repos,
clone_mode, [],
update_dependencies, no_verification,
force_update=force_update, preview,
) )
except SystemExit as exc:
code = exc.code if isinstance(exc.code, int) else str(exc.code)
failures.append((identifier, f"pull failed (exit={code})"))
if not quiet:
print(
f"[Warning] update: pull failed for {identifier} (exit={code}). Continuing..."
)
continue
except Exception as exc:
failures.append((identifier, f"pull failed: {exc}"))
if not quiet:
print(
f"[Warning] update: pull failed for {identifier}: {exc}. Continuing..."
)
continue
try:
install_repos(
[repo],
repositories_base_dir,
bin_dir,
all_repos,
no_verification,
preview,
quiet,
clone_mode,
update_dependencies,
force_update=force_update,
silent=silent,
emit_summary=False,
)
except SystemExit as exc:
code = exc.code if isinstance(exc.code, int) else str(exc.code)
failures.append((identifier, f"install failed (exit={code})"))
if not quiet:
print(
f"[Warning] update: install failed for {identifier} (exit={code}). Continuing..."
)
continue
except Exception as exc:
failures.append((identifier, f"install failed: {exc}"))
if not quiet:
print(
f"[Warning] update: install failed for {identifier}: {exc}. Continuing..."
)
continue
if failures and not quiet:
print("\n[pkgmgr] Update finished with warnings:")
for ident, msg in failures:
print(f" - {ident}: {msg}")
if failures and not silent:
raise SystemExit(1)
if system_update: if system_update:
self._system_updater.run(preview=preview) self._system_updater.run(preview=preview)

View File

@@ -31,6 +31,7 @@ class OSReleaseInfo:
""" """
Minimal /etc/os-release representation for distro detection. Minimal /etc/os-release representation for distro detection.
""" """
id: str = "" id: str = ""
id_like: str = "" id_like: str = ""
pretty_name: str = "" pretty_name: str = ""
@@ -63,4 +64,6 @@ class OSReleaseInfo:
def is_fedora_family(self) -> bool: def is_fedora_family(self) -> bool:
ids = self.ids() ids = self.ids()
return bool(ids.intersection({"fedora", "rhel", "centos", "rocky", "almalinux"})) return bool(
ids.intersection({"fedora", "rhel", "centos", "rocky", "almalinux"})
)

View File

@@ -58,7 +58,9 @@ class SystemUpdater:
run_command("sudo pacman -Syu --noconfirm", preview=preview) run_command("sudo pacman -Syu --noconfirm", preview=preview)
return return
print("[Warning] Cannot update Arch system: missing required tools (sudo/yay/pacman).") print(
"[Warning] Cannot update Arch system: missing required tools (sudo/yay/pacman)."
)
def _update_debian(self, *, preview: bool) -> None: def _update_debian(self, *, preview: bool) -> None:
from pkgmgr.core.command.run import run_command from pkgmgr.core.command.run import run_command
@@ -67,7 +69,9 @@ class SystemUpdater:
apt_get = shutil.which("apt-get") apt_get = shutil.which("apt-get")
if not (sudo and apt_get): if not (sudo and apt_get):
print("[Warning] Cannot update Debian/Ubuntu system: missing required tools (sudo/apt-get).") print(
"[Warning] Cannot update Debian/Ubuntu system: missing required tools (sudo/apt-get)."
)
return return
env = "DEBIAN_FRONTEND=noninteractive" env = "DEBIAN_FRONTEND=noninteractive"

View File

@@ -29,6 +29,7 @@ For details on any command, run:
\033[1mpkgmgr <command> --help\033[0m \033[1mpkgmgr <command> --help\033[0m
""" """
def main() -> None: def main() -> None:
""" """
Entry point for the pkgmgr CLI. Entry point for the pkgmgr CLI.
@@ -41,9 +42,7 @@ def main() -> None:
repositories_dir = os.path.expanduser( repositories_dir = os.path.expanduser(
directories.get("repositories", "~/Repositories") directories.get("repositories", "~/Repositories")
) )
binaries_dir = os.path.expanduser( binaries_dir = os.path.expanduser(directories.get("binaries", "~/.local/bin"))
directories.get("binaries", "~/.local/bin")
)
# Ensure the merged config actually contains the resolved directories # Ensure the merged config actually contains the resolved directories
config_merged.setdefault("directories", {}) config_merged.setdefault("directories", {})

View File

@@ -7,7 +7,7 @@ from typing import Any, Dict, List, Optional, Tuple
from pkgmgr.cli.context import CLIContext from pkgmgr.cli.context import CLIContext
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.core.git import get_tags from pkgmgr.core.git.queries import get_tags
from pkgmgr.core.version.semver import extract_semver_from_tags from pkgmgr.core.version.semver import extract_semver_from_tags
from pkgmgr.actions.changelog import generate_changelog from pkgmgr.actions.changelog import generate_changelog
@@ -135,9 +135,7 @@ def handle_changelog(
target_tag=range_arg, target_tag=range_arg,
) )
if cur_tag is None: if cur_tag is None:
print( print(f"[WARN] Tag {range_arg!r} not found or not a SemVer tag.")
f"[WARN] Tag {range_arg!r} not found or not a SemVer tag."
)
print("[INFO] Falling back to full history.") print("[INFO] Falling back to full history.")
from_ref = None from_ref = None
to_ref = None to_ref = None

View File

@@ -213,9 +213,7 @@ def handle_config(args, ctx: CLIContext) -> None:
) )
if key == mod_key: if key == mod_key:
entry["ignore"] = args.set == "true" entry["ignore"] = args.set == "true"
print( print(f"Set ignore for {key} to {entry['ignore']}")
f"Set ignore for {key} to {entry['ignore']}"
)
save_user_config(user_config, user_config_path) save_user_config(user_config, user_config_path)
return return

View File

@@ -4,7 +4,12 @@ from __future__ import annotations
import sys import sys
from typing import Any, Dict, List from typing import Any, Dict, List
from pkgmgr.actions.mirror import diff_mirrors, list_mirrors, merge_mirrors, setup_mirrors from pkgmgr.actions.mirror import (
diff_mirrors,
list_mirrors,
merge_mirrors,
setup_mirrors,
)
from pkgmgr.cli.context import CLIContext from pkgmgr.cli.context import CLIContext
Repository = Dict[str, Any] Repository = Dict[str, Any]
@@ -56,11 +61,15 @@ def handle_mirror_command(
preview = getattr(args, "preview", False) preview = getattr(args, "preview", False)
if source == target: if source == target:
print("[ERROR] For 'mirror merge', source and target must differ (config vs file).") print(
"[ERROR] For 'mirror merge', source and target must differ (config vs file)."
)
sys.exit(2) sys.exit(2)
explicit_config_path = getattr(args, "config_path", None) explicit_config_path = getattr(args, "config_path", None)
user_config_path = explicit_config_path or getattr(ctx, "user_config_path", None) user_config_path = explicit_config_path or getattr(
ctx, "user_config_path", None
)
merge_mirrors( merge_mirrors(
selected_repos=selected, selected_repos=selected,

View File

@@ -18,7 +18,9 @@ def handle_publish(args, ctx: CLIContext, selected: List[Repository]) -> None:
for repo in selected: for repo in selected:
identifier = get_repo_identifier(repo, ctx.all_repositories) identifier = get_repo_identifier(repo, ctx.all_repositories)
repo_dir = repo.get("directory") or get_repo_dir(ctx.repositories_base_dir, repo) repo_dir = repo.get("directory") or get_repo_dir(
ctx.repositories_base_dir, repo
)
if not os.path.isdir(repo_dir): if not os.path.isdir(repo_dir):
print(f"[WARN] Skipping {identifier}: directory missing.") print(f"[WARN] Skipping {identifier}: directory missing.")

View File

@@ -1,31 +1,17 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
"""
Release command wiring for the pkgmgr CLI.
This module implements the `pkgmgr release` subcommand on top of the
generic selection logic from cli.dispatch. It does not define its
own subparser; the CLI surface is configured in cli.parser.
Responsibilities:
- Take the parsed argparse.Namespace for the `release` command.
- Use the list of selected repositories provided by dispatch_command().
- Optionally list affected repositories when --list is set.
- For each selected repository, run pkgmgr.actions.release.release(...) in
the context of that repository directory.
"""
from __future__ import annotations from __future__ import annotations
import os import os
import sys
from typing import Any, Dict, List from typing import Any, Dict, List
from pkgmgr.actions.publish import publish as run_publish
from pkgmgr.actions.release import release as run_release
from pkgmgr.cli.context import CLIContext from pkgmgr.cli.context import CLIContext
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.actions.release import release as run_release
Repository = Dict[str, Any] Repository = Dict[str, Any]
@@ -35,23 +21,10 @@ def handle_release(
ctx: CLIContext, ctx: CLIContext,
selected: List[Repository], selected: List[Repository],
) -> None: ) -> None:
"""
Handle the `pkgmgr release` subcommand.
Flow:
1) Use the `selected` repositories as computed by dispatch_command().
2) If --list is given, print the identifiers of the selected repos
and return without running any release.
3) For each selected repository:
- Resolve its identifier and local directory.
- Change into that directory.
- Call pkgmgr.actions.release.release(...) with the parsed options.
"""
if not selected: if not selected:
print("[pkgmgr] No repositories selected for release.") print("[pkgmgr] No repositories selected for release.")
return return
# List-only mode: show which repositories would be affected.
if getattr(args, "list", False): if getattr(args, "list", False):
print("[pkgmgr] Repositories that would be affected by this release:") print("[pkgmgr] Repositories that would be affected by this release:")
for repo in selected: for repo in selected:
@@ -62,29 +35,26 @@ def handle_release(
for repo in selected: for repo in selected:
identifier = get_repo_identifier(repo, ctx.all_repositories) identifier = get_repo_identifier(repo, ctx.all_repositories)
repo_dir = repo.get("directory") try:
if not repo_dir: repo_dir = repo.get("directory") or get_repo_dir(
try: ctx.repositories_base_dir, repo
repo_dir = get_repo_dir(ctx.repositories_base_dir, repo) )
except Exception: except Exception as exc:
repo_dir = None
if not repo_dir or not os.path.isdir(repo_dir):
print( print(
f"[WARN] Skipping repository {identifier}: " f"[WARN] Skipping repository {identifier}: failed to resolve directory: {exc}"
"local directory does not exist."
) )
continue continue
print( if not os.path.isdir(repo_dir):
f"[pkgmgr] Running release for repository {identifier} " print(f"[WARN] Skipping repository {identifier}: directory missing.")
f"in '{repo_dir}'..." continue
)
print(f"[pkgmgr] Running release for repository {identifier}...")
# Change to repo directory and invoke the helper.
cwd_before = os.getcwd() cwd_before = os.getcwd()
try: try:
os.chdir(repo_dir) os.chdir(repo_dir)
run_release( run_release(
pyproject_path="pyproject.toml", pyproject_path="pyproject.toml",
changelog_path="CHANGELOG.md", changelog_path="CHANGELOG.md",
@@ -94,5 +64,17 @@ def handle_release(
force=getattr(args, "force", False), force=getattr(args, "force", False),
close=getattr(args, "close", False), close=getattr(args, "close", False),
) )
if not getattr(args, "no_publish", False):
print(f"[pkgmgr] Running publish for repository {identifier}...")
is_tty = sys.stdin.isatty()
run_publish(
repo=repo,
repo_dir=repo_dir,
preview=getattr(args, "preview", False),
interactive=is_tty,
allow_prompt=is_tty,
)
finally: finally:
os.chdir(cwd_before) os.chdir(cwd_before)

View File

@@ -32,9 +32,8 @@ def _resolve_repository_directory(repository: Repository, ctx: CLIContext) -> st
if repo_dir: if repo_dir:
return repo_dir return repo_dir
base_dir = ( base_dir = getattr(ctx, "repositories_base_dir", None) or getattr(
getattr(ctx, "repositories_base_dir", None) ctx, "repositories_dir", None
or getattr(ctx, "repositories_dir", None)
) )
if not base_dir: if not base_dir:
raise RuntimeError( raise RuntimeError(
@@ -68,6 +67,7 @@ def handle_repos_command(
args.clone_mode, args.clone_mode,
args.dependencies, args.dependencies,
force_update=getattr(args, "update", False), force_update=getattr(args, "update", False),
silent=getattr(args, "silent", False),
) )
return return

View File

@@ -1,64 +1,27 @@
from __future__ import annotations from __future__ import annotations
import json
import os
from typing import Any, Dict, List from typing import Any, Dict, List
from pkgmgr .cli .context import CLIContext from pkgmgr.cli.context import CLIContext
from pkgmgr .core .command .run import run_command from pkgmgr.cli.tools import open_vscode_workspace
from pkgmgr .core .repository .identifier import get_repo_identifier from pkgmgr.cli.tools.paths import resolve_repository_path
from pkgmgr .core .repository .dir import get_repo_dir from pkgmgr.core.command.run import run_command
Repository = Dict[str, Any] Repository = Dict[str, Any]
def _resolve_repository_path(repository: Repository, ctx: CLIContext) -> str:
"""
Resolve the filesystem path for a repository.
Priority:
1. Use explicit keys if present (directory / path / workspace / workspace_dir).
2. Fallback to get_repo_dir(...) using the repositories base directory
from the CLI context.
"""
# 1) Explicit path-like keys on the repository object
for key in ("directory", "path", "workspace", "workspace_dir"):
value = repository.get(key)
if value:
return value
# 2) Fallback: compute from base dir + repository metadata
base_dir = (
getattr(ctx, "repositories_base_dir", None)
or getattr(ctx, "repositories_dir", None)
)
if not base_dir:
raise RuntimeError(
"Cannot resolve repositories base directory from context; "
"expected ctx.repositories_base_dir or ctx.repositories_dir."
)
return get_repo_dir(base_dir, repository)
def handle_tools_command( def handle_tools_command(
args, args,
ctx: CLIContext, ctx: CLIContext,
selected: List[Repository], selected: List[Repository],
) -> None: ) -> None:
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# nautilus "explore" command # nautilus "explore" command
# ------------------------------------------------------------------ # ------------------------------------------------------------------
if args.command == "explore": if args.command == "explore":
for repository in selected: for repository in selected:
repo_path = _resolve_repository_path(repository, ctx) repo_path = resolve_repository_path(repository, ctx)
run_command( run_command(f'nautilus "{repo_path}" & disown')
f'nautilus "{repo_path}" & disown'
)
return return
# ------------------------------------------------------------------ # ------------------------------------------------------------------
@@ -66,50 +29,13 @@ def handle_tools_command(
# ------------------------------------------------------------------ # ------------------------------------------------------------------
if args.command == "terminal": if args.command == "terminal":
for repository in selected: for repository in selected:
repo_path = _resolve_repository_path(repository, ctx) repo_path = resolve_repository_path(repository, ctx)
run_command( run_command(f'gnome-terminal --tab --working-directory="{repo_path}"')
f'gnome-terminal --tab --working-directory="{repo_path}"'
)
return return
# ------------------------------------------------------------------ # ------------------------------------------------------------------
# VS Code workspace command # VS Code workspace command
# ------------------------------------------------------------------ # ------------------------------------------------------------------
if args.command == "code": if args.command == "code":
if not selected: open_vscode_workspace(ctx, selected)
print("No repositories selected.")
return
identifiers = [
get_repo_identifier(repo, ctx.all_repositories)
for repo in selected
]
sorted_identifiers = sorted(identifiers)
workspace_name = "_".join(sorted_identifiers) + ".code-workspace"
directories_cfg = ctx.config_merged.get("directories") or {}
workspaces_dir = os.path.expanduser(
directories_cfg.get("workspaces", "~/Workspaces")
)
os.makedirs(workspaces_dir, exist_ok=True)
workspace_file = os.path.join(workspaces_dir, workspace_name)
folders = [
{"path": _resolve_repository_path(repository, ctx)}
for repository in selected
]
workspace_data = {
"folders": folders,
"settings": {},
}
if not os.path.exists(workspace_file):
with open(workspace_file, "w", encoding="utf-8") as f:
json.dump(workspace_data, f, indent=4)
print(f"Created workspace file: {workspace_file}")
else:
print(f"Using existing workspace file: {workspace_file}")
run_command(f'code "{workspace_file}"')
return return

View File

@@ -7,7 +7,7 @@ from typing import Any, Dict, List, Optional, Tuple
from pkgmgr.cli.context import CLIContext from pkgmgr.cli.context import CLIContext
from pkgmgr.core.repository.dir import get_repo_dir from pkgmgr.core.repository.dir import get_repo_dir
from pkgmgr.core.repository.identifier import get_repo_identifier from pkgmgr.core.repository.identifier import get_repo_identifier
from pkgmgr.core.git import get_tags from pkgmgr.core.git.queries import get_tags
from pkgmgr.core.version.semver import SemVer, find_latest_version from pkgmgr.core.version.semver import SemVer, find_latest_version
from pkgmgr.core.version.installed import ( from pkgmgr.core.version.installed import (
get_installed_python_version, get_installed_python_version,
@@ -38,9 +38,9 @@ def _print_pkgmgr_self_version() -> None:
# Common distribution/module naming variants. # Common distribution/module naming variants.
python_candidates = [ python_candidates = [
"package-manager", # PyPI dist name in your project "package-manager", # PyPI dist name in your project
"package_manager", # module-ish variant "package_manager", # module-ish variant
"pkgmgr", # console/alias-ish "pkgmgr", # console/alias-ish
] ]
nix_candidates = [ nix_candidates = [
"pkgmgr", "pkgmgr",

View File

@@ -105,6 +105,7 @@ def dispatch_command(args, ctx: CLIContext) -> None:
if args.command == "update": if args.command == "update":
from pkgmgr.actions.update import UpdateManager from pkgmgr.actions.update import UpdateManager
UpdateManager().run( UpdateManager().run(
selected_repos=selected, selected_repos=selected,
repositories_base_dir=ctx.repositories_base_dir, repositories_base_dir=ctx.repositories_base_dir,
@@ -116,6 +117,7 @@ def dispatch_command(args, ctx: CLIContext) -> None:
quiet=args.quiet, quiet=args.quiet,
update_dependencies=args.dependencies, update_dependencies=args.dependencies,
clone_mode=args.clone_mode, clone_mode=args.clone_mode,
silent=getattr(args, "silent", False),
force_update=True, force_update=True,
) )
return return

View File

@@ -4,18 +4,18 @@ import argparse
from pkgmgr.cli.proxy import register_proxy_commands from pkgmgr.cli.proxy import register_proxy_commands
from .common import SortedSubParsersAction
from .install_update import add_install_update_subparsers
from .config_cmd import add_config_subparsers
from .navigation_cmd import add_navigation_subparsers
from .branch_cmd import add_branch_subparsers from .branch_cmd import add_branch_subparsers
from .release_cmd import add_release_subparser
from .publish_cmd import add_publish_subparser
from .version_cmd import add_version_subparser
from .changelog_cmd import add_changelog_subparser from .changelog_cmd import add_changelog_subparser
from .common import SortedSubParsersAction
from .config_cmd import add_config_subparsers
from .install_update import add_install_update_subparsers
from .list_cmd import add_list_subparser from .list_cmd import add_list_subparser
from .make_cmd import add_make_subparsers from .make_cmd import add_make_subparsers
from .mirror_cmd import add_mirror_subparsers from .mirror_cmd import add_mirror_subparsers
from .navigation_cmd import add_navigation_subparsers
from .publish_cmd import add_publish_subparser
from .release_cmd import add_release_subparser
from .version_cmd import add_version_subparser
def create_parser(description_text: str) -> argparse.ArgumentParser: def create_parser(description_text: str) -> argparse.ArgumentParser:
@@ -23,12 +23,34 @@ def create_parser(description_text: str) -> argparse.ArgumentParser:
description=description_text, description=description_text,
formatter_class=argparse.RawTextHelpFormatter, formatter_class=argparse.RawTextHelpFormatter,
) )
subparsers = parser.add_subparsers( subparsers = parser.add_subparsers(
dest="command", dest="command",
help="Subcommands", help="Subcommands",
action=SortedSubParsersAction, action=SortedSubParsersAction,
) )
# create
p_create = subparsers.add_parser(
"create",
help="Create a new repository (scaffold + config).",
)
p_create.add_argument(
"identifiers",
nargs="+",
help="Repository identifier(s): URL or 'provider(:port)/owner/repo'.",
)
p_create.add_argument(
"--remote",
action="store_true",
help="Also push an initial commit to the remote (main/master).",
)
p_create.add_argument(
"--preview",
action="store_true",
help="Print actions without writing files or executing commands.",
)
add_install_update_subparsers(subparsers) add_install_update_subparsers(subparsers)
add_config_subparsers(subparsers) add_config_subparsers(subparsers)
add_navigation_subparsers(subparsers) add_navigation_subparsers(subparsers)

View File

@@ -33,8 +33,7 @@ def add_branch_subparsers(
"name", "name",
nargs="?", nargs="?",
help=( help=(
"Name of the new branch (optional; will be asked interactively " "Name of the new branch (optional; will be asked interactively if omitted)"
"if omitted)"
), ),
) )
branch_open.add_argument( branch_open.add_argument(
@@ -54,8 +53,7 @@ def add_branch_subparsers(
"name", "name",
nargs="?", nargs="?",
help=( help=(
"Name of the branch to close (optional; current branch is used " "Name of the branch to close (optional; current branch is used if omitted)"
"if omitted)"
), ),
) )
branch_close.add_argument( branch_close.add_argument(
@@ -84,8 +82,7 @@ def add_branch_subparsers(
"name", "name",
nargs="?", nargs="?",
help=( help=(
"Name of the branch to drop (optional; current branch is used " "Name of the branch to drop (optional; current branch is used if omitted)"
"if omitted)"
), ),
) )
branch_drop.add_argument( branch_drop.add_argument(

View File

@@ -168,3 +168,10 @@ def add_install_update_arguments(subparser: argparse.ArgumentParser) -> None:
default="ssh", default="ssh",
help="Specify clone mode (default: ssh).", help="Specify clone mode (default: ssh).",
) )
_add_option_if_missing(
subparser,
"--silent",
action="store_true",
help="Continue with other repositories if one fails; downgrade errors to warnings.",
)

View File

@@ -20,7 +20,9 @@ def add_mirror_subparsers(subparsers: argparse._SubParsersAction) -> None:
required=True, required=True,
) )
mirror_list = mirror_subparsers.add_parser("list", help="List configured mirrors for repositories") mirror_list = mirror_subparsers.add_parser(
"list", help="List configured mirrors for repositories"
)
add_identifier_arguments(mirror_list) add_identifier_arguments(mirror_list)
mirror_list.add_argument( mirror_list.add_argument(
"--source", "--source",
@@ -29,15 +31,21 @@ def add_mirror_subparsers(subparsers: argparse._SubParsersAction) -> None:
help="Which mirror source to show.", help="Which mirror source to show.",
) )
mirror_diff = mirror_subparsers.add_parser("diff", help="Show differences between config mirrors and MIRRORS file") mirror_diff = mirror_subparsers.add_parser(
"diff", help="Show differences between config mirrors and MIRRORS file"
)
add_identifier_arguments(mirror_diff) add_identifier_arguments(mirror_diff)
mirror_merge = mirror_subparsers.add_parser( mirror_merge = mirror_subparsers.add_parser(
"merge", "merge",
help="Merge mirrors between config and MIRRORS file (example: pkgmgr mirror merge config file --all)", help="Merge mirrors between config and MIRRORS file (example: pkgmgr mirror merge config file --all)",
) )
mirror_merge.add_argument("source", choices=["config", "file"], help="Source of mirrors.") mirror_merge.add_argument(
mirror_merge.add_argument("target", choices=["config", "file"], help="Target of mirrors.") "source", choices=["config", "file"], help="Source of mirrors."
)
mirror_merge.add_argument(
"target", choices=["config", "file"], help="Target of mirrors."
)
add_identifier_arguments(mirror_merge) add_identifier_arguments(mirror_merge)
mirror_merge.add_argument( mirror_merge.add_argument(
"--config-path", "--config-path",

Some files were not shown because too many files have changed in this diff Show More