repomatic.github package

GitHub integration package.

Submodules provide GitHub Actions utilities (actions), a gh CLI wrapper (gh), and helpers for issues, matrices, PR bodies, tokens, unsubscribing, and workflow syncing.

Submodules

repomatic.github.actions module

GitHub Actions output formatting, annotations, and workflow events.

This module provides utilities for working with GitHub Actions: multiline output formatting, workflow annotations, event payload loading, and GitHub-specific constants and enums shared across multiple modules.

Note

Concurrency quirks addressed by the workflows

SHA-based groups (``release.yaml``): cancel-in-progress is evaluated on the new workflow, not the old one. If a regular commit is pushed while a release workflow is running, the new workflow would cancel it (same group). Solution: release commits (freeze and unfreeze) get a unique group keyed by github.sha, so they can never be cancelled.

Event-scoped groups (``changelog.yaml``): changelog.yaml has both push and workflow_run triggers. Without event_name in the concurrency group, a fast-completing workflow_run event would cancel the push event’s prepare-release job, then skip prepare-release itself (guarded by if: event_name != 'workflow_run'), so it would never run. Including event_name prevents cross-event cancellation.

``workflow_run`` checkout ref: Always use github.sha (latest commit on the default branch), never workflow_run.head_sha (the commit that triggered the upstream workflow). After a release cycle adds commits (freeze + unfreeze), head_sha is stale and produces a tree that conflicts with current main.

repomatic.github.actions.NULL_SHA = '0000000000000000000000000000000000000000'

The null SHA used by Git to represent a non-existent commit.

GitHub sends this value as the before SHA when a tag is created, since there is no previous commit to compare against.

class repomatic.github.actions.WorkflowEvent(*values)[source]

Bases: StrEnum

Workflow events that cause a workflow to run.

List of events.

branch_protection_rule = 'branch_protection_rule'
check_run = 'check_run'
check_suite = 'check_suite'
create = 'create'
delete = 'delete'
deployment = 'deployment'
deployment_status = 'deployment_status'
discussion = 'discussion'
discussion_comment = 'discussion_comment'
fork = 'fork'
gollum = 'gollum'
issue_comment = 'issue_comment'
issues = 'issues'
label = 'label'
merge_group = 'merge_group'
milestone = 'milestone'
page_build = 'page_build'
project = 'project'
project_card = 'project_card'
project_column = 'project_column'
public = 'public'
pull_request = 'pull_request'
pull_request_comment = 'pull_request_comment'
pull_request_review = 'pull_request_review'
pull_request_review_comment = 'pull_request_review_comment'
pull_request_target = 'pull_request_target'
push = 'push'
registry_package = 'registry_package'
release = 'release'
repository_dispatch = 'repository_dispatch'
schedule = 'schedule'
status = 'status'
watch = 'watch'
workflow_call = 'workflow_call'
workflow_dispatch = 'workflow_dispatch'
workflow_run = 'workflow_run'
class repomatic.github.actions.AnnotationLevel(*values)[source]

Bases: Enum

Annotation levels for GitHub Actions workflow commands.

ERROR = 'error'
WARNING = 'warning'
NOTICE = 'notice'
repomatic.github.actions.generate_delimiter()[source]

Generate a unique delimiter for GitHub Actions multiline output.

GitHub Actions requires a unique delimiter to encode multiline values in $GITHUB_OUTPUT. This function generates a random delimiter that is extremely unlikely to appear in the output content.

The delimiter format is GHA_DELIMITER_NNNNNNNNN where N is a digit, producing a 9-digit random suffix.

Return type:

str

Returns:

A unique delimiter string.

repomatic.github.actions.format_multiline_output(name, value)[source]

Format a multiline value for GitHub Actions output.

Produces output in the heredoc format required by $GITHUB_OUTPUT:

name<<GHA_DELIMITER_NNNNNNNNN
value line 1
value line 2
GHA_DELIMITER_NNNNNNNNN
Parameters:
  • name (str) – The output variable name.

  • value (str) – The multiline value.

Return type:

str

Returns:

Formatted string for $GITHUB_OUTPUT.

repomatic.github.actions.emit_annotation(level, message)[source]

Emit a GitHub Actions workflow annotation.

Prints a workflow command that creates an annotation visible in the GitHub Actions UI and PR checks.

Parameters:
  • level (AnnotationLevel | Literal['error', 'warning', 'notice']) – The annotation level (error, warning, or notice).

  • message (str) – The annotation message.

Return type:

None

repomatic.github.actions.get_github_event() dict[str, Any][source]

Load the GitHub event payload from GITHUB_EVENT_PATH.

Return type:

dict[str, Any]

Returns:

The parsed event payload, or empty dict if not available.

repomatic.github.dev_release module

Sync a rolling dev pre-release on GitHub.

Maintains a single draft pre-release that mirrors the unreleased changelog section and always carries the latest successful dev binaries and Python package. The dev tag (e.g. v6.1.1.dev0) is force-updated to point to the latest main commit — no tag proliferation.

When the current version’s dev release already exists, it is edited (not deleted and recreated) so that previously uploaded assets — especially compiled binaries — survive pushes that skip binary compilation (e.g. documentation-only changes). The upload_release_assets() function deletes all existing assets before uploading new ones, preventing stale files from accumulating when the naming scheme changes. Stale dev releases from previous versions are always deleted.

Note

Dev releases are created as drafts so they remain mutable even when GitHub’s immutable releases setting is enabled. Immutability only blocks asset uploads on published releases — deletion still works. But because the workflow needs to upload binaries after creation, the release must stay as a draft throughout its lifetime to allow asset uploads. See CLAUDE.md § Immutable releases.

repomatic.github.dev_release.DEV_ASSET_PATTERNS = ('*.bin', '*.exe', '*.whl', '*.tar.gz')

Glob patterns for dev release assets.

Note

Bare extensions (no repomatic- prefix) keep patterns generic so downstream repositories can reuse the same logic regardless of their package name.

repomatic.github.dev_release.sync_dev_release(changelog_path, version, nwo, dry_run=True, asset_dir=None)[source]

Create or update the dev pre-release on GitHub.

Reads the changelog, renders the release body for the given version via build_expected_body(), then either edits the existing dev release or creates a new one. Stale dev releases from previous versions are always cleaned up.

Existing releases are edited (not deleted and recreated) to preserve assets like compiled binaries from previous successful builds. When asset_dir is provided, existing assets are deleted and new ones uploaded via upload_release_assets().

Parameters:
  • changelog_path (Path) – Path to changelog.md.

  • version (str) – Current version string (e.g. 6.1.1.dev0).

  • nwo (str) – Repository name-with-owner (e.g. user/repo).

  • dry_run (bool) – If True, report without making changes.

  • asset_dir (Path | None) – Directory containing assets to upload. If None, no asset upload is performed.

Return type:

bool

Returns:

True if the release was synced (or would be in dry-run), False if the changelog section is empty.

repomatic.github.dev_release.upload_release_assets(tag, nwo, asset_dir)[source]

Upload assets to a GitHub release.

Scans asset_dir for files matching DEV_ASSET_PATTERNS. If no matching files are found, returns immediately without modifying the release — this preserves existing assets for documentation-only pushes. When files are found, all existing assets are deleted first to prevent stale files from accumulating when the naming scheme changes.

Parameters:
  • tag (str) – Git tag name (e.g. v6.1.1.dev0).

  • nwo (str) – Repository name-with-owner (e.g. user/repo).

  • asset_dir (Path) – Directory containing assets to upload.

Return type:

list[Path]

Returns:

List of uploaded file paths.

repomatic.github.dev_release.cleanup_dev_releases(nwo, *, keep_tag=None)[source]

Delete stale dev pre-releases from GitHub.

Lists all releases and deletes any whose tag ends with .dev0, except keep_tag which is preserved so its assets (e.g. compiled binaries) survive. This handles stale dev releases left behind after version bumps. Silently succeeds if no dev releases exist or if individual deletions fail.

Parameters:
  • nwo (str) – Repository name-with-owner (e.g. user/repo).

  • keep_tag (str | None) – Tag to preserve (e.g. v6.2.0.dev0). If None, all dev releases are deleted.

Return type:

None

repomatic.github.dev_release.delete_dev_release(version, nwo)[source]

Delete the dev pre-release and its tag from GitHub.

Silently succeeds if no dev release exists. This is used during real releases to clean up the dev pre-release for the version being released.

Parameters:
  • version (str) – Dev version string (e.g. 6.1.1.dev0).

  • nwo (str) – Repository name-with-owner (e.g. user/repo).

Return type:

None

repomatic.github.dev_release.delete_release_by_tag(tag, nwo)[source]

Delete a release and its tag from GitHub.

Silently succeeds if the release does not exist or cannot be deleted.

Parameters:
  • tag (str) – Git tag name (e.g. v6.1.1.dev0).

  • nwo (str) – Repository name-with-owner (e.g. user/repo).

Return type:

None

repomatic.github.gh module

Generic wrapper for the gh CLI.

Note

Workflow steps must set GH_TOKEN explicitly: GITHUB_TOKEN is a secret expression in GitHub Actions, not an automatic environment variable. The standard pattern is GH_TOKEN: ${{ secrets.REPOMATIC_PAT || github.token }} for steps that prefer a PAT, or GH_TOKEN: ${{ github.token }} otherwise.

As defense-in-depth, run_gh_command() promotes REPOMATIC_PAT to GH_TOKEN when set, and promotes GITHUB_TOKEN to GH_TOKEN when GH_TOKEN is absent. On 401 Bad Credentials (expired or revoked PAT), it retries with GITHUB_TOKEN if available and different.

repomatic.github.gh.run_gh_command(args)[source]

Run a gh CLI command and return stdout.

Token priority: REPOMATIC_PAT > GH_TOKEN > GITHUB_TOKEN. The gh CLI does not recognize REPOMATIC_PAT, so when set it is injected as GH_TOKEN. On 401 Bad Credentials the command is retried with GITHUB_TOKEN if available and different, letting CI jobs degrade gracefully to the standard Actions token instead of failing outright on a stale PAT.

Parameters:

args (list[str]) – Command arguments to pass to gh.

Return type:

str

Returns:

The stdout output from the command.

Raises:

RuntimeError – If the command fails (after fallback, if attempted).

repomatic.github.issue module

GitHub issue lifecycle management.

Generic primitives for listing, creating, updating, closing, and triaging GitHub issues via the gh CLI. Used by broken_links and potentially other modules that manage bot-created issues.

We need to manually manage the life-cycle of issues created in CI jobs because the create-issue-from-file action blindly creates issues ad-nauseam.

See: - https://github.com/peter-evans/create-issue-from-file/issues/298 - https://github.com/lycheeverse/lychee-action/issues/74#issuecomment-1587089689

repomatic.github.issue.list_issues(title='')[source]

List all issues (open and closed), optionally filtered by title.

Note

No --author filter is applied. When REPOMATIC_PAT is configured, gh authenticates as the token owner (not github-actions[bot]), so issues may be authored by either identity. Filtering by author would miss issues created under the other identity, breaking deduplication. The caller (triage_issues()) already matches by exact title, so author-agnostic listing is safe.

Parameters:

title (str) – If provided, only return issues whose title matches exactly.

Return type:

list[dict[str, Any]]

Returns:

List of issue dicts with number, title, createdAt, and state.

repomatic.github.issue.list_open_issues(title='')[source]

List open issues, optionally filtered by title.

Convenience wrapper around list_issues() that filters to open issues only and strips the state field for backward compatibility.

Parameters:

title (str) – If provided, only return issues whose title matches exactly.

Return type:

list[dict[str, Any]]

Returns:

List of issue dicts with number, title, and createdAt.

repomatic.github.issue.close_issue(number, comment)[source]

Close an issue with a comment.

Parameters:
  • number (int) – The issue number to close.

  • comment (str) – The comment to add when closing.

Return type:

None

repomatic.github.issue.reopen_issue(number, comment='')[source]

Reopen a previously closed issue.

Parameters:
  • number (int) – The issue number to reopen.

  • comment (str) – Optional comment to add when reopening.

Return type:

None

repomatic.github.issue.create_issue(body_file, labels, title)[source]

Create a new issue.

Parameters:
  • body_file (Path) – Path to the file containing the issue body.

  • labels (list[str]) – List of labels to apply.

  • title (str) – Issue title.

Return type:

int

Returns:

The created issue number.

repomatic.github.issue.update_issue(number, body_file)[source]

Update an existing issue body.

Parameters:
  • number (int) – The issue number to update.

  • body_file (Path) – Path to the file containing the new issue body.

Return type:

None

repomatic.github.issue.triage_issues(issues, title, needed)[source]

Triage issues matching a title for deduplication.

Parameters:
  • issues (list[dict]) – List of issue dicts from gh issue list --json number,title,createdAt,state. The state field is optional for backward compatibility; when absent it defaults to "OPEN".

  • title (str) – Issue title to match against.

  • needed (bool) – Whether an issue with this title should exist.

Return type:

tuple[bool, int | None, str | None, set[int]]

Returns:

A tuple of (issue_needed, issue_to_update, issue_state, issues_to_close).

If needed is True, the most recent matching issue is kept as issue_to_update (with its issue_state) and all older matching issues are collected in issues_to_close. If needed is False, all open matching issues are placed in issues_to_close (already-closed issues are skipped).

repomatic.github.issue.manage_issue_lifecycle(has_issues, body_file, labels, title, no_issues_comment='No more issues.')[source]

Manage the full issue lifecycle: list, triage, close, create/update.

This function handles: 1. Listing all issues (open and closed) via gh issue list. 2. Triaging matching issues (keep newest if needed, close duplicates). 3. Closing duplicate open issues via gh issue close. 4. Creating, updating, or reopening the main issue via ``gh issue

create``, gh issue edit, or gh issue reopen.

When has_issues is True and the most recent matching issue is closed, it is reopened and updated rather than creating a duplicate.

Parameters:
  • has_issues (bool) – Whether issues were found that warrant an open issue.

  • body_file (Path) – Path to the file containing the issue body.

  • labels (list[str]) – Labels to apply when creating a new issue.

  • title (str) – Issue title to match and create.

  • no_issues_comment (str) – Comment to add when closing issues because the condition no longer applies.

Return type:

None

repomatic.github.matrix module

class repomatic.github.matrix.Matrix(*args, **kwargs)[source]

Bases: object

A matrix as defined by GitHub’s actions workflows.

See GitHub official documentation on how-to implement variations of jobs in a workflow.

Note

Why matrices are pre-computed in the metadata job

GitHub Actions matrix outputs are not cumulative — the last job in a matrix wins (community discussion). This makes a matrix-based job terminal in a dependency graph: no downstream job can depend on its aggregated outputs.

The workaround is a single preliminary metadata job that computes all matrices upfront. Downstream jobs depend on that job and consume the pre-built matrices, rather than computing them themselves.

This Matrix behave like a dict and works everywhere a dict would. Only that it is immutable and based on FrozenDict. If you want to populate the matrix you have to use the following methods:

The implementation respects the order in which items were inserted. This provides a natural and visual sorting that should ease the inspection and debugging of large matrix.

matrix(ignore_includes=False, ignore_excludes=False)[source]

Returns a copy of the matrix.

The special include and excludes directives will be added by default. You can selectively ignore them by passing the corresponding boolean parameters.

Return type:

FrozenDict[str, tuple[str, ...] | tuple[dict[str, str], ...]]

add_variation(variation_id, values)[source]
Return type:

None

replace_variation_value(variation_id, old, new)[source]

Replace a single value within a variation axis.

The new value takes the position of the old value. If the new value already exists elsewhere in the axis, the duplicate is removed by boltons.iterutils.unique().

Silently skips if the axis does not exist or does not contain the old value, making the operation idempotent.

Return type:

None

remove_variation_value(variation_id, value)[source]

Remove a single value from a variation axis.

If the axis becomes empty after removal, it is deleted entirely.

Silently skips if the axis does not exist or does not contain the value, making the operation idempotent.

Return type:

None

add_includes(*new_includes)[source]

Add one or more include special directives to the matrix.

Return type:

None

add_excludes(*new_excludes)[source]

Add one or more exclude special directives to the matrix.

Return type:

None

prune()[source]

Remove no-op exclude directives and log about them.

An exclude is a no-op when it references a key that is not a variation axis at all, or when the key exists but the value is not present in that axis. Either way the exclude can never match any combination produced by product(), and GitHub Actions rejects excludes that reference non-existent matrix keys.

Return type:

None

all_variations(with_matrix=True, with_includes=False, with_excludes=False)[source]

Collect all variations encountered in the matrix.

Extra variations mentioned in the special include and exclude directives will be ignored by default.

You can selectively expand or restrict the resulting inventory of variations by passing the corresponding with_matrix, with_includes and with_excludes boolean filter parameters.

Return type:

dict[str, tuple[str, ...]]

product(with_includes=False, with_excludes=False)[source]

Only returns the combinations of the base matrix by default.

You can optionally add any variation referenced in the include and exclude special directives.

Respects the order of variations and their values.

Return type:

Iterator[dict[str, str]]

solve(strict=False)[source]

Returns all combinations and apply include and exclude constraints.

Caution

As per GitHub specifications, all include combinations are processed after exclude. This allows you to use include to add back combinations that were previously excluded.

Return type:

Iterator[dict[str, str]]

repomatic.github.pr_body module

Generate PR body with workflow metadata for auto-created pull requests.

Uses Metadata for CI context to produce a collapsible <details> block containing a metadata table. Template prefixes are loaded from markdown files in repomatic/templates/, optionally with YAML frontmatter for templates that require arguments.

Also provides sanitize_markdown_mentions() for neutralizing @mentions, #issue refs, and GitHub URLs in externally-sourced markdown before embedding it in PR or issue bodies.

repomatic.github.pr_body.sanitize_markdown_mentions(text)[source]

Neutralize @mentions, #issue refs, and GitHub URLs in markdown.

Prevents GitHub from auto-linking mentions and issue references in externally-sourced markdown (upstream release notes, third-party tool output) that would cause notification spam or accidental issue closure.

Uses a placeholder extraction approach: fenced code blocks and inline code spans are temporarily replaced with unique placeholders before sanitization, then restored afterward. This avoids the fragile “sanitize then restore” pattern that caused bugs in both Dependabot (2019 code-fence regression, dependabot/dependabot-core#1421) and Renovate (ongoing restoration pass edge cases, renovatebot/renovate#8823, renovatebot/renovate#2554).

Inserts a Unicode zero-width space (U+200B) after @ and # to break GitHub’s mention and issue parsers without affecting visual rendering. Rewrites github.com URLs to redirect.github.com to prevent backlink cross-references on upstream issues.

Parameters:

text (str) – Raw markdown text from an external source.

Return type:

str

Returns:

Sanitized markdown safe for embedding in a GitHub PR or issue body.

Note

Only call this on externally-sourced content (upstream release notes, third-party tool output). Do not call on content authored by the repository owner where mentions are intentional.

repomatic.github.pr_body.load_template(name)[source]

Load a PR body template from the repomatic/templates/ package.

Tries {name}.md.noformat first, then {name}.md. The .md.noformat extension is used for templates whose string.Template placeholders confuse mdformat (e.g. $rerun_row at the start of a table row is parsed as a cell value, breaking the table structure). See pr-metadata.md.noformat for the canonical example.

Parameters:

name (str) – Template name without extension (e.g. bump-version).

Return type:

tuple[dict[str, object], str]

Returns:

A tuple of (frontmatter metadata dict, template body string).

Raises:

FileNotFoundError – If neither file exists.

repomatic.github.pr_body.render_template(*names, **kwargs)[source]

Load and render one or more templates with variable substitution.

When multiple template names are given, each is rendered and joined with a blank line. The generated-footer attribution is appended once at the end if any of the templates wants it (i.e. does not have footer: false in its frontmatter).

Static templates (no $variable placeholders) are returned as-is. Dynamic templates use string.Template ($variable syntax) to avoid conflicts with markdown braces like [tool.repomatic].

Consecutive blank lines left by empty variables are collapsed to a single blank line.

Parameters:
  • names (str) – One or more template names without .md extension.

  • kwargs (str | None) – Variables to substitute into all templates.

Return type:

str

Returns:

The rendered markdown string.

repomatic.github.pr_body.render_title(name, **kwargs)[source]

Load and render a template’s PR title with variable substitution.

Parameters:
  • name (str) – Template name without .md extension.

  • kwargs (str | None) – Variables to substitute into the title.

Return type:

str

Returns:

The rendered title string.

Raises:

KeyError – If the template has no title in its frontmatter.

repomatic.github.pr_body.render_commit_message(name, **kwargs)[source]

Load and render a template’s commit message with variable substitution.

Falls back to the title if no commit_message is defined.

Parameters:
  • name (str) – Template name without .md extension.

  • kwargs (str | None) – Variables to substitute into the commit message.

Return type:

str

Returns:

The rendered commit message string.

repomatic.github.pr_body.template_args(name)[source]

Return the list of required arguments for a template.

Parameters:

name (str) – Template name without .md extension.

Return type:

list[str]

Returns:

List of argument names from the frontmatter args field.

repomatic.github.pr_body.get_template_names()[source]

Discover all available template names from the templates package.

Return type:

list[str]

Returns:

Sorted list of template names (without .md extension).

repomatic.github.pr_body.extract_workflow_filename(workflow_ref)[source]

Extract the workflow filename from GITHUB_WORKFLOW_REF.

Parameters:

workflow_ref (str | None) – The full workflow reference, e.g. owner/repo/.github/workflows/name.yaml@refs/heads/branch.

Return type:

str

Returns:

The workflow filename (e.g. name.yaml), or an empty string if the reference is empty or malformed.

repomatic.github.pr_body.generate_pr_metadata_block()[source]

Generate a collapsible metadata block from CI context.

Uses Metadata to read GITHUB_* environment variables and returns a markdown <details> block containing a table of workflow metadata fields.

Return type:

str

Returns:

A markdown string with the metadata block.

repomatic.github.pr_body.generate_refresh_tip()[source]

Generate a tip admonition inviting users to refresh the PR manually.

Uses Metadata for the repository URL and GITHUB_WORKFLOW_REF to build the workflow dispatch URL.

Return type:

str

Returns:

A GitHub-flavored markdown [!TIP] blockquote, or an empty string if the workflow reference is unavailable.

repomatic.github.pr_body.build_pr_body(prefix, metadata_block)[source]

Concatenate prefix, refresh tip, and metadata block into a PR body.

The metadata_block already includes the attribution footer (appended automatically by render_template()).

Parameters:
  • prefix (str) – Content to prepend before the metadata block. Can be empty.

  • metadata_block (str) – The collapsible metadata block from generate_pr_metadata_block(), with footer.

Return type:

str

Returns:

The complete PR body string.

repomatic.github.release_sync module

Sync GitHub release notes from changelog.md.

Compares each GitHub release body against the corresponding changelog.md section and updates any that have drifted. changelog.md is the single source of truth.

class repomatic.github.release_sync.SyncAction(*values)[source]

Bases: Enum

Action taken (or to be taken) on a release body.

DRY_RUN = 'dry_run'
FAILED = 'failed'
SKIPPED = 'skipped'
UPDATED = 'updated'
class repomatic.github.release_sync.SyncRow(action, version, release_url)[source]

Bases: object

Per-release detail for the markdown report table.

action: SyncAction
version: str
release_url: str
class repomatic.github.release_sync.SyncResult(dry_run=True, rows=<factory>, total=0, in_sync=0, drifted=0, updated=0, failed=0, missing_changelog=0)[source]

Bases: object

Accumulated results from a release-notes sync run.

dry_run: bool = True
rows: list[SyncRow]
total: int = 0
in_sync: int = 0
drifted: int = 0
updated: int = 0
failed: int = 0
missing_changelog: int = 0
repomatic.github.release_sync.build_expected_body(changelog, version, *, admonition_override=None)[source]

Build the expected release body from the changelog.

Decomposes the changelog section into discrete elements and renders them through the github-releases template. This allows the GitHub release body to include a different subset of elements than the release-notes template used for changelog.md entries.

Parameters:
  • changelog (Changelog) – Parsed changelog instance.

  • version (str) – Version string (e.g. 1.2.3).

  • admonition_override (str | None) – If provided, replaces the availability_admonition from the changelog. Used by release_notes_with_admonition to inject a pre-computed admonition at release time.

Return type:

str

Returns:

The rendered release body, or empty string if the version has no changelog section.

repomatic.github.release_sync.sync_github_releases(repo_url, changelog_path, dry_run=True)[source]

Sync GitHub release bodies from changelog.md.

For each released version in the changelog, compares the expected body (from changelog.md) with the actual GitHub release body. In live mode, updates drifted releases via gh release edit.

Parameters:
  • repo_url (str) – Repository URL (e.g. https://github.com/user/repo).

  • changelog_path (Path) – Path to changelog.md.

  • dry_run (bool) – If True, report without making changes.

Return type:

SyncResult

Returns:

Structured sync results.

repomatic.github.release_sync.render_sync_report(result)[source]

Render a markdown report from sync results.

Parameters:

result (SyncResult) – Structured results from the sync run.

Return type:

str

Returns:

Markdown report string.

repomatic.github.releases module

GitHub Releases API integration.

repomatic.github.releases.GITHUB_API_RELEASES_URL = 'https://api.github.com/repos/{owner}/{repo}/releases'

GitHub API URL for fetching all releases for a repository.

class repomatic.github.releases.GitHubRelease(date: str, body: str)[source]

Bases: NamedTuple

Release metadata for a single version from GitHub.

Create new instance of GitHubRelease(date, body)

date: str

Publication date in YYYY-MM-DD format.

body: str

Release description body (markdown).

repomatic.github.releases.get_github_releases(repo_url)[source]

Get versions and dates for all GitHub releases.

Fetches all releases via the GitHub API with pagination. Extracts version numbers by stripping the v prefix from tag names. Uses published_at (falling back to created_at) for the date.

Parameters:

repo_url (str) – Repository URL (e.g. https://github.com/user/repo).

Return type:

dict[str, GitHubRelease]

Returns:

Dict mapping version strings to GitHubRelease tuples. Empty dict if the request fails.

repomatic.github.token module

GitHub token validation utilities.

Provides early validation for CLI commands that depend on the GitHub API, so users get clear error messages at startup rather than opaque failures mid-execution.

Note

Why REPOMATIC_PAT is needed

GitHub’s GITHUB_TOKEN cannot modify workflow files in .github/. Neither contents: write, actions: write, nor permissions: write-all grant this ability. The only way to push changes to workflow YAML files is via a fine-grained Personal Access Token with the Workflows permission. Without it, pushes are rejected with:

! [remote rejected] branch_xxx -> branch_xxx (refusing to allow a
GitHub App to create or update workflow
`.github/workflows/my_workflow.yaml` without `workflows` permission)

Additionally, events triggered by GITHUB_TOKEN do not start new workflow runs (see GitHub docs), so tag pushes also need the PAT to trigger downstream workflows.

The Settings → Actions → General → Workflow permissions setting has no effect on this limitation — it’s a hard security boundary enforced by GitHub regardless of repository-level settings.

Jobs that use REPOMATIC_PAT:

  • autofix.yaml: fix-typos, sync-repomatic (PRs touching .github/workflows/ files).

  • changelog.yaml: prepare-release (freezes versions in workflow files).

  • release.yaml: create-tag (push triggers on.push.tags), create-release (triggers downstream workflows).

  • renovate.yaml: renovate (dependency PRs, status checks, dashboard, vulnerability alerts).

All jobs fall back to GITHUB_TOKEN when the PAT is unavailable (secrets.REPOMATIC_PAT || github.token), but operations requiring the workflows permission or workflow triggering will silently fail.

Token permission mapping:

  • Workflows — PRs that touch .github/workflows/ files.

  • Contents — Tag pushes, release publishing, PR branch creation.

  • Pull requests — All PR-creating jobs.

  • Commit statuses — Renovate stability-days status checks.

  • Dependabot alerts — Renovate vulnerability alert reading.

  • Issues — Renovate Dependency Dashboard.

  • Metadata — Required for all fine-grained token API operations.

repomatic.github.token.check_pat_contents_permission(repo)[source]

Check that the token has contents permission.

Tests read access via GET /repos/{owner}/{repo}/contents/.github.

Parameters:

repo (str) – Repository in ‘owner/repo’ format.

Return type:

tuple[bool, str]

Returns:

Tuple of (passed, message).

repomatic.github.token.check_pat_issues_permission(repo)[source]

Check that the token has issues permission.

Tests read access via GET /repos/{owner}/{repo}/issues.

Parameters:

repo (str) – Repository in ‘owner/repo’ format.

Return type:

tuple[bool, str]

Returns:

Tuple of (passed, message).

repomatic.github.token.check_pat_pull_requests_permission(repo)[source]

Check that the token has pull requests permission.

Tests read access via GET /repos/{owner}/{repo}/pulls.

Parameters:

repo (str) – Repository in ‘owner/repo’ format.

Return type:

tuple[bool, str]

Returns:

Tuple of (passed, message).

repomatic.github.token.check_pat_vulnerability_alerts_permission(repo)[source]

Check that the token has Dependabot alerts permission and alerts are enabled.

Tests access via GET /repos/{owner}/{repo}/dependabot/alerts?per_page=1. Returns 200 when the token has the vulnerability_alerts permission and alerts are enabled. Fails on 403 (token lacks the permission) or 404 (alerts not enabled).

Note

The older GET /repos/{owner}/{repo}/vulnerability-alerts endpoint requires the Administration: read fine-grained token permission, not Dependabot alerts. The Dependabot alerts listing endpoint used here correctly maps to the vulnerability_alerts permission scope.

Parameters:

repo (str) – Repository in ‘owner/repo’ format.

Return type:

tuple[bool, str]

Returns:

Tuple of (passed, message).

repomatic.github.token.check_pat_workflows_permission(repo)[source]

Check that the token has workflows permission.

Tests access via GET /repos/{owner}/{repo}/actions/workflows. Fine-grained PATs with the Workflows permission get actions:read access. Without it, this endpoint returns 403.

Parameters:

repo (str) – Repository in ‘owner/repo’ format.

Return type:

tuple[bool, str]

Returns:

Tuple of (passed, message).

repomatic.github.token.check_commit_statuses_permission(repo, sha)[source]

Check that the token has commit statuses permission.

Required for Renovate to set stability-days status checks.

Parameters:
  • repo (str) – Repository in ‘owner/repo’ format.

  • sha (str) – Commit SHA to check.

Return type:

tuple[bool, str]

Returns:

Tuple of (passed, message).

class repomatic.github.token.PatPermissionResults(contents, issues, pull_requests, vulnerability_alerts, workflows, commit_statuses=None)[source]

Bases: object

Results of all PAT permission checks.

Each field holds a (passed, message) tuple from the corresponding check_pat_* function. The commit_statuses field is None when no commit SHA was available to probe.

contents: tuple[bool, str]

Result of check_pat_contents_permission().

issues: tuple[bool, str]

Result of check_pat_issues_permission().

pull_requests: tuple[bool, str]

Result of check_pat_pull_requests_permission().

vulnerability_alerts: tuple[bool, str]

Result of check_pat_vulnerability_alerts_permission().

workflows: tuple[bool, str]

Result of check_pat_workflows_permission().

commit_statuses: tuple[bool, str] | None = None

Result of check_commit_statuses_permission(), or None if skipped.

all_passed()[source]

Return True when every executed check passed.

Return type:

bool

failed()[source]

Return (field_name, message) pairs for each failed check.

Return type:

list[tuple[str, str]]

iter_results()[source]

Yield all non-None (passed, message) tuples.

Return type:

list[tuple[bool, str]]

repomatic.github.token.check_all_pat_permissions(repo, sha=None)[source]

Run all PAT permission checks and return structured results.

This is the single entry point for PAT permission validation. Both lint-repo and setup-guide call this function so that adding a new permission check benefits all consumers automatically.

Parameters:
  • repo (str) – Repository in ‘owner/repo’ format.

  • sha (str | None) – Commit SHA for the statuses check. When None, the commit_statuses field is set to None (skipped).

Return type:

PatPermissionResults

Returns:

PatPermissionResults with all check outcomes.

repomatic.github.token.validate_gh_token_env()[source]

Check that a GitHub token environment variable is set.

Lookup order: REPOMATIC_PAT > GH_TOKEN > GITHUB_TOKEN, matching run_gh_command.

Raises:

RuntimeError – If no variable is set.

Return type:

None

repomatic.github.token.validate_gh_api_access()[source]

Smoke-test the GitHub API and return parsed response.

Calls GET https://api.github.com/rate_limit with the token from environment variables.

Return type:

tuple[int, dict[str, str], str]

Returns:

Tuple of (status_code, headers, body).

Raises:

RuntimeError – If the API returns a 4xx/5xx status.

repomatic.github.token.validate_classic_pat_scope(required_scope)[source]

Validate that the GitHub token is a classic PAT with the required scope.

Checks:

  1. A GitHub token environment variable is set.

  2. GitHub API is reachable (smoke-test GET).

  3. Token is a classic PAT (has X-OAuth-Scopes header).

  4. Token has the required scope.

Parameters:

required_scope (str) – The OAuth scope to require (e.g. "notifications").

Return type:

list[str]

Returns:

The full list of scopes on the token.

Raises:

RuntimeError – If any check fails.

repomatic.github.unsubscribe module

Unsubscribe from closed, inactive GitHub notification threads.

Processes notification threads in two phases:

  1. REST notification threads — Fetches all Issue/PullRequest notification threads via /notifications, inspects each for closed + stale status, and unsubscribes via DELETE + PATCH.

  2. GraphQL threadless subscriptions — Searches for closed issues/PRs the user is involved in but that lack notification threads, and unsubscribes via the updateSubscription mutation.

Requires the gh CLI to be installed and authenticated with a token that has the notifications scope (classic PAT) or equivalent fine-grained permissions.

repomatic.github.unsubscribe.GRAPHQL_PAGE_SIZE = 25

Per-page count for GraphQL search results.

repomatic.github.unsubscribe.NOTIFICATION_PAGE_SIZE = 50

Per-page count for REST /notifications results.

repomatic.github.unsubscribe.NOTIFICATION_SUBJECT_TYPES = frozenset({'Issue', 'PullRequest'})

Notification subject types to process.

class repomatic.github.unsubscribe.ItemAction(*values)[source]

Bases: Enum

Action taken (or to be taken) on a notification item.

DRY_RUN = 'dry_run'
FAILED = 'failed'
UNSUBSCRIBED = 'unsubscribed'
class repomatic.github.unsubscribe.DetailRow(action, html_url, number, repo, title, updated_at)[source]

Bases: object

Per-item detail for the markdown report table.

action: ItemAction
html_url: str
number: int | None
repo: str
title: str
updated_at: datetime | None
class repomatic.github.unsubscribe.Phase1Result(batch_size=0, cutoff=None, newest_updated=None, oldest_updated=None, rows=<factory>, threads_failed=0, threads_inspected=0, threads_skipped_open=0, threads_skipped_recent=0, threads_skipped_unknown=0, threads_total=0, threads_unsubscribed=0)[source]

Bases: object

Accumulated counts and details from REST notification phase.

batch_size: int = 0
cutoff: datetime | None = None
newest_updated: datetime | None = None
oldest_updated: datetime | None = None
rows: list[DetailRow]
threads_failed: int = 0
threads_inspected: int = 0
threads_skipped_open: int = 0
threads_skipped_recent: int = 0
threads_skipped_unknown: int = 0
threads_total: int = 0
threads_unsubscribed: int = 0
class repomatic.github.unsubscribe.Phase2Result(batch_size=0, cutoff=None, graphql_failed=0, graphql_not_subscribed=0, graphql_total=0, graphql_unsubscribed=0, rows=<factory>, search_query='', skipped=False, skip_reason='')[source]

Bases: object

Accumulated counts and details from GraphQL threadless phase.

batch_size: int = 0
cutoff: datetime | None = None
graphql_failed: int = 0
graphql_not_subscribed: int = 0
graphql_total: int = 0
graphql_unsubscribed: int = 0
rows: list[DetailRow]
search_query: str = ''
skipped: bool = False
skip_reason: str = ''
class repomatic.github.unsubscribe.UnsubscribeResult(dry_run=False, months=3, phase1=<factory>, phase2=<factory>)[source]

Bases: object

Accumulated results from both unsubscribe phases.

dry_run: bool = False
months: int = 3
phase1: Phase1Result
phase2: Phase2Result
repomatic.github.unsubscribe.render_report(result)[source]

Render a markdown report from unsubscribe results.

Pure function that produces the same markdown structure as the downstream unsubscribe.yaml workflow’s $GITHUB_STEP_SUMMARY.

Parameters:

result (UnsubscribeResult) – Structured results from both phases.

Return type:

str

Returns:

Markdown report string.

repomatic.github.unsubscribe.unsubscribe_threads(months, batch_size, dry_run)[source]

Unsubscribe from closed, inactive notification threads.

Runs two phases:

  1. REST notification threads — Fetches notification threads, inspects each subject for closed + stale status, and unsubscribes.

  2. GraphQL threadless subscriptions — Searches for closed issues/PRs the user is involved in and unsubscribes via mutation.

Parameters:
  • months (int) – Inactivity threshold in months.

  • batch_size (int) – Maximum threads/items to process per phase.

  • dry_run (bool) – If True, report what would be done without acting.

Return type:

UnsubscribeResult

Returns:

Structured results from both phases.

repomatic.github.workflow_sync module

Generation, sync, and lint for downstream workflows.

Downstream repositories consuming reusable workflows from kdeldycke/repomatic manually write caller workflows that often miss triggers like workflow_dispatch. This module provides tools to generate, synchronize, and lint those callers by parsing the canonical workflow definitions.

See WorkflowFormat for available output formats and their behavior.

Caution

PyYAML destroys formatting and comments on round-trip. Until we find a layout-preserving YAML parsing and rendering solution, we use raw text extraction to manipulate workflow files while preserving formatting and comments.

class repomatic.github.workflow_sync.WorkflowFormat(*values)[source]

Bases: StrEnum

Output format for generated workflow files.

FULL_COPY = 'full-copy'

Verbatim copy of the canonical workflow file.

Creates or overwrites the target with the full upstream content. Useful for workflows that need no downstream customization.

HEADER_ONLY = 'header-only'

Sync only the header (name, on, concurrency) from upstream.

Replaces everything before the jobs: line in an existing downstream file with the canonical header. The downstream jobs: section is preserved. Requires the target file to already exist; does not create new files.

Create a symbolic link to the canonical workflow file.

Creates or overwrites the target as a symlink pointing to the upstream workflow in the bundled data directory.

THIN_CALLER = 'thin-caller'

Generate a minimal caller that delegates to the reusable upstream workflow.

Creates or overwrites the target with a lightweight workflow containing only name, on triggers, and a jobs: section that calls the upstream workflow via workflow_call. Only works for reusable workflows (those with a workflow_call trigger).

When the target file already exists and contains extra jobs beyond the managed caller job, those jobs are preserved and appended after the regenerated content.

repomatic.github.workflow_sync.DEFAULT_VERSION: Final[str] = 'main'

Default version reference for upstream workflows.

For release builds (e.g., repomatic==5.11.0), this resolves to the corresponding tag (v5.11.0). For development builds (5.11.1.dev0), it falls back to main since the tag does not exist yet.

class repomatic.github.workflow_sync.WorkflowTriggerInfo(name, filename, non_call_triggers, call_inputs, call_secrets, has_workflow_call, concurrency, raw_concurrency)[source]

Bases: object

Parsed trigger information from a canonical workflow.

name: str

Workflow display name from the name: field.

filename: str

Workflow filename (e.g., release.yaml).

non_call_triggers: dict[str, Any]

All triggers except workflow_call, preserving their configuration.

call_inputs: dict[str, Any]

Inputs defined under workflow_call.inputs.

call_secrets: dict[str, Any]

Secrets defined under workflow_call.secrets.

has_workflow_call: bool

Whether the workflow defines a workflow_call trigger.

concurrency: dict[str, Any] | None

Parsed concurrency configuration, or None if absent.

raw_concurrency: str | None

Raw text of the concurrency block, preserving formatting and comments.

class repomatic.github.workflow_sync.LintResult(message, is_issue, level=AnnotationLevel.WARNING)[source]

Bases: object

Result of a single lint check.

message: str

Human-readable description of the finding.

is_issue: bool

Whether this result represents a problem.

level: AnnotationLevel = 'warning'

Severity level for GitHub Actions annotations.

repomatic.github.workflow_sync.extract_trigger_info(filename)[source]

Extract trigger information from a bundled canonical workflow.

Parses the workflow YAML and separates workflow_call configuration from other triggers.

Parameters:

filename (str) – Workflow filename (e.g., release.yaml).

Return type:

WorkflowTriggerInfo

Returns:

Parsed trigger information.

Raises:

FileNotFoundError – If the workflow file is not bundled.

repomatic.github.workflow_sync.generate_thin_caller(filename, repo='kdeldycke/repomatic', version='main', source_paths=None, commit_sha=None)[source]

Generate a thin caller workflow for a reusable canonical workflow.

The generated caller includes all non-workflow_call triggers from the canonical workflow, always ensures workflow_dispatch is present, and delegates to the upstream workflow via uses:.

When source_paths is provided, canonical paths: filters are adapted for the downstream project by replacing the upstream source directory glob with downstream equivalents. When None, paths are stripped entirely (conservative but correct — triggers on any file change).

When commit_sha is provided, the uses: reference is SHA-pinned (@sha # version) matching Renovate’s pin format. This eliminates Renovate’s initial “pin dependencies” PR on downstream repos.

Parameters:
  • filename (str) – Canonical workflow filename (e.g., release.yaml).

  • repo (str) – Upstream repository (default: kdeldycke/repomatic).

  • version (str) – Version reference (default: main).

  • source_paths (list[str] | None) – Downstream source directory names (e.g., ["extra_platforms"]). None strips all path filters.

  • commit_sha (str | None) – Full 40-character commit SHA for the version tag. When provided, produces @sha # version. When None, produces @version.

Return type:

str

Returns:

Complete YAML content for the thin caller workflow.

Raises:

ValueError – If the workflow does not support workflow_call.

repomatic.github.workflow_sync.identify_canonical_workflow(workflow_path, repo='kdeldycke/repomatic')[source]

Identify if a workflow is a thin caller for a canonical upstream workflow.

Scans jobs for a uses: reference matching the upstream repository pattern.

Parameters:
  • workflow_path (Path) – Path to the workflow file.

  • repo (str) – Upstream repository to match against.

Return type:

str | None

Returns:

Canonical workflow filename, or None if not a thin caller.

repomatic.github.workflow_sync.extract_extra_jobs(content, repo='kdeldycke/repomatic')[source]

Extract extra downstream jobs from an existing thin-caller workflow.

Parses the file with YAML to identify the managed thin-caller job (the one whose uses: references the upstream repository), then returns all raw text after that job: blank lines, comments, and additional job definitions.

Uses raw text slicing (not YAML round-tripping) to preserve formatting and comments, consistent with the rest of the module.

Parameters:
  • content (str) – Full workflow file content.

  • repo (str) – Upstream repository to match against.

Return type:

str

Returns:

Raw text of extra jobs (empty string when there are none).

repomatic.github.workflow_sync.check_has_workflow_dispatch(workflow_path)[source]

Check that a workflow has a workflow_dispatch trigger.

Parameters:

workflow_path (Path) – Path to the workflow file.

Return type:

LintResult

Returns:

Lint result.

repomatic.github.workflow_sync.check_version_pinned(workflow_path, repo='kdeldycke/repomatic')[source]

Check that a thin caller pins to a version tag, not @main.

Parameters:
  • workflow_path (Path) – Path to the workflow file.

  • repo (str) – Upstream repository to match against.

Return type:

LintResult

Returns:

Lint result.

repomatic.github.workflow_sync.check_triggers_match(workflow_path, canonical_filename)[source]

Check that a thin caller’s triggers match the canonical workflow.

Verifies that the caller includes all non-workflow_call triggers defined in the canonical workflow.

Parameters:
  • workflow_path (Path) – Path to the caller workflow file.

  • canonical_filename (str) – Filename of the canonical upstream workflow.

Return type:

LintResult

Returns:

Lint result.

repomatic.github.workflow_sync.check_secrets_passed(workflow_path, canonical_filename)[source]

Check that a thin caller passes all required secrets explicitly.

Verifies that every secret declared by the canonical workflow is forwarded by the caller, either via explicit secrets: mapping or via secrets: inherit.

Parameters:
  • workflow_path (Path) – Path to the caller workflow file.

  • canonical_filename (str) – Filename of the canonical upstream workflow.

Return type:

LintResult

Returns:

Lint result.

repomatic.github.workflow_sync.check_concurrency_match(workflow_path, canonical_filename)[source]

Check that a thin caller’s concurrency block matches the canonical workflow.

Compares parsed concurrency dicts so formatting differences are ignored.

Parameters:
  • workflow_path (Path) – Path to the caller workflow file.

  • canonical_filename (str) – Filename of the canonical upstream workflow.

Return type:

LintResult

Returns:

Lint result.

repomatic.github.workflow_sync.generate_workflow_header(filename, source_paths=None)[source]

Return the raw header of a canonical workflow.

The header is everything before the jobs: line: name, on triggers, concurrency, and any comments.

When source_paths is provided, upstream source directory references in paths: filters are replaced with downstream equivalents via text substitution. When None, the header is returned unmodified.

Parameters:
  • filename (str) – Canonical workflow filename (e.g., tests.yaml).

  • source_paths (list[str] | None) – Downstream source directory names (e.g., ["extra_platforms"]). None leaves paths unmodified.

Return type:

str

Returns:

Raw header text.

Raises:
repomatic.github.workflow_sync.run_workflow_lint(workflow_dir, repo='kdeldycke/repomatic', fatal=False)[source]

Lint all workflow files in a directory.

Runs check_has_workflow_dispatch on all YAML files, and caller-specific checks on files identified as thin callers.

Parameters:
  • workflow_dir (Path) – Directory containing workflow YAML files.

  • repo (str) – Upstream repository to match against.

  • fatal (bool) – If True, return exit code 1 when issues are found.

Return type:

int

Returns:

Exit code (0 for clean, 1 if fatal and issues found).

repomatic.github.workflow_sync.generate_workflows(names, output_format, version, repo, output_dir, overwrite, source_paths=None, commit_sha=None)[source]

Generate workflow files in the specified format.

Shared logic for the create and sync subcommands.

Parameters:
  • names (tuple[str, ...]) – Workflow filenames to generate. Empty tuple means all.

  • output_format (WorkflowFormat) – See WorkflowFormat for available formats.

  • version (str) – Version reference for thin callers.

  • repo (str) – Upstream repository.

  • output_dir (Path) – Directory to write files to.

  • overwrite (bool) – Whether to overwrite existing files.

  • source_paths (list[str] | None) – Downstream source directory names for paths: filters. None strips all path filters (conservative default).

  • commit_sha (str | None) – Full 40-character commit SHA for SHA-pinned uses: references. Passed through to generate_thin_caller().

Return type:

int

Returns:

Exit code (0 for success, 1 for errors).